WorldWideScience

Sample records for maximum credible earthquake

  1. The MCE (Maximum Credible Earthquake) - an approach to reduction of seismic risk

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchison, R.J.

    1979-01-01

    It is the responsibility of the Regulatory Body (in Canada, the AECB) to ensure that radiological risks resulting from the effects of earthquakes on nuclear facilities, do not exceed acceptable levels. In simplified numerical terms this means that the frequency of an unacceptable radiation dose must be kept below 10 -6 per annum. Unfortunately, seismic events fall into the class of external events which are not well defined at these low frequency levels. Thus, design earthquakes have been chosen, at the 10 -3 - 10 -4 frequency level, a level commensurate with the limits of statistical data. There exists, therefore, a need to define an additional level of earthquake. A seismic design explicitly and implicitly recognizes three levels of earthquake loading; one comfortably below yield, one at or about yield, and one at ultimate. The ultimate level earthquake, contrary to the first two, has been implicitly addressed by conscientious designers by choosing systems, materials and details compatible with postulated dynamic forces. It is the purpose of this paper to discuss the regulatory specifications required to quantify this third level, or Maximum Credible Earthquake (MCE). (orig.)

  2. LCLS Maximum Credible Beam Power

    International Nuclear Information System (INIS)

    Clendenin, J.

    2005-01-01

    The maximum credible beam power is defined as the highest credible average beam power that the accelerator can deliver to the point in question, given the laws of physics, the beam line design, and assuming all protection devices have failed. For a new accelerator project, the official maximum credible beam power is determined by project staff in consultation with the Radiation Physics Department, after examining the arguments and evidence presented by the appropriate accelerator physicist(s) and beam line engineers. The definitive parameter becomes part of the project's safety envelope. This technical note will first review the studies that were done for the Gun Test Facility (GTF) at SSRL, where a photoinjector similar to the one proposed for the LCLS is being tested. In Section 3 the maximum charge out of the gun for a single rf pulse is calculated. In Section 4, PARMELA simulations are used to track the beam from the gun to the end of the photoinjector. Finally in Section 5 the beam through the matching section and injected into Linac-1 is discussed

  3. Maximum credible earthquake (MCE) magnitude of structures affecting the Ujung Lemahabang site

    International Nuclear Information System (INIS)

    Soerjodibroto, M.

    1997-01-01

    This report analyse the geological structures in/around Muria Peninsula that might originating potential earthquake hazard toward the selected site for NPP, Ujung Lemahabang (ULA). Analysis was focused on the Lasem fault and AF-1/AF-4 offshore faults that are considered as the determinant structures affecting the seismicity of ULA (Nira, 1979, Newjec, 1994). Methods for estimating the MCE of the structures include maximum historical earthquake, and relationship between the length of the fault and the magnitude of earthquake originating from the known structure (Tocher, Iida, Matsuda, Wells and Coopersmith). The MCE magnitude estimating by these method for earthquake originating along the Lasem and AF-1/AF-4 faults vary from 2,1M to 7,0M. Comparison between the result of historical data and fault-magnitude relationship, however, suggest a MCE magnitude of Ms=7,0M for both fault zones. (author)

  4. Maximum credible earthquake (MCE) magnitude of structures affecting the Ujung Lemahabang site

    Energy Technology Data Exchange (ETDEWEB)

    Soerjodibroto, M [National Atomic Energy Agency, Jakarta (Indonesia)

    1997-03-01

    This report analyse the geological structures in/around Muria Peninsula that might originating potential earthquake hazard toward the selected site for NPP, Ujung Lemahabang (ULA). Analysis was focused on the Lasem fault and AF-1/AF-4 offshore faults that are considered as the determinant structures affecting the seismicity of ULA (Nira, 1979, Newjec, 1994). Methods for estimating the MCE of the structures include maximum historical earthquake, and relationship between the length of the fault and the magnitude of earthquake originating from the known structure (Tocher, Iida, Matsuda, Wells and Coopersmith). The MCE magnitude estimating by these method for earthquake originating along the Lasem and AF-1/AF-4 faults vary from 2,1M to 7,0M. Comparison between the result of historical data and fault-magnitude relationship, however, suggest a MCE magnitude of Ms=7,0M for both fault zones. (author)

  5. Estimation of maximum credible atmospheric radioactivity concentrations and dose rates from nuclear tests

    International Nuclear Information System (INIS)

    Telegadas, K.

    1979-01-01

    A simple technique is presented for estimating maximum credible gross beta air concentrations from nuclear detonations in the atmosphere, based on aircraft sampling of radioactivity following each Chinese nuclear test from 1964 to 1976. The calculated concentration is a function of the total yield and fission yield, initial vertical radioactivity distribution, time after detonation, and rate of horizontal spread of the debris with time. calculated maximum credible concentrations are compared with the highest concentrations measured during aircraft sampling. The technique provides a reasonable estimate of maximum air concentrations from 1 to 10 days after a detonation. An estimate of the whole-body external gamma dose rate corresponding to the maximum credible gross beta concentration is also given. (author)

  6. Dose assessment around TR-2 reactor due to maximum credible accident

    International Nuclear Information System (INIS)

    Turgut, M. H.; Adalioglu, U.; Aytekin, A.

    2001-01-01

    The revision of safety analysis report of TR-2 research reactor had been initiated in 1995. The whole accident analysis and accepted scenario for maximum credible accident has been revised according to the new safety concepts and the impact to be given to the environment due to this scenario has been assessed. This paper comprises all results of these calculations. The accepted maximum credible accident scenario is the partial blockage of the whole reactor core which resulted in the release of 25% of the core inventory. The DOSER code which uses very conservative modelling of atmospheric distributions were modified for the assessment calculations. Pasquill conditions based on the local weather observations, topography, and building affects were considered. The thyroid and whole body doses for 16 sectors and up to 10 km of distance around CNAEM were obtained. Release models were puff and a prolonged one of two hours of duration. Release fractions for the active isotopes were chosen from literature which were realistic

  7. Maximum credible accident analysis for TR-2 reactor conceptual design

    International Nuclear Information System (INIS)

    Manopulo, E.

    1981-01-01

    A new reactor, TR-2, of 5 MW, designed in cooperation with CEN/GRENOBLE is under construction in the open pool of TR-1 reactor of 1 MW set up by AMF atomics at the Cekmece Nuclear Research and Training Center. In this report the fission product inventory and doses released after the maximum credible accident have been studied. The diffusion of the gaseous fission products to the environment and the potential radiation risks to the population have been evaluated

  8. Credible occurrence probabilities for extreme geophysical events: earthquakes, volcanic eruptions, magnetic storms

    Science.gov (United States)

    Love, Jeffrey J.

    2012-01-01

    Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.

  9. Probable Maximum Earthquake Magnitudes for the Cascadia Subduction

    Science.gov (United States)

    Rong, Y.; Jackson, D. D.; Magistrale, H.; Goldfinger, C.

    2013-12-01

    The concept of maximum earthquake magnitude (mx) is widely used in seismic hazard and risk analysis. However, absolute mx lacks a precise definition and cannot be determined from a finite earthquake history. The surprising magnitudes of the 2004 Sumatra and the 2011 Tohoku earthquakes showed that most methods for estimating mx underestimate the true maximum if it exists. Thus, we introduced the alternate concept of mp(T), probable maximum magnitude within a time interval T. The mp(T) can be solved using theoretical magnitude-frequency distributions such as Tapered Gutenberg-Richter (TGR) distribution. The two TGR parameters, β-value (which equals 2/3 b-value in the GR distribution) and corner magnitude (mc), can be obtained by applying maximum likelihood method to earthquake catalogs with additional constraint from tectonic moment rate. Here, we integrate the paleoseismic data in the Cascadia subduction zone to estimate mp. The Cascadia subduction zone has been seismically quiescent since at least 1900. Fortunately, turbidite studies have unearthed a 10,000 year record of great earthquakes along the subduction zone. We thoroughly investigate the earthquake magnitude-frequency distribution of the region by combining instrumental and paleoseismic data, and using the tectonic moment rate information. To use the paleoseismic data, we first estimate event magnitudes, which we achieve by using the time interval between events, rupture extent of the events, and turbidite thickness. We estimate three sets of TGR parameters: for the first two sets, we consider a geographically large Cascadia region that includes the subduction zone, and the Explorer, Juan de Fuca, and Gorda plates; for the third set, we consider a narrow geographic region straddling the subduction zone. In the first set, the β-value is derived using the GCMT catalog. In the second and third sets, the β-value is derived using both the GCMT and paleoseismic data. Next, we calculate the corresponding mc

  10. Simulation of the WWER-440/213 maximum credible accident at the EhNITs stand

    International Nuclear Information System (INIS)

    Blinkov, V.N.; Melikhov, O.I.; Melikhov, V.I.; Davydov, M.V.; Sokolin, A.V.; Shchepetil'nikov, Eh.Yu.

    2000-01-01

    The calculations of thermohydraulic processes through the ATHLET code for determining optimal conditions for modeling the coolant leakage at the EhNITs stand by the maximum credible accident at the NPP with WWER-440/213 reactor are presented. The diameters of the nozzle at the stand, whereby the local criterion of coincidence with the data on the NPP (by the maximum flow) and integral criterion of coincidence (by the mass and energy of the coolant, effluent during 10 s) are determined in the process of parametric calculations [ru

  11. High-frequency maximum observable shaking map of Italy from fault sources

    KAUST Repository

    Zonno, Gaetano

    2012-03-17

    We present a strategy for obtaining fault-based maximum observable shaking (MOS) maps, which represent an innovative concept for assessing deterministic seismic ground motion at a regional scale. Our approach uses the fault sources supplied for Italy by the Database of Individual Seismogenic Sources, and particularly by its composite seismogenic sources (CSS), a spatially continuous simplified 3-D representation of a fault system. For each CSS, we consider the associated Typical Fault, i. e., the portion of the corresponding CSS that can generate the maximum credible earthquake. We then compute the high-frequency (1-50 Hz) ground shaking for a rupture model derived from its associated maximum credible earthquake. As the Typical Fault floats within its CSS to occupy all possible positions of the rupture, the high-frequency shaking is updated in the area surrounding the fault, and the maximum from that scenario is extracted and displayed on a map. The final high-frequency MOS map of Italy is then obtained by merging 8,859 individual scenario-simulations, from which the ground shaking parameters have been extracted. To explore the internal consistency of our calculations and validate the results of the procedure we compare our results (1) with predictions based on the Next Generation Attenuation ground-motion equations for an earthquake of M w 7.1, (2) with the predictions of the official Italian seismic hazard map, and (3) with macroseismic intensities included in the DBMI04 Italian database. We then examine the uncertainties and analyse the variability of ground motion for different fault geometries and slip distributions. © 2012 Springer Science+Business Media B.V.

  12. High-frequency maximum observable shaking map of Italy from fault sources

    KAUST Repository

    Zonno, Gaetano; Basili, Roberto; Meroni, Fabrizio; Musacchio, Gemma; Mai, Paul Martin; Valensise, Gianluca

    2012-01-01

    We present a strategy for obtaining fault-based maximum observable shaking (MOS) maps, which represent an innovative concept for assessing deterministic seismic ground motion at a regional scale. Our approach uses the fault sources supplied for Italy by the Database of Individual Seismogenic Sources, and particularly by its composite seismogenic sources (CSS), a spatially continuous simplified 3-D representation of a fault system. For each CSS, we consider the associated Typical Fault, i. e., the portion of the corresponding CSS that can generate the maximum credible earthquake. We then compute the high-frequency (1-50 Hz) ground shaking for a rupture model derived from its associated maximum credible earthquake. As the Typical Fault floats within its CSS to occupy all possible positions of the rupture, the high-frequency shaking is updated in the area surrounding the fault, and the maximum from that scenario is extracted and displayed on a map. The final high-frequency MOS map of Italy is then obtained by merging 8,859 individual scenario-simulations, from which the ground shaking parameters have been extracted. To explore the internal consistency of our calculations and validate the results of the procedure we compare our results (1) with predictions based on the Next Generation Attenuation ground-motion equations for an earthquake of M w 7.1, (2) with the predictions of the official Italian seismic hazard map, and (3) with macroseismic intensities included in the DBMI04 Italian database. We then examine the uncertainties and analyse the variability of ground motion for different fault geometries and slip distributions. © 2012 Springer Science+Business Media B.V.

  13. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    Science.gov (United States)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  14. Safety analysis of RA reactor operation, I-III, Part III - Environmental effect of the maximum credible accident

    International Nuclear Information System (INIS)

    Raisic, N.

    1963-02-01

    Maximum credible accident at the RA reactor would consider release of fission products into the environment. This would result from fuel elements failure or meltdown due to loss of coolant. The analysis presented in this report assumes that the reactor was operating at nominal power at the moment of maximum possible accident. The report includes calculations of fission products activity at the moment of accident, total activity release during the accident, concentration of radioactive material in the air in the reactor neighbourhood, and the analysis of accident environmental effects

  15. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    Science.gov (United States)

    Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  16. What controls the maximum magnitude of injection-induced earthquakes?

    Science.gov (United States)

    Eaton, D. W. S.

    2017-12-01

    Three different approaches for estimation of maximum magnitude are considered here, along with their implications for managing risk. The first approach is based on a deterministic limit for seismic moment proposed by McGarr (1976), which was originally designed for application to mining-induced seismicity. This approach has since been reformulated for earthquakes induced by fluid injection (McGarr, 2014). In essence, this method assumes that the upper limit for seismic moment release is constrained by the pressure-induced stress change. A deterministic limit is given by the product of shear modulus and the net injected fluid volume. This method is based on the assumptions that the medium is fully saturated and in a state of incipient failure. An alternative geometrical approach was proposed by Shapiro et al. (2011), who postulated that the rupture area for an induced earthquake falls entirely within the stimulated volume. This assumption reduces the maximum-magnitude problem to one of estimating the largest potential slip surface area within a given stimulated volume. Finally, van der Elst et al. (2016) proposed that the maximum observed magnitude, statistically speaking, is the expected maximum value for a finite sample drawn from an unbounded Gutenberg-Richter distribution. These three models imply different approaches for risk management. The deterministic method proposed by McGarr (2014) implies that a ceiling on the maximum magnitude can be imposed by limiting the net injected volume, whereas the approach developed by Shapiro et al. (2011) implies that the time-dependent maximum magnitude is governed by the spatial size of the microseismic event cloud. Finally, the sample-size hypothesis of Van der Elst et al. (2016) implies that the best available estimate of the maximum magnitude is based upon observed seismicity rate. The latter two approaches suggest that real-time monitoring is essential for effective management of risk. A reliable estimate of maximum

  17. Mining-induced earthquakes monitored during pit closure in the Midlothian Coalfield

    Energy Technology Data Exchange (ETDEWEB)

    Redmayne, D.W.; Richards, J.A.; Wild, P.W. [British Geological Survey, Edinburgh (United Kingdom). Global Seismology and Geomagnetism Group

    1998-06-01

    The British Geological Survey installed a seismometer network to monitor earthquakes around Rosslyn Chapel in the Midlothian Coalfield from November 1987 until January 1990. Accurate locations were obtained for 247 events and a close spatial and temporal association with concurrent coal mining, with a rapid decay of earthquake activity following pit closure, was demonstrated, indicating a mining-induced cause. Residual stress from past mining appears to have been an important factor in generating seismicity, and observations indicate that limiting the width of the workings or rate of extraction may significantly reduce or eliminate mining-induced earthquake activity. A frequency-magnitude analysis indicates a relatively high abundance of small events in this coalfield area. The maximum magnitude of a mining-induced earthquake likely to have been experienced during the life of the coalfield (maximum credible magnitude) was 3.0 M-L, although an extreme event (maximum possible magnitude) as large as 3.4 M-L was remotely possible. Significant seismic amplification was observed at Rosslyn Chapel, which is founded on sand and gravel, compared with a nearby bedrock site. As a consequence, relatively small magnitude events caused high, and occasionally damaging, seismic intensities at the chapel.

  18. Global correlations between maximum magnitudes of subduction zone interface thrust earthquakes and physical parameters of subduction zones

    NARCIS (Netherlands)

    Schellart, W. P.; Rawlinson, N.

    2013-01-01

    The maximum earthquake magnitude recorded for subduction zone plate boundaries varies considerably on Earth, with some subduction zone segments producing giant subduction zone thrust earthquakes (e.g. Chile, Alaska, Sumatra-Andaman, Japan) and others producing relatively small earthquakes (e.g.

  19. Maximum credibly yield for deuteriuim-filled double shell imaging targets meeting requirements for yield bin Category A

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Douglas Carl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Loomis, Eric Nicholas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-17

    We are anticipating our first NIF double shell shot using an aluminum ablator and a glass inner shell filled with deuterium shown in figure 1. The expected yield is between a few 1010 to a few 1011 dd neutrons. The maximum credible yield is 5e+13. This memo describes why, and what would be expected with variations on the target. This memo evaluates the maximum credible yield for deuterium filled double shell capsule targets with an aluminum ablator shell and a glass inner shell in yield Category A (< 1014 neutrons). It also pertains to fills of gas diluted with hydrogen, helium (3He or 4He), or any other fuel except tritium. This memo does not apply to lower z ablator dopants, such as beryllium, as this would increase the ablation efficiency. This evaluation is for 5.75 scale hohlraum targets of either gold or uranium with helium gas fills with density between 0 and 1.6 mg/cc. It could be extended to other hohlraum sizes and shapes with slight modifications. At present only laser pulse energies up to 1.5 MJ were considered with a single step laser pulse of arbitrary shape. Since yield decreases with laser energy for this target, the memo could be extended to higher laser energies if desired. These maximum laser parameters of pulses addressed here are near the edge of NIF’s capability, and constitute the operating envelope for experiments covered by this memo. We have not considered multiple step pulses, would probably create no advantages in performance, and are not planned for double shell capsules. The main target variables are summarized in Table 1 and explained in detail in the memo. Predicted neutron yields are based on 1D and 2D clean simulations.

  20. Maximum magnitude of injection-induced earthquakes: A criterion to assess the influence of pressure migration along faults

    Science.gov (United States)

    Norbeck, Jack H.; Horne, Roland N.

    2018-05-01

    The maximum expected earthquake magnitude is an important parameter in seismic hazard and risk analysis because of its strong influence on ground motion. In the context of injection-induced seismicity, the processes that control how large an earthquake will grow may be influenced by operational factors under engineering control as well as natural tectonic factors. Determining the relative influence of these effects on maximum magnitude will impact the design and implementation of induced seismicity management strategies. In this work, we apply a numerical model that considers the coupled interactions of fluid flow in faulted porous media and quasidynamic elasticity to investigate the earthquake nucleation, rupture, and arrest processes for cases of induced seismicity. We find that under certain conditions, earthquake ruptures are confined to a pressurized region along the fault with a length-scale that is set by injection operations. However, earthquakes are sometimes able to propagate as sustained ruptures outside of the zone that experienced a pressure perturbation. We propose a faulting criterion that depends primarily on the state of stress and the earthquake stress drop to characterize the transition between pressure-constrained and runaway rupture behavior.

  1. RA reactor safety analysis I-III, Part III - Environmental effect of the maximum credible accident; Analiza sigurnosti rada Reaktora RA I-III, III deo - Posledica maksimalno moguceg akcidenta na okolinu reaktora

    Energy Technology Data Exchange (ETDEWEB)

    Raisic, N [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1963-02-15

    The objective of the maximum credible accident analysis was to determine the integral radiation doses in the vicinity of the reactor and in the environment. In case of RA reactor the maximum credible accident, meaning release of the fission products, would be caused by fuel elements meltdown. This analysis includes the following calculation results: activity of the fission products, volatility of the fission products, concentration of radioactive materials in the air, analysis of the accident environmental effects.

  2. Localization of b-values and maximum earthquakes; B chi to saidai jishin no chiikisei

    Energy Technology Data Exchange (ETDEWEB)

    Kurimoto, H

    1996-05-01

    There is a thought that hourly and spacial blanks in earthquake activity contribute to earthquake occurrence probability. Based on an idea that if so, this tendency may appear also in statistical parameters of earthquake, earthquake activities in every ten years were investigated in the relation between locational distribution of inclined b values of a line relating to the number of earthquake and the magnitude, and the center focus of earthquakes which are M{ge}7.0. The field surveyed is the Japanese Islands and the peripheral ocean, and the area inside the circle with a radius of 100km with a lattice-like point divided in 1{degree} in every direction of latitude and longitude as center was made a unit region. The depth is divided by above 60km or below 60km. As a result, the following were found out: as to epicenters of earthquakes with M{ge}7.0 during the survey period of 100 years, many are in a range of b(b value){le}0.75, and sometimes they may be in a range of b{ge}0.75 in the area from the ocean near Izu peninsula to the ocean off the west Hokkaido; the position of epicenters in a range of b{le}0.75 seems not to come close to the center of contour which indicates the maximum b value. 7 refs., 2 figs.

  3. Characterising large scenario earthquakes and their influence on NDSHA maps

    Science.gov (United States)

    Magrin, Andrea; Peresan, Antonella; Panza, Giuliano F.

    2016-04-01

    The neo-deterministic approach to seismic zoning, NDSHA, relies on physically sound modelling of ground shaking from a large set of credible scenario earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g. morphostructural features and present day deformation processes identified by Earth observations). NDSHA is based on the calculation of complete synthetic seismograms; hence it does not make use of empirical attenuation models (i.e. ground motion prediction equations). From the set of synthetic seismograms, maps of seismic hazard that describe the maximum of different ground shaking parameters at the bedrock can be produced. As a rule, the NDSHA, defines the hazard as the envelope ground shaking at the site, computed from all of the defined seismic sources; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In this way, the standard NDSHA maps permit to account for the largest observed or credible earthquake sources identified in the region in a quite straightforward manner. This study aims to assess the influence of unavoidable uncertainties in the characterisation of large scenario earthquakes on the NDSHA estimates. The treatment of uncertainties is performed by sensitivity analyses for key modelling parameters and accounts for the uncertainty in the prediction of fault radiation and in the use of Green's function for a given medium. Results from sensitivity analyses with respect to the definition of possible seismic sources are discussed. A key parameter is the magnitude of seismic sources used in the simulation, which is based on information from earthquake catalogue, seismogenic zones and seismogenic nodes. The largest part of the existing Italian catalogues is based on macroseismic intensities, a rough estimate of the error in peak values of ground motion can

  4. Tsunami hazard in the Caribbean: Regional exposure derived from credible worst case scenarios

    Science.gov (United States)

    Harbitz, C. B.; Glimsdal, S.; Bazin, S.; Zamora, N.; Løvholt, F.; Bungum, H.; Smebye, H.; Gauer, P.; Kjekstad, O.

    2012-04-01

    The present study documents a high tsunami hazard in the Caribbean region, with several thousands of lives lost in tsunamis and associated earthquakes since the XIXth century. Since then, the coastal population of the Caribbean and the Central West Atlantic region has grown significantly and is still growing. Understanding this hazard is therefore essential for the development of efficient mitigation measures. To this end, we report a regional tsunami exposure assessment based on potential and credible seismic and non-seismic tsunamigenic sources. Regional tsunami databases have been compiled and reviewed, and on this basis five main scenarios have been selected to estimate the exposure. The scenarios comprise two Mw8 earthquake tsunamis (north of Hispaniola and east of Lesser Antilles), two subaerial/submarine volcano flank collapse tsunamis (Montserrat and Saint Lucia), and one tsunami resulting from a landslide on the flanks of the Kick'em Jenny submarine volcano (north of Grenada). Offshore tsunami water surface elevations as well as maximum water level distributions along the shore lines are computed and discussed for each of the scenarios. The number of exposed people has been estimated in each case, together with a summary of the tsunami exposure for the earthquake and the landslide tsunami scenarios. For the earthquake scenarios, the highest tsunami exposure relative to the population is found for Guadeloupe (6.5%) and Antigua (7.5%), while Saint Lucia (4.5%) and Antigua (5%) have been found to have the highest tsunami exposure relative to the population for the landslide scenarios. Such high exposure levels clearly warrant more attention on dedicated mitigation measures in the Caribbean region.

  5. The maximum earthquake in future T years: Checking by a real catalog

    International Nuclear Information System (INIS)

    Pisarenko, V.F.; Rodkin, M.V.

    2015-01-01

    The studies of disaster statistics are being largely carried out in recent decades. Some recent achievements in the field can be found in Pisarenko and Rodkin (2010). An important aspect in the seismic risk assessment is the using historical earthquake catalogs and the combining historical data with instrumental ones since historical catalogs cover very long time periods and can improve seismic statistics in the higher magnitude domain considerably. We suggest the new statistical technique for this purpose and apply it to two historical Japan catalogs and the instrumental JMA catalog. The main focus of these approaches is on the occurrence of disasters of extreme sizes as the most important ones from practical point of view. Our method of statistical analysis of the size distribution in the uppermost range of extremely rare events was suggested, based on maximum size M max (τ) (e.g. earthquake energy, ground acceleration caused by earthquake, victims and economic losses from natural catastrophes, etc.) that will occur in a prescribed time interval τ. A new approach to the problem discrete data that we called “the magnitude spreading” is suggested. This method reduces discrete random value to continuous ones by addition a small uniformly distributed random components. We analyze this method in details and apply it to verification of parameters derived from two historical catalogs: the Usami earthquake catalog (599–1884) and the Utsu catalog (1885–1925). We compare their parameters with ones derived from the instrumental JMA catalog (1926–2014). The results of this verification are following: The Usami catalog is incompatible with the instrumental one, whereas parameters estimated from the Utsu catalog are statistically compatible in the higher magnitude domain with sample of M max (τ) derived from the JMA catalog

  6. Prediction of maximum earthquake intensities for the San Francisco Bay region

    Science.gov (United States)

    Borcherdt, Roger D.; Gibbs, James F.

    1975-01-01

    The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault.

  7. Prediction of maximum earthquake intensities for the San Francisco Bay region

    Energy Technology Data Exchange (ETDEWEB)

    Borcherdt, R.D.; Gibbs, J.F.

    1975-01-01

    The intensity data for the California earthquake of Apr 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan formation is intensity = 2.69 - 1.90 log (distance) (km). For sites on other geologic units, intensity increments, derived with respect to this empirical relation, correlate strongly with the average horizontal spectral amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is intensity increment = 0.27 + 2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan formation, 0.64 for the Great Valley sequence, 0.82 for Santa Clara formation, 1.34 for alluvium, and 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hayward fault.

  8. Crustal seismicity and the earthquake catalog maximum moment magnitudes (Mcmax) in stable continental regions (SCRs): correlation with the seismic velocity of the lithosphere

    Science.gov (United States)

    Mooney, Walter D.; Ritsema, Jeroen; Hwang, Yong Keun

    2012-01-01

    A joint analysis of global seismicity and seismic tomography indicates that the seismic potential of continental intraplate regions is correlated with the seismic properties of the lithosphere. Archean and Early Proterozoic cratons with cold, stable continental lithospheric roots have fewer crustal earthquakes and a lower maximum earthquake catalog moment magnitude (Mcmax). The geographic distribution of thick lithospheric roots is inferred from the global seismic model S40RTS that displays shear-velocity perturbations (δVS) relative to the Preliminary Reference Earth Model (PREM). We compare δVS at a depth of 175 km with the locations and moment magnitudes (Mw) of intraplate earthquakes in the crust (Schulte and Mooney, 2005). Many intraplate earthquakes concentrate around the pronounced lateral gradients in lithospheric thickness that surround the cratons and few earthquakes occur within cratonic interiors. Globally, 27% of stable continental lithosphere is underlain by δVS≥3.0%, yet only 6.5% of crustal earthquakes with Mw>4.5 occur above these regions with thick lithosphere. No earthquakes in our catalog with Mw>6 have occurred above mantle lithosphere with δVS>3.5%, although such lithosphere comprises 19% of stable continental regions. Thus, for cratonic interiors with seismically determined thick lithosphere (1) there is a significant decrease in the number of crustal earthquakes, and (2) the maximum moment magnitude found in the earthquake catalog is Mcmax=6.0. We attribute these observations to higher lithospheric strength beneath cratonic interiors due to lower temperatures and dehydration in both the lower crust and the highly depleted lithospheric root.

  9. Crustal seismicity and the earthquake catalog maximum moment magnitude (Mcmax) in stable continental regions (SCRs): Correlation with the seismic velocity of the lithosphere

    Science.gov (United States)

    Mooney, Walter D.; Ritsema, Jeroen; Hwang, Yong Keun

    2012-12-01

    A joint analysis of global seismicity and seismic tomography indicates that the seismic potential of continental intraplate regions is correlated with the seismic properties of the lithosphere. Archean and Early Proterozoic cratons with cold, stable continental lithospheric roots have fewer crustal earthquakes and a lower maximum earthquake catalog moment magnitude (Mcmax). The geographic distribution of thick lithospheric roots is inferred from the global seismic model S40RTS that displays shear-velocity perturbations (δVS) relative to the Preliminary Reference Earth Model (PREM). We compare δVS at a depth of 175 km with the locations and moment magnitudes (Mw) of intraplate earthquakes in the crust (Schulte and Mooney, 2005). Many intraplate earthquakes concentrate around the pronounced lateral gradients in lithospheric thickness that surround the cratons and few earthquakes occur within cratonic interiors. Globally, 27% of stable continental lithosphere is underlain by δVS≥3.0%, yet only 6.5% of crustal earthquakes with Mw>4.5 occur above these regions with thick lithosphere. No earthquakes in our catalog with Mw>6 have occurred above mantle lithosphere with δVS>3.5%, although such lithosphere comprises 19% of stable continental regions. Thus, for cratonic interiors with seismically determined thick lithosphere (1) there is a significant decrease in the number of crustal earthquakes, and (2) the maximum moment magnitude found in the earthquake catalog is Mcmax=6.0. We attribute these observations to higher lithospheric strength beneath cratonic interiors due to lower temperatures and dehydration in both the lower crust and the highly depleted lithospheric root.

  10. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  11. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    Science.gov (United States)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  12. Improving Bayesian credibility intervals for classifier error rates using maximum entropy empirical priors.

    Science.gov (United States)

    Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders

    2010-06-01

    Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.

  13. The guideline and practical procedures for earthquake-resistant design of nuclear power plants in Japan

    International Nuclear Information System (INIS)

    Watabe, M.

    1985-01-01

    The Guideline for the aseismic design of nuclear reactor facilities, revised in 1981, is introduced. The basic philosophy entails structural integrity against a major earthquake, rigid structure for less deformation and foundation on rock. The classification of facilities is then explained. Some practical examples are tabulated. In the light of the above classifications, evaluation procedures for aseismic design are defined. Design basis earthquake ground motions, S1 and S2, are defined. S1 is the maximum possible earthquake ground motion, while S2 is the maximum credible one. The relation between active faults and S1, S2 motions is explained, seismic forces induced by S1 and S2 are expressed in terms of response spectra. Static seismic coefficient procedures are also applied to evaluate seismic forces, as a minimum guide-line based on dynamic analysis. Combinations of seismic forces and allowable limits are then explained. In the second part of the paper, seismic analysis for reactor buildings as a part of design practice is outlined. There are three major key points in practical aseismic design. The first one is input design earthquake motions, in which soil/foundation interaction problems are also included. In practice, ground motions at the free field rock surface have to be convoluted or deconvoluted to obtain base rock motions, which are applied to estimate input design earthquake motions by way of finite element analysis or a lumped mass lattice model. Also introduced is dynamic modelling of the reactor building with its non-linear behaviour represented by plastic deformation of reinforced concrete members as well as by uplift characteristics of foundations. Then an evaluation of aseismic safety is introduced. (author)

  14. Maximum Credible Event Analysis Methods-Tools and Applications in Biosecurity Programs

    International Nuclear Information System (INIS)

    Rao, V.

    2007-01-01

    Maximum Credible Event (MCE) analyses are analogous to worst-case scenarios involving a likely mishap scenario in biotechnology bioprocessing operations, biological products testing laboratories, and biological specimen repository facilities, leading to release of particulate/aerosolized etiologic agents into the environment. The purpose of MCE analyses is to estimate the effectiveness of existing safeguards such as the engineering controls, administrative procedures and the attributes of facility design that, in combination, prevent the probability of release of potentially pathogenic or toxic material from the test facility to external environment. As part of our support to the United States Chemical Biological Defense Program, we have developed a unique set og realistic MCE worst-case scenarios for all laboratory and industrial aspects of a biological product development process. Although MCE analysis is a part of an overall facility biosafety assessment, our approach considered biosecurity related issues such as facility vulnerability, employment procedures and workers background investigations, exercise and drills involving local law enforcement and emergency response community, records and audits process, and facility biosafety and biosecurity oversight and governance issues. our standard operating procedure for tracking biological material transfer agreements and operating procedures for materials transfer, together with an integrated checklist of biosafety/biosecurity facility inspection and evaluation was to ensure compliance with all biosafety and biosecurity guidelines.The results of MCE analysis, described in terms of potential hazard of exposure for workers and immediate environment to etiologic agents from the manufacturing process, is a quasi-quantitative estimate of the nature and extent of adverse impact on the health and immediate environment at the vicinity. Etiologic agent exposure concentrations are estimated based on a Gaussian air depression

  15. Seismic Stability of St. Stephen Hydropower Plant, South Carolina

    National Research Council Canada - National Science Library

    Ebeling, Robert M; Hall, Robert L; Strom, Ralph W; Yule, Donald E; Chowdhury, Mostafiz

    2006-01-01

    .... Two site-specific design response spectra were used to evaluate the structure. These included a 2,475-year probabilistic earthquake event and a deterministic Maximum Credible Earthquake plus one standard deviation event. The St...

  16. A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia

    Science.gov (United States)

    Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

    2012-12-01

    We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of

  17. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  18. Safety analysis of RA reactor operation, I-III, Part III - Environmental effect of the maximum credible accident; Analiza sigurnosti rada reaktora RA - I-III, III deo - Posledica maksimalno moguceg akcidenta na okolinu reaktora

    Energy Technology Data Exchange (ETDEWEB)

    Raisic, N [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1963-02-15

    Maximum credible accident at the RA reactor would consider release of fission products into the environment. This would result from fuel elements failure or meltdown due to loss of coolant. The analysis presented in this report assumes that the reactor was operating at nominal power at the moment of maximum possible accident. The report includes calculations of fission products activity at the moment of accident, total activity release during the accident, concentration of radioactive material in the air in the reactor neighbourhood, and the analysis of accident environmental effects.

  19. Comparison of credible patients of very low intelligence and non-credible patients on neurocognitive performance validity indicators.

    Science.gov (United States)

    Smith, Klayton; Boone, Kyle; Victor, Tara; Miora, Deborah; Cottingham, Maria; Ziegler, Elizabeth; Zeller, Michelle; Wright, Matthew

    2014-01-01

    The purpose of this archival study was to identify performance validity tests (PVTs) and standard IQ and neurocognitive test scores, which singly or in combination, differentiate credible patients of low IQ (FSIQ ≤ 75; n = 55) from non-credible patients. We compared the credible participants against a sample of 74 non-credible patients who appeared to have been attempting to feign low intelligence specifically (FSIQ ≤ 75), as well as a larger non-credible sample (n = 383) unselected for IQ. The entire non-credible group scored significantly higher than the credible participants on measures of verbal crystallized intelligence/semantic memory and manipulation of overlearned information, while the credible group performed significantly better on many processing speed and memory tests. Additionally, credible women showed faster finger-tapping speeds than non-credible women. The credible group also scored significantly higher than the non-credible subgroup with low IQ scores on measures of attention, visual perceptual/spatial tasks, processing speed, verbal learning/list learning, and visual memory, and credible women continued to outperform non-credible women on finger tapping. When cut-offs were selected to maintain approximately 90% specificity in the credible group, sensitivity rates were highest for verbal and visual memory measures (i.e., TOMM trials 1 and 2; Warrington Words correct and time; Rey Word Recognition Test total; RAVLT Effort Equation, Trial 5, total across learning trials, short delay, recognition, and RAVLT/RO discriminant function; and Digit Symbol recognition), followed by select attentional PVT scores (i.e., b Test omissions and time to recite four digits forward). When failure rates were tabulated across seven most sensitive scores, a cut-off of ≥ 2 failures was associated with 85.4% specificity and 85.7% sensitivity, while a cut-off of ≥ 3 failures resulted in 95.1% specificity and 66.0% sensitivity. Results are discussed in light of

  20. A Credibility-Based Chance-Constrained Transfer Point Location Model for the Relief Logistics Design (Case Study: Earthquake Disaster on Region 1 of Tehran City

    Directory of Open Access Journals (Sweden)

    Ahmad Mohamadi

    2015-02-01

    Full Text Available Occurrence of natural disaster inflicts irreparable injuries and symptoms on humans. In such conditions, affected people are waiting for medical services and relief commodities. Thus, quick reaction of medical services and relief commodities supply play important roles in improving natural disaster management. In this paper, a multi-objective non-linear credibility-based fuzzy mathematical programming model under uncertainty conditions is presented, which considers two vital needs in disaster time including medical services and relief commodities through location of hospitals, transfer points, and location routing of relief depots. The proposed model approaches reality by considering time, cost, failures probability in routes, and parameters uncertainty. The problem is first linearized and then global criterion method is applied for solving the multi objective model. Moreover, to illustrate model efficiency, a case study is performed on region 1 of Tehran city for earthquake disaster. Results demonstrate that if Decision-makers want to meet uncertainty with lowered risk, they have to choose a high minimum constraint feasibility degree even though the objective function will be worse.

  1. Measuring Credibility Perceptions in CSR Communication: A Scale Development to Test Readers’ Perceived Credibility of CSR Reports

    Science.gov (United States)

    Lock, Irina; Seele, Peter

    2017-01-01

    Credibility is central to communication but often jeopardized by “credibility gaps.” This is especially true for communication about corporate social responsibility (CSR). To date, no tool has been available to analyze stakeholders’ credibility perceptions of CSR communication. This article presents a series of studies conducted to develop a scale to assess the perceived credibility of CSR reports, one of CSR communication’s most important tools. The scale provides a novel operationalization of credibility using validity claims of Habermas’s ideal speech situation as subdimensions. The scale development process, carried out in five studies including a literature review, a Delphi study, and three validation studies applying confirmatory factor analysis, resulted in the 16-item Perceived Credibility (PERCRED) scale. The scale shows convergent, discriminant, concurrent, and nomological validity and is the first validated measure for analyzing credibility perceptions of CSR reports. PMID:29278260

  2. Measuring Credibility Perceptions in CSR Communication: A Scale Development to Test Readers' Perceived Credibility of CSR Reports.

    Science.gov (United States)

    Lock, Irina; Seele, Peter

    2017-11-01

    Credibility is central to communication but often jeopardized by "credibility gaps." This is especially true for communication about corporate social responsibility (CSR). To date, no tool has been available to analyze stakeholders' credibility perceptions of CSR communication. This article presents a series of studies conducted to develop a scale to assess the perceived credibility of CSR reports, one of CSR communication's most important tools. The scale provides a novel operationalization of credibility using validity claims of Habermas's ideal speech situation as subdimensions. The scale development process, carried out in five studies including a literature review, a Delphi study, and three validation studies applying confirmatory factor analysis, resulted in the 16-item Perceived Credibility (PERCRED) scale. The scale shows convergent, discriminant, concurrent, and nomological validity and is the first validated measure for analyzing credibility perceptions of CSR reports.

  3. Calibration of Crustal Historical Earthquakes from Intra-Carpathian Region of Romania

    Science.gov (United States)

    Oros, Eugen; Popa, Mihaela; Rogozea, Maria

    2017-12-01

    The main task of the presented study is to elaborate a set of relations of mutual conversion macroseismic intensity - magnitude, necessary for the calibration of the historical crustal earthquakes produced in the Intra - Carpathian region of Romania, as a prerequisite for homogenization of the parametric catalogue of Romanian earthquakes. To achieve the goal, we selected a set of earthquakes for which we have quality macroseismic data and the Mw moment magnitude obtained instrumentally. These seismic events were used to determine the relations between the Mw and the peak/epicentral intensity, the isoseist surface area for I=3, I=4 and I=5: Mw = f (Imax / Io), Mw = f (Imax / Io, h), Mw = f (A3, A4; A5). We investigated several variants of such relationships and combinations, taking into account that the macroseismic data necessary for the re-evaluation of historical earthquakes in the investigated region are available in several forms. Thus, a number of investigations provided various information resulted after revising initial historical data: 1) Intensity data point (IDP) assimilated or not with the epicentre intensity after analysis of the correlation level with recent seismicity data and / or active tectonics / seismotectonics, 2) Sets of intensities obtained in several localities (IDPs) with variable values having maxims that can be considered equal to epicentral intensity (Io), 3) Sets of intensities obtained in several localities (IDPs) but without obvious maximum values, assimilable with the epicentral intensity, 4) maps with isoseismals, 5) Information on the areas in which the investigated earthquake was felt or the area of perceptiveness (e.g. I = 3 EMS during the day and I = 4 EMS at night) or the surfaces corresponding to a certain degree of well-defined intensity. The obtained relationships were validated using a set of earthquakes with instrumental source parameters (localization, depth, Mw). These relationships lead to redundant results meaningful in

  4. Credibility and advocacy in conservation science

    Science.gov (United States)

    Horton, Cristi C.; Peterson, Tarla Rai; Banerjee, Paulami

    2015-01-01

    Abstract Conservation policy sits at the nexus of natural science and politics. On the one hand, conservation scientists strive to maintain scientific credibility by emphasizing that their research findings are the result of disinterested observations of reality. On the other hand, conservation scientists are committed to conservation even if they do not advocate a particular policy. The professional conservation literature offers guidance on negotiating the relationship between scientific objectivity and political advocacy without damaging conservation science's credibility. The value of this guidance, however, may be restricted by limited recognition of credibility's multidimensionality and emergent nature: it emerges through perceptions of expertise, goodwill, and trustworthiness. We used content analysis of the literature to determine how credibility is framed in conservation science as it relates to apparent contradictions between science and advocacy. Credibility typically was framed as a static entity lacking dimensionality. Authors identified expertise or trustworthiness as important, but rarely mentioned goodwill. They usually did not identify expertise, goodwill, or trustworthiness as dimensions of credibility or recognize interactions among these 3 dimensions of credibility. This oversimplification may limit the ability of conservation scientists to contribute to biodiversity conservation. Accounting for the emergent quality and multidimensionality of credibility should enable conservation scientists to advance biodiversity conservation more effectively. PMID:26041036

  5. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  6. Credibility enacted : Understanding the meaning of credible political leadership in the Dutch parliamentary election campaign of 2010

    NARCIS (Netherlands)

    van Zuydam, Sabine; Hendriks, Frank

    2018-01-01

    In times of perception politics, the credibility of electoral candidates is a crucial asset in political marketing. This raises the question to which political leaders citizens attribute credibility and how political credibility is gained and lost through media performance. We analyze and compare

  7. Credibility enacted : Understanding the meaning of credible political leadership in the Dutch parliamentary election campaign of 2010

    NARCIS (Netherlands)

    van Zuydam, Sabine; Hendriks, Frank

    2015-01-01

    In times of perception politics, the credibility of electoral candidates is a crucial asset in political marketing. This raises the question to which political leaders citizens attribute credibility and how political credibility is gained and lost through media performance. We analyze and compare

  8. Near-real-time and scenario earthquake loss estimates for Mexico

    Science.gov (United States)

    Wyss, M.; Zuñiga, R.

    2017-12-01

    The large earthquakes of 8 September 2017, M8.1, and 19 September 2017, M7.1 have focused attention on the dangers of Mexican seismicity. The near-real-time alerts by QLARM estimated 10 to 300 fatalities and 0 to 200 fatalities, respectively. At the time of this submission the reported death tolls are 96 and 226, respectively. These alerts were issued within 96 and 57 minutes of the occurrence times. For the M8.1 earthquake the losses due to a line model could be calculated. The line with length L=110 km extended from the initial epicenter to the NE, where the USGS had reported aftershocks. On September 19, no aftershocks were available in near-real-time, so a point source had to be used for the quick calculation of likely casualties. In both cases, the casualties were at least an order of magnitude smaller than what they could have been because on 8 September the source was relatively far offshore and on 19 September the hypocenter was relatively deep. The largest historic earthquake in Mexico occurred on 28 March 1787 and likely had a rupture length of 450 km and M8.6. Based on this event, and after verifying our tool for Mexico, we estimated the order of magnitude of a disaster, given the current population, in a maximum credible earthquake along the Pacific coast. In the countryside along the coast we expect approximately 27,000 fatalities and 480,000 injured. In the special case of Mexico City the casualties in a worst possible earthquake along the Pacific plate boundary would likely be counted as five digit numbers. The large agglomerate of the capital with its lake bed soil attracts most attention. Nevertheless, one should pay attention to the fact that the poor, rural segment of society, living in buildings of weak resistance to shaking, are likely to sustain a mortality rate about 20% larger than the population in cities on average soil.

  9. Maximum Credible Incidents

    CERN Document Server

    Strait, J

    2009-01-01

    Following the incident in sector 34, considerable effort has been made to improve the systems for detecting similar faults and to improve the safety systems to limit the damage if a similar incident should occur. Nevertheless, even after the consolidation and repairs are completed, other faults may still occur in the superconducting magnet systems, which could result in damage to the LHC. Such faults include both direct failures of a particular component or system, or an incorrect response to a “normal” upset condition, for example a quench. I will review a range of faults which could be reasonably expected to occur in the superconducting magnet systems, and which could result in substantial damage and down-time to the LHC. I will evaluate the probability and the consequences of such faults, and suggest what mitigations, if any, are possible to protect against each.

  10. Deterministic earthquake scenarios for the city of Sofia

    CERN Document Server

    Slavov, S I; Panza, G F; Paskaleva, I; Vaccari, P

    2002-01-01

    The city of Sofia is exposed to a high seismic risk. Macroseismic intensities in the range of VIII-X (MSK) can be expected in the city. The earthquakes, that can influence the hazard at Sofia, originate either beneath the city or are caused by seismic sources located within a radius of 40km. The city of Sofia is also prone to the remote Vrancea seismic zone in Romania, and particularly vulnerable are the long - period elements of the built environment. The high seismic risk and the lack of instrumental recordings of the regional seismicity makes the use of appropriate credible earthquake scenarios and ground motion modelling approaches for defining the seismic input for the city of Sofia necessary. Complete synthetic seismic signals, due to several earthquake scenarios, were computed along chosen geological profiles crossing the city, applying a hybrid technique, based on the modal summation technique and finite differences. The modelling takes into account simultaneously the geotechnical properties of the si...

  11. Credibility Discourse of PR Agencies

    DEFF Research Database (Denmark)

    Isaksson, Maria; Jørgensen, Poul Erik Flyvholm

    2008-01-01

    to giving assurance of their expertise, trustworthiness and empathy, thus confirming our overall expectation that corporate credibility discourse is relatively uniform from a European perspective. However, contrary to our assumptions, the results of our study show that PR credibility discourse demonstrates...

  12. Factors Contributing to the Catastrophe in Mexico City During the Earthquake of September 19, 1985

    OpenAIRE

    Beck, James L.; Hall, John F.

    1986-01-01

    The extensive damage to high‐rise buildings in Mexico City during the September 19, 1985 earthquake is primarily due to the intensity of the ground shaking exceeding what was previously considered credible for the city by Mexican engineers. There were two major factors contributing to the catastrophe, resonance in the sediments of an ancient lake that once existed in the Valley of Mexico, and the long duration of shaking compared with other coastal earthquakes in the last 50 years. Both of th...

  13. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  14. Would a madman have been so wise as this?" The effects of source credibility and message credibility on validation.

    Science.gov (United States)

    Foy, Jeffrey E; LoCasto, Paul C; Briner, Stephen W; Dyar, Samantha

    2017-02-01

    Readers rapidly check new information against prior knowledge during validation, but research is inconsistent as to whether source credibility affects validation. We argue that readers are likely to accept highly plausible assertions regardless of source, but that high source credibility may boost acceptance of claims that are less plausible based on general world knowledge. In Experiment 1, participants read narratives with assertions for which the plausibility varied depending on the source. For high credibility sources, we found that readers were faster to read information confirming these assertions relative to contradictory information. We found the opposite patterns for low credibility characters. In Experiment 2, readers read claims from the same high or low credibility sources, but the claims were always plausible based on general world knowledge. Readers consistently took longer to read contradictory information, regardless of source. In Experiment 3, participants read modified versions of "The Tell-Tale Heart," which was narrated entirely by an unreliable source. We manipulated the plausibility of a target event, as well as whether high credibility characters within the story provided confirmatory or contradictory information about the narrator's description of the target event. Though readers rated the narrator as being insane, they were more likely to believe the narrator's assertions about the target event when it was plausible and corroborated by other characters. We argue that sourcing research would benefit from focusing on the relationship between source credibility, message credibility, and multiple sources within a text.

  15. The dual impact of "appeal" and "researcher credibility" on mail survey response rate in the context of preventive health care.

    Science.gov (United States)

    Angur, M G; Nataraajan, R; Chawla, S K

    1994-01-01

    Health and fitness centers are becoming increasingly aware of their importance in the realm of preventive health care. Many hospitals have begun to open and run fitness centers, a trend that seems very likely to continue. In a competitive environment, every center would desire to obtain maximum valid customer information at minimum cost, and this paper addresses this issue. The authors investigate the confluence of both appeal and researcher credibility on mail questionnaire response rates from a metropolitan membership of a large fitness center. Personal appeal with high researcher credibility was found to generate significantly higher response rate followed by the hybrid appeal with low researcher credibility.

  16. Induced seismicity provides insight into why earthquake ruptures stop

    KAUST Repository

    Galis, Martin

    2017-12-21

    Injection-induced earthquakes pose a serious seismic hazard but also offer an opportunity to gain insight into earthquake physics. Currently used models relating the maximum magnitude of injection-induced earthquakes to injection parameters do not incorporate rupture physics. We develop theoretical estimates, validated by simulations, of the size of ruptures induced by localized pore-pressure perturbations and propagating on prestressed faults. Our model accounts for ruptures growing beyond the perturbed area and distinguishes self-arrested from runaway ruptures. We develop a theoretical scaling relation between the largest magnitude of self-arrested earthquakes and the injected volume and find it consistent with observed maximum magnitudes of injection-induced earthquakes over a broad range of injected volumes, suggesting that, although runaway ruptures are possible, most injection-induced events so far have been self-arrested ruptures.

  17. Cross-validation of the Dot Counting Test in a large sample of credible and non-credible patients referred for neuropsychological testing.

    Science.gov (United States)

    McCaul, Courtney; Boone, Kyle B; Ermshar, Annette; Cottingham, Maria; Victor, Tara L; Ziegler, Elizabeth; Zeller, Michelle A; Wright, Matthew

    2018-01-18

    To cross-validate the Dot Counting Test in a large neuropsychological sample. Dot Counting Test scores were compared in credible (n = 142) and non-credible (n = 335) neuropsychology referrals. Non-credible patients scored significantly higher than credible patients on all Dot Counting Test scores. While the original E-score cut-off of ≥17 achieved excellent specificity (96.5%), it was associated with mediocre sensitivity (52.8%). However, the cut-off could be substantially lowered to ≥13.80, while still maintaining adequate specificity (≥90%), and raising sensitivity to 70.0%. Examination of non-credible subgroups revealed that Dot Counting Test sensitivity in feigned mild traumatic brain injury (mTBI) was 55.8%, whereas sensitivity was 90.6% in patients with non-credible cognitive dysfunction in the context of claimed psychosis, and 81.0% in patients with non-credible cognitive performance in depression or severe TBI. Thus, the Dot Counting Test may have a particular role in detection of non-credible cognitive symptoms in claimed psychiatric disorders. Alternative to use of the E-score, failure on ≥1 cut-offs applied to individual Dot Counting Test scores (≥6.0″ for mean grouped dot counting time, ≥10.0″ for mean ungrouped dot counting time, and ≥4 errors), occurred in 11.3% of the credible sample, while nearly two-thirds (63.6%) of the non-credible sample failed one of more of these cut-offs. An E-score cut-off of 13.80, or failure on ≥1 individual score cut-offs, resulted in few false positive identifications in credible patients, and achieved high sensitivity (64.0-70.0%), and therefore appear appropriate for use in identifying neurocognitive performance invalidity.

  18. Seismic experience in power and industrial facilities as it relates to small magnitude earthquakes

    International Nuclear Information System (INIS)

    Swan, S.W.; Horstman, N.G.

    1987-01-01

    The data base on the performance of power and industrial facilities in small magnitude earthquakes (M = 4.0 - 5.5) is potentially very large. In California alone many earthquakes in this magnitude range occur every year, often near industrial areas. In 1986 for example, in northern California alone, there were 76 earthquakes between Richter magnitude 4.0 and 5.5. Experience has shown that the effects of small magnitude earthquakes are seldom significant to well-engineered facilities. (The term well-engineered is here defined to include most modern industrial installations, as well as power plants and substations.) Therefore detailed investigations of small magnitude earthquakes are normally not considered worthwhile. The purpose of this paper is to review the tendency toward seismic damage of equipment installations representative of nuclear power plant safety systems. Estimates are made of the thresholds of seismic damage to certain types of equipment in terms of conventional means of measuring the damage potential of an earthquake. The objective is to define thresholds of damage that can be correlated with Richter magnitude. In this manner an earthquake magnitude might be chosen below which damage to nuclear plant safety systems is not considered credible

  19. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    International Nuclear Information System (INIS)

    O'Brien, G.M.

    1993-01-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p number-sign 1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p number-sign 1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells

  20. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  1. ON POTENTIAL REPRESENTATIONS OF THE DISTRIBUTION LAW OF RARE STRONGEST EARTHQUAKES

    Directory of Open Access Journals (Sweden)

    M. V. Rodkin

    2014-01-01

    Full Text Available Assessment of long-term seismic hazard is critically dependent on the behavior of tail of the distribution function of rare strongest earthquakes. Analyses of empirical data cannot however yield the credible solution of this problem because the instrumental catalogs of earthquake are available only for a rather short time intervals, and the uncertainty in estimations of magnitude of paleoearthquakes is high. From the available data, it was possible only to propose a number of alternative models characterizing the distribution of rare strongest earthquakes. There are the following models: the model based on theGuttenberg – Richter law suggested to be valid until a maximum possible seismic event (Мmах, models of 'bend down' of earthquake recurrence curve, and the characteristic earthquakes model. We discuss these models from the general physical concepts supported by the theory of extreme values (with reference to the generalized extreme value (GEV distribution and the generalized Pareto distribution (GPD and the multiplicative cascade model of seismic regime. In terms of the multiplicative cascade model, seismic regime is treated as a large number of episodes of avalanche-type relaxation of metastable states which take place in a set of metastable sub-systems.The model of magnitude-unlimited continuation of the Guttenberg – Richter law is invalid from the physical point of view because it corresponds to an infinite mean value of seismic energy and infinite capacity of the process generating seismicity. A model of an abrupt cut of this law by a maximum possible event, Мmах is not fully logical either.A model with the 'bend-down' of earthquake recurrence curve can ensure both continuity of the distribution law and finiteness of seismic energy value. Results of studies with the use of the theory of extreme values provide a convincing support to the model of 'bend-down' of earthquakes’ recurrence curve. Moreover they testify also that the

  2. Building credibility in international banking and financial markets

    DEFF Research Database (Denmark)

    Jørgensen, Poul Erik Flyvholm; Isaksson, Maria

    2008-01-01

    . There is also clear evidence that corporate advertising is in fact strongly focussed on communicating credibility with less than 10% of discourse and visuals devoted to credibility-free themes and issues. Research implications/limitations - The study takes a production perspective, using discourse......Purpose - The research draws a detailed picture of how international corporate banks and financial institutions approach image advertising to enhance impressions of their credibility. The purpose of the work is twofold, namely to demonstrate (1) how corporate credibility can be conceptualised...... appeal forms. A corpus of 74 print adverts was then analysed in order to establish how financial marketers use the appeal forms to strengthen their corporate reputations. The patterns of credibility appeals obtained were then linked to the supporting visuals to provide a fuller picture of the industry...

  3. Estimation of Surface Deformation due to Pasni Earthquake Using SAR Interferometry

    Science.gov (United States)

    Ali, M.; Shahzad, M. I.; Nazeer, M.; Kazmi, J. H.

    2018-04-01

    Earthquake cause ground deformation in sedimented surface areas like Pasni and that is a hazard. Such earthquake induced ground displacements can seriously damage building structures. On 7 February 2017, an earthquake with 6.3 magnitudes strike near to Pasni. We have successfully distinguished widely spread ground displacements for the Pasni earthquake by using InSAR-based analysis with Sentinel-1 satellite C-band data. The maps of surface displacement field resulting from the earthquake are generated. Sentinel-1 Wide Swath data acquired from 9 December 2016 to 28 February 2017 was used to generate displacement map. The interferogram revealed the area of deformation. The comparison map of interferometric vertical displacement in different time period was treated as an evidence of deformation caused by earthquake. Profile graphs of interferogram were created to estimate the vertical displacement range and trend. Pasni lies in strong earthquake magnitude effected area. The major surface deformation areas are divided into different zones based on significance of deformation. The average displacement in Pasni is estimated about 250 mm. Maximum pasni area is uplifted by earthquake and maximum uplifting occurs was about 1200 mm. Some of areas was subsidized like the areas near to shoreline and maximum subsidence was estimated about 1500 mm. Pasni is facing many problems due to increasing sea water intrusion under prevailing climatic change where land deformation due to a strong earthquake can augment its vulnerability.

  4. Can diligent and extensive mapping of faults provide reliable estimates of the expected maximum earthquakes at these faults? No. (Invited)

    Science.gov (United States)

    Bird, P.

    2010-12-01

    The hope expressed in the title question above can be contradicted in 5 ways, listed below. To summarize, an earthquake rupture can be larger than anticipated either because the fault system has not been fully mapped, or because the rupture is not limited to the pre-existing fault network. 1. Geologic mapping of faults is always incomplete due to four limitations: (a) Map-scale limitation: Faults below a certain (scale-dependent) apparent offset are omitted; (b) Field-time limitation: The most obvious fault(s) get(s) the most attention; (c) Outcrop limitation: You can't map what you can't see; and (d) Lithologic-contrast limitation: Intra-formation faults can be tough to map, so they are often assumed to be minor and omitted. If mapping is incomplete, fault traces may be longer and/or better-connected than we realize. 2. Fault trace “lengths” are unreliable guides to maximum magnitude. Fault networks have multiply-branching, quasi-fractal shapes, so fault “length” may be meaningless. Naming conventions for main strands are unclear, and rarely reviewed. Gaps due to Quaternary alluvial cover may not reflect deeper seismogenic structure. Mapped kinks and other “segment boundary asperities” may be only shallow structures. Also, some recent earthquakes have jumped and linked “separate” faults (Landers, California 1992; Denali, Alaska, 2002) [Wesnousky, 2006; Black, 2008]. 3. Distributed faulting (“eventually occurring everywhere”) is predicted by several simple theories: (a) Viscoelastic stress redistribution in plate/microplate interiors concentrates deviatoric stress upward until they fail by faulting; (b) Unstable triple-junctions (e.g., between 3 strike-slip faults) in 2-D plate theory require new faults to form; and (c) Faults which appear to end (on a geologic map) imply distributed permanent deformation. This means that all fault networks evolve and that even a perfect fault map would be incomplete for future ruptures. 4. A recent attempt

  5. Enhancing the quality and credibility of qualitative analysis.

    OpenAIRE

    Patton, M Q

    1999-01-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, comp...

  6. Maximum spectral demands in the near-fault region

    Science.gov (United States)

    Huang, Yin-Nan; Whittaker, Andrew S.; Luco, Nicolas

    2008-01-01

    The Next Generation Attenuation (NGA) relationships for shallow crustal earthquakes in the western United States predict a rotated geometric mean of horizontal spectral demand, termed GMRotI50, and not maximum spectral demand. Differences between strike-normal, strike-parallel, geometric-mean, and maximum spectral demands in the near-fault region are investigated using 147 pairs of records selected from the NGA strong motion database. The selected records are for earthquakes with moment magnitude greater than 6.5 and for closest site-to-fault distance less than 15 km. Ratios of maximum spectral demand to NGA-predicted GMRotI50 for each pair of ground motions are presented. The ratio shows a clear dependence on period and the Somerville directivity parameters. Maximum demands can substantially exceed NGA-predicted GMRotI50 demands in the near-fault region, which has significant implications for seismic design, seismic performance assessment, and the next-generation seismic design maps. Strike-normal spectral demands are a significantly unconservative surrogate for maximum spectral demands for closest distance greater than 3 to 5 km. Scale factors that transform NGA-predicted GMRotI50 to a maximum spectral demand in the near-fault region are proposed.

  7. Quantitative prediction of strong motion for a potential earthquake fault

    Directory of Open Access Journals (Sweden)

    Shamita Das

    2010-02-01

    Full Text Available This paper describes a new method for calculating strong motion records for a given seismic region on the basis of the laws of physics using information on the tectonics and physical properties of the earthquake fault. Our method is based on a earthquake model, called a «barrier model», which is characterized by five source parameters: fault length, width, maximum slip, rupture velocity, and barrier interval. The first three parameters may be constrained from plate tectonics, and the fourth parameter is roughly a constant. The most important parameter controlling the earthquake strong motion is the last parameter, «barrier interval». There are three methods to estimate the barrier interval for a given seismic region: 1 surface measurement of slip across fault breaks, 2 model fitting with observed near and far-field seismograms, and 3 scaling law data for small earthquakes in the region. The barrier intervals were estimated for a dozen earthquakes and four seismic regions by the above three methods. Our preliminary results for California suggest that the barrier interval may be determined if the maximum slip is given. The relation between the barrier interval and maximum slip varies from one seismic region to another. For example, the interval appears to be unusually long for Kilauea, Hawaii, which may explain why only scattered evidence of strong ground shaking was observed in the epicentral area of the Island of Hawaii earthquake of November 29, 1975. The stress drop associated with an individual fault segment estimated from the barrier interval and maximum slip lies between 100 and 1000 bars. These values are about one order of magnitude greater than those estimated earlier by the use of crack models without barriers. Thus, the barrier model can resolve, at least partially, the well known discrepancy between the stress-drops measured in the laboratory and those estimated for earthquakes.

  8. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  9. Credibility of Policy Announcements Under Asymmetric Information

    DEFF Research Database (Denmark)

    Christensen, Michael

    1999-01-01

    In a simple macro-economic model, where the monetary authorities process superior information about a real shocks, the scope for an active stabilization policy is shown to depend on the credibility of the policy maker. Lack of credibility increases the need for an active stabilization policy...

  10. Multivariate statistical analysis to investigate the subduction zone parameters favoring the occurrence of giant megathrust earthquakes

    Science.gov (United States)

    Brizzi, S.; Sandri, L.; Funiciello, F.; Corbi, F.; Piromallo, C.; Heuret, A.

    2018-03-01

    The observed maximum magnitude of subduction megathrust earthquakes is highly variable worldwide. One key question is which conditions, if any, favor the occurrence of giant earthquakes (Mw ≥ 8.5). Here we carry out a multivariate statistical study in order to investigate the factors affecting the maximum magnitude of subduction megathrust earthquakes. We find that the trench-parallel extent of subduction zones and the thickness of trench sediments provide the largest discriminating capability between subduction zones that have experienced giant earthquakes and those having significantly lower maximum magnitude. Monte Carlo simulations show that the observed spatial distribution of giant earthquakes cannot be explained by pure chance to a statistically significant level. We suggest that the combination of a long subduction zone with thick trench sediments likely promotes a great lateral rupture propagation, characteristic of almost all giant earthquakes.

  11. Credibility and trust in risk communication

    International Nuclear Information System (INIS)

    Renn, O.; Levine, D.

    1989-01-01

    The paper attempts to summarize the major findings of the psychological and sociological literature on trust and credibility, and to apply these findings to the specific arena of risk communication. A few guidelines for risk communication that appear appropriate for the social and institutional context in which the risk debate takes place are presented. The case studies of credibility of nuclear energy, biotechnology, medicine, and aviation are discussed. (DG)

  12. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  13. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  14. Credibility is the first principle

    International Nuclear Information System (INIS)

    Beecher, William

    2002-01-01

    The first principle of an effective public affairs program on nuclear energy is credibility. If credibility is lacking, no matter how artful the message, it will not be persuasive. There has long been a problem in the United States. For years much of the industry followed the practice, when there was an event at a nuclear power plant that resulted in an unplanned release of radioactivity, to tell the public there was 'no release' if in fact the release was below the technical specifications of what the NRC mandates as being safe. The NRC is a safety regulator. It can tell nuclear power plant operators what to do, or not do, when it comes to safety, but doesn't have the right to tell them what to say to the public. The example of an emergency exercise and the NRC press release on that occasion showed the direction how companies could be influenced to behave in order to prevent such avoidably negative news coverage, i.e. attaining credibility when public anxiety is concerned

  15. Credible nuclear waste management: a legislative perspective

    International Nuclear Information System (INIS)

    Jeffords, J.M.

    1978-01-01

    The past credibility of the AEC, ERDA, and NRC, along with the present credibility of DOE and NRC, are questioned. The results of voter responses to a moratorium on expansion of nuclear power are linked to the question of past credibility of these Federal agencies. It is proposed that the future of nuclear power be linked directly to the Executive Branch of the government via a new bureaucracy, a Waste Management Authority. This new bureaucracy would be completely separated from the construction or licensing phase of nuclear power, except it would have final say over any nuclear power expansion pending an acceptable solution to the waste reprocessing question

  16. Extracting Credible Dependencies for Averaged One-Dependence Estimator Analysis

    Directory of Open Access Journals (Sweden)

    LiMin Wang

    2014-01-01

    Full Text Available Of the numerous proposals to improve the accuracy of naive Bayes (NB by weakening the conditional independence assumption, averaged one-dependence estimator (AODE demonstrates remarkable zero-one loss performance. However, indiscriminate superparent attributes will bring both considerable computational cost and negative effect on classification accuracy. In this paper, to extract the most credible dependencies we present a new type of seminaive Bayesian operation, which selects superparent attributes by building maximum weighted spanning tree and removes highly correlated children attributes by functional dependency and canonical cover analysis. Our extensive experimental comparison on UCI data sets shows that this operation efficiently identifies possible superparent attributes at training time and eliminates redundant children attributes at classification time.

  17. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    Energy Technology Data Exchange (ETDEWEB)

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

  18. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    International Nuclear Information System (INIS)

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided

  19. The Credibility of Science Communication

    Directory of Open Access Journals (Sweden)

    Nielsen, L. H.

    2007-10-01

    Full Text Available Current developments in the media marketplace and an increased need for visibility to secure funding are leading inevitably to faster, simpler and more aggressive science communication. This article presents the results of an exploratory study of potential credibility problems in astronomy press releases, their causes, consequences and possible remedies. The study consisted of eleven open-ended interviews with journalists, scientists and public information officers. Results suggest that credibility issues are central to communication, deeply integrated into the workflow and can have severe consequences for the actors (especially the scientist, but are an unavoidable part of thecommunication process.

  20. New approach of determinations of earthquake moment magnitude using near earthquake source duration and maximum displacement amplitude of high frequency energy radiation

    Energy Technology Data Exchange (ETDEWEB)

    Gunawan, H.; Puspito, N. T.; Ibrahim, G.; Harjadi, P. J. P. [ITB, Faculty of Earth Sciences and Tecnology (Indonesia); BMKG (Indonesia)

    2012-06-20

    The new approach method to determine the magnitude by using amplitude displacement relationship (A), epicenter distance ({Delta}) and duration of high frequency radiation (t) has been investigated for Tasikmalaya earthquake, on September 2, 2009, and their aftershock. Moment magnitude scale commonly used seismic surface waves with the teleseismic range of the period is greater than 200 seconds or a moment magnitude of the P wave using teleseismic seismogram data and the range of 10-60 seconds. In this research techniques have been developed a new approach to determine the displacement amplitude and duration of high frequency radiation using near earthquake. Determination of the duration of high frequency using half of period of P waves on the seismograms displacement. This is due tothe very complex rupture process in the near earthquake. Seismic data of the P wave mixing with other wave (S wave) before the duration runs out, so it is difficult to separate or determined the final of P-wave. Application of the 68 earthquakes recorded by station of CISI, Garut West Java, the following relationship is obtained: Mw = 0.78 log (A) + 0.83 log {Delta}+ 0.69 log (t) + 6.46 with: A (m), d (km) and t (second). Moment magnitude of this new approach is quite reliable, time processing faster so useful for early warning.

  1. New approach of determinations of earthquake moment magnitude using near earthquake source duration and maximum displacement amplitude of high frequency energy radiation

    Science.gov (United States)

    Gunawan, H.; Puspito, N. T.; Ibrahim, G.; Harjadi, P. J. P.

    2012-06-01

    The new approach method to determine the magnitude by using amplitude displacement relationship (A), epicenter distance (Δ) and duration of high frequency radiation (t) has been investigated for Tasikmalaya earthquake, on September 2, 2009, and their aftershock. Moment magnitude scale commonly used seismic surface waves with the teleseismic range of the period is greater than 200 seconds or a moment magnitude of the P wave using teleseismic seismogram data and the range of 10-60 seconds. In this research techniques have been developed a new approach to determine the displacement amplitude and duration of high frequency radiation using near earthquake. Determination of the duration of high frequency using half of period of P waves on the seismograms displacement. This is due tothe very complex rupture process in the near earthquake. Seismic data of the P wave mixing with other wave (S wave) before the duration runs out, so it is difficult to separate or determined the final of P-wave. Application of the 68 earthquakes recorded by station of CISI, Garut West Java, the following relationship is obtained: Mw = 0.78 log (A) + 0.83 log Δ + 0.69 log (t) + 6.46 with: A (m), d (km) and t (second). Moment magnitude of this new approach is quite reliable, time processing faster so useful for early warning.

  2. New approach of determinations of earthquake moment magnitude using near earthquake source duration and maximum displacement amplitude of high frequency energy radiation

    International Nuclear Information System (INIS)

    Gunawan, H.; Puspito, N. T.; Ibrahim, G.; Harjadi, P. J. P.

    2012-01-01

    The new approach method to determine the magnitude by using amplitude displacement relationship (A), epicenter distance (Δ) and duration of high frequency radiation (t) has been investigated for Tasikmalaya earthquake, on September 2, 2009, and their aftershock. Moment magnitude scale commonly used seismic surface waves with the teleseismic range of the period is greater than 200 seconds or a moment magnitude of the P wave using teleseismic seismogram data and the range of 10-60 seconds. In this research techniques have been developed a new approach to determine the displacement amplitude and duration of high frequency radiation using near earthquake. Determination of the duration of high frequency using half of period of P waves on the seismograms displacement. This is due tothe very complex rupture process in the near earthquake. Seismic data of the P wave mixing with other wave (S wave) before the duration runs out, so it is difficult to separate or determined the final of P-wave. Application of the 68 earthquakes recorded by station of CISI, Garut West Java, the following relationship is obtained: Mw = 0.78 log (A) + 0.83 log Δ+ 0.69 log (t) + 6.46 with: A (m), d (km) and t (second). Moment magnitude of this new approach is quite reliable, time processing faster so useful for early warning.

  3. Maximum Likelihood and Bayes Estimation in Randomly Censored Geometric Distribution

    Directory of Open Access Journals (Sweden)

    Hare Krishna

    2017-01-01

    Full Text Available In this article, we study the geometric distribution under randomly censored data. Maximum likelihood estimators and confidence intervals based on Fisher information matrix are derived for the unknown parameters with randomly censored data. Bayes estimators are also developed using beta priors under generalized entropy and LINEX loss functions. Also, Bayesian credible and highest posterior density (HPD credible intervals are obtained for the parameters. Expected time on test and reliability characteristics are also analyzed in this article. To compare various estimates developed in the article, a Monte Carlo simulation study is carried out. Finally, for illustration purpose, a randomly censored real data set is discussed.

  4. Exchange rate stabilization under imperfect credibility

    OpenAIRE

    Calvo, Guillermo; Vegh, Carlos

    1991-01-01

    This paper analyzes stabilization policy under predetermined exchange rates in a cash-in-advance, staggered-prices model. Under full credibility, a reduction in the rate of devaluation results in an immediate and permanent reduction in the inflation rate, with no effect on output or consumption. In contrast, a non-credible stabilization results in an initial expansion of output, followed by a later recession. The inflation rate of home goods remains above the rate of devaluation throughout...

  5. Credibility judgments of narratives: language, plausibility, and absorption.

    Science.gov (United States)

    Nahari, Galit; Glicksohn, Joseph; Nachson, Israel

    2010-01-01

    Two experiments were conducted in order to find out whether textual features of narratives differentially affect credibility judgments made by judges having different levels of absorption (a disposition associated with rich visual imagination). Participants in both experiments were exposed to a textual narrative and requested to judge whether the narrator actually experienced the event he described in his story. In Experiment 1, the narrative varied in terms of language (literal, figurative) and plausibility (ordinary, anomalous). In Experiment 2, the narrative varied in terms of language only. The participants' perceptions of the plausibility of the story described and the extent to which they were absorbed in reading were measured. The data from both experiments together suggest that the groups applied entirely different criteria in credibility judgments. For high-absorption individuals, their credibility judgment depends on the degree to which the text can be assimilated into their own vivid imagination, whereas for low-absorption individuals it depends mainly on plausibility. That is, high-absorption individuals applied an experiential mental set while judging the credibility of the narrator, whereas low-absorption individuals applied an instrumental mental set. Possible cognitive mechanisms and implications for credibility judgments are discussed.

  6. Credibility assessment in child sexual abuse investigations: A descriptive analysis.

    Science.gov (United States)

    Melkman, Eran P; Hershkowitz, Irit; Zur, Ronit

    2017-05-01

    A major challenge in cases of child sexual abuse (CSA) is determining the credibility of children's reports. Consequently cases may be misclassified as false or deemed 'no judgment possible'. Based on a large national sample of reports of CSA made in Israel in 2014, the study examines child and event characteristics contributing to the probability that reports of abuse would be judged credible. National data files of all children aged 3-14, who were referred for investigation following suspected victimization of sexual abuse, and had disclosed sexual abuse, were analyzed. Cases were classified as either 'credible' or 'no judgment possible'. The probability of reaching a 'credible' judgment was examined in relation to characteristics of the child (age, gender, cognitive delay, marital status of the parents,) and of the abusive event (abuse severity, frequency, perpetrator-victim relationship, perpetrator's use of grooming, and perpetrator's use of coercion), controlling for investigator's identity at the cluster level of the analysis. Of 1563 cases analyzed, 57.9% were assessed as credible. The most powerful predictors of a credible judgment were older age and absence of a cognitive delay. Reports of children to married parents, who experienced a single abusive event that involved perpetrator's use of grooming, were also more likely to be judged as credible. Rates of credible judgments found are lower than expected suggesting under-identification of truthful reports of CSA. In particular, those cases of severe and multiple abuse involving younger and cognitively delayed children are the ones with the lowest chances of being assessed as credible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Spatial Distribution of earthquakes off the coast of Fukushima Two Years after the M9 Earthquake: the Southern Area of the 2011 Tohoku Earthquake Rupture Zone

    Science.gov (United States)

    Yamada, T.; Nakahigashi, K.; Shinohara, M.; Mochizuki, K.; Shiobara, H.

    2014-12-01

    Huge earthquakes cause vastly stress field change around the rupture zones, and many aftershocks and other related geophysical phenomenon such as geodetic movements have been observed. It is important to figure out the time-spacious distribution during the relaxation process for understanding the giant earthquake cycle. In this study, we pick up the southern rupture area of the 2011 Tohoku earthquake (M9.0). The seismicity rate keeps still high compared with that before the 2011 earthquake. Many studies using ocean bottom seismometers (OBSs) have been doing since soon after the 2011 Tohoku earthquake in order to obtain aftershock activity precisely. Here we show one of the studies at off the coast of Fukushima which is located on the southern part of the rupture area caused by the 2011 Tohoku earthquake. We deployed 4 broadband type OBSs (BBOBSs) and 12 short-period type OBSs (SOBS) in August 2012. Other 4 BBOBSs attached with absolute pressure gauges and 20 SOBSs were added in November 2012. We recovered 36 OBSs including 8 BBOBSs in November 2013. We selected 1,000 events in the vicinity of the OBS network based on a hypocenter catalog published by the Japan Meteorological Agency, and extracted the data after time corrections caused by each internal clock. Each P and S wave arrival times, P wave polarity and maximum amplitude were picked manually on a computer display. We assumed one dimensional velocity structure based on the result from an active source experiment across our network, and applied time corrections every station for removing ambiguity of the assumed structure. Then we adopted a maximum-likelihood estimation technique and calculated the hypocenters. The results show that intensive activity near the Japan Trench can be seen, while there was a quiet seismic zone between the trench zone and landward high activity zone.

  8. Earthquake Strong Ground Motion Scenario at the 2008 Olympic Games Sites, Beijing, China

    Science.gov (United States)

    Liu, L.; Rohrbach, E. A.; Chen, Q.; Chen, Y.

    2006-12-01

    Historic earthquake record indicates mediate to strong earthquakes have been frequently hit greater Beijing metropolitan area where is going to host the 2008 summer Olympic Games. For the readiness preparation of emergency response to the earthquake shaking for a mega event in a mega city like Beijing in summer 2008, this paper tries to construct the strong ground motion scenario at a number of gymnasium sites for the 2008 Olympic Games. During the last 500 years (the Ming and Qing Dynasties) in which the historic earthquake record are thorough and complete, there are at least 12 earthquake events with the maximum intensity of VI or greater occurred within 100 km radius centered at the Tiananmen Square, the center of Beijing City. Numerical simulation of the seismic wave propagation and surface strong ground motion is carried out by the pseudospectral time domain methods with viscoelastic material properties. To improve the modeling efficiency and accuracy, a multi-scale approach is adapted: the seismic wave propagation originated from an earthquake rupture source is first simulated by a model with larger physical domain with coarser grids. Then the wavefield at a given plane is taken as the source input for the small-scale, fine grid model for the strong ground motion study at the sites. The earthquake source rupture scenario is based on two particular historic earthquake events: One is the Great 1679 Sanhe-Pinggu Earthquake (M~8, Maximum Intensity XI at the epicenter and Intensity VIII in city center)) whose epicenter is about 60 km ENE of the city center. The other one is the 1730 Haidian Earthquake (M~6, Maximum Intensity IX at the epicenter and Intensity VIII in city center) with the epicentral distance less than 20 km away from the city center in the NW Haidian District. The exist of the thick Tertiary-Quaternary sediments (maximum thickness ~ 2 km) in Beijing area plays a critical role on estimating the surface ground motion at the Olympic Games sites, which

  9. Aggregated trustworthiness: Redefining online credibility through social validation

    DEFF Research Database (Denmark)

    Jessen, Johan; Jørgensen, Anker Helms

    2012-01-01

    This article investigates the impact of social dynamics on online credibility. Empirical studies by Pettingill (2006) and Hargittai, et al. (2010) suggest that social validation and online trustees play increasingly important roles when evaluating credibility online. This dynamic puts pressure...

  10. Credibility Perceptions of User Generated Content

    OpenAIRE

    Murugan, S.; Nagarajan, Dr. P.S.

    2017-01-01

    Social media users generate a large volume of user generated content in various social media platforms to share their experiences in using a brand or a service. In the travel industry, the user generated content reviews are used by the prospective travellers to decide their travel plans. In the 1950’s credibility research of the media was started when television was introduced as a new media in the world dominated by newspapers. In the Social Media platforms, the credibility assessment is muc...

  11. Evaluation method of nuclear nonproliferation credibility

    International Nuclear Information System (INIS)

    Kwon, Eun-ha; Ko, Won Il

    2009-01-01

    This paper presents an integrated multicriteria analysis method for the quantitative evaluation of a state's nuclear nonproliferation credibility level. Underscoring the implications of policy on the sources of political demand for nuclear weapons rather than focusing on efforts to restrict the supply of specific weapons technology from the 'haves' to the 'have-nots', the proposed methodology considers the political, social, and cultural dimensions of nuclear proliferation. This methodology comprises three steps: (1) identifying the factors that influence credibility formation and employing them to construct a criteria tree that will illustrate the relationships among these factors; (2) defining the weight coefficients of each criterion through pairwise comparisons of the Analytical Hierarchy Process (AHP); and (3) assigning numerical scores to a state under each criterion and combining them with the weight coefficients in order to provide an overall assessment of the state. The functionality of this methodology is examined by assessing the current level of nuclear nonproliferation credibility of four countries: Japan, North Korea, South Korea, and Switzerland.

  12. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    Science.gov (United States)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  13. Design basis earthquakes for critical industrial facilities and their characteristics, and the Southern Hyogo prefecture earthquake, 17 January 1995

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki

    1998-12-01

    This paper deals with how to establish the concept of the design basis earthquake (DBE) for critical industrial facilities such as nuclear power plants in consideration of disasters such as the Southern Hyogo prefecture earthquake, the so-called Kobe earthquake in 1995. The author once discussed various DBEs at the 7th World Conference on Earthquake Engineering. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared the values of accelerations of a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo prefecture earthquake in 1995 exceeded the previous assumption of the author, even though the results of the previous paper had been pessimistic. According to the experience of the Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2} of previous DBEs.

  14. Enhancing the quality and credibility of qualitative analysis.

    Science.gov (United States)

    Patton, M Q

    1999-12-01

    Varying philosophical and theoretical orientations to qualitative inquiry remind us that issues of quality and credibility intersect with audience and intended research purposes. This overview examines ways of enhancing the quality and credibility of qualitative analysis by dealing with three distinct but related inquiry concerns: rigorous techniques and methods for gathering and analyzing qualitative data, including attention to validity, reliability, and triangulation; the credibility, competence, and perceived trustworthiness of the qualitative researcher; and the philosophical beliefs of evaluation users about such paradigm-based preferences as objectivity versus subjectivity, truth versus perspective, and generalizations versus extrapolations. Although this overview examines some general approaches to issues of credibility and data quality in qualitative analysis, it is important to acknowledge that particular philosophical underpinnings, specific paradigms, and special purposes for qualitative inquiry will typically include additional or substitute criteria for assuring and judging quality, validity, and credibility. Moreover, the context for these considerations has evolved. In early literature on evaluation methods the debate between qualitative and quantitative methodologists was often strident. In recent years the debate has softened. A consensus has gradually emerged that the important challenge is to match appropriately the methods to empirical questions and issues, and not to universally advocate any single methodological approach for all problems.

  15. Non extensivity and frequency magnitude distribution of earthquakes

    International Nuclear Information System (INIS)

    Sotolongo-Costa, Oscar; Posadas, Antonio

    2003-01-01

    Starting from first principles (in this case a non-extensive formulation of the maximum entropy principle) and a phenomenological approach, an explicit formula for the magnitude distribution of earthquakes is derived, which describes earthquakes in the whole range of magnitudes. The Gutenberg-Richter law appears as a particular case of the obtained formula. Comparison with geophysical data gives a very good agreement

  16. Deterministic earthquake scenarios for the city of Sofia

    International Nuclear Information System (INIS)

    Slavov, S.; Paskaleva, I.; Kouteva, M.; Vaccari, P.; Panza, G.F.

    2002-08-01

    The city of Sofia is exposed to a high seismic risk. Macroseismic intensities in the range of VIII-X (MSK) can be expected in the city. The earthquakes, that can influence the hazard at Sofia, originate either beneath the city or are caused by seismic sources located within a radius of 40km. The city of Sofia is also prone to the remote Vrancea seismic zone in Romania, and particularly vulnerable are the long - period elements of the built environment. The high seismic risk and the lack of instrumental recordings of the regional seismicity makes the use of appropriate credible earthquake scenarios and ground motion modelling approaches for defining the seismic input for the city of Sofia necessary. Complete synthetic seismic signals, due to several earthquake scenarios, were computed along chosen geological profiles crossing the city, applying a hybrid technique, based on the modal summation technique and finite differences. The modelling takes into account simultaneously the geotechnical properties of the site, the position and geometry of the seismic source and the mechanical properties of the propagation medium. Acceleration, velocity and displacement time histories and related quantities of earthquake engineering interest (e.g. response spectra, ground motion amplification along the profiles) have been supplied. The approach applied in this study allows us to obtain the definition of the seismic input at low cost exploiting large quantities of existing data (e.g. geotechnical, geological, seismological). It may be efficiently used to estimate the ground motion for the purposes of microzonation, urban planning, retrofitting or insurance of the built environment, etc. (author)

  17. Surface latent heat flux as an earthquake precursor

    Directory of Open Access Journals (Sweden)

    S. Dey

    2003-01-01

    Full Text Available The analysis of surface latent heat flux (SLHF from the epicentral regions of five recent earthquakes that occurred in close proximity to the oceans has been found to show anomalous behavior. The maximum increase of SLHF is found 2–7 days prior to the main earthquake event. This increase is likely due to an ocean-land-atmosphere interaction. The increase of SLHF prior to the main earthquake event is attributed to the increase in infrared thermal (IR temperature in the epicentral and surrounding region. The anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the tides and monsoon in surface latent heat flux. Efforts have been made to understand the level of background noise in the epicentral regions of the five earthquakes considered in the present paper. A comparison of SLHF from the epicentral regions over the coastal earthquakes and the earthquakes that occurred far away from the coast has been made and it has been found that the anomalous behavior of SLHF prior to the main earthquake event is only associated with the coastal earthquakes.

  18. Assessing the credibility of diverting through containment penetrations

    International Nuclear Information System (INIS)

    Cooley, J.N.; Swindle, D.W. Jr.

    1980-01-01

    A viable approach has been developed for identifying those containment penetrations in a nuclear fuel reprocessing plant which are credible diversion routes. The approach is based upon systematic engineering and design analyses and is applied to each type of penetration to determine which penetrations could be utilized to divert nuclear material from a reprocessing facility. The approach is described and the results of an application are discussed. In addition, the concept of credibility is developed and discussed. For a typical reprocessing plant design, the number of penetrations determined to be credible without process or piping modifications was approx. 16% of the penetrations originally identified

  19. Institutional credibility and the future of Danish journalism

    DEFF Research Database (Denmark)

    Ørsten, Mark; Burkal, Rasmus

    Credibility is frequently represented as both an ideal goal for journalism as a profession (Vultree, 2010) and as an integral part of the survival strategy of the news industry (Meyer, 2004). Yet there exists no widely accepted operationalization of the concept of credibility. In this paper we...... ethics among Danish journalists....

  20. eWOM credibility on social networking sites: A framework

    OpenAIRE

    Moran, Gillian; Muzellec, Laurent

    2017-01-01

    Social networking sites (SNS) offer brands the ability to spread positive electronic Word of Mouth (eWOM) for the purposes of building awareness and acquiring new customers. However, the credibility of eWOM is threatened of late as marketers increasingly try to manipulate eWOM practices on SNS. A greater understanding of eWOM credibility is necessary to better enable marketers to leverage true consumer engagement by generating credible peer-to-peer communications. Yet, to date, there is no on...

  1. Relating stick-slip friction experiments to earthquake source parameters

    Science.gov (United States)

    McGarr, Arthur F.

    2012-01-01

    Analytical results for parameters, such as static stress drop, for stick-slip friction experiments, with arbitrary input parameters, can be determined by solving an energy-balance equation. These results can then be related to a given earthquake based on its seismic moment and the maximum slip within its rupture zone, assuming that the rupture process entails the same physics as stick-slip friction. This analysis yields overshoots and ratios of apparent stress to static stress drop of about 0.25. The inferred earthquake source parameters static stress drop, apparent stress, slip rate, and radiated energy are robust inasmuch as they are largely independent of the experimental parameters used in their estimation. Instead, these earthquake parameters depend on C, the ratio of maximum slip to the cube root of the seismic moment. C is controlled by the normal stress applied to the rupture plane and the difference between the static and dynamic coefficients of friction. Estimating yield stress and seismic efficiency using the same procedure is only possible when the actual static and dynamic coefficients of friction are known within the earthquake rupture zone.

  2. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    Directory of Open Access Journals (Sweden)

    A. Muhammad

    2017-12-01

    Full Text Available This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0 that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan – including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal–vertical evacuation time maps – has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  3. Characteristic behavior of underground and semi-underground structure at earthquake

    International Nuclear Information System (INIS)

    Sawada, Yoshihiro; Komada, Hiroya

    1985-01-01

    An appropriate earthquake-resistant repository design is required to ensure the safety of the radioactive wastes (shallow or deep ground disposal of low- and high-level wastes, respectively). It is particularly important to understand the propagation characteristics of seismic waves and the behaviors of underground hollow structures at the time of an earthquake. This report deals with seismologic observations of rock beds and undergound structures. The maximum acceleration deep under the ground is found to be about 1/2 - 1/3 of that at the ground surface or along the rock bed in the horizontal direction and about 1/1 - 1/2 in the longitudinal direction. A large attenuation cannot be expected in shallow ground. The decrease in displacement amplitude is small compared to that in acceleration. The attenuation effect is larger for a small earthquake and at a short hypocentral distance. The attenuation factor reaches a maximum at a depth of several tens of meters. The seismic spectrum under the ground is flatter than that at the surface. The maximum acceleration along the side wall of a cavity is almost the same as that in the surrounding rock bed. An underground cavity shows complicated phase characteristics at the time of a small earthquake at a short hypocentral distance. (Nogami, K.)

  4. How Organizational Design Can Make Delegation Credible

    OpenAIRE

    Foss, Kirsten; Foss, Nicolai J.

    2005-01-01

    Credible delegation of discretion obtains when it is a rational strategy for managers not to overrule employee decisions that are based on delegated decision rights or renege on the level of delegated discretion (and this is common knowledge). Making delegation of discretion credible becomes a crucial issue when organizations want to sustain the advantages that may flow from delegation: Such advantages are dependent on motivated employees, and managerial overruling or reneging is harmful to m...

  5. Extension and Application of Credibility Models in Predicting Claim Frequency

    Directory of Open Access Journals (Sweden)

    Yuan-tao Xie

    2018-01-01

    Full Text Available In nonlife actuarial science, credibility models are one of the main methods of experience ratemaking. Bühlmann-Straub credibility model can be expressed as a special case of linear mixed models (LMMs with the underlying assumption of normality. In this paper, we extend the assumption of Bühlmann-Straub model to include Poisson and negative binomial distributions as they are more appropriate for describing the distribution of a number of claims. By using the framework of generalized linear mixed models (GLMMs, we obtain the generalized credibility premiums that contain as particular cases another credibility premium in the literature. Compared to generalized linear mixed models, our extended credibility models also have an advantage in that the credibility factor falls into the range from 0 to 1. The performance of our models in comparison with an existing model in the literature is also evaluated through numerical studies, which shows that our approach produces premium estimates close to the optima. In addition, our proposed model can also be applied to the most commonly used ratemaking approach, namely, the net, the optimal Bonus-Malus system.

  6. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  7. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  8. Repetition of large stress drop earthquakes on Wairarapa fault, New Zealand, revealed by LiDAR data

    Science.gov (United States)

    Delor, E.; Manighetti, I.; Garambois, S.; Beaupretre, S.; Vitard, C.

    2013-12-01

    We have acquired high-resolution LiDAR topographic data over most of the onland trace of the 120 km-long Wairarapa strike-slip fault, New Zealand. The Wairarapa fault broke in a large earthquake in 1855, and this historical earthquake is suggested to have produced up to 18 m of lateral slip at the ground surface. This would make this earthquake a remarkable event having produced a stress drop much higher than commonly observed on other earthquakes worldwide. The LiDAR data allowed us examining the ground surface morphology along the fault at statistical analysis of the cumulative offsets per segment reveals that the alluvial morphology has well recorded, at every step along the fault, no more than a few (3-6), well distinct cumulative slips, all lower than 80 m. Plotted along the entire fault, the statistically defined cumulative slip values document four, fairly continuous slip profiles that we attribute to the four most recent large earthquakes on the Wairarapa fault. The four slip profiles have a roughly triangular and asymmetric envelope shape that is similar to the coseismic slip distributions described for most large earthquakes worldwide. The four slip profiles have their maximum slip at the same place, in the northeastern third of the fault trace. The maximum slips vary from one event to another in the range 7-15 m; the most recent 1855 earthquake produced a maximum coseismic slip of 15 × 2 m at the ground surface. Our results thus confirm that the Wairarapa fault breaks in remarkably large stress drop earthquakes. Those repeating large earthquakes share both similar (rupture length, slip-length distribution, location of maximum slip) and distinct (maximum slip amplitudes) characteristics. Furthermore, the seismic behavior of the Wairarapa fault is markedly different from that of nearby large strike-slip faults (Wellington, Hope). The reasons for those differences in rupture behavior might reside in the intrinsic properties of the broken faults, especially

  9. EVALUATION OF NATIONAL BANK OF ROMANIA MONETARY POLICY CREDIBILITY

    Directory of Open Access Journals (Sweden)

    Toader Valentin

    2008-05-01

    Full Text Available In this paper, using the models from the economic literature, the authors study the credibility level of National Bank of Romania (NRB during the time span Mars 2007 – Mars 2008. We will use three types of credibility indexes - two from the economic literature and one proposed by the authors. Also, we will emphasize the impact of unpredictable shocks - the natural calamities (drought which affected the aggregate supply in the summer of 2007 and the depreciation of RON against the euro - on the NBR credibility.

  10. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  11. Brand emotional credibility: effects of mixed emotions about branded products with varying credibility.

    Science.gov (United States)

    Mileti, Antonio; Prete, M Irene; Guido, Gianluigi

    2013-10-01

    This research investigates the effects of mixed emotions on the positioning and on the intention to purchase different categories of branded products (i.e., Attractiveness-products, Expertise-products, and Trustworthiness-products), in relation to their main component of credibility (Ohanian, 1990). On the basis of a focus group (n = 12) aimed to identify the three branded products used as stimuli and a pre-test (n = 240) directed to discover emotions elicited by them, two studies (n = 630; n = 240) were carried out. Positioning and multiple regression analyses showed that positive and negative emotions are positively related with the positioning and the purchase intention of Attractiveness-products, and, respectively, positively and negatively related with those of Trustworthiness-products; whereas negative emotions are negatively associated with those of Expertise-products. Brand Emotional Credibility--i.e., the emotional believability of the brand positioning signals--may help to identify unconscious elements and the simultaneous importance of mixed emotions associated with different products to match consumers' desires and expectations.

  12. Exchange-Rate-Based Stabilization under Imperfect Credibility

    OpenAIRE

    Guillermo Calvo; Carlos A. Végh Gramont

    1991-01-01

    This paper analyzes stabilization policy under predetermined exchange rates in a cash-in-advance, staggered-prices model. Under full credibility, a reduction in the rate of devaluation results in an immediate and permanent reduction in the inflation rate, with no effect on output or consumption. In contrast, a non-credible stabilization results in an initial expansion of output, followed by a later recession. The inflation rate of home goods remains above the rate of devaluation throughout th...

  13. Broadband records of earthquakes in deep gold mines and a comparison with results from SAFOD, California

    Science.gov (United States)

    McGarr, Arthur F.; Boettcher, M.; Fletcher, Jon Peter B.; Sell, Russell; Johnston, Malcolm J.; Durrheim, R.; Spottiswoode, S.; Milev, A.

    2009-01-01

    For one week during September 2007, we deployed a temporary network of field recorders and accelerometers at four sites within two deep, seismically active mines. The ground-motion data, recorded at 200 samples/sec, are well suited to determining source and ground-motion parameters for the mining-induced earthquakes within and adjacent to our network. Four earthquakes with magnitudes close to 2 were recorded with high signal/noise at all four sites. Analysis of seismic moments and peak velocities, in conjunction with the results of laboratory stick-slip friction experiments, were used to estimate source processes that are key to understanding source physics and to assessing underground seismic hazard. The maximum displacements on the rupture surfaces can be estimated from the parameter , where  is the peak ground velocity at a given recording site, and R is the hypocentral distance. For each earthquake, the maximum slip and seismic moment can be combined with results from laboratory friction experiments to estimate the maximum slip rate within the rupture zone. Analysis of the four M 2 earthquakes recorded during our deployment and one of special interest recorded by the in-mine seismic network in 2004 revealed maximum slips ranging from 4 to 27 mm and maximum slip rates from 1.1 to 6.3 m/sec. Applying the same analyses to an M 2.1 earthquake within a cluster of repeating earthquakes near the San Andreas Fault Observatory at Depth site, California, yielded similar results for maximum slip and slip rate, 14 mm and 4.0 m/sec.

  14. Topic familiarity and information skills in online credibility evaluation

    NARCIS (Netherlands)

    Lucassen, T.; Schraagen, J.M.C.

    2013-01-01

    With the rise of user generated content, evaluating the credibility of information has become increasingly important. It is already known that various user characteristics influence the way credibility evaluation is performed. Domain experts on the topic at hand primarily focus on semantic features

  15. Impact of Celebrity Credibility on Advertising Effectiveness

    Directory of Open Access Journals (Sweden)

    Sadia Aziz

    2013-05-01

    Full Text Available Advertisers often make use of endorsers or representatives as trustworthy sources of persuasion for consumers' attitudes. Promotion of products through celebrities is a trendy advertising practice around the world. The present study judged the impact of celebrity credibility on advertising effectiveness in terms of consumer’s attitude towards the advertisement, attitude towards the brand and their purchase intention. This study also explored the differences of respondent’s responses towards the advertisements of brand through famous celebrities as well as unknown celebrities. Different TV advertisements were used for the experiment. Several statistical tools were applied to test the hypotheses and identify significant differences & the proposed relationships among the variables. Overall findings suggests that the respondents considered the famous celebrities of the brand as the most credible celebrities, having positive impact on consumers attitude towards the advertisement, attitude to the brand and their favorable purchase intentions as compare to the unknown celebrity with less credibility.

  16. Conquering Credibility for Monetary Policy Under Sticky Confidence

    Directory of Open Access Journals (Sweden)

    Jaylson Jair da Silveira

    2015-06-01

    Full Text Available We derive a best-reply monetary policy when the confidence by price setters on the monetary authority’s commitment to price level targeting may be both incomplete and sticky. We find that complete confidence (or full credibility is not a necessary condition for the achievement of a price level target even when heterogeneity in firms’ price level expectations is endogenously time-varying and may emerge as a long-run equilibrium outcome. In fact, in the absence of exogenous perturbations to the dynamic of confidence building, it is the achievement of a price level target for long enough that, due to stickiness in the state of confidence, rather ensures the conquering of full credibility. This result has relevant implications for the conduct of monetary policy in pursuit of price stability. One implication is that setting a price level target matters more as a means to provide monetary policy with a sharper focus on price stability than as a device to conquer credibility. As regards the conquering of credibility for monetary policy, it turns out that actions speak louder than words, as the continuing achievement of price stability is what ultimately performs better as a confidence-building device.

  17. Comparison of the inelastic response of steel building frames to strong earthquake and underground nuclear explosion ground motion

    International Nuclear Information System (INIS)

    Murray, R.C.; Tokarz, F.J.

    1976-01-01

    Analytic studies were made of the adequacy of simulating earthquake effects at the Nevada Test Site for structural testing purposes. It is concluded that underground nuclear explosion ground motion will produce inelastic behavior and damage comparable to that produced by strong earthquakes. The generally longer duration of earthquakes compared with underground nuclear explosions does not appear to significantly affect the structural behavior of the building frames considered. A comparison of maximum ductility ratios, maximum story drifts, and maximum displacement indicate similar structural behavior for both types of ground motion. Low yield (10 - kt) underground nuclear explosions are capable of producing inelastic behavior in large structures. Ground motion produced by underground nuclear explosions can produce inelastic earthquake-like effects in large structures and could be used for testing large structures in the inelastic response regime. The Nevada Test Site is a feasible earthquake simulator for testing large structures

  18. The Effect of Top-Level Domains and Advertisements on Health Web Site Credibility

    Science.gov (United States)

    Wang, Zuoming; Loh, Tracy

    2004-01-01

    Background Concerns over health information on the Internet have generated efforts to enhance credibility markers; yet how users actually assess the credibility of online health information is largely unknown. Objective This study set out to (1) establish a parsimonious and valid questionnaire instrument to measure credibility of Internet health information by drawing on various previous measures of source, news, and other credibility scales; and (2) to identify the effects of Web-site domains and advertising on credibility perceptions. Methods Respondents (N = 156) examined one of 12 Web-site mock-ups and completed credibility scales in a 3 x 2 x 2 between-subjects experimental design. Factor analysis and validity checks were used for item reduction, and analysis of variance was employed for hypothesis testing of Web-site features' effects. Results In an attempt to construct a credibility instrument, three dimensions of credibility (safety, trustworthiness, and dynamism) were retained, reflecting traditional credibility sub-themes, but composed of items from disparate sources. When testing the effect of the presence or absence of advertising on a Web site on credibility, we found that this depends on the site's domain, with a trend for advertisements having deleterious effects on the credibility of sites with .org domain, but positive effects on sites with .com or .edu domains. Conclusions Health-information Web-site providers should select domains purposefully when they can, especially if they must accept on-site advertising. Credibility perceptions may not be invariant or stable, but rather are sensitive to topic and context. Future research may employ these findings in order to compare other forms of health-information delivery to optimal Web-site features. PMID:15471750

  19. Does the Credible Fiscal Policy Support the Prices Stabilization?

    Directory of Open Access Journals (Sweden)

    Kuncoro Haryo

    2015-06-01

    Full Text Available This paper aims at analyzing the co-movement between fiscal policy and monetary policy rules in the context of price stabilization. More specifically, we observe the potential impact of fiscal policy credibility on the price stabilization in the inflation targeting framework. Motivated by the fact that empirical studies concerning this aspect are still limited, we take the case of Indonesia over the period 2001-2013. Based on the quarterly data analysis, we found that the impact of credibility typically depends on characteristics of fiscal rules commitment. On one hand, the credibility of debt rule reduces the inflation rate. In contrast, the incredible deficit rule policy does not have any impact on the inflation rate and therefore does not support to inflation targeting. Given those results, we conclude that credibility matters in stabilizing price levels. Accordingly, those findings suggest tightening coordination between monetary and fiscal policy to maintain fiscal sustainability in accordance with price stabilization policy

  20. Credibility judgments in web page design - a brief review.

    Science.gov (United States)

    Selejan, O; Muresanu, D F; Popa, L; Muresanu-Oloeriu, I; Iudean, D; Buzoianu, A; Suciu, S

    2016-01-01

    Today, more than ever, knowledge that interfaces appearance analysis is a crucial point in human-computer interaction field has been accepted. As nowadays virtually anyone can publish information on the web, the credibility role has grown increasingly important in relation to the web-based content. Areas like trust, credibility, and behavior, doubled by overall impression and user expectation are today in the spotlight of research compared to the last period, when other pragmatic areas such as usability and utility were considered. Credibility has been discussed as a theoretical construct in the field of communication in the past decades and revealed that people tend to evaluate the credibility of communication primarily by the communicator's expertise. Other factors involved in the content communication process are trustworthiness and dynamism as well as various other criteria but to a lower extent. In this brief review, factors like web page aesthetics, browsing experiences and user experience are considered.

  1. Public confidence in local management officials: organizational credibility and emergency behavior

    Energy Technology Data Exchange (ETDEWEB)

    Sorensen, J.H.

    1984-01-01

    Confidence issues create potential risks for the public in any emergency situation. They do so because credibility and associated perceptions of legitimacy and competency of organizations are determinants of human behavior in disasters. Credibility, however, is only one of numerous factors that shape response of people or organizations to a threatening event. The purposes of this paper are to review what is known about the way in which credibility and related constructs influence emergency response, discuss how this knowledge applies to radiological emergency planning, and suggest how credibility-induced risk can be minimized in emergency planning and response.

  2. Public confidence in local management officials: organizational credibility and emergency behavior

    International Nuclear Information System (INIS)

    Sorensen, J.H.

    1984-01-01

    Confidence issues create potential risks for the public in any emergency situation. They do so because credibility and associated perceptions of legitimacy and competency of organizations are determinants of human behavior in disasters. Credibility, however, is only one of numerous factors that shape response of people or organizations to a threatening event. The purposes of this paper are to review what is known about the way in which credibility and related constructs influence emergency response, discuss how this knowledge applies to radiological emergency planning, and suggest how credibility-induced risk can be minimized in emergency planning and response

  3. How Experienced SoTL Researchers Develop the Credibility of Their Work

    Directory of Open Access Journals (Sweden)

    Jennie Billot

    2017-03-01

    Full Text Available Teaching and learning research in higher education, often referred to as the Scholarship of Teaching and Learning (SoTL, is still relatively novel in many academic contexts compared to the mainstay of disciplinary research. One indication of this is the challenges those who engage in SoTL report in terms of how this work is valued or considered credible amongst disciplinary colleagues and in the face of institutional policies and practices. This paper moves beyond the literature that describes these specific challenges to investigate how 23 experienced SoTL researchers from five different countries understood the notion of credibility in relationship to their SoTL research and how they went about developing credibility for their work. Semi-structured interviews were facilitated and analyzed using inductive analysis. Findings indicate that notions of credibility encompassed putting SoTL research into action and building capacity and community around research findings, as well as gaining external validation through traditional indicators such as publishing. SoTL researchers reported a variety of strategies and approaches they were using, both formal and informal, to develop credibility for their work. The direct focus of this paper on credibility of SoTL work as perceived by experienced SoTL researchers, and how they go about developing credibility, is a distinct contribution to the discussions about the valuing of SoTL work.

  4. Credibility of experts and institutions in emergency situations

    International Nuclear Information System (INIS)

    Renn, O.; Kastenholz, H.G.

    1997-01-01

    This article deals with the most important results of communication and psychological studies on credibility and trust in emergency situations. The central insight in this paper has been that communication about the different elements of credibility needs to start before any emergency in order to build trust among the various constituents. In addition, emergency handling institutions need to reflect the experiences they have made during an emergency and involve the public in a social learning process to improve the existing emergency plans. Credibility of institutions relies on the congruence between the expectations of the public with respect to the demanded performance and the preception of the actual performance. Any institution needs to adjust its communication about public expectations as well as institute sufficient control and monitoring to improve actual performance. (orig.) [de

  5. Deposit Insurance Coverage, Credibility of Non-insurance, and Banking Crises

    DEFF Research Database (Denmark)

    Angkinand, Apanard; Wihlborg, Clas

    2005-01-01

    level require analyses of institutional factors affecting the credibility of non-insurance. In particular, the implementation of effective distress resolution procedures for banks would allow governments to reduce explicit deposit insurance coverage and, thereby, to strengthen market discipline......The ambiguity in existing empirical work with respect to effects of deposit insurance schemes on banks' risk-taking can be resolved if it is recognized that absence of deposit insurance is rarely credible and that the credibility of non-insurance can be enhanced by explicit deposit insurance...... schemes. We show that under reasonable conditions for effects on risk-taking of creditor protection in banking, and for effects on credibility of non-insurance of explicit coverage of deposit insurance schemes, there exists a partial level of coverage that maximizes market discipline and minimizes moral...

  6. Vigilance and reason - The keys to continued credibility

    International Nuclear Information System (INIS)

    Arlotto, G.A.

    1994-01-01

    I have entitle my speech open-quotes Vigilance and Reason-The Keys to Continued Credibilityclose quotes. A partitioning of the words gives insight into my view of where we are, where we may be going, and that we have control of our fate. open-quotes Continuedclose quotes indicates that I believe, at present, the American Society of Mechanical Engineers (ASME) Codes and Standards process has credibility. open-quotes Keysclose quotes connotes that we are at a crossroads and something must be done to stay on track. And open-quotes Vigilance and Reasonclose quotes suggest that we can achieve our goal of continued credibility through exercising vigilance and reason. We cannot rest on our laurels. We must take conscious, positive actions to strengthen the credibility of our products; otherwise we will backslide. It is up to us

  7. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

    Science.gov (United States)

    Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

    2012-04-01

    Banner headlines in an Italian newspaper read on May 11, 2011: "Absence boom in offices: the urban legend in Rome become psychosis". This was the effect of a large-magnitude earthquake prediction in Rome for May 11, 2011. This prediction was never officially released, but it grew up in Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment and this increased the earthquake prediction credibility. Given the echo of this earthquake prediction, INGV decided to organize on May 11 (the same day the earthquake was predicted to happen) an Open Day in its headquarter in Rome to inform on the Italian seismicity and the earthquake physics. The Open Day was preceded by a press conference two days before, attended by about 40 journalists from newspapers, local and national TV's, press agencies and web news magazines. Hundreds of articles appeared in the following two days, advertising the 11 May Open Day. On May 11 the INGV headquarter was peacefully invaded by over 3,000 visitors from 9am to 9pm: families, students, civil protection groups and many journalists. The program included conferences on a wide variety of subjects (from social impact of rumors to seismic risk reduction) and distribution of books and brochures, in addition to several activities: meetings with INGV researchers to discuss scientific issues, visits to the seismic monitoring room (open 24h/7 all year), guided tours through interactive exhibitions on earthquakes and Earth's deep structure. During the same day, thirteen new videos have also been posted on our youtube/INGVterremoti channel to explain the earthquake process and hazard, and to provide real time periodic updates on seismicity in Italy. On May 11 no large earthquake happened in Italy. The initiative, built up in few weeks, had a very large feedback

  8. Mathematical Validation and Credibility of Diagnostic Blocks for Spinal Pain.

    Science.gov (United States)

    Engel, Andrew J; Bogduk, Nikolai

    2016-10-01

    Diagnostic blocks are used in different ways for the diagnosis of spinal pain, but their validity has not been fully evaluated. Four clinical protocols were analyzed mathematically to determine the probability of correct responses arising by chance. The complement of this probability was adopted as a measure of the credibility of correct responses. The credibility of responses varied from 50% to 95%, and was determined less by the agents used but more by what information was given to patients and if the agents were fully randomized for each block. Randomized, comparative local anesthetic blocks offer a credibility of 75%, but randomized, placebo-controlled blocks provide a credibility of 95%, and are thereby suitable as a criterion standard for diagnostic blocks. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Between Sunda subduction and Himalayan collision: fertility, people and earthquakes on the Ganges-Brahmaputra Delta

    Science.gov (United States)

    Seeber, L.; Steckler, M. S.; Akhter, S. H.; Goodbred, S. L., Jr.; Gale, J.; McHugh, C. M.; Ferguson, E. K.; Mondal, D. R.; Paola, C.; Reitz, M. D.; Wilson, C.

    2014-12-01

    A foreland (Ganges) and a suture (Brahmaputra) river, which both drain the Himalaya, have coalesced to form Ganges-Brahmaputra Delta (GBD), the world's largest. The GBD progrades along the continental margin, coupled with an advancing subduction to collision transition, deforming the delta as it grows. A better understanding of this time-transgressive system is urgent now that humans are increasing their forcing of the system and exposure to environmental hazards. Among these, earthquake risk is rapidly growing as people move from rural settings into expanding cities, creating unprecedented exposure. The megathrust 1950 M8.7 earthquake in Assam occurred during the monsoon and released 10x the annual sediment load, causing progradation at the coast and a pulse of river widening that propagated downstream. The 1762 M8.8(?) along the Arakan coast extended into the shelf of the delta where coastal tsunami deposits have been identified recently. These events bracket a segment with no credible historic megathrust earthquakes, but could affect far more people. Geodetic and geologic data along this 300 km boundary facing the GBD show oblique contraction. The subaerial accretionary prism (Burma Ranges) is up to 250 km wide with a blind thrust front that reaches ½ way across the delta. The GPS convergence rate of 14 mm/y is consistent with large displacements and long interseismic times, which can account for lack of historic ruptures, but also the potential for catastrophic events. Active folds and shallow thrust earthquakes point to an additional threat from upper-plate seismicity. Much of the current seismicity is in the lower-plate and reaches as far west as Dhaka; it may pose an immediate threat. The folds, and the uplift and subsidence patterns also influence the courses of the rivers. North of the delta, the Shillong plateau is a huge basement cored anticline bounded by the north-dipping Dauki thrust fault. 7 mm/y of N-S shortening and 5 km of structural relief here

  10. Multi-Sensors Observations of Pre-Earthquake Signals. What We Learned from the Great Tohoku Earthquake?

    Science.gov (United States)

    Ouzonounov, D.; Pulinets, S.; Papadopoulos, G.; Kunitsyn, V.; Nesterov, I.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    The lessons learned from the Great Tohoku EQ (Japan, 2011) will affect our future observations and an analysis is the main focus of this presentation. Multi-sensors observations and multidisciplinary research is presented in our study of the phenomena preceding major earthquakes Our approach is based on a systematic analysis of several physical and environmental parameters, which been reported by others in connections with earthquake processes: thermal infrared radiation; temperature; concentration of electrons in the ionosphere; radon/ion activities; and atmospheric temperature/humidity [Ouzounov et al, 2011]. We used the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model, one of several possible paradigms [Pulinets and Ouzounov, 2011] to interpret our observations. We retrospectively analyzed the temporal and spatial variations of three different physical parameters characterizing the state of the atmosphere, ionosphere the ground surface several days before the March 11, 2011 M9 Tohoku earthquake Namely: (i) Outgoing Long wave Radiation (OLR) measured at the top of the atmosphere; (ii) Anomalous variations of ionospheric parameters revealed by multi-sensors observations; and (iii) The change in the foreshock sequence (rate, space and time); Our results show that on March 8th, 2011 a rapid increase of emitted infrared radiation was observed and an anomaly developed near the epicenter with largest value occurring on March 11 at 07.30 LT. The GPS/TEC data indicate an increase and variation in electron density reaching a maximum value on March 8. Starting from this day in the lower ionosphere there was also observed an abnormal TEC variation over the epicenter. From March 3 to 11 a large increase in electron concentration was recorded at all four Japanese ground-based ionosondes, which returned to normal after the main earthquake. We use the Japanese GPS network stations and method of Radio Tomography to study the spatiotemporal structure of ionospheric

  11. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  12. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  13. Earthquake responses of a beam supported by a mechanical snubber

    International Nuclear Information System (INIS)

    Ohmata, Kenichiro; Ishizu, Seiji.

    1989-01-01

    The mechanical snubber is an earthquakeproof device for piping systems under particular circumstances such as high temperature and radioactivity. It has nonlinearities in both load and frequency response. In this report, the resisting force characteristics of the snubber and earthquake responses of piping (a simply supported beam) which is supported by the snubber are simulated using Continuous System Simulation Language (CSSL). Digital simulations are carried out for various kinds of physical properties of the snubber. The restraint effect and the maximum resisting force of the snubber during earthquakes are discussed and compared with the case of an oil damper. The earthquake waves used here are E1 Centro N-S and Akita Harbour N-S (Nihonkai-Chubu earthquake). (author)

  14. Inter-Disciplinary Validation of Pre Earthquake Signals. Case Study for Major Earthquakes in Asia (2004-2010) and for 2011 Tohoku Earthquake

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Hattori, K.; Liu, J.-Y.; Yang. T. Y.; Parrot, M.; Kafatos, M.; Taylor, P.

    2012-01-01

    We carried out multi-sensors observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several physical and environmental parameters, which we found, associated with the earthquake processes: thermal infrared radiation, temperature and concentration of electrons in the ionosphere, radon/ion activities, and air temperature/humidity in the atmosphere. We used satellite and ground observations and interpreted them with the Lithosphere-Atmosphere- Ionosphere Coupling (LAIC) model, one of possible paradigms we study and support. We made two independent continues hind-cast investigations in Taiwan and Japan for total of 102 earthquakes (M>6) occurring from 2004-2011. We analyzed: (1) ionospheric electromagnetic radiation, plasma and energetic electron measurements from DEMETER (2) emitted long-wavelength radiation (OLR) from NOAA/AVHRR and NASA/EOS; (3) radon/ion variations (in situ data); and 4) GPS Total Electron Content (TEC) measurements collected from space and ground based observations. This joint analysis of ground and satellite data has shown that one to six (or more) days prior to the largest earthquakes there were anomalies in all of the analyzed physical observations. For the latest March 11 , 2011 Tohoku earthquake, our analysis shows again the same relationship between several independent observations characterizing the lithosphere /atmosphere coupling. On March 7th we found a rapid increase of emitted infrared radiation observed from satellite data and subsequently an anomaly developed near the epicenter. The GPS/TEC data indicated an increase and variation in electron density reaching a maximum value on March 8. Beginning from this day we confirmed an abnormal TEC variation over the epicenter in the lower ionosphere. These findings revealed the existence of atmospheric and ionospheric phenomena occurring prior to the 2011 Tohoku earthquake, which indicated new evidence of a distinct

  15. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  16. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    Science.gov (United States)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.

    2017-09-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  17. Revisiting Organizational Credibility and Organizational Reputation – A Situational Crisis Communication Approach

    Directory of Open Access Journals (Sweden)

    Jamal Jamilah

    2017-01-01

    Full Text Available Organizational credibility, the extent of which an organization as the source of messages is perceived as trustworthy and reliable, is one important aspect to determine organization’s survival. The perceived credibility of the messages will either strengthen or worsen an organization reputation. The primary objective of this paper is to revisit the concept of organizational credibility and its interaction with organizational outcomes such as organizational reputation. Based on the situational crisis communication theory (SCCT, this paper focuses on the impact of organizational credibility on organizational reputation following a crisis. Even though the SCCT has been widely used in crisis communication research, the theory still has its own limitations in explaining factors that could potentially affect the reputation of an organization. This study proposes a model by integrating organizational credibility in the SCCT theoretical framework. Derived from the theoretical framework, three propositions are advanced to determine the relationships between organizational credibility with crisis responsibility and perceived organizational reputation. This paper contributes to further establishing the SCCT and posits key attributes in the organizational reputation processes..

  18. Credibility and Crisis Stress Testing

    Directory of Open Access Journals (Sweden)

    Li Lian Ong

    2014-02-01

    Full Text Available Credibility is the bedrock of any crisis stress test. The use of stress tests to manage systemic risk was introduced by the U.S. authorities in 2009 in the form of the Supervisory Capital Assessment Program. Since then, supervisory authorities in other jurisdictions have also conducted similar exercises. In some of those cases, the design and implementation of certain elements of the framework have been criticized for their lack of credibility. This paper proposes a set of guidelines for constructing an effective crisis stress test. It combines financial markets impact studies of previous exercises with relevant case study information gleaned from those experiences to identify the key elements and to formulate their appropriate design. Pertinent concepts, issues and nuances particular to crisis stress testing are also discussed. The findings may be useful for country authorities seeking to include stress tests in their crisis management arsenal, as well as for the design of crisis programs.

  19. Fault parameters and macroseismic observations of the May 10, 1997 Ardekul-Ghaen earthquake

    Science.gov (United States)

    Amini, H.; Zare, M.; Ansari, A.

    2018-01-01

    The Ardekul (Zirkuh) earthquake (May 10, 1997) is the largest recent earthquake that occurred in the Ardekul-Ghaen region of Eastern Iran. The greatest destruction was concentrated around Ardekul, Haji-Abad, Esfargh, Pishbar, Bashiran, Abiz-Qadim, and Fakhr-Abad (completely destroyed). The total surface fault rupture was about 125 km with the longest un-interrupted segment in the south of the region. The maximum horizontal and vertical displacements were reported in Korizan and Bohn-Abad with about 210 and 70 cm, respectively; moreover, other building damages and environmental effects were also reported for this earthquake. In this study, the intensity value XI on the European Macroseismic Scale (EMS) and Environmental Seismic Intensity (ESI) scale was selected for this earthquake according to the maximum effects on macroseismic data points affected by this earthquake. Then, according to its macroseismic data points of this earthquake and Boxer code, some macroseismic parameters including magnitude, location, source dimension, and orientation of this earthquake were also estimated at 7.3, 33.52° N-59.99° E, 75 km long and 21 km wide, and 152°, respectively. As the estimated macroseismic parameters are consistent with the instrumental ones (Global Centroid Moment Tensor (GCMT) location and magnitude equal 33.58° N-60.02° E, and 7.2, respectively), this method and dataset are suggested not only for other instrumental earthquakes, but also for historical events.

  20. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  1. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

  2. Improving credibility evaluations on Wikipedia

    NARCIS (Netherlands)

    Lucassen, T.; Schmettow, Martin; Wiering, Caro H.; Pieters, Jules M.; Boer, Henk

    2011-01-01

    In this chapter, ongoing research on trust in Wikipedia is used as a case study to illustrate the design process of a support tool for Wikipedia, following the ASCE-model. This research is performed from a cognitive perspective and aims at users actively evaluating the credibility of information on

  3. Data Credibility: A Perspective from Systematic Reviews in Environmental Management

    Science.gov (United States)

    Pullin, Andrew S.; Knight, Teri M.

    2009-01-01

    To use environmental program evaluation to increase effectiveness, predictive power, and resource allocation efficiency, evaluators need good data. Data require sufficient credibility in terms of fitness for purpose and quality to develop the necessary evidence base. The authors examine elements of data credibility using experience from critical…

  4. The Importance of Being…Social? Instructor Credibility and the Millennials

    Science.gov (United States)

    Gerhardt, Megan W.

    2016-01-01

    Using the framework of generational identity, the current study explores how a range of characteristics impact Millennial perceptions of instructor credibility. Millennial Generation student ratings of the impact of competence, character, and sociability on instructor credibility were compared to faculty ratings of the same characteristics.…

  5. Large magnitude earthquakes on the Awatere Fault, Marlborough

    International Nuclear Information System (INIS)

    Mason, D.P.M.; Little, T.A.; Van Dissen, R.J.

    2006-01-01

    The Awatere Fault is a principal active strike-slip fault within the Marlborough fault system, and last ruptured in October 1848, in the M w ∼7.5 Marlborough earthquake. The coseismic slip distribution and maximum traceable length of this rupture are calculated from the magnitude and distribution of small, metre-scale geomorphic displacements attributable to this earthquake. These data suggest this event ruptured ∼110 km of the fault, with mean horizontal surface displacement of 5.3 ± 1.6m. Based on these parameters, the moment magnitude of this earthquake would be M w ∼7.4-7.7. Paeloseismic trenching investigations along the eastern section reveal evidence for at least eight, and possibly ten, surface-rupturing paleoearthquakes in the last 8600 years, including the 1848 rupture. The coseismic slip distribution and rupture length of the 1848 earthquake, in combination with the paleoearthquake age data, suggest the eastern section of the Awatere Fault ruptures in M w ∼7.5 earthquakes, with over 5 m of surface displacement, every 860-1080 years. (author). 21 refs., 10 figs., 7 tabs

  6. Strategy to Increase U.S. Credibility

    National Research Council Canada - National Science Library

    Shanks, Wayne M

    2006-01-01

    ...) National Security Strategy (NSS). The public's mistrust of the United States is born out of a widespread misunderstanding and mistrust of its policies and a lack of USG credibility especially in the Greater Middle East...

  7. Marginality, Credibility, and Impression Management: The Asian Sociologist in America.

    Science.gov (United States)

    Unnithan, N. Prabha

    1988-01-01

    Relates personal experiences of a sociologist of Asian origin in an effort to illustrate problems inherent in the process of becoming accepted as an academic sociologist. Identifies important themes of marginality, credibility, and impression management. Points out ways in which the Asian sociologists can go about achieving credibility. (KO)

  8. The effects of stakeholder involvement on perceptions of an evaluation's credibility.

    Science.gov (United States)

    Jacobson, Miriam R; Azzam, Tarek

    2018-06-01

    This article presents a study of the effects of stakeholder involvement on perceptions of an evaluation's credibility. Crowdsourced members of the public and a group of educational administrators read a description of a hypothetical program and two evaluations of the program: one conducted by a researcher and one conducted by program staff (i.e. program stakeholders). Study participants were randomly assigned versions of the scenario with different levels of stakeholder credibility and types of findings. Results showed that both samples perceived the researcher's evaluation findings to be more credible than the program staff's, but that this difference was significantly reduced when the program staff were described to be highly credible. The article concludes with implications for theory and research on evaluation dissemination and stakeholder involvement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Earthquake design for controlled structures

    Directory of Open Access Journals (Sweden)

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  10. Evaluation of steam generator tube integrity during earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Kusakabe, Takaya; Kodama, Toshio [Mitsubishi Heavy Industries Ltd., Kobe (Japan). Kobe Shipyard and Machinery Works; Takamatsu, Hiroshi; Matsunaga, Tomoya

    1999-07-01

    This report shows an experimental study on the strength of PWR steam generator (SG) tubes with various defects under cyclic loads which simulate earthquakes. The tests were done using same SG tubing as actual plants with axial and circumferential defects with various length and depth. In the tests, straight tubes were loaded with cyclic bending moments to simulate earthquake waves and number of load cycles at which tube leak started or tube burst was counted. The test results showed that even tubes with very long crack made by EDM more than 80% depth could stand the maximum earthquake, and tubes with corrosion crack were far stronger than those. Thus the integrity of SG tubes with minute potential defects was demonstrated. (author)

  11. Risky Business: Communicating with Credibility

    Science.gov (United States)

    Greenberger, Leonard S.

    2011-01-01

    In hostile situations, a communicator's goal is to establish and maintain trust and credibility with the audience. School business officials need the special skills and techniques of what's known as "risk communication." Few people are natural risk communicators. Those who do it well honed their skills over many years spent in hostile…

  12. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  13. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  14. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  15. VLF radio wave anomalies associated with the 2010 Ms 7.1 Yushu earthquake

    Science.gov (United States)

    Shen, Xuhui; Zhima, Zeren; Zhao, Shufan; Qian, Geng; Ye, Qing; Ruzhin, Yuri

    2017-05-01

    The VLF radio signals recorded both from the ground based VLF radio wave monitoring network and the DEMETER satellite are investigated during the 2010 Ms 7.1 Yushu earthquake. The ground-based observations show that the disturbance intensity of VLF wave's amplitude relative to the background gets an enhancement over 22% at 11.9 kHz, 27% at 12.6 kHz and 62% at 14.9 kHz VLF radio wave along the path from Novosibirsk - TH one day before the main shock, as compared to the maximum 20% observed during non-earthquake time. The space based observations indicate that there is a decrease of the signal to noise ratio (SNR) for the power spectral density data of 14.9 kHz VLF radio signal at electric field four days before the main shock, with disturbance intensity exceeding the background by over 5% as compared to the maximum 3% observed during non-earthquake time. The geoelectric field observations in the epicenter region also show that a sharp enhancement from ∼340 to 430 mV/km simultaneously appeared at two monitors 14 days before main shock. The comparative analysis from the ground and space based observations during the earthquake and non-earthquake time provides us convincible evidence that there exits seismic anomalies from the VLF radio wave propagation before the 2010 Ms 7.1 Yushu earthquake. The possible mechanism for VLF radio signal propagation anomaly during 2010 Yushu earthquake maybe related to the change of the geoelectric field nearby the earthquake zone.

  16. Use of Credibility Heuristics in a Social Question-Answering Service

    Science.gov (United States)

    Matthews, Paul

    2015-01-01

    Introduction: This study looked at the effect of community peripheral cues (specifically voting score and answerer's reputation) on the user's credibility rating of answers. Method: Students in technology and philosophy were asked to assess the credibility of answers to questions posted on a social question-answering platform. Through the use of a…

  17. A three-step maximum a posteriori probability method for InSAR data inversion of coseismic rupture with application to the 14 April 2010 Mw 6.9 Yushu, China, earthquake

    Science.gov (United States)

    Sun, Jianbao; Shen, Zheng-Kang; Bürgmann, Roland; Wang, Min; Chen, Lichun; Xu, Xiwei

    2013-08-01

    develop a three-step maximum a posteriori probability method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic deformation solutions of earthquake rupture. The method originates from the fully Bayesian inversion and mixed linear-nonlinear Bayesian inversion methods and shares the same posterior PDF with them, while overcoming difficulties with convergence when large numbers of low-quality data are used and greatly improving the convergence rate using optimization procedures. A highly efficient global optimization algorithm, adaptive simulated annealing, is used to search for the maximum of a posterior PDF ("mode" in statistics) in the first step. The second step inversion approaches the "true" solution further using the Monte Carlo inversion technique with positivity constraints, with all parameters obtained from the first step as the initial solution. Then slip artifacts are eliminated from slip models in the third step using the same procedure of the second step, with fixed fault geometry parameters. We first design a fault model with 45° dip angle and oblique slip, and produce corresponding synthetic interferometric synthetic aperture radar (InSAR) data sets to validate the reliability and efficiency of the new method. We then apply this method to InSAR data inversion for the coseismic slip distribution of the 14 April 2010 Mw 6.9 Yushu, China earthquake. Our preferred slip model is composed of three segments with most of the slip occurring within 15 km depth and the maximum slip reaches 1.38 m at the surface. The seismic moment released is estimated to be 2.32e+19 Nm, consistent with the seismic estimate of 2.50e+19 Nm.

  18. The effect of vertical earthquake component on the uplift of the nuclear reactor building

    International Nuclear Information System (INIS)

    Kobayashi, Toshio

    1986-01-01

    During a strong earthquake, the base mat of a nuclear reactor building may be lifted partially by the response overturning moment. And it causes geometrical nonlinear interaction between the base mat and rock foundation beneath it. In order to avoid this uplift phenomena, the base mat and/or plan of the building is enlarged in some cases. These special design need more cost and/or time in construction. In the evaluation of the uplift phenomena, a parameter ''η'' named ''contact ratio'' is used defined as the ratio of compression stress zone area of base mat for total area of base mat. Usually this contact ratio is calculated under the combination of the maximum overturning moment obtained by the linear earthquake response analysis and the normal force by the gravity considering the effect of the vertical earthquake component. In this report, the effect of vertical earthquake component for the uplift phenomena is studied and it concludes that the vertical earthquake component gives little influence on the contact ratio. In order to obtain more reasonable contact retio, the nonlinear rocking analysis subjected to horizontal and vertical earthquake motions simultaneously is proposed in this report. As the second best method, the combination of the maximum overturning moment obtained by linear analysis and the normal force by only the gravity without the vertical earthquake effect is proposed. (author)

  19. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  20. Revisiting Organizational Credibility and Organizational Reputation – A Situational Crisis Communication Approach

    OpenAIRE

    Jamal Jamilah; Abu Bakar Hassan

    2017-01-01

    Organizational credibility, the extent of which an organization as the source of messages is perceived as trustworthy and reliable, is one important aspect to determine organization’s survival. The perceived credibility of the messages will either strengthen or worsen an organization reputation. The primary objective of this paper is to revisit the concept of organizational credibility and its interaction with organizational outcomes such as organizational reputation. Based on the situational...

  1. Earthquake response of heavily damaged historical masonry mosques after restoration

    Science.gov (United States)

    Altunışık, Ahmet Can; Fuat Genç, Ali

    2017-10-01

    Restoration works have been accelerated substantially in Turkey in the last decade. Many historical buildings, mosques, minaret, bridges, towers and structures have been restored. With these restorations an important issue arises, namely how restoration work affects the structure. For this reason, we aimed to investigate the restoration effect on the earthquake response of a historical masonry mosque considering the openings on the masonry dome. For this purpose, we used the Hüsrev Pasha Mosque, which is located in the Ortakapı district in the old city of Van, Turkey. The region of Van is in an active seismic zone; therefore, earthquake analyses were performed in this study. Firstly a finite element model of the mosque was constructed considering the restoration drawings and 16 window openings on the dome. Then model was constructed with eight window openings. Structural analyses were performed under dead load and earthquake load, and the mode superposition method was used in analyses. Maximum displacements, maximum-minimum principal stresses and shear stresses are given with contours diagrams. The results are analyzed according to Turkish Earthquake Code (TEC, 2007) and compared between 8 and 16 window openings cases. The results show that reduction of the window openings affected the structural behavior of the mosque positively.

  2. Real Time Earthquake Information System in Japan

    Science.gov (United States)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  3. Establishing model credibility involves more than validation

    International Nuclear Information System (INIS)

    Kirchner, T.

    1991-01-01

    One widely used definition of validation is that the quantitative test of the performance of a model through the comparison of model predictions to independent sets of observations from the system being simulated. The ability to show that the model predictions compare well with observations is often thought to be the most rigorous test that can be used to establish credibility for a model in the scientific community. However, such tests are only part of the process used to establish credibility, and in some cases may be either unnecessary or misleading. Naylor and Finger extended the concept of validation to include the establishment of validity for the postulates embodied in the model and the test of assumptions used to select postulates for the model. Validity of postulates is established through concurrence by experts in the field of study that the mathematical or conceptual model contains the structural components and mathematical relationships necessary to adequately represent the system with respect to the goals for the model. This extended definition of validation provides for consideration of the structure of the model, not just its performance, in establishing credibility. Evaluation of a simulation model should establish the correctness of the code and the efficacy of the model within its domain of applicability. (24 refs., 6 figs.)

  4. Credibility and Consumer Behavior of Islamic Bank in Indonesia: A Literature Review

    Directory of Open Access Journals (Sweden)

    Naufal BACHRI

    2016-06-01

    Full Text Available The concept “credibility” has become significant attention from academics and practitioners because it played an important role in creating and maintaining consumer behavior. This study uses twenty- seven references relates to credibility, customer value, satisfaction, and loyalty. Several studies have discussed the relationship between credibility and consumer behavior and also elaborated dimensions of credibility. It also presented the shortcomings of current research and the trends for future study in Islamic banking.

  5. Fundamental principles of earthquake resistance calculation to be reflected in the next generation regulations

    OpenAIRE

    Mkrtychev Oleg; Dzhinchvelashvili Guram

    2016-01-01

    The article scrutinizes the pressing issues of regulation in the domain of seismic construction. The existing code of rules SNIP II-7-81* “Construction in seismic areas” provides that earthquake resistance calculation be performed on two levels of impact: basic safety earthquake (BSE) and maximum considered earthquake (MCE). However, the very nature of such calculation cannot be deemed well-founded and contradicts the fundamental standards of foreign countries. The authors of the article have...

  6. Earthquake behavior at deep underground observed by three-dimensional array

    International Nuclear Information System (INIS)

    Komada, Hiroya; Sawada, Yoshihiro; Aoyama, Shigeo.

    1989-01-01

    The earthquake observation has been carried out using an eight point three-dimensional array between on-ground and the depth of about 400 m at Hosokura Mine in Miyagi prefecture, for the purpose of obtaining the basic datum on the characteristics of the seismic waves for the earthquake resistance design of the deep underground disposal facility of high level waste. The following results ware obtained. (1) The maximum accelerations at the underground are damped to about 60 % of those at on-ground horizontal and to about 70 % vertical. (2) Although the frequency characteristics of the seismic waves varies for each earthquake, the transfer characteristics of seismic waves from deep underground to on-ground is the same for each earthquake. (3) The horizontal dirrections of seismic wave incidence are similar to the directions from epicenters of each earthquake. The vertical directions of seismic wave incidence are in the range of about 3deg to 35deg from vertical line. (author)

  7. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    Science.gov (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    very low rupture velocity. The low rupture velocity can mean slow-faulting, which brings to slow release of accumulated seismic energy. The slow release energy does principally little to moderate damages. Additionally wave form of the earthquake shows low frequency content of P-waves (the maximum P-wave is at 1.19 Hz) and the specific P- wave displacement spectral is characterise with not expressed spectrum plateau and corner frequency. These and other signs suggest us to the conclusion, that the 2012 Mw5.6 earthquake can be considered as types of slow earthquake, like a low frequency quake. The study is based on data from Bulgarian seismological network (NOTSSI), the local network (LSN) deployed around Kozloduy NPP and System of Accelerographs for Seismic Monitoring of Equipment and Structures (SASMES) installed in the Kozloduy NPP. NOTSSI jointly with LSN and SASMES provide reliable information for multiple studies on seismicity in regional scale.

  8. Effect of slip-area scaling on the earthquake frequency-magnitude relationship

    Science.gov (United States)

    Senatorski, Piotr

    2017-06-01

    The earthquake frequency-magnitude relationship is considered in the maximum entropy principle (MEP) perspective. The MEP suggests sampling with constraints as a simple stochastic model of seismicity. The model is based on the von Neumann's acceptance-rejection method, with b-value as the parameter that breaks symmetry between small and large earthquakes. The Gutenberg-Richter law's b-value forms a link between earthquake statistics and physics. Dependence between b-value and the rupture area vs. slip scaling exponent is derived. The relationship enables us to explain observed ranges of b-values for different types of earthquakes. Specifically, different b-value ranges for tectonic and induced, hydraulic fracturing seismicity is explained in terms of their different triggering mechanisms: by the applied stress increase and fault strength reduction, respectively.

  9. Induced seismicity provides insight into why earthquake ruptures stop

    KAUST Repository

    Galis, Martin; Ampuero, Jean Paul; Mai, Paul Martin; Cappa, Fré dé ric

    2017-01-01

    the perturbed area and distinguishes self-arrested from runaway ruptures. We develop a theoretical scaling relation between the largest magnitude of self-arrested earthquakes and the injected volume and find it consistent with observed maximum magnitudes

  10. Credibility and the media as a political institution

    DEFF Research Database (Denmark)

    Ørsten, Mark; Burkal, Rasmus

    2014-01-01

    of credibility in Danish news media. Credibility is defined at an institutional level by two dimensions: A) the accuracy and reliability of the news stories featured in leading Danish news media, and B) journalists’ knowledge and understanding of the Danish code of press ethics. The results show that sources...... only find objective errors in 14.1% of the news stories, which is a lower figure than most other studies report. The results also show that Danish journalists find bad press ethics to be an increasing problem and attribute this problem to increased pressure in the newsroom....

  11. An Overview of Soil Models for Earthquake Response Analysis

    Directory of Open Access Journals (Sweden)

    Halida Yunita

    2015-01-01

    Full Text Available Earthquakes can damage thousands of buildings and infrastructure as well as cause the loss of thousands of lives. During an earthquake, the damage to buildings is mostly caused by the effect of local soil conditions. Depending on the soil type, the earthquake waves propagating from the epicenter to the ground surface will result in various behaviors of the soil. Several studies have been conducted to accurately obtain the soil response during an earthquake. The soil model used must be able to characterize the stress-strain behavior of the soil during the earthquake. This paper compares equivalent linear and nonlinear soil model responses. Analysis was performed on two soil types, Site Class D and Site Class E. An equivalent linear soil model leads to a constant value of shear modulus, while in a nonlinear soil model, the shear modulus changes constantly,depending on the stress level, and shows inelastic behavior. The results from a comparison of both soil models are displayed in the form of maximum acceleration profiles and stress-strain curves.

  12. Credibility improves topical blog post retrieval

    NARCIS (Netherlands)

    Weerkamp, W.; de Rijke, M.

    2008-01-01

    Topical blog post retrieval is the task of ranking blog posts with respect to their relevance for a given topic. To improve topical blog post retrieval we incorporate textual credibility indicators in the retrieval process. We consider two groups of indicators: post level (determined using

  13. The Credibility of Fiscal Rules Policy and Business Cycle Volatility

    Directory of Open Access Journals (Sweden)

    Kuncoro Haryo

    2016-06-01

    Full Text Available The aim of this paper is two-fold; first, it studies the impact of the credibility of fiscal rule policy on the stability of output growth; second, it compares the effectiveness of fiscal rule policy to discretionary and automatic stabilizer fiscal policies to address the fluctuation of output growth. Employing quarterly data over the period 2001-2013 in the case of Indonesia, we obtain that the credible debt rule leads to a decrease in the volatility of output growth while the non-credible deficit rule does not have any effect. Both unsystematic and systematic components of discretionary fiscal policy have a stabilizing function. Interestingly, the automatic stabilization tends to induce the volatility of output growth. Given those results, we infer that government spending is not a good automatic stabilizer. It seems that the lower ratio of government expenditure to GDP along with improving credibility of deficit rule policy has a smoother effect on the economy. Therefore, they implicitly support expenditure cuts when implementing fiscal adjustment with the purpose of reaching fiscal sustainability in the short-run and a stable economic growth in the long-run.

  14. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    Science.gov (United States)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  15. Modified mercalli intensities for nine earthquakes in central and western Washington between 1989 and 1999

    Science.gov (United States)

    Brocher, Thomas M.; Dewey, James W.; Cassidy, John F.

    2017-08-15

    We determine Modified Mercalli (Seismic) Intensities (MMI) for nine onshore earthquakes of magnitude 4.5 and larger that occurred in central and western Washington between 1989 and 1999, on the basis of effects reported in postal questionnaires, the press, and professional collaborators. The earthquakes studied include four earthquakes of M5 and larger: the M5.0 Deming earthquake of April 13, 1990, the M5.0 Point Robinson earthquake of January 29, 1995, the M5.4 Duvall earthquake of May 3, 1996, and the M5.8 Satsop earthquake of July 3, 1999. The MMI are assigned using data and procedures that evolved at the U.S. Geological Survey (USGS) and its Department of Commerce predecessors and that were used to assign MMI to felt earthquakes occurring in the United States between 1931 and 1986. We refer to the MMI assigned in this report as traditional MMI, because they are based on responses to postal questionnaires and on newspaper reports, and to distinguish them from MMI calculated from data contributed by the public by way of the internet. Maximum traditional MMI documented for the M5 and larger earthquakes are VII for the 1990 Deming earthquake, V for the 1995 Point Robinson earthquake, VI for the 1996 Duvall earthquake, and VII for the 1999 Satsop earthquake; the five other earthquakes were variously assigned maximum intensities of IV, V, or VI. Starting in 1995, the Pacific Northwest Seismic Network (PNSN) published MMI maps for four of the studied earthquakes, based on macroseismic observations submitted by the public by way of the internet. With the availability now of the traditional USGS MMI interpreted for all the sites from which USGS postal questionnaires were returned, the four Washington earthquakes join a rather small group of earthquakes for which both traditional USGS MMI and some type of internet-based MMI have been assigned. The values and distributions of the traditional MMI are broadly similar to the internet-based PNSN intensities; we discuss some

  16. Thermal anomalies detection before strong earthquakes (M > 6.0 using interquartile, wavelet and Kalman filter methods

    Directory of Open Access Journals (Sweden)

    M. Akhoondzadeh

    2011-04-01

    , indicating that the anomalous behaviors can be related to impending earthquakes. The proposed method receives its credibility from the overall capabilities of the three integrated methods.

  17. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  18. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  19. Experimental evidence that thrust earthquake ruptures might open faults.

    Science.gov (United States)

    Gabuchian, Vahe; Rosakis, Ares J; Bhat, Harsha S; Madariaga, Raúl; Kanamori, Hiroo

    2017-05-18

    Many of Earth's great earthquakes occur on thrust faults. These earthquakes predominantly occur within subduction zones, such as the 2011 moment magnitude 9.0 eathquake in Tohoku-Oki, Japan, or along large collision zones, such as the 1999 moment magnitude 7.7 earthquake in Chi-Chi, Taiwan. Notably, these two earthquakes had a maximum slip that was very close to the surface. This contributed to the destructive tsunami that occurred during the Tohoku-Oki event and to the large amount of structural damage caused by the Chi-Chi event. The mechanism that results in such large slip near the surface is poorly understood as shallow parts of thrust faults are considered to be frictionally stable. Here we use earthquake rupture experiments to reveal the existence of a torquing mechanism of thrust fault ruptures near the free surface that causes them to unclamp and slip large distances. Complementary numerical modelling of the experiments confirms that the hanging-wall wedge undergoes pronounced rotation in one direction as the earthquake rupture approaches the free surface, and this torque is released as soon as the rupture breaks the free surface, resulting in the unclamping and violent 'flapping' of the hanging-wall wedge. Our results imply that the shallow extent of the seismogenic zone of a subducting interface is not fixed and can extend up to the trench during great earthquakes through a torquing mechanism.

  20. Is earthquake rate in south Iceland modified by seasonal loading?

    Science.gov (United States)

    Jonsson, S.; Aoki, Y.; Drouin, V.

    2017-12-01

    Several temporarily varying processes have the potential of modifying the rate of earthquakes in the south Iceland seismic zone, one of the two most active seismic zones in Iceland. These include solid earth tides, seasonal meteorological effects and influence from passing weather systems, and variations in snow and glacier loads. In this study we investigate the influence these processes may have on crustal stresses and stressing rates in the seismic zone and assess whether they appear to be influencing the earthquake rate. While historical earthquakes in the south Iceland have preferentially occurred in early summer, this tendency is less clear for small earthquakes. The local earthquake catalogue (going back to 1991, magnitude of completeness M6+ earthquakes, which occurred in June 2000 and May 2008. Standard Reasenberg earthquake declustering or more involved model independent stochastic declustering algorithms are not capable of fully eliminating the aftershocks from the catalogue. We therefore inspected the catalogue for the time period before 2000 and it shows limited seasonal tendency in earthquake occurrence. Our preliminary results show no clear correlation between earthquake rates and short-term stressing variations induced from solid earth tides or passing storms. Seasonal meteorological effects also appear to be too small to influence the earthquake activity. Snow and glacier load variations induce significant vertical motions in the area with peak loading occurring in Spring (April-May) and maximum unloading in Fall (Sept.-Oct.). Early summer occurrence of historical earthquakes therefore correlates with early unloading rather than with the peak unloading or unloading rate, which appears to indicate limited influence of this seasonal process on the earthquake activity.

  1. On the reliability of the geomagnetic quake as a short time earthquake's precursor for the Sofia region

    Directory of Open Access Journals (Sweden)

    S. Cht. Mavrodiev

    2004-01-01

    Full Text Available The local 'when' for earthquake prediction is based on the connection between geomagnetic 'quakes' and the next incoming minimum or maximum of tidal gravitational potential. The probability time window for the predicted earthquake is for the tidal minimum approximately ±1 day and for the maximum ±2 days. The preliminary statistic estimation on the basis of distribution of the time difference between occurred and predicted earthquakes for the period 2002-2003 for the Sofia region is given. The possibility for creating a local 'when, where' earthquake research and prediction NETWORK is based on the accurate monitoring of the electromagnetic field with special space and time scales under, on and over the Earth's surface. The periodically upgraded information from seismic hazard maps and other standard geodetic information, as well as other precursory information, is essential.

  2. Trust and credibility: measured by multidimensional scaling

    International Nuclear Information System (INIS)

    Warg, L.E.; Bodin, L.

    1998-01-01

    Full text of publication follows: in focus of much of today's research interest in risk communication, is the fact that the communities do not trust policy and decision makers such as politicians, government or industry people. This is especially serious in the years to come when we are expecting risk issues concerning for example the nuclear industry, global warming and hazardous waste, to be even higher on the political and social agenda all over the world. Despite the research efforts devoted to trust, society needs an in depth understanding of trust for conducting successful communication regarding environmental hazards. The present abstract is about an experimental study in psychology where focus has been on the possibility to use the multidimensional scaling technique to explore the characteristics people consider to be of importance when they say that certain persons are credible. In the study, a total of 61 students of the University of Oerebro, Sweden, were required to make comparisons of the similarity between 12 well-known swedish persons from politics science, media, industry, 'TV-world' and literature (two persons at a time), regarding their credibility when making statements about risks in society. In addition, the subjects were rating the importance of 19 factors for the credibility of a source. These 61 persons comprised three groups of students: pedagogists, business economists, and chemists. There were 61 % women and 39% men and the mean age was 23 years. The results will be analyzed using multidimensional scaling technique. Differences between the three groups will be analyzed and presented as well as those between men and women. In addition, the 19 factors will be discussed and considered when trying to label the dimensions accounted for by the multidimensional scaling technique. The result from this study will contribute to our understanding of important factors behind human judgments concerning trust and credibility. It will also point to a

  3. One feature of the activated southern Ordos block: the Ziwuling small earthquake cluster

    Directory of Open Access Journals (Sweden)

    Li Yuhang

    2014-08-01

    Full Text Available Small earthquakes (Ms > 2.0 have been recorded from 1970 to the present day and reveal a significant difference in seismicity between the stable Ordos block and its active surrounding area. The southern Ordos block is a conspicuous small earthquake belt clustered and isolated along the NNW direction and extends to the inner stable Ordos block; no active fault can match this small earthquake cluster. In this paper, we analyze the dynamic mechanism of this small earthquake cluster based on the GPS velocity field (from 1999 to 2007, which are mainly from Crustal Movement Observation Network of China (CMONOC with respect to the north and south China blocks. The principal direction of strain rate field, the expansion ratefield, the maximum shear strain rate, and the rotation rate were constrained using the GPS velocity field. The results show that the velocity field, which is bounded by the small earthquake cluster from Tongchuan to Weinan, differs from the strain rate field, and the crustal deformation is left-lateral shear. This left-lateral shear belt not only spatially coincides with the Neo-tectonic belt in the Weihe Basin but also with the NNW small earthquake cluster (the Ziwuling small earthquake cluster. Based on these studies, we speculate that the NNW small earthquake cluster is caused by left-lateral shear slip, which is prone to strain accumulation. When the strain releases along the weak zone of structure, small earthquakes diffuse within its upper crust. The maximum principal compression strees direction changed from NE-SW to NEE-SWW, and the former reverse faults in the southwestern margin of the Ordos block became a left-lateral strike slip due to readjustment of the tectonic strees field after the middle Pleistocene. The NNW Neo-tectonic belt in the Weihe Basin, the different movement character of the inner Weihe Basin (which was demonstrated through GPS measurements and the small earthquake cluster belt reflect the activated

  4. The earthquakes of stable continental regions. Volume 2: Appendices A to E. Final report

    International Nuclear Information System (INIS)

    Johnston, A.C.; Kanter, L.R.; Coppersmith, K.J.; Cornell, C.A.

    1994-12-01

    The objectives of the study were to develop a comprehensive database of earthquakes in stable continental regions (SCRs) and to statistically examine use of the database for the assessment of large earthquake potential. We identified nine major and several minor SCRs worldwide and compiled a database of geologic characteristics of tectonic domains within each SCR. We examined all available earthquake data from SCRs, from historical accounts of events with no instrumental ground-motion data to present-day instrumentally recorded events. In all, 1,385 events were analyzed. Using moment magnitude 4.5 as the lower bound threshold for inclusion in the database, 870 were assigned to an SCR, 124 were found to be transitional to an SCR, and 391 were examined, but rejected. We then performed a seismotectonic analysis to determine what distinguishes seismic activity in SCRs from other types of crust, such as active plate margins or active continental regions. General observations are: (1) SCRs comprise nearly two-thirds of all continental crust of which 25% is considered to be extended (i.e., rifted); (2) the majority of seismic energy release and the largest earthquakes in SCRs have occurred in extended crust; and (3) active plate margins release seismic energy at a rate per unit area approximately 7,000 times the average for non-extended SCRs. Finally, results of a statistical examination of distributions of historical maximum earthquakes between different crustal domain types indicated that additional information is needed in order to adequately constrain estimates of maximum earthquakes for any given region. Thus, a Bayesian approach was developed in which statistical constraints from the database were used to develop a prior distribution, which may then be combined with source-specific information to constrain maximum magnitude assessments for use in probabilistic seismic hazard analyses

  5. Establishing Credibility in the Multicultural Classroom: When the Instructor Speaks with an Accent

    Science.gov (United States)

    McLean, Chikako Akamatsu

    2007-01-01

    Applying theories of cultural dimensions, teacher credibility, and nonverbal immediacy, this chapter explores classroom management techniques used by Asian female teachers to establish credibility. (Contains 1 note.)

  6. Facts learnt from the Hanshin-Awaji disaster and consideration on design basis earthquake

    International Nuclear Information System (INIS)

    Shibata, Heki

    1997-01-01

    This paper will deal with how to establish the concept of the design basis earthquake for critical industrial facilities such as nuclear power plants in consideration of disasters induced by the 1995 Hyogoken-Nanbu Earthquake (Southern Hyogo-prefecture Earthquake-1995), so-called Kobe earthquake. The author once discussed various DBEs at 7 WCEE. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared to the values of accelerations to a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo-pref. Earthquake-1995 exceeded the previous assumption of the author, even though the evaluation results of the previous paper had been pessimistic. According to the experience of Kobe event, the author will point out the necessity of the third earthquake S s adding to S 1 and S 2 , previous DBEs. (author)

  7. Development of a Severe Accident Mitigation Support with Speediness and Credibility

    International Nuclear Information System (INIS)

    Hur, Sup; Park, Jae Chang; Choi, Jong Gyun; Kim, Jung Taek; Kim, Chang Hwoi

    2014-01-01

    This study suggests a methodology of severe accident mitigation support with speediness and credibility. Using this methodology, the severe accident is automatically identified based on the information credibility check. And then, proper mitigation function, available mitigation routes, and an optimal mitigation path are automatically suggested. The basic logic of the information credibility is based on environmental evaluation, historical evaluation and some conventional methods such as redundancy and diversity comparison of instruments. To identify the available mitigation routes, availability of paths and components, source status, process limitation, expected adverse effect, and mitigation capability of the path are automatically were evaluated. Among the available routes, the optimal mitigation path was finally suggested based on the path priority criteria and physical relationship

  8. Variations of Background Seismic Noise Before Strong Earthquakes, Kamchatka.

    Science.gov (United States)

    Kasimova, V.; Kopylova, G.; Lyubushin, A.

    2017-12-01

    The network of broadband seismic stations of Geophysical Service (Russian Academy of Science) works on the territory of Kamchatka peninsula in the Far East of Russia. We used continuous records on Z-channels at 21 stations for creation of background seismic noise time series in 2011-2017. Average daily parameters of multi-fractal spectra of singularity have been calculated at each station using 1-minute records. Maps and graphs of their spatial distribution and temporal changes were constructed at time scales from days to several years. The analysis of the coherent behavior of the time series of the statistics was considered. The technique included the splitting of seismic network into groups of stations, taking into account the coastal effect, the network configuration and the main tectonic elements of Kamchatka. Then the time series of median values of noise parameters from each group of stations were made and the frequency-time diagrams of the evolution of the spectral measure of the coherent behavior of four time series were analyzed. The time intervals and frequency bands of the maximum values showing the increase of coherence in the changes of all statistics were evaluated. The strong earthquakes with magnitudes M=6.9-8.3 occurred near the Kamchatka peninsula during the observations. The synchronous variations of the background noise parameters and increase in the coherent behavior of the median values of statistical parameters was shown before two earthquakes 2013 (February 28, Mw=6.9; May 24, Mw=8.3) within 3-9 months and before earthquake of January 30, 2016, Mw=7.2 within 3-6 months. The maximum effect of increased coherence in the range of periods 4-5.5 days corresponds to the time of preparation of two strong earthquakes in 2013 and their aftershock processes. Peculiarities in changes of statistical parameters at stages of preparation of strong earthquakes indicate the attenuation in high-amplitude outliers and the loss of multi-fractal properties in

  9. On a method of evaluation of failure rate of equipment and pipings under excess-earthquake loadings

    International Nuclear Information System (INIS)

    Shibata, H.; Okamura, H.

    1979-01-01

    This paper deals with a method of evaluation of the failure rate of equipment and pipings in nuclear power plants under an earthquake which is exceeding the design basis earthquake. If we put the ratio of the maximum ground acceleration of an earthquake to that of the design basis earthquake as n, then the failure rate or the probability of failure is the function of n as p(n). The purpose of this study is establishing the procedure of evaluation of the relation n vs. p(n). (orig.)

  10. 8 CFR 1208.30 - Credible fear determinations involving stowaways and applicants for admission found inadmissible...

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Credible fear determinations involving..., DEPARTMENT OF JUSTICE IMMIGRATION REGULATIONS PROCEDURES FOR ASYLUM AND WITHHOLDING OF REMOVAL Credible Fear of Persecution § 1208.30 Credible fear determinations involving stowaways and applicants for...

  11. Consider the source: persuasion of implicit evaluations is moderated by source credibility.

    Science.gov (United States)

    Smith, Colin Tucker; De Houwer, Jan; Nosek, Brian A

    2013-02-01

    The long history of persuasion research shows how to change explicit, self-reported evaluations through direct appeals. At the same time, research on how to change implicit evaluations has focused almost entirely on techniques of retraining existing evaluations or manipulating contexts. In five studies, we examined whether direct appeals can change implicit evaluations in the same way as they do explicit evaluations. In five studies, both explicit and implicit evaluations showed greater evidence of persuasion following information presented by a highly credible source than a source low in credibility. Whereas cognitive load did not alter the effect of source credibility on explicit evaluations, source credibility had an effect on the persuasion of implicit evaluations only when participants were encouraged and able to consider information about the source. Our findings reveal the relevance of persuasion research for changing implicit evaluations and provide new ideas about the processes underlying both types of evaluation.

  12. Characteristics of global strong earthquakes and their implications ...

    Indian Academy of Sciences (India)

    11

    as important sources for describing the present-day stress field and regime. ..... happened there will indicate relative movements between Pacific plate and Australia ... time, and (b) earthquake slip occurs in the direction of maximum shear stress .... circum-pacific seismic belt and the Himalaya collision boundary as shown in ...

  13. Estimation of Slip Distribution of the 2007 Bengkulu Earthquake from GPS Observation Using Least Squares Inversion Method

    Directory of Open Access Journals (Sweden)

    Moehammad Awaluddin

    2012-07-01

    Full Text Available Continuous Global Positioning System (GPS observations showed significant crustal displacements as a result of the Bengkulu earthquake occurring on September 12, 2007. A maximum horizontal displacement of 2.11 m was observed at PRKB station, while the vertical component at BSAT station was uplifted with a maximum of 0.73 m, and the vertical component at LAIS station was subsided by -0.97 m. The method of adding more constraint on the inversion for the Bengkulu earthquake slip distribution from GPS observations can help solve a least squares inversion with an under-determined condition. Checkerboard tests were performed to help conduct the weighting for constraining the inversion. The inversion calculation of the Bengkulu earthquake slip distribution yielded in an optimum value of slip distribution by giving a weight of smoothing constraint of 0.001 and a weight of slip value constraint = 0 at the edge of the earthquake rupture area. A maximum coseismic slip of the optimal inversion calculation was 5.12 m at the lower area of PRKB and BSAT stations. The seismic moment calculated from the optimal slip distribution was 7.14 x 1021 Nm, which is equivalent to a magnitude of 8.5.

  14. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  15. Listening to the 2011 magnitude 9.0 Tohoku-Oki, Japan, earthquake

    Science.gov (United States)

    Peng, Zhigang; Aiken, Chastity; Kilb, Debi; Shelly, David R.; Enescu, Bogdan

    2012-01-01

    The magnitude 9.0 Tohoku-Oki, Japan, earthquake on 11 March 2011 is the largest earthquake to date in Japan’s modern history and is ranked as the fourth largest earthquake in the world since 1900. This earthquake occurred within the northeast Japan subduction zone (Figure 1), where the Pacific plate is subducting beneath the Okhotsk plate at rate of ∼8–9 cm/yr (DeMets et al. 2010). This type of extremely large earthquake within a subduction zone is generally termed a “megathrust” earthquake. Strong shaking from this magnitude 9 earthquake engulfed the entire Japanese Islands, reaching a maximum acceleration ∼3 times that of gravity (3 g). Two days prior to the main event, a foreshock sequence occurred, including one earthquake of magnitude 7.2. Following the main event, numerous aftershocks occurred around the main slip region; the largest of these was magnitude 7.9. The entire foreshocks-mainshock-aftershocks sequence was well recorded by thousands of sensitive seismometers and geodetic instruments across Japan, resulting in the best-recorded megathrust earthquake in history. This devastating earthquake resulted in significant damage and high death tolls caused primarily by the associated large tsunami. This tsunami reached heights of more than 30 m, and inundation propagated inland more than 5 km from the Pacific coast, which also caused a nuclear crisis that is still affecting people’s lives in certain regions of Japan.

  16. Scientists Examine Challenges and Lessons From Japan's Earthquake and Tsunami

    Science.gov (United States)

    Showstack, Randy

    2011-03-01

    A week after the magnitude 9.0 great Tohoku earthquake and the resulting tragic and damaging tsunami of 11 March struck Japan, the ramifications continued, with a series of major aftershocks (as Eos went to press, there had been about 4 dozen with magnitudes greater than 6); the grim search for missing people—the death toll was expected to approximate 10,000; the urgent assistance needed for the more than 400,000 homeless and the 1 million people without water; and the frantic efforts to avert an environmental catastrophe at Japan's damaged Fukushima Daiichi Nuclear Power Station, about 225 kilometers northeast of Tokyo, where radiation was leaking. The earthquake offshore of Honshu in northeastern Japan (see Figure 1) was a plate boundary rupture along the Japan Trench subduction zone, with the source area of the earthquake estimated at 400-500 kilometers long with a maximum slip of 20 meters, determined through various means including Global Positioning System (GPS) and seismographic data, according to Kenji Satake, professor at the Earthquake Research Institute of the University of Tokyo. In some places the tsunami may have topped 7 meters—the maximum instrumental measurement at many coastal tide gauges—and some parts of the coastline may have been inundated more than 5 kilometers inland, Satake indicated. The International Tsunami Information Center (ITIC) noted that eyewitnesses reported that the highest tsunami waves were 13 meters high. Satake also noted that continuous GPS stations indicate that the coast near Sendai—which is 130 kilometers west of the earthquake and is the largest city in the Tohoku region of Honshu—moved more than 4 meters horizontally and subsided about 0.8 meter.

  17. Effects of message repetition and negativity on credibility judgments and political attitudes

    NARCIS (Netherlands)

    Ernst, N.; Kühne, R.; Wirth, W.

    2017-01-01

    Research on the truth effect has demonstrated that statements are rated as more credible when they are repeatedly presented. However, current research indicates that there are limits to the truth effect and that too many repetitions can decrease message credibility. This study investigates whether

  18. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    volume increment for a given slip increment becomes larger. A juction with past accumulated slip ??0 is a strong barrier to earthquakes with maximum slip um < 2 (P/µ u0 = u0/50. As slip continues to occur elsewhere in the fault system, a stress concentration will grow at the old junction. A fresh fracture may occur in the stress concentration, establishing a new triple junction, and allowing continuity of slip in the fault system. The fresh fracture could provide the instability needed to explain earthquakes. Perhaps a small fraction (on the order of P/µ of the surface that slips in any earthquake is fresh fracture. Stress drop occurs only on this small fraction of the rupture surface, the asperities. Strain change in the asperities is on the order of P/µ. Therefore this model predicts average strais change in an earthquake to be on the order of (P/µ2 = 0.0001, as is observed.

  19. Time history nonlinear earthquake response analysis considering materials and geometrical nonlinearity

    International Nuclear Information System (INIS)

    Kobayashi, T.; Yoshikawa, K.; Takaoka, E.; Nakazawa, M.; Shikama, Y.

    2002-01-01

    A time history nonlinear earthquake response analysis method was proposed and applied to earthquake response prediction analysis for a Large Scale Seismic Test (LSST) Program in Hualien, Taiwan, in which a 1/4 scale model of a nuclear reactor containment structure was constructed on sandy gravel layer. In the analysis both of strain-dependent material nonlinearity, and geometrical nonlinearity by base mat uplift, were considered. The 'Lattice Model' for the soil-structure interaction model was employed. An earthquake record on soil surface at the site was used as control motion, and deconvoluted to the input motion of the analysis model at GL-52 m with 300 Gal of maximum acceleration. The following two analyses were considered: (A) time history nonlinear, (B) equivalent linear, and the advantage of time history nonlinear earthquake response analysis method is discussed

  20. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Science.gov (United States)

    Jackson, D. D.

    2017-12-01

    up to about magnitude 7. Regional forecasts for a few decades, like those in UCERF3, could be improved by calibrating tectonic moment rate to past seismicity rates. Century-long forecasts must be speculative. Estimates of maximum magnitude and rate of giant earthquakes over geologic time scales require more than science.

  1. Facts learnt from the Hanshin-Awaji disaster and consideration on design basis earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki [Yokohama National Univ. (Japan). Faculty of Engineering

    1997-03-01

    This paper will deal with how to establish the concept of the design basis earthquake for critical industrial facilities such as nuclear power plants in consideration of disasters induced by the 1995 Hyogoken-Nanbu Earthquake (Southern Hyogo-prefecture Earthquake-1995), so-called Kobe earthquake. The author once discussed various DBEs at 7 WCEE. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared to the values of accelerations to a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo-pref. Earthquake-1995 exceeded the previous assumption of the author, even though the evaluation results of the previous paper had been pessimistic. According to the experience of Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2}, previous DBEs. (author)

  2. The 'Thin film of gold': monetary rules and policy credibility

    OpenAIRE

    Niall Ferguson; Moritz Schularick

    2012-01-01

    This paper asks whether developing countries can reap credibility gains from submitting policy to a strict monetary rule. Following earlier work, we look at the gold standard era (1880-1914) as a "natural experiment" to test whether adoption of a rule-based monetary framework such as the gold standard increased policy credibility. On the basis of the largest possible dataset covering almost sixty independent and colonial borrowers in the London market, we challenge the traditional view that g...

  3. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  4. Intensity estimation of historical earthquakes through seismic analysis of wooden house

    International Nuclear Information System (INIS)

    Choi, I. K.; Soe, J. M.

    1999-01-01

    The intensity of historical earthquake records related with house collapses are estimated by the seismic analyses of traditional three-bay-straw-roof house. Eighteen artificial time histories for magnitudes 6-8, epicentral distances 5 km - 350 km and hard and soft soil condition were generated for the analyses. Nonlinear dynamic analyses were performed for a traditional three-bay-roof house. Damage level of the wooden house according to the input earthquake motions and the MM intensity were estimated by maximum displacement response at the top of columns. Considering the structural characteristics of the three-bay-straw-roof house, the largest historical earthquake record related to the house collapse is about MMI VIII

  5. Credible accident analyses for TRIGA and TRIGA-fueled reactors

    International Nuclear Information System (INIS)

    Hawley, S.C.; Kathren, R.L.

    1982-04-01

    Credible accidents were developed and analyzed for TRIGA and TRIGA-fueled reactors. The only potential for offsite exposure appears to be from a fuel-handling accident that, based on highly conservative assumptions, would result in dose equivalents of less than or equal to 1 mrem to the total body from noble gases and less than or equal to 1.2 rem to the thyroid from radioiodines. Credible accidents from excess reactivity insertions, metal-water reactions, lost, misplaced, or inadvertent experiments, core rearrangements, and changes in fuel morphology and ZrH/sub x/ composition are also evaluated, and suggestions for further study provided

  6. Determination of Love- and Rayleigh-Wave Magnitudes for Earthquakes and Explosions and Other Studies

    Science.gov (United States)

    2012-12-30

    09-C-0012 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 62601F 6. AUTHOR(S) Jessie L. Bonner, Anastasia Stroujkova, Dale Anderson, Jonathan...AND RAYLEIGH-WAVE MAGNITUDES FOR EARTHQUAKES AND EXPLOSIONS Jessie L. Bonner, Anastasia Stroujkova, and Dale Anderson INTRODUCTION Since...MAXIMUM LIKELIHOOD ESTIMATION: APPLICATION TO MIDDLE EAST EARTHQUAKE DATA Anastasia Stroujkova and Jessie Bonner Weston Geophysical Corporation

  7. Examining the Effect of Endorser Credibility on the Consumers' Buying Intentions: An Empirical Study in Turkey

    OpenAIRE

    Sertoglu, Aysegul Ermec; Catlı, Ozlem; Korkmaz, Sezer

    2014-01-01

    The purpose of this study is to test whether the source credibility affects buying intention and measure the perceived credibility differences between created spokesperson and celebrity endorser. The influence that endorser credibility dimensions (i.e. attractiveness, trustworthiness and expertise) have on purchase intentions of 326 young consumers has been examined. The results showed that all of the three credibility dimensions for both celebrity endorser and created spokesperson have a pos...

  8. EXAMINING THE EFFECT OF ENDORSER CREDIBILITY ON THE CONSUMERS' BUYING INTENTIONS: AN EMPIRICAL STUDY IN TURKEY

    OpenAIRE

    Aysegul Ermec Sertoglu; Ozlem Catli; Sezer Korkmaz

    2014-01-01

    The purpose of this study is to test whether the source credibility affects buying intention and measure the perceived credibility differences between created spokesperson and celebrity endorser. The influence that endorser credibility dimensions (i.e. attractiveness, trustworthiness and expertise) have on purchase intentions of 326 young consumers has been examined. The results showed that all of the three credibility dimensions for both celebrity endorser and created spokesperson have a pos...

  9. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  10. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  11. Geological and seismotectonic characteristics of the broader area of the October 15, 2016, earthquake (Ioannina, Greece)

    Science.gov (United States)

    Pavlides, Spyros; Ganas, Athanasios; Chatzipetros, Alexandros; Sboras, Sotiris; Valkaniotis, Sotiris; Papathanassiou, George; Thomaidou, Efi; Georgiadis, George

    2017-04-01

    This paper examines the seismotectonic setting of the moderate earthquake of October 15, 2016, Μw=5.3 (or 5.5), in the broader area of ​​Ioannina (Epirus, Greece). In this region the problem of reviewing the geological structure with new and modern methods and techniques, in relation to the geological-seismological evidence of the recent seismic sequence, is addressed. The seismic stimulation of landslides and other soil deformations is also examined. The earthquake is interpreted as indicative of a geotectonic environment of lithospheric compression, which comprises the backbone of Pindos mountain range. It starts from southern Albania and traverses western Greece, in an almost N-S direction. This is a seismically active region with a history of strong and moderate earthquakes, such as these of 1969 (Ms=5.8), 1960 (South Albania, M> 6.5, maximum intensity VIII+) and 1967 (Arta-Ioannina, M = 6.4, maximum intensity IX). The recent earthquake is associated with a known fault zone as recorded and identified in the Greek Database of Seismogenic Sources (GreDaSS, www.gredass.unife.it). Focal mechanism data indicate that the seismic fault is reverse or high-angle thrust, striking NNW-SSE and dipping to the E. The upper part of Epirus crust (brittle), which have an estimated maximum thickness of 10 km, do not show any significant seismicity. The deeper seismicity of 10-20 km, such as this of the recent earthquake, is caused by deep crustal processes with reverse - high-angle thrust faults. We suggest that the case of this earthquake is peculiar, complex and requires careful study and attention. The precise determination of the seismogenic fault and its dimensions, although not possible to be identified by direct field observations, can be assessed through the study of seismological and geodetic data (GPS, satellite images, stress transfer), as well as its seismic behavior. Field work in the broader area, in combination with instrumental data, can contribute to

  12. Methods and problems of determination of paleoearthquake magnitudes from fault source parameters

    International Nuclear Information System (INIS)

    Chang, C. J.; Choi, W. H.; Yeon, K. H.; Park, D. H.; Im, C. B.

    2004-01-01

    It has been debated that some of the Quaternary faults which were discovered near the nuclear power plant site whether are capable or not, SE Korea peninsula, thereby, it was necessary to estimate the maximum earthquake potential from the fault source parameters. In this study, we reviewed and analyzed the methods of evaluation of the maximum earthquake potential and also evaluated the maximum credible earthquake from the fault source parameters to the exclusion for the factor of faulting time. We obtained the paleomagnitude range of M 6.82∼7.21 and mean of M 6.98 from a certain fault with 1.5 m displacement of the Quaternary faults have been surveyed along the coast line of the East Sea. And, we also obtained the mean values of M 5.36, M 7.47 and M 6.46 from the other fault which is the fault surface length of 1.5 km, displacement of 4 m and the rate of seismic moment-release, respectively. We consider that a cause of the different paleomagnitudes is due to including the factors of over- and under-estimation in estimating the earthquake potential, and also may not fully identify the detailed geometry and dynamics of the faults

  13. Long-term earthquake forecasts based on the epidemic-type aftershock sequence (ETAS model for short-term clustering

    Directory of Open Access Journals (Sweden)

    Jiancang Zhuang

    2012-07-01

    Full Text Available Based on the ETAS (epidemic-type aftershock sequence model, which is used for describing the features of short-term clustering of earthquake occurrence, this paper presents some theories and techniques related to evaluating the probability distribution of the maximum magnitude in a given space-time window, where the Gutenberg-Richter law for earthquake magnitude distribution cannot be directly applied. It is seen that the distribution of the maximum magnitude in a given space-time volume is determined in the longterm by the background seismicity rate and the magnitude distribution of the largest events in each earthquake cluster. The techniques introduced were applied to the seismicity in the Japan region in the period from 1926 to 2009. It was found that the regions most likely to have big earthquakes are along the Tohoku (northeastern Japan Arc and the Kuril Arc, both with much higher probabilities than the offshore Nankai and Tokai regions.

  14. In Your Facebook: Examining Facebook Usage as Misbehavior on Perceived Teacher Credibility

    Science.gov (United States)

    Hutchens, Jason S.; Hayes, Timothy

    2014-01-01

    Teachers sometimes do things that negatively impact their own credibility in classroom settings. One way instructors maintain credibility among students is by keeping a veil between their personal and professional personas. The advent of Facebook presents new challenges for instructors seeking to keep their personal lives private in order to…

  15. Preliminary Results on Earthquake Recurrence Intervals, Rupture Segmentation, and Potential Earthquake Moment Magnitudes along the Tahoe-Sierra Frontal Fault Zone, Lake Tahoe, California

    Science.gov (United States)

    Howle, J.; Bawden, G. W.; Schweickert, R. A.; Hunter, L. E.; Rose, R.

    2012-12-01

    Utilizing high-resolution bare-earth LiDAR topography, field observations, and earlier results of Howle et al. (2012), we estimate latest Pleistocene/Holocene earthquake-recurrence intervals, propose scenarios for earthquake-rupture segmentation, and estimate potential earthquake moment magnitudes for the Tahoe-Sierra frontal fault zone (TSFFZ), west of Lake Tahoe, California. We have developed a new technique to estimate the vertical separation for the most recent and the previous ground-rupturing earthquakes at five sites along the Echo Peak and Mt. Tallac segments of the TSFFZ. At these sites are fault scarps with two bevels separated by an inflection point (compound fault scarps), indicating that the cumulative vertical separation (VS) across the scarp resulted from two events. This technique, modified from the modeling methods of Howle et al. (2012), uses the far-field plunge of the best-fit footwall vector and the fault-scarp morphology from high-resolution LiDAR profiles to estimate the per-event VS. From this data, we conclude that the adjacent and overlapping Echo Peak and Mt. Tallac segments have ruptured coseismically twice during the Holocene. The right-stepping, en echelon range-front segments of the TSFFZ show progressively greater VS rates and shorter earthquake-recurrence intervals from southeast to northwest. Our preliminary estimates suggest latest Pleistocene/ Holocene earthquake-recurrence intervals of 4.8±0.9x103 years for a coseismic rupture of the Echo Peak and Mt. Tallac segments, located at the southeastern end of the TSFFZ. For the Rubicon Peak segment, northwest of the Echo Peak and Mt. Tallac segments, our preliminary estimate of the maximum earthquake-recurrence interval is 2.8±1.0x103 years, based on data from two sites. The correspondence between high VS rates and short recurrence intervals suggests that earthquake sequences along the TSFFZ may initiate in the northwest part of the zone and then occur to the southeast with a lower

  16. Establishing the credibility of qualitative research findings: the plot thickens.

    Science.gov (United States)

    Cutcliffe, J R; McKenna, H P

    1999-08-01

    Qualitative research is increasingly recognized and valued and its unique place in nursing research is highlighted by many. Despite this, some nurse researchers continue to raise epistemological issues about the problems of objectivity and the validity of qualitative research findings. This paper explores the issues relating to the representativeness or credibility of qualitative research findings. It therefore critiques the existing distinct philosophical and methodological positions concerning the trustworthiness of qualitative research findings, which are described as follows: quantitative studies should be judged using the same criteria and terminology as quantitative studies; it is impossible, in a meaningful way, for any criteria to be used to judge qualitative studies; qualitative studies should be judged using criteria that are developed for and fit the qualitative paradigm; and the credibility of qualitative research findings could be established by testing out the emerging theory by means of conducting a deductive quantitative study. The authors conclude by providing some guidelines for establishing the credibility of qualitative research findings.

  17. Exceptional Ground Accelerations and Velocities Caused by Earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, John

    2008-01-17

    This project aims to understand the characteristics of the free-field strong-motion records that have yielded the 100 largest peak accelerations and the 100 largest peak velocities recorded to date. The peak is defined as the maximum magnitude of the acceleration or velocity vector during the strong shaking. This compilation includes 35 records with peak acceleration greater than gravity, and 41 records with peak velocities greater than 100 cm/s. The results represent an estimated 150,000 instrument-years of strong-motion recordings. The mean horizontal acceleration or velocity, as used for the NGA ground motion models, is typically 0.76 times the magnitude of this vector peak. Accelerations in the top 100 come from earthquakes as small as magnitude 5, while velocities in the top 100 all come from earthquakes with magnitude 6 or larger. Records are dominated by crustal earthquakes with thrust, oblique-thrust, or strike-slip mechanisms. Normal faulting mechanisms in crustal earthquakes constitute under 5% of the records in the databases searched, and an even smaller percentage of the exceptional records. All NEHRP site categories have contributed exceptional records, in proportions similar to the extent that they are represented in the larger database.

  18. The Influence of Peer Reviews on Source Credibility and Purchase Intention

    Directory of Open Access Journals (Sweden)

    Kristine L. Nowak

    2014-12-01

    Full Text Available Electronic Word of Mouth (eWOM is information shared on the Internet about a product, which allows people to receive information from others they may not otherwise encounter. Online product reviews are a type of eWOM where a user posts a comment about a product and selects an image to represent the self. The perception of the image and the text in the product review can influence source credibility and the perception of the product, as well as the likelihood that someone will purchase the product. This study examines the effect of the product reviews and their different images and text on perceived credibility, source trustworthiness and purchase intention. Consistent with predictions based on the information processing theory, perceived anthropomorphism influences perceived credibility, source trust, and purchase intention.

  19. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  20. Coulomb Failure Stress Accumulation in Nepal After the 2015 Mw 7.8 Gorkha Earthquake: Testing Earthquake Triggering Hypothesis and Evaluating Seismic Hazards

    Science.gov (United States)

    Xiong, N.; Niu, F.

    2017-12-01

    A Mw 7.8 earthquake struck Gorkha, Nepal, on April 5, 2015, resulting in more than 8000 deaths and 3.5 million homeless. The earthquake initiated 70km west of Kathmandu and propagated eastward, rupturing an area of approximately 150km by 60km in size. However, the earthquake failed to fully rupture the locked fault beneath the Himalaya, suggesting that the region south of Kathmandu and west of the current rupture are still locked and a much more powerful earthquake might occur in future. Therefore, the seismic hazard of the unruptured region is of great concern. In this study, we investigated the Coulomb failure stress (CFS) accumulation on the unruptured fault transferred by the Gorkha earthquake and some nearby historical great earthquakes. First, we calculated the co-seismic CFS changes of the Gorkha earthquake on the nodal planes of 16 large aftershocks to quantitatively examine whether they were brought closer to failure by the mainshock. It is shown that at least 12 of the 16 aftershocks were encouraged by an increase of CFS of 0.1-3 MPa. The correspondence between the distribution of off-fault aftershocks and the increased CFS pattern also validates the applicability of the earthquake triggering hypothesis in the thrust regime of Nepal. With the validation as confidence, we calculated the co-seismic CFS change on the locked region imparted by the Gorkha earthquake and historical great earthquakes. A newly proposed ramp-flat-ramp-flat fault geometry model was employed, and the source parameters of historical earthquakes were computed with the empirical scaling relationship. A broad region south of the Kathmandu and west of the current rupture were shown to be positively stressed with CFS change roughly ranging between 0.01 and 0.5 MPa. The maximum of CFS increase (>1MPa) was found in the updip segment south of the current rupture, implying a high seismic hazard. Since the locked region may be additionally stressed by the post-seismic relaxation of the lower

  1. Credible enough? Forward guidance and perceived National Bank of Poland’s policy rule

    OpenAIRE

    Baranowski, Paweł; Gajewski, Paweł

    2015-01-01

    Credible forward guidance should bring down the perceived impact of macroeconomic variables on the interest rate. Using a micro-level dataset we test the perception of monetary policy in Poland among professional forecasters and find evidence for forward guidance credibility.

  2. Gravity variation before the Akto Ms6.7 earthquake, Xinjiang

    Directory of Open Access Journals (Sweden)

    Hongtao Hao

    2017-03-01

    Full Text Available The relationship between gravity variation and the Akto Ms6.7 earthquake on November 11, 2016, was studied by use of mobile gravity observation data from the China continental structural environmental monitoring network. The result revealed that before the Akto earthquake, a high positive gravity variation was observed in the Pamir tectonic knots region (within a maximum magnitude of approximately +80 microgal, which was consistent with the existing knowledge of gravity abnormality and the locations of strong earthquakes. In view of the recent strong seismic activities in the Pamir tectonic knots region, as well as the strong upward crust movement and compressive strain, it is believed that gravity change in the Pamir tectonic knots region reflects the recent strong seismic activities and crust movement.

  3. Trust and Credibility in Web-Based Health Information: A Review and Agenda for Future Research.

    Science.gov (United States)

    Sbaffi, Laura; Rowley, Jennifer

    2017-06-19

    Internet sources are becoming increasingly important in seeking health information, such that they may have a significant effect on health care decisions and outcomes. Hence, given the wide range of different sources of Web-based health information (WHI) from different organizations and individuals, it is important to understand how information seekers evaluate and select the sources that they use, and more specifically, how they assess their credibility and trustworthiness. The aim of this study was to review empirical studies on trust and credibility in the use of WHI. The article seeks to present a profile of the research conducted on trust and credibility in WHI seeking, to identify the factors that impact judgments of trustworthiness and credibility, and to explore the role of demographic factors affecting trust formation. On this basis, it aimed to identify the gaps in current knowledge and to propose an agenda for future research. A systematic literature review was conducted. Searches were conducted using a variety of combinations of the terms WHI, trust, credibility, and their variants in four multi-disciplinary and four health-oriented databases. Articles selected were published in English from 2000 onwards; this process generated 3827 unique records. After the application of the exclusion criteria, 73 were analyzed fully. Interest in this topic has persisted over the last 15 years, with articles being published in medicine, social science, and computer science and originating mostly from the United States and the United Kingdom. Documents in the final dataset fell into 3 categories: (1) those using trust or credibility as a dependent variable, (2) those using trust or credibility as an independent variable, and (3) studies of the demographic factors that influence the role of trust or credibility in WHI seeking. There is a consensus that website design, clear layout, interactive features, and the authority of the owner have a positive effect on trust or

  4. Brief communication "Fast-track earthquake risk assessment for selected urban areas in Turkey"

    Directory of Open Access Journals (Sweden)

    D. Kepekci

    2011-02-01

    Full Text Available This study is presented as a contribution to earthquake disaster mitigation studies for selected cities in Turkey. The risk evaluations must be based on earthquake hazard analysis and city information. To estimate the ground motion level, data for earthquakes with a magnitude greater than 4.5 and an epicenter location within a 100-km radius of each city were used for the period from 1900 to 2006, as recorded at the Kandilli Observatory and Earthquake Research Institute. Probabilistic seismic hazard analysis for each city was carried out using Poisson probabilistic approaches. Ground motion level was estimated as the probability of a given degree of acceleration with a 10% exceedence rate during a 50-year time period for each city. The risk level of each city was evaluated using the number of houses, the per-capita income of city residents, population, and ground motion levels. The maximum risk level obtained for the cities was taken as a reference value for relative risk assessment, and other risk values were estimated relative to the maximum risk level. When the selected cities were classified according to their relative risk levels, the five most risky cities were found to be, in descending order of risk, Istanbul, Izmir, Ankara, Bursa, and Kocaeli.

  5. Credibility of the emotional witness: a study of ratings by court judges.

    Science.gov (United States)

    Wessel, Ellen; Drevland, Guri C B; Eilertsen, Dag Erik; Magnussen, Svein

    2006-04-01

    Previous studies have shown that the emotional behavior displayed during testimony may affect the perceived credibility of the witness. The present study compares credibility ratings by Norwegian court judges with those made by lay people. The participants viewed one of three video-recorded versions of a rape victim's statement, role played by a professional actress. The statement was given in a free-recall manner with one of three kinds of emotions displayed, termed congruent, neutral, and incongruent emotional expression. The results show that, in contrast to lay people, the credibility ratings of court judges and their votes for a guilty verdict were not influenced by the emotions displayed by the witness. Results are discussed in terms of professional expertise.

  6. Earthquake Early Warning in Japan - Result of recent two years -

    Science.gov (United States)

    Shimoyama, T.; Doi, K.; Kiyomoto, M.; Hoshiba, M.

    2009-12-01

    Japan Meteorological Agency(JMA) started to provide Earthquake Early Warning(EEW) to the general public in October 2007. It was followed by provision of EEW to a limited number of users who understand the technical limit of EEW and can utilize it for automatic control from August 2006. Earthquake Early Warning in Japan definitely means information of estimated amplitude and arrival time of a strong ground motion after fault rupture occurred. In other words, the EEW provided by JMA is defined as a forecast of a strong ground motion before the strong motion arrival. EEW of JMA is to enable advance countermeasures to disasters caused by strong ground motions with providing a warning message of anticipating strong ground motion before the S wave arrival. However, due to its very short available time period, there should need some measures and ideas to provide rapidly EEW and utilize it properly. - EEW is issued to general public when the maximum seismic intensity 5 lower (JMA scale) or greater is expected. - EEW message contains origin time, epicentral region name, and names of areas (unit is about 1/3 to 1/4 of one prefecture) where seismic intensity 4 or greater is expected. Expected arrival time is not included because it differs substantially even in one unit area. - EEW is to be broadcast through the broadcasting media(TV, radio and City Administrative Disaster Management Radio), and is delivered to cellular phones through cell broadcast system. For those who would like to know the more precise estimation and smaller earthquake information at their point of their properties, JMA allows designated private companies to provide forecast of strong ground motion, in which the estimation of a seismic intensity as well as arrival time of S-wave are contained, at arbitrary places under the JMA’s technical assurance. From October, 2007 to August, 2009, JMA issued 11 warnings to general public expecting seismic intensity “5 lower” or greater, including M=7.2 inland

  7. Credible Phenomenological Research: A Mixed-Methods Study

    Science.gov (United States)

    Flynn, Stephen V.; Korcuska, James S.

    2018-01-01

    The authors conducted a 3-phase investigation into the credible standards for phenomenological research practices identified in the literature and endorsed by a sample of counselor education qualitative research experts. Utilizing a mixed-methods approach, the findings offer evidence that professional counseling has a distinctive format in which…

  8. Vrancea earthquakes. Specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru

    2005-01-01

    Earthquakes have been known in Romania since Roman times, when Trajan's legionnaires began the colonization of the rich plains stretching from the Carpathian Mountains to the Danube River. Since readings from seismographic stations became available, after 1940, it has been established that the most frequent largest earthquakes arise from deep Vrancea sources at the bend of the Carpathians Earthquakes in the Carpathian-Pannonian region are confined to the crust, except for the Vrancea zone, where earthquakes with focal depth down to 200 km occur. For example, the ruptured area migrated in depth from 150 km to 180 km (November 10, 1940, M w =7.7), from 90 to 110 km (March 4, 1977, M w =7.4), from 130 to 150 km (August 30, 1986, M w =7.1), and from 70 to 90 km (May 30, 1990, M w =6.9). The depth interval between 110 km and 130 km has remained unruptured since 1802, October 26, when the strongest known earthquake occurred in this part of Central Europe. The magnitude is assumed to have been M w =7.9 - 8.0, and this depth interval is a natural candidate for the next strong Vrancea event. The maximum intensity for strong deep Vrancea earthquakes is quite distant from the actual epicenter and greater than the epicentral intensity. For the 1977 strong earthquake (M w =7.4), the estimated intensity at its Vrancea region epicenter was only VII (MMI scale), while some 170 km away, in the capital city of Bucharest, the estimated maximum intensity was IX1/2 -X (MMI). The intensely deforming Vrancea zone shows a quite enigmatic seismic pattern (peak ground accelerations/intensity one, characteristic response spectra with large periods of 1.5 seconds, no significant attenuations on Romanian territory, large amplifications away, etc.). While no country in the world is entirely safe, the lack of capacity to limit the impact of seismic hazards remains a major burden for all countries and while the world has witnessed an exponential increase in human and material losses due to

  9. Uniform risk spectra of strong earthquake ground motion: NEQRISK

    International Nuclear Information System (INIS)

    Lee, V.W.; Trifunac, M.D.

    1987-01-01

    The concept of uniform risk spectra of Anderson and Trifunac (1977) has been generalized to include (1) more refined description of earthquake source zones, (2) the uncertainties in estimating seismicity parameters a and b in log 10 N = a - bM, (3) to consider uncertainties in estimation of maximum earthquake size in each source zone, and to (4) include the most recent results on empirical scaling of strong motion amplitudes at a site. Examples of using to new NEQRISK program are presented and compared with the corresponding case studies of Anderson and Trifunac (1977). The organization of the computer program NEQRISK is also briefly described

  10. Distribution system reliability evaluation using credibility theory

    African Journals Online (AJOL)

    Xufeng Xu, Joydeep Mitra

    have found that credibility theory, which broadens the scope of fuzzy set theory, is an effective tool for representing fuzzy events, and have developed a theoretical .... Based on the status of switches, the distribution system can be divided into multiple SPSS, which are connected with tie switches. For example, SPSS.

  11. Earthquake related dynamic groundwater pressure changes observed at the Kamaishi Mine

    International Nuclear Information System (INIS)

    Sasaki, Shunji; Yasuike, Shinji; Komada, Hiroya; Kobayashi, Yoshimasa; Kawamura, Makoto; Aoki, Kazuhiro

    1999-01-01

    From 342 seismic records observed at the Kamaishi Mine form 1990 to 1998, a total of 92 data whose acceleration is greater than 1 gal or ground water pressure is greater than 1 kPa were selected and dynamic ground water pressure changes associated with earthquakes were studied. The results obtained are as follows: (1) A total of 27 earthquakes accompanied by static ground water pressure changes were observed. Earthquake-related static ground water pressure changes are smaller than 1/10 of the annual range of ground water pressure changes. There is also a tendency that the ground water pressure changes recovers to its original trend in several weeks after earthquakes. (2) Dynamic ground water pressure changes associated with earthquakes occur when P-waves arrive. However, the largest dynamic ground water pressure changes occur on S-wave part arrivals where the amplitude of seismic wave is the largest. A positive correlation is recognized between the maximum value of velocity wave form and that of dynamic ground water pressure changes. (3) The characteristic of dynamic change in ground water pressure due to earthquakes can be explained qualitatively by mechanism in which the P-wave converted from an incident SV wave propagates along the borehole. (author)

  12. Exposing therapists to trauma-focused treatment in psychosis: effects on credibility, expected burden, and harm expectancies

    Directory of Open Access Journals (Sweden)

    David P. G. van den Berg

    2016-09-01

    Full Text Available Background: Despite robust empirical support for the efficacy of trauma-focused treatments, the dissemination proves difficult, especially in relation to patients with comorbid psychosis. Many therapists endorse negative beliefs about the credibility, burden, and harm of such treatment. Objective: This feasibility study explores the impact of specialized training on therapists’ beliefs about trauma-focused treatment within a randomized controlled trial. Method: Therapist-rated (n=16 credibility, expected burden, and harm expectancies of trauma-focused treatment were assessed at baseline, post-theoretical training, post-technical training, post-supervised practical training, and at 2-year follow-up. Credibility and burden beliefs of therapists concerning the treatment of every specific patient in the trial were also assessed. Results: Over time, therapist-rated credibility of trauma-focused treatment showed a significant increase, whereas therapists’ expected burden and harm expectancies decreased significantly. In treating posttraumatic stress disorder (PTSD in patients with psychotic disorders (n=79, pre-treatment symptom severity was not associated with therapist-rated credibility or expected burden of that specific treatment. Treatment outcome had no influence on patient-specific credibility or burden expectancies of therapists. Conclusions: These findings support the notion that specialized training, including practical training with supervision, has long-term positive effects on therapists’ credibility, burden, and harm beliefs concerning trauma-focused treatment.

  13. Evaluating the quality of Websites related to Hospital-Based Home Care: The Credibility Indicator as a prognostic factor

    Directory of Open Access Journals (Sweden)

    María Sanz-Lorente

    2017-04-01

    Full Text Available Objective: To evaluate the documental quality of websites related to Home Care Services. Method: This is a descriptive cross-sectional study of websites based on Home Care Services, using searches on Google to access the study population. The “fallacy sample” of this search engine was take into account. The quality was studied thought the 8 variables of the Credibility Indicator (CI. Results: A total of 215 active websites, mainly belonging to the media, were evaluated. None of the websites met all 8 items in the CI. Mean of 2,12 ± 0,07; Minimum of 0 and Maximum of 5; Median equal to 3. Within the studied websites, 74 (34,42% presented both authorship and affiliation. There was an association between the CI accomplishment and websites that had these 2 variables (p <0.001. Conclusions: The quality of websites covering issues of Hospital-Based Home Care services is still poor. It is confirmed that identifying authorship and affiliation is an important factor in predicting the quality of the information. The Credibility Indicator is a useful aid when determining the quality of a website.

  14. A new Bayesian Inference-based Phase Associator for Earthquake Early Warning

    Science.gov (United States)

    Meier, Men-Andrin; Heaton, Thomas; Clinton, John; Wiemer, Stefan

    2013-04-01

    State of the art network-based Earthquake Early Warning (EEW) systems can provide warnings for large magnitude 7+ earthquakes. Although regions in the direct vicinity of the epicenter will not receive warnings prior to damaging shaking, real-time event characterization is available before the destructive S-wave arrival across much of the strongly affected region. In contrast, in the case of the more frequent medium size events, such as the devastating 1994 Mw6.7 Northridge, California, earthquake, providing timely warning to the smaller damage zone is more difficult. For such events the "blind zone" of current systems (e.g. the CISN ShakeAlert system in California) is similar in size to the area over which severe damage occurs. We propose a faster and more robust Bayesian inference-based event associator, that in contrast to the current standard associators (e.g. Earthworm Binder), is tailored to EEW and exploits information other than only phase arrival times. In particular, the associator potentially allows for reliable automated event association with as little as two observations, which, compared to the ShakeAlert system, would speed up the real-time characterizations by about ten seconds and thus reduce the blind zone area by up to 80%. We compile an extensive data set of regional and teleseismic earthquake and noise waveforms spanning a wide range of earthquake magnitudes and tectonic regimes. We pass these waveforms through a causal real-time filterbank with passband filters between 0.1 and 50Hz, and, updating every second from the event detection, extract the maximum amplitudes in each frequency band. Using this dataset, we define distributions of amplitude maxima in each passband as a function of epicentral distance and magnitude. For the real-time data, we pass incoming broadband and strong motion waveforms through the same filterbank and extract an evolving set of maximum amplitudes in each passband. We use the maximum amplitude distributions to check

  15. Earthquake Loss Scenarios in the Himalayas

    Science.gov (United States)

    Wyss, M.; Gupta, S.; Rosset, P.; Chamlagain, D.

    2017-12-01

    We estimate quantitatively that in repeats of the 1555 and 1505 great Himalayan earthquakes the fatalities may range from 51K to 549K, the injured from 157K to 1,700K and the strongly affected population (Intensity≥VI) from 15 to 75 million, depending on the details of the assumed earthquake parameters. For up-dip ruptures in the stressed segments of the M7.8 Gorkha 2015, the M7.9 Subansiri 1947 and the M7.8 Kangra 1905 earthquakes, we estimate 62K, 100K and 200K fatalities, respectively. The numbers of strongly affected people we estimate as 8, 12, 33 million, in these cases respectively. These loss calculations are based on verifications of the QLARM algorithms and data set in the cases of the M7.8 Gorkha 2015, the M7.8 Kashmir 2005, the M6.6 Chamoli 1999, the M6.8 Uttarkashi 1991 and the M7.8 Kangra 1905 earthquakes. The requirement of verification that was fulfilled in these test cases was that the reported intensity field and the fatality count had to match approximately, using the known parameters of the earthquakes. The apparent attenuation factor was a free parameter and ranged within acceptable values. Numbers for population were adjusted for the years in question from the latest census. The hour of day was assumed to be at night with maximum occupation. The assumption that the upper half of the Main Frontal Thrust (MFT) will rupture in companion earthquakes to historic earthquakes in the down-dip half is based on the observations of several meters of displacement in trenches across the MFT outcrop. Among mitigation measures awareness with training and adherence to construction codes rank highest. Retrofitting of schools and hospitals would save lives and prevent injuries. Preparation plans for helping millions of strongly affected people should be put in place. These mitigation efforts should focus on an approximately 7 km wide strip along the MFT on the up-thrown side because the strong motions are likely to be doubled. We emphasize that our estimates

  16. Projections of Flood Risk using Credible Climate Signals in the Ohio River Basin

    Science.gov (United States)

    Schlef, K.; Robertson, A. W.; Brown, C.

    2017-12-01

    Estimating future hydrologic flood risk under non-stationary climate is a key challenge to the design of long-term water resources infrastructure and flood management strategies. In this work, we demonstrate how projections of large-scale climate patterns can be credibly used to create projections of long-term flood risk. Our study area is the northwest region of the Ohio River Basin in the United States Midwest. In the region, three major teleconnections have been previously demonstrated to affect synoptic patterns that influence extreme precipitation and streamflow: the El Nino Southern Oscillation, the Pacific North American pattern, and the Pacific Decadal Oscillation. These teleconnections are strongest during the winter season (January-March), which also experiences the greatest number of peak flow events. For this reason, flood events are defined as the maximum daily streamflow to occur in the winter season. For each gage in the region, the location parameter of a log Pearson type 3 distribution is conditioned on the first principal component of the three teleconnections to create a statistical model of flood events. Future projections of flood risk are created by forcing the statistical model with projections of the teleconnections from general circulation models selected for skill. We compare the results of our method to the results of two other methods: the traditional model chain (i.e., general circulation model projections to downscaling method to hydrologic model to flood frequency analysis) and that of using the historic trend. We also discuss the potential for developing credible projections of flood events for the continental United States.

  17. Credibility of the Printed Media: The Swine Flu as a Case Study

    Directory of Open Access Journals (Sweden)

    Ksenija Žlof

    2011-12-01

    Full Text Available The issue of credibility becomes especially pronounced in times of crises, which characteristically abound in the unknown, uncertainty, and doubt. Such crises are mostly sudden, often complex, andsometimes mired in controversial events. The public subsequently craves more information in times of crises, such that they may obtain more precise guidance, and ease their ability to cope. Given the relatively low frequency of crisis situations, most people lack actual experience relevant to a given predicament. The appearance of Virus A (H1N1 at the onset of 2009 is one such case. Despite H1N1’s classification as a broad-scale, serious health hazard, preventive vaccinations failed to reach a large segment of the population. We contend that the lack of credibility in informing the public through the media contributed considerably to this failure. Therefore, the aim of this paper is to determine the level of credible information provided by the print media from which the general public could have taken an informed position on the crisis in question. Quantitative research and content analysis ascertained from a body of print media sources with national coverage reveals that the Croatian print media, contrary to our expectations, largely rely on official sources and transparently cite authors, which contributes to a higher degree of credibility. Yet further analysis of the number of sources suggests that most journalists used on average only one or no named sources, which significantly reduces the credibility of the published articles.

  18. Methodology for identifying credible disruptions to isolation of nuclear waste within Columbia River basalts

    International Nuclear Information System (INIS)

    Davis, J.D.

    1982-01-01

    Analysis of potential preclosure and postclosure disruptive events and processes is comprised of evaluation of (1) potential uncertainties and omissions associated with characterization of the site; (2) credible events and processes resulting from the dynamics of natural systems; (3) potential, credible changes in isolation conditions induced by the presence of the repository, and (4) potential, credible future changes impacting isolation capability resulting from man's activities, independent of repository construction/operation. This report presents the overall methodology for identification, classification and analysis of disruptive events based on Nuclear Regulatory Commission and Environmental Protection Agency guidelines proposed in drafts of 10 CFR 60 and 40 CFR 191. Potential credible disruptive events, processes, and conditions are considered with respect to whether they are anticipated or unanticipated, and determined to have reasonably foreseeable, very unlikely, or extremely unlikely probability of occurrence. 85 refs., 2 figs., 7 tabs

  19. Strategies Employed by Citizen Science Programs to Increase the Credibility of Their Data

    Directory of Open Access Journals (Sweden)

    Amy Freitag

    2016-05-01

    Full Text Available The success of citizen science in producing important and unique data is attracting interest from scientists and resource managers. Nonetheless, questions remain about the credibility of citizen science data. Citizen science programs desire to meet the same standards of credibility as academic science, but they usually work within a different context, for example, training and managing significant numbers of volunteers with limited resources. We surveyed the credibility-building strategies of 30 citizen science programs that monitor environmental aspects of the California coast. We identified a total of twelve strategies: Three that are applied during training and planning; four that are applied during data collection; and five that are applied during data analysis and program evaluation. Variation in the application of these strategies by program is related to factors such as the number of participants, the focus on group or individual work, and the time commitment required of volunteers. The structure of each program and available resources require program designers to navigate tradeoffs in the choices of their credibility strategies. Our results illustrate those tradeoffs and provide a framework for the necessary discussions between citizen science programs and potential users of their data—including scientists and decision makers—about shared expectations for credibility and practical approaches for meeting those expectations. This article has been corrected here: http://dx.doi.org/10.5334/cstp.91

  20. Credible enough? Forward guidance and per-ceived National Bank of Poland’s policy rule

    OpenAIRE

    Pawel Baranowski; Pawel Gajewski

    2015-01-01

    Credible forward guidance should reduce the perceived impact of macroeconomic variables on the interest rate. Using a micro-level dataset we test the perception of monetary policy in Poland among professional forecasters and find evidence for forward guidance credibility.

  1. Information Literacy on the Web: How College Students Use Visual and Textual Cues to Assess Credibility on Health Websites

    Directory of Open Access Journals (Sweden)

    Katrina L. Pariera

    2012-12-01

    Full Text Available One of the most important literacy skills in today’s information society is the ability to determine the credibility of online information. Users sort through a staggering number of websites while discerning which will provide satisfactory information. In this study, 70 college students assessed the credibility of health websites with a low and high design quality, in either low or high credibility groups. The study’s purpose was to understand if students relied more on textual or visual cues in determining credibility, and to understand if this affected their recall of those cues later. The results indicate that when viewing a high credibility website, high design quality will bolster the credibility perception, but design quality will not compensate for a low credibility website. The recall test also indicated that credibility does impact the participants’ recall of visual and textual cues. Implications are discussed in light of the Elaboration Likelihood Model.

  2. Earthquake Scenario-Based Tsunami Wave Heights in the Eastern Mediterranean and Connected Seas

    Science.gov (United States)

    Necmioglu, Ocal; Özel, Nurcan Meral

    2015-12-01

    We identified a set of tsunami scenario input parameters in a 0.5° × 0.5° uniformly gridded area in the Eastern Mediterranean, Aegean (both for shallow- and intermediate-depth earthquakes) and Black Seas (only shallow earthquakes) and calculated tsunami scenarios using the SWAN-Joint Research Centre (SWAN-JRC) code ( Mader 2004; Annunziato 2007) with 2-arcmin resolution bathymetry data for the range of 6.5—Mwmax with an Mw increment of 0.1 at each grid in order to realize a comprehensive analysis of tsunami wave heights from earthquakes originating in the region. We defined characteristic earthquake source parameters from a compiled set of sources such as existing moment tensor catalogues and various reference studies, together with the Mwmax assigned in the literature, where possible. Results from 2,415 scenarios show that in the Eastern Mediterranean and its connected seas (Aegean and Black Sea), shallow earthquakes with Mw ≥ 6.5 may result in coastal wave heights of 0.5 m, whereas the same wave height would be expected only from intermediate-depth earthquakes with Mw ≥ 7.0 . The distribution of maximum wave heights calculated indicate that tsunami wave heights up to 1 m could be expected in the northern Aegean, whereas in the Black Sea, Cyprus, Levantine coasts, northern Libya, eastern Sicily, southern Italy, and western Greece, up to 3-m wave height could be possible. Crete, the southern Aegean, and the area between northeast Libya and Alexandria (Egypt) is prone to maximum tsunami wave heights of >3 m. Considering that calculations are performed at a minimum bathymetry depth of 20 m, these wave heights may, according to Green's Law, be amplified by a factor of 2 at the coastline. The study can provide a basis for detailed tsunami hazard studies in the region.

  3. Seven years of postseismic deformation following the 2003 Mw = 6.8 Zemmouri earthquake (Algeria) from InSAR time series

    KAUST Repository

    Cetin, Esra

    2012-05-28

    We study the postseismic surface deformation of the Mw 6.8, 2003 Zemmouri earthquake (northern Algeria) using the Multi-Temporal Small Baseline InSAR technique. InSAR time series obtained from 31 Envisat ASAR images from 2003 to 2010 reveal sub-cm coastline ground movements between Cap Matifou and Dellys. Two regions display subsidence at a maximum rate of 2 mm/yr in Cap Djenet and 3.5 mm/yr in Boumerdes. These regions correlate well with areas of maximum coseismic uplifts, and their association with two rupture segments. Inverse modeling suggest that subsidence in the areas of high coseismic uplift can be explained by afterslip on shallow sections (<5 km) of the fault above the areas of coseismic slip, in agreement with previous GPS observations. The earthquake impact on soft sediments and the ground water table southwest of the earthquake area, characterizes ground deformation of non-tectonic origin. The cumulative postseismic moment due to 7 years afterslip is equivalent to an Mw 6.3 earthquake. Therefore, the postseismic deformation and stress buildup has significant implications on the earthquake cycle models and recurrence intervals of large earthquakes in the Algiers area.

  4. Seven years of postseismic deformation following the 2003 Mw = 6.8 Zemmouri earthquake (Algeria) from InSAR time series

    KAUST Repository

    Cetin, Esra; Meghraoui, Mustapha; Cakir, Ziyadin; Akoglu, Ahmet M.; Mimouni, Omar; Chebbah, Mouloud

    2012-01-01

    We study the postseismic surface deformation of the Mw 6.8, 2003 Zemmouri earthquake (northern Algeria) using the Multi-Temporal Small Baseline InSAR technique. InSAR time series obtained from 31 Envisat ASAR images from 2003 to 2010 reveal sub-cm coastline ground movements between Cap Matifou and Dellys. Two regions display subsidence at a maximum rate of 2 mm/yr in Cap Djenet and 3.5 mm/yr in Boumerdes. These regions correlate well with areas of maximum coseismic uplifts, and their association with two rupture segments. Inverse modeling suggest that subsidence in the areas of high coseismic uplift can be explained by afterslip on shallow sections (<5 km) of the fault above the areas of coseismic slip, in agreement with previous GPS observations. The earthquake impact on soft sediments and the ground water table southwest of the earthquake area, characterizes ground deformation of non-tectonic origin. The cumulative postseismic moment due to 7 years afterslip is equivalent to an Mw 6.3 earthquake. Therefore, the postseismic deformation and stress buildup has significant implications on the earthquake cycle models and recurrence intervals of large earthquakes in the Algiers area.

  5. Characteristics of Gyeongju earthquake, moment magnitude 5.5 and relative relocations of aftershocks

    Science.gov (United States)

    Cho, ChangSoo; Son, Minkyung

    2017-04-01

    There is low seismicity in the korea peninsula. According historical record in the historic book, There were several strong earthquake in the korea peninsula. Especially in Gyeongju of capital city of the Silla dynasty, few strong earthquakes caused the fatalities of several hundreds people 1,300 years ago and damaged the houses and make the wall of castles collapsed. Moderate strong earthquake of moment magnitude 5.5 hit the city in September 12, 2016. Over 1000 aftershocks were detected. The numbers of occurrences of aftershock over time follows omori's law well. The distribution of relative locations of 561 events using clustering aftershocks by cross-correlation between P and S waveform of the events showed the strike NNE 25 30 o and dip 68 74o of fault plane to cause the earthquake matched with the fault plane solution of moment tensor inversion well. The depth of range of the events is from 11km to 16km. The width of distribution of event locations is about 5km length. The direction of maximum horizontal stress by inversion of stress for the moment solutions of main event and large aftershocks is similar to the known maximum horizontal stress direction of the korea peninsula. The relation curves between moment magnitude and local magnitude of aftershocks shows that the moment magnitude increases slightly more for events of size less than 2.0

  6. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  7. On the principles of the determination of the safe shut-down earthquake for nuclear power plants in Austria

    International Nuclear Information System (INIS)

    Drimmel, J.

    1976-01-01

    At present no legal guide lines exist in Austria for the determination of the Safe Shut-Down Earthquake. According to experience, the present requirements for a nuclear power plant site are the following: It must be free of marked tectonic faults and it must never have been situated within the epicentral region of a strong earthquake. The maximum expected earthquake and the Safe Shut-Down Earthquake respectively, are fixed by the aid of a frequency map of strong earthquakes and a map of extreme earthquake intensities in Austria based on macroseismic data since 1201 A.D. The corresponding values of acceleration will be prescribed according to the state of science, but must at least be 0.10 g for the horizontal and 0.05 g for the vertical component of acceleration at the basement

  8. Analytical investigations of the earthquake resistance of the support base of an oil-gas platform

    Energy Technology Data Exchange (ETDEWEB)

    Glagovskii, V. B.; Kassirova, N. A.; Turchina, O. A.; Finagenov, O. M.; Tsirukhin, N. A. [JSC ' VNIIG im. B. E. Vedeneeva' (Russian Federation)

    2012-01-15

    In designing stationary oil-gas recovery platforms on the continental shelf, the need arises to compute the estimated strength of their support base during seismic events. This paper is devoted to this estimation. The paper examines a structure consisting of the superstructure of an oil-gas platform and its gravity-type base. It is possible to install earthquake-insulating supports between them. Calculations performed for the design earthquake indicated that the design of the gravity base can resist a seismic effect without special additional measures. During the maximum design earthquake, moreover, significant stresses may develop in the zone of base where the columns are connected to the upper slab of the caisson. In that case, the earthquake insulation considered for the top of the platform becomes critical.

  9. Analytical investigations of the earthquake resistance of the support base of an oil-gas platform

    International Nuclear Information System (INIS)

    Glagovskii, V. B.; Kassirova, N. A.; Turchina, O. A.; Finagenov, O. M.; Tsirukhin, N. A.

    2012-01-01

    In designing stationary oil-gas recovery platforms on the continental shelf, the need arises to compute the estimated strength of their support base during seismic events. This paper is devoted to this estimation. The paper examines a structure consisting of the superstructure of an oil-gas platform and its gravity-type base. It is possible to install earthquake-insulating supports between them. Calculations performed for the design earthquake indicated that the design of the gravity base can resist a seismic effect without special additional measures. During the maximum design earthquake, moreover, significant stresses may develop in the zone of base where the columns are connected to the upper slab of the caisson. In that case, the earthquake insulation considered for the top of the platform becomes critical.

  10. The Hengill geothermal area, Iceland: Variation of temperature gradients deduced from the maximum depth of seismogenesis

    Science.gov (United States)

    Foulger, G. R.

    1995-04-01

    Given a uniform lithology and strain rate and a full seismic data set, the maximum depth of earthquakes may be viewed to a first order as an isotherm. These conditions are approached at the Hengill geothermal area S. Iceland, a dominantly basaltic area. The likely strain rate calculated from thermal and tectonic considerations is 10 -15 s -1, and temperature measurements from four drill sites within the area indicate average, near-surface geothermal gradients of up to 150 °C km -1 throughout the upper 2 km. The temperature at which seismic failure ceases for the strain rates likely at the Hengill geothermal area is determined by analogy with oceanic crust, and is about 650 ± 50 °C. The topographies of the top and bottom of the seismogenic layer were mapped using 617 earthquakes located highly accurately by performing a simultaneous inversion for three-dimensional structure and hypocentral parameters. The thickness of the seismogenic layer is roughly constant and about 3 km. A shallow, aseismic, low-velocity volume within the spreading plate boundary that crosses the area occurs above the top of the seismogenic layer and is interpreted as an isolated body of partial melt. The base of the seismogenic layer has a maximum depth of about 6.5 km beneath the spreading axis and deepens to about 7 km beneath a transform zone in the south of the area. Beneath the high-temperature part of the geothermal area, the maximum depth of earthquakes may be as shallow as 4 km. The geothermal gradient below drilling depths in various parts of the area ranges from 84 ± 9 °Ckm -1 within the low-temperature geothermal area of the transform zone to 138 ± 15 °Ckm -1 below the centre of the high-temperature geothermal area. Shallow maximum depths of earthquakes and therefore high average geothermal gradients tend to correlate with the intensity of the geothermal area and not with the location of the currently active spreading axis.

  11. Credibility engineering in the food industry: linking science, regulation, and marketing in a corporate context.

    Science.gov (United States)

    Penders, Bart; Nelis, Annemiek P

    2011-12-01

    We expand upon the notion of the "credibility cycle" through a study of credibility engineering by the food industry. Research and development (R&D) as well as marketing contribute to the credibility of the food company Unilever and its claims. Innovation encompasses the development, marketing, and sales of products. These are directed towards three distinct audiences: scientific peers, regulators, and consumers. R&D uses scientific articles to create credit for itself amongst peers and regulators. These articles are used to support health claims on products. However, R&D, regulation, and marketing are not separate realms. A single strategy of credibility engineering connects health claims to a specific public through linking that public to a health issue and a food product.

  12. Co-seismic deformation and gravity changes of the 2011 India-Nepal and Myanmar earthquakes

    Directory of Open Access Journals (Sweden)

    Liu Chengli

    2012-02-01

    Full Text Available Co-seismic deformation and gravity field changes caused by the 2011 Mw6. 8 Myanmar and Mw6. 9 India-Nepal earthquakes are calculated with a finite-element model and an average-slip model, respectively, based on the multi-layered elastic half-space dislocation theory. The calculated maximum horizontal displacement of the Myanmar earthquake is 36 cm, which is larger than the value of 9. 5 cm for the India-Nepal earthquake. This difference is attributed to their different focal depths and our use of different models. Except certain differences in the near field, both models give similar deformation and gravity results for the Myanmar event.

  13. Inflation Targeting and Liquidity Traps under Endogenous Credibility

    NARCIS (Netherlands)

    Hommes, C.; Lustenhouwer, J.

    2015-01-01

    We derive policy implications for an inflation targeting central bank, who's credibility is endogenous and depends on its past ability to achieve its targets. We do this in a New Keynesian framework with heterogeneous agents and boundely rational expectations. Our assumptions about expectation

  14. Precursory earthquakes of the 1943 eruption of Paricutin volcano, Michoacan, Mexico

    Science.gov (United States)

    Yokoyama, I.; de la Cruz-Reyna, S.

    1990-12-01

    Paricutin volcano is a monogenetic volcano whose birth and growth were observed by modern volcanological techniques. At the time of its birth in 1943, the seismic activity in central Mexico was mainly recorded by the Wiechert seismographs at the Tacubaya seismic station in Mexico City about 320 km east of the volcano area. In this paper we aim to find any characteristics of precursory earthquakes of the monogenetic eruption. Though there are limits in the available information, such as imprecise location of hypocenters and lack of earthquake data with magnitudes under 3.0. The available data show that the first precursory earthquake occurred on January 7, 1943, with a magnitude of 4.4. Subsequently, 21 earthquakes ranging from 3.2 to 4.5 in magnitude occurred before the outbreak of the eruption on February 20. The (S - P) durations of the precursory earthquakes do not show any systematic changes within the observational errors. The hypocenters were rather shallow and did not migrate. The precursory earthquakes had a characteristic tectonic signature, which was retained through the whole period of activity. However, the spectra of the P-waves of the Paricutin earthquakes show minor differences from those of tectonic earthquakes. This fact helped in the identification of Paricutin earthquakes. Except for the first shock, the maximum earthquake magnitudes show an increasing tendency with time towards the outbreak. The total seismic energy released by the precursory earthquakes amounted to 2 × 10 19 ergs. Considering that statistically there is a threshold of cumulative seismic energy release (10 17-18ergs) by precursory earthquakes in polygenetic volcanoes erupting after long quiescence, the above cumulative energy is exceptionally large. This suggests that a monogenetic volcano may need much more energy to clear the way of magma passage to the earth surface than a polygenetic one. The magma ascent before the outbreak of Paricutin volcano is interpretable by a model

  15. Inspection and repair in JRR-3 after the 2011 off the Pacific coast of Tohoku Earthquake

    International Nuclear Information System (INIS)

    Hosoya, Toshiaki; Nagadomi, Hideki; Torii, Yoshiya

    2014-01-01

    In the 2011 off the Pacific Coast of Tohoku Earthquake, seismic intensity of 6 lower was observed at Tokai Village. However, the maximum acceleration of ground motion that was observed in the JRR-3 reactor facilities had exceeded the maximum response acceleration at the time of design. Therefore, to confirm whether the predetermined performance of the facility equipment of the reactor facilities had been maintained after the earthquake, soundness confirmation inspection was carried out. In the inspection, the soundness of equipment and facilities was evaluated from the results of the equipment inspection and seismic impact assessment, and the repair work was applied when necessary. As a result, it was confirmed that after the earthquake, the equipment of JRR-3 reactor facilities maintained the predetermined performance, and was possible to resume operation. The following item are reported here: (1) overview of JRR-3, (2) conditions of JRR-3 reactor facilities while earthquake occurrence, (3) basic principle for soundness evaluation of facilities, (4) soundness confirmation of buildings and structures, (5) contents of repair, and (6) soundness verification and comprehensive evaluation of each facility and equipment. (A.O.)

  16. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  17. Earthquakes Versus Surface Deformation: Qualitative and Quantitative Relationships From The Aegean

    Science.gov (United States)

    Pavlides, S.; Caputo, R.

    Historical seismicity of the Aegean Region has been revised in order to associate major earthquakes to specific seismogenic structures. Only earthquakes associated to normal faulting have been considered. All available historical and seismotectonic data relative to co-seismic surface faulting have been collected in order to evaluate the surface rup- ture length (SRL) and the maximum displacement (MD). In order to perform Seismic Hazard analyses, empirical relationships between these parameters and the magnitude have been inferred and the best fitting regression functions have been calculated. Both co-seismic fault rupture lengths and maximum displacements show a logarithmic re- lationships, but our data from the Aegean Region have systematically lower values than the same parameters world-wide though they are similar to those of the East- ern Mediterranean-Middle East region. The upper envelopes of our diagrams (SRL vs Mw and MD vs Mw) have been also estimated and discussed, because they give useful information of the wort-case scenarios; these curces will be also discussed. Further- more, geological and morphological criteria have been used to recognise the tectonic structures along which historical earthquakes occurred in order to define the geolog- ical fault length (GFL). Accordingly, the SRL/GFL ratio seems to have a bimodal distribution with a major peak about 0.8-1.0, indicating that several earthquakes break through almost the entire geological fault length, and a second peak around 0.5, re- lated to the possible segmentation of these major neotectonic faults. In contrast, no relationships can be depicted between the SRL/GFL ratio and the magnitude of the corresponding events.

  18. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  19. Spatial Distribution of the Coefficient of Variation for the Paleo-Earthquakes in Japan

    Science.gov (United States)

    Nomura, S.; Ogata, Y.

    2015-12-01

    Renewal processes, point prccesses in which intervals between consecutive events are independently and identically distributed, are frequently used to describe this repeating earthquake mechanism and forecast the next earthquakes. However, one of the difficulties in applying recurrent earthquake models is the scarcity of the historical data. Most studied fault segments have few, or only one observed earthquake that often have poorly constrained historic and/or radiocarbon ages. The maximum likelihood estimate from such a small data set can have a large bias and error, which tends to yield high probability for the next event in a very short time span when the recurrence intervals have similar lengths. On the other hand, recurrence intervals at a fault depend on the long-term slip rate caused by the tectonic motion in average. In addition, recurrence times are also fluctuated by nearby earthquakes or fault activities which encourage or discourage surrounding seismicity. These factors have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus, this paper introduces a spatial structure on the key parameters of renewal processes for recurrent earthquakes and estimates it by using spatial statistics. Spatial variation of mean and variance parameters of recurrence times are estimated in Bayesian framework and the next earthquakes are forecasted by Bayesian predictive distributions. The proposal model is applied for recurrent earthquake catalog in Japan and its result is compared with the current forecast adopted by the Earthquake Research Committee of Japan.

  20. Temporal distribution of earthquakes using renewal process in the Dasht-e-Bayaz region

    Science.gov (United States)

    Mousavi, Mehdi; Salehi, Masoud

    2018-01-01

    Temporal distribution of earthquakes with M w > 6 in the Dasht-e-Bayaz region, eastern Iran has been investigated using time-dependent models. Based on these types of models, it is assumed that the times between consecutive large earthquakes follow a certain statistical distribution. For this purpose, four time-dependent inter-event distributions including the Weibull, Gamma, Lognormal, and the Brownian Passage Time (BPT) are used in this study and the associated parameters are estimated using the method of maximum likelihood estimation. The suitable distribution is selected based on logarithm likelihood function and Bayesian Information Criterion. The probability of the occurrence of the next large earthquake during a specified interval of time was calculated for each model. Then, the concept of conditional probability has been applied to forecast the next major ( M w > 6) earthquake in the site of our interest. The emphasis is on statistical methods which attempt to quantify the probability of an earthquake occurring within a specified time, space, and magnitude windows. According to obtained results, the probability of occurrence of an earthquake with M w > 6 in the near future is significantly high.

  1. A framework to establish credibility of computational models in biology.

    Science.gov (United States)

    Patterson, Eann A; Whelan, Maurice P

    2017-10-01

    Computational models in biology and biomedical science are often constructed to aid people's understanding of phenomena or to inform decisions with socioeconomic consequences. Model credibility is the willingness of people to trust a model's predictions and is often difficult to establish for computational biology models. A 3 × 3 matrix has been proposed to allow such models to be categorised with respect to their testability and epistemic foundation in order to guide the selection of an appropriate process of validation to supply evidence to establish credibility. Three approaches to validation are identified that can be deployed depending on whether a model is deemed untestable, testable or lies somewhere in between. In the latter two cases, the validation process involves the quantification of uncertainty which is a key output. The issues arising due to the complexity and inherent variability of biological systems are discussed and the creation of 'digital twins' proposed as a means to alleviate the issues and provide a more robust, transparent and traceable route to model credibility and acceptance. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  3. Shortcomings of InSAR for studying megathrust earthquakes: The case of the M w 9.0 Tohoku-Oki earthquake

    KAUST Repository

    Feng, Guangcai

    2012-05-28

    Interferometric Synthetic Aperture Radar (InSAR) observations are sometimes the only geodetic data of large subduction-zone earthquakes. However, these data usually suffer from spatially long-wavelength orbital and atmospheric errors that can be difficult to distinguish from the coseismic deformation and may therefore result in biased fault-slip inversions. To study how well InSAR constrains fault-slip of large subduction zone earthquakes, we use data of the 11 March 2011 Tohoku-Oki earthquake (Mw9.0) and test InSAR-derived fault-slip models against models constrained by GPS data from the extensive nationwide network in Japan. The coseismic deformation field was mapped using InSAR data acquired from multiple ascending and descending passes of the ALOS and Envisat satellites. We then estimated several fault-slip distribution models that were constrained using the InSAR data alone, onland and seafloor GPS/acoustic data, or combinations of the different data sets. Based on comparisons of the slip models, we find that there is no real gain by including InSAR observations for determining the fault slip distribution of this earthquake. That said, however, some of the main fault-slip patterns can be retrieved using the InSAR data alone when estimating long wavelength orbital/atmospheric ramps as a part of the modeling. Our final preferred fault-slip solution of the Tohoku-Oki earthquake is based only on the GPS data and has maximum reverse- and strike-slip of 36.0 m and 6.0 m, respectively, located northeast of the epicenter at a depth of 6 km, and has a total geodetic moment is 3.6 × 1022 Nm (Mw 9.01), similar to seismological estimates.

  4. Data base and seismicity studies for Fagaras, Romania crustal earthquakes

    International Nuclear Information System (INIS)

    Moldovan, I.-A.; Enescu, B. D.; Pantea, A.; Constantin, A.; Bazacliu, O.; Malita, Z.; Moldoveanu, T.

    2002-01-01

    Besides the major impact of the Vrancea seismic region, one of the most important intermediate earthquake sources of Europe, the Romanian crustal earthquake sources, from Fagaras, Banat, Crisana, Bucovina or Dobrogea regions, have to be taken into consideration for seismicity studies or seismic hazard assessment. To determine the characteristics of the seismicity for Fagaras seismogenic region, a revised and updated catalogue of the Romanian earthquakes, recently compiled by Oncescu et al. (1999) is used. The catalogue contains 471 tectonic earthquakes and 338 induced earthquakes and is homogenous starting with 1471 for I>VIII and for I>VII starting with 1801. The catalogue is complete for magnitudes larger than 3 starting with 1982. In the studied zone only normal earthquakes occur, related to intracrustal fractures situated from 5 to 30 km depth. Most of them are of low energy, but once in a century a large destructive event occurs with epicentral intensity larger than VIII. The maximum expected magnitude is M GR = 6.5 and the epicenter distribution outlines significant clustering in the zones and on the lines mentioned in the tectonic studies. Taking into account the date of the last major earthquake (1916) and the return periods of severe damaging shocks of over 85 years it is to be expected very soon a large shock in the area. That's why a seismicity and hazard study for this zone is necessary. In the paper there are studied the b parameter variation (the mean value is 0.69), the activity value, the return periods, and seismicity maps and different histograms are plotted. At the same time there are excluded from the catalogue the explosions due to Campulung quarry. Because the catalogue contains the aftershocks for the 1916 earthquake for the seismicity studies we have excluded these shocks. (authors)

  5. Outline of geophysical investigations on the great earthquake in the south-west Japan on Dec. 21, 1946

    Science.gov (United States)

    Nagata, Takeshi

    1947-01-01

    In in the early morning of Dec. 21, 1946, a great destructive earthquake occurred in southern-western Japan. According to the seismogram obtained in our university, the earthquake motion began at Tokyo from 4 h 20 m 10.4 s on Dec. 21, 1946. The maximum amplitude of NS, EW, and up-down components of the earthquake motion at Tokyo was 12.0 mm, 14.0 mm and 3.0 mm respectively, while the initial motion was composed of 80 μ south, 67 μ west and 20 μ down movements.

  6. FEATURES AND PROBLEMS WITH HISTORICAL GREAT EARTHQUAKES AND TSUNAMIS IN THE MEDITERRANEAN SEA

    Directory of Open Access Journals (Sweden)

    Lobkovsky L.

    2016-11-01

    Full Text Available The present study examines the historical earthquakes and tsunamis of 21 July 365 and of 9 February 1948 in the Eastern Mediterranean Sea. Numerical simulations were performed for the tsunamis generated by underwater seismic sources in frames of the keyboard model, as well as for their propagation in the Mediterranean Sea basin. Similarly examined were three different types of seismic sources at the same localization near the Island of Crete for the earthquake of 21 July 365, and of two different types of seismic sources for the earthquake of 9 February 1948 near the Island of Karpathos. For each scenario, the tsunami wave field characteristics from the earthquake source to coastal zones in Mediterranean Sea’s basin were obtained and histograms were constructed showing the distribution of maximum tsunami wave heights, along a 5-m isobath. Comparison of tsunami wave characteristics for all the above mentioned scenarios, demonstrates that underwater earthquakes with magnitude M > 7 in the Eastern Mediterranean Sea basin, can generate waves with coastal runup up to 9 m.

  7. ARMA models for earthquake ground motions. Seismic Safety Margins Research Program

    International Nuclear Information System (INIS)

    Chang, Mark K.; Kwiatkowski, Jan W.; Nau, Robert F.; Oliver, Robert M.; Pister, Karl S.

    1981-02-01

    This report contains an analysis of four major California earthquake records using a class of discrete linear time-domain processes commonly referred to as ARMA (Autoregressive/Moving-Average) models. It has been possible to analyze these different earthquakes, identify the order of the appropriate ARMA model(s), estimate parameters and test the residuals generated by these models. It has also been possible to show the connections, similarities and differences between the traditional continuous models (with parameter estimates based on spectral analyses) and the discrete models with parameters estimated by various maximum likelihood techniques applied to digitized acceleration data in the time domain. The methodology proposed in this report is suitable for simulating earthquake ground motions in the time domain and appears to be easily adapted to serve as inputs for nonlinear discrete time models of structural motions. (author)

  8. Building Credible Human Capital Analytics for Organizational Competitive Advantage

    DEFF Research Database (Denmark)

    Minbaeva, Dana

    2018-01-01

    Despite the enormous interest in human capital analytics (HCA), organizations have struggled to move from operational reporting to HCA. This is mainly the result of the inability of analytics teams to establish credible internal HCA and demonstrate its value. In this article, we stress the import......Despite the enormous interest in human capital analytics (HCA), organizations have struggled to move from operational reporting to HCA. This is mainly the result of the inability of analytics teams to establish credible internal HCA and demonstrate its value. In this article, we stress...... the importance of conceptualizing HCA as an organizational capability and suggest a method for its operationalization. We argue that the development of HCA within an organization requires working with three dimensions of HCA: data quality, analytics capabilities, and strategic ability to act. Moreover, such work...

  9. Can Simulation Credibility Be Improved Using Sensitivity Analysis to Understand Input Data Effects on Model Outcome?

    Science.gov (United States)

    Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.

    2015-01-01

    Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.

  10. A generalized fuzzy credibility-constrained linear fractional programming approach for optimal irrigation water allocation under uncertainty

    Science.gov (United States)

    Zhang, Chenglong; Guo, Ping

    2017-10-01

    The vague and fuzzy parametric information is a challenging issue in irrigation water management problems. In response to this problem, a generalized fuzzy credibility-constrained linear fractional programming (GFCCFP) model is developed for optimal irrigation water allocation under uncertainty. The model can be derived from integrating generalized fuzzy credibility-constrained programming (GFCCP) into a linear fractional programming (LFP) optimization framework. Therefore, it can solve ratio optimization problems associated with fuzzy parameters, and examine the variation of results under different credibility levels and weight coefficients of possibility and necessary. It has advantages in: (1) balancing the economic and resources objectives directly; (2) analyzing system efficiency; (3) generating more flexible decision solutions by giving different credibility levels and weight coefficients of possibility and (4) supporting in-depth analysis of the interrelationships among system efficiency, credibility level and weight coefficient. The model is applied to a case study of irrigation water allocation in the middle reaches of Heihe River Basin, northwest China. Therefore, optimal irrigation water allocation solutions from the GFCCFP model can be obtained. Moreover, factorial analysis on the two parameters (i.e. λ and γ) indicates that the weight coefficient is a main factor compared with credibility level for system efficiency. These results can be effective for support reasonable irrigation water resources management and agricultural production.

  11. Performance of Real-time Earthquake Information System in Japan

    Science.gov (United States)

    Nakamura, H.; Horiuchi, S.; Wu, C.; Yamamoto, S.

    2008-12-01

    Horiuchi et al. (2005) developed a real-time earthquake information system (REIS) using Hi-net, a densely deployed nationwide seismic network, which consists of about 800 stations operated by NIED, Japan. REIS determines hypocenter locations and earthquake magnitudes automatically within a few seconds after P waves arrive at the closest station and calculates focal mechanisms within about 15 seconds. Obtained hypocenter parameters are transferred immediately by using XML format to a computer in Japan Meteorological Agency (JMA), who started the service of EEW to special users in June 2005. JMA also developed EEW using 200 stations. The results by the two systems are merged. Among all the first issued EEW reports by both systems, REIS information accounts for about 80 percent. This study examines the rapidity and credibility of REIS by analyzing the 4050 earthquakes which occurred around the Japan Islands since 2005 with magnitude larger than 3.0. REIS re-determines hypocenter parameters every one second according to the revision of waveform data. Here, we discuss only about the results by the first reports. On rapidness, our results show that about 44 percent of the first reports are issued within 5 seconds after the P waves arrives at the closest stations. Note that this 5-second time window includes time delay due to data package and transmission delay of about 2 seconds. REIS waits till two stations detect P waves for events in the network but four stations outside the network so as to get reliable solutions. For earthquakes with hypocentral distance less than 100km, 55 percent of earthquakes are warned in 5 seconds and 87 percent are warned in 10 seconds. Most of events having long time delay are small and triggered by S wave arrivals. About 80 percent of events have difference in epicenter distances less than 20km relative to JMA manually determined locations. Because of the existence of large lateral heterogeneity in seismic velocity, the difference depends

  12. Earthquake accelerations estimation for construction calculating with different responsibility degrees

    International Nuclear Information System (INIS)

    Dolgaya, A.A.; Uzdin, A.M.; Indeykin, A.V.

    1993-01-01

    The investigation object is the design amplitude of accelerograms, which are used in the evaluation of seismic stability of responsible structures, first and foremost, NPS. The amplitude level is established depending on the degree of responsibility of the structure and on the prevailing period of earthquake action on the construction site. The investigation procedure is based on statistical analysis of 310 earthquakes. At the first stage of statistical data-processing we established the correlation dependence of both the mathematical expectation and root-mean-square deviation of peak acceleration of the earthquake on its prevailing period. At the second stage the most suitable law of acceleration distribution about the mean was chosen. To determine of this distribution parameters, we specified the maximum conceivable acceleration, the excess of which is not allowed. Other parameters of distribution are determined according to statistical data. At the third stage the dependencies of design amplitude on the prevailing period of seismic effect for different structures and equipment were established. The obtained data made it possible to recommend to fix the level of safe-shutdown (SSB) and operating basis earthquakes (OBE) for objects of various responsibility categories when designing NPS. (author)

  13. Vote Buying or Campaign Promises? Electoral Strategies When Party Credibility is Limited

    OpenAIRE

    Hanusch, Marek; Keefer, Philip; Vlaicu, Razvan

    2016-01-01

    What explains significant variation across countries in the use of vote buying instead of campaign promises to secure voter support? This paper explicitly models the tradeoff parties face between engaging in vote buying and making campaign promises, and explores the distributional consequences of this decision, in a setting where party credibility can vary. When parties are less credible they spend more on vote buying and target vote buying more heavily toward groups that do not believe campa...

  14. Relations between source parameters for large Persian earthquakes

    Directory of Open Access Journals (Sweden)

    Majid Nemati

    2015-11-01

    Full Text Available Empirical relationships for magnitude scales and fault parameters were produced using 436 Iranian intraplate earthquakes of recently regional databases since the continental events represent a large portion of total seismicity of Iran. The relations between different source parameters of the earthquakes were derived using input information which has usefully been provided from the databases after 1900. Suggested equations for magnitude scales relate the body-wave, surface-wave as well as local magnitude scales to scalar moment of the earthquakes. Also, dependence of source parameters as surface and subsurface rupture length and maximum surface displacement on the moment magnitude for some well documented earthquakes was investigated. For meeting this aim, ordinary linear regression procedures were employed for all relations. Our evaluations reveal a fair agreement between obtained relations and equations described in other worldwide and regional works in literature. The M0-mb and M0-MS equations are correlated well to the worldwide relations. Also, both M0-MS and M0-ML relations have a good agreement with regional studies in Taiwan. The equations derived from this study mainly confirm the results of the global investigations about rupture length of historical and instrumental events. However, some relations like MW-MN and MN-ML which are remarkably unlike to available regional works (e.g., American and Canadian were also found.

  15. INVESTIGATINGTHE SOURCEATTRIBUTES INFLUENCINGCONSUMERS’ CREDIBILITY EVALUATIONS OF AN ATHLETE-CELEBRITY ENDORSED PRODUCT

    Directory of Open Access Journals (Sweden)

    Bafokeng Bafokeng Mahao

    2017-01-01

    Full Text Available The use of celebrity endorsement as an advertising strategy has been widelyembraced by numerousorganisations. Nonetheless, scholarly wisdom suggests theneed for cumulative research that seeks to identify the unique set of source factorsthat enhance the credibility of advertising communication messages delivered bycelebrities across countries.Inthis vein, the primary purpose of this research wasto apply the dimensions of thesource attributes theory (Ohanian, 1990 tounderstand the underlying factors that influence South African consumers to havepositive perceptions towards purchasingaproductthat hasbeen endorsed by alocal athlete celebrity-endorser. A quantitative research approach was applied,wherein a self-administered survey questionnaire comprising20scale items wasadapted for this research. Data were collected from a consumer sample of 456consumers based in Gauteng, South Africa. Upon applying exploratory factoranalysisand mean score rankings, source likeability, source trust, source authorityand source credibility were established as the underlying factors influencingconsumers’ credibility evaluations, in descending order of importance. Moreover,the inter-factor correlation matrix revealed positive relationships among theidentified factors. Insights gained from this study could assist practitioners todesign effectiveadvertisementstrategies that fosterpositivecredibility evaluationsthrough known product endorsers.

  16. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience

    Science.gov (United States)

    Mulugeta, Lealem; Drach, Andrew; Erdemir, Ahmet; Hunt, C. A.; Horner, Marc; Ku, Joy P.; Myers Jr., Jerry G.; Vadigepalli, Rajanikanth; Lytton, William W.

    2018-01-01

    Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips), from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins) and scenario-based simulations rather than on numerical simulations. PMID:29713272

  17. Credibility, Replicability, and Reproducibility in Simulation for Biomedicine and Clinical Applications in Neuroscience

    Directory of Open Access Journals (Sweden)

    Lealem Mulugeta

    2018-04-01

    Full Text Available Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips, from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins and scenario-based simulations rather than on numerical simulations.

  18. Goce derived geoid changes before the Pisagua 2014 earthquake

    Directory of Open Access Journals (Sweden)

    Orlando Álvarez

    2018-01-01

    Full Text Available The analysis of space – time surface deformation during earthquakes reveals the variable state of stress that occurs at deep crustal levels, and this information can be used to better understand the seismic cycle. Understanding the possible mechanisms that produce earthquake precursors is a key issue for earthquake prediction. In the last years, modern geodesy can map the degree of seismic coupling during the interseismic period, as well as the coseismic and postseismic slip for great earthquakes along subduction zones. Earthquakes usually occur due to mass transfer and consequent gravity variations, where these changes have been monitored for intraplate earthquakes by means of terrestrial gravity measurements. When stresses and correspondent rupture areas are large, affecting hundreds of thousands of square kilometres (as occurs in some segments along plate interface zones, satellite gravimetry data become relevant. This is due to the higher spatial resolution of this type of data when compared to terrestrial data, and also due to their homogeneous precision and availability across the whole Earth. Satellite gravity missions as GOCE can map the Earth gravity field with unprecedented precision and resolution. We mapped geoid changes from two GOCE satellite models obtained by the direct approach, which combines data from other gravity missions as GRACE and LAGEOS regarding their best characteristics. The results show that the geoid height diminished from a year to five months before the main seismic event in the region where maximum slip occurred after the Pisagua Mw = 8.2 great megathrust earthquake. This diminution is interpreted as accelerated inland-directed interseismic mass transfer before the earthquake, coinciding with the intermediate degree of seismic coupling reported in the region. We highlight the advantage of satellite data for modelling surficial deformation related to pre-seismic displacements. This deformation, combined to

  19. A comparison of inflation expectations and inflation credibility in South Africa: results from survey data

    Directory of Open Access Journals (Sweden)

    Jannie Rossouw

    2011-08-01

    Full Text Available This paper reports a comparison of South African household inflation expectations and inflation credibility surveys undertaken in 2006 and 2008. It tests for possible feed-through between inflation credibility and inflation expectations. It supplements earlier research that focused only on the 2006 survey results. The comparison shows that inflation expectations differed between different income groups in both 2006 and 2008. Inflation credibility differed between male and female respondents, but this difference did not feed through to inflation expectations. More periodic survey data will be required for developing final conclusions on the possibility of feed-through effects. To this end the structure of credibility surveys should be reconsidered, as a large percentage of respondents indicated that they ‘don’t know’ whether the historic rate of inflation is an accurate indication of price increases.

  20. Credible or not : A study on the factors influencing consumers' credibility assessment of product placements on Instagram

    OpenAIRE

    Kulin, Elin; Blomgren, Linnéa

    2016-01-01

    Background: To align with the new trend of using social media in the marketing mix, product placement has been adapted to social media platforms as one strategy to create attention. Especially on Instagram, product placements have gained popularity among companies. While scholars have focused on measuring the effectiveness of the strategy, suggesting that credibility is one component necessary for success, a gap in the research is illuminated when focusing on what makes a product placement on...

  1. Distribution system reliability evaluation using credibility theory | Xu ...

    African Journals Online (AJOL)

    In this paper, a hybrid algorithm based on fuzzy simulation and Failure Mode and Effect Analysis (FMEA) is applied to determine fuzzy reliability indices of distribution system. This approach can obtain fuzzy expected values and their variances of reliability indices, and the credibilities of reliability indices meeting specified ...

  2. Earthquake Activities Along the Strike-Slip Fault System on the Thailand-Myanmar Border

    Directory of Open Access Journals (Sweden)

    Santi Pailoplee

    2014-01-01

    Full Text Available This study investigates the present-day seismicity along the strike-slip fault system on the Thailand-Myanmar border. Using the earthquake catalogue the earthquake parameters representing seismic activities were evaluated in terms of the possible maximum magnitude, return period and earthquake occurrence probabilities. Three different hazardous areas could be distinguished from the obtained results. The most seismic-prone area was located along the northern segment of the fault system and can generate earthquakes of magnitude 5.0, 5.8, and 6.8 mb in the next 5, 10, and 50 years, respectively. The second most-prone area was the southern segment where earthquakes of magnitude 5.0, 6.0, and 7.0 mb might be generated every 18, 60, and 300 years, respectively. For the central segment, there was less than 30 and 10% probability that 6.0- and 7.0-mb earthquakes will be generated in the next 50 years. With regards to the significant infrastructures (dams in the vicinity, the operational Wachiralongkorn dam is situated in a low seismic hazard area with a return period of around 30 - 3000 years for a 5.0 - 7.0 mb earthquake. In contrast, the Hut Gyi, Srinakarin and Tha Thung Na dams are seismically at risk for earthquakes of mb 6.4 - 6.5 being generated in the next 50 years. Plans for a seismic-retrofit should therefore be completed and implemented while seismic monitoring in this region is indispensable.

  3. Credibility assessment of testimonies provided by victims with intellectual disabilities

    Directory of Open Access Journals (Sweden)

    Antonio L. MANZANERO

    2017-09-01

    Full Text Available One of the main obstacles in the way of access to justice for the victims with intellectual disability comes from the stereotypes referred to their ability to produce a statement at police legal procedures, with the consequence that some consider their statements less reliable than the rest of the victims, and others considerate their statements more reliable given their inability to create complex lies. This article reviews three of the most recent studies done by the UCM group of Psychology of Testimony, with the objective of analyzing the role of experience and intuition in the evaluation of credibility in people with intellectual disability (ID, and also it aims to prove whether the credibility analysis procedures such as Reality Monitoring (RM and Statement Validity Assessment (SVA would be valid procedures to discriminate between real and false statements within these collectives. From the results of these studies, it can be deducted that experience may not seem to be enough in order to discriminate between real and simulated victims, but analyzing the characteristics of the statements as the only indicator doesn’t seem to be enough either. As an alternative, the general procedure HELPT is proposed for the evaluation of credibility of people with ID.

  4. The Source and Credibility of Colorectal Cancer Information on Twitter.

    Science.gov (United States)

    Park, SoHyun; Oh, Heung-Kwon; Park, Gibeom; Suh, Bongwon; Bae, Woo Kyung; Kim, Jin Won; Yoon, Hyuk; Kim, Duck-Woo; Kang, Sung-Bum

    2016-02-01

    Despite the rapid penetration of social media in modern life, there has been limited research conducted on whether social media serves as a credible source of health information. In this study, we propose to identify colorectal cancer information on Twitter and assess its informational credibility. We collected Twitter messages containing colorectal cancer-related keywords, over a 3-month period. A review of sample tweets yielded content and user categorization schemes. The results of the sample analysis were applied to classify all collected tweets and users, using a machine learning technique. The credibility of the information in the sampled tweets was evaluated. A total of 76,119 tweets were analyzed. Individual users authored the majority of tweets (n = 68,982, 90.6%). They mostly tweeted about news articles/research (n = 16,761, 22.0%) and risk/prevention (n = 14,767, 19.4%). Medical professional users generated only 2.0% of total tweets (n = 1509), and medical institutions rarely tweeted (n = 417, 0.6%). Organizations tended to tweet more about information than did individuals (85.2% vs 63.1%; P users. Coupled with the Internet's potential to increase social support, Twitter may contribute to enhancing public health and empowering users, when used with proper caution.

  5. Empirical Bayes Credibility Models for Economic Catastrophic Losses by Regions

    Directory of Open Access Journals (Sweden)

    Jindrová Pavla

    2017-01-01

    Full Text Available Catastrophic events affect various regions of the world with increasing frequency and intensity. The number of catastrophic events and the amount of economic losses is varying in different world regions. Part of these losses is covered by insurance. Catastrophe events in last years are associated with increases in premiums for some lines of business. The article focus on estimating the amount of net premiums that would be needed to cover the total or insured catastrophic losses in different world regions using Bühlmann and Bühlmann-Straub empirical credibility models based on data from Sigma Swiss Re 2010-2016. The empirical credibility models have been developed to estimate insurance premiums for short term insurance contracts using two ingredients: past data from the risk itself and collateral data from other sources considered to be relevant. In this article we deal with application of these models based on the real data about number of catastrophic events and about the total economic and insured catastrophe losses in seven regions of the world in time period 2009-2015. Estimated credible premiums by world regions provide information how much money in the monitored regions will be need to cover total and insured catastrophic losses in next year.

  6. The 8 September 2017 Tsunami Triggered by the M w 8.2 Intraplate Earthquake, Chiapas, Mexico

    Science.gov (United States)

    Ramírez-Herrera, María Teresa; Corona, Néstor; Ruiz-Angulo, Angel; Melgar, Diego; Zavala-Hidalgo, Jorge

    2018-01-01

    The 8 September 2017, M w 8.2 earthquake offshore Chiapas, Mexico, is the largest earthquake in recorded history in Chiapas since 1902. It caused damage in the states of Oaxaca, Chiapas and Tabasco, including more than 100 fatalities, over 1.5 million people were affected, and 41,000 homes were damaged in the state of Chiapas alone. This earthquake, an intraplate event on a normal fault on the oceanic subducting plate, generated a tsunami recorded at several tide gauge stations in Mexico and on the Pacific Ocean. Here, we report the physical effects of the tsunami on the Chiapas coast and analyze the societal implications of this tsunami on the basis of our post-tsunami field survey. The associated tsunami waves were recorded first at Huatulco tide gauge station at 5:04 (GMT) 12 min after the earthquake. We covered ground observations along 41 km of the coast of Chiapas, encompassing the sites with the highest projected wave heights based on our preliminary tsunami model (maximum tsunami amplitudes between 94.5° and 93.0°W). Runup and inundation distances were measured along eight sites. The tsunami occurred at low tide. The maximum runup was 3 m at Boca del Cielo, and maximum inundation distance was 190 m in Puerto Arista, corresponding to the coast in front of the epicenter and in the central sector of the Gulf of Tehuantepec. Tsunami scour and erosion was evident along the Chiapas coast. Tsunami deposits, mainly sand, reached up to 32 cm thickness thinning landward up to 172 m distance.

  7. Prevention of strong earthquakes: Goal or utopia?

    Science.gov (United States)

    Mukhamediev, Sh. A.

    2010-11-01

    In the present paper, we consider ideas suggesting various kinds of industrial impact on the close-to-failure block of the Earth’s crust in order to break a pending strong earthquake (PSE) into a number of smaller quakes or aseismic slips. Among the published proposals on the prevention of a forthcoming strong earthquake, methods based on water injection and vibro influence merit greater attention as they are based on field observations and the results of laboratory tests. In spite of this, the cited proofs are, for various reasons, insufficient to acknowledge the proposed techniques as highly substantiated; in addition, the physical essence of these methods has still not been fully understood. First, the key concept of the methods, namely, the release of the accumulated stresses (or excessive elastic energy) in the source region of a forthcoming strong earthquake, is open to objection. If we treat an earthquake as a phenomenon of a loss in stability, then, the heterogeneities of the physicomechanical properties and stresses along the existing fault or its future trajectory, rather than the absolute values of stresses, play the most important role. In the present paper, this statement is illustrated by the classical examples of stable and unstable fractures and by the examples of the calculated stress fields, which were realized in the source regions of the tsunamigenic earthquakes of December 26, 2004 near the Sumatra Island and of September 29, 2009 near the Samoa Island. Here, just before the earthquakes, there were no excessive stresses in the source regions. Quite the opposite, the maximum shear stresses τmax were close to their minimum value, compared to τmax in the adjacent territory. In the present paper, we provide quantitative examples that falsify the theory of the prevention of PSE in its current form. It is shown that the measures for the prevention of PSE, even when successful for an already existing fault, can trigger or accelerate a catastrophic

  8. Credibility and music´s effect as therapeutic kind in health

    Directory of Open Access Journals (Sweden)

    Karyne Cristine da Fonseca

    2006-12-01

    Full Text Available This article has as purpose to analise the professional´s music therapist´s perception about the credibility and approval of music therapy by their clients. It´s a qualitative research, developed in Goiânia-GO, between 2003, august and 2004, june. We verified that the professional´s majority noted their client´s credibility related to music capacity in transmitting pleasant sensations and its capacity to act in a efficient way on the healing process of some diseases. They too evidence that the music therapy needs to be published with more effectiveness for the population.

  9. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  10. Foreshock sequences and short-term earthquake predictability on East Pacific Rise transform faults.

    Science.gov (United States)

    McGuire, Jeffrey J; Boettcher, Margaret S; Jordan, Thomas H

    2005-03-24

    East Pacific Rise transform faults are characterized by high slip rates (more than ten centimetres a year), predominantly aseismic slip and maximum earthquake magnitudes of about 6.5. Using recordings from a hydroacoustic array deployed by the National Oceanic and Atmospheric Administration, we show here that East Pacific Rise transform faults also have a low number of aftershocks and high foreshock rates compared to continental strike-slip faults. The high ratio of foreshocks to aftershocks implies that such transform-fault seismicity cannot be explained by seismic triggering models in which there is no fundamental distinction between foreshocks, mainshocks and aftershocks. The foreshock sequences on East Pacific Rise transform faults can be used to predict (retrospectively) earthquakes of magnitude 5.4 or greater, in narrow spatial and temporal windows and with a high probability gain. The predictability of such transform earthquakes is consistent with a model in which slow slip transients trigger earthquakes, enrich their low-frequency radiation and accommodate much of the aseismic plate motion.

  11. Eletronuclear's relationship with the Brazilian media: transparency and credibility

    International Nuclear Information System (INIS)

    Alvarez, Gloria

    2013-01-01

    In a capitalist economy the most valued assets are not money, shares or facilities, but credibility. Lack of money can ruin a company, but often it is reputation that delivers the final blow. It has become challenging to safeguard reputation in a world where Communication is increasingly connected and with such an intense and lightning fast flow of information. This is particularly true for the electricity sector - a commodity so prevalent in everyday modern life, but, whose business dealings, are hardly known by the general public. When it comes to nuclear energy, the challenge of establishing an effective Communication with transparency and credibility touches on even more complex elements. The topic of this paper is the scenario through which the Communication process, along with its characteristics and approaches, unfolds between the nuclear sector and the Brazilian media. (author)

  12. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  13. The Quanzhou large earthquake: environment impact and deep process

    Science.gov (United States)

    WANG, Y.; Gao*, R.; Ye, Z.; Wang, C.

    2017-12-01

    The Quanzhou earthquake is the largest earthquake in China's southeast coast in history. The ancient city of Quanzhou and its adjacent areas suffered serious damage. Analysis of the impact of Quanzhou earthquake on human activities, ecological environment and social development will provide an example for the research on environment and human interaction.According to historical records, on the night of December 29, 1604, a Ms 8.0 earthquake occurred in the sea area at the east of Quanzhou (25.0°N, 119.5°E) with a focal depth of 25 kilometers. It affected to a maximum distance of 220 kilometers from the epicenter and caused serious damage. Quanzhou, which has been known as one of the world's largest trade ports during Song and Yuan periods was heavily destroyed by this earthquake. The destruction of the ancient city was very serious and widespread. The city wall collapsed in Putian, Nanan, Tongan and other places. The East and West Towers of Kaiyuan Temple, which are famous with magnificent architecture in history, were seriously destroyed.Therefore, an enormous earthquake can exert devastating effects on human activities and social development in the history. It is estimated that a more than Ms. 5.0 earthquake in the economically developed coastal areas in China can directly cause economic losses for more than one hundred million yuan. This devastating large earthquake that severely destroyed the Quanzhou city was triggered under a tectonic-extensional circumstance. In this coastal area of the Fujian Province, the crust gradually thins eastward from inland to coast (less than 29 km thick crust beneath the coast), the lithosphere is also rather thin (60 70 km), and the Poisson's ratio of the crust here appears relatively high. The historical Quanzhou Earthquake was probably correlated with the NE-striking Littoral Fault Zone, which is characterized by right-lateral slip and exhibiting the most active seismicity in the coastal area of Fujian. Meanwhile, tectonic

  14. Error evaluation of inelastic response spectrum method for earthquake design

    International Nuclear Information System (INIS)

    Paz, M.; Wong, J.

    1981-01-01

    Two-story, four-story and ten-story shear building-type frames subjected to earthquake excitaion, were analyzed at several levels of their yield resistance. These frames were subjected at their base to the motion recorded for north-south component of the 1940 El Centro earthquake, and to an artificial earthquake which would produce the response spectral charts recommended for design. The frames were first subjected to 25% or 50% of the intensity level of these earthquakes. The resulting maximum relative displacement for each story of the frames was assumed to be yield resistance for the subsequent analyses at 100% of intensity for the excitation. The frames analyzed were uniform along their height with the stiffness adjusted as to result in 0.20 seconds of the fundamental period for the two-story frame, 0.40 seconds for the four-story frame and 1.0 seconds for the ten-story frame. Results of the study provided the following conclusions: (1) The percentage error in floor displacement for linear behavior was less than 10%; (2) The percentage error in floor displacement for inelastic behavior (elastoplastic) could be as high as 100%; (3) In most of the cases analyzed, the error increased with damping in the system; (4) As a general rule, the error increased as the modal yield resistance decreased; (5) The error was lower for the structures subjected to the 1940 E1 Centro earthquake than for the same structures subjected to an artificial earthquake which was generated from the response spectra for design. (orig./HP)

  15. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  16. Looking into the Credibility of Appearance: Exploring the Role of Color in Interface Aesthetics and How it Affects our Perception on System’s Credibility

    Directory of Open Access Journals (Sweden)

    Achmad Syarief

    2007-03-01

    Full Text Available Dalam penelitian  ini dikaji hasil tiga eksperimen sebagai kelanjutan studi yang pernah dilakukan oleh Kurosu-Kashimura [1] dan Noam Tractinsky [2] tentang  relasi antara persepsi pengguna dengan kualitas estetik dan usability tampilan interface. Berdasar dua premis utama yaitu bahwa persepsi estetik dipengaruhi latar belakang kultural serta tampilan yang atraktif dapat mempengaruhi persepsi kehandalan sebuah produk., Dalam penelitian ini dievaluasi bagaimana persepsi pengguna migran (:orang Indonesia yang berada di Jepang terhadap relasi antara tampilan estetik  dan apparent usability pada sebuah interface produk. Dalam eksperimen dilakukan investigasi efek tampilan warna pada sebuah interface produk terhadap persepsi trustworthy (tingkat kepercayaan dan credibility (tingkat kredibilitas produk secara umum. Sebagai stimulus, digunakan tampilan  layout-utama (hasil modifikasi layar ATM bank di Jepang. Hasil eksperimen menunjukkan bahwa nilai estetik tampilan interface mempengaruhi persepsi user atas credibility (tingkat kredibilitas dan trustworthy (tingkat kepercayaan sebuah objek. Latar belakang budaya pengguna tidak memiliki pengaruh signifikan terhadap persepsi estetik tampilan interface apabila pengguna telah melakukan adaptasi eksperiential  atau memiliki pengalaman interaksi dengan produk dengan komposisi layout sejenis. Lebih lanjut hasil penelitian menunjukkan bahwa warna memiliki pengaruh penting dalam meningkatkan kualitas ke-atraktif-an, persepsi kredibiltas (credibility, dan tingkat penerimaan (acceptability pengguna (user. Eksperimen lebih lanjut perlu dilakukan untuk mengetahui bagaimana dan seperti apa sebuah kombinasi warna pada sebuah tampilan interface, dapat memiliki pengaruh yang bermakna  pada keterpakaian sebuah  produk.

  17. GROWTH, LIQUIDITY CONSTRAINTS, CREDIBILITY AND THE EFFECTS OF SHOCKS UNDER A NON-CREDIBLE GOVERNMENT

    Directory of Open Access Journals (Sweden)

    DURMUŞ ÖZDEMİR

    2013-06-01

    Full Text Available This paper presents an overlapping generations model for a small open economy. The model is calibrated to fit data for Turkey. Simulations suggest that for a fairly open economy such as Turkey, credibility and liquidity constraints matter and the choice of income taxation rate, the mix of government spending and the long-run government debt/GDP ratio can all significantly affect the economic growth. The paper also examines the effectiveness of fiscal policy under different levels of liquidity constraint in an open economy within a dynamic framework. It shows that liquidity constraints can affect the outcome of any fiscal policy. Hence fiscal policy is even more important for the less developed economies of the world.

  18. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  19. Source Credibility and the Biasing Effect of Narrative Information on the Perception of Vaccination Risks.

    Science.gov (United States)

    Haase, Niels; Betsch, Cornelia; Renkewitz, Frank

    2015-08-01

    Immunization rates are below the Global Immunization Vision and Strategy established by the World Health Organization. One reason for this are anti-vaccination activists, who use the Internet to disseminate their agenda, frequently by publishing narrative reports about alleged vaccine adverse events. In health communication, the use of narrative information has been shown to be effectively persuasive. Furthermore, persuasion research indicates that the credibility of an information source may serve as a cue to discount or augment the communicated message. Thus, the present study investigated the effect of source credibility on the biasing effect of narrative information regarding the perception of vaccination risks. 265 participants were provided with statistical information (20%) regarding the occurrence of vaccine adverse events after vaccination against a fictitious disease. This was followed by 20 personalized narratives from an online forum on vaccination experiences. The authors varied the relative frequency of narratives reporting vaccine adverse events (35% vs. 85%), narrative source credibility (anti-vaccination website vs. neutral health forum), and the credibility of the statistical information (reliable data vs. unreliable data vs. control) in a between-subjects design. Results showed a stable narrative bias on risk perception that was not affected by credibility cues. However, narratives from an anti-vaccination website generally led to lower perceptions of vaccination risks.

  20. Frictional heating processes during laboratory earthquakes

    Science.gov (United States)

    Aubry, J.; Passelegue, F. X.; Deldicque, D.; Lahfid, A.; Girault, F.; Pinquier, Y.; Escartin, J.; Schubnel, A.

    2017-12-01

    Frictional heating during seismic slip plays a crucial role in the dynamic of earthquakes because it controls fault weakening. This study proposes (i) to image frictional heating combining an in-situ carbon thermometer and Raman microspectrometric mapping, (ii) to combine these observations with fault surface roughness and heat production, (iii) to estimate the mechanical energy dissipated during laboratory earthquakes. Laboratory earthquakes were performed in a triaxial oil loading press, at 45, 90 and 180 MPa of confining pressure by using saw-cut samples of Westerly granite. Initial topography of the fault surface was +/- 30 microns. We use a carbon layer as a local temperature tracer on the fault plane and a type K thermocouple to measure temperature approximately 6mm away from the fault surface. The thermocouple measures the bulk temperature of the fault plane while the in-situ carbon thermometer images the temperature production heterogeneity at the micro-scale. Raman microspectrometry on amorphous carbon patch allowed mapping the temperature heterogeneities on the fault surface after sliding overlaid over a few micrometers to the final fault roughness. The maximum temperature achieved during laboratory earthquakes remains high for all experiments but generally increases with the confining pressure. In addition, the melted surface of fault during seismic slip increases drastically with confining pressure. While melting is systematically observed, the strength drop increases with confining pressure. These results suggest that the dynamic friction coefficient is a function of the area of the fault melted during stick-slip. Using the thermocouple, we inverted the heat dissipated during each event. We show that for rough faults under low confining pressure, less than 20% of the total mechanical work is dissipated into heat. The ratio of frictional heating vs. total mechanical work decreases with cumulated slip (i.e. number of events), and decreases with

  1. Excess Claims and Data Trimming in the Context of Credibility Rating Procedures,

    Science.gov (United States)

    1981-11-01

    Triining in the Context of Credibility Rating Procedures by Hans BShlmann, Alois Gisler, William S. Jewell* 1. Motivation In Ratemaking and in Experience...work on the ETH computer. __.1: " Zen * ’ ’ II / -2- 2. The Basic Model Throughout the paper we work with the most simple model in the credibility...additional structure are summed up by stating that the density -3- f 8 (x) has the following form 1) fe(x) -(1-r)po (x/e) + rape(x) 3. The Basic Problem As

  2. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  3. Controversy matters: Impacts of topic and solution controversy on the perceived credibility of a scientist who advocates.

    Directory of Open Access Journals (Sweden)

    Lindsey Beall

    Full Text Available In this article, we focus on the potential influence of a scientist's advocacy position on the public's perceived credibility of scientists as a whole. Further, we examine how the scientist's solution position (information only, non-controversial, and controversial affects the public's perception of the scientist's motivation for sharing information about specific issues (flu, marijuana, climate change, severe weather. Finally, we assess how perceived motivations mediate the relationship between solution position and credibility. Using data from a quota sample of American adults obtained by Qualtrics (n = 2,453, we found that in some conditions advocating for a solution positively predicted credibility, while in one condition, it negatively predicted scientist credibility. We also found that the influence of solution position on perceived credibility was mediated by several motivation perceptions; most notably through perception that the scientist was motivated to: (a serve the public and (b persuade the public. Further results and implications are discussed.

  4. Correlation between hypocenter depth, antecedent precipitation and earthquake-induced landslide spatial distribution

    Science.gov (United States)

    Fukuoka, Hiroshi; Watanabe, Eisuke

    2017-04-01

    Since Keefer published the paper on earthquake magnitude and affected area, maximum epicentral/fault distance of induced landslide distribution in 1984, showing the envelope of plots, a lot of studies on this topic have been conducted. It has been generally supposed that landslides have been triggered by shallow quakes and more landslides are likely to occur with heavy rainfalls immediately before the quake. In order to confirm this, we have collected 22 case records of earthquake-induced landslide distribution in Japan and examined the effect of hypocenter depth and antecedent precipitation. Earthquake magnitude by JMA (Japan Meteorological Agency) of the cases are from 4.5 to 9.0. Analysis on hycpocenter depth showed the deeper quake cause wider distribution. Antecedent precipitation was evaluated using the Soil Water Index (SWI), which was developed by JMA for issuing landslide alert. We could not find meaningful correlation between SWI and the earthquake-induced landslide distribution. Additionally, we found that smaller minimum size of collected landslides results in wider distribution especially between 1,000 to 100,000 m2.

  5. Impact of Celebrity Credibility on Advertising Effectiveness

    OpenAIRE

    Sadia Aziz; Usman Ghani; Abdullah Niazi

    2013-01-01

    Advertisers often make use of endorsers or representatives as trustworthy sources of persuasion for consumers' attitudes. Promotion of products through celebrities is a trendy advertising practice around the world. The present study judged the impact of celebrity credibility on advertising effectiveness in terms of consumer’s attitude towards the advertisement, attitude towards the brand and their purchase intention. This study also explored the differences of respondent’s responses towards t...

  6. Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis.

    Science.gov (United States)

    Liao, Hongjing; Hitchcock, John

    2018-06-01

    This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Investors Assessment of the Credibility of Management Disclosures ...

    African Journals Online (AJOL)

    The objective of this study, is to examine the issue of the assessment of the credibility of management disclosures about a company from the perspective of the investors. It presents the results from a questionnaire survey of a sample of financial Analysts, accountants and other investor. The data were analyzed using the one ...

  8. Source credibility and the effectiveness of firewise information

    Science.gov (United States)

    Alan D. Bright; Andrew W. Don Carlos; Jerry J. Vaske; James D. Absher

    2007-01-01

    Understanding how residents of the wildlandurban interface (WUI) react to information about firewise behavior can enhance efforts to communicate safety information to the public. This study explored the multiple roles of source credibility on the elaboration and impact of messages about conducting firewise behaviors in the WUI. A mail-back survey to residents of the...

  9. Moderators of Framing Effects on Political Attitudes: Is Source Credibility Worth Investigating?

    Directory of Open Access Journals (Sweden)

    Dana Raluca Buturoiu

    2015-08-01

    Full Text Available This research paper focuses on indirect (mediated media effects. In particular, we discuss which independent variables might intervene in and moderate the impact of framing effects on public attitudes (namely political trust, both in short-term and medium-term contexts. Among these, we focus on source credibility as a possible moderator of framing effects over time. The purpose of this study was to examine if and how source credibility influences individuals’ political trust. The moderator role of source credibility is analysed according to the exposure to different types of frames (repetitive or competitive at different moments (one week or one month. By means of a framing experiment (N=769 on political topics, we argue that media frames could influence political trust: Source credibility has a marginal influence, which suggests that, with stronger stimulus material (video, as opposed to written press articles, the source could play an important role in the willingness of people to trust political figures in general. Thus, we might argue that the media play a significant role not only in offering information about politics and politicians, but also in altering people’s perceptions about them. On the other hand, time seems to matter, since framing effects are more powerful after competitive media exposures. This study proposes new theoretical insights into framing effects, in the sense that classical theories should be revisited in various cultural or political contexts

  10. Evaluation measures for relevance and credibility in ranked lists

    DEFF Research Database (Denmark)

    Lioma, Christina; Simonsen, Jakob Grue; Larsen, Birger

    2017-01-01

    Recent discussions on alternative facts, fake news, and post truth politics have motivated research on creating technologies that allow people not only to access information, but also to assess the credibility of the information presented to them by information retrieval systems. Whereas technology...

  11. Ranking the Online Documents Based on Relative Credibility Measures

    Directory of Open Access Journals (Sweden)

    Ahmad Dahlan

    2013-09-01

    Full Text Available Information searching is the most popular activity in Internet. Usually the search engine provides the search results ranked by the relevance. However, for a certain purpose that concerns with information credibility, particularly citing information for scientific works, another approach of ranking the search engine results is required. This paper presents a study on developing a new ranking method based on the credibility of information. The method is built up upon two well-known algorithms, PageRank and Citation Analysis. The result of the experiment that used Spearman Rank Correlation Coefficient to compare the proposed rank (generated by the method with the standard rank (generated manually by a group of experts showed that the average Spearman 0 < rS < critical value. It means that the correlation was proven but it was not significant. Hence the proposed rank does not satisfy the standard but the performance could be improved.

  12. Ranking the Online Documents Based on Relative Credibility Measures

    Directory of Open Access Journals (Sweden)

    Ahmad Dahlan

    2009-05-01

    Full Text Available Information searching is the most popular activity in Internet. Usually the search engine provides the search results ranked by the relevance. However, for a certain purpose that concerns with information credibility, particularly citing information for scientific works, another approach of ranking the search engine results is required. This paper presents a study on developing a new ranking method based on the credibility of information. The method is built up upon two well-known algorithms, PageRank and Citation Analysis. The result of the experiment that used Spearman Rank Correlation Coefficient to compare the proposed rank (generated by the method with the standard rank (generated manually by a group of experts showed that the average Spearman 0 < rS < critical value. It means that the correlation was proven but it was not significant. Hence the proposed rank does not satisfy the standard but the performance could be improved.

  13. Credible baseline analysis for multi-model public policy studies

    Energy Technology Data Exchange (ETDEWEB)

    Parikh, S.C.; Gass, S.I.

    1981-01-01

    The nature of public decision-making and resource allocation is such that many complex interactions can best be examined and understood by quantitative analysis. Most organizations do not possess the totality of models and needed analytical skills to perform detailed and systematic quantitative analysis. Hence, the need for coordinated, multi-organization studies that support public decision-making has grown in recent years. This trend is expected not only to continue, but to increase. This paper describes the authors' views on the process of multi-model analysis based on their participation in an analytical exercise, the ORNL/MITRE Study. One of the authors was the exercise coordinator. During the study, the authors were concerned with the issue of measuring and conveying credibility of the analysis. This work led them to identify several key determinants, described in this paper, that could be used to develop a rating of credibility.

  14. The effect of source credibility on consumers' perceptions of the quality of health information on the Internet.

    Science.gov (United States)

    Bates, Benjamin R; Romina, Sharon; Ahmed, Rukhsana; Hopson, Danielle

    2006-03-01

    Recent use of the Internet as a source of health information has raised concerns about consumers' ability to tell 'good' information from 'bad' information. Although consumers report that they use source credibility to judge information quality, several observational studies suggest that consumers make little use of source credibility. This study examines consumer evaluations of web pages attributed to a credible source as compared to generic web pages on measures of message quality. In spring 2005, a community-wide convenience survey was distributed in a regional hub city in Ohio, USA. 519 participants were randomly assigned one of six messages discussing lung cancer prevention: three messages each attributed to a highly credible national organization and three identical messages each attributed to a generic web page. Independent sample t-tests were conducted to compare each attributed message to its counterpart attributed to a generic web page on measures of trustworthiness, truthfulness, readability, and completeness. The results demonstrated that differences in attribution to a source did not have a significant effect on consumers' evaluations of the quality of the information.Conclusions. The authors offer suggestions for national organizations to promote credibility to consumers as a heuristic for choosing better online health information through the use of media co-channels to emphasize credibility.

  15. Source attribution and credibility of health and appearance exercise advertisements: relationship with implicit and explicit attitudes and intentions.

    Science.gov (United States)

    Berry, Tanya R; Shields, Chris

    2014-02-01

    The relationship of attributed source (commercial or nonprofit) and credibility of exercise advertisements to explicit and implicit exercise-related attitudes and intentions was examined. Male and female participants (N = 227) were randomly assigned to watch health or appearance-related advertisements and then completed an implicit attitudes task and questionnaires. Health advertisements and those attributed to a nonprofit source were rated more credible. Appearance condition participants who attributed the advertisement to a nonprofit source also rated the advertisement as more credible. Participants who rated a commercial advertisement as credible reported higher implicit instrumental attitudes. Implications for exercise promotion are discussed.

  16. Gravity Variations Related to Earthquakes in the BTTZ Region in China

    Science.gov (United States)

    Zheng, J.; Liu, K.; Lu, H.; Liu, D.; Chen, Y.; Kuo, J. T.

    2006-05-01

    Temporal variations of gravity before and after earthquakes have been observed since 1960s, but a definitive conclusion has not been reached concerning the relationship between the gravity variation and earthquake occurrence. Since 1980, the first US/China joint scientific research project has been monitoring micro-gravity variations related to earthquakes in the Beijing-Tianjin-Tangshan-Zhangjiekou (BTTZ) region in China through the establishment of a network of spatially and temporally continuous and discrete gravity stations. With the data of both temporally continuous and discrete data of gravity variations accumulated and analyzed, a general picture of gravity variation associated with the seismogenesis and occurrence of earthquakes in the BTTZ region has been emerged clearly. Some of the major findings are 1. Gravity variations before and after earthquakes exist spatially and temporally; 2. Gravity variation data of temporally continuous measurements are essential to monitor the variations of gravity related to earthquakes unless temporally discrete gravity data are made in very close time intervals. 3. Concept of epicentroid and hypocentroid with respect to the maximum values of gravity variation is valid and has been experimentally verified; 4. The gravity variations related to the occurrence of earthquakes in the BTTZ region for the magnitudes of 4-5 earthquakes support the proposed "combined dilatation model", i.e., a dual-dilatancy of diffusion dilatancy (D/D) and the fault zone dilatancy (FZD) models; 5. Although the temporally discrete gravity variation data were collected in a larger time interval of about six months in the BTTZ region, these gravity variation data, in some cases, indicate that these variations are related to the occurrence of earthquakes; 7. Subsurface fluids do play a very important role in the gravity variations that have not been recognized and emphasized previously; 7. With the temporally continuous gravity variation data, the

  17. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  18. Management of limb fractures in a teaching hospital: comparison between Wenchuan and Yushu earthquakes.

    Science.gov (United States)

    Min, Li; Tu, Chong-qi; Liu, Lei; Zhang, Wen-li; Yi, Min; Song, Yue-ming; Huang, Fu-guo; Yang, Tian-fu; Pei, Fu-xing

    2013-01-01

    To comparatively analyze the medical records of patients with limb fractures as well as rescue strategy in Wenchuan and Yushu earthquakes so as to provide references for post-earthquake rescue. We retrospectively investigated 944 patients sustaining limb fractures, including 891 in Wenchuan earthquake and 53 in Yushu earthquake, who were admitted to West China Hospital (WCH) of Sichuan University. In Wenchuan earthquake, WCH met its three peaks of limb fracture patients influx, on post-earthquake day (PED) 2, 8 and 14 respectively. Between PED 3-14, 585 patients were transferred from WCH to other hospitals outside the Sichuan Province. In Yushu earthquake, the maximum influx of limb fracture patients happened on PED 3, and no one was shifted to other hospitals. Both in Wenchuan and Yushu earthquakes, most limb fractures were caused by blunt strike and crush/burying. In Wenchuan earthquake, there were 396 (396/942, 42.0%) open limb fractures, including 28 Gustilo I, 201 Gustilo II and 167 Gustilo III injuries. But in Yushu earthquake, the incidence of open limb fracture was much lower (6/61, 9.8%). The percent of patients with acute complications in Wenchuan earthquake (167/891, 18.7%) was much higher than that in Yushu earthquake (5/53, 3.8%). In Wenchuan earthquake rescue, 1 018 surgeries were done, composed of debridement in 376, internal fixation in 283, external fixation in 119, and vacuum sealing drainage in 117, etc. While among the 64 surgeries in Yushu earthquake rescue, the internal fixation for limb fracture was mostly adopted. All patients received proper treatment and survived except one who died due to multiple organs failure in Wenchuan earthquake. Provision of suitable and sufficient medical care in a catastrophe can only be achieved by construction of sophisticated national disaster medical system, prediction of the injury types and number of injuries, and confirmation of participating hospitals?exact role. Based on the valuable rescue experiences

  19. Coseismic slip in the 2010 Yushu earthquake (China, constrained by wide-swath and strip-map InSAR

    Directory of Open Access Journals (Sweden)

    Y. Wen

    2013-01-01

    Full Text Available On 14 April 2010, an Mw = 6.9 earthquake occurred in the Yushu county of China, which caused ~3000 people to lose their lives. Integrated with the information from the observed surface ruptures and aftershock locations, the faulting pattern of this earthquake is derived from the descending wide-swath and ascending strip mode PALSAR data collected by ALOS satellite. We used a layered crustal model and stress drop smoothing constraint to infer the coseismic slip distribution. Our model suggests that the earthquake fault can be divided into four segments and the slip mainly occurs within the upper 12 km with a maximum slip of 2.0 m at depth of 3 km on the Jiegu segment. The rupture of the upper 12 km is dominated by left-lateral strike-slip motion. The relatively small slip along the SE region of Yushu segment suggests a slip deficit there. The inverted geodetic moment is approximately Mw = 6.9, consistent with the seismological results. The average stress drop caused by the earthquake is about 2 MPa with a maximum stress drop of 8.3 MPa. Furthermore, the calculated static Coulomb stress changes in surrounding regions show increased Coulomb stress occurred in the SE region along the Yushu segment but with less aftershock, indicating an increased seismic hazard in this region after the earthquake.

  20. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  1. The numerical simulation study of the dynamic evolutionary processes in an earthquake cycle on the Longmen Shan Fault

    Science.gov (United States)

    Tao, Wei; Shen, Zheng-Kang; Zhang, Yong

    2016-04-01

    The Longmen Shan, located in the conjunction of the eastern margin the Tibet plateau and Sichuan basin, is a typical area for studying the deformation pattern of the Tibet plateau. Following the 2008 Mw 7.9 Wenchuan earthquake (WE) rupturing the Longmen Shan Fault (LSF), a great deal of observations and studies on geology, geophysics, and geodesy have been carried out for this region, with results published successively in recent years. Using the 2D viscoelastic finite element model, introducing the rate-state friction law to the fault, this thesis makes modeling of the earthquake recurrence process and the dynamic evolutionary processes in an earthquake cycle of 10 thousand years. By analyzing the displacement, velocity, stresses, strain energy and strain energy increment fields, this work obtains the following conclusions: (1) The maximum coseismic displacement on the fault is on the surface, and the damage on the hanging wall is much more serious than that on the foot wall of the fault. If the detachment layer is absent, the coseismic displacement would be smaller and the relative displacement between the hanging wall and foot wall would also be smaller. (2) In every stage of the earthquake cycle, the velocities (especially the vertical velocities) on the hanging wall of the fault are larger than that on the food wall, and the values and the distribution patterns of the velocity fields are similar. While in the locking stage prior to the earthquake, the velocities in crust and the relative velocities between hanging wall and foot wall decrease. For the model without the detachment layer, the velocities in crust in the post-seismic stage is much larger than those in other stages. (3) The maximum principle stress and the maximum shear stress concentrate around the joint of the fault and detachment layer, therefore the earthquake would nucleate and start here. (4) The strain density distribution patterns in stages of the earthquake cycle are similar. There are two

  2. Two-year survey comparing earthquake activity and injection-well locations in the Barnett Shale, Texas

    Science.gov (United States)

    Frohlich, Cliff

    2012-01-01

    Between November 2009 and September 2011, temporary seismographs deployed under the EarthScope USArray program were situated on a 70-km grid covering the Barnett Shale in Texas, recording data that allowed sensing and locating regional earthquakes with magnitudes 1.5 and larger. I analyzed these data and located 67 earthquakes, more than eight times as many as reported by the National Earthquake Information Center. All 24 of the most reliably located epicenters occurred in eight groups within 3.2 km of one or more injection wells. These included wells near Dallas–Fort Worth and Cleburne, Texas, where earthquakes near injection wells were reported by the media in 2008 and 2009, as well as wells in six other locations, including several where no earthquakes have been reported previously. This suggests injection-triggered earthquakes are more common than is generally recognized. All the wells nearest to the earthquake groups reported maximum monthly injection rates exceeding 150,000 barrels of water per month (24,000 m3/mo) since October 2006. However, while 9 of 27 such wells in Johnson County were near earthquakes, elsewhere no earthquakes occurred near wells with similar injection rates. A plausible hypothesis to explain these observations is that injection only triggers earthquakes if injected fluids reach and relieve friction on a suitably oriented, nearby fault that is experiencing regional tectonic stress. Testing this hypothesis would require identifying geographic regions where there is interpreted subsurface structure information available to determine whether there are faults near seismically active and seismically quiescent injection wells. PMID:22869701

  3. Earthquake-induced static stress change on magma pathway in promoting the 2012 Copahue eruption

    Science.gov (United States)

    Bonali, F. L.

    2013-11-01

    It was studied how tectonic earthquake-induced static stress changes could have contributed to favouring the 22 December 2012 major eruption at Copahue volcano, Chile. Numerical modelling indicates that the vertical N60°E-striking magma pathway below Copahue was affected by a normal stress reduction induced by the Mw 8.8 Chile earthquake of 27 February 2010. A sensitivity analysis suggests that N-, NE- and E-striking vertical planes are affected by normal stress decrease (maximum at the NE-striking plane), and that also a possible inclined N60°E plane is affected by this reduction. Copahue did not have any magmatic event since 2000. Seismic signals of awakening started in April 2012 and the first volcanic event occurred on July 2012. Thus, it is here suggested a possible earthquake-induced feedback effect on the crust below the volcanic arc up to at least 3 years after a large subduction earthquake, favouring new eruptions.

  4. Seismicity and earthquake risk in western Sicily

    Directory of Open Access Journals (Sweden)

    P. COSENTINO

    1978-06-01

    Full Text Available The seismicity and the earthquake risk in Western Sicily are here
    evaluated on the basis of the experimental data referring to the historical
    and instrumentally recorded earthquakes in this area (from 1248
    up to 1968, which have been thoroughly collected, analyzed, tested and
    normalized in order to assure the quasi-stationarity of the series of
    events.
    The approximated magnitude values — obtained by means of a compared
    analysis of the magnitude and epicentral intensity values of the
    latest events — have allowed to study the parameters of the frequency-
    magnitude relation with both the classical exponential model and
    the truncated exponential one previously proposed by the author.
    So, the basic parameters, including the maximum possible regional
    magnitude, have been estimated by means of different procedures, and
    their behaviours have been studied as functions of the threshold magnitude.

  5. Daytime dependence of disturbances of ionospheric Es-layers connected to earthquakes

    Science.gov (United States)

    Liperovskaya, E. V.; Liperovsky, A. V.; Meister, C.-V.; Silina, A. S.

    2012-04-01

    In the present work variations of the semi-transparency of the sporadic E-layer of the ionosphere due to seismic activities are studied. The semi-transparency Q is determined by the blanketing frequency fbEs and the characteristic frequency foEs, Q = (foEs - fbEs)/fbEs. At low values of the blanketing frequency fbEs, the critical frequency foEs does not describe the maximum ionisation density of the Es-layer, as the critical frequencies of regular ionospheric layers (e.g. foF2) do, but it describes the occurrence of small-scall (tenths of meters) inhomogeneities of the ionisation density along the vertical in the layer. The maximum ionisation density of the sporadic layer is proportional to the square of fbEs. In the case of vertical ionospheric sounding, the sporadic layer becomes transparent for signals with frequencies larger than fbEs. Investigations showed that about three days before an earthquake an increase of the semi-transparency interval is observed during sunset and sunrise. In the present work, analogous results are found for data of the vertical sounding stations "Tokyo" and "Petropavlovsk-Kamchatsky". Using the method of superposition of epoches, more than 50 earthquakes with magnitudes M > 5, depths h < 40 km, and distances between the station and the epicenter R < 300 km are considered in case of the vertical sounding station "Tokyo". More than 20 earthquakes with such parameters were analysed in case of the station "Petropavlovsk-Kamchatsky". Days with strong geomagnetic activity were excluded from the analysis. According to the station "Petropavlovsk-Kamchatsky" about 1-3 days before earthquakes, an increase of Es-spread is observed a few hours before midnight. This increase is a sign of large-scale inhomogeneities in the sporadic layers.

  6. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  7. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  8. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  9. Joint inversion of GNSS and teleseismic data for the rupture process of the 2017 M w6.5 Jiuzhaigou, China, earthquake

    Science.gov (United States)

    Li, Qi; Tan, Kai; Wang, Dong Zhen; Zhao, Bin; Zhang, Rui; Li, Yu; Qi, Yu Jie

    2018-05-01

    The spatio-temporal slip distribution of the earthquake that occurred on 8 August 2017 in Jiuzhaigou, China, was estimated from the teleseismic body wave and near-field Global Navigation Satellite System (GNSS) data (coseismic displacements and high-rate GPS data) based on a finite fault model. Compared with the inversion results from the teleseismic body waves, the near-field GNSS data can better restrain the rupture area, the maximum slip, the source time function, and the surface rupture. The results show that the maximum slip of the earthquake approaches 1.4 m, the scalar seismic moment is 8.0 × 1018 N·m ( M w ≈ 6.5), and the centroid depth is 15 km. The slip is mainly driven by the left-lateral strike-slip and it is initially inferred that the seismogenic fault occurs in the south branch of the Tazang fault or an undetectable fault, a NW-trending left-lateral strike-slip fault, and belongs to one of the tail structures at the easternmost end of the eastern Kunlun fault zone. The earthquake rupture is mainly concentrated at depths of 5-15 km, which results in the complete rupture of the seismic gap left by the previous four earthquakes with magnitudes > 6.0 in 1973 and 1976. Therefore, the possibility of a strong aftershock on the Huya fault is low. The source duration is 30 s and there are two major ruptures. The main rupture occurs in the first 10 s, 4 s after the earthquake; the second rupture peak arrives in 17 s. In addition, the Coulomb stress study shows that the epicenter of the earthquake is located in the area where the static Coulomb stress change increased because of the 12 May 2017 M w7.9 Wenchuan, China, earthquake. Therefore, the Wenchuan earthquake promoted the occurrence of the 8 August 2017 Jiuzhaigou earthquake.

  10. Ferrocyanide safety program: Credibility of drying out ferrocyanide tank waste by hot spots

    International Nuclear Information System (INIS)

    Dickinson, D.R.; McLaren, J.M.; Borsheim, G.L.; Crippen, M.D.

    1993-04-01

    The single-shell waste tanks at the Hanford Site that contain significant quantities of ferrocyanide have been considered a possible hazard, since under certain conditions the ferrocyanide in the waste tanks could undergo an exothermic chemical reaction with the nitrates and nitrites that are also present in the tanks. The purpose of this report is to assess the credibility of local dryout of ferrocyanide due to a hotspot. This report considers the following: What amount of decay heat generation within what volume would be necessary to raise the temperature of the liquid in the sludge to its boiling point? What mechanisms could produce a significant local concentration of heat sources? Is it credible that a waste tank heat concentration could be as large as that required to reach the dryout temperatures? This report also provides a recommendation as to whether infrared scanning of the ferrocyanide tanks is needed. From the analyses presented in this report it is evident that formation of dry, and thus chemically reactive, regions in the ferrocyanide sludge by local hotspots is not credible. This conclusion is subject to reevaluation if future analyses of tank core samples show much higher 137 Cs or 90 Sr concentrations than expected. Since hotspots of concern are not credible, infrared scanning to detect such hotspots is not required for safe storage of tank waste

  11. Tohoku's earthquake of Friday March 11, 2011 (5:46 UT), magnitude 9.0, off Honshu island (Japan)

    International Nuclear Information System (INIS)

    2011-01-01

    On Friday March 11, 2011, at 5:46 UT (2:46 PM local time), a magnitude 9.0 earthquake took place at 80 km east of Honshu island (Japan). The earthquake generated a tsunami which led to the loss of the cooling systems of the Fukushima Dai-ichi and Fukushima Daini power plants. This paper describes the seismo-tectonic and historical seismic context of the Japan archipelago and the first analyses of the Tohoku earthquake impact: magnitudes of first shock and of aftershocks, impact on nuclear facilities (maximum acceleration values detected with respect to design basis values, subsidence of coastal areas and submersion of power plant platforms). (J.S.)

  12. Seismic hazard study for the TREAT Reactor facility at the INEL, Idaho

    International Nuclear Information System (INIS)

    1979-01-01

    The TREAT Reactor is founded on a thick unfaulted sequence of Plio-Pleistocene basalt on the Snake River Plain. The plain is presently aseismic; however, seismic activity occurs in the mountains around the plain. The Howe Scarp is located 19 miles from TREAT and contains a known capable fault. Evaluation of this and other faults in the region indicate the Howe Scarp is the most significant earthquake fault for TREAT. A maximum credible earthquake on this fault could produce a maximum ground motion of about .22 g at TREAT. A study of three range front fault systems north of the Snake River Plain indicates the fault systems have not ruptured as a unit in the past; and, cross range faults, mountain spurs and reentrants generally bound the definable fault sets in the range front systems. This study indicates future surface fault rupture and earthquake events will follow a similar pattern of contiguous faulting; each individual surface rupture event should only involve a single fault set of the range front fault system. Surface faulting on contiguous fault sets should be separated by significant intervals of geologic time. Certain volcanic hazards have been examined and discussed

  13. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  14. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  15. Machine learning methods for credibility assessment of interviewees based on posturographic data.

    Science.gov (United States)

    Saripalle, Sashi K; Vemulapalli, Spandana; King, Gregory W; Burgoon, Judee K; Derakhshani, Reza

    2015-01-01

    This paper discusses the advantages of using posturographic signals from force plates for non-invasive credibility assessment. The contributions of our work are two fold: first, the proposed method is highly efficient and non invasive. Second, feasibility for creating an autonomous credibility assessment system using machine-learning algorithms is studied. This study employs an interview paradigm that includes subjects responding with truthful and deceptive intent while their center of pressure (COP) signal is being recorded. Classification models utilizing sets of COP features for deceptive responses are derived and best accuracy of 93.5% for test interval is reported.

  16. Motivation and treatment credibility predict alliance in cognitive behavioral treatment for youth with anxiety disorders in community clinics.

    Science.gov (United States)

    Fjermestad, K W; Lerner, M D; McLeod, B D; Wergeland, G J H; Haugland, B S M; Havik, O E; Öst, L-G; Silverman, W K

    2017-11-16

    We examined whether motivation and treatment credibility predicted alliance in a 10-session cognitive behavioral treatment delivered in community clinics for youth anxiety disorders. Ninety-one clinic-referred youths (mean age  = 11.4 years, standard deviation = 2.1, range 8-15 years, 49.5% boys) with anxiety disorders-rated treatment motivation at pretreatment and perceived treatment credibility after session 1. Youths and therapists (YT) rated alliance after session 3 (early) and session 7 (late). Hierarchical linear models were applied to examine whether motivation and treatment credibility predicted YT early alliance, YT alliance change, and YT alliance agreement. Motivation predicted high early YT alliance, but not YT alliance change or alliance agreement. Youth-rated treatment credibility predicted high early youth alliance and high YT positive alliance change, but not early therapist alliance or alliance agreement. Conclusion Efforts to enhance youth motivation and treatment credibility early in treatment could facilitate the formation of a strong YT alliance. © 2017 Wiley Periodicals, Inc.

  17. Pengaruh Brand Credibility Terhadap Information Efficiency Dan Risk Reduction, Serta Dampaknya Atas Repurchase Intention

    OpenAIRE

    Faisal, Aekram

    2015-01-01

    This research conducted to know the influence of Brand Credibility to Information efficiency and Risk reduction, also the influence of Information efficiency and Risk reduction to Repurchase intention. This research aimed to know the influence of Brand Credibility to Repurchase intention that mediated by Information efficiency and Risk reduction. The methodology of this research is testing hypothesis research. The sample collecting by questionnaire of 150 respondents from Starb...

  18. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  19. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  20. The Loma Prieta, California, Earthquake of October 17, 1989: Societal Response

    Science.gov (United States)

    Coordinated by Mileti, Dennis S.

    1993-01-01

    Professional Paper 1553 describes how people and organizations responded to the earthquake and how the earthquake impacted people and society. The investigations evaluate the tools available to the research community to measure the nature, extent, and causes of damage and losses. They describe human behavior during and immediately after the earthquake and how citizens participated in emergency response. They review the challenges confronted by police and fire departments and disruptions to transbay transportations systems. And they survey the challenges of post-earthquake recovery. Some significant findings were: * Loma Prieta provided the first test of ATC-20, the red, yellow, and green tagging of buildings. It successful application has led to widespread use in other disasters including the September 11, 2001, New York City terrorist incident. * Most people responded calmly and without panic to the earthquake and acted to get themselves to a safe location. * Actions by people to help alleviate emergency conditions were proportional to the level of need at the community level. * Some solutions caused problems of their own. The police perimeter around the Cypress Viaduct isolated businesses from their customers leading to a loss of business and the evacuation of employees from those businesses hindered the movement of supplies to the disaster scene. * Emergency transbay ferry service was established 6 days after the earthquake, but required constant revision of service contracts and schedules. * The Loma Prieta earthquake produced minimal disruption to the regional economy. The total economic disruption resulted in maximum losses to the Gross Regional Product of $725 million in 1 month and $2.9 billion in 2 months, but 80% of the loss was recovered during the first 6 months of 1990. Approximately 7,100 workers were laid off.

  1. Management of limb fractures in a teaching hospital: comparison between Wenchuan and Yushu earthquakes

    Directory of Open Access Journals (Sweden)

    MIN Li

    2013-02-01

    Full Text Available 【Abstract】Objective: To comparatively analyze the medical records of patients with limb fractures as well as rescue strategy in Wenchuan and Yushu earthquakes so as to provide references for post-earthquake rescue. Methods: We retrospectively investigated 944 patients sustaining limb fractures, including 891 in Wenchuan earth-quake and 53 in Yushu earthquake, who were admitted to West China Hospital (WCH of Sichuan University. Results: In Wenchuan earthquake, WCH met its three peaks of limb fracture patients influx, on post-earthquake day (PED 2, 8 and 14 respectively. Between PED 3-14, 585 patients were transferred from WCH to other hospitals out-side the Sichuan Province. In Yushu earthquake, the maxi-mum influx of limb fracture patients happened on PED 3, and no one was shifted to other hospitals. Both in Wenchuan and Yushu earthquakes, most limb fractures were caused by blunt strike and crush/burying. In Wenchuan earthquake, there were 396 (396/942, 42.0% open limb fractures, includ-ing 28 Gustilo I, 201 Gustilo II and 167 Gustilo III injuries. But in Yushu earthquake, the incidence of open limb frac-ture was much lower (6/61, 9.8%. The percent of patients with acute complications in Wenchuan earthquake (167/891, 18.7% was much higher than that in Yushu earthquake (5/53, 3.8%. In Wenchuan earthquake rescue, 1 018 surgeries were done, composed of debridement in 376, internal fixation in 283, external fixation in 119, and vacuum sealing drainage in 117, etc. While among the 64 surgeries in Yushu earthquake rescue, the internal fixation for limb fracture was mostly adopted. All patients received proper treatment and sur-vived except one who died due to multiple organs failure in Wenchuan earthquake. Conclusion: Provision of suitable and sufficient medi-cal care in a catastrophe can only be achieved by construc-tion of sophisticated national disaster medical system, pre-diction of the injury types and number of injuries, and con-firmation of

  2. GIS learning tool for world's largest earthquakes and their causes

    Science.gov (United States)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and

  3. The Credibility of Policy Reporting Across Learning Disciplines

    Energy Technology Data Exchange (ETDEWEB)

    Carley, S.; Youtie, J.; Solomon, G.; Porter, A.

    2016-07-01

    The notion of a credibility map argues that everyone has a distinctive map that dictates the preference given to different types and sources of information. When seeking to influence other academic fields, scholars will likely turn to scientific and technical information though other types, such as policy reports, may also be relevant. We draw on the credibility mapping concept to understand how a major policy report is taken up by the target academic community. The report, How People Learn, was published by the US National Academies in 2000, to expose the education community (mainly educational researchers but also knowledge-seeking practitioners) to major cognitive science research findings of relevance to learning. We applied several search strings to measure the take up of this report in the target community. We used Google Scholar to evince that that How People Learn was cited in nearly 15,000 publications, these citations grew particularly steeply from 2000 to 2008, and most were in education-related journal papers. We performed a similar analysis using the Web of Science, which showed that most of the citations were substantial as opposed to perfunctory. (Author)

  4. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  5. EXAMINING THE EFFECT OF ENDORSER CREDIBILITY ON THE CONSUMERS' BUYING INTENTIONS: AN EMPIRICAL STUDY IN TURKEY

    Directory of Open Access Journals (Sweden)

    Aysegul Ermec Sertoglu

    2014-01-01

    Full Text Available The purpose of this study is to test whether the source credibility affects buying intention and measure the perceived credibility differences between created spokesperson and celebrity endorser. The influence that endorser credibility dimensions (i.e. attractiveness, trustworthiness and expertise have on purchase intentions of 326 young consumers has been examined. The results showed that all of the three credibility dimensions for both celebrity endorser and created spokesperson have a positive relationship with purchase intention. Created spokesperson is perceived to be more trustworthy and competent whereas the celebrity endorser is found to be more attractive by the respondents. This study is unique in a way that it covers fairly new and rapidly growing Turkish market. One factor that makes this study unique in Turkey, in which the usage of celebrity endorsers holds significant part in the marketing of products, is the lack of studies that would measure the effectiveness of this method.

  6. Earthquake focal mechanism forecasting in Italy for PSHA purposes

    Science.gov (United States)

    Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola

    2018-01-01

    In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.

  7. Maximum forseeable accident analysis made by a sodium leak on the BN-800 primary circuit and the more constraining accident development scenario

    International Nuclear Information System (INIS)

    Ivanenko, V.N.; Zybin, V.A.

    1988-01-01

    In this paper the different ways of development for the BN-800 maximum credible accident in case of loss and fire of primary sodium are examined. The more constraining scenario is presented. During the scenario analysis the accidental release of radioactive materials in the environment has been studied. These releases are below the authorized values [fr

  8. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  9. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  10. Protecting your family from earthquakes: The seven steps to earthquake safety

    Science.gov (United States)

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  11. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  12. A Study of the Historical Earthquake Catalog and Gutenberg-richter Parameter Values of the Korean Peninsula

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil; Rhee, Hyun Me

    2010-01-01

    The KIER's Korean historical earthquake catalog was revised for MMI≥VI events recorded from the years 27 A.D. to 1904. The magnitude of each event was directly determined from the criteria suggested by Seo. The criteria incorporated the damage phenomena of the Japanese historical earthquake catalog, recent seismological studies, and the results of tests performed on ancient structures in Korea. Thus, the uncertainty of the magnitudes of the Korean historical earthquakes can be reduced. Also, the Gutenberg-Richter parameter values were estimated based on the revised catalog of this study. It was determined that the magnitudes of a maximum inland and minimum offshore event were approximately 6.3 and 6.5, respectively. The Gutenberg-Richter parameter pairs of the historical earthquake catalog were estimated to be a=5.32±0.21, b=0.95±0.19, which were somewhat lower than those obtained from recent complete instrumental earthquakes. No apparent change in the Gutenberg-Richter parameter is observed for the 16 th -17 th centuries of the seismically active period

  13. Evidence for strong Holocene earthquake(s) in the Wabash Valley seismic zone

    International Nuclear Information System (INIS)

    Obermeier, S.

    1991-01-01

    Many small and slightly damaging earthquakes have taken place in the region of the lower Wabash River Valley of Indiana and Illinois during the 200 years of historic record. Seismologists have long suspected the Wabash Valley seismic zone to be capable of producing earthquakes much stronger than the largest of record (m b 5.8). The seismic zone contains the poorly defined Wabash Valley fault zone and also appears to contain other vaguely defined faults at depths from which the strongest earthquakes presently originate. Faults near the surface are generally covered with thick alluvium in lowlands and a veneer of loess in uplands, which make direct observations of faults difficult. Partly because of this difficulty, a search for paleoliquefaction features was begun in 1990. Conclusions of the study are as follows: (1) an earthquake much stronger than any historic earthquake struck the lower Wabash Valley between 1,500 and 7,500 years ago; (2) the epicentral region of the prehistoric strong earthquake was the Wabash Valley seismic zone; (3) apparent sites have been located where 1811-12 earthquake accelerations can be bracketed

  14. Development of damage probability matrices based on Greek earthquake damage data

    Science.gov (United States)

    Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.

    2011-03-01

    A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.

  15. Characteristics and damage investigation of the 1998 Papua New Guinea earthquake tsunami

    International Nuclear Information System (INIS)

    Matsuyama, Masashi

    1998-01-01

    On 17 July, 1998, an earthquake with moment magnitude Mw 7.1 (estimated by Harvard Univ.) occurred at 18:49 (local time) on the north west part of Papua New Guinea. Several minutes after the main shock, huge tsunami attacked the north coast of Sissano and Malol, where the coast is composed of straight beach with white sand, and about 7,000 people had lived in high floor wooden houses. Due to the tsunami, more than 2,000 people were killed. To investigate damage by the tsunami, a survey team of seven members was organized in Japan. The author took part in the survey team, which was headed by Prof. Kawata, of Kyoto University. We stayed in the Papua New Guinea from 30th July through 10th August 1998 to investigate the maximum water level, to interview the people about the phenomena caused by the earthquake and the tsunami, and to set three seismographs. These results imply that: (1) By main shock, an earthquake intensity of 6 on the Richter scale was felt in Sissano and Malol. In the coast area near Sissano and Malol, liquefaction took place. (2) More than 2,000 people were killed mainly due to the tsunami. (3) The maximum water level of the tsunami was about 15 m. (4) It seems that the tsunami caused not only by crustal movement, but also by other factors. This is suggested by the fact that the measured maximum water level was beyond 10 times larger than the estimated one, which was calculated by numerical simulation based on known fault parameters. It is highly probable that a submarine landslide was one of main factors which amplified the tsunami. (author)

  16. Simulating Terrorism: Credible Commitment, Costly Signaling, and Strategic Behavior

    Science.gov (United States)

    Siegel, David A.; Young, Joseph K.

    2009-01-01

    We present two simulations designed to convey the strategic nature of terrorism and counterterrorism. The first is a simulated hostage crisis, designed primarily to illustrate the concepts of credible commitment and costly signaling. The second explores high-level decision making of both a terrorist group and the state, and is designed to…

  17. Earthquake hazard analysis for the different regions in and around Ağrı

    Energy Technology Data Exchange (ETDEWEB)

    Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    We investigated earthquake hazard parameters for Eastern part of Turkey by determining the a and b parameters in a Gutenberg–Richter magnitude–frequency relationship. For this purpose, study area is divided into seven different source zones based on their tectonic and seismotectonic regimes. The database used in this work was taken from different sources and catalogues such as TURKNET, International Seismological Centre (ISC), Incorporated Research Institutions for Seismology (IRIS) and The Scientific and Technological Research Council of Turkey (TUBITAK) for instrumental period. We calculated the a value, b value, which is the slope of the frequency–magnitude Gutenberg–Richter relationship, from the maximum likelihood method (ML). Also, we estimated the mean return periods, the most probable maximum magnitude in the time period of t-years and the probability for an earthquake occurrence for an earthquake magnitude ≥ M during a time span of t-years. We used Zmap software to calculate these parameters. The lowest b value was calculated in Region 1 covered Cobandede Fault Zone. We obtain the highest a value in Region 2 covered Kagizman Fault Zone. This conclusion is strongly supported from the probability value, which shows the largest value (87%) for an earthquake with magnitude greater than or equal to 6.0. The mean return period for such a magnitude is the lowest in this region (49-years). The most probable magnitude in the next 100 years was calculated and we determined the highest value around Cobandede Fault Zone. According to these parameters, Region 1 covered the Cobandede Fault Zone and is the most dangerous area around the Eastern part of Turkey.

  18. I FEEL CONNECTED: A COMPARATIVE STUDY OF STUDENT ENGAGEMENT AND LECTURERS’ CREDIBILITY

    Directory of Open Access Journals (Sweden)

    Padma Pillai

    2016-12-01

    Full Text Available Communication is essential. Having the ability to communicate thoughts, ideas, and feelings is crucial in all environments. The education industry regards communication as a core business to transfer knowledge. This paper focuses on how two different groups of students at Sunway University, Malaysia, perceived Lecturers’ Credibility (LC in a class that enhances the Students’ Engagement (SE. A group of 50 to 60 students from the Faculty of Arts (FoA and School of Business (SoB completed measures of LC and SE using McCroskey and Teven’s (1999 Source Credibility Questionnaire (SCQ and Students Engagement Survey from Indicators of Positive Development Conference, Child Trends. The variables for LC comprise competence, character and caring (CCC, and the variables for SE consist of cognitive, behaviour and emotion (CBE. The study aims to determine if there are any differences in SE between students from FoA and SoB with their perceived LC. Hopefully, the study sheds some light on the research question: “Are there any differences among Faculty of Arts students and School of Business students in the relationship between lecturers’ credibility and students’ engagement?”

  19. Truth and Credibility in Sincere Policy Analysis: Alternative Approaches for the Production of Policy-Relevant Knowledge.

    Science.gov (United States)

    Bozeman, Barry; Landsbergen, David

    1989-01-01

    Two competing approaches to policy analysis are distinguished: a credibility approach, and a truth approach. According to the credibility approach, the policy analyst's role is to search for plausible argument rather than truth. Each approach has pragmatic tradeoffs in fulfilling the goal of providing usable knowledge to decision makers. (TJH)

  20. Regional dependence in earthquake early warning and real time seismology

    International Nuclear Information System (INIS)

    Caprio, M.

    2013-01-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  1. Regional dependence in earthquake early warning and real time seismology

    Energy Technology Data Exchange (ETDEWEB)

    Caprio, M.

    2013-07-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  2. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  3. Scientific risk communication about controversial issues influences public perceptions of scientists' political orientations and credibility.

    Science.gov (United States)

    Vraga, Emily; Myers, Teresa; Kotcher, John; Beall, Lindsey; Maibach, Ed

    2018-02-01

    Many scientists communicate with the public about risks associated with scientific issues, but such communication may have unintended consequences for how the public views the political orientations and the credibility of the communicating scientist. We explore this possibility using an experiment with a nationally representative sample of Americans in the fall of 2015. We find that risk communication on controversial scientific issues sometimes influences perceptions of the political orientations and credibility of the communicating scientist when the scientist addresses the risks of issues associated with conservative or liberal groups. This relationship is moderated by participant political ideology, with liberals adjusting their perceptions of the scientists' political beliefs more substantially when the scientist addressed the risks of marijuana use when compared with other issues. Conservatives' political perceptions were less impacted by the issue context of the scientific risk communication but indirectly influenced credibility perceptions. Our results support a contextual model of audience interpretation of scientific risk communication. Scientists should be cognizant that audience members may make inferences about the communicating scientist's political orientations and credibility when they engage in risk communication efforts about controversial issues.

  4. Currency option pricing in a credible exchange rate target zone

    NARCIS (Netherlands)

    Veestraeten, D.

    2013-01-01

    This article examines currency option pricing within a credible target zone arrangement where interventions at the boundaries push the exchange rate back into its fluctuation band. Valuation of such options is complicated by the requirement that the reflection mechanism should prevent the arbitrage

  5. Currency option pricing in a credible exchange rate target zone

    NARCIS (Netherlands)

    Veestraeten, D.

    2012-01-01

    This article examines currency option pricing within a credible target zone arrangement where interventions at the boundaries push the exchange rate back into its fluctuation band. Valuation of such options is complicated by the requirement that the reflection mechanism should prevent the arbitrage

  6. Geological and Seismological Analysis of the 13 February 2001 Mw 6.6 El Salvador Earthquake: Evidence for Surface Rupture and Implications for Seismic Hazard

    OpenAIRE

    Canora Catalán, Carolina; Martínez Díaz, José J.; Villamor Pérez, María Pilar; Berryman, K.R.; Álvarez Gómez, José Antonio; Pullinger, Carlos; Capote del Villar, Ramón

    2010-01-01

    The El Salvador earthquake of 13 February 2001 (Mw 6.6) caused tectonic rupture on the El Salvador fault zone (ESFZ). Right-lateral strike-slip surface rupture of the east–west trending fault zone had a maximum surface displacement of 0.60 m. No vertical component was observed. The earthquake resulted in widespread landslides in the epicentral area, where bedrock is composed of volcanic sediments, tephra, and weak ignimbrites. In the aftermath of the earthquake, widespread dama...

  7. Student Perceptions of Peer Credibility Based on Email Addresses

    Science.gov (United States)

    Livermore, Jeffrey A.; Scafe, Marla G.; Wiechowski, Linda S.; Maier, David J.

    2013-01-01

    The purpose of this study was to evaluate students' perceptions of their peer's credibility based on email addresses. The survey was conducted at a community college in Michigan where all students were registered and actively taking at least one course. The survey results show that a student's selection of an email address does influence other…

  8. Student Perceptions of Faculty Credibility Based on Email Addresses

    Science.gov (United States)

    Livermore, Jeffrey A.; Scafe, Marla G.; Wiechowski, Linda S.

    2010-01-01

    The purpose of this study was to evaluate students' perceptions of faculty credibility based on email addresses. The survey was conducted at an upper division business school in Michigan where all students have completed at least two years of college courses. The survey results show that a faculty member's selection of an email address does…

  9. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  10. Determinants of Judgments of Explanatory Power: Credibility, Generality, and Statistical Relevance

    Science.gov (United States)

    Colombo, Matteo; Bucher, Leandra; Sprenger, Jan

    2017-01-01

    Explanation is a central concept in human psychology. Drawing upon philosophical theories of explanation, psychologists have recently begun to examine the relationship between explanation, probability and causality. Our study advances this growing literature at the intersection of psychology and philosophy of science by systematically investigating how judgments of explanatory power are affected by (i) the prior credibility of an explanatory hypothesis, (ii) the causal framing of the hypothesis, (iii) the perceived generalizability of the explanation, and (iv) the relation of statistical relevance between hypothesis and evidence. Collectively, the results of our five experiments support the hypothesis that the prior credibility of a causal explanation plays a central role in explanatory reasoning: first, because of the presence of strong main effects on judgments of explanatory power, and second, because of the gate-keeping role it has for other factors. Highly credible explanations are not susceptible to causal framing effects, but they are sensitive to the effects of normatively relevant factors: the generalizability of an explanation, and its statistical relevance for the evidence. These results advance current literature in the philosophy and psychology of explanation in three ways. First, they yield a more nuanced understanding of the determinants of judgments of explanatory power, and the interaction between these factors. Second, they show the close relationship between prior beliefs and explanatory power. Third, they elucidate the nature of abductive reasoning. PMID:28928679

  11. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  12. Tsunamigenic Ratio of the Pacific Ocean earthquakes and a proposal for a Tsunami Index

    Directory of Open Access Journals (Sweden)

    A. Suppasri

    2012-01-01

    Full Text Available The Pacific Ocean is the location where two-thirds of tsunamis have occurred, resulting in a great number of casualties. Once information on an earthquake has been issued, it is important to understand if there is a tsunami generation risk in relation with a specific earthquake magnitude or focal depth. This study proposes a Tsunamigenic Ratio (TR that is defined as the ratio between the number of earthquake-generated tsunamis and the total number of earthquakes. Earthquake and tsunami data used in this study were selected from a database containing tsunamigenic earthquakes from prior 1900 to 2011. The TR is calculated from earthquake events with a magnitude greater than 5.0, a focal depth shallower than 200 km and a sea depth less than 7 km. The results suggest that a great earthquake magnitude and a shallow focal depth have a high potential to generate tsunamis with a large tsunami height. The average TR in the Pacific Ocean is 0.4, whereas the TR for specific regions of the Pacific Ocean varies from 0.3 to 0.7. The TR calculated for each region shows the relationship between three influential parameters: earthquake magnitude, focal depth and sea depth. The three parameters were combined and proposed as a dimensionless parameter called the Tsunami Index (TI. TI can express better relationship with the TR and with maximum tsunami height, while the three parameters mentioned above cannot. The results show that recent submarine earthquakes had a higher potential to generate a tsunami with a larger tsunami height than during the last century. A tsunami is definitely generated if the TI is larger than 7.0. The proposed TR and TI will help ascertain the tsunami generation risk of each earthquake event based on a statistical analysis of the historical data and could be an important decision support tool during the early tsunami warning stage.

  13. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  14. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  15. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  16. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    Science.gov (United States)

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  17. On the credibility of scientific findings

    International Nuclear Information System (INIS)

    Maier-Leibnitz, H.

    1987-01-01

    Since the beginning of the so-called nuclear controversy, problems of risk and of safety increasingly have come to be discussed also by persons not really qualified for the job; often, results and findings were presented which, although technically unfounded or based on wrong assumptions or conclusions, have greatly helped to create fear and concern about the peaceful uses of nuclear energy. In a study of the credibility gap faced by science, criteria are given for the competency of experts vis-a-vis specific problems. The most important aspect in arriving at the truth is felt to be the weighing of alternative decisions. (orig./HP) [de

  18. Crowdsourcing Rapid Assessment of Collapsed Buildings Early after the Earthquake Based on Aerial Remote Sensing Image: A Case Study of Yushu Earthquake

    Directory of Open Access Journals (Sweden)

    Shuai Xie

    2016-09-01

    Full Text Available Remote sensing (RS images play a significant role in disaster emergency response. Web2.0 changes the way data are created, making it possible for the public to participate in scientific issues. In this paper, an experiment is designed to evaluate the reliability of crowdsourcing buildings collapse assessment in the early time after an earthquake based on aerial remote sensing image. The procedure of RS data pre-processing and crowdsourcing data collection is presented. A probabilistic model including maximum likelihood estimation (MLE, Bayes’ theorem and expectation-maximization (EM algorithm are applied to quantitatively estimate the individual error-rate and “ground truth” according to multiple participants’ assessment results. An experimental area of Yushu earthquake is provided to present the results contributed by participants. Following the results, some discussion is provided regarding accuracy and variation among participants. The features of buildings labeled as the same damage type are found highly consistent. This suggests that the building damage assessment contributed by crowdsourcing can be treated as reliable samples. This study shows potential for a rapid building collapse assessment through crowdsourcing and quantitatively inferring “ground truth” according to crowdsourcing data in the early time after the earthquake based on aerial remote sensing image.

  19. On operator diagnosis aid in severe earthquakes

    International Nuclear Information System (INIS)

    Lee, S.H.; Okrent, D.

    1988-01-01

    During a severe earthquake, any component, system, or structure may fail; the plant may be driven into a very complex situation in which instrumentaion and control systems may also fail and provide operators with unreliable information about the processing parameters crucial to plant safety. What can operators do when faced with such complexity. Even though the likelihood of such a severe earthquake may be very low, its consequence may be more serious if mitigative measures are not thought out and implemented in advance. The objectives of the present study is related to the measures to protect the plant from severe damage due to large earthquakes, namely, the improvement of operator capability to respond to seismic damage through the use of Emergency Procedure Guidelines (EPGs). The fact that the symptoms presented to operators may be unreliable in severe earthquakes endangers the validity of actions in EPGs. It is the purpose of this study to design a tool through which study may be done so that the weakness of EPGs may be identified in advance then, if possible, according to the practice results some learning may be obtained so that EPGs may be improved to accomodate the complexity to a maximum. In other words, the present study intends to provide a tool which may simulate available signals, including false ones, such that EPGs may be examined and operator actions may be studied. It is hoped to develop some knowledge needed to complement the currently available knowledge. The final product of this study shall be a program which may provide users the rationale on how it reachs conclusions such that users may improve their knowledge, as well as a program whose knowledge may be updated via user interfacing

  20. Faculty Perceptions of Student Credibility Based on Email Addresses

    Science.gov (United States)

    Livermore, Jeffrey A.; Wiechowski, Linda S.; Scafe, Marla G.

    2011-01-01

    The purpose of this study was to evaluate faculty perceptions of student credibility based on email addresses. The survey was conducted at an upper division business school in Michigan where all students have completed at least two years of college courses. The survey results show that a student's selection of an email address does influence the…

  1. Design and implementation of a voluntary collective earthquake insurance policy to cover low-income homeowners in a developing country

    OpenAIRE

    Marulanda, M.; Cardona, O.; Mora, Miguel; Barbat, Alex

    2018-01-01

    Understanding and evaluating disaster risk due to natural hazard events such as earthquakes creates powerful incentives for countries to develop planning options and tools to reduce potential damages. The use of models for earthquake risk evaluation allows obtaining outputs such as the loss exceedance curve, the expected annual loss and the probable maximum loss, which are probabilistic metrics useful for risk analyses, for designing strategies for risk reduction and mitigation, for emergency...

  2. Chapter 8: The credibility crisis

    International Nuclear Information System (INIS)

    Poumadere, M.

    1991-01-01

    In the credibility crisis, a generalized state of conflicting cognitions is probably prevalent, along with possible individual and social pathologies. These cognitive conflicts and emotional traumas are linked to both the characteristics of Chernobyl as a human-made disaster and the specific nature of Chernobyl as a nuclear disaster. Three major elements as constituents of this nature are identified: The rupture of a social contract, the loss of a socially valued object, and the sudden removal of established distances. Further research and basic information are needed in this area where little specific observation is reported. A better grasp of the impact of nuclear energy on our societies can lead to better adapted policy, increased local and social solidarity, more decentralized initiative and risk management, and better organized primary prevention in nuclear disaster. (orig./DG)

  3. Effects of Huge Earthquakes on Earth Rotation and the length of Day

    Directory of Open Access Journals (Sweden)

    Changyi Xu

    2013-01-01

    Full Text Available We calculated the co-seismic Earth rotation changes for several typical great earthquakes since 1960 based on Dahlen¡¦s analytical expression of Earth inertia moment change, the excitation functions of polar motion and, variation in the length of a day (ΔLOD. Then, we derived a mathematical relation between polar motion and earthquake parameters, to prove that the amplitude of polar motion is independent of longitude. Because the analytical expression of Dahlen¡¦s theory is useful to theoretically estimate rotation changes by earthquakes having different seismic parameters, we show results for polar motion and ΔLOD for various types of earthquakes in a comprehensive manner. The modeled results show that the seismic effect on the Earth¡¦s rotation decreases gradually with increased latitude if other parameters are unchanged. The Earth¡¦s rotational change is symmetrical for a 45° dip angle and the maximum changes appear at the equator and poles. Earthquakes at a medium dip angle and low latitudes produce large rotation changes. As an example, we calculate the polar motion and ΔLOD caused by the 2011 Tohoku-Oki Earthquake using two different fault models. Results show that a fine slip fault model is useful to compute co-seismic Earth rotation change. The obtained results indicate Dahlen¡¦s method gives good approximations for computation of co-seismic rotation changes, but there are some differences if one considers detailed fault slip distributions. Finally we analyze and discuss the co-seismic Earth rotation change signal using GRACE data, showing that such a signal is hard to be detected at present, but it might be detected under some conditions. Numerical results of this study will serve as a good indicator to check if satellite observations such as GRACE can detect a seismic rotation change when a great earthquake occur.

  4. CredibleMeds.org: What does it offer?

    Science.gov (United States)

    Woosley, Raymond L; Black, Kristin; Heise, C William; Romero, Klaus

    2018-02-01

    Since the 1990s, when numerous non-cardiac drugs were first recognized to have the potential to prolong the QT interval and cause torsades de pointes (TdP), clinicians, drug regulators, drug developers, and clinical investigators have become aware of the complexities of assessing evidence and determining TdP causality for the many drugs being marketed or under development. To facilitate better understanding, the Arizona Center for Education and Research on Therapeutics, known as AZCERT, has developed the CredibleMeds.org website which includes QTdrugs, a listing of over 220 drugs placed in four risk categories based on their association with QT prolongation and TdP. Since the site was launched in 1999, it has become the single and most reliable source of information of its kind for patients, healthcare providers, and research scientists. Over 96,000 registered users rely on the QTdrugs database as their primary resource to inform their medication use, their prescribing or their clinical research into the impact of QT-prolonging drugs and drug-induced arrhythmias. The QTdrugs lists are increasingly used as the basis for clinical decision support systems in healthcare and for metrics of prescribing quality by healthcare insurers. A free smartphone app and an application program interface enable rapid and mobile access to the lists. Also, the CredibleMeds website offers numerous educational resources for patients, educators and healthcare providers that foster the safe use of medications. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Luka Brajnović – From Fidelity to Oneself towards Credibility of Profession

    Directory of Open Access Journals (Sweden)

    Danijel Labaš

    2010-12-01

    Full Text Available In certain periods, at certain places and in certain media, the reputation and credibility of the journalistic profession has suff ered for a number of reasons, including ignorance, mediocrity, dishonorable or morally questionable methods of journalists, or scandalous, fabricated or partial news stories. This is the opinion of Luka Brajnović, whose reflections in a comparative analysis with other authors comprise the ”contemplative axis” of this article. The fundamental task and goal of this article is to present and analyze Mr Brajnović’s refl ections on the possibility of saving or restoring the reputation and credibility of the journalistic profession. Journalists and the media will not be able to restore credibility as long as extravagant ideas exist about journalism as a profession that deals with ”public whispering, accusations and dissatisfaction with everything that has been established, or as a neutral profession that is ethically hybrid and indifferent towards good and evil”. Such an understanding of the journalistic profession runs against a positive image and reputation of journalism, a fi eld which is in itself worthy of respect of the entire public. In journalism, just as in other professions, unethical behavior on the part of a small number of journalists and media outlets casts a shadow on the journalistic profession as a whole, causing the reputation of the profession to become dependent upon a positive image and the reputation of those individuals working in it. As results of this article show – which for the fi rst time analytically approaches the scientific arguments and refl ections of Mr Brajnović in the Croatian public sphere – ethical and intellectual health, which can restore credibility to the journalistic profession, are the very elements rooted deep inside of it.

  6. How influencers’ credibility on Instagram is perceived by consumers and its impact on purchase intention

    OpenAIRE

    Rebelo, Marta Figueiredo

    2017-01-01

    The purpose of this thesis is to understand the perception Instagram users, in other words consumers, have of influencers they follow on Instagram. Consumer perceived credibility of influencers, and its impact on the purchase intention, is therefore studied. This dissertation aims to highlight which credibility dimensions better explain the purchase intention. Gender is also explored to verify behavior differences between female and male consumers. To better analyze the perc...

  7. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  8. Ductile gap between the Wenchuan and Lushan earthquakes revealed from the two-dimensional Pg seismic tomography.

    Science.gov (United States)

    Pei, Shunping; Zhang, Haijiang; Su, Jinrong; Cui, Zhongxiong

    2014-09-30

    A high-resolution two-dimensional Pg-wave velocity model is obtained for the upper crust around the epicenters of the April 20, 2013 Ms7.0 Lushan earthquake and the May 12, 2008 Ms8.0 Wenchuan earthquake, China. The tomographic inversion uses 47235 Pg arrival times from 6812 aftershocks recorded by 61 stations around the Lushan and Wenchuan earthquakes. Across the front Longmenshan fault near the Lushan earthquake, there exists a strong velocity contrast with higher velocities to the west and lower velocities to the east. Along the Longmenshan fault system, there exist two high velocity patches showing an "X" shape with an obtuse angle along the near northwest-southeast (NW-SE) direction. They correspond to the Precambrian Pengguan and Baoxing complexes on the surface but with a ~20 km shift, respectively. The aftershock gap of the 2008 Wenchuan and the 2013 Lushan earthquakes is associated with lower velocities. Based on the theory of maximum effective moment criterion, this suggests that the aftershock gap is weak and the ductile deformation is more likely to occur in the upper crust within the gap under the near NW-SE compression. Therefore our results suggest that the large earthquake may be hard to happen within the gap.

  9. Mentalizing skills do not differentiate believers from non-believers, but credibility enhancing displays do.

    Directory of Open Access Journals (Sweden)

    David L R Maij

    Full Text Available The ability to mentalize has been marked as an important cognitive mechanism enabling belief in supernatural agents. In five studies we cross-culturally investigated the relationship between mentalizing and belief in supernatural agents with large sample sizes (over 67,000 participants in total and different operationalizations of mentalizing. The relative importance of mentalizing for endorsing supernatural beliefs was directly compared with credibility enhancing displays-the extent to which people observed credible religious acts during their upbringing. We also compared autistic with neurotypical adolescents. The empathy quotient and the autism-spectrum quotient were not predictive of belief in supernatural agents in all countries (i.e., The Netherlands, Switzerland and the United States, although we did observe a curvilinear effect in the United States. We further observed a strong influence of credibility enhancing displays on belief in supernatural agents. These findings highlight the importance of cultural learning for acquiring supernatural beliefs and ask for reconsiderations of the importance of mentalizing.

  10. Effect of heterogeneities on evaluating earthquake triggering of volcanic eruptions

    Directory of Open Access Journals (Sweden)

    J. Takekawa

    2013-02-01

    Full Text Available Recent researches have indicated coupling between volcanic eruptions and earthquakes. Some of them calculated static stress transfer in subsurface induced by the occurrences of earthquakes. Most of their analyses ignored the spatial heterogeneity in subsurface, or only took into account the rigidity layering in the crust. On the other hand, a smaller scale heterogeneity of around hundreds of meters has been suggested by geophysical investigations. It is difficult to reflect that kind of heterogeneity in analysis models because accurate distributions of fluctuation are not well understood in many cases. Thus, the effect of the ignorance of the smaller scale heterogeneity on evaluating the earthquake triggering of volcanic eruptions is also not well understood. In the present study, we investigate the influence of the assumption of homogeneity on evaluating earthquake triggering of volcanic eruptions using finite element simulations. The crust is treated as a stochastic media with different heterogeneous parameters (correlation length and magnitude of velocity perturbation in our simulations. We adopt exponential and von Karman functions as spatial auto-correlation functions (ACF. In all our simulation results, the ignorance of the smaller scale heterogeneity leads to underestimation of the failure pressure around a chamber wall, which relates to dyke initiation. The magnitude of the velocity perturbation has a larger effect on the tensile failure at the chamber wall than the difference of the ACF and the correlation length. The maximum effect on the failure pressure in all our simulations is about twice larger than that in the homogeneous case. This indicates that the estimation of the earthquake triggering due to static stress transfer should take account of the heterogeneity of around hundreds of meters.

  11. Deconvolution effect of near-fault earthquake ground motions on stochastic dynamic response of tunnel-soil deposit interaction systems

    Directory of Open Access Journals (Sweden)

    K. Hacıefendioğlu

    2012-04-01

    Full Text Available The deconvolution effect of the near-fault earthquake ground motions on the stochastic dynamic response of tunnel-soil deposit interaction systems are investigated by using the finite element method. Two different earthquake input mechanisms are used to consider the deconvolution effects in the analyses: the standard rigid-base input and the deconvolved-base-rock input model. The Bolu tunnel in Turkey is chosen as a numerical example. As near-fault ground motions, 1999 Kocaeli earthquake ground motion is selected. The interface finite elements are used between tunnel and soil deposit. The mean of maximum values of quasi-static, dynamic and total responses obtained from the two input models are compared with each other.

  12. Great earthquakes and slow slip events along the Sagami trough and outline of the Kanto Asperity Project

    Science.gov (United States)

    Kobayashi, R.; Yamamoto, Y.; Sato, T.; Shishikura, M.; Ito, H.; Shinohara, M.; Kawamura, K.; Shibazaki, B.

    2010-12-01

    The Kanto region is one of the most densely populated urban areas in the world. Complicated plate configurations are due to T-T-T type triple junction, island arc-island arc collision zone, and very shallow angle between axis of the Sagami trough and subducting direction. Great earthquakes along the Sagami trough have repeatedly occurred. The 1703 Genroku and 1923 (Taisho) Kanto earthquakes caused severe damages in the Tokyo metropolitan area. Intriguingly slow slip events have also repeatedly occurred in an area adjacent to the asperities of the great earthquakes, off Boso peninsula (e.g., Ozawa et al 2007). In the cases of the Nankai and Cascadia subduction zones, slow slip events occur at deeper levels than the asperity, in a transition zone between the asperity and a region of steady slip. In contrast, slow slip events in the Kanto region have occurred at relatively shallow depths, at the same level as the asperity, raising the possibility of friction controlled by different conditions to those (temperature and pressure) encountered at Nankai and Cascadia. We focus on three different types of seismic events occurring repeatedly at the almost same depth of the seismogenic zone along the Sagami trough (5-20 km) (1) The 1923 M~7.9 Taisho earthquake, located in Sagami Bay. Maximum slip is about 6 m, the recurrence interval is 200-400 yr, and the coupling rate is 80-100% (“coupling rates” = “slip amounts during earthquakes or slow-slip events” / [“rate of motion of the Philippine Sea Plate” - “recurrence interval”]) . (2) The 1703 M~8.2 Genroku earthquake, located in Sagami Bay, but also extending to the southern part of Boso Peninsula. Maximum slip is 15-20 m, the recurrence interval is ~2000 yr, and the coupling rate at the southern part of the Boso Peninsula is 10-30%. (3) Boso slow-slip events, located southeast of Boso Peninsula. Maximum slip is 15-20 cm over ~10 days, the recurrence interval is 5-6 yr, and the coupling rate is 70

  13. The relationship between clients' depression etiological beliefs and psychotherapy orientation preferences, expectations, and credibility beliefs.

    Science.gov (United States)

    Tompkins, Kelley A; Swift, Joshua K; Rousmaniere, Tony G; Whipple, Jason L

    2017-06-01

    The purpose of this study was to examine the relationship between clients' etiological beliefs for depression and treatment preferences, credibility beliefs, and outcome expectations for five different depression treatments-behavioral activation, cognitive therapy, interpersonal psychotherapy, pharmacotherapy, and psychodynamic psychotherapy. Adult psychotherapy clients (N = 98) were asked to complete an online survey that included the Reasons for Depression Questionnaire, a brief description of each of the five treatment options, and credibility, expectancy, and preference questions for each option. On average, the participating clients rated pharmacotherapy as significantly less credible, having a lower likelihood of success, and being less preferred than the four types of psychotherapy. In general, interpersonal psychotherapy was also rated more negatively than the other types of psychotherapy. However, these findings depended somewhat on whether the participating client was personally experiencing depression. Credibility beliefs, outcome expectations, and preferences for pharmacotherapy were positively associated with biological beliefs for depression; however, the other hypothesized relationships between etiological beliefs and treatment attitudes were not supported. Although the study is limited based on the specific sample and treatment descriptions that were used, the results may still have implications for psychotherapy research, training, and practice. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Lithospheric flexure under the Hawaiian volcanic load: Internal stresses and a broken plate revealed by earthquakes

    Science.gov (United States)

    Klein, Fred W.

    2016-01-01

    Several lines of earthquake evidence indicate that the lithospheric plate is broken under the load of the island of Hawai`i, where the geometry of the lithosphere is circular with a central depression. The plate bends concave downward surrounding a stress-free hole, rather than bending concave upward as with past assumptions. Earthquake focal mechanisms show that the center of load stress and the weak hole is between the summits of Mauna Loa and Mauna Kea where the load is greatest. The earthquake gap at 21 km depth coincides with the predicted neutral plane of flexure where horizontal stress changes sign. Focal mechanism P axes below the neutral plane display a striking radial pattern pointing to the stress center. Earthquakes above the neutral plane in the north part of the island have opposite stress patterns; T axes tend to be radial. The M6.2 Honomu and M6.7 Kiholo main shocks (both at 39 km depth) are below the neutral plane and show radial compression, and the M6.0 Kiholo aftershock above the neutral plane has tangential compression. Earthquakes deeper than 20 km define a donut of seismicity around the stress center where flexural bending is a maximum. The hole is interpreted as the soft center where the lithospheric plate is broken. Kilauea's deep conduit is seismically active because it is in the ring of maximum bending. A simplified two-dimensional stress model for a bending slab with a load at one end yields stress orientations that agree with earthquake stress axes and radial P axes below the neutral plane. A previous inversion of deep Hawaiian focal mechanisms found a circular solution around the stress center that agrees with the model. For horizontal faults, the shear stress within the bending slab matches the slip in the deep Kilauea seismic zone and enhances outward slip of active flanks.

  15. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  16. Kinematics, mechanics, and potential earthquake hazards for faults in Pottawatomie County, Kansas, USA

    Science.gov (United States)

    Ohlmacher, G.C.; Berendsen, P.

    2005-01-01

    Many stable continental regions have subregions with poorly defined earthquake hazards. Analysis of minor structures (folds and faults) in these subregions can improve our understanding of the tectonics and earthquake hazards. Detailed structural mapping in Pottawatomie County has revealed a suite consisting of two uplifted blocks aligned along a northeast trend and surrounded by faults. The first uplift is located southwest of the second. The northwest and southeast sides of these uplifts are bounded by northeast-trending right-lateral faults. To the east, both uplifts are bounded by north-trending reverse faults, and the first uplift is bounded by a north-trending high-angle fault to the west. The structural suite occurs above a basement fault that is part of a series of north-northeast-trending faults that delineate the Humboldt Fault Zone of eastern Kansas, an integral part of the Midcontinent Rift System. The favored kinematic model is a contractional stepover (push-up) between echelon strike-slip faults. Mechanical modeling using the boundary element method supports the interpretation of the uplifts as contractional stepovers and indicates that an approximately east-northeast maximum compressive stress trajectory is responsible for the formation of the structural suite. This stress trajectory suggests potential activity during the Laramide Orogeny, which agrees with the age of kimberlite emplacement in adjacent Riley County. The current stress field in Kansas has a N85??W maximum compressive stress trajectory that could potentially produce earthquakes along the basement faults. Several epicenters of seismic events (maximum

  17. The use of tags and tag clouds to discern credible content in online health message forums.

    Science.gov (United States)

    O'Grady, Laura; Wathen, C Nadine; Charnaw-Burger, Jill; Betel, Lisa; Shachak, Aviv; Luke, Robert; Hockema, Stephen; Jadad, Alejandro R

    2012-01-01

    Web sites with health-oriented content are potentially harmful if inaccurate or inappropriate medical information is used to make health-related decisions. Checklists, rating systems and guidelines have been developed to help people determine what is credible, but recent Internet technologies emphasize applications that are collaborative in nature, including tags and tag clouds, where site users 'tag' or label online content, each using their own labelling system. Concepts such as the date, reference, author, testimonial and quotations are considered predictors of credible content. An understanding of these descriptive tools, how they relate to the depiction of credibility and how this relates to overall efforts to label data in relation to the semantic web has yet to emerge. This study investigates how structured (pre-determined) and unstructured (user-generated) tags and tag clouds with a multiple word search feature are used by participants to assess credibility of messages posted in online message forums. The targeted respondents were those using web sites message forums for disease self-management. We also explored the relevancy of our findings to the labelling or indexing of data in the context of the semantic web. Diabetes was chosen as the content area in this study, since (a) this is a condition with increasing prevalence and (b) diabetics have been shown to actively use the Internet to manage their condition. From January to March 2010 participants were recruited using purposive sampling techniques. A screening instrument was used to determine eligibility. The study consisted of a demographic and computer usage survey, a series of usability tests and an interview. We tested participants (N=22) on two scenarios, each involving tasks that assessed their ability to tag content and search using a tag cloud that included six structured credibility terms (statistics, date, reference, author, testimonial and quotations). MORAE Usability software (version 3

  18. The Hengill geothermal area, Iceland: variation of temperature gradients deduced from the maximum depth of seismogenesis

    Science.gov (United States)

    Foulger, G.R.

    1995-01-01

    Given a uniform lithology and strain rate and a full seismic data set, the maximum depth of earthquakes may be viewed to a first order as an isotherm. These conditions are approached at the Hengill geothermal area, S. Iceland, a dominantly basaltic area. The temperature at which seismic failure ceases for the strain rates likely at the Hengill geothermal area is determined by analogy with oceanic crust, and is about 650 ?? 50??C. The topographies of the top and bottom of the seismogenic layer were mapped using 617 earthquakes. The thickness of the seismogenic layer is roughly constant and about 3 km. A shallow, aseismic, low-velocity volume within the spreading plate boundary that crosses the area occurs above the top of the seismogenic layer and is interpreted as an isolated body of partial melt. The base of the seismogenic layer has a maximum depth of about 6.5 km beneath the spreading axis and deepens to about 7 km beneath a transform zone in the south of the area. -from Author

  19. Deflation Risk in the Euro Area and Central Bank Credibility

    NARCIS (Netherlands)

    G. Galati (Gabriele); Z. Gorgi (Zion); R. Moessner (Richhild); C. Zhou (Chen)

    2016-01-01

    textabstractThis paper investigates how the perceived risk that the euro area will experience deflation has evolved over time, and what this risk implies for the credibility of the ECB. We use a novel data set on market participants’ perceptions of short- to long-term deflation risk implied by

  20. Tsunami hazard assessment along Diba-Oman and Diba-Al-Emirates coasts

    Directory of Open Access Journals (Sweden)

    El-Hussain Issa

    2017-01-01

    Full Text Available Tsunami is among the most devastating natural hazards phenomenon responsible for significant loss of life and property throughout history. The Sultanate of Oman and United Arab Emirates are among the Indian Ocean countries that were subjected to one confirmed tsunami in November 27, 1945 due to an Mw 8.1 earthquake in Makran Subduction Zone. In this study, we present preliminary deterministic tsunami hazard assessment for the coasts of Diba Oman and Diba Al-Emirates, which are located on the western coast of the Oman Sea. The tsunami vulnerability of these cities increases due to the construction of many critical infrastructures and urban concentration along their coasts. Therefore, tsunami hazard assessment is necessary to mitigate the risk on the socio-economic system and sustainable developments. The major known source of tsunamis able to impact both coasts of Oman and United Arab Emirates is the Makran Subduction Zone (MSZ which extends for approximately 900 km. The deterministic approach uses specific scenarios considering the maximum credible earthquakes occurring in the MSZ and computes the ensuing tsunami impact in the coasts of the study area. The maximum wave height graphs and inundation maps are obtained for tsunami scenarios caused by 8.8 earthquake magnitude in eastern MSZ and 8.2 magnitude from western MSZ. The Mw8.8 eastern MSZ causes a maximum inundation distance of 447 meters and a maximum flow depth of 1.37 meter. Maximum inundation distance larger than 420 meters occurs due to the Mw8.2 western MSZ scenario. For this scenario, numerical simulations show a maximum flow depth of about 2.34 meters.

  1. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  2. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  3. Cultural Variation in Situation Assessment: Influence of Source Credibility and Rank Status

    National Research Council Canada - National Science Library

    Heacox, N

    2000-01-01

    .... Although information content, rank status, and source credibility have received much attention by researchers in command and control decision-making, cultural variations in these factors have seldom been studied...

  4. Questioning History, Nationality and Identity in Timberlake Wertenbaker’s Credible Witness

    Directory of Open Access Journals (Sweden)

    Nursen Gömceli

    2014-05-01

    Full Text Available The aim of this paper is to examine the Anglo-American playwright Timberlake Wertenbaker’s approach to the issues of history, nationality and identity in her play Credible Witness (2001, and to discuss the significance of these concepts in our modern world through a close analysis of the play. In Credible Witness, the playwright brings together people from diverse countries, such as Sri Lanka, Algeria, Eritrea, Somalia and Macedonia in a detention centre in London, and via the stories of these asylum seekers, and particularly through the dramatic encounter between Petra, a Macedonian woman with strong nationalistic pride, and her son Alexander, a history teacher forced to seek refuge in Britain for political reasons, Wertenbaker tries to demonstrate “what happens to people when they step outside, or are forced outside, their history, their identity” (Aston 2003, 13.

  5. Eletronuclear's relationship with the Brazilian media: transparency and credibility

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, Gloria, E-mail: galvarez@eletronuclear.gov.br [Eletrobras Termonuclear S.A. (ELETRONUCLEAR), Rio de Janeiro, RJ (Brazil)

    2013-07-01

    In a capitalist economy the most valued assets are not money, shares or facilities, but credibility. Lack of money can ruin a company, but often it is reputation that delivers the final blow. It has become challenging to safeguard reputation in a world where Communication is increasingly connected and with such an intense and lightning fast flow of information. This is particularly true for the electricity sector - a commodity so prevalent in everyday modern life, but, whose business dealings, are hardly known by the general public. When it comes to nuclear energy, the challenge of establishing an effective Communication with transparency and credibility touches on even more complex elements. The topic of this paper is the scenario through which the Communication process, along with its characteristics and approaches, unfolds between the nuclear sector and the Brazilian media. (author)

  6. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  7. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  8. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  9. Rapid modeling of complex multi-fault ruptures with simplistic models from real-time GPS: Perspectives from the 2016 Mw 7.8 Kaikoura earthquake

    Science.gov (United States)

    Crowell, B.; Melgar, D.

    2017-12-01

    The 2016 Mw 7.8 Kaikoura earthquake is one of the most complex earthquakes in recent history, rupturing across at least 10 disparate faults with varying faulting styles, and exhibiting intricate surface deformation patterns. The complexity of this event has motivated the need for multidisciplinary geophysical studies to get at the underlying source physics to better inform earthquake hazards models in the future. However, events like Kaikoura beg the question of how well (or how poorly) such earthquakes can be modeled automatically in real-time and still satisfy the general public and emergency managers. To investigate this question, we perform a retrospective real-time GPS analysis of the Kaikoura earthquake with the G-FAST early warning module. We first perform simple point source models of the earthquake using peak ground displacement scaling and a coseismic offset based centroid moment tensor (CMT) inversion. We predict ground motions based on these point sources as well as simple finite faults determined from source scaling studies, and validate against true recordings of peak ground acceleration and velocity. Secondly, we perform a slip inversion based upon the CMT fault orientations and forward model near-field tsunami maximum expected wave heights to compare against available tide gauge records. We find remarkably good agreement between recorded and predicted ground motions when using a simple fault plane, with the majority of disagreement in ground motions being attributable to local site effects, not earthquake source complexity. Similarly, the near-field tsunami maximum amplitude predictions match tide gauge records well. We conclude that even though our models for the Kaikoura earthquake are devoid of rich source complexities, the CMT driven finite fault is a good enough "average" source and provides useful constraints for rapid forecasting of ground motion and near-field tsunami amplitudes.

  10. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    Science.gov (United States)

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global

  11. Persuading girls to take elective physical science courses in high school: Who are the credible communicators?

    Science.gov (United States)

    Koballa, Thomas R., Jr.

    Eighth-grade girls (N=257) randomly selected from nine different public junior high schools in central Texas were questioned in order to identify the communicators whom they perceive as highly credible regarding reasons for taking elective physical science courses in high school and the attributes associated with these communicators. Four persons were each identified by better than 10 percent of the sample as the best person to try to convince junior high school girls to take elective physical science courses in high school. In order of perceived credibility, these persons are father, woman science teacher, mother, and boy high school student. Slight variations in the order of perceived credibility were found when the responses from girls of the different ethnic groups represented in the sample (Caucasian, Hispanic, Black, and Asian) were examined separately. Attributes listed by the respondents for father, woman science teacher, mother, and boy high school student were examined and classified into the categories of prestige, trustworthiness, similarity, attractiveness, and power. Prestige and trustworthiness are the attributes associates most frequently with communicators identified as highly credible. Implications of the present study and suggestions for further research are discussed.

  12. Perceptions of Credibility of Male and Female Syndicated Political Columnists.

    Science.gov (United States)

    Andsager, Julie L.

    1990-01-01

    Examines perceptions of the credibility of male and female syndicated political columnists. Finds that college students exhibited little prejudice against female versus male bylines in political interpretive columns. Finds a small tendency for male readers to evaluate male bylines higher in stereotypical ways, but female readers do not do this.…

  13. Consistency between verbal and non-verbal affective cues: a clue to speaker credibility.

    Science.gov (United States)

    Gillis, Randall L; Nilsen, Elizabeth S

    2017-06-01

    Listeners are exposed to inconsistencies in communication; for example, when speakers' words (i.e. verbal) are discrepant with their demonstrated emotions (i.e. non-verbal). Such inconsistencies introduce ambiguity, which may render a speaker to be a less credible source of information. Two experiments examined whether children make credibility discriminations based on the consistency of speakers' affect cues. In Experiment 1, school-age children (7- to 8-year-olds) preferred to solicit information from consistent speakers (e.g. those who provided a negative statement with negative affect), over novel speakers, to a greater extent than they preferred to solicit information from inconsistent speakers (e.g. those who provided a negative statement with positive affect) over novel speakers. Preschoolers (4- to 5-year-olds) did not demonstrate this preference. Experiment 2 showed that school-age children's ratings of speakers were influenced by speakers' affect consistency when the attribute being judged was related to information acquisition (speakers' believability, "weird" speech), but not general characteristics (speakers' friendliness, likeability). Together, findings suggest that school-age children are sensitive to, and use, the congruency of affect cues to determine whether individuals are credible sources of information.

  14. CPSFS: A Credible Personalized Spam Filtering Scheme by Crowdsourcing

    Directory of Open Access Journals (Sweden)

    Xin Liu

    2017-01-01

    Full Text Available Email spam consumes a lot of network resources and threatens many systems because of its unwanted or malicious content. Most existing spam filters only target complete-spam but ignore semispam. This paper proposes a novel and comprehensive CPSFS scheme: Credible Personalized Spam Filtering Scheme, which classifies spam into two categories: complete-spam and semispam, and targets filtering both kinds of spam. Complete-spam is always spam for all users; semispam is an email identified as spam by some users and as regular email by other users. Most existing spam filters target complete-spam but ignore semispam. In CPSFS, Bayesian filtering is deployed at email servers to identify complete-spam, while semispam is identified at client side by crowdsourcing. An email user client can distinguish junk from legitimate emails according to spam reports from credible contacts with the similar interests. Social trust and interest similarity between users and their contacts are calculated so that spam reports are more accurately targeted to similar users. The experimental results show that the proposed CPSFS can improve the accuracy rate of distinguishing spam from legitimate emails compared with that of Bayesian filter alone.

  15. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  16. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    International Nuclear Information System (INIS)

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-01-01

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M w ) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M w 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M w 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper

  17. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  18. Civil Society and the Conduct of Free, Fair and Credible Election ...

    African Journals Online (AJOL)

    Civil Society and the Conduct of Free, Fair and Credible Election: Lessons from ... of Non Governmental agencies like civil society to prevent the government of the ... fair so as to rid the continent of its notorious record of post election violence.

  19. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  20. Numerical relationship between surface deformation and a change of groundwater table before and after an earthquake

    International Nuclear Information System (INIS)

    Akao, Yoshihiko

    1995-01-01

    The purpose of this study is to estimate the effect of earthquakes upon a groundwater flow around a repositories for high-level radioactive wastes. Estimation of a groundwater flow change before and after an earthquake or a volcanic eruption is one of the issues for a long-term safety assessment of the repositories. However, almost any systematic investigation about the causality between a groundwater flow change and an earthquake or an eruption was not found, and as well no estimation formula has been published. The authors succeeded in obtaining a primitive relationship between a groundwater change and an earthquake in this study. The study consists of three stages. First, several survey reports which describe field observation results of groundwater anomalies caused by earthquakes or eruptions have been collected. The necessary data have been read from the literature and systematically arranged. Second, source mechanisms of the corresponding earthquakes were inspected and static displacements at the well positions were calculated by the dislocation theory in the seismology. Third, parametric studies among the parameters of groundwater anomalies and earthquakes were carried out to find a numerical relationship between a couple of them. Then, a preliminary relationship between water table change in a well and static displacement at the well position was found. The authors can conclude that temporary change of water table seems to depend on the norm of displacement vector. In this relationship, the maximum value of water table change would be approximately one hundred times of the displacement

  1. Investigating the impact of viral message appeal and message credibility on consumer attitude toward brand

    Directory of Open Access Journals (Sweden)

    Majid Esmaeilpour

    2016-12-01

    Full Text Available Background - Due to the rapid growth of the Internet and use of e-commerce in recent years, viral marketing has drawn the attention of manufacturing and service organizations. However, no research has been conducted to examine the impact of message appeal and message source credibility on consumers' attitude with mediating role of intellectual involvement of consumers and their risk taking level. Purpose - The aim of this study was to examine the impact of appeal and message source credibility on consumers’ attitude with mediating role of consumers’ intellectual involvement and their risk taking level. Design/methodology/approach – The population of this study includes consumers of mobile phones (Samsung, Sony, Nokia, LG and iPhone in Bushehr city (Iran. As the population of the study is unlimited, 430 questionnaires were distributed using available sampling method, and 391 questionnaires were collected and analyzed. Using structural equation modeling, data were analyzed through smart PLS software. Findings –The results show that the appeal and credibility of the message source have impact on consumer attitudes toward the brand. It was also found that intellectual involvement of consumers plays the mediating role in the relationship between message appeal and consumer attitudes toward brand. In the relationship between message source credibility and customer attitude towards the brand, the level of risk taking of people has no mediating role. Research limitations/implications – Data collection tool was questionnaire in this study, and questionnaire has some disadvantages that can affect the results. Additionally, this study was conducted in Bushehr city (Iran. Therefore, we should be cautious in generalizing the findings. Originality/value – In this study, the effect of message appeal and message source credibility on consumer attitude to brand was examined. The risk taking level of consumer and his involvement level were considered

  2. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  3. Seismic Regionalization of Michoacan, Mexico and Recurrence Periods for Earthquakes

    Science.gov (United States)

    Magaña García, N.; Figueroa-Soto, Á.; Garduño-Monroy, V. H.; Zúñiga, R.

    2017-12-01

    Michoacán is one of the states with the highest occurrence of earthquakes in Mexico and it is a limit of convergence triggered by the subduction of Cocos plate over the North American plate, located in the zone of the Pacific Ocean of our country, in addition to the existence of active faults inside of the state like the Morelia-Acambay Fault System (MAFS).It is important to make a combination of seismic, paleosismological and geological studies to have good planning and development of urban complexes to mitigate disasters if destructive earthquakes appear. With statistical seismology it is possible to characterize the degree of seismic activity as well as to estimate the recurrence periods for earthquakes. For this work, seismicity catalog of Michoacán was compiled and homogenized in time and magnitude. This information was obtained from world and national agencies (SSN, CMT, etc), some data published by Mendoza and Martínez-López (2016) and starting from the seismic catalog homogenized by F. R. Zúñiga (Personal communication). From the analysis of the different focal mechanisms reported in the literature and geological studies, the seismic regionalization of the state of Michoacán complemented the one presented by Vázquez-Rosas (2012) and the recurrence periods for earthquakes within the four different seismotectonic regions. In addition, stable periods were determined for the b value of the Gutenberg-Richter (1944) using the Maximum Curvature and EMR (Entire Magnitude Range Method, 2005) techniques, which allowed us to determine recurrence periods: years for earthquakes upper to 7.5 for the subduction zone (A zone) with EMR technique and years with MAXC technique for the same years for earthquakes upper to 5 for B1 zone with EMR technique and years with MAXC technique; years for earthquakes upper to 7.0 for B2 zone with EMR technique and years with MAXC technique; and the last one, the Morelia-Acambay Fault Sistem zone (C zone) years for earthquakes

  4. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  5. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  6. One Basin, One Stress Regime, One Orientation of Seismogenic Basement Faults, Variable Spatio-Temporal Slip Histories: Lessons from Fort Worth Basin Induced Earthquake Sequences

    Science.gov (United States)

    DeShon, H. R.; Brudzinski, M.; Frohlich, C.; Hayward, C.; Jeong, S.; Hornbach, M. J.; Magnani, M. B.; Ogwari, P.; Quinones, L.; Scales, M. M.; Stump, B. W.; Sufri, O.; Walter, J. I.

    2017-12-01

    Since October 2008, the Fort Worth basin in north Texas has experienced over 30 magnitude (M) 3.0+ earthquakes, including one M4.0. Five named earthquake sequences have been recorded by local seismic networks: DFW Airport, Cleburne-Johnson County, Azle, Irving-Dallas, and Venus-Johnson County. Earthquakes have occurred on northeast (NE)-southwest (SW) trending Precambrian basement faults and within the overlying Ellenburger limestone unit used for wastewater disposal. Focal mechanisms indicate primarily normal faulting, and stress inversions indicate maximum regional horizontal stress strikes 20-30° NE. The seismogenic sections of the faults in either the basement or within the Ellenburger appear optimally oriented for failure within the modern stress regime. Stress drop estimates range from 10 to 75 bars, with little variability between and within the named sequences, and the values are consistent with intraplate earthquake stress drops in natural tectonic settings. However, the spatio-temporal history of each sequence relative to wastewater injection data varies. The May 2015 M4.0 Venus earthquake, for example, is only the largest of what is nearly 10 years of earthquake activity on a single fault structure. Here, maximum earthquake size has increased with time and exhibits a log-linear relationship to cumulative injected volume from 5 nearby wells. At the DFW airport, where the causative well was shut-in within a few months of the initial earthquakes and soon after the well began operation, we document migration away from the injector on the same fault for nearly 6 km sporadically over 5 years. The Irving-Dallas and Azle sequences, like DFW airport, appear to have started rather abruptly with just a few small magnitude earthquakes in the weeks or months preceding the significant set of magnitude 3.5+ earthquakes associated with each sequence. There are no nearby (<10 km) injection operations to the Irving-Dallas sequence and the Azle linked wells operated for

  7. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  8. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  9. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  10. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  11. Predictors of premature termination from psychotherapy for anorexia nervosa: Low treatment credibility, early therapy alliance, and self-transcendence.

    Science.gov (United States)

    Jordan, Jennifer; McIntosh, Virginia V W; Carter, Frances A; Joyce, Peter R; Frampton, Christopher M A; Luty, Suzanne E; McKenzie, Janice M; Carter, Janet D; Bulik, Cynthia M

    2017-08-01

    Failure to complete treatment for anorexia nervosa (AN) is- common, clinically concerning but difficult to predict. This study examines whether therapy-related factors (patient-rated pretreatment credibility and early therapeutic alliance) predict subsequent premature termination of treatment (PTT) alongside self-transcendence (a previously identified clinical predictor) in women with AN. 56 women aged 17-40 years participating in a randomized outpatient psychotherapy trial for AN. Treatment completion was defined as attending 15/20 planned sessions. Measures were the Treatment Credibility, Temperament and Character Inventory, Vanderbilt Therapeutic Alliance Scale and the Vanderbilt Psychotherapy Process Scale. Statistics were univariate tests, correlations, and logistic regression. Treatment credibility and certain early patient and therapist alliance/process subscales predicted PTT. Lower self-transcendence and lower early process accounted for 33% of the variance in predicting PTT. Routine assessment of treatment credibility and early process (comprehensively assessed from multiple perspectives) may help clinicians reduce PTT thereby enhancing treatment outcomes. © 2017 Wiley Periodicals, Inc.

  12. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  13. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  14. Earthquake scaling laws for rupture geometry and slip heterogeneity

    Science.gov (United States)

    Thingbaijam, Kiran K. S.; Mai, P. Martin; Goda, Katsuichiro

    2016-04-01

    We analyze an extensive compilation of finite-fault rupture models to investigate earthquake scaling of source geometry and slip heterogeneity to derive new relationships for seismic and tsunami hazard assessment. Our dataset comprises 158 earthquakes with a total of 316 rupture models selected from the SRCMOD database (http://equake-rc.info/srcmod). We find that fault-length does not saturate with earthquake magnitude, while fault-width reveals inhibited growth due to the finite seismogenic thickness. For strike-slip earthquakes, fault-length grows more rapidly with increasing magnitude compared to events of other faulting types. Interestingly, our derived relationship falls between the L-model and W-model end-members. In contrast, both reverse and normal dip-slip events are more consistent with self-similar scaling of fault-length. However, fault-width scaling relationships for large strike-slip and normal dip-slip events, occurring on steeply dipping faults (δ~90° for strike-slip faults, and δ~60° for normal faults), deviate from self-similarity. Although reverse dip-slip events in general show self-similar scaling, the restricted growth of down-dip fault extent (with upper limit of ~200 km) can be seen for mega-thrust subduction events (M~9.0). Despite this fact, for a given earthquake magnitude, subduction reverse dip-slip events occupy relatively larger rupture area, compared to shallow crustal events. In addition, we characterize slip heterogeneity in terms of its probability distribution and spatial correlation structure to develop a complete stochastic random-field characterization of earthquake slip. We find that truncated exponential law best describes the probability distribution of slip, with observable scale parameters determined by the average and maximum slip. Applying Box-Cox transformation to slip distributions (to create quasi-normal distributed data) supports cube-root transformation, which also implies distinctive non-Gaussian slip

  15. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  16. Factors that optimise the credibility of advertisements whilst promoting feelings of emotional well-being and satisfaction

    OpenAIRE

    Duck, Nicholas

    2017-01-01

    The aim of this thesis is to demonstrate that advertisements that challenge the security of consumers can undermine the impact and lasting influence of these messages. Conversely, advertisements could be used to evoke feelings of security and enhance emotional well-being whilst optimising the credibility and impact of messages. Specifically, research demonstrates that advertisements might elicit different motivational styles, referred to as an individual’s regulatory focus. The credibility of...

  17. Analysis of the Earthquake-Resistant Design Approach for Buildings in Mexico

    Directory of Open Access Journals (Sweden)

    Carrillo Julián

    2014-01-01

    Full Text Available The development of new codes for earthquake-resistant structures has made possible to guarantee a better performance of buildings, when they are subjected to seismic actions. Therefore, it is convenient that current codes for design of building become conceptually transparent when defining the strength modification factors and assessing maximum lateral displacements, so that the design process can be clearly understood by structural engineers. The aim of this study is to analyze the transparency of earthquake-resistant design approach for buildings in Mexico by means of a critical review of the factors for strength modification and displacement amplification. The approach of building design codes in US is also analyzed. It is concluded that earthquake-resistant design in Mexico have evolved in refinement and complexity. It is also demonstrated that the procedure prescribed by such design codes allows the assessment of the design strengths and displacements in a more rational way, in accordance not only with the present stage of knowledge but also with the contemporary tendencies in building codes. In contrast, the procedures used in US codes may not provide a clear view for seismic response assessment of buildings.

  18. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  19. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  20. Multi-Parameter Observation and Detection of Pre-Earthquake Signals in Seismically Active Areas

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Parrot, M.; Liu, J. Y.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    .9) in Taiwan and Japan. We have found anomalous behavior before all of these events with no false negatives. Calculated false alarm ratio for the for the same month over the entire period of analysis (2003-2009) is less than 10% and was d as the earthquakes. The commonalities in detecting atmospheric/ionospheric anomalies show that they may exist over both land and sea in regions of maximum stress (i.e., along plate boundaries) Our results indicate that the ISTF model could provide a capability to observe pre-earthquake atmospheric/ionospheric signals by combining this information into a common framework.

  1. Stress regimes in the northwest of Iran from stress inversion of earthquake focal mechanisms

    Science.gov (United States)

    Afra, Mahsa; Moradi, Ali; Pakzad, Mehrdad

    2017-11-01

    Northwestern Iran is one of the seismically active regions with a high seismic risk in the world. This area is a part of the complex tectonic system due to the interaction between Arabia, Anatolia and Eurasia. The purpose of this study is to deduce the stress regimes in the northwestern Iran and surrounding regions from stress inversion of earthquake focal mechanisms. We compile 92 focal mechanisms data from the Global CMT catalogue and other sources and also determine the focal mechanisms of 14 earthquakes applying the moment tensor inversion. We divide the studied region into 9 zones using similarity of the horizontal GPS velocities and existing focal mechanisms. We implement two stress inversion methods, Multiple Inverse Method and Iterative Joint Inversion Method, which provide comparable results in terms of orientations of maximum horizontal stress axes SHmax. The similar results of the two methods should make us more confident about the interpretations. We consider zones of exclusion surrounding all the earthquakes according to independent focal mechanisms hypothesis. The hypothesis says that the inversion should involve events that are far enough from each other in order that any previous event doesn't affect the stress field near the earthquake under consideration. Accordingly we deal with the matter by considering zones of exclusion around all the events. The result of exclusion is only significant for eastern Anatolia. The stress regime in this region changes from oblique to strike slip faulting because of the exclusion. In eastern Anatolia, the direction of maximum horizontal stress is nearly north-south. The direction alters to east-west in Talesh region. Errors of σ1 are lower in all zones comparing with errors of σ2 and σ3 and there is a trade-off between data resolution and covariance of the model. The results substantiate the strike-slip and thrust faulting stress regimes in the northwest of Iran.

  2. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    Science.gov (United States)

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  3. Influence of Source Credibility on Consumer Acceptance of Genetically Modified Foods in China

    Directory of Open Access Journals (Sweden)

    Mingyang Zhang

    2016-09-01

    Full Text Available This paper examines the reasoning mechanism behind the consumer acceptance of genetically modified foods (GMFs in China, and investigates influence of source credibility on consumer acceptance of GMFs. Based on the original Persuasion Model—which was developed by Carl Hovland, an American psychologist and pioneer in the study of communication and its effect on attitudes and beliefs—we conducted a survey using multistage sampling from 1167 urban residents, which were proportionally selected from six cities in three economic regions (south, central, and north in the Jiangsu province through face to face interviews. Mixed-process regression that could correct endogeneity and ordered probit model were used to test the impact of source credibility on consumers’ acceptance of GMFs. Our major finding was that consumer acceptance of GMFs is affected by such factors as information source credibility, general attitudes, gender, and education levels. The reliability of biotechnology research institutes, government offices devoted to management of GM organisms (GMOs, and GMO technological experts have expedited urban consumer acceptance of GM soybean oil. However, public acceptance can also decrease as faith in the environmental organization. We also found that ignorance of the endogeneity of above mentioned source significantly undervalued its effect on consumers’ acceptance. Moreover, the remaining three sources (non-GMO experts, food companies, and anonymous information found on the Internet had almost no effect on consumer acceptance. Surprisingly, the more educated people in our survey were more skeptical towards GMFs. Our results contribute to the behavioral literature on consumer attitudes toward GMFs by developing a reasoning mechanism determining consumer acceptance of GMFs. Particularly, this paper quantitatively studied the influence of different source credibility on consumer acceptance of GMFs by using mixed-process regression to

  4. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  5. Space-borne Observations of Atmospheric Pre-Earthquake Signals in Seismically Active Areas: Case Study for Greece 2008-2009

    Science.gov (United States)

    Ouzounov, D. P.; Pulinets, S. A.; Davidenko, D. A.; Kafatos, M.; Taylor, P. T.

    2013-01-01

    We are conducting theoretical studies and practical validation of atm osphere/ionosphere phenomena preceding major earthquakes. Our approach is based on monitoring of two physical parameters from space: outgoi ng long-wavelength radiation (OLR) on the top of the atmosphere and e lectron and electron density variations in the ionosphere via GPS Tot al Electron Content (GPS/TEC). We retrospectively analyzed the temporal and spatial variations of OLR an GPS/TEC parameters characterizing the state of the atmosphere and ionosphere several days before four m ajor earthquakes (M>6) in Greece for 2008-2009: M6.9 of 02.12.08, M6. 2 02.20.08; M6.4 of 06.08.08 and M6.4 of 07.01.09.We found anomalous behavior before all of these events (over land and sea) over regions o f maximum stress. We expect that our analysis reveal the underlying p hysics of pre-earthquake signals associated with some of the largest earthquakes in Greece.

  6. Public Affairs: An Operational Planning Function to Safeguard Credibility and Public Opinion

    National Research Council Canada - National Science Library

    Cutler, Dawn

    2004-01-01

    ... context throughout all phases of conflict. Although the target audiences may differ when a commander is considering message dissemination through either the PA or IC channel, the consistency of the messages is important to credibility...

  7. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  8. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    Science.gov (United States)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  9. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    Science.gov (United States)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  10. Numerical simulation of co-seismic deformation of 2011 Japan Mw9. 0 earthquake

    Directory of Open Access Journals (Sweden)

    Zhang Keliang

    2011-08-01

    Full Text Available Co-seismic displacements associated with the Mw9. 0 earthquake on March 11, 2011 in Japan are numerically simulated on the basis of a finite-fault dislocation model with PSGRN/PSCMP software. Compared with the inland GPS observation, 90% of the computed eastward, northward and vertical displacements have residuals less than 0.10 m, suggesting that the simulated results can be, to certain extent, used to demonstrate the co-seismic deformation in the near field. In this model, the maximum eastward displacement increases from 6 m along the coast to 30 m near the epicenter, where the maximum southward displacement is 13 m. The three-dimensional display shows that the vertical displacement reaches a maximum uplift of 14.3 m, which is comparable to the tsunami height in the near-trench region. The maximum subsidence is 5.3 m.

  11. Effects in Morocco of the Lisboa earthquake 1 November 1755

    International Nuclear Information System (INIS)

    Levret, A.

    1988-05-01

    Within the framework of a cooperative agreement Sofratome/Office National d'Electricite of Morocco and Sofratome/Electricidade de Portugal, a study has been conducted as to the effects of the November 1, 1755 Lisbon earthquake in Morocco. This event, the effects of which have been described at length in Portugal, was likewise strongly felt in Morocco, especially on the Atlantic coast, which was laid waste not only through the direct agency of seismic waves, but also through that of a formidable tsunami. In old texts, the descriptions of these conjugate effects has been rendered with varying degrees of overstatement. The procedure adopted in order to arrive at a precise identification of the effects and their origin and an evaluation of intensity involves three stages: a) an assessment of the reliability of the documents used; b) a thoroughgoing analysis of the descriptions with the object of discriminating between the direct effects of the earthquake and those ascribable to the action of the tidal wave: c) a readjustment of the intensities by analysis of the global effects of the earthquake not only in Morocco but also in Portugal and Spain. Then a comparison of these with the well- documented effects of the recent, February 28, 1969 earthquake, originating at the same source. Extrapolated isoseismals for the effects in Morocco of the 1755 event derived from this study are then assigned. In the light of current knowledge concerning the historical seismicity of the Iberian African collision zone, an outline of the maximum observed intensities is proposed [fr

  12. Stress triggering of the Lushan M7. 0 earthquake by the Wenchuan Ms8. 0 earthquake

    Directory of Open Access Journals (Sweden)

    Wu Jianchao

    2013-08-01

    Full Text Available The Wenchuan Ms8. 0 earthquake and the Lushan M7. 0 earthquake occurred in the north and south segments of the Longmenshan nappe tectonic belt, respectively. Based on the focal mechanism and finite fault model of the Wenchuan Ms8. 0 earthquake, we calculated the coulomb failure stress change. The inverted coulomb stress changes based on the Nishimura and Chenji models both show that the Lushan M7. 0 earthquake occurred in the increased area of coulomb failure stress induced by the Wenchuan Ms8. 0 earthquake. The coulomb failure stress increased by approximately 0. 135 – 0. 152 bar in the source of the Lushan M7. 0 earthquake, which is far more than the stress triggering threshold. Therefore, the Lushan M7. 0 earthquake was most likely triggered by the coulomb failure stress change.

  13. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  14. The Effects of Source Credibility in the Presence or Absence of Prior Attitudes: Implications for the Design of Persuasive Communication Campaigns.

    Science.gov (United States)

    Kumkale, G Tarcan; Albarracín, Dolores; Seignourel, Paul J

    2010-06-01

    Most theories of persuasion predict that limited ability and motivation to think about communications should increase the impact of source credibility on persuasion. Furthermore, this effect is assumed to occur, regardless of whether or not the recipients have prior attitudes. In this study, the effects of source credibility, ability, and motivation (knowledge, message repetition, relevance) on persuasion were examined meta-analytically across both attitude formation and change conditions. Findings revealed that the Source Credibility × Ability/Motivation interaction emerged only when participants lacked prior attitudes and were unable to form a new attitude based on the message content. In such settings, the effects of source credibility decayed rapidly. The implications of these findings for applied communication campaigns are discussed.

  15. The 'credibility paradox' in China's science communication: Views from scientific practitioners.

    Science.gov (United States)

    Zhang, Joy Yueyue

    2015-11-01

    In contrast to increasing debates on China's rising status as a global scientific power, issues of China's science communication remain under-explored. Based on 21 in-depth interviews in three cities, this article examines Chinese scientists' accounts of the entangled web of influence which conditions the process of how scientific knowledge achieves (or fails to achieve) its civic authority. A main finding of this study is a 'credibility paradox' as a result of the over-politicisation of science and science communication in China. Respondents report that an absence of visible institutional endorsements renders them more public credibility and better communication outcomes. Thus, instead of exploiting formal channels of science communication, scientists interviewed were more keen to act as 'informal risk communicators' in grassroots and private events. Chinese scientists' perspectives on how to earn public support of their research sheds light on the nature and impact of a 'civic epistemology' in an authoritarian state. © The Author(s) 2015.

  16. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  17. Source Parameters from Full Moment Tensor Inversions of Potentially Induced Earthquakes in Western Canada

    Science.gov (United States)

    Wang, R.; Gu, Y. J.; Schultz, R.; Kim, A.; Chen, Y.

    2015-12-01

    During the past four years, the number of earthquakes with magnitudes greater than three has substantially increased in the southern section of Western Canada Sedimentary Basin (WCSB). While some of these events are likely associated with tectonic forces, especially along the foothills of the Canadian Rockies, a significant fraction occurred in previously quiescent regions and has been linked to waste water disposal or hydraulic fracturing. A proper assessment of the origin and source properties of these 'induced earthquakes' requires careful analyses and modeling of regional broadband data, which steadily improved during the past 8 years due to recent establishments of regional broadband seismic networks such as CRANE, RAVEN and TD. Several earthquakes, especially those close to fracking activities (e.g. Fox creek town, Alberta) are analyzed. Our preliminary full moment tensor inversion results show maximum horizontal compressional orientations (P-axis) along the northeast-southwest orientation, which agree with the regional stress directions from borehole breakout data and the P-axis of historical events. The decomposition of those moment tensors shows evidence of strike-slip mechanism with near vertical fault plane solutions, which are comparable to the focal mechanisms of injection induced earthquakes in Oklahoma. Minimal isotropic components have been observed, while a modest percentage of compensated-linear-vector-dipole (CLVD) components, which have been linked to fluid migraition, may be required to match the waveforms. To further evaluate the non-double-couple components, we compare the outcomes of full, deviatoric and pure double couple (DC) inversions using multiple frequency ranges and phases. Improved location and depth information from a novel grid search greatly assists the identification and classification of earthquakes in potential connection with fluid injection or extraction. Overall, a systematic comparison of the source attributes of

  18. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    Science.gov (United States)

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; pchest (45/143 vs. 11/66 patients, RR = 1.9; ptraumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  19. 3. General principles of assessing seismic resistance of technological equipment of nuclear power plants

    International Nuclear Information System (INIS)

    1983-01-01

    The evaluation of the seismic resistance of technological equipment is performed by computation, experimental trial, possibly by combining both methods. Existing and prepared standards in the field of seismic resistance of nuclear power plants are mentioned. Accelerograms and response spectra of design-basis earhtquake and maximum credible earthquake serve as the basic data for evaluating seismic resistance. The nuclear power plant in Mochovce will be the first Czechoslovak nuclear power plant with so-called partially seismic design. The problem of dynamic interaction of technological equipment and nuclear power plant systems with a bearing structure is discussed. (E.F.)

  20. Consideration for standard earthquake vibration (1). The Niigataken Chuetsu-oki Earthquake in 2007

    International Nuclear Information System (INIS)

    Ishibashi, Katsuhiko

    2007-01-01

    Outline of new guideline of quakeproof design standard of nuclear power plant and the standard earthquake vibration are explained. The improvement points of new guideline are discussed on the basis of Kashiwazaki-Kariwa Nuclear Power Plant incidents. The fundamental limits of new guideline are pointed. Placement of the quakeproof design standard of nuclear power plant, JEAG4601 of Japan Electric Association, new guideline, standard earthquake vibration of new guideline, the Niigataken Chuetsu-oki Earthquake in 2007 and damage of Kashiwazaki-Kariwa Nuclear Power Plant are discussed. The safety criteria of safety review system, organization, standard and guideline should be improved on the basis of this earthquake and nuclear plant accident. The general knowledge, 'a nuclear power plant is not constructed in the area expected large earthquake', has to be realized. Preconditions of all nuclear power plants should not cause damage to anything. (S.Y.)

  1. Auto Correlation Analysis of Coda Waves from Local Earthquakes for Detecting Temporal Changes in Shallow Subsurface Structures: the 2011 Tohoku-Oki, Japan Earthquake

    Science.gov (United States)

    Nakahara, Hisashi

    2015-02-01

    For monitoring temporal changes in subsurface structures I propose to use auto correlation functions of coda waves from local earthquakes recorded at surface receivers, which probably contain more body waves than surface waves. Use of coda waves requires earthquakes resulting in decreased time resolution for monitoring. Nonetheless, it may be possible to monitor subsurface structures in sufficient time resolutions in regions with high seismicity. In studying the 2011 Tohoku-Oki, Japan earthquake (Mw 9.0), for which velocity changes have been previously reported, I try to validate the method. KiK-net stations in northern Honshu are used in this analysis. For each moderate earthquake normalized auto correlation functions of surface records are stacked with respect to time windows in the S-wave coda. Aligning the stacked, normalized auto correlation functions with time, I search for changes in phases arrival times. The phases at lag times of <1 s are studied because changes at shallow depths are focused. Temporal variations in the arrival times are measured at the stations based on the stretching method. Clear phase delays are found to be associated with the mainshock and to gradually recover with time. The amounts of the phase delays are 10 % on average with the maximum of about 50 % at some stations. The deconvolution analysis using surface and subsurface records at the same stations is conducted for validation. The results show the phase delays from the deconvolution analysis are slightly smaller than those from the auto correlation analysis, which implies that the phases on the auto correlations are caused by larger velocity changes at shallower depths. The auto correlation analysis seems to have an accuracy of about several percent, which is much larger than methods using earthquake doublets and borehole array data. So this analysis might be applicable in detecting larger changes. In spite of these disadvantages, this analysis is still attractive because it can

  2. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  3. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  4. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  5. Establishing the credibility of archaeoastronomical sites

    Science.gov (United States)

    Ruggles, Clive L. N.

    2015-08-01

    This is not a talk about archaeoastronomy per se, but rather about how the Astronomy and World Heritage Initiative helps us deal with archaeoastronomical sites as potential World Heritage.In 2011, an attempt to nominate a prehistoric “observatory” site onto the World Heritage List proved unsuccessful because UNESCO rejected the interpretation as statistically and archaeologically unproven. The case highlights an issue at the heart of archaeoastronomical methodology and interpretation: the mere existence of astronomical alignments in ancient sites does not prove that they were important to those who constructed and used the sites, let alone giving us insights into their likely significance and meaning. Advances in archaeoastronomy over several decades have resulted in the development of a substantial body of theory and practice (Ruggles 2014) where the most favoured interpretations depend upon integrating methods from astronomy, anthropology and other disciplines, but individual cases can still engender considerable controversy.The fact that more archaeoastronomical sites are now appearing on national tentative lists prior to their WHL nomination means that this is no longer just an academic issue; establishing the credibility of the archaeoastronomical interpretations is crucial to any assessment of their value in heritage terms.In this talk I shall describe progress that has been made within the Astronomy and World Heritage Initiative towards establishing broadly acceptable measures of archaeoastronomical credibility that make sense in the context of the heritage evaluation process. I will focus particularly, but not exclusively, on sites that are included in the Thematic Studies and/or are already included on national Tentative Lists, such as the Portuguese/Spanish seven-stone antas (Neolithic dolmens) and Chankillo in Peru (solar observation device dating to c. 300BC). I will also mention how the recognition of astronomical attributes of potential

  6. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  7. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  8. Instructor Credibility across Disciplines: Identifying Students' Differentiated Expectations of Instructor Behaviors

    Science.gov (United States)

    Obermiller, Carl; Ruppert, Bryan; Atwood, April

    2012-01-01

    Business communication instructors can face a unique set of challenges to maintain their credibility with students. Communication plays an important role in the instructor-student relationship, and students judge instructors' ability to teach communication based on their ability to practice what they teach. The authors' empirical study shows that…

  9. The Effects of Teacher Self-Disclosure via "Facebook" on Teacher Credibility

    Science.gov (United States)

    Mazer, Joseph P.; Murphy, Richard E.; Simonds, Cheri J.

    2009-01-01

    Research suggests that teachers who personalize their teaching through the use of humor, stories, enthusiasm, and self-disclosure are perceived by their students to be effective in explaining course content. This experimental study examined the effects of computer-mediated teacher self-disclosure on perceptions of teacher credibility. Participants…

  10. Twitter Use and Its Effects on Student Perception of Instructor Credibility

    Science.gov (United States)

    DeGroot, Jocelyn M.; Young, Valerie J.; VanSlette, Sarah H.

    2015-01-01

    This study investigates college student perceptions of instructor credibility based on the content of an instructor's Twitterfeed and student beliefs about Twitter as a communication tool. Quantitative and qualitative methods were utilized to explore the effects of three manipulated Twitter feeds (e.g., tweeting social topics, professional topics,…

  11. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  12. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  13. Towards Credible City Branding Practices: How Do Iran’s Largest Cities Face Ecological Modernization?

    Directory of Open Access Journals (Sweden)

    Negar Noori

    2018-04-01

    Full Text Available City branding is not only increasingly practiced in cities in established economies, but also among municipal governments in countries, until quite recently, rather closed off from the outside world. One country with a strong drive to engage in urban (redevelopment in the post-oil era through enhancing its ‘ecological modernization’ is Iran. Megacities in Iran have all begun to venture into making profiles of what they think they are or would like to be. However, some of the adopted city branding strategies lack sophistication. In this article, the authors examine what indicators can be used for evaluating the credibility of city brands and apply these to Iran’s 15 megacities. After offering brief descriptions of the generic features of each of these cities, they map their use of city brand identities and popular city labels related to ecological modernization and analyze the credibility of their city branding practices. Based on their findings, the authors distinguish five types of cities and explain what makes some types more credible in their use of brands than others. Generally speaking, compared to cities in other nations, Iranian cities pay special attention to historical, natural, cultural, and religious aspects.

  14. Numerical simulation of faulting in the Sunda Trench shows that seamounts may generate megathrust earthquakes

    Science.gov (United States)

    Jiao, L.; Chan, C. H.; Tapponnier, P.

    2017-12-01

    The role of seamounts in generating earthquakes has been debated, with some studies suggesting that seamounts could be truncated to generate megathrust events, while other studies indicate that the maximum size of megathrust earthquakes could be reduced as subducting seamounts could lead to segmentation. The debate is highly relevant for the seamounts discovered along the Mentawai patch of the Sunda Trench, where previous studies have suggested that a megathrust earthquake will likely occur within decades. In order to model the dynamic behavior of the Mentawai patch, we simulated forearc faulting caused by seamount subducting using the Discrete Element Method. Our models show that rupture behavior in the subduction system is dominated by stiffness of the overriding plate. When stiffness is low, a seamount can be a barrier to rupture propagation, resulting in several smaller (M≤8.0) events. If, however, stiffness is high, a seamount can cause a megathrust earthquake (M8 class). In addition, we show that a splay fault in the subduction environment could only develop when a seamount is present, and a larger offset along a splay fault is expected when stiffness of the overriding plate is higher. Our dynamic models are not only consistent with previous findings from seismic profiles and earthquake activities, but the models also better constrain the rupture behavior of the Mentawai patch, thus contributing to subsequent seismic hazard assessment.

  15. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

    Science.gov (United States)

    Liu, B.

    2017-12-01

    The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

  16. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  17. Trading Time with Space - Development of subduction zone parameter database for a maximum magnitude correlation assessment

    Science.gov (United States)

    Schaefer, Andreas; Wenzel, Friedemann

    2017-04-01

    Subduction zones are generally the sources of the earthquakes with the highest magnitudes. Not only in Japan or Chile, but also in Pakistan, the Solomon Islands or for the Lesser Antilles, subduction zones pose a significant hazard for the people. To understand the behavior of subduction zones, especially to identify their capabilities to produce maximum magnitude earthquakes, various physical models have been developed leading to a large number of various datasets, e.g. from geodesy, geomagnetics, structural geology, etc. There have been various studies to utilize this data for the compilation of a subduction zone parameters database, but mostly concentrating on only the major zones. Here, we compile the largest dataset of subduction zone parameters both in parameter diversity but also in the number of considered subduction zones. In total, more than 70 individual sources have been assessed and the aforementioned parametric data have been combined with seismological data and many more sources have been compiled leading to more than 60 individual parameters. Not all parameters have been resolved for each zone, since the data completeness depends on the data availability and quality for each source. In addition, the 3D down-dip geometry of a majority of the subduction zones has been resolved using historical earthquake hypocenter data and centroid moment tensors where available and additionally compared and verified with results from previous studies. With such a database, a statistical study has been undertaken to identify not only correlations between those parameters to estimate a parametric driven way to identify potentials for maximum possible magnitudes, but also to identify similarities between the sources themselves. This identification of similarities leads to a classification system for subduction zones. Here, it could be expected if two sources share enough common characteristics, other characteristics of interest may be similar as well. This concept

  18. Why Follow the Leader? Collective Action, Credible Commitment and Conflict

    OpenAIRE

    Keefer, Philip

    2012-01-01

    Most analyses of conflict assume that conflicting groups act in a unitary fashion. This assumption is often violated: to reduce their risk of replacement, group leaders prevent both group members and soldiers from acting collectively, making it difficult for leaders to make credible commitments to them. Lifting the assumption that groups are unitary shifts the analysis of a wide range of c...

  19. Supply Chain Risk Management: An Introduction to the Credible Threat

    Science.gov (United States)

    2016-08-01

    van.poindexter@dau.mil. Figure 3. Trusted Suppliers Source: “ Managing Information Communications Technology Global Supply Chain Risk Awareness...Defense AT&L: July-August 2016 18 Supply Chain Risk Management An Introduction to the Credible Threat Heath Ferry n Van Poindexter 19...cybersecurity breach. This article examines the elements of supply chain risk management , the national security risks associated with exploitation, and

  20. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  1. LASSCI2009.2: layered earthquake rupture forecast model for central Italy, submitted to the CSEP project

    Directory of Open Access Journals (Sweden)

    Francesco Visini

    2010-11-01

    Full Text Available The Collaboratory for the Study of Earthquake Predictability (CSEP selected Italy as a testing region for probabilistic earthquake forecast models in October, 2008. The model we have submitted for the two medium-term forecast periods of 5 and 10 years (from 2009 is a time-dependent, geologically based earthquake rupture forecast that is defined for central Italy only (11-15˚ E; 41-45˚ N. The model took into account three separate layers of seismogenic sources: background seismicity; seismotectonic provinces; and individual faults that can produce major earthquakes (seismogenic boxes. For CSEP testing purposes, the background seismicity layer covered a range of magnitudes from 5.0 to 5.3 and the seismicity rates were obtained by truncated Gutenberg-Richter relationships for cells centered on the CSEP grid. Then the seismotectonic provinces layer returned the expected rates of medium-to-large earthquakes following a traditional Cornell-type approach. Finally, for the seismogenic boxes layer, the rates were based on the geometry and kinematics of the faults that different earthquake recurrence models have been assigned to, ranging from pure Gutenberg-Richter behavior to characteristic events, with the intermediate behavior named as the hybrid model. The results for different magnitude ranges highlight the contribution of each of the three layers to the total computation. The expected rates for M >6.0 on April 1, 2009 (thus computed before the L'Aquila, 2009, MW= 6.3 earthquake are of particular interest. They showed local maxima in the two seismogenic-box sources of Paganica and Sulmona, one of which was activated by the L'Aquila earthquake of April 6, 2009. Earthquake rates as of August 1, 2009, (now under test also showed a maximum close to the Sulmona source for MW ~6.5; significant seismicity rates (10-4 to 10-3 in 5 years for destructive events (magnitude up to 7.0 were located in other individual sources identified as being capable of such

  2. Influence of User Ratings, Expert Ratings and Purposes of Information Use on the Credibility Judgments of College Students

    Science.gov (United States)

    Lim, Sook; Steffel, Nick

    2015-01-01

    Introduction: This study examined whether user ratings, expert ratings and the purpose of the use of a book on a user-generated site influenced the credibility of the book. It also examined whether the effects of user ratings and expert ratings on credibility judgments of the book varied according to the purpose of information use. In addition,…

  3. The Emotional Moves of a Rational Actor: Smiles, Scowls, and Other Credible Messages

    Directory of Open Access Journals (Sweden)

    Lawrence Ian Reed

    2017-03-01

    Full Text Available Many scholars turn to emotions to understand irrational behavior. We do the opposite: we turn to rationality and game theory to understand people’s emotions. We discuss a striking theory of emotions that began with the game theory of credible threats and promises, then was enriched by evolutionary biology and psychology, and now is being tested in psychological experiments. We review some of these experiments which use economic games to set up strategic situations with real payoffs. The experiments test whether a player’s emotional expressions lend credibility to promises, threats, and claims of danger or hardship. The results offer insights into the hidden strategies behind a warm smile, an angry scowl, a look of terror, and eyes of despair.

  4. Workshop on establishing institutional credibility for SEAB Task Force on Radioactive Waste Management

    International Nuclear Information System (INIS)

    1994-01-01

    At the request of the Secretary of Energy Advisory Board's Task Force on Civilian Radioactive Waste Management, the National Research Council sponsored a workshop on Establishing Institutional Credibility. The purpose of the workshop was to (1) identify the range of available knowledge regarding the theoretical and conceptual issues of how institutions establish their credibility and legitimacy with key constituents, and (2) to help explore and clarify fundamental concepts in management theory related to these issues. The examination was to include what is known about how organizations establish, maintain, lose, and regain public trust and confidence. There was to be no attempt to develop consensus on these issues or to suggest particular courses of action. The workshop was held on October 24-25, 1991, in Denver, Colorado

  5. What Can Sounds Tell Us About Earthquake Interactions?

    Science.gov (United States)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  6. Comparison of the November 2002 Denali and November 2001 Kunlun Earthquakes

    Science.gov (United States)

    Bufe, C. G.

    2002-12-01

    Major earthquakes occurred in Tibet on the central Kunlun fault (M 7.8) on November 14, 2001 (Lin and others, 2002) and in Alaska on the central Denali fault (M 7.9) on November 3, 2002. Both earthquakes generated large surface waves (Kunlun Ms 8.0 (USGS) and Denali Ms 8.5). Each event occurred on east-west-trending strike-slip faults and exhibited nearly unilateral rupture propagating several hundred kilometers from west to east. Surface rupture length estimates were about 400 km for Kunlun, 300 km for Denali. Maximum surface faulting and moment release were observed far to the east of the points of rupture initiation. Harvard moment centroids were located east of USGS epicenters by 182 km (Kunlun) and by 126 km (Denali). Maximum surface faulting was observed near 240 km (Kunlun, 16 m left lateral) and near 175 km (Denali, 9 m right lateral) east of the USGS epicenters. Significant thrust components were observed in the initiation of the Denali event (ERI analysis and mapped thrust) and in the termination of the Kunlun rupture, as evidenced by thrust mechanisms of the largest aftershocks which occurred near the eastern part of the Kunlun rupture. In each sequence the largest aftershock was about 2 orders of magnitude smaller than the mainshock. Moment release along the ruptured segments was examined for the 25-year periods preceding the main shocks. The Denali zone shows precursory accelerating moment release with the dominant events occurring on October 22, 1996 (M 5.8) and October 23, 2002 (M 6.7). The Kunlun zone shows nearly constant moment release over time with the last significant event before the main shock occurring on November 26, 2000 (M 5.4). Moment release data are consistent with previous observations of annual periodicity preceding major earthquakes, possibly due to the evolution of a critical state with seasonal and tidal triggering (Varnes and Bufe, 2001). Annual periodicity is also evident for the larger events in the greater San Francisco Bay

  7. From intermediation to disintermediation and apomediation: new models for consumers to access and assess the credibility of health information in the age of Web2.0.

    Science.gov (United States)

    Eysenbach, Gunther

    2007-01-01

    This theoretical paper discusses the model that, as a result of the social process of disintermediation enabled by digital media, traditional intermediaries are replaced by what this author calls apomediaries, which are tools and peers standing by to guide consumers to trustworthy information, or adding credibility to information. For apomediation to be an attractive and successful model for consumers, the recipient has to reach a certain degree of maturity and autonomy. Different degrees of autonomy may explain differences in information seeking and credibility appraisal behaviours. It is hypothesized that in an apomediated environment, tools, influential peers and opinion leaders are the primary conveyors of trust and credibility. In this environment, apomediary credibility may become equally or more important than source credibility or even message credibility. It is suggested to use tools of network analysis to study the dynamics of apomediary credibility in a networked digital world. There are practical implications of the apomediation model for developers of consumer health websites which aspire to come across as "credible: Consumers need and want to be able to be co-creators of content, not merely be an audience who is broadcasted to. Web2.0 technology enables such sites. Engaging and credible Web sites are about building community and communities are built upon personal and social needs.

  8. Investigating the impact of viral message appeal and message credibility on consumer attitude toward the brand

    Directory of Open Access Journals (Sweden)

    Esmaeilpour Majid

    2016-07-01

    Full Text Available Due to the rapid growth of the Internet and use of e-commerce in the recent years, viral marketing has drawn the attention of manufacturing and service organizations. However, no research has been conducted to examine the impact of message appeal and message source credibility on consumers’ attitude with mediating role of intellectual involvement of consumers and their risk taking level. The aim of this study was to examine the impact of appeal and message source credibility on consumers’ attitude with mediating role of consumers’ intellectual involvement and their risk taking level. The population of this study includes consumers of mobile phones (Samsung, Sony, Nokia, LG and iPhone in the Bushehr city (Iran. As the population of the study is unlimited, 430 questionnaires were distributed using available sampling method, and 391 questionnaires were collected and analyzed. Using structural equation modeling, we analysed the data through smart PLS software. The results show that the appeal and credibility of the message source impact the consumer attitudes toward the brand. We also found that the intellectual involvement of consumers plays the mediating role in the relationship between message appeal and consumer attitudes toward brands. In the relationship between message source credibility and customer attitude towards the brand, the level of risk taking of people has no mediating role.

  9. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  10. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  11. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  12. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  13. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  14. Global Zero and Deterrence Credibility : A Critical Analysis of Obama`s Nuclear Policy and Extended Nuclear Deterrence Credibility on the Korean Peninsula

    OpenAIRE

    Ganss, Mathias

    2012-01-01

    This thesis is a qualitative case study analysis of the whether the nuclear policies of President Obama has weakened the U.S. extended nuclear deterrence credibility on the Korean Peninsula. To answer this, the thesis employs two strategies: First, two variables are discussed; a nuclear capabilities-variable; and a nuclear policy-variable. The purpose is to assess the impact the New START treaty has on U.S. nuclear capabilities, and to assess the implications of Obama`s nuclear policy, expres...

  15. The Age of Emotionality? – How emotions influence consumers’ perception of credibility and trust in CSR communication

    OpenAIRE

    Reupsch, Anika

    2017-01-01

    Companies around the world are using different strategies for their corporate social responsibility (CSR) communication, but finding an appropriate strategy to enhance trust and credibility on the consumer side remains challenging. The constitutive aspect of emotions in CSR communication has long been overlooked. Therefore, this study investigates the influence emotions in CSR communication have on the credibility and trust consumers have in a firm’s CSR. Quantitative research with group divi...

  16. Radon anomalies prior to earthquakes (2). Atmospheric radon anomaly observed before the Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    Before the 1995 Hyogoken-Nanbu earthquake, various geochemical precursors were observed in the aftershock area: chloride ion concentration, groundwater discharge rate, groundwater radon concentration and so on. Kobe Pharmaceutical University (KPU) is located about 25 km northeast from the epicenter and within the aftershock area. Atmospheric radon concentration had been continuously measured from 1984 at KPU, using a flow-type ionization chamber. The radon concentration data were analyzed using the smoothed residual values which represent the daily minimum of radon concentration with the exclusion of normalized seasonal variation. The radon concentration (smoothed residual values) demonstrated an upward trend about two months before the Hyogoken-Nanbu earthquake. The trend can be well fitted to a log-periodic model related to earthquake fault dynamics. As a result of model fitting, a critical point was calculated to be between 13 and 27 January 1995, which was in good agreement with the occurrence date of earthquake (17 January 1995). The mechanism of radon anomaly before earthquakes is not fully understood. However, it might be possible to detect atmospheric radon anomaly as a precursor before a large earthquake, if (1) the measurement is conducted near the earthquake fault, (2) the monitoring station is located on granite (radon-rich) areas, and (3) the measurement is conducted for more than several years before the earthquake to obtain background data. (author)

  17. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  18. Investigating electronic word-of-mouth effects on online discussion forums: the role of perceived positive electronic word-of-mouth review credibility.

    Science.gov (United States)

    Chih, Wen-Hai; Wang, Kai-Yu; Hsu, Li-Chun; Huang, Su-Chen

    2013-09-01

    Electronic word of mouth (eWOM) has been an important factor influencing consumer purchase decisions. Using the ABC model of attitude, this study proposes a model to explain how eWOM affects online discussion forums. Specifically, we propose that platform (Web site reputation and source credibility) and customer (obtaining buying-related information and social orientation through information) factors influence purchase intentions via perceived positive eWOM review credibility, as well as product and Web site attitudes in an online community context. A total of 353 online discussion forum users in an online community (Fashion Guide) in Taiwan were recruited, and structural equation modeling (SEM) was used to test the research hypotheses. The results indicate that Web site reputation, source credibility, obtaining buying-related information, and social orientation through information positively influence perceived positive eWOM review credibility. In turn, perceived positive eWOM review credibility directly influences purchase intentions and also indirectly influences purchase intentions via product and Web site attitudes. Finally, we discuss the theoretical and managerial implications of the findings.

  19. The Importance of 'Likes': The Interplay of Message Framing, Source, and Social Endorsement on Credibility Perceptions of Health Information on Facebook.

    Science.gov (United States)

    Borah, Porismita; Xiao, Xizhu

    2018-01-01

    Online sources not only permeate the information-seeking environment of the younger generation, but also have profound influence in shaping their beliefs and behaviors. In this landscape, examining the factors responsible for credibility perceptions of online information is fundamental, particularly for health-related information. Using a 2 (frames: gain vs. loss) × 2 (source: expert vs. non-expert) × 2 (social endorsement: high vs. low) randomized between-subjects experimental design, this study examines the effect of health message framing and the moderating effects of social endorsement and source type on credibility perceptions of Facebook posts. Testing across two issues--physical activity and alcohol consumption--findings indicate that the gain-framed message was perceived as most credible. Additionally, significant three-way interactions suggest that social endorsement and source type affect the relationship between message framing and credibility perceptions. Specifically, the findings demonstrate that a gain-framed message from an expert source with high number of 'likes' is considered the most credible message. These findings have significant implications for information gathering from social media sources, such as the influence of 'likes' on health information.

  20. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the