WorldWideScience

Sample records for earthquakes turbulence financial

  1. Money matters: Rapid post-earthquake financial decision-making

    Science.gov (United States)

    Wald, David J.; Franco, Guillermo

    2016-01-01

    Post-earthquake financial decision-making is a realm beyond that of many people. In the immediate aftermath of a damaging earthquake, billions of dollars of relief, recovery, and insurance funds are in the balance through new financial instruments that allow those with resources to hedge against disasters and those at risk to limit their earthquake losses and receive funds for response and recovery.

  2. Higher Order Analysis of Turbulent Changes Found in the ELF Range Electric Field Plasma Before Major Earthquakes

    Science.gov (United States)

    Kosciesza, M.; Blecki, J. S.; Parrot, M.

    2014-12-01

    We report the structure function analysis of changes found in electric field in the ELF range plasma turbulence registered in the ionosphere over epicenter region of major earthquakes with depth less than 40 km that took place during 6.5 years of the scientific mission of the DEMETER satellite. We compare the data for the earthquakes for which we found turbulence with events without any turbulent changes. The structure functions were calculated also for the Polar CUSP region and equatorial spread F region. Basic studies of the turbulent processes were conducted with use of higher order spectra and higher order statistics. The structure function analysis was performed to locate and check if there are intermittent behaviors in the ionospheres plasma over epicenter region of the earthquakes. These registrations are correlated with the plasma parameters measured onboard DEMETER satellite and with geomagnetic indices.

  3. The Financial Safety Net – a Necessity in a Turbulent Financial World

    Directory of Open Access Journals (Sweden)

    Peter BALOGH

    2011-11-01

    Full Text Available Over the last years we observed that whenever crisis hits, interest in guarantee arrangements rises. The current financial crisis is no exception in this respect. It turns the spotlight on the operation of the financial safety net and provides policy makers with a unique opportunity to monitor its performance and, more specifically, to identify its strengths and weaknesses. This paper focuses on the way parts of the financial safety net work and places a special emphasis on the growing role of these safety nets in our turbulent financial world.

  4. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  5. Theory of earthquakes interevent times applied to financial markets

    Science.gov (United States)

    Jagielski, Maciej; Kutner, Ryszard; Sornette, Didier

    2017-10-01

    We analyze the probability density function (PDF) of waiting times between financial loss exceedances. The empirical PDFs are fitted with the self-excited Hawkes conditional Poisson process with a long power law memory kernel. The Hawkes process is the simplest extension of the Poisson process that takes into account how past events influence the occurrence of future events. By analyzing the empirical data for 15 different financial assets, we show that the formalism of the Hawkes process used for earthquakes can successfully model the PDF of interevent times between successive market losses.

  6. Associations between economic loss, financial strain and the psychological status of Wenchuan earthquake survivors.

    Science.gov (United States)

    Huang, Yunong; Wong, Hung; Tan, Ngoh Tiong

    2015-10-01

    This study examines the effects of economic loss on the life satisfaction and mental health of Wenchuan earthquake survivors. Economic loss is measured by earthquake impacts on the income and houses of the survivors. The correlation analysis shows that earthquake impact on income is significantly correlated with life satisfaction and depression. The regression analyses indicate that earthquake impact on income is indirectly associated with life satisfaction and depression through its effect on financial strain. The research highlights the importance of coping strategies in maintaining a balance between economic status and living demands for disaster survivors. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.

  7. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    Science.gov (United States)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all earthquakes recorded in Australia between 1967 and 1999. In conclusion, the increasing number and size of geoengineering activities, such as coal mining near Newcastle or planned carbon dioxide Geosequestration initiatives, represent a growing hazard potential, which can negatively affect socio-economic growth and sustainable development. Finally, hazard and risk degrees, based on geomechanical-mathematical models, can be forecasted in space and over time for urban planning in order to prevent economic losses of human-triggered earthquakes in the future.

  8. Developing of the ionospheric plasma turbulence over the epicenters of the extremely strong earthquakes - the results of the DEMETER satellite observations

    Science.gov (United States)

    Blecki, J. S.; Parrot, M.; Wronowski, R.; Kosciesza, M.

    2011-12-01

    The DEMETER French microsatellite satellite was launched in June 2004 and finished its operation in December 2010. During the time of the DEMETER satellite operation some gigantic earthquakes took place. We will report the electromagnetic effects registered by DEMETER prior to the earthquakes with magnitude over 8 or just close to this value. We selected events with good coverage of the measurements in the burst mode when the wave form of the electric field variations were registered. It is because the special attention will be given to study of the characteristics of the spectra of these variations and search of the nonlinear effects. This analysis is possible in the time interval when the waveform has been transmitted. Using wavelet and bispectral analysis as well as the statistical characteristics of the measured parameter, we find that registered variations are associated with developing of the ionospheric plasma turbulence. It is mainly Kolmogorov type of the turbulence. The payload of the DEMETER allows to measure important plasma parameters (ion composition, electron density and temperature, energetic particles) with high temporal resolution in the ionosphere over the seismic regions. The correlation of the observed plasma turbulence with changes of the other parameters will be also given. In the present work analysis of the low frequency fluctuations of the electric and magnetic fields for the selected strong earthquakes will be given. The mechanism of the energy transmission from the earthquake to the ionosphere is not clear, but we can discuss the behavior of the ionospheric plasma and search of the instabilities which could be a source of the electromagnetic field variations. Some attempt of this discussion will be given in the presentation. We will present results obtained prior to the some giant earthquakes (Peru2007, Wechuan China 2008, Haiti 2010, Chile 2010).

  9. Competitive strategy in turbulent healthcare markets: an analysis of financially effective teaching hospitals.

    Science.gov (United States)

    Langabeer, J

    1998-01-01

    As the healthcare marketplace, characterized by declining revenues and heavy price competition, continues to evolve toward managed care, teaching hospitals are being forced to act more like traditional industrial organizations. Profit-oriented behavior, including emphases on market strategies and competitive advantage, is now a necessity if these hospitals are going to survive the transition to managed care. To help teaching hospitals evaluate strategic options that maximize financial effectiveness, this study examined the financial and operating data for 100 major U.S. teaching hospitals to determine relationships among competitive strategy, market environment, and financial return on invested capital. Results should help major hospitals formulate more effective strategies to combat environmental turbulence.

  10. Risk Management in Earthquakes, Financial Markets, and the Game of 21: The role of Forecasting, Nowcasting, and Timecasting

    Science.gov (United States)

    Rundle, J. B.

    2017-12-01

    Earthquakes and financial markets share surprising similarities [1]. For example, the well-known VIX index, which by definition is the implied volatility of the Standard and Poors 500 index, behaves in very similar quantitative fashion to time series for earthquake rates. Both display sudden increases at the time of an earthquake or an announcement of the US Federal Reserve Open Market Committee [2], and both decay as an inverse power of time. Both can be regarded as examples of first order phase transitions [1], and display fractal and scaling behavior associated with critical transitions, such as power-law magnitude-frequency relations in the tails of the distributions. Early quantitative investors such as Edward Thorpe and John Kelly invented novel methods to mitigate or manage risk in games of chance such as blackjack, and in markets using hedging techniques that are still in widespread use today. The basic idea is the concept of proportional betting, where the gambler/investor bets a fraction of the bankroll whose size is determined by the "edge" or inside knowledge of the real (and changing) odds. For earthquake systems, the "edge" over nature can only exist in the form of a forecast (probability of a future earthquake); a nowcast (knowledge of the current state of an earthquake fault system); or a timecast (statistical estimate of the waiting time until the next major earthquake). In our terminology, a forecast is a model, while the nowcast and timecast are analysis methods using observed data only (no model). We also focus on defined geographic areas rather than on faults, thereby eliminating the need to consider specific fault data or fault interactions. Data used are online earthquake catalogs, generally since 1980. Forecasts are based on the Weibull (1952) probability law, and only a handful of parameters are needed. These methods allow the development of real time hazard and risk estimation using cloud-based technologies, and permit the application of

  11. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  12. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  13. Fractals and Forecasting in Earthquakes and Finance

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

    2011-12-01

    It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

  14. Contagion Effect of Natural Disaster and Financial Crisis Events on International Stock Markets

    Directory of Open Access Journals (Sweden)

    Kuo-Jung Lee

    2018-03-01

    Full Text Available In the contemporary world bustling with global trade, a natural disaster or financial crisis in one country (or region can cause substantial economic losses and turbulence in the local financial markets, which may then affect the economic activities and financial assets of other countries (or regions. This study focuses on the major natural disasters that occurred worldwide during the last decade, especially those in the Asia–Pacific region, and the economic effects of global financial crises. The heteroscedasticity bias correlation coefficient method and exponential general autoregressive conditional heteroscedasticity model are employed to compare the contagion effect in the stock markets of the initiating country on other countries, determining whether economically devastating factors have contagion or spillover effects on other countries. The empirical results indicate that among all the natural disasters considered, the 2008 Sichuan Earthquake in China caused the most substantial contagion effect in the stock markets of neighboring Asian countries. Regarding financial crises, the financial tsunami triggered by the secondary mortgage fallout in the United States generated the strongest contagion effect on the stock markets of developing and emerging economies. When building a diversified global investment portfolio, investors should be aware of the risks of major natural disasters and financial incidents.

  15. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  16. Earthquake Loss Assessment for the Evaluation of the Sovereign Risk and Financial Sustainability of Countries and Cities

    Science.gov (United States)

    Cardona, O. D.

    2013-05-01

    Recently earthquakes have struck cities both from developing as well as developed countries, revealing significant knowledge gaps and the need to improve the quality of input data and of the assumptions of the risk models. The quake and tsunami in Japan (2011) and the disasters due to earthquakes in Haiti (2010), Chile (2010), New Zealand (2011) and Spain (2011), only to mention some unexpected impacts in different regions, have left several concerns regarding hazard assessment as well as regarding the associated uncertainties to the estimation of the future losses. Understanding probable losses and reconstruction costs due to earthquakes creates powerful incentives for countries to develop planning options and tools to cope with sovereign risk, including allocating the sustained budgetary resources necessary to reduce those potential damages and safeguard development. Therefore the use of robust risk models is a need to assess the future economic impacts, the country's fiscal responsibilities and the contingent liabilities for governments and to formulate, justify and implement risk reduction measures and optimal financial strategies of risk retention and transfer. Special attention should be paid to the understanding of risk metrics such as the Loss Exceedance Curve (empiric and analytical) and the Expected Annual Loss in the context of conjoint and cascading hazards.

  17. Measuring the effectiveness of earthquake forecasting in insurance strategies

    Science.gov (United States)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  18. Impact of the Christchurch earthquakes on hospital staff.

    Science.gov (United States)

    Tovaranonte, Pleayo; Cawood, Tom J

    2013-06-01

    On September 4, 2010 a major earthquake caused widespread damage, but no loss of life, to Christchurch city and surrounding areas. There were numerous aftershocks, including on February 22, 2011 which, in contrast, caused substantial loss of life and major damage to the city. The research aim was to assess how these two earthquakes affected the staff in the General Medicine Department at Christchurch Hospital. Problem To date there have been no published data assessing the impact of this type of natural disaster on hospital staff in Australasia. A questionnaire that examined seven domains (demographics, personal impact, psychological impact, emotional impact, impact on care for patients, work impact, and coping strategies) was handed out to General Medicine staff and students nine days after the September 2010 earthquake and 14 days after the February 2011 earthquake. Response rates were ≥ 99%. Sixty percent of responders were earthquakes, respectively. A fifth to a third of people had to find an alternative route of transport to get to work but only eight percent to 18% took time off work. Financial impact was more severe following the February earthquake, with 46% reporting damage of >NZ $1,000, compared with 15% following the September earthquake (P earthquake than the September earthquake (42% vs 69%, P earthquake but this rose to 53% after the February earthquake (12/53 vs 45/85, P earthquake but this dropped significantly to 15% following the February earthquake (27/53 vs 13/62, P earthquakes upon General Medicine hospital staff. The effect was widespread with minor financial impact during the first but much more during the second earthquake. Moderate psychological impact was experienced in both earthquakes. This data may be useful to help prepare plans for future natural disasters. .

  19. Hierarchical structure of stock price fluctuations in financial markets

    International Nuclear Information System (INIS)

    Gao, Ya-Chun; Cai, Shi-Min; Wang, Bing-Hong

    2012-01-01

    The financial market and turbulence have been broadly compared on account of the same quantitative methods and several common stylized facts they share. In this paper, the She–Leveque (SL) hierarchy, proposed to explain the anomalous scaling exponents deviating from Kolmogorov monofractal scaling of the velocity fluctuation in fluid turbulence, is applied to study and quantify the hierarchical structure of stock price fluctuations in financial markets. We therefore observed certain interesting results: (i) the hierarchical structure related to multifractal scaling generally presents in all the stock price fluctuations we investigated. (ii) The quantitatively statistical parameters that describe SL hierarchy are different between developed financial markets and emerging ones, distinctively. (iii) For the high-frequency stock price fluctuation, the hierarchical structure varies with different time periods. All these results provide a novel analogy in turbulence and financial market dynamics and an insight to deeply understand multifractality in financial markets. (paper)

  20. The Impact of Earthquakes on the Domestic Stock Market

    NARCIS (Netherlands)

    Scholtens, Bert; Voorhorst, Yvonne

    How do financial markets respond to the impact of earthquakes? We investigate this for more than 100 earthquakes with fatalities in 21 countries from five continents in the period 1973-2011. Using an event study methodology we conclude that there are significant negative effects on stock market

  1. Turbulence and turmoil in the market or the language of a financial crisis

    Directory of Open Access Journals (Sweden)

    Michael White

    2004-04-01

    Full Text Available In the wake of cognitive linguistics developments, work pointing out the metaphorical underpinning of specialist discourse in many fields is showing a dramatic increase. In Spain alone, this is quite evident in full scale thesis dissertations: Civil Engineering and Urban Development (Roldán Riejos 1995; Economics (White, 1996; Bueno Lajusticia, 1999, Publicity (Cortés del Río, 2001; Architecture (Úbeda, 2000; Caballero Rodriguez, 2001; Science (Cuadrado Esclapez (2001; Mad Cow Disease (Martín de la Rosa, 2002 to give a few examples. Furthering this line of research, the present article focuses on how the press handles a very specific aspect of a financial crisis, namely, the question of extreme fluctuation of currency values. Two lexical items -turbulence and turmoil- are reiteratively used to grasp and convey the nature of this issue to the general public. As metaphor researchers are still finding fundamental issues such as metaphor identification very difficult to pin down, both theoretically and in practice,1 the evidence presented impinges on a significant area in this field, namely, usage whose metaphorical nature is open to question. The article first tackles this question addressing the issue of whether the lexical words turbulence and turmoil are to be considered metaphoric or have they become lexicalised or near lexicalised in the domain of economics. Co-textual evidence argues in favour of metaphoric consideration. A second issue is the question of how metaphoric sources may be attributed to different domains (see Cameron 1999, Kövecses 2000 and how these overlap and work together. Finally, the role of metaphor in underpinning cohesion, coherence and communication is examined.

  2. A Case Study of the Bam Earthquake to Establish a Pattern for Earthquake Management in Iran

    Directory of Open Access Journals (Sweden)

    Keramatollah Ziari

    2015-03-01

    Full Text Available The field of crisis management knowledge and expertise is associated with a wide range of fields. Knowledge-based crisis management is a combination of science, art and practice. Iran is an earthquake-prone country. Through years several earthquakes have happened in the country resulting in many human and financial losses. According to scientific standards, the first 24 hours following an earthquake is the most valuable time for saving victims. Yet in the case of Bam only 5% of the victims were rescued within the first 48 hours. The success of disaster management is evaluated in terms of programming, raising public participation, organizing and hiring manpower, and supervising the management process. In this study disaster management is divided into three stages in which different actions are required. The stages and actions are explained in detail. Moreover, features, effects, and losses of the earthquake are described.

  3. The Financial Turbulence in the Economy: Case of Ukraine

    Directory of Open Access Journals (Sweden)

    Iryna Novikova

    2016-01-01

    Full Text Available The article analyzes the causes and consequences of violation of financial stability. The famous historical examples of inflationary bursts emerging have been showed, as well as ways of establishing financial equilibrium. In particular, the article states that the main cause of violations of financial stability becomes inflationary boom, which arose by wars, socio-economic and political contradictions. The paper examines the impact of modern social and economic challenges on growth in inflation and on the deterioration of other macroeconomic indicators in Ukraine. At the end, recommendations to overcome financial problems in national economy have been provided. The importance of the exchange rate stability of currency has been emphasized.

  4. The Financial Turbulence in the Economy: Case of Ukraine

    OpenAIRE

    Iryna Novikova

    2016-01-01

    The article analyzes the causes and consequences of violation of financial stability. The famous historical examples of inflationary bursts emerging have been showed, as well as ways of establishing financial equilibrium. In particular, the article states that the main cause of violations of financial stability becomes inflationary boom, which arose by wars, socio-economic and political contradictions. The paper examines the impact of modern social and economic challenges on growth in inflati...

  5. Plasma turbulence in the ionosphere prior to earthquakes, some remarks on the DEMETER registrations

    Science.gov (United States)

    Błęcki, Jan; Parrot, Michel; Wronowski, Roman

    2011-06-01

    The question about presence of some precursors of the earthquakes has a long history. The answer is still not resolved, but researchers are looking for the effects which can be registered prior to earthquakes. One of the factors which has been found is the variation of the electromagnetic field observed on ground as well as onboard satellites. The disturbances of the electromagnetic field around areas of the earthquakes as pre-seismic events can occur few hours or even few days before the main shock. The payload of the DEMETER French microsatellite allows to measure waves and also some important plasma parameters (ion composition, electron density and temperature, energetic particles) with high temporal resolution in the ionosphere over the seismic regions. In the present work, analysis of the low frequency fluctuations of the electric fields for selected strong earthquakes in Japan (2004), China (2008), Taiwan (2006) and New Zealand (2009) are given. Special attention will be given to the study of the spectral characteristics of these variations and the search for nonlinear effects. This analysis is possible in the time interval where the waveform has been transmitted. The mechanism of the energy transmission from earthquakes to the ionosphere is not clear, but we can discuss the behavior of the ionospheric plasma and the search for instabilities which could be a source of electromagnetic field variations. A brief discussion of the characteristics of the spectra and multi-spectra is given in this paper. Attention is particularly given to the effect prior to the earthquake in New Zealand, when a nonlinear interaction leading to a lower hybrid wave generation was directly seen.

  6. Corporate Governance within Financial Institutions: Asset or Liability?

    Directory of Open Access Journals (Sweden)

    Dan CHIRLESAN

    2012-04-01

    Full Text Available Solid corporate governance of the financial institutions is of a vital concern not only to the institutions themselves but also for the entire financial system. After four years of financial turbulences, the issue of corporate governance is more important than never especially for financial institutions who take on a significant role in the process of financial intermediation as they are considered to be important players in the financial system, especially in the Euro Area. The main purpose of this paper is to set out a framework for analyzing and thinking about the core meaning, the advantages and the direction of specific practices regarding corporate governance in a company in general, and specifically in financial institutions.

  7. Statistical aspects and risks of human-caused earthquakes

    Science.gov (United States)

    Klose, C. D.

    2013-12-01

    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  8. Time Change and Universality in Turbulence and Finance

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Schmiegel, Jürgen; Shephard, Neil

    Empirical time series of turbulent flows and financial markets reveal some common basic stylized features. In particular, the densities of velocity increments and log returns are well fitted within the class of Normal inverse Gaussian distributions and show a similar evolution across time scales ...

  9. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  10. Earthquake forewarning — A multidisciplinary challenge from the ground up to space

    Science.gov (United States)

    Freund, Friedemann

    2013-08-01

    Most destructive earthquakes nucleate at between 5-7 km and about 35-40 km depth. Before earthquakes, rocks are subjected to increasing stress. Not every stress increase leads to rupture. To understand pre-earthquake phenomena we note that igneous and high-grade metamorphic rocks contain defects which, upon stressing, release defect electrons in the oxygen anion sublattice, known as positive holes. These charge carriers are highly mobile, able to flow out of stressed rocks into surrounding unstressed rocks. They form electric currents, which emit electromagnetic radiation, sometimes in pulses, sometimes sustained. The arrival of positive holes at the ground-air interface can lead to air ionization, often exclusively positive. Ionized air rising upward can lead to cloud condensation. The upward flow of positive ions can lead to instabilities in the mesosphere, to mesospheric lightning, to changes in the Total Electron Content (TEC) at the lower edge of the ionosphere, and electric field turbulences. Advances in deciphering the earthquake process can only be achieved in a broadly multidisciplinary spirit.

  11. How to recover from the financial market flu.

    Science.gov (United States)

    Doody, Dennis

    2008-05-01

    The widely publicized subprime mortgage crisis and soaring crude oil prices have contributed to considerable market volatility in recent months, inducing queasiness among institutional investors. A four-layer approach to asset allocation that carefully considers assets, liquidity, currency, and risk may be the best strategy for maintaining an institution's financial health through today's volatile market. Perhaps the biggest challenge in such financially turbulent times is keeping fear in check.

  12. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  13. Density-ratio effects on buoyancy-driven variable-density turbulent mixing

    Science.gov (United States)

    Aslangil, Denis; Livescu, Daniel; Banerjee, Arindam

    2017-11-01

    Density-ratio effects on the turbulent mixing of two incompressible, miscible fluids with different densities subject to constant acceleration are studied by means of high-resolution Direct Numerical Simulations. In a triply periodic domain, turbulence is generated by stirring in response to the differential buoyancy forces within the flow. Later, as the fluids become molecularly mixed, dissipation starts to overcome turbulence generation by bouyancy. Thus, the flow evolution includes both turbulence growth and decay, and it displays features present in the core region of the mixing layer of the Rayleigh-Taylor as well as Richtmyer-Meshkov instabilities. We extend the previous studies by investigating a broad range of density-ratio, from 1-14.4:1, corresponding to Atwood numbers of 0.05-0.87. Here, we focus on the Atwood number dependence of mixing-efficiency, that is defined based on the energy-conversion ratios from potential energy to total and turbulent kinetic energies, the decay characteristics of buoyancy-assisted variable-density homogeneous turbulence, and the effects of high density-ratios on the turbulence structure and mixing process. Authors acknowledge financial support from DOE-SSAA (DE-NA0003195) and NSF CAREER (#1453056) awards.

  14. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  15. Assessing Romanian financial sector stability: the importance of the international economic climate

    OpenAIRE

    Albulescu, Claudiu Tiberiu

    2008-01-01

    The aim of this paper is to develop an aggregate stability index for the Romanian financial system. The index which is meant to enhance the set of analysis used by the central bank to assess the financial stability accurately reflects the financial stability dynamics and the periods with financial turbulences during 1997-2007 in Romania. By the application of a technique which enables the measurement of the components’ contribution to the aggregate index volatility, we show that some individu...

  16. COMPARATIVE EVALUATION OF THE INFLUENCING EFFECTS OF GEOMAGNETIC SOLAR STORMS ON EARTHQUAKES IN ANATOLIAN PENINSULA

    Directory of Open Access Journals (Sweden)

    Yesugey Sadik Cengiz

    2009-07-01

    Full Text Available Earthquakes are tectonic events that take place within the fractures of the earth's crust, namely faults. Above certain scale, earthquakes can result in widespread fatalities and substantial financial loss. In addition to the movement of tectonic plates relative to each other, it is widely discussed that there are other external influences originate outside earth that can trigger earthquakes. These influences are called "triggering effects". The purpose of this article is to present a statistical view to elaborate if the solar geomagnetic storms trigger earthquakes.As a model, the research focuses on the Anatolian peninsula, presenting 41 years of historical data on magnetic storms and earthquakes collated from national and international resources. As a result of the comparative assessment of the data, it is concluded that the geomagnetic storms do not trigger earthquakes.

  17. Risk Management of the English Universities after the 2008 Financial Crisis

    Science.gov (United States)

    Yokoyama, Keiko

    2018-01-01

    The objective of the paper is to identify whether the global financial crisis in 2008 re-shaped risk management in the English universities in order to avoid future financial turbulence and manage risk in uncertain and insecure environments. The paper examined changes in the risk management mechanism of the English university system between 2008…

  18. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  19. Empirical fractal geometry analysis of some speculative financial bubbles

    Science.gov (United States)

    Redelico, Francisco O.; Proto, Araceli N.

    2012-11-01

    Empirical evidence of a multifractal signature during increasing of a financial bubble leading to a crash is presented. The April 2000 crash in the NASDAQ composite index and a time series from the discrete Chakrabarti-Stinchcombe model for earthquakes are analyzed using a geometric approach and some common patterns are identified. These patterns can be related the geometry of the rising period of a financial bubbles with the non-concave entropy problem.

  20. Transitional-turbulent spots and turbulent-turbulent spots in boundary layers.

    Science.gov (United States)

    Wu, Xiaohua; Moin, Parviz; Wallace, James M; Skarda, Jinhie; Lozano-Durán, Adrián; Hickey, Jean-Pierre

    2017-07-03

    Two observations drawn from a thoroughly validated direct numerical simulation of the canonical spatially developing, zero-pressure gradient, smooth, flat-plate boundary layer are presented here. The first is that, for bypass transition in the narrow sense defined herein, we found that the transitional-turbulent spot inception mechanism is analogous to the secondary instability of boundary-layer natural transition, namely a spanwise vortex filament becomes a [Formula: see text] vortex and then, a hairpin packet. Long streak meandering does occur but usually when a streak is infected by a nearby existing transitional-turbulent spot. Streak waviness and breakdown are, therefore, not the mechanisms for the inception of transitional-turbulent spots found here. Rather, they only facilitate the growth and spreading of existing transitional-turbulent spots. The second observation is the discovery, in the inner layer of the developed turbulent boundary layer, of what we call turbulent-turbulent spots. These turbulent-turbulent spots are dense concentrations of small-scale vortices with high swirling strength originating from hairpin packets. Although structurally quite similar to the transitional-turbulent spots, these turbulent-turbulent spots are generated locally in the fully turbulent environment, and they are persistent with a systematic variation of detection threshold level. They exert indentation, segmentation, and termination on the viscous sublayer streaks, and they coincide with local concentrations of high levels of Reynolds shear stress, enstrophy, and temperature fluctuations. The sublayer streaks seem to be passive and are often simply the rims of the indentation pockets arising from the turbulent-turbulent spots.

  1. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  2. Behaviour of turbulence models near a turbulent/non-turbulent interface revisited

    International Nuclear Information System (INIS)

    Ferrey, P.; Aupoix, B.

    2006-01-01

    The behaviour of turbulence models near a turbulent/non-turbulent interface is investigated. The analysis holds as well for two-equation as for Reynolds stress turbulence models using Daly and Harlow diffusion model. The behaviour near the interface is shown not to be a power law, as usually considered, but a more complex parametric solution. Why previous works seemed to numerically confirm the power law solution is explained. Constraints for turbulence modelling, i.e., for ensuring that models have a good behaviour near a turbulent/non-turbulent interface so that the solution is not sensitive to small turbulence levels imposed in the irrotational flow, are drawn

  3. Depression and posttraumatic stress disorder in temporary settlement residents 1 year after the Sichuan earthquake.

    Science.gov (United States)

    Cheng, Zhang; Ma, Ning; Yang, Lei; Agho, Kingsley; Stevens, Garry; Raphael, Beverley; Cui, Lijun; Liu, Yongqiao; Yan, Baoping; Ma, Hong; Yu, Xin

    2015-03-01

    The authors sought to determine the prevalence and risk factors for major depressive disorder and posttraumatic stress disorder (PTSD) among survivors living in temporary accommodation in the Yongxing settlement in Mianyang city 1 year after the Sichuan earthquake for further interventions. They interviewed 182 residents, using the Structured Clinical Interview for DSM-IV Axis I Disorders and a self-report questionnaire. The 12-month prevalence of depressive disorder and PTSD were 48.9% and 39.6%, respectively. Multivariate analysis indicated that bereaved survivors were 5.51 times (adjusted odds ratio [AOR] = 5.51; 95% confidence interval [CI] =2.14-14.22) more likely to report PTSD and 2.42 times (AOR = 2.42; 95%CI =1.00-5.48) more likely to report depressive disorder than nonbereaved survivors. Older age and receipt of government financial support were significantly associated with 12-month PTSD. Depressive disorder 12 months after the earthquake was associated with receipt of government financial support, pre-earthquake physical illness, single marital status, being currently employed, and Han ethnicity. © 2013 APJPH.

  4. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    Science.gov (United States)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    outermost layer was treated this way, the core of the shrines was made of simple rectangular blocks. The system resisted both in-plane and out-of-plane shaking quite well, as proven by survival of many shrines for more than a millennium, and by fracturing of blocks instead of displacement during the 2006 Yogyakarta earthquake. Systematic use or disuse of known earthquake-resistant techniques in any one society depends on the perception of earthquake risk and on available financial resources. Earthquake-resistant construction practice is significantly more expensive than regular construction. Perception is influenced mostly by short individual and longer social memory. If earthquake recurrence time is longer than the preservation of social memory, if damaging quakes fade into the past, societies commit the same construction mistakes again and again. Length of the memory is possibly about a generation's lifetime. Events occurring less frequently than 25-30 years can be readily forgotten, and the risk of recurrence considered as negligible, not worth the costs of safe construction practices. (Example of recurring flash floods in Hungary.) Frequent earthquakes maintain safe construction practices, like the Java masonry technique throughout at least two centuries, and like the Fachwerk tradition on Modern Aegean Samos throughout 500 years of political and technological development. (OTKA K67583)

  5. Spatial and Financial Fixes and the Global Financial Crisis: Does Labour Have the Knowledge and Power to Meet the Challenge?

    Science.gov (United States)

    Brown, Tony

    2013-01-01

    Five years after the global financial crisis, and trillions of dollars in stimulus spending later, the crisis not only remains unresolved, but risks entering a new deeper phase in southern Europe. The global turbulence, although experienced with differing degrees of intensity and dislocation around the world, manifests as high unemployment,…

  6. Special issue: Terrestrial fluids, earthquakes and volcanoes: The Hiroshi Wakita volume I

    Science.gov (United States)

    Perez, Nemesio M.; King, Chi-Yu; Gurrieri, Sergio; McGee, Kenneth A.

    2006-01-01

    Terrestrial Fluids, Earthquakes and Volcanoes: The Hiroshi Wakita Volume I is a special publication to honor Professor Hiroshi Wakita for his scientific contributions. This volume consists of 17 original papers dealing with various aspects of the role of terrestrial fluids in earthquake and volcanic processes, which reflect Prof. Wakita’s wide scope of research interests.Professor Wakita co-founded the Laboratory for Earthquake Chemistry in 1978 and served as its director from 1988 until his retirement from the university in 1997. He has made the laboratory a leading world center for studying earthquakes and volcanic activities by means of geochemical and hydrological methods. Together with his research team and a number of foreign guest researchers that he attracted, he has made many significant contributions in the above-mentioned scientific fields of interest. This achievement is a testimony to not only his scientific talent, but also his enthusiasm, his open mindedness, and his drive in obtaining both human and financial support.

  7. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  8. Statistical and machine learning approaches for the minimization of trigger errors in parametric earthquake catastrophe bonds

    OpenAIRE

    Calvet, Laura

    2017-01-01

    Catastrophe bonds are financial instruments designed to transfer risk of monetary losses arising from earthquakes, hurricanes, or floods to the capital markets. The insurance and reinsurance industry, governments, and private entities employ them frequently to obtain coverage. Parametric catastrophe bonds base their payments on physical features. For instance, given parameters such as magnitude of the earthquake and the location of its epicentre, the bond may pay a fixed amount or not pay at ...

  9. A generalized self-similar spectrum for decaying homogeneous and isotropic turbulence

    Science.gov (United States)

    Yang, Pingfan; Pumir, Alain; Xu, Haitao

    2017-11-01

    The spectrum of turbulence in dissipative and inertial range can be described by the celebrated Kolmogorov theory. However, there is no general solution of the spectrum in the large scales, especially for statistically unsteady turbulent flows. Here we propose a generalized self-similar form that contains two length-scales, the integral scale and the Kolmogorov scale, for decaying homogeneous and isotropic turbulence. With the help of the local spectral energy transfer hypothesis by Pao (Phys. Fluids, 1965), we derive and solve for the explicit form of the energy spectrum and the energy transfer function, from which the second- and third-order velocity structure functions can also be obtained. We check and verify our assumptions by direct numerical simulations (DNS), and our solutions of the velocity structure functions compare well with hot-wire measurements of high-Reynolds number wind-tunnel turbulence. Financial supports from NSFC under Grant Number 11672157, from the Alexander von Humboldt Foundation, and from the MPG are gratefully acknowledged.

  10. Linking Financial Market Dynamics and the Impact of News

    Science.gov (United States)

    Nacher, J. C.; Ochiai, T.

    2011-09-01

    In financial markets, he behavior of investors determines the prices of financial products. However, these investors can also be influenced by good and bad news. Here, we present a mathematical model to reproduce the price dynamics in real financial markets affected by news. The model has both positive and negative feed-back mechanisms. Furthermore, the behavior of the model is examined by considering two different types of noise. Our results show that the dynamic balance of positive and negative feed-back mechanisms with the noise effect determines the asset price movement. For comparison with real market, we have used the Forex data corresponding to the time period of the recent Tohoku-Kanto earthquake in Japan.

  11. Earthquake history of the Republic of Ragusa (today Dubrovnik, Croatia) (Invited)

    Science.gov (United States)

    Albini, P.; Rovida, A.; Locati, M.

    2009-12-01

    Among the towns constellating the Dalmatian coast, Ragusa (today Dubrovnik, Croatia), stands out, both because of its location in the middle of the Eastern Adriatic coast and its long-lasting, independent history of a Modern Age town and its small coastal territory. An important intelligence crossroads, squeezed as it was in between powerful and influential neighbours, such as the Ottoman Empire and the Republic of Venice, in its history (1358-1808) the Republic of Ragusa did experience heavily damaging earthquakes. We narrate the story of these earthquakes, which were recorded in the historical documentation of the Republic (today stored at the State Archives of Dubrovnik - Drzavni arhiv u Dubrovniku) as well as in documents from officers of other Mediterranean countries and letters of individuals. Of special note is the 6 April 1667 earthquake, which inflicted a permanent scar on the Republic. The earthquake's direct effects and their consequences caused a serious financial crisis, so critical that it took over 50 years for Ragusa to recover. This large earthquake is reappraised on the basis of newly investigated sources, and effects of the damage within the city walls are detailed. A seismic history of Ragusa is finally proposed, supported by full-text coeval records.

  12. Priorities, concerns and unmet needs among Haitians in Boston after the 2010 earthquake.

    Science.gov (United States)

    Allen, Jennifer D; Leyva, Bryan; Hilaire, Dany M; Reich, Amanda J; Martinez, Linda Sprague

    2016-11-01

    In January 2010, a massive earthquake struck Haiti. The devastation not only affected those living in Haiti at the time but also those Haitians living in the United States (U.S.). Few studies have assessed the degree of impact of the earthquake in U.S. Haitian communities. The purpose of this study was to elicit information about health priorities, concerns and resources needed to improve the delivery of health and social care for Haitians in Boston, MA. We conducted six focus groups among 78 individuals in the spring of 2011. Participants were recruited through community organisations, including churches, Haitian social service centres, restaurants and by word of mouth. Analysis of qualitative data revealed an enormous psychological, emotional, financial and physical toll experienced by Boston-area Haitians following the earthquake. Participants described increased distress, depressive episodes, headaches and financial hardship. They also noted insufficient resources to meet the increased needs of those living in the U.S., and those who had immigrated after the earthquake. Most participants cited an increased need for mental health services, as well as assistance with finding employment, navigating the immigration system, and balancing the health and financial needs of families in the U.S. and in Haiti. Despite this, many reported that the tragedy created a sense of unity and solidarity within the Haitian community. These findings corroborate the need for culturally and linguistically appropriate mental health services, as well as for employment, immigration and healthcare navigation services. Participants suggested that interventions be offered through Haitian radio and television stations, as well as group events held in churches. Further research should assess the need for and barriers to utilisation of mental health services among the Haitian community. A multi-faceted approach that includes a variety of outreach strategies implemented through multiple

  13. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.

  14. Earthquake research for the safer siting of critical facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cluff, J.L. (ed.)

    1980-01-01

    The task of providing the necessities for living, such as adequate electrical power, water, and fuel, is becoming more complicated with time. Some of the facilities that provide these necessities would present potential hazards to the population if serious damage were to occur to them during earthquakes. Other facilities must remain operable immediately after an earthquake to provide life-support services to people who have been affected. The purpose of this report is to recommend research that will improve the information available to those who must decide where to site these critical facilities, and thereby mitigate the effects of the earthquake hazard. The term critical facility is used in this report to describe facilities that could seriously affect the public well-being through loss of life, large financial loss, or degradation of the environment if they were to fail. The term critical facility also is used to refer to facilities that, although they pose a limited hazard to the public, are considered critical because they must continue to function in the event of a disaster so that they can provide vital services.

  15. Small College Guide to Financial Health: Weathering Turbulent Times [with CD-ROM

    Science.gov (United States)

    Townsley, Michael K.

    2009-01-01

    In this timely book, financial consultant and experienced college administrator Mike Townsley examines the financial and strategic resources that private colleges and universities must have in place to withstand the storm. Small college presidents, CFOs, planners, chief academic officers, and board members all have a hand on the tiller and will…

  16. The U.S. Money Market and the Term Auction Facility in the Financial Crisis of 2007-–2009

    OpenAIRE

    Tao Wu

    2011-01-01

    The interbank money market in the United States and Europe became turbulent during the financial crisis of 2007-–2009, with the counterparty default risk premiums and liquidity premiums of short-term financing among major financial institutions rising sharply to unprecedented levels. Using various measures of macroeconomic and financial risks, I find that the surges in counterparty risk premiums were predominantly driven by heightened uncertainties about the macroeconomy and financial market,...

  17. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  18. Tokyo Metropolitan Earthquake Preparedness Project - A Progress Report

    Science.gov (United States)

    Hayashi, H.

    2010-12-01

    Munich Re once ranked that Tokyo metropolitan region, the capital of Japan, is the most vulnerable area for earthquake disasters, followed by San Francisco Bay Area, US and Osaka, Japan. Seismologists also predict that Tokyo metropolitan region may have at least one near-field earthquake with a probability of 70% for the next 30 years. Given this prediction, Japanese Government took it seriously to conduct damage estimations and revealed that, as the worst case scenario, if a7.3 magnitude earthquake under heavy winds as shown in the fig. 1, it would kill a total of 11,000 people and a total of direct and indirect losses would amount to 112,000,000,000,000 yen(1,300,000,000,000, 1=85yen) . In addition to mortality and financial losses, a total of 25 million people would be severely impacted by this earthquake in four prefectures. If this earthquake occurs, 300,000 elevators will be stopped suddenly, and 12,500 persons would be confined in them for a long time. Seven million people will come to use over 20,000 public shelters spread over the impacted area. Over one millions temporary housing units should be built to accommodate 4.6 million people who lost their dwellings. 2.5 million people will relocate to outside of the damaged area. In short, an unprecedented scale of earthquake disaster is expected and we must prepare for it. Even though disaster mitigation is undoubtedly the best solution, it is more realistic that the expected earthquake would hit before we complete this business. In other words, we must take into account another solution to make the people and the assets in this region more resilient for the Tokyo metropolitan earthquake. This is the question we have been tackling with for the last four years. To increase societal resilience for Tokyo metropolitan earthquake, we adopted a holistic approach to integrate both emergency response and long-term recovery. There are three goals for long-term recovery, which consists of Physical recovery, Economic

  19. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  20. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  1. Fractal Markets Hypothesis and the Global Financial Crisis: Wavelet Power Evidence

    Science.gov (United States)

    Kristoufek, Ladislav

    2013-10-01

    We analyze whether the prediction of the fractal markets hypothesis about a dominance of specific investment horizons during turbulent times holds. To do so, we utilize the continuous wavelet transform analysis and obtained wavelet power spectra which give the crucial information about the variance distribution across scales and its evolution in time. We show that the most turbulent times of the Global Financial Crisis can be very well characterized by the dominance of short investment horizons which is in hand with the assertions of the fractal markets hypothesis.

  2. Turbulence

    CERN Document Server

    Bailly, Christophe

    2015-01-01

    This book covers the major problems of turbulence and turbulent processes, including  physical phenomena, their modeling and their simulation. After a general introduction in Chapter 1 illustrating many aspects dealing with turbulent flows, averaged equations and kinetic energy budgets are provided in Chapter 2. The concept of turbulent viscosity as a closure of the Reynolds stress is also introduced. Wall-bounded flows are presented in Chapter 3, and aspects specific to boundary layers and channel or pipe flows are also pointed out. Free shear flows, namely free jets and wakes, are considered in Chapter 4. Chapter 5 deals with vortex dynamics. Homogeneous turbulence, isotropy, and dynamics of isotropic turbulence are presented in Chapters 6 and 7. Turbulence is then described both in the physical space and in the wave number space. Time dependent numerical simulations are presented in Chapter 8, where an introduction to large eddy simulation is offered. The last three chapters of the book summarize remarka...

  3. An alternative way to track the hot money in turbulent times

    Science.gov (United States)

    Sensoy, Ahmet

    2015-02-01

    During recent years, networks have proven to be an efficient way to characterize and investigate a wide range of complex financial systems. In this study, we first obtain the dynamic conditional correlations between filtered exchange rates (against US dollar) of several countries and introduce a time-varying threshold correlation level to define dynamic strong correlations between these exchange rates. Then, using evolving networks obtained from strong correlations, we propose an alternative approach to track the hot money in turbulent times. The approach is demonstrated for the time period including the financial turmoil of 2008. Other applications are also discussed.

  4. Research on the trend of Yen exchange rate and international crude oil price fluctuation affected by Japan’s earthquake

    Directory of Open Access Journals (Sweden)

    Xiaoguang Li

    2014-05-01

    Full Text Available Purpose: Whether this earthquake would become a turning point of the high oil price and whether it would have big impact on yen exchange rate are two issues to be discussed in this paper.Design/methodology/approach: To analyze deeply the internal relations between changes in yen exchange rate caused by Japan’s earthquake and price fluctuation of international crude oil, this research chooses middle rate of yen exchange rate during the 45 days around Japan’s earthquake and price data of international crude oil to do an empirical study, uses VAR model and HP trend decomposition to estimate the mutual effect of yen exchange rate change and price fluctuation of international crude oil in this period.Findings: It has been found in the empirical study with VAR model and HP filter decomposition model on the yen exchange rate and the international crude oil price fluctuation during 45 days around Japan’s earthquake that: the fluctuation of yen exchange rate around the earthquake is one of the main reasons for the drastic fluctuation of international crude oil price in that period. The fluctuation of international crude oil price directly triggered by yen exchange rate occupies 13.54% of its total variance. There is a long-term interactive relationship between yen exchange rate and international crude oil price. The upward trend of international crude oil price after the earthquake was obvious, while yen exchange rate remained relatively stable after the earthquake.Originality/value: As economic globalization goes deeper, the influence of natural disasters on international financial market and world economy will become more and more obvious. It has a great revelatory meaning to studying further each kind of natural disaster’s impacts on international financial market and world economics.

  5. Turbulent/non-turbulent interfaces detected in DNS of incompressible turbulent boundary layers

    Science.gov (United States)

    Watanabe, T.; Zhang, X.; Nagata, K.

    2018-03-01

    The turbulent/non-turbulent interface (TNTI) detected in direct numerical simulations is studied for incompressible, temporally developing turbulent boundary layers at momentum thickness Reynolds number Reθ ≈ 2000. The outer edge of the TNTI layer is detected as an isosurface of the vorticity magnitude with the threshold determined with the dependence of the turbulent volume on a threshold level. The spanwise vorticity magnitude and passive scalar are shown to be good markers of turbulent fluids, where the conditional statistics on a distance from the outer edge of the TNTI layer are almost identical to the ones obtained with the vorticity magnitude. Significant differences are observed for the conditional statistics between the TNTI detected by the kinetic energy and vorticity magnitude. A widely used grid setting determined solely from the wall unit results in an insufficient resolution in a streamwise direction in the outer region, whose influence is found for the geometry of the TNTI and vorticity jump across the TNTI layer. The present results suggest that the grid spacing should be similar for the streamwise and spanwise directions. Comparison of the TNTI layer among different flows requires appropriate normalization of the conditional statistics. Reference quantities of the turbulence near the TNTI layer are obtained with the average of turbulent fluids in the intermittent region. The conditional statistics normalized by the reference turbulence characteristics show good quantitative agreement for the turbulent boundary layer and planar jet when they are plotted against the distance from the outer edge of the TNTI layer divided by the Kolmogorov scale defined for turbulent fluids in the intermittent region.

  6. The self-preservation of dissipation elements in homogeneous isotropic decaying turbulence

    Science.gov (United States)

    Gauding, Michael; Danaila, Luminita; Varea, Emilien

    2017-11-01

    The concept of self-preservation has played an important role in shaping the understanding of turbulent flows. The assumption of complete self-preservation imposes certain constrains on the dynamics of the flow, allowing to express statistics by choosing an appropriate unique length scale. Another approach in turbulence research is to study the dynamics of geometrical objects, like dissipation elements (DE). DE appear as coherent space-filling structures in turbulent scalar fields and can be parameterized by the linear length between their ending points. This distance is a natural length scale that provides information about the local structure of turbulence. In this work, the evolution of DE in decaying turbulence is investigated from a self-preservation perspective. The analysis is based on data obtained from direct numerical simulations (DNS). The temporal evolution of DE is governed by a complex process, involving cutting and reconnection events, which change the number and consequently also the length of DE. An analysis of the evolution equation for the probability density function of the length of DE is carried out and leads to specific constraints for the self-preservation of DE, which are justified from DNS. Financial support was provided by Labex EMC3 (under the Grant VAVIDEN), Normandy Region and FEDER.

  7. Turbulent mass transfer in electrochemical systems: Turbulence for electrochemistry, electrochemistry for turbulence

    International Nuclear Information System (INIS)

    Vorotyntsev, M.A.

    1991-01-01

    Key problems of turbulent mass transfer at a solid wall are reviewed: closure problem for the concentration field, information on wall turbulence, applications of microelectrodes to study the structure of turbulence, correlation properties of current fluctuations. (author). 26 refs

  8. Major risks and financial guarantees provided by the State in France

    International Nuclear Information System (INIS)

    Brassard, Guy

    2012-01-01

    France's system for indemnifying damage from natural catastrophe is exemplary, whether for floods, storms, or subsidence. However, France is not equipped with the financial capacity to deal with the damage resulting from an exceptional disaster, such as an earthquake on the Mediterranean coast, or a nuclear meltdown. Major catastrophes could be a significant risk to the financial stability of the State today, because the State is in fact the ultimate insurer of its citizens and its institutions. It would be wise to built up reserves in order to enhance the financial resources of the State and to provide a uniform guarantee covering major risks, whatever the cause of the damage may be. (author)

  9. Characterization and prediction of extreme events in turbulence

    Science.gov (United States)

    Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.

    2017-11-01

    Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).

  10. Dynamic Asset Allocation Strategies Based on Volatility, Unexpected Volatility and Financial Turbulence

    OpenAIRE

    Grimsrud, David Borkner

    2015-01-01

    Masteroppgave økonomi og administrasjon- Universitetet i Agder, 2015 This master thesis looks at unexpected volatility- and financial turbulence’s predictive ability, and exploit these measures of financial risk, together with volatility, to create three dynamic asset allocation strategies, and test if they can outperform a passive and naively diversified buy-and-hold strategy. The idea with the dynamic strategies is to increase the portfolio return by keeping the portfolio risk at a low a...

  11. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  12. Turbulent cascades in foreign exchange markets

    Science.gov (United States)

    Ghashghaie, S.; Breymann, W.; Peinke, J.; Talkner, P.; Dodge, Y.

    1996-06-01

    THE availability of high-frequency data for financial markets has made it possible to study market dynamics on timescales of less than a day1. For foreign exchange (FX) rates Müller et al.2 have shown that there is a net flow of information from long to short timescales: the behaviour of long-term traders (who watch the markets only from time to time) influences the behaviour of short-term traders (who watch the markets continuously). Motivated by this hierarchical feature, we have studied FX market dynamics in more detail, and report here an analogy between these dynamics and hydrodynamic turbulence3-8. Specifically, the relationship between the probability density of FX price changes (δx) and the time delay (δt) (Fig. la) is much the same as the relationship between the probability density of the velocity differences (δv) of two points in a turbulent flow and their spatial separation δr (Fig. 1b). Guided by this similarity we claim that there is an information cascade in FX market dynamics that corresponds to the energy cascade in hydrodynamic turbulence. On the basis of this analogy we can now rationalize the statistics of FX price differences at different time delays, which is important for, for example, option pricing. The analogy also provides a conceptual framework for understanding the short-term dynamics of speculative markets.

  13. Large Eddy Simulation of a cooling impinging jet to a turbulent crossflow

    Science.gov (United States)

    Georgiou, Michail; Papalexandris, Miltiadis

    2015-11-01

    In this talk we report on Large Eddy Simulations of a cooling impinging jet to a turbulent channel flow. The impinging jet enters the turbulent stream in an oblique direction. This type of flow is relevant to the so-called ``Pressurized Thermal Shock'' phenomenon that can occur in pressurized water reactors. First we elaborate on issues related to the set-up of the simulations of the flow of interest such as, imposition of turbulent inflows, choice of subgrid-scale model and others. Also, the issue of the commutator error due to the anisotropy of the spatial cut-off filter induced by non-uniform grids is being discussed. In the second part of the talk we present results of our simulations. In particular, we focus on the high-shear and recirculation zones that are developed and on the characteristics of the temperature field. The budget for the mean kinetic energy of the resolved-scale turbulent velocity fluctuations is also discussed and analyzed. Financial support has been provided by Bel V, a subsidiary of the Federal Agency for Nuclear Control of Belgium.

  14. Magnetohydrodynamic turbulence

    CERN Document Server

    Biskamp, Dieter

    2003-01-01

    This book presents an introduction to, and modern account of, magnetohydrodynamic (MHD) turbulence, an active field both in general turbulence theory and in various areas of astrophysics. The book starts by introducing the MHD equations, certain useful approximations and the transition to turbulence. The second part of the book covers incompressible MHD turbulence, the macroscopic aspects connected with the different self-organization processes, the phenomenology of the turbulence spectra, two-point closure theory, and intermittency. The third considers two-dimensional turbulence and compressi

  15. Reducing financial avalanches by random investments

    Science.gov (United States)

    Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea; Helbing, Dirk

    2013-12-01

    Building on similarities between earthquakes and extreme financial events, we use a self-organized criticality-generating model to study herding and avalanche dynamics in financial markets. We consider a community of interacting investors, distributed in a small-world network, who bet on the bullish (increasing) or bearish (decreasing) behavior of the market which has been specified according to the S&P 500 historical time series. Remarkably, we find that the size of herding-related avalanches in the community can be strongly reduced by the presence of a relatively small percentage of traders, randomly distributed inside the network, who adopt a random investment strategy. Our findings suggest a promising strategy to limit the size of financial bubbles and crashes. We also obtain that the resulting wealth distribution of all traders corresponds to the well-known Pareto power law, while that of random traders is exponential. In other words, for technical traders, the risk of losses is much greater than the probability of gains compared to those of random traders.

  16. A study of earthquake-induced building detection by object oriented classification approach

    Science.gov (United States)

    Sabuncu, Asli; Damla Uca Avci, Zehra; Sunar, Filiz

    2017-04-01

    Among the natural hazards, earthquakes are the most destructive disasters and cause huge loss of lives, heavily infrastructure damages and great financial losses every year all around the world. According to the statistics about the earthquakes, more than a million earthquakes occur which is equal to two earthquakes per minute in the world. Natural disasters have brought more than 780.000 deaths approximately % 60 of all mortality is due to the earthquakes after 2001. A great earthquake took place at 38.75 N 43.36 E in the eastern part of Turkey in Van Province on On October 23th, 2011. 604 people died and about 4000 buildings seriously damaged and collapsed after this earthquake. In recent years, the use of object oriented classification approach based on different object features, such as spectral, textural, shape and spatial information, has gained importance and became widespread for the classification of high-resolution satellite images and orthophotos. The motivation of this study is to detect the collapsed buildings and debris areas after the earthquake by using very high-resolution satellite images and orthophotos with the object oriented classification and also see how well remote sensing technology was carried out in determining the collapsed buildings. In this study, two different land surfaces were selected as homogenous and heterogeneous case study areas. In the first step of application, multi-resolution segmentation was applied and optimum parameters were selected to obtain the objects in each area after testing different color/shape and compactness/smoothness values. In the next step, two different classification approaches, namely "supervised" and "unsupervised" approaches were applied and their classification performances were compared. Object-based Image Analysis (OBIA) was performed using e-Cognition software.

  17. The Phenomenon of Financial Economics: Russia and the World Are in Current Global Turbulence

    Directory of Open Access Journals (Sweden)

    Valentine P. Akinina

    2009-12-01

    Full Text Available The article deals with the analysis of the current situation on the global financial arena, analyzing the chain of cause and effect of the origins of the economic crisis and providing its possible logical outcomes. We are trying to prove here that the way the world economic situation develops will lead to either further growth or stagnation of national economies and define their position in the global business, financial, and social spheres.

    We provide an analysis of the serious transformations financial economics have been undergoing at the end of the 20th and beginning of the 21st centuries. All these changes, such as the development of international fusions on financial markets, the creation of new financial instruments, products and services, and others, have been caused largely by (and also have led to significant events in the global political arena. However, regardless of the transformations, world leadership remains in the hands of US government and business and that of their closest partners, while those societies that are not willing to support the “Americanized” world order end up on the blacklist of the World Bank, the IMF, and other international financial institutions.

    Finally, the article provides our views of the possible ways of dealing with the global economic stagnation. We highlight the importance of the strong and careful supervision of any global as well as national financial activities, the education of the public on the issues of wise investments, and the dangers of living on credit.

  18. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  19. Structure and Connectivity Analysis of Financial Complex System Based on G-Causality Network

    Science.gov (United States)

    Xu, Chuan-Ming; Yan, Yan; Zhu, Xiao-Wu; Li, Xiao-Teng; Chen, Xiao-Song

    2013-11-01

    The recent financial crisis highlights the inherent weaknesses of the financial market. To explore the mechanism that maintains the financial market as a system, we study the interactions of U.S. financial market from the network perspective. Applied with conditional Granger causality network analysis, network density, in-degree and out-degree rankings are important indicators to analyze the conditional causal relationships among financial agents, and further to assess the stability of U.S. financial systems. It is found that the topological structure of G-causality network in U.S. financial market changed in different stages over the last decade, especially during the recent global financial crisis. Network density of the G-causality model is much higher during the period of 2007-2009 crisis stage, and it reaches the peak value in 2008, the most turbulent time in the crisis. Ranked by in-degrees and out-degrees, insurance companies are listed in the top of 68 financial institutions during the crisis. They act as the hubs which are more easily influenced by other financial institutions and simultaneously influence others during the global financial disturbance.

  20. Stirring turbulence with turbulence

    NARCIS (Netherlands)

    Cekli, H.E.; Joosten, R.; van de Water, W.

    2015-01-01

    We stir wind-tunnel turbulence with an active grid that consists of rods with attached vanes. The time-varying angle of these rods is controlled by random numbers. We study the response of turbulence on the statistical properties of these random numbers. The random numbers are generated by the

  1. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  2. The new international financial crisis: causes, consequences and perspectives

    Directory of Open Access Journals (Sweden)

    Flavio Vilela Vieira

    2011-06-01

    Full Text Available The paper investigates the recent financial crisis within a historical and comparative perspective having in mind that it is ultimately a confidence crisis, initially associated to a chain of high risk loans and financial innovations that spread thorough the international system culminating with impressive wealth losses. The financial market will eventually recover from the crisis but the outcome should be followed by a different and more disciplined set of international institutions. There will be a change on how we perceive the widespread liberal argument that the market is always efficient, or at least, more efficient than any State intervention, overcoming the false perception that the State is in opposition to the market. A deep financial crisis brings out a period of wealth losses and an adjustment process characterized by price corrections (commodities and equity price deflation and real effects (recession and lower employment, and a period of turbulences and end of illusions is in place.

  3. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  4. Superfluid turbulence

    International Nuclear Information System (INIS)

    Donnelly, R.J.

    1988-01-01

    Most flows of fluids, in nature and in technology, are turbulent. Since much of the energy expended by machines and devices that involve fluid flows is spent in overcoming drag caused by turbulence, there is a strong motivation to understand the phenomena. Surprisingly, the peculiar, quantum-mechanical form of turbulence that can form in superfluid helium may turn out to be much simpler to understand that the classical turbulence that forms in normal fluids. It now seems that the study of superfluid turbulence may provide simplified model systems for studying some forms of classical turbulence. There are also practical motivations for studying superfluid turbulence. For example, superfuid helium is often used as a coolant in superconducting machinery. Superfluid turbulence is the primary impediment to the transfer of heat by superfluid helium; an understanding of the phenomena may make it possible to design more efficient methods of refrigeration for superconducting devices. 8 figs

  5. Turbulent premixed flames on fractal-grid-generated turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Soulopoulos, N; Kerl, J; Sponfeldner, T; Beyrau, F; Hardalupas, Y; Taylor, A M K P [Mechanical Engineering Department, Imperial College London, London SW7 2AZ (United Kingdom); Vassilicos, J C, E-mail: ns6@ic.ac.uk [Department of Aeronautics, Imperial College London, London SW7 2AZ (United Kingdom)

    2013-12-15

    A space-filling, low blockage fractal grid is used as a novel turbulence generator in a premixed turbulent flame stabilized by a rod. The study compares the flame behaviour with a fractal grid to the behaviour when a standard square mesh grid with the same effective mesh size and solidity as the fractal grid is used. The isothermal gas flow turbulence characteristics, including mean flow velocity and rms of velocity fluctuations and Taylor length, were evaluated from hot-wire measurements. The behaviour of the flames was assessed with direct chemiluminescence emission from the flame and high-speed OH-laser-induced fluorescence. The characteristics of the two flames are considered in terms of turbulent flame thickness, local flame curvature and turbulent flame speed. It is found that, for the same flow rate and stoichiometry and at the same distance downstream of the location of the grid, fractal-grid-generated turbulence leads to a more turbulent flame with enhanced burning rate and increased flame surface area. (paper)

  6. Results of subionospheric radio LF monitoring prior to the Tokachi (M=8, Hokkaido, 25 September 2003 earthquake

    Directory of Open Access Journals (Sweden)

    A. V. Shvets

    2004-01-01

    Full Text Available Results of simultaneous LF subionospheric monitoring over two different propagation paths prior to the very strong Tokachi earthquake (near the east coast of Hokkaido Island, 25 September 2003 of magnitude 8.3 are presented firstly. Nighttime amplitude fluctuations of the Japanese Time Standard Transmitter (JG2AS, 40kHz signal received at Moshiri (Japan, 142°E, 44°N and at Petropavlovsk-Kamchatski (Russia, 158°E, 53°N were analyzed. As a possible precursory signature we observed synchronous intensification of quasi periodical 16-day variations of the dispersion in the signals received at both observation stations before the earthquake. The strongest deviations observed as a rule were depletions of signal amplitude probably connected with increase of loss in the ionosphere by the enhancement of turbulence. This is due to dissipation of internal gravity waves (IGW at the lower ionosphere heights. A scheme for seismo-IGW-planetary waves (PW interconnection has been justified to explain the observed connection with strong earthquakes. It considers the seasonal variability in the signal.

  7. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  8. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  9. Turbulence closure: turbulence, waves and the wave-turbulence transition – Part 1: Vanishing mean shear

    Directory of Open Access Journals (Sweden)

    H. Z. Baumert

    2009-03-01

    Full Text Available This paper extends a turbulence closure-like model for stably stratified flows into a new dynamic domain in which turbulence is generated by internal gravity waves rather than mean shear. The model turbulent kinetic energy (TKE, K balance, its first equation, incorporates a term for the energy transfer from internal waves to turbulence. This energy source is in addition to the traditional shear production. The second variable of the new two-equation model is the turbulent enstrophy (Ω. Compared to the traditional shear-only case, the Ω-equation is modified to account for the effect of the waves on the turbulence time and space scales. This modification is based on the assumption of a non-zero constant flux Richardson number in the limit of vanishing mean shear when turbulence is produced exclusively by internal waves. This paper is part 1 of a continuing theoretical development. It accounts for mean shear- and internal wave-driven mixing only in the two limits of mean shear and no waves and waves but no mean shear, respectively.

    The new model reproduces the wave-turbulence transition analyzed by D'Asaro and Lien (2000b. At small energy density E of the internal wave field, the turbulent dissipation rate (ε scales like ε~E2. This is what is observed in the deep sea. With increasing E, after the wave-turbulence transition has been passed, the scaling changes to ε~E1. This is observed, for example, in the highly energetic tidal flow near a sill in Knight Inlet. The new model further exhibits a turbulent length scale proportional to the Ozmidov scale, as observed in the ocean, and predicts the ratio between the turbulent Thorpe and Ozmidov length scales well within the range observed in the ocean.

  10. Global Financial Governance: a Perspective from the International Monetary Fund

    Directory of Open Access Journals (Sweden)

    Ryszard Wilczyński

    2011-03-01

    Full Text Available An environment for the activities of the International Monetary Fund (the IMF has fundamentally changed over the two recent decades. The strong development of financial innovations as well as of financial globalisation was among major forces driving the change and shaping the economic growth worldwide. As some economies were able - with the support from financial markets – to accelerate their growth, other countries suffered from turbulences, which were reinforced and transferred internationally through the volatile financial markets. The process of international financial contagion makes the case for global financial governance, which so far has been left behind the development of markets. The IMF is mandated to play a central role in the global governance designed to ensure financial stability. The article reconsiders the Fund’s role and includes an overview and assessment of its activities, particularly in the context of the global financial crisis in 2007-2010. In the aftermath of this crisis, the international financial stability may, however, again be at risk as several external imbalances in the global economy may be hardly sustainable. It is argued in the paper that, in addition to a gradually improving surveillance and lending as well as to adjusting resources by the Fund, an enhanced credibility of the institution is needed so that its role in the process of the stabilising global financial system is strong and effective.

  11. The Need for More Earthquake Science in Southeast Asia

    Science.gov (United States)

    Sieh, K.

    2015-12-01

    Many regions within SE Asia have as great a density of active seismic structures as does the western US - Sumatra, Myanmar, Bangladesh, New Guinea and the Philippines come first to mind. Much of Earth's release of seismic energy in the current millennium has, in fact, come from these regions, with great losses of life and livelihoods. Unfortunately, the scientific progress upon which seismic-risk reduction in SE Asia ultimately depends has been and continues to be slow. Last year at AGU, for example, I counted 57 talks about the M6 Napa earthquake. In contrast, I can't recall hearing any talk on a SE Asian M6 earthquake at any venue in the past many years. In fact, even M7+ earthquakes often go unstudied. Not uncommonly, the region's earthquake scientists face high financial and political impediments to conducting earthquake research. Their slow speed in the development of scientific knowledge doesn't bode well for speedy progress in the science of seismic hazards, the sine qua non for substantially reducing seismic risk. There are two basic necessities for the region to evolve significantly from the current state of affairs. Both involve the development of regional infrastructure: 1) Data: Robust and accessible geophysical monitoring systems would need to be installed, maintained and utilized by the region's earth scientists and their results shared internationally. Concomitantly, geological mapping (sensu lato) would need to be undertaken. 2) People: The training, employment, and enduring support of a new, young, international corps of earth scientists would need to accelerate markedly. The United States could play an important role in achieving the goal of significant seismic risk reduction in the most seismically active countries of SE Asia by taking the lead in establishing a coalition to robustly fund a multi-decadal program that supports scientists and their research institutions to work alongside local expertise.

  12. Turbulence modulation induced by interaction between a bubble swarm and decaying turbulence in oscillating-grid turbulence

    International Nuclear Information System (INIS)

    Imaizumi, Ryota; Morikawa, Koichi; Higuchi, Masamori; Saito, Takayuki

    2009-01-01

    In this study, the interaction between a bubble swarm and homogeneous isotropic turbulence was experimentally investigated. The objective is to clarify the turbulence modulation induced by interaction between the bubble swarm and the homogeneous isotropic turbulence without mean flow. In order to generate simultaneously ideally homogeneous isotropic turbulence and a sufficiently controlled bubble swarm, we employed both oscillating grid and bubble generators equipped with audio speakers. First, the homogeneous isotropic turbulence was formed by operating the oscillating grid cylindrical acrylic pipe (height: 600 mm, inner diameter: 149 mm) filled with ion-exchanged and degassed water. Second, we stopped the oscillating-grid in arbitrary time after the homogeneous isotropic turbulence was achieved. A few moments later, the controlled bubble swarm (number of bubbles: 3, average equivalent diameter of bubble: 3 mm, bubble Reynolds number: 859, Weber number: 3.48) was launched into the decaying turbulence described above, using the bubble generators. The bubble formation, bubble size and bubble-launch timing are controlled arbitrarily and precisely by this device. In this study, we conducted the following experiments: 1) measurement of the motion of bubbles in rest water and oscillating grid turbulence via high-speed visualization, 2) measurement of the liquid phase motion around the bubbles in rest water via PIV system with LIF method, 3) measurement of the liquid phase motion around the bubbles in oscillating-grid turbulence via PIV system with LIF method. In the vitalization of the liquid-phase motion of both experiments, two high speed video cameras were employed in order to simultaneously film large- and small-scale interrogation areas. The liquid-phase ambient turbulence hastened the change of the bubble motion from zigzag mode to spiral mode. The interaction between the bubble swarm and liquid-phase turbulence increased decay-rate of the turbulence. (author)

  13. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  14. Structure and Connectivity Analysis of Financial Complex System Based on G-Causality Network

    International Nuclear Information System (INIS)

    Xu Chuan-Ming; Yan Yan; Zhu Xiao-Wu; Li Xiao-Teng; Chen Xiao-Song

    2013-01-01

    The recent financial crisis highlights the inherent weaknesses of the financial market. To explore the mechanism that maintains the financial market as a system, we study the interactions of U.S. financial market from the network perspective. Applied with conditional Granger causality network analysis, network density, in-degree and out-degree rankings are important indicators to analyze the conditional causal relationships among financial agents, and further to assess the stability of U.S. financial systems. It is found that the topological structure of G-causality network in U.S. financial market changed in different stages over the last decade, especially during the recent global financial crisis. Network density of the G-causality model is much higher during the period of 2007–2009 crisis stage, and it reaches the peak value in 2008, the most turbulent time in the crisis. Ranked by in-degrees and out-degrees, insurance companies are listed in the top of 68 financial institutions during the crisis. They act as the hubs which are more easily influenced by other financial institutions and simultaneously influence others during the global financial disturbance. (interdisciplinary physics and related areas of science and technology)

  15. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  16. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  17. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  18. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  19. Superhydrophobic and polymer drag reduction in turbulent Taylor-Couette flow

    Science.gov (United States)

    Rajappan, Anoop; McKinley, Gareth H.

    2017-11-01

    We use a custom-built Taylor-Couette apparatus (radius ratio η = 0.75) to study frictional drag reduction by dilute polymer solutions and superhydrophobic (SH) surfaces in turbulent flows for 15000 analysis. We also investigate drag reduction by dilute polymer solutions, and show that natural biopolymers from plant mucilage can be an inexpensive and effective alternative to synthetic polymers in drag reduction applications, approaching the same maximum drag reduction asymptote. Finally we explore combinations of the two methods - one arising from wall slip and the other due to changes in turbulence dynamics in the bulk flow - and find that the two effects are not additive; interestingly, the effectiveness of polymer drag reduction is drastically reduced in the presence of an SH coating on the wall. This study was financially supported by the Office of Naval Research (ONR) through Contract No. 3002453814.

  20. An Assessment of Malaysian Monetary Policy During the Global Financial Crisis of 2008-09

    OpenAIRE

    Selim Elekdag; Subir Lall; Harun Alp

    2012-01-01

    Malaysia was hit hard by the global financial crisis of 2008-09. Anticipating the downturn that would follow the episode of extreme financial turbulence, Bank Negara Malaysia (BNM) let the exchange rate depreciate as capital flowed out, and preemptively cut the policy rate by 150 basis points. Against this backdrop, this paper tries to quantify how much deeper the recession would have been without the BNM's monetary policy response. Taking the most intense year of the crisis as our baseline (...

  1. Turbulence modulation induced by bubble swarm in oscillating-grid turbulence

    International Nuclear Information System (INIS)

    Morikawa, Koichi; Urano, Shigeyuki; Saito, Takayuki

    2007-01-01

    In the present study, liquid-phase turbulence modulation induced by a bubble swarm ascending in arbitrary turbulence was experimentally investigated. Liquid-phase homogeneous isotropic turbulence was formed using an oscillating grid in a cylindrical acrylic vessel of 149 mm in inner diameter. A bubble swarm consisting of 19 bubbles of 2.8 mm in equivalent diameter was examined; the bubble size and launching time were completely controlled using a bubble launching device through audio speakers. This bubble launching device was able to repeatedly control the bubble swarm arbitrarily and precisely. The bubble swarm was launched at a frequency of 4 Hz. The liquid phase motion was measured via two LDA (Laser Doppler Anemometer) probes. The turbulence intensity, spatial correlation and integral scale were calculated from LDA data obtained by the two spatially-separate-point measurement. When the bubble swarm was added, the turbulence intensity dramatically changed. The original isotropic turbulence was modulated to the anisotropic turbulence by the mutual interference between the bubble swarm and ambient isotropic turbulence. The integral scales were calculated from the spatial correlation function. The effects of the bubble swarm on the integral scales showed the tendencies similar to those on turbulence intensity. (author)

  2. Local regression type methods applied to the study of geophysics and high frequency financial data

    Science.gov (United States)

    Mariani, M. C.; Basu, K.

    2014-09-01

    In this work we applied locally weighted scatterplot smoothing techniques (Lowess/Loess) to Geophysical and high frequency financial data. We first analyze and apply this technique to the California earthquake geological data. A spatial analysis was performed to show that the estimation of the earthquake magnitude at a fixed location is very accurate up to the relative error of 0.01%. We also applied the same method to a high frequency data set arising in the financial sector and obtained similar satisfactory results. The application of this approach to the two different data sets demonstrates that the overall method is accurate and efficient, and the Lowess approach is much more desirable than the Loess method. The previous works studied the time series analysis; in this paper our local regression models perform a spatial analysis for the geophysics data providing different information. For the high frequency data, our models estimate the curve of best fit where data are dependent on time.

  3. Transitional–turbulent spots and turbulent–turbulent spots in boundary layers

    Science.gov (United States)

    Wu, Xiaohua; Moin, Parviz; Wallace, James M.; Skarda, Jinhie; Lozano-Durán, Adrián; Hickey, Jean-Pierre

    2017-01-01

    Two observations drawn from a thoroughly validated direct numerical simulation of the canonical spatially developing, zero-pressure gradient, smooth, flat-plate boundary layer are presented here. The first is that, for bypass transition in the narrow sense defined herein, we found that the transitional–turbulent spot inception mechanism is analogous to the secondary instability of boundary-layer natural transition, namely a spanwise vortex filament becomes a Λ vortex and then, a hairpin packet. Long streak meandering does occur but usually when a streak is infected by a nearby existing transitional–turbulent spot. Streak waviness and breakdown are, therefore, not the mechanisms for the inception of transitional–turbulent spots found here. Rather, they only facilitate the growth and spreading of existing transitional–turbulent spots. The second observation is the discovery, in the inner layer of the developed turbulent boundary layer, of what we call turbulent–turbulent spots. These turbulent–turbulent spots are dense concentrations of small-scale vortices with high swirling strength originating from hairpin packets. Although structurally quite similar to the transitional–turbulent spots, these turbulent–turbulent spots are generated locally in the fully turbulent environment, and they are persistent with a systematic variation of detection threshold level. They exert indentation, segmentation, and termination on the viscous sublayer streaks, and they coincide with local concentrations of high levels of Reynolds shear stress, enstrophy, and temperature fluctuations. The sublayer streaks seem to be passive and are often simply the rims of the indentation pockets arising from the turbulent–turbulent spots. PMID:28630304

  4. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  5. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  6. Supply Chain Information Systems and Organisational Performance in Economic Turbulent Times

    OpenAIRE

    Argyropoulou, Maria; Reid, Iain; Michaelides, Roula; Ioannou, George

    2015-01-01

    Supply Chain Information Systems and their impact on organisational performance has been studied by a number of studies. This study seeks to extend this body of knowledge by adopting a fresh lens to explore empirically the relationship between organizational performance and SCIS in circumstances of economic downturn and financial turbulence. The statistical relationship between Supply Chain Information Systems (SCIS) ˜Effectiveness and ˜Organisational Performance is tested and measured by m...

  7. Turbulence and fossil turbulence lead to life in the universe

    International Nuclear Information System (INIS)

    Gibson, Carl H

    2013-01-01

    Turbulence is defined as an eddy-like state of fluid motion where the inertial-vortex forces of the eddies are larger than all the other forces that tend to damp the eddies out. Fossil turbulence is a perturbation produced by turbulence that persists after the fluid ceases to be turbulent at the scale of the perturbation. Because vorticity is produced at small scales, turbulence must cascade from small scales to large, providing a consistent physical basis for Kolmogorovian universal similarity laws. Oceanic and astrophysical mixing and diffusion are dominated by fossil turbulence and fossil turbulent waves. Observations from space telescopes show turbulence and vorticity existed in the beginning of the universe and that their fossils persist. Fossils of big bang turbulence include spin and the dark matter of galaxies: clumps of ∼10 12 frozen hydrogen planets that make globular star clusters as seen by infrared and microwave space telescopes. When the planets were hot gas, they hosted the formation of life in a cosmic soup of hot-water oceans as they merged to form the first stars and chemicals. Because spontaneous life formation according to the standard cosmological model is virtually impossible, the existence of life falsifies the standard cosmological model. (paper)

  8. Long-term predictability of regions and dates of strong earthquakes

    Science.gov (United States)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    parameters and seismic events. Further development of the H-104 method is the plotting of H-104 trajectories in two-dimensional time coordinates. The method provides the dates of future earthquakes for several (3-4) sequential time intervals multiple of 104 days. The H-104 method could be used together with the empirical scheme for short-term earthquake prediction reducing the date uncertainty. Using the H-104 method, it is developed the following long-term forecast of seismic activity. 1. The total number of M6+ earthquakes expected in the time frames: - 10.01-07.02: 14; - 08.02-08.03: 17; - 09.03-06.04: 9. 3. The potential days of M6+ earthquakes expected in the period of 10.01.2016-06.04.2016 are the following: - in January: 17, 18, 23, 24, 26, 28, 31; - in February: 01, 02, 05, 12, 15, 18, 20, 23; - in March: 02, 04, 05, 07 (M7+ is possible), 09, 10, 17 (M7+ is possible), 19, 20 (M7+ is possible), 23 (M7+ is possible), 30; - in April: 02, 06. The work was financially supported by the Ministry of Education and Science of the Russian Federation (contract No. 14.577.21.0109, project UID RFMEFI57714X0109)

  9. Turbulence introduction to theory and applications of turbulent flows

    CERN Document Server

    Westerweel, Jerry; Nieuwstadt, Frans T M

    2016-01-01

    This book provides a general introduction to the topic of turbulent flows. Apart from classical topics in turbulence, attention is also paid to modern topics. After studying this work, the reader will have the basic knowledge to follow current topics on turbulence in scientific literature. The theory is illustrated with a number of examples of applications, such as closure models, numerical simulations and turbulent diffusion, and experimental findings. The work also contains a number of illustrative exercises.

  10. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  11. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  12. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  13. Suppression of turbulent resistivity in turbulent Couette flow

    Science.gov (United States)

    Si, Jiahe; Colgate, Stirling A.; Sonnenfeld, Richard G.; Nornberg, Mark D.; Li, Hui; Colgate, Arthur S.; Westpfahl, David J.; Romero, Van D.; Martinic, Joe

    2015-07-01

    Turbulent transport in rapidly rotating shear flow very efficiently transports angular momentum, a critical feature of instabilities responsible both for the dynamics of accretion disks and the turbulent power dissipation in a centrifuge. Turbulent mixing can efficiently transport other quantities like heat and even magnetic flux by enhanced diffusion. This enhancement is particularly evident in homogeneous, isotropic turbulent flows of liquid metals. In the New Mexico dynamo experiment, the effective resistivity is measured using both differential rotation and pulsed magnetic field decay to demonstrate that at very high Reynolds number rotating shear flow can be described entirely by mean flow induction with very little contribution from correlated velocity fluctuations.

  14. Suppression of turbulent resistivity in turbulent Couette flow

    Energy Technology Data Exchange (ETDEWEB)

    Si, Jiahe, E-mail: jsi@nmt.edu; Sonnenfeld, Richard G.; Colgate, Arthur S.; Westpfahl, David J.; Romero, Van D.; Martinic, Joe [New Mexico Institute of Mining and Technology, Socorro, New Mexico 87801 (United States); Colgate, Stirling A.; Li, Hui [Los Alamos National Laboratory, Los Alamos, New Mexico 87544 (United States); Nornberg, Mark D. [University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States)

    2015-07-15

    Turbulent transport in rapidly rotating shear flow very efficiently transports angular momentum, a critical feature of instabilities responsible both for the dynamics of accretion disks and the turbulent power dissipation in a centrifuge. Turbulent mixing can efficiently transport other quantities like heat and even magnetic flux by enhanced diffusion. This enhancement is particularly evident in homogeneous, isotropic turbulent flows of liquid metals. In the New Mexico dynamo experiment, the effective resistivity is measured using both differential rotation and pulsed magnetic field decay to demonstrate that at very high Reynolds number rotating shear flow can be described entirely by mean flow induction with very little contribution from correlated velocity fluctuations.

  15. Suppression of turbulent resistivity in turbulent Couette flow

    International Nuclear Information System (INIS)

    Si, Jiahe; Sonnenfeld, Richard G.; Colgate, Arthur S.; Westpfahl, David J.; Romero, Van D.; Martinic, Joe; Colgate, Stirling A.; Li, Hui; Nornberg, Mark D.

    2015-01-01

    Turbulent transport in rapidly rotating shear flow very efficiently transports angular momentum, a critical feature of instabilities responsible both for the dynamics of accretion disks and the turbulent power dissipation in a centrifuge. Turbulent mixing can efficiently transport other quantities like heat and even magnetic flux by enhanced diffusion. This enhancement is particularly evident in homogeneous, isotropic turbulent flows of liquid metals. In the New Mexico dynamo experiment, the effective resistivity is measured using both differential rotation and pulsed magnetic field decay to demonstrate that at very high Reynolds number rotating shear flow can be described entirely by mean flow induction with very little contribution from correlated velocity fluctuations

  16. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  17. A mathematical model of turbulence for turbulent boundary layers

    International Nuclear Information System (INIS)

    Pereira Filho, H.D.V.

    1977-01-01

    Equations to the so called Reynolds stress-tensor (kinetic turbulent energy) and dissipation rate are developed and a turbulence flux approximation used. Our ideia here is to use those equations in order to develop an economical and fast numeircal procedure for computation of turbulent boundary layer. (author) [pt

  18. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  19. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  20. Strategic Orientation for Improving Financial Performance Case Study in Al-Qadissiya Governorate Banking

    Directory of Open Access Journals (Sweden)

    Basim Abbas Kraidy JASSMY

    2017-06-01

    Full Text Available This study investigated the relationships between market turbulence, competitive intensity, customer orientation, competitor orientation, inter-functional coordination, organizational commitment and financial performance in the banks of Al-Qadissya governorate. A survey questionnaire was conducted for investigation and data was collected from 170 mangers that work in these banks. To test these relationships, the authors examined all the variables under (SPSS V 20. In order to reveal the effects of the variables, the findings showed that market instability and competitive intensity have effect on strategic orientation, while market instability has no effect on organizational commitment and financial performance. At the same time, inter-functional coordination has no effect on organizational commitment. Furthermore, the study findings showed correlations between competitive intensity and organizational commitment, while there is no correlation between competitive intensity and financial performance. At last, organizational commitment influences financial performance. According to the study results could be improved by all types of strategic orientation and enhance organizational commitment that increase financial performance.

  1. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  2. Vrancea earthquakes. Specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru

    2005-01-01

    natural disasters given by earthquakes, there is a need to reverse trends in seismic risk mitigation to future events. Main courses of specific action to mitigate the seismic risks from strong deep Vrancea earthquakes should be considered as key to future development projects, including: - Early warning system for industrial facilities; - Short and long term prediction program of strong Vrancea earthquakes; - Seismic hazard map of Romania; - Seismic microzonation of large populated cities; - Shake map; - Seismic tomography of dams for avoiding disasters. The quality of life and the security of infrastructure (including human services, civil and industrial structures, financial infrastructure, information transmission and processing systems) in every nation are increasingly vulnerable to disasters caused by events that have geological, atmospheric, hydrologic, and technological origins. As UN Secretary General Kofi Annan pointed out, 'Building a culture of prevention is not easy. While the costs of prevention have to be paid in the present, its benefits lie in a distant future'. In other words: Prevention pays off. This may not always become apparent immediately, but, in the long run, the benefits from prevention measures will always outweigh their costs by far. Romania is an earthquake prone area and these main specific actions are really contributing to seismic risk mitigation. These specific actions are provided for in Law nr. 372/March 18,2004 -'The National Program of Seismic Risk Management'. (authors)

  3. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  4. Implications of Navier-Stokes turbulence theory for plasma turbulence

    International Nuclear Information System (INIS)

    Montgomery, David

    1977-01-01

    A brief discussion of Navier-Stokes turbulence theory is given with particular reference to the two dimensional case. The MHD turbulence is introduced with possible applications of techniques developed in Navier-Stokes theory. Turbulence in Vlasov plasma is also discussed from the point of view of the ''direct interaction approximation'' (DIA). (A.K.)

  5. A Generalized turbulent dispersion model for bubbly flow numerical simulation in NEPTUNE-CFD

    Energy Technology Data Exchange (ETDEWEB)

    Laviéville, Jérôme, E-mail: Jerome-marcel.lavieville@edf.fr; Mérigoux, Nicolas, E-mail: nicolas.merigoux@edf.fr; Guingo, Mathieu, E-mail: mathieu.guingo@edf.fr; Baudry, Cyril, E-mail: Cyril.baudry@edf.fr; Mimouni, Stéphane, E-mail: stephane.mimouni@edf.fr

    2017-02-15

    The NEPTUNE-CFD code, based upon an Eulerian multi-fluid model, is developed within the framework of the NEPTUNE project, financially supported by EDF (Electricité de France), CEA (Commissariat à l’Energie Atomique et aux Energies Alternatives), IRSN (Institut de Radioprotection et de Sûreté Nucléaire) and AREVA-NP. NEPTUNE-CFD is mainly focused on Nuclear Safety applications involving two-phase water-steam flows, like two-phase Pressurized Shock (PTS) and Departure from Nucleate Boiling (DNB). Many of these applications involve bubbly flows, particularly, for application to flows in PWR fuel assemblies, including studies related to DNB. Considering a very usual model for interfacial forces acting on bubbles, including drag, virtual mass and lift forces, the turbulent dispersion force is often added to moderate the lift effect in orthogonal directions to the main flow and get the right dispersion shape. This paper presents a formal derivation of this force, considering on the one hand, the fluctuating part of drag and virtual mass, and on the other hand, Turbulent Pressure derivation obtained by comparison between Lagrangian and Eulerian description of bubbles motion. An extension of the Tchen’s theory is used to express the turbulent kinetic energy of bubbles and the two-fluid turbulent covariance tensor in terms of liquid turbulent velocities and time scale. The model obtained by this way, called Generalized Turbulent Dispersion Model (GTD), does not require any user parameter. The model is validated against Liu & Bankoff air-water experiment, Arizona State University (ASU) experiment, DEBORA experiment and Texas A&M University (TAMU) boiling flow experiments.

  6. SciDAC Center for Gyrokinetic Particle Simulation of Turbulent Transport in Burning Plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Zhihong [Univ. of California, Irvine, CA (United States)

    2013-12-18

    During the first year of the SciDAC gyrokinetic particle simulation (GPS) project, the GPS team (Zhihong Lin, Liu Chen, Yasutaro Nishimura, and Igor Holod) at the University of California, Irvine (UCI) studied the tokamak electron transport driven by electron temperature gradient (ETG) turbulence, and by trapped electron mode (TEM) turbulence and ion temperature gradient (ITG) turbulence with kinetic electron effects, extended our studies of ITG turbulence spreading to core-edge coupling. We have developed and optimized an elliptic solver using finite element method (FEM), which enables the implementation of advanced kinetic electron models (split-weight scheme and hybrid model) in the SciDAC GPS production code GTC. The GTC code has been ported and optimized on both scalar and vector parallel computer architectures, and is being transformed into objected-oriented style to facilitate collaborative code development. During this period, the UCI team members presented 11 invited talks at major national and international conferences, published 22 papers in peer-reviewed journals and 10 papers in conference proceedings. The UCI hosted the annual SciDAC Workshop on Plasma Turbulence sponsored by the GPS Center, 2005-2007. The workshop was attended by about fifties US and foreign researchers and financially sponsored several gradual students from MIT, Princeton University, Germany, Switzerland, and Finland. A new SciDAC postdoc, Igor Holod, has arrived at UCI to initiate global particle simulation of magnetohydrodynamics turbulence driven by energetic particle modes. The PI, Z. Lin, has been promoted to the Associate Professor with tenure at UCI.

  7. Homogeneous turbulence dynamics

    CERN Document Server

    Sagaut, Pierre

    2018-01-01

    This book provides state-of-the-art results and theories in homogeneous turbulence, including anisotropy and compressibility effects with extension to quantum turbulence, magneto-hydodynamic turbulence  and turbulence in non-newtonian fluids. Each chapter is devoted to a given type of interaction (strain, rotation, shear, etc.), and presents and compares experimental data, numerical results, analysis of the Reynolds stress budget equations and advanced multipoint spectral theories. The role of both linear and non-linear mechanisms is emphasized. The link between the statistical properties and the dynamics of coherent structures is also addressed. Despite its restriction to homogeneous turbulence, the book is of interest to all people working in turbulence, since the basic physical mechanisms which are present in all turbulent flows are explained. The reader will find a unified presentation of the results and a clear presentation of existing controversies. Special attention is given to bridge the results obta...

  8. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  9. PREFACE: Turbulent Mixing and Beyond Turbulent Mixing and Beyond

    Science.gov (United States)

    Abarzhi, Snezhana I.; Gauthier, Serge; Rosner, Robert

    2008-10-01

    (continuous DNS/LES/RANS, Molecular dynamics, Monte-Carlo, predictive modeling) New Experimental Diagnostics (novel methods for flow visualization and control, high-tech) The First International Conference `Turbulent Mixing and Beyond' was organized by the following members of the Organizing Committee: Snezhana I Abarzhi (chairperson, Chicago, USA) Malcolm J Andrews (Los Alamos National Laboratory, USA) Sergei I Anisimov (Landau Institute for Theoretical Physics, Russia) Serge Gauthier (Commissariat à l'Energie Atomique, France) Donald Q Lamb (The University of Chicago, USA) Katsunobu Nishihara (Institute for Laser Engineering, Osaka, Japan) Bruce A Remington (Lawrence Livermore National Laboratory, USA) Robert Rosner (Argonne National Laboratory, USA) Katepalli R Sreenivasan (International Centre for Theoretical Physics, Italy) Alexander L Velikovich (Naval Research Laboratory, USA) The Organizing Committee gratefully acknowledges the financial support of the Conference Sponsors: National Science Foundation (NSF), USA (Divisions and Programs Directors: Drs A G Detwiler, L M Jameson, E L Lomon, P E Phelan, G A Prentice, J A Raper, W Schultz, P R Westmoreland; PI: Dr S I Abarzhi) Air Force Office of Scientific Research (AFOSR), USA (Program Director: Dr J D Schmisseur; PI: Dr S I Abarzhi) European Office of Aerospace Research and Development (EOARD) of the AFOSR, UK (Program Chief: Dr S Surampudi; PI: Dr S I Abarzhi) International Centre for Theoretical Physics (ICTP), Trieste, Italy (Centre's Director: Dr K R Sreenivasan) The University of Chicago and The Argonne National Laboratory (ANL), USA (Laboratory's Director: Dr R Rosner) Commissariat à l'Energie Atomique (CEA), France (Directeur de Recherche: Dr S Gauthier) Department of Energy, Los Alamos National Laboratory (LANL), USA (Program manager: Dr R J Hanrahan; Group Leader: Dr M J Andrew) The DOE ASC Alliance Center for Astrophysical Thermonuclear Flashes, The University of Chicago, USA (Center's Director: Dr D Q Lamb

  10. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  11. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  12. Orientation damage in the Christchurch cemeteries generated during the Christchurch earthquakes of 2010

    Science.gov (United States)

    Martín-González, Fidel; Perez-Lopez, Raul; Rodrigez-Pascua, Miguel Angel; Martin-Velazquez, Silvia

    2014-05-01

    The intensity scales determined the damage caused by an earthquake. However, a new methodology takes into account not only the damage but the type of damage "Earthquake Archaeological Effects" EAE's, and its orientation (e.g. displaced masonry blocks, impact marks, conjugated fractures, fallen and oriented columns, dipping broken corners, etc.). It focuses not only on the amount of damage but also in its orientation, giving information about the ground motion during the earthquake. In 2010 an earthquake of magnitude 6.2 took place in Christchurch (New Zealand) (22-2-2010), 185 casualties, making it the second-deadliest natural disaster in New Zealand. Due to the magnitude of the catastrophe, the city centre (CBD) was closed and the most damaged buildings were closed and later demolished. For this reason it could not be possible to access to sampling or make observations in the most damaged areas. However, the cemeteries were not closed and a year later still remained intact since the financial means to recover were used to reconstruct infrastructures and housing the city. This peculiarity of the cemeteries made measures of the earthquake effects possible. Orientation damage was measured on the tombs, crosses and headstones of the cemeteries (mainly on falling objects such as fallen crosses, obelisks, displaced tombstones, etc.). 140 data were taken in the most important cemeteries (Barbadoes, Addington, Pebleton, Woodston, Broomley and Linwood cemeteries) covering much of the city area. The procedure involved two main phases: a) inventory and identification of damages, and b) analysis of the damage orientations. The orientation was calculated for each element and plotted in a map and statistically in rose diagrams. The orientation dispersion is high in some cemeteries but damage orientation S-N and E-W is observed. However, due to the multiple seismogenic faults responsible for earthquakes and damages in Christchurch during the year after the 2010 earthquake, a

  13. Protecting your family from earthquakes: The seven steps to earthquake safety

    Science.gov (United States)

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  14. Graphic Turbulence Guidance

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Forecast turbulence hazards identified by the Graphical Turbulence Guidance algorithm. The Graphical Turbulence Guidance product depicts mid-level and upper-level...

  15. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  16. Evidence for strong Holocene earthquake(s) in the Wabash Valley seismic zone

    International Nuclear Information System (INIS)

    Obermeier, S.

    1991-01-01

    Many small and slightly damaging earthquakes have taken place in the region of the lower Wabash River Valley of Indiana and Illinois during the 200 years of historic record. Seismologists have long suspected the Wabash Valley seismic zone to be capable of producing earthquakes much stronger than the largest of record (m b 5.8). The seismic zone contains the poorly defined Wabash Valley fault zone and also appears to contain other vaguely defined faults at depths from which the strongest earthquakes presently originate. Faults near the surface are generally covered with thick alluvium in lowlands and a veneer of loess in uplands, which make direct observations of faults difficult. Partly because of this difficulty, a search for paleoliquefaction features was begun in 1990. Conclusions of the study are as follows: (1) an earthquake much stronger than any historic earthquake struck the lower Wabash Valley between 1,500 and 7,500 years ago; (2) the epicentral region of the prehistoric strong earthquake was the Wabash Valley seismic zone; (3) apparent sites have been located where 1811-12 earthquake accelerations can be bracketed

  17. Prediction of free turbulent mixing using a turbulent kinetic energy method

    Science.gov (United States)

    Harsha, P. T.

    1973-01-01

    Free turbulent mixing of two-dimensional and axisymmetric one- and two-stream flows is analyzed by a relatively simple turbulent kinetic energy method. This method incorporates a linear relationship between the turbulent shear and the turbulent kinetic energy and an algebraic relationship for the length scale appearing in the turbulent kinetic energy equation. Good results are obtained for a wide variety of flows. The technique is shown to be especially applicable to flows with heat and mass transfer, for which nonunity Prandtl and Schmidt numbers may be assumed.

  18. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  19. Turbulent Fluid Motion 6: Turbulence, Nonlinear Dynamics, and Deterministic Chaos

    Science.gov (United States)

    Deissler, Robert G.

    1996-01-01

    Several turbulent and nonturbulent solutions of the Navier-Stokes equations are obtained. The unaveraged equations are used numerically in conjunction with tools and concepts from nonlinear dynamics, including time series, phase portraits, Poincare sections, Liapunov exponents, power spectra, and strange attractors. Initially neighboring solutions for a low-Reynolds-number fully developed turbulence are compared. The turbulence is sustained by a nonrandom time-independent external force. The solutions, on the average, separate exponentially with time, having a positive Liapunov exponent. Thus, the turbulence is characterized as chaotic. In a search for solutions which contrast with the turbulent ones, the Reynolds number (or strength of the forcing) is reduced. Several qualitatively different flows are noted. These are, respectively, fully chaotic, complex periodic, weakly chaotic, simple periodic, and fixed-point. Of these, we classify only the fully chaotic flows as turbulent. Those flows have both a positive Liapunov exponent and Poincare sections without pattern. By contrast, the weakly chaotic flows, although having positive Liapunov exponents, have some pattern in their Poincare sections. The fixed-point and periodic flows are nonturbulent, since turbulence, as generally understood, is both time-dependent and aperiodic.

  20. Acoustic, electromagnetic, neutron emissions from fracture and earthquakes

    CERN Document Server

    Lacidogna, Giuseppe; Manuello, Amedeo

    2015-01-01

    This book presents the relevant consequences of recently discovered and interdisciplinary phenomena, triggered by local mechanical instabilities. In particular, it looks at emissions from nano-scale mechanical instabilities such as fracture, turbulence, buckling and cavitation, focussing on vibrations at the TeraHertz frequency and Piezonuclear reactions. Future applications for this work could include earthquake precursors, climate change, energy production, and cellular biology. A series of fracture experiments on natural rocks demonstrates that the TeraHertz vibrations are able to induce fission reactions on medium weight elements accompanied by neutron emissions. The same phenomenon appears to have occurred in several different situations, particularly in the chemical evolution of the Earth and Solar System, through seismicity (rocky planets) and storms (gaseous planets). As the authors explore, these phenomena can also explain puzzles related to the history of our planet, like the ocean formation or th...

  1. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  2. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  3. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  4. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  5. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  6. Wave turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nazarenko, Sergey [Warwick Univ., Coventry (United Kingdom). Mathematics Inst.

    2011-07-01

    Wave Turbulence refers to the statistical theory of weakly nonlinear dispersive waves. There is a wide and growing spectrum of physical applications, ranging from sea waves, to plasma waves, to superfluid turbulence, to nonlinear optics and Bose-Einstein condensates. Beyond the fundamentals the book thus also covers new developments such as the interaction of random waves with coherent structures (vortices, solitons, wave breaks), inverse cascades leading to condensation and the transitions between weak and strong turbulence, turbulence intermittency as well as finite system size effects, such as ''frozen'' turbulence, discrete wave resonances and avalanche-type energy cascades. This book is an outgrow of several lectures courses held by the author and, as a result, written and structured rather as a graduate text than a monograph, with many exercises and solutions offered along the way. The present compact description primarily addresses students and non-specialist researchers wishing to enter and work in this field. (orig.)

  7. Experimental Investigation of Turbulence-Chemistry Interaction in High-Reynolds-Number Turbulent Partially Premixed Flames

    Science.gov (United States)

    2016-06-23

    AFRL-AFOSR-VA-TR-2016-0277 Experimental Investigation of Turbulence-Chemistry Interaction in High- Reynolds -Number Turbulent Partially Premixed...4. TITLE AND SUBTITLE [U] Experimental investigation of turbulence-chemistry interaction in high- Reynolds -number 5a. CONTRACT NUMBER turbulent...for public release Final Report: Experimental investigation of turbulence-chemistry interaction in high- Reynolds -number turbulent partially premixed

  8. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    Science.gov (United States)

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  9. The Canterbury Charity Hospital: an update (2010-2012) and effects of the earthquakes.

    Science.gov (United States)

    Bagshaw, Philip F; Maimbo-M'siska, Miriam; Nicholls, M Gary; Shaw, Carl G; Allardyce, Randall A; Bagshaw, Susan N; McNabb, Angela L; Johnson, Stuart S; Frampton, Christopher M; Stokes, Brian W

    2013-11-22

    To update activities of the Canterbury Charity Hospital (CCH) and its Trust over the 3 years 2010-2012, during which the devastating Christchurch earthquakes occurred. Patients' treatments, establishment of new services, expansion of the CCH, staffing and finances were reviewed. Previously established services including general surgery continued as before, some services such as ophthalmology declined, and new services were established including colonoscopy, dentistry and some gynaecological procedures; counselling was provided following the earthquakes. Teaching and research endeavours increased. An adjacent property was purchased and renovated to accommodate the expansion. The Trust became financially self-sustaining in 2010; annual running costs of $340,000/year were maintained but were anticipated to increase soon. Of the money generously donated by the community to the Trust, 82% went directly to patient care. Although not formally recorded, hundreds of appointment request were rejected because of service unavailability or unmet referral criteria. This 3-year review highlights substantial, undocumented unmet healthcare needs in the region, which were exacerbated by the 2010/2011 earthquakes. We contend that the level of unmet healthcare in Canterbury and throughout the country should be regularly documented to inform planning of public healthcare services.

  10. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  11. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    Science.gov (United States)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  12. Global financial crisis

    Directory of Open Access Journals (Sweden)

    MSc. Jusuf Qarkaxhija

    2011-03-01

    Full Text Available The most recent developments in economy are a clear indicator of many changes, which are a result of this high rate pacing, which also demonstrates as such. Market economy processes occur as a result of intertwining of many potential technological and human factors, thereby creating a system of numerous diver-gences and turbulences. Economics, a social science, is characteri-sed with movements from a system to another system, and is har-monized with elements or components which have impacted the development and application of economic policies as a result. This example can be illustrated with the passing from a commanded system (centralized to a self-governing (decentrali-zed system, while the movement from a system to another is known as transi-tion. Such transition in its own nature bears a number of problems of almost any kind (political, economic, social, etc., and is charac-terised with differences from a country to another. Financial crisis is a phenomenon consisting of a perception of economic policies and creation of an economic and financial stabi-lity in regional and global structures. From this, one may assume that each system has its own changes in its nature, and as a result of these changes, we have the crisis of such a system. Even in the economic field, if we look closely, we have such a problem, where development trends both in human and technological fields have created a large gap between older times and today, thereby crea-ting dynamics with a high intensity of action. If we dwell on the problem, and enter into the financial world, we can see that the so-called industrialized countries have made giant leaps in deve-lopment, while countries in transition have stalled in many fields, as a result of a high rate of corruption and unemployment in these countries, and obviously these indicators are directly connected, thereby stroking the financial system in these countries. Corruption is an element, which directly and indirectly

  13. Recent developments in plasma turbulence and turbulent transport

    Energy Technology Data Exchange (ETDEWEB)

    Terry, P.W. [Univ. of Wisconsin, Madison, WI (United States)

    1997-09-22

    This report contains viewgraphs of recent developments in plasma turbulence and turbulent transport. Localized nonlinear structures occur under a variety of circumstances in turbulent, magnetically confined plasmas, arising in both kinetic and fluid descriptions, i.e., in either wave-particle or three-wave coupling interactions. These structures are non wavelike. They cannot be incorporated in the collective wave response, but interact with collective modes through their shielding by the plasma dielectric. These structures are predicted to modify turbulence-driven transport in a way that in consistent with, or in some cases are confirmed by recent experimental observations. In kinetic theory, non wavelike structures are localized perturbations of phase space density. There are two types of structures. Holes are self-trapped, while clumps have a self-potential that is too weak to resist deformation and mixing by ambient potential fluctuations. Clumps remain correlated in turbulence if their spatial extent is smaller than the correlation length of the scattering fields. In magnetic turbulence, clumps travel along stochastic magnetic fields, shielded by the plasma dielectric. A drag on the clump macro-particle is exerted by the shielding, inducing emission into the collective response. The emission in turn damps back on the particle distribution via Landau dampling. The exchange of energy between clumps and particles, as mediated by the collective mode, imposes constraints on transport. For a turbulent spectrum whose mean wavenumber along the equilibrium magnetic field is nonzero, the electron thermal flux is proportional to the ion thermal velocity. Conventional predictions (which account only for collective modes) are larger by the square root of the ion to electron mass ratio. Recent measurements are consistent with the small flux. In fluid plasma,s localized coherent structures can occur as intense vortices.

  14. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  15. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    Science.gov (United States)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  16. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  17. A new quantitative method for the rapid evaluation of buildings against earthquakes

    International Nuclear Information System (INIS)

    Mahmoodzadeh, Amir; Mazaheri, Mohammad Mehdi

    2008-01-01

    At the present time there exist numerous weak buildings which are not able to withstand earthquakes. At the same time, both private and public developers are trying to use scientific methods to prioritize and allocate budget in order to reinforce the above mentioned structures. This is because of the limited financial resources and time. In the recent years the procedure of seismic assessment before rehabilitation of vulnerable buildings has been implemented in many countries. Now, it seems logical to reinforce the existing procedures with the mass of available data about the effects caused by earthquakes on buildings. The main idea is driven from FMEA (Failure Mode and Effect Analysis) in quality management where the main procedure is to recognize the failure, the causes, and the priority of each cause and failure. Specifying the causes and effects which lead to a certain shortcoming in structural behavior during earthquakes, an inventory is developed and each building is rated through a yes-or-no procedure. In this way, the rating of the structure is based on some standard forms which along with relative weights are developed in this study. The resulted criteria by rapid assessment will indicate whether the structure is to be demolished, has a high, medium or low vulnerability or is invulnerable

  18. Soliton turbulence

    Science.gov (United States)

    Tchen, C. M.

    1986-01-01

    Theoretical and numerical works in atmospheric turbulence have used the Navier-Stokes fluid equations exclusively for describing large-scale motions. Controversy over the existence of an average temperature gradient for the very large eddies in the atmosphere suggested that a new theoretical basis for describing large-scale turbulence was necessary. A new soliton formalism as a fluid analogue that generalizes the Schrodinger equation and the Zakharov equations has been developed. This formalism, processing all the nonlinearities including those from modulation provided by the density fluctuations and from convection due to the emission of finite sound waves by velocity fluctuations, treats large-scale turbulence as coalescing and colliding solitons. The new soliton system describes large-scale instabilities more explicitly than the Navier-Stokes system because it has a nonlinearity of the gradient type, while the Navier-Stokes has a nonlinearity of the non-gradient type. The forced Schrodinger equation for strong fluctuations describes the micro-hydrodynamical state of soliton turbulence and is valid for large-scale turbulence in fluids and plasmas where internal waves can interact with velocity fluctuations.

  19. Ionospheric turbulence from ground-based and satellite VLF/LF transmitter signal observations for the Simushir earthquake (November 15, 2006

    Directory of Open Access Journals (Sweden)

    Pier Francesco Biagi

    2012-04-01

    Full Text Available

    Signals from very low frequency (VLF/ low frequency (LF transmitters recorded on the ground station at Petropavlovsk-Kamchatsky and on board the French DEMETER satellite were analyzed for the Simushir earthquake (M 8.3; November 15, 2006. The period of analysis was from October 1, 2006, to January 31, 2007. The ground and satellite data were processed by a method based on the difference between the real signal at night-time and the model signal. The model for the ground observations was the monthly averaged signal amplitudes and phases, as calculated for the quiet days of every month. For the satellite data, a two-dimensional model of the signal distribution over the selected area was constructed. Preseismic effects were found several days before the earthquake, in both the ground and satellite observations.

     

  20. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  1. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  2. Kolmogorov Behavior of Near-Wall Turbulence and Its Application in Turbulence Modeling

    Science.gov (United States)

    Shih, Tsan-Hsing; Lumley, John L.

    1992-01-01

    The near-wall behavior of turbulence is re-examined in a way different from that proposed by Hanjalic and Launder and followers. It is shown that at a certain distance from the wall, all energetic large eddies will reduce to Kolmogorov eddies (the smallest eddies in turbulence). All the important wall parameters, such as friction velocity, viscous length scale, and mean strain rate at the wall, are characterized by Kolmogorov microscales. According to this Kolmogorov behavior of near-wall turbulence, the turbulence quantities, such as turbulent kinetic energy, dissipation rate, etc. at the location where the large eddies become Kolmogorov eddies, can be estimated by using both direct numerical simulation (DNS) data and asymptotic analysis of near-wall turbulence. This information will provide useful boundary conditions for the turbulent transport equations. As an example, the concept is incorporated in the standard k-epsilon model which is then applied to channel and boundary flows. Using appropriate boundary conditions (based on Kolmogorov behavior of near-wall turbulence), there is no need for any wall-modification to the k-epsilon equations (including model constants). Results compare very well with the DNS and experimental data.

  3. Modeling of turbulent chemical reaction

    Science.gov (United States)

    Chen, J.-Y.

    1995-01-01

    Viewgraphs are presented on modeling turbulent reacting flows, regimes of turbulent combustion, regimes of premixed and regimes of non-premixed turbulent combustion, chemical closure models, flamelet model, conditional moment closure (CMC), NO(x) emissions from turbulent H2 jet flames, probability density function (PDF), departures from chemical equilibrium, mixing models for PDF methods, comparison of predicted and measured H2O mass fractions in turbulent nonpremixed jet flames, experimental evidence of preferential diffusion in turbulent jet flames, and computation of turbulent reacting flows.

  4. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  5. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  6. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  7. Airfoils in Turbulent Inflow

    DEFF Research Database (Denmark)

    Gilling, Lasse

    of resolved inflow turbulence on airfoil simulations in CFD. The detached-eddy simulation technique is used because it can resolve the inflow turbulence without becoming too computationally expensive due to its limited requirements for mesh resolution in the boundary layer. It cannot resolve the turbulence......Wind turbines operate in inflow turbulence whether it originates from the shear in the atmospheric boundary layer or from the wake of other wind turbines. Consequently, the airfoils of the wings experience turbulence in the inflow. The main topic of this thesis is to investigate the effect...... that is formed in attached boundary layers, but the freestream turbulence can penetrate the boundary layer. The idea is that the resolved turbulence from the freestream should mix high momentum flow into the boundary layer and thereby increase the resistance against separation and increase the maximum lift...

  8. Direct numerical simulation of turbulent mixing in grid-generated turbulence

    International Nuclear Information System (INIS)

    Nagata, Kouji; Suzuki, Hiroki; Sakai, Yasuhiko; Kubo, Takashi; Hayase, Toshiyuki

    2008-01-01

    Turbulent mixing of passive scalar (heat) in grid-generated turbulence (GGT) is simulated by means of direct numerical simulation (DNS). A turbulence-generating grid, on which the velocity components are set to zero, is located downstream of the channel entrance, and it is numerically constructed on the staggered mesh arrangement using the immersed boundary method. The grid types constructed are: (a) square-mesh biplane grid, (b) square-mesh single-plane grid, (c) composite grid consisting of parallel square-bars and (d) fractal grid. Two fluids with different temperatures are provided separately in the upper and lower streams upstream of the turbulence-generating grids, generating the thermal mixing layer behind the grids. For the grid (a), simulations for two different Prandtl numbers of 0.71 and 7.1, corresponding to air and water flows, are conducted to investigate the effect of the Prandtl number. The results show that the typical grid turbulence and shearless mixing layer are generated downstream of the grids. The results of the scalar field show that a typical thermal mixing layer is generated as well, and the effects of the Prandtl numbers on turbulent heat transfer are observed.

  9. Direct numerical simulation of turbulent mixing in grid-generated turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nagata, Kouji; Suzuki, Hiroki; Sakai, Yasuhiko; Kubo, Takashi [Department of Mechanical Science and Engineering, Nagoya University, Nagoya 464-8603 (Japan); Hayase, Toshiyuki [Institute of Fluid Science, Tohoku University, Sendai 980-8577 (Japan)], E-mail: nagata@nagoya-u.jp, E-mail: hsuzuki@nagoya-u.jp, E-mail: ysakai@mech.nagoya-u.ac.jp, E-mail: t-kubo@nagoya-u.jp, E-mail: hayase@ifs.tohoku.ac.jp

    2008-12-15

    Turbulent mixing of passive scalar (heat) in grid-generated turbulence (GGT) is simulated by means of direct numerical simulation (DNS). A turbulence-generating grid, on which the velocity components are set to zero, is located downstream of the channel entrance, and it is numerically constructed on the staggered mesh arrangement using the immersed boundary method. The grid types constructed are: (a) square-mesh biplane grid, (b) square-mesh single-plane grid, (c) composite grid consisting of parallel square-bars and (d) fractal grid. Two fluids with different temperatures are provided separately in the upper and lower streams upstream of the turbulence-generating grids, generating the thermal mixing layer behind the grids. For the grid (a), simulations for two different Prandtl numbers of 0.71 and 7.1, corresponding to air and water flows, are conducted to investigate the effect of the Prandtl number. The results show that the typical grid turbulence and shearless mixing layer are generated downstream of the grids. The results of the scalar field show that a typical thermal mixing layer is generated as well, and the effects of the Prandtl numbers on turbulent heat transfer are observed.

  10. TURBULENT DISKS ARE NEVER STABLE: FRAGMENTATION AND TURBULENCE-PROMOTED PLANET FORMATION

    Energy Technology Data Exchange (ETDEWEB)

    Hopkins, Philip F. [TAPIR, Mailcode 350-17, California Institute of Technology, Pasadena, CA 91125 (United States); Christiansen, Jessie L., E-mail: phopkins@caltech.edu [SETI Institute/NASA Ames Research Center, M/S 244-30, Moffett Field, CA 94035 (United States)

    2013-10-10

    A fundamental assumption in our understanding of disks is that when the Toomre Q >> 1, the disk is stable against fragmentation into self-gravitating objects (and so cannot form planets via direct collapse). But if disks are turbulent, this neglects a spectrum of stochastic density fluctuations that can produce rare, high-density mass concentrations. Here, we use a recently developed analytic framework to predict the statistics of these fluctuations, i.e., the rate of fragmentation and mass spectrum of fragments formed in a turbulent Keplerian disk. Turbulent disks are never completely stable: we calculate the (always finite) probability of forming self-gravitating structures via stochastic turbulent density fluctuations in such disks. Modest sub-sonic turbulence above Mach number M∼0.1 can produce a few stochastic fragmentation or 'direct collapse' events over ∼Myr timescales, even if Q >> 1 and cooling is slow (t{sub cool} >> t{sub orbit}). In transsonic turbulence this extends to Q ∼ 100. We derive the true Q-criterion needed to suppress such events, which scales exponentially with Mach number. We specify to turbulence driven by magneto-rotational instability, convection, or spiral waves and derive equivalent criteria in terms of Q and the cooling time. Cooling times ∼> 50 t{sub dyn} may be required to completely suppress fragmentation. These gravo-turbulent events produce mass spectra peaked near ∼(Q M{sub disk}/M{sub *}){sup 2} M{sub disk} (rocky-to-giant planet masses, increasing with distance from the star). We apply this to protoplanetary disk models and show that even minimum-mass solar nebulae could experience stochastic collapse events, provided a source of turbulence.

  11. TURBULENT DISKS ARE NEVER STABLE: FRAGMENTATION AND TURBULENCE-PROMOTED PLANET FORMATION

    International Nuclear Information System (INIS)

    Hopkins, Philip F.; Christiansen, Jessie L.

    2013-01-01

    A fundamental assumption in our understanding of disks is that when the Toomre Q >> 1, the disk is stable against fragmentation into self-gravitating objects (and so cannot form planets via direct collapse). But if disks are turbulent, this neglects a spectrum of stochastic density fluctuations that can produce rare, high-density mass concentrations. Here, we use a recently developed analytic framework to predict the statistics of these fluctuations, i.e., the rate of fragmentation and mass spectrum of fragments formed in a turbulent Keplerian disk. Turbulent disks are never completely stable: we calculate the (always finite) probability of forming self-gravitating structures via stochastic turbulent density fluctuations in such disks. Modest sub-sonic turbulence above Mach number M∼0.1 can produce a few stochastic fragmentation or 'direct collapse' events over ∼Myr timescales, even if Q >> 1 and cooling is slow (t cool >> t orbit ). In transsonic turbulence this extends to Q ∼ 100. We derive the true Q-criterion needed to suppress such events, which scales exponentially with Mach number. We specify to turbulence driven by magneto-rotational instability, convection, or spiral waves and derive equivalent criteria in terms of Q and the cooling time. Cooling times ∼> 50 t dyn may be required to completely suppress fragmentation. These gravo-turbulent events produce mass spectra peaked near ∼(Q M disk /M * ) 2 M disk (rocky-to-giant planet masses, increasing with distance from the star). We apply this to protoplanetary disk models and show that even minimum-mass solar nebulae could experience stochastic collapse events, provided a source of turbulence

  12. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  13. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  14. Forest - added Turbulence: A parametric study on Turbulence intensity in and around forests

    International Nuclear Information System (INIS)

    Pedersen, Henrik Sundgaard; Langreder, Wiebke

    2007-01-01

    The scope of the investigation is to take on-site measured wind data from a number of sites inside and close to forests. From the collected on-site data the ambient turbulence intensity is calculated and analysed depending on the distance to the forest and height above the forest. From this forest turbulence intensity database it is possible to get an overview of the general behaviour of the turbulence above and down stream from the forest. The database currently consists of 65 measurements points from around the globe, and it will be continually updated as relevant sites are made available. Using the database a number of questions can be answered. How does the ambient turbulence intensity decay with height? What does the turbulence profile look like according to wind speed? Is it the general situation that high wind speeds are creating movement in the canopy tops, resulting in higher turbulence? How does the ambient turbulence intensity decay at different height as a function of distance to the forest? From the forest turbulence database it can be seen that in general, the majority of the turbulence intensity created by the forest is visible within a radius of 5 times the forest height in vertical and 500 meters downstream from the forest edge in horizontal direction. Outside these boundaries the ambient turbulence intensity is rapidly approaching normal values

  15. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  16. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  17. Environmentally Friendly Solution to Ground Hazards in Design of Bridges in Earthquake Prone Areas Using Timber Piles

    Science.gov (United States)

    Sadeghi, H.

    2015-12-01

    Bridges are major elements of infrastructure in all societies. Their safety and continued serviceability guaranties the transportation and emergency access in urban and rural areas. However, these important structures are subject to earthquake induced damages in structure and foundations. The basic approach to the proper support of foundations are a) distribution of imposed loads to foundation in a way they can resist those loads without excessive settlement and failure; b) modification of foundation ground with various available methods; and c) combination of "a" and "b". The engineers has to face the task of designing the foundations meeting all safely and serviceability criteria but sometimes when there are numerous environmental and financial constrains, the use of some traditional methods become inevitable. This paper explains the application of timber piles to improve ground resistance to liquefaction and to secure the abutments of short to medium length bridges in an earthquake/liquefaction prone area in Bohol Island, Philippines. The limitations of using the common ground improvement methods (i.e., injection, dynamic compaction) because of either environmental or financial concerns along with the abundance of timber in the area made the engineers to use a network of timber piles behind the backwalls of the bridge abutments. The suggested timber pile network is simulated by numerical methods and its safety is examined. The results show that the compaction caused by driving of the piles and bearing capacity provided by timbers reduce the settlement and lateral movements due to service and earthquake induced loads.

  18. Multigrid solution of incompressible turbulent flows by using two-equation turbulence models

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, X.; Liu, C. [Front Range Scientific Computations, Inc., Denver, CO (United States); Sung, C.H. [David Taylor Model Basin, Bethesda, MD (United States)

    1996-12-31

    Most of practical flows are turbulent. From the interest of engineering applications, simulation of realistic flows is usually done through solution of Reynolds-averaged Navier-Stokes equations and turbulence model equations. It has been widely accepted that turbulence modeling plays a very important role in numerical simulation of practical flow problem, particularly when the accuracy is of great concern. Among the most used turbulence models today, two-equation models appear to be favored for the reason that they are more general than algebraic models and affordable with current available computer resources. However, investigators using two-equation models seem to have been more concerned with the solution of N-S equations. Less attention is paid to the solution method for the turbulence model equations. In most cases, the turbulence model equations are loosely coupled with N-S equations, multigrid acceleration is only applied to the solution of N-S equations due to perhaps the fact the turbulence model equations are source-term dominant and very stiff in sublayer region.

  19. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  20. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  1. Strong Langmuir turbulence

    International Nuclear Information System (INIS)

    Goldman, M.V.

    1984-01-01

    After a brief discussion of beam-excited Langmuir turbulence in the solar wind, we explain the criteria for wave-particle, three-wave and strong turbulence interactions. We then present the results of a numerical integration of the Zakharov equations, which describe the strong turbulence saturation of a weak (low-density) high energy, bump-on-tail beam instability. (author)

  2. Magnetohydrodynamic turbulence revisited

    International Nuclear Information System (INIS)

    Goldreich, P.; Sridhar, S.

    1997-01-01

    In 1965, Kraichnan proposed that MHD turbulence occurs as a result of collisions between oppositely directed Alfvacute en wave packets. Recent work has generated some controversy over the nature of nonlinear couplings between colliding Alfvacute en waves. We find that the resolution to much of the confusion lies in the existence of a new type of turbulence, intermediate turbulence, in which the cascade of energy in the inertial range exhibits properties intermediate between those of weak and strong turbulent cascades. Some properties of intermediate MHD turbulence are the following: (1) in common with weak turbulent cascades, wave packets belonging to the inertial range are long-lived; (2) however, components of the strain tensor are so large that, similar to the situation in strong turbulence, perturbation theory is not applicable; (3) the breakdown of perturbation theory results from the divergence of neighboring field lines due to wave packets whose perturbations in velocity and magnetic fields are localized, but whose perturbations in displacement are not; (4) three-wave interactions dominate individual collisions between wave packets, but interactions of all orders n≥3 make comparable contributions to the intermediate turbulent energy cascade; (5) successive collisions are correlated since wave packets are distorted as they follow diverging field lines; (6) in common with the weak MHD cascade, there is no parallel cascade of energy, and the cascade to small perpendicular scales strengthens as it reaches higher wavenumbers; (7) for an appropriate weak excitation, there is a natural progression from a weak, through an intermediate, to a strong cascade. copyright 1997 The American Astronomical Society

  3. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  4. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  5. Graphical Turbulence Guidance - Composite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Forecast turbulence hazards identified by the Graphical Turbulence Guidance algorithm. The Graphical Turbulence Guidance product depicts mid-level and upper-level...

  6. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  7. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  8. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  9. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  10. Assessment of impact of strong earthquakes to the global economy by example of Thoku event

    Science.gov (United States)

    Tatiana, Skufina; Peter, Skuf'in; Sergey, Baranov; Vera, Samarina; Taisiya, Shatalova

    2016-04-01

    We examine the economic consequences of strong earthquakes by example of M9 Tahoku one that occurred on March 11, 2011 close to the northeast shore of Japanese coast Honshu. This earthquake became the strongest in the whole history of the seismological observations in this part of the planet. The generated tsunami killed more than 15,700 people, damaged 332,395 buildings and 2,126 roads. The total economic loss in Japan was estimated at 309 billion. The catastrophe in Japan also impacted global economy. To estimate its impact, we used regional and global stock indexes, production indexes, stock prices of the main Japanese, European and US companies, import and export dynamics, as well as the data provided by the custom of Japan. We also demonstrated that the catastrophe substantially affected the markets and on the short run in some indicators it even exceeded the effect of the global financial crisis of 2008. The last strong earthquake occurred in Nepal (25.04.2015, M7.8) and Chile (16.09.2015, M8.3), both actualized the research of cost assessments of the overall economic impact of seismic hazard. We concluded that it is necessary to treat strong earthquakes as one very important factor that affects the world economy depending on their location. The research was supported by Russian Foundation for Basic Research (Project 16-06-00056A).

  11. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  12. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  13. Using Indirect Turbulence Measurements for Real-Time Parameter Estimation in Turbulent Air

    Science.gov (United States)

    Martos, Borja; Morelli, Eugene A.

    2012-01-01

    The use of indirect turbulence measurements for real-time estimation of parameters in a linear longitudinal dynamics model in atmospheric turbulence was studied. It is shown that measuring the atmospheric turbulence makes it possible to treat the turbulence as a measured explanatory variable in the parameter estimation problem. Commercial off-the-shelf sensors were researched and evaluated, then compared to air data booms. Sources of colored noise in the explanatory variables resulting from typical turbulence measurement techniques were identified and studied. A major source of colored noise in the explanatory variables was identified as frequency dependent upwash and time delay. The resulting upwash and time delay corrections were analyzed and compared to previous time shift dynamic modeling research. Simulation data as well as flight test data in atmospheric turbulence were used to verify the time delay behavior. Recommendations are given for follow on flight research and instrumentation.

  14. Lagrangian statistics across the turbulent-nonturbulent interface in a turbulent plane jet.

    Science.gov (United States)

    Taveira, Rodrigo R; Diogo, José S; Lopes, Diogo C; da Silva, Carlos B

    2013-10-01

    Lagrangian statistics from millions of particles are used to study the turbulent entrainment mechanism in a direct numerical simulation of a turbulent plane jet at Re(λ) ≈ 110. The particles (tracers) are initially seeded at the irrotational region of the jet near the turbulent shear layer and are followed as they are drawn into the turbulent region across the turbulent-nonturbulent interface (TNTI), allowing the study of the enstrophy buildup and thereby characterizing the turbulent entrainment mechanism in the jet. The use of Lagrangian statistics following fluid particles gives a more correct description of the entrainment mechanism than in previous works since the statistics in relation to the TNTI position involve data from the trajectories of the entraining fluid particles. The Lagrangian statistics for the particles show the existence of a velocity jump and a characteristic vorticity jump (with a thickness which is one order of magnitude greater than the Kolmogorov microscale), in agreement with previous results using Eulerian statistics. The particles initially acquire enstrophy by viscous diffusion and later by enstrophy production, which becomes "active" only deep inside the turbulent region. Both enstrophy diffusion and production near the TNTI differ substantially from inside the turbulent region. Only about 1% of all particles find their way into pockets of irrotational flow engulfed into the turbulent shear layer region, indicating that "engulfment" is not significant for the present flow, indirectly suggesting that the entrainment is largely due to "nibbling" small-scale mechanisms acting along the entire TNTI surface. Probability density functions of particle positions suggests that the particles spend more time crossing the region near the TNTI than traveling inside the turbulent region, consistent with the particles moving tangent to the interface around the time they cross it.

  15. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  16. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  17. Earthquake Hazard and Risk in Alaska

    Science.gov (United States)

    Black Porto, N.; Nyst, M.

    2014-12-01

    Alaska is one of the most seismically active and tectonically diverse regions in the United States. To examine risk, we have updated the seismic hazard model in Alaska. The current RMS Alaska hazard model is based on the 2007 probabilistic seismic hazard maps for Alaska (Wesson et al., 2007; Boyd et al., 2007). The 2015 RMS model will update several key source parameters, including: extending the earthquake catalog, implementing a new set of crustal faults, updating the subduction zone geometry and reoccurrence rate. First, we extend the earthquake catalog to 2013; decluster the catalog, and compute new background rates. We then create a crustal fault model, based on the Alaska 2012 fault and fold database. This new model increased the number of crustal faults from ten in 2007, to 91 faults in the 2015 model. This includes the addition of: the western Denali, Cook Inlet folds near Anchorage, and thrust faults near Fairbanks. Previously the subduction zone was modeled at a uniform depth. In this update, we model the intraslab as a series of deep stepping events. We also use the best available data, such as Slab 1.0, to update the geometry of the subduction zone. The city of Anchorage represents 80% of the risk exposure in Alaska. In the 2007 model, the hazard in Alaska was dominated by the frequent rate of magnitude 7 to 8 events (Gutenberg-Richter distribution), and large magnitude 8+ events had a low reoccurrence rate (Characteristic) and therefore didn't contribute as highly to the overall risk. We will review these reoccurrence rates, and will present the results and impact to Anchorage. We will compare our hazard update to the 2007 USGS hazard map, and discuss the changes and drivers for these changes. Finally, we will examine the impact model changes have on Alaska earthquake risk. Consider risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the

  18. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    Science.gov (United States)

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  19. Particle clustering within a two-phase turbulent pipe jet

    Science.gov (United States)

    Lau, Timothy; Nathan, Graham

    2016-11-01

    A comprehensive study of the influence of Stokes number on the instantaneous distributions of particles within a well-characterised, two-phase, turbulent pipe jet in a weak co-flow was performed. The experiments utilised particles with a narrow size distribution, resulting in a truly mono-disperse particle-laden jet. The jet Reynolds number, based on the pipe diameter, was in the range 10000 developed technique. The results show that particle clustering is significantly influenced by the exit Stokes number. Particle clustering was found to be significant for 0 . 3 financial contributions by the Australian Research Council (Grant No. DP120102961) and the Australian Renewable Energy Agency (Grant No. USO034).

  20. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  1. Aviation turbulence processes, detection, prediction

    CERN Document Server

    Lane, Todd

    2016-01-01

    Anyone who has experienced turbulence in flight knows that it is usually not pleasant, and may wonder why this is so difficult to avoid. The book includes papers by various aviation turbulence researchers and provides background into the nature and causes of atmospheric turbulence that affect aircraft motion, and contains surveys of the latest techniques for remote and in situ sensing and forecasting of the turbulence phenomenon. It provides updates on the state-of-the-art research since earlier studies in the 1960s on clear-air turbulence, explains recent new understanding into turbulence generation by thunderstorms, and summarizes future challenges in turbulence prediction and avoidance.

  2. Turbulence modelling; Modelisation de la turbulence isotherme

    Energy Technology Data Exchange (ETDEWEB)

    Laurence, D. [Electricite de France (EDF), Direction des Etudes et Recherches, 92 - Clamart (France)

    1997-12-31

    This paper is an introduction course in modelling turbulent thermohydraulics, aimed at computational fluid dynamics users. No specific knowledge other than the Navier Stokes equations is required beforehand. Chapter I (which those who are not beginners can skip) provides basic ideas on turbulence physics and is taken up in a textbook prepared by the teaching team of the ENPC (Benque, Viollet). Chapter II describes turbulent viscosity type modelling and the 2k-{epsilon} two equations model. It provides details of the channel flow case and the boundary conditions. Chapter III describes the `standard` (R{sub ij}-{epsilon}) Reynolds tensions transport model and introduces more recent models called `feasible`. A second paper deals with heat transfer and the effects of gravity, and returns to the Reynolds stress transport model. (author). 37 refs.

  3. Dislocations in FX Swap and Money Markets in Hong Kong and Policy Actions during the Financial Crisis of 2008

    OpenAIRE

    Laurence Fung; Ip-wing Yu

    2009-01-01

    When US dollar interbank markets malfunctioned during the global financial crisis of 2008, many non-US financial institutions relied heavily on the foreign-exchange (FX) swap markets for US-dollar funds. This one-sided market induced a risk premium of the FX swap-implied US-dollar rate across a range of funding currencies, i.e. a deviation from the covered interest parity (CIP) condition. The turbulence in the global interbank markets therefore spilled over to the FX swap markets, including t...

  4. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  5. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  6. Posttraumatic Stress, Depression, and Coping Following the 2015 Nepal Earthquake: A Study on Adolescents.

    Science.gov (United States)

    Sharma, Asmita; Kar, Nilamadhab

    2018-05-24

    The study aimed to gather data on posttraumatic stress and depression in adolescents following the 2015 Nepal earthquake and explore the adolescents' coping strategies. In a questionnaire-based, cross-sectional study about 1 year after the earthquake, adolescents in two districts with different degrees of impact were evaluated for disaster experience, coping strategies, and symptoms of posttraumatic stress and depression measured with the Child Posttraumatic Stress Scale and the Depression Self Rating Scale. In the studied sample (N=409), the estimated prevalence of posttraumatic stress disorder (PTSD) (43.3%) and depression (38.1%) was considerable. Prevalence of PTSD was significantly higher in the more affected area (49.0% v 37.9%); however, the prevalence figures were comparable in adolescents who reported a stress. The prevalence of depression was comparable. Female gender, joint family, financial problems, displacement, injury or being trapped in the earthquake, damage to livelihood, and fear of death were significantly associated with a probable PTSD diagnosis. Various coping strategies were used: talking to others, praying, helping others, hoping for the best, and some activities were common. Drug abuse was rare. Most of the coping strategies were comparable among the clinical groups. A considerable proportion of adolescents had posttraumatic stress and depression 1 year after the earthquake. There is a need for clinical interventions and follow-up studies regarding the outcome. Disaster Med Public Health Preparedness. 2018;page 1 of 7.

  7. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    Science.gov (United States)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  8. Containerless Ripple Turbulence

    Science.gov (United States)

    Putterman, Seth; Wright, William; Duval, Walter; Panzarella, Charles

    2002-11-01

    One of the longest standing unsolved problems in physics relates to the behavior of fluids that are driven far from equilibrium such as occurs when they become turbulent due to fast flow through a grid or tidal motions. In turbulent flows the distribution of vortex energy as a function of the inverse length scale [or wavenumber 'k'] of motion is proportional to 1/k5/3 which is the celebrated law of Kolmogorov. Although this law gives a good description of the average motion, fluctuations around the average are huge. This stands in contrast with thermally activated motion where large fluctuations around thermal equilibrium are highly unfavorable. The problem of turbulence is the problem of understanding why large fluctuations are so prevalent which is also called the problem of 'intermittency'. Turbulence is a remarkable problem in that its solution sits simultaneously at the forefront of physics, mathematics, engineering and computer science. A recent conference [March 2002] on 'Statistical Hydrodynamics' organized by the Los Alamos Laboratory Center for Nonlinear Studies brought together researchers in all of these fields. Although turbulence is generally thought to be described by the Navier-Stokes Equations of fluid mechanics the solution as well as its existence has eluded researchers for over 100 years. In fact proof of the existence of such a solution qualifies for a 1 M millennium prize. As part of our NASA funded research we have proposed building a bridge between vortex turbulence and wave turbulence. The latter occurs when high amplitude waves of various wavelengths are allowed to mutually interact in a fluid. In particular we have proposed measuring the interaction of ripples [capillary waves] that run around on the surface of a fluid sphere suspended in a microgravity environment. The problem of ripple turbulence poses similar mathematical challenges to the problem of vortex turbulence. The waves can have a high amplitude and a strong nonlinear

  9. Containerless Ripple Turbulence

    Science.gov (United States)

    Putterman, Seth; Wright, William; Duval, Walter; Panzarella, Charles

    2002-01-01

    One of the longest standing unsolved problems in physics relates to the behavior of fluids that are driven far from equilibrium such as occurs when they become turbulent due to fast flow through a grid or tidal motions. In turbulent flows the distribution of vortex energy as a function of the inverse length scale [or wavenumber 'k'] of motion is proportional to 1/k(sup 5/3) which is the celebrated law of Kolmogorov. Although this law gives a good description of the average motion, fluctuations around the average are huge. This stands in contrast with thermally activated motion where large fluctuations around thermal equilibrium are highly unfavorable. The problem of turbulence is the problem of understanding why large fluctuations are so prevalent which is also called the problem of 'intermittency'. Turbulence is a remarkable problem in that its solution sits simultaneously at the forefront of physics, mathematics, engineering and computer science. A recent conference [March 2002] on 'Statistical Hydrodynamics' organized by the Los Alamos Laboratory Center for Nonlinear Studies brought together researchers in all of these fields. Although turbulence is generally thought to be described by the Navier-Stokes Equations of fluid mechanics the solution as well as its existence has eluded researchers for over 100 years. In fact proof of the existence of such a solution qualifies for a 1 M$ millennium prize. As part of our NASA funded research we have proposed building a bridge between vortex turbulence and wave turbulence. The latter occurs when high amplitude waves of various wavelengths are allowed to mutually interact in a fluid. In particular we have proposed measuring the interaction of ripples [capillary waves] that run around on the surface of a fluid sphere suspended in a microgravity environment. The problem of ripple turbulence poses similar mathematical challenges to the problem of vortex turbulence. The waves can have a high amplitude and a strong nonlinear

  10. Plasma turbulence calculations on supercomputers

    International Nuclear Information System (INIS)

    Carreras, B.A.; Charlton, L.A.; Dominguez, N.; Drake, J.B.; Garcia, L.; Leboeuf, J.N.; Lee, D.K.; Lynch, V.E.; Sidikman, K.

    1991-01-01

    Although the single-particle picture of magnetic confinement is helpful in understanding some basic physics of plasma confinement, it does not give a full description. Collective effects dominate plasma behavior. Any analysis of plasma confinement requires a self-consistent treatment of the particles and fields. The general picture is further complicated because the plasma, in general, is turbulent. The study of fluid turbulence is a rather complex field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples field by itself. In addition to the difficulties of classical fluid turbulence, plasma turbulence studies face the problems caused by the induced magnetic turbulence, which couples back to the fluid. Since the fluid is not a perfect conductor, this turbulence can lead to changes in the topology of the magnetic field structure, causing the magnetic field lines to wander radially. Because the plasma fluid flows along field lines, they carry the particles with them, and this enhances the losses caused by collisions. The changes in topology are critical for the plasma confinement. The study of plasma turbulence and the concomitant transport is a challenging problem. Because of the importance of solving the plasma turbulence problem for controlled thermonuclear research, the high complexity of the problem, and the necessity of attacking the problem with supercomputers, the study of plasma turbulence in magnetic confinement devices is a Grand Challenge problem

  11. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    Science.gov (United States)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  12. Turbulence generation by waves

    Energy Technology Data Exchange (ETDEWEB)

    Kaftori, D.; Nan, X.S.; Banerjee, S. [Univ. of California, Santa Barbara, CA (United States)

    1995-12-31

    The interaction between two-dimensional mechanically generated waves, and a turbulent stream was investigated experimentally in a horizontal channel, using a 3-D LDA synchronized with a surface position measuring device and a micro-bubble tracers flow visualization with high speed video. Results show that although the wave induced orbital motion reached all the way to the wall, the characteristics of the turbulence wall structures and the turbulence intensity close to the wall were not altered. Nor was the streaky nature of the wall layer. On the other hand, the mean velocity profile became more uniform and the mean friction velocity was increased. Close to the free surface, the turbulence intensity was substantially increased as well. Even in predominantly laminar flows, the introduction of 2-D waves causes three dimensional turbulence. The turbulence enhancement is found to be proportional to the wave strength.

  13. Progress in turbulence research

    International Nuclear Information System (INIS)

    Bradshaw, P.

    1990-01-01

    Recent developments in experiments and eddy simulations, as an introduction to a discussion of turbulence modeling for engineers is reviewed. The most important advances in the last decade rely on computers: microcomputers to control laboratory experiments, especially for multidimensional imaging, and supercomputers to simulate turbulence. These basic studies in turbulence research are leading to genuine breakthroughs in prediction methods for engineers and earth scientists. The three main branches of turbulence research: experiments, simulations (numerically-accurate three-dimensional, time-dependent solutions of the Navier-Stokes equations, with any empiricism confined to the smallest eddies), and modeling (empirical closure of time-averaged equations for turbulent flow) are discussed. 33 refs

  14. Stress triggering of the Lushan M7. 0 earthquake by the Wenchuan Ms8. 0 earthquake

    Directory of Open Access Journals (Sweden)

    Wu Jianchao

    2013-08-01

    Full Text Available The Wenchuan Ms8. 0 earthquake and the Lushan M7. 0 earthquake occurred in the north and south segments of the Longmenshan nappe tectonic belt, respectively. Based on the focal mechanism and finite fault model of the Wenchuan Ms8. 0 earthquake, we calculated the coulomb failure stress change. The inverted coulomb stress changes based on the Nishimura and Chenji models both show that the Lushan M7. 0 earthquake occurred in the increased area of coulomb failure stress induced by the Wenchuan Ms8. 0 earthquake. The coulomb failure stress increased by approximately 0. 135 – 0. 152 bar in the source of the Lushan M7. 0 earthquake, which is far more than the stress triggering threshold. Therefore, the Lushan M7. 0 earthquake was most likely triggered by the coulomb failure stress change.

  15. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  16. Turbulence and wind turbines

    DEFF Research Database (Denmark)

    Brand, Arno J.; Peinke, Joachim; Mann, Jakob

    2011-01-01

    The nature of turbulent flow towards, near and behind a wind turbine, the effect of turbulence on the electricity production and the mechanical loading of individual and clustered wind turbines, and some future issues are discussed.......The nature of turbulent flow towards, near and behind a wind turbine, the effect of turbulence on the electricity production and the mechanical loading of individual and clustered wind turbines, and some future issues are discussed....

  17. High Turbulence

    CERN Multimedia

    EuHIT, Collaboration

    2015-01-01

    As a member of the EuHIT (European High-Performance Infrastructures in Turbulence - see here) consortium, CERN is participating in fundamental research on turbulence phenomena. To this end, the Laboratory provides European researchers with a cryogenic research infrastructure (see here), where the first tests have just been performed.

  18. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  19. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    Science.gov (United States)

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; pchest (45/143 vs. 11/66 patients, RR = 1.9; ptraumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  20. Effects of premixed flames on turbulence and turbulent scalar transport

    Energy Technology Data Exchange (ETDEWEB)

    Lipatnikov, A.N.; Chomiak, J. [Department of Applied Mechanics, Chalmers University of Technology, 412 75 Goeteborg (Sweden)

    2010-02-15

    Experimental data and results of direct numerical simulations are reviewed in order to show that premixed combustion can change the basic characteristics of a fluctuating velocity field (the so-called flame-generated turbulence) and the direction of scalar fluxes (the so-called countergradient or pressure-driven transport) in a turbulent flow. Various approaches to modeling these phenomena are discussed and the lack of a well-elaborated and widely validated predictive approach is emphasized. Relevant basic issues (the transition from gradient to countergradient scalar transport, the role played by flame-generated turbulence in the combustion rate, the characterization of turbulence in premixed flames, etc.) are critically considered and certain widely accepted concepts are disputed. Despite the substantial progress made in understanding the discussed effects over the past decades, these basic issues strongly need further research. (author)

  1. Consideration for standard earthquake vibration (1). The Niigataken Chuetsu-oki Earthquake in 2007

    International Nuclear Information System (INIS)

    Ishibashi, Katsuhiko

    2007-01-01

    Outline of new guideline of quakeproof design standard of nuclear power plant and the standard earthquake vibration are explained. The improvement points of new guideline are discussed on the basis of Kashiwazaki-Kariwa Nuclear Power Plant incidents. The fundamental limits of new guideline are pointed. Placement of the quakeproof design standard of nuclear power plant, JEAG4601 of Japan Electric Association, new guideline, standard earthquake vibration of new guideline, the Niigataken Chuetsu-oki Earthquake in 2007 and damage of Kashiwazaki-Kariwa Nuclear Power Plant are discussed. The safety criteria of safety review system, organization, standard and guideline should be improved on the basis of this earthquake and nuclear plant accident. The general knowledge, 'a nuclear power plant is not constructed in the area expected large earthquake', has to be realized. Preconditions of all nuclear power plants should not cause damage to anything. (S.Y.)

  2. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  3. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  4. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  5. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  6. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  7. Plasma turbulence

    International Nuclear Information System (INIS)

    Horton, W.

    1998-07-01

    The origin of plasma turbulence from currents and spatial gradients in plasmas is described and shown to lead to the dominant transport mechanism in many plasma regimes. A wide variety of turbulent transport mechanism exists in plasmas. In this survey the authors summarize some of the universally observed plasma transport rates

  8. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  9. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  10. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

    Science.gov (United States)

    Liu, B.

    2017-12-01

    The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

  11. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  12. Quantum Turbulence ---Another da Vinci Code---

    Science.gov (United States)

    Tsubota, M.

    Quantum turbulence comprises a tangle of quantized vorticeswhich are stable topological defects created by Bose-Einstein condensation, being realized in superfluid helium and atomic Bose-Einstein condensates. In recent years there has been a growing interest in quantum turbulence. One of the important motivations is to understand the relation between quantum and classical turbulence. Quantum turbulence is expected to be much simpler than usual classical turbulence and give a prototype of turbulence. This article reviews shortly the recent research developments on quantum turbulence.

  13. Comparison of turbulence in a transitional boundary layer to turbulence in a developed boundary layer*

    Science.gov (United States)

    Park, G. I.; Wallace, J.; Wu, X.; Moin, P.

    2010-11-01

    Using a recent DNS of a flat-plate boundary layer, statistics of turbulence in transition at Reθ= 500 where spots merge (distributions of the mean velocity, rms velocity and vorticity fluctuations, Reynolds shear stress, kinetic energy production and dissipation rates and enstrophy) have been compared to these statistics for the developed boundary layer turbulence at Reθ= 1850. When the distributions in the transitional region, determined in narrow planes 0.03 Reθ wide, exclude regions and times when the flow is not turbulent, they closely resemble those in the developed turbulent state at the higher Reynolds number, especially in the buffer and sublayers. The skin friction coefficient, determined in this conditional manner in the transitional flow is, of course, much larger than that obtained by including both turbulent and non-turbulent information there, and is consistent with a value obtained by extrapolating from the developed turbulent region. We are attempting to perform this data analysis even further upstream in the transitioning flow at Reθ= 300 where the turbulent spots are individuated. These results add further evidence to support the view that the structure of a developed turbulent boundary layer is little different from its structure in its embryonic form in turbulent spots. *CTR 2010 Summer Program research.

  14. What Can Sounds Tell Us About Earthquake Interactions?

    Science.gov (United States)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  15. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  16. Effect of turbulent collisions on diffusion in stationary plasma turbulence

    International Nuclear Information System (INIS)

    Xia, H.; Ishihara, O.

    1990-01-01

    Recently the velocity diffusion process was studied by the generalized Langevin equation derived by the projection operator method. The further study shows that the retarded frictional function plays an important role in suppressing particle diffusion in the velocity space in stronger turbulence as much as the resonance broadening effect. The retarded frictional effect, produced by the effective collisions due to the plasma turbulence is assumed to be a Gaussian, but non-Markovian and non-wide-sense stationary process. The relations between the proposed formulation and the extended resonance broadening theory is discussed. The authors also carry out test particle numerical experiment for Langmuir turbulence to test the theories. In a stronger turbulence a deviation of the diffusion rate from the one predicted by both the quasilinear and the extended resonance theories has been observed and is explained qualitatively by the present formulation

  17. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  18. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  19. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  20. Modeling of turbulent bubbly flows; Modelisation des ecoulements turbulents a bulles

    Energy Technology Data Exchange (ETDEWEB)

    Bellakhal, Ghazi

    2005-03-15

    The two-phase flows involve interfacial interactions which modify significantly the structure of the mean and fluctuating flow fields. The design of the two-fluid models adapted to industrial flows requires the taking into account of the effect of these interactions in the closure relations adopted. The work developed in this thesis concerns the development of first order two-fluid models deduced by reduction of second order closures. The adopted reasoning, based on the principle of decomposition of the Reynolds stress tensor into two statistically independent contributions turbulent and pseudo-turbulent parts, allows to preserve the physical contents of the second order relations closure. Analysis of the turbulence structure in two basic flows: homogeneous bubbly flows uniform and with a constant shear allows to deduce a formulation of the two-phase turbulent viscosity involving the characteristic scales of bubbly turbulence, as well as an analytical description of modification of the homogeneous turbulence structure induced by the bubbles presence. The Eulerian two-fluid model was then generalized with the case of the inhomogeneous flows with low void fractions. The numerical results obtained by the application of this model integrated in the computer code MELODIF in the case of free sheared turbulent bubbly flow of wake showed a satisfactory agreement with the experimental data and made it possible to analyze the modification of the characteristic scales of such flow by the interfacial interactions. The two-fluid first order model is generalized finally with the case of high void fractions bubbly flows where the hydrodynamic interactions between the bubbles are not negligible any more. (author)

  1. Radon anomalies prior to earthquakes (2). Atmospheric radon anomaly observed before the Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    Before the 1995 Hyogoken-Nanbu earthquake, various geochemical precursors were observed in the aftershock area: chloride ion concentration, groundwater discharge rate, groundwater radon concentration and so on. Kobe Pharmaceutical University (KPU) is located about 25 km northeast from the epicenter and within the aftershock area. Atmospheric radon concentration had been continuously measured from 1984 at KPU, using a flow-type ionization chamber. The radon concentration data were analyzed using the smoothed residual values which represent the daily minimum of radon concentration with the exclusion of normalized seasonal variation. The radon concentration (smoothed residual values) demonstrated an upward trend about two months before the Hyogoken-Nanbu earthquake. The trend can be well fitted to a log-periodic model related to earthquake fault dynamics. As a result of model fitting, a critical point was calculated to be between 13 and 27 January 1995, which was in good agreement with the occurrence date of earthquake (17 January 1995). The mechanism of radon anomaly before earthquakes is not fully understood. However, it might be possible to detect atmospheric radon anomaly as a precursor before a large earthquake, if (1) the measurement is conducted near the earthquake fault, (2) the monitoring station is located on granite (radon-rich) areas, and (3) the measurement is conducted for more than several years before the earthquake to obtain background data. (author)

  2. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  3. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  4. Impact of a Large San Andreas Fault Earthquake on Tall Buildings in Southern California

    Science.gov (United States)

    Krishnan, S.; Ji, C.; Komatitsch, D.; Tromp, J.

    2004-12-01

    In 1857, an earthquake of magnitude 7.9 occurred on the San Andreas fault, starting at Parkfield and rupturing in a southeasterly direction for more than 300~km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. The strong shaking in the basins due to this earthquake would have had a significant long-period content (2--8~s). If such motions were to happen today, they could have a serious impact on tall buildings in Southern California. In order to study the effects of large San Andreas fault earthquakes on tall buildings in Southern California, we use the finite source of the magnitude 7.9 2001 Denali fault earthquake in Alaska and map it onto the San Andreas fault with the rupture originating at Parkfield and proceeding southward over a distance of 290~km. Using the SPECFEM3D spectral element seismic wave propagation code, we simulate a Denali-like earthquake on the San Andreas fault and compute ground motions at sites located on a grid with a 2.5--5.0~km spacing in the greater Southern California region. We subsequently analyze 3D structural models of an existing tall steel building designed in 1984 as well as one designed according to the current building code (Uniform Building Code, 1997) subjected to the computed ground motion. We use a sophisticated nonlinear building analysis program, FRAME3D, that has the ability to simulate damage in buildings due to three-component ground motion. We summarize the performance of these structural models on contour maps of carefully selected structural performance indices. This study could benefit the city in laying out emergency response strategies in the event of an earthquake on the San Andreas fault, in undertaking appropriate retrofit measures for tall buildings, and in formulating zoning regulations for new construction. In addition, the study would provide risk data associated with existing and new construction to insurance companies, real estate developers, and

  5. Analysis of turbulent boundary layers

    CERN Document Server

    Cebeci, Tuncer

    1974-01-01

    Analysis of Turbulent Boundary Layers focuses on turbulent flows meeting the requirements for the boundary-layer or thin-shear-layer approximations. Its approach is devising relatively fundamental, and often subtle, empirical engineering correlations, which are then introduced into various forms of describing equations for final solution. After introducing the topic on turbulence, the book examines the conservation equations for compressible turbulent flows, boundary-layer equations, and general behavior of turbulent boundary layers. The latter chapters describe the CS method for calculati

  6. Turbulent buoyant jets and plumes

    CERN Document Server

    Rodi, Wolfgang

    The Science & Applications of Heat and Mass Transfer: Reports, Reviews, & Computer Programs, Volume 6: Turbulent Buoyant Jets and Plumes focuses on the formation, properties, characteristics, and reactions of turbulent jets and plumes. The selection first offers information on the mechanics of turbulent buoyant jets and plumes and turbulent buoyant jets in shallow fluid layers. Discussions focus on submerged buoyant jets into shallow fluid, horizontal surface or interface jets into shallow layers, fundamental considerations, and turbulent buoyant jets (forced plumes). The manuscript then exami

  7. Turbulence measurements in fusion plasmas

    International Nuclear Information System (INIS)

    Conway, G D

    2008-01-01

    Turbulence measurements in magnetically confined toroidal plasmas have a long history and relevance due to the detrimental role of turbulence induced transport on particle, energy, impurity and momentum confinement. The turbulence-the microscopic random fluctuations in particle density, temperature, potential and magnetic field-is generally driven by radial gradients in the plasma density and temperature. The correlation between the turbulence properties and global confinement, via enhanced diffusion, convection and direct conduction, is now well documented. Theory, together with recent measurements, also indicates that non-linear interactions within the turbulence generate large scale zonal flows and geodesic oscillations, which can feed back onto the turbulence and equilibrium profiles creating a complex interdependence. An overview of the current status and understanding of plasma turbulence measurements in the closed flux surface region of magnetic confinement fusion devices is presented, highlighting some recent developments and outstanding problems.

  8. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  9. Earthquake effect on the geological environment

    International Nuclear Information System (INIS)

    Kawamura, Makoto

    1999-01-01

    Acceleration caused by the earthquake, changes in the water pressure, and the rock-mass strain were monitored for a series of 344 earthquakes from 1990 to 1998 at Kamaishi In Situ Test Site. The largest acceleration was registered to be 57.14 gal with the earthquake named 'North coast of Iwate Earthquake' (M4.4) occurred in June, 1996. Changes of the water pressure were recorded with 27 earthquakes; the largest change was -0.35 Kgt/cm 2 . The water-pressure change by earthquake was, however, usually smaller than that caused by rainfall in this area. No change in the electric conductivity or pH of ground water was detected before and after the earthquake throughout the entire period of monitoring. The rock-mass strain was measured with a extensometer whose detection limit was of the order of 10 -8 to 10 -9 degrees and the remaining strain of about 2.5x10 -9 degrees was detected following the 'Offshore Miyagi Earthquake' (M5.1) in October, 1997. (H. Baba)

  10. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  11. Disability in post-earthquake Haiti: prevalence and inequality in access to services.

    Science.gov (United States)

    Danquah, Lisa; Polack, Sarah; Brus, Aude; Mactaggart, Islay; Houdon, Claire Perrin; Senia, Patrick; Gallien, Pierre; Kuper, Hannah

    2015-01-01

    To assess the prevalence of disability and service needs in post-earthquake Haiti, and to compare the inclusion and living conditions of people with disabilities to those without disabilities. A population-based prevalence survey of disability was undertaken in 2012 in Port-au-Prince region, which was at the centre of the earthquake in 2010. Sixty clusters of 50 people aged 5 + years were selected with probability proportionate to size sampling and screened for disability (Washington Group short set questionnaire). A case-control study was undertaken, nested within the survey, matching cases to controls by age, gender and cluster. There was additional case finding to identify further children with disabilities. Information was collected on: socioeconomic status, education, livelihood, health, activities, participation and barriers. The prevalence of disability was 4.1% (3.4-4.7%) across 3132 eligible individuals. The earthquake was the second leading cause of disability. Disability was more common with increasing age, but unrelated to poverty. Large gaps existed in access of services for people with disabilities. Adults with disabilities were less likely to be literate or work and more likely to visit health services than adults without disabilities. Children with disabilities were less likely to be currently enrolled at school compared to controls. Children and adults with disabilities reported more activity limitations and participation restriction. Further focus is needed to improve inclusion of people with disabilities in post-earthquake Haiti to ensure that their rights are fulfilled. Almost one in six households in this region of Haiti included a person with a disability, and the earthquake was the second leading cause of disability. Fewer than half of people who reported needing medical rehabilitation had received this service. The leading reported barriers to the uptake of health services included financial constraints (50%) and difficulties with

  12. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  13. Smoking prevalence increases following Canterbury earthquakes.

    Science.gov (United States)

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  14. Cyclic migration of weak earthquakes between Lunigiana earthquake of October 10, 1995 and Reggio Emilia earthquake of October 15, 1996 (Northern Italy)

    Science.gov (United States)

    di Giovambattista, R.; Tyupkin, Yu

    The cyclic migration of weak earthquakes (M 2.2) which occurred during the yearprior to the October 15, 1996 (M = 4.9) Reggio Emilia earthquake isdiscussed in this paper. The onset of this migration was associated with theoccurrence of the October 10, 1995 (M = 4.8) Lunigiana earthquakeabout 90 km southwest from the epicenter of the Reggio Emiliaearthquake. At least three series of earthquakes migrating from theepicentral area of the Lunigiana earthquake in the northeast direction wereobserved. The migration of earthquakes of the first series terminated at adistance of about 30 km from the epicenter of the Reggio Emiliaearthquake. The earthquake migration of the other two series halted atabout 10 km from the Reggio Emilia epicenter. The average rate ofearthquake migration was about 200-300 km/year, while the time ofrecurrence of the observed cycles varied from 68 to 178 days. Weakearthquakes migrated along the transversal fault zones and sometimesjumped from one fault to another. A correlation between the migratingearthquakes and tidal variations is analysed. We discuss the hypothesis thatthe analyzed area is in a state of stress approaching the limit of thelong-term durability of crustal rocks and that the observed cyclic migrationis a result of a combination of a more or less regular evolution of tectonicand tidal variations.

  15. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    Science.gov (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

  16. Plasma Turbulence General Topics

    Energy Technology Data Exchange (ETDEWEB)

    Kadomtsev, B. B. [Nuclear Energy Institute, Academy of Sciences of the USSR, Moscow, USSR (Russian Federation)

    1965-06-15

    It is known that under experimental conditions plasma often shows chaotic motion. Such motion, when many degrees of freedom are excited to levels considerably above the thermal level, will be called turbulent. The properties of turbulent plasma in many respects differ from the properties of laminar plasma. It can be said that the appearance of various anomalies in plasma behaviour indicates the presence of turbulence in plasma. In order to verify directly the presence of turbulent motion in plasma we must, however, measure the fluctuation of some microscopic parameters in plasma.

  17. Global Turbulence Decision Support for Aviation

    Science.gov (United States)

    Williams, J.; Sharman, R.; Kessinger, C.; Feltz, W.; Wimmers, A.

    2009-09-01

    Turbulence is widely recognized as the leading cause of injuries to flight attendants and passengers on commercial air carriers, yet legacy decision support products such as SIGMETs and SIGWX charts provide relatively low spatial- and temporal-resolution assessments and forecasts of turbulence, with limited usefulness for strategic planning and tactical turbulence avoidance. A new effort is underway to develop an automated, rapid-update, gridded global turbulence diagnosis and forecast system that addresses upper-level clear-air turbulence, mountain-wave turbulence, and convectively-induced turbulence. This NASA-funded effort, modeled on the U.S. Federal Aviation Administration's Graphical Turbulence Guidance (GTG) and GTG Nowcast systems, employs NCEP Global Forecast System (GFS) model output and data from NASA and operational satellites to produce quantitative turbulence nowcasts and forecasts. A convective nowcast element based on GFS forecasts and satellite data provides a basis for diagnosing convective turbulence. An operational prototype "Global GTG” system has been running in real-time at the U.S. National Center for Atmospheric Research since the spring of 2009. Initial verification based on data from TRMM, Cloudsat and MODIS (for the convection nowcasting) and AIREPs and AMDAR data (for turbulence) are presented. This product aims to provide the "single authoritative source” for global turbulence information for the U.S. Next Generation Air Transportation System.

  18. PROTOSTELLAR OUTFLOW EVOLUTION IN TURBULENT ENVIRONMENTS

    International Nuclear Information System (INIS)

    Cunningham, Andrew J.; Frank, Adam; Carroll, Jonathan; Blackman, Eric G.; Quillen, Alice C.

    2009-01-01

    The link between turbulence in star-forming environments and protostellar jets remains controversial. To explore issues of turbulence and fossil cavities driven by young stellar outflows, we present a series of numerical simulations tracking the evolution of transient protostellar jets driven into a turbulent medium. Our simulations show both the effect of turbulence on outflow structures and, conversely, the effect of outflows on the ambient turbulence. We demonstrate how turbulence will lead to strong modifications in jet morphology. More importantly, we demonstrate that individual transient outflows have the capacity to re-energize decaying turbulence. Our simulations support a scenario in which the directed energy/momentum associated with cavities is randomized as the cavities are disrupted by dynamical instabilities seeded by the ambient turbulence. Consideration of the energy power spectra of the simulations reveals that the disruption of the cavities powers an energy cascade consistent with Burgers'-type turbulence and produces a driving scale length associated with the cavity propagation length. We conclude that fossil cavities interacting either with a turbulent medium or with other cavities have the capacity to sustain or create turbulent flows in star-forming environments. In the last section, we contrast our work and its conclusions with previous studies which claim that jets cannot be the source of turbulence.

  19. Momentum and scalar transport at the turbulent/non-turbulent interface of a jet

    DEFF Research Database (Denmark)

    Westerweel, J.; Fukushima, C.; Pedersen, Jakob Martin

    2009-01-01

    and well-defined bounding interface between the turbulent and non-turbulent regions of flow. The jet carries a fluorescent dye measured with planar laser-induced fluorescence (LIF), and the surface discontinuity in the scalar concentration is identified as the fluctuating turbulent jet interface. Thence...... velocity and mean scalar and a tendency towards a singularity in mean vorticity. These actual or asymptotic discontinuities are consistent with the conditional mean momentum and scalar transport equations integrated across the interface. Measurements of the fluxes of turbulent kinetic energy and enstrophy...

  20. On the turbulent flow in piston engines: Coupling of statistical theory quantities and instantaneous turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Zentgraf, Florian; Baum, Elias; Dreizler, Andreas [Fachgebiet Reaktive Strömungen und Messtechnik (RSM), Center of Smart Interfaces (CSI), Technische Universität Darmstadt, Jovanka-Bontschits-Straße 2, 64287 Darmstadt (Germany); Böhm, Benjamin [Fachgebiet Energie und Kraftwerkstechnik (EKT), Technische Universität Darmstadt, Jovanka-Bontschits-Straße 2, 64287 Darmstadt (Germany); Peterson, Brian, E-mail: brian.peterson@ed.ac.uk [Department of Mechanical Engineering, School of Engineering, Institute for Energy Systems, University of Edinburgh, The King’s Buildings, Mayfield Road, Edinburgh EH9 3JL, Scotland (United Kingdom)

    2016-04-15

    Planar particle image velocimetry (PIV) and tomographic PIV (TPIV) measurements are utilized to analyze turbulent statistical theory quantities and the instantaneous turbulence within a single-cylinder optical engine. Measurements are performed during the intake and mid-compression stroke at 800 and 1500 RPM. TPIV facilitates the evaluation of spatially resolved Reynolds stress tensor (RST) distributions, anisotropic Reynolds stress invariants, and instantaneous turbulent vortical structures. The RST analysis describes distributions of individual velocity fluctuation components that arise from unsteady turbulent flow behavior as well as cycle-to-cycle variability (CCV). A conditional analysis, for which instantaneous PIV images are sampled by their tumble center location, reveals that CCV and turbulence have similar contributions to RST distributions at the mean tumble center, but turbulence is dominant in regions peripheral to the tumble center. Analysis of the anisotropic Reynolds stress invariants reveals the spatial distribution of axisymmetric expansion, axisymmetric contraction, and 3D isotropy within the cylinder. Findings indicate that the mid-compression flow exhibits a higher tendency toward 3D isotropy than the intake flow. A novel post-processing algorithm is utilized to classify the geometry of instantaneous turbulent vortical structures and evaluate their frequency of occurrence within the cylinder. Findings are coupled with statistical theory quantities to provide a comprehensive understanding of the distribution of turbulent velocity components, the distribution of anisotropic states of turbulence, and compare the turbulent vortical flow distribution that is theoretically expected to what is experimentally observed. The analyses reveal requisites of important turbulent flow quantities and discern their sensitivity to the local flow topography and engine operation.

  1. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  2. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    Directory of Open Access Journals (Sweden)

    Zhi-hui Dong

    2011-01-01

    Full Text Available PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT. METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR = 2.2; p<0.001. Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR=1.2; p<0.05 or flail chest (45/143 vs. 11/66 patients, RR=1.9; p<0.05 were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR= 1.7; p<0.01. Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR= 1.4; p<0.01. Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR=1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001. Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. CONCLUSIONS: Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries.

  3. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  4. Towards CFD modeling of turbulent pipeline material transportation

    Science.gov (United States)

    Shahirpour, Amir; Herzog, Nicoleta; Egbers, Cristoph

    2013-04-01

    Safe and financially efficient pipeline transportation of carbon dioxide is a critical issue in the developing field of the CCS Technology. In this part of the process, carbon dioxide is transported via pipes with diameter of 1.5 m and entry pressure of 150 bar, with Reynolds number of 107 and viscosity of 8×10(-5) Pa.s as dense fluid [1]. Presence of large and small scale structures in the pipeline, high Reynolds numbers at which CO2 should be transferred, and 3 dimensional turbulence caused by local geometrical modifications, increase the importance of simulation of turbulent material transport through the individual components of the CO2 chain process. In this study, incompressible turbulent channel flow and pipe flow have been modeled using OpenFoam, an open source CFD software. In the first step, simulation of a turbulent channel flow has been considered using LES for shear Reynolds number of 395. A simple geometry has been chosen with cyclic fluid inlet and outlet boundary conditions to simulate a fully developed flow. The mesh is gradually refined towards the wall to provide values close enough to the wall for the wall coordinate (y+). Grid resolution study has been conducted for One-Equation model. The accuracy of the results is analyzed with respect to the grid smoothness in order to reach an optimized resolution for carrying out the next simulations. Furthermore, three LES models, One-Equation, Smagorinsky and Dynamic Smagorinsky are applied for the grid resolution of (60 × 100 × 80) in (x, y, z) directions. The results are then validated with reference to the DNS carried out by Moser et al.[2] for the similar geometry using logarithmic velocity profile (U+) and Reynolds stress tensor components. In the second step the similar flow is modeled using Reynolds averaged method. Several RANS models, like K-epsilon and Launder-Reece-Rodi are applied and validated against DNS and LES results in a similar fashion. In the most recent step, it has been intended

  5. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    Science.gov (United States)

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  6. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  7. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  8. Turbulence-Free Double-slit Interferometer

    Science.gov (United States)

    Smith, Thomas A.; Shih, Yanhua

    2018-02-01

    Optical turbulence can be detrimental for optical observations. For instance, atmospheric turbulence may reduce the visibility or completely blur out the interference produced by an interferometer in open air. However, a simple two-photon interference theory based on Einstein's granularity picture of light makes a turbulence-free interferometer possible; i.e., any refraction index, length, or phase variations along the optical paths of the interferometer do not have any effect on its interference. Applying this mechanism, the reported experiment demonstrates a two-photon double-slit interference that is insensitive to atmospheric turbulence. The turbulence-free mechanism and especially the turbulence-free interferometer would be helpful in optical observations that require high sensitivity and stability such as for gravitational-wave detection.

  9. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  10. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  11. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  12. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  13. How fault geometry controls earthquake magnitude

    Science.gov (United States)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  14. Destabilizing turbulence in pipe flow

    Science.gov (United States)

    Kühnen, Jakob; Song, Baofang; Scarselli, Davide; Budanur, Nazmi Burak; Riedl, Michael; Willis, Ashley P.; Avila, Marc; Hof, Björn

    2018-04-01

    Turbulence is the major cause of friction losses in transport processes and it is responsible for a drastic drag increase in flows over bounding surfaces. While much effort is invested into developing ways to control and reduce turbulence intensities1-3, so far no methods exist to altogether eliminate turbulence if velocities are sufficiently large. We demonstrate for pipe flow that appropriate distortions to the velocity profile lead to a complete collapse of turbulence and subsequently friction losses are reduced by as much as 90%. Counterintuitively, the return to laminar motion is accomplished by initially increasing turbulence intensities or by transiently amplifying wall shear. Since neither the Reynolds number nor the shear stresses decrease (the latter often increase), these measures are not indicative of turbulence collapse. Instead, an amplification mechanism4,5 measuring the interaction between eddies and the mean shear is found to set a threshold below which turbulence is suppressed beyond recovery.

  15. Financial history and financial economics

    OpenAIRE

    Turner, John D.

    2014-01-01

    This essay looks at the bidirectional relationship between financial history and financial economics. It begins by giving a brief history of financial economics by outlining the main topics of interest to financial economists. It then documents and explains the increasing influence of financial economics upon financial history, and warns of the dangers of applying financial economics unthinkingly to the study of financial history. The essay proceeds to highlight the many insights that financi...

  16. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  17. PDF modelling and particle-turbulence interaction of turbulent spray flames

    NARCIS (Netherlands)

    Beishuizen, N.A.

    2008-01-01

    Turbulent spray flames can be found in many applications, such as Diesel engines, rocket engines and power plants. The many practical applications are a motivation to investigate the physical phenomena occurring in turbulent spray flames in detail in order to be able to understand, predict and

  18. Special issue of selected papers from the second UK-Japan bilateral Workshop and First ERCOFTAC Workshop on Turbulent Flows Generated/Designed in Multiscale/Fractal Ways, London, March 2012

    Science.gov (United States)

    Laizet, Sylvain; Sakai, Yasuhiko; Christos Vassilicos, J.

    2013-12-01

    This special issue of Fluid Dynamics Research includes nine papers which are based on nine of the presentations at the Second UK-Japan bilateral Workshop and First ERCOFTAC Workshop on 'Turbulent flows generated/designed in multiscale/fractal ways: fundamentals and applications' held from 26 to 27 March 2012 at Imperial College London, UK. The research area of fractal-generated turbulent flows started with a chapter published in 2001 in one of the conference proceedings which came out of the 1999 Isaac Newton Institute 6 month Programme on Turbulence in Cambridge (UK). However, the first results which formed the basis of much of the work reported in this special issue started appearing from 2007 onwards and progress since then could perhaps be described as not insignificant. Research in this area has resulted in the following six notable advances: (a) the definition of two new length-scales characterizing grid-generated turbulence; (b) enhanced and energy-efficient stirring and scalar transfer by fractal grid and fractal openings/flanges with applications, in particular, to improved turbulence generation for combustion; (c) the non-equilibrium turbulent dissipation law; (d) non-equilibrium axisymmetric wake laws; (e) insights into the dependence of drag forces and vortex shedding on the fractal geometry of fractal objects and simulation methods for the calculation of drag of fractal trees; and (f) the invention and successful proof of concept of fractal spoilers and fractal fences. The present special issue contains papers directly related to these advances and can be seen as a reflection of the current research in the field of fractal-generated turbulent flows and their differences and commonalities with other turbulent flows. The financial support from the Japan Society for the Promotion of Science has been decisive for the organization and success of this workshop. We are also grateful to ERCOFTAC who put in place the EU-wide Special Interest Group on multiscale

  19. Measuring the size of an earthquake

    Science.gov (United States)

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Earthquakes range broadly in size. A rock-burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat Island earthquake in the Aleutian arc involved a 650-kilometer length of the Earth's crust. Earthquakes can be even smaller and even larger. If an earthquake is felt or causes perceptible surface damage, then its intensity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic areas or at great focal depths and are either simply not felt or their felt pattern does not really indicate their true size.

  20. Earthquakes-Rattling the Earth's Plumbing System

    Science.gov (United States)

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  1. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    Science.gov (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  2. Electron acceleration by turbulent plasmoid reconnection

    Science.gov (United States)

    Zhou, X.; Büchner, J.; Widmer, F.; Muñoz, P. A.

    2018-04-01

    In space and astrophysical plasmas, like in planetary magnetospheres, as that of Mercury, energetic electrons are often found near current sheets, which hint at electron acceleration by magnetic reconnection. Unfortunately, electron acceleration by reconnection is not well understood yet, in particular, acceleration by turbulent plasmoid reconnection. We have investigated electron acceleration by turbulent plasmoid reconnection, described by MHD simulations, via test particle calculations. In order to avoid resolving all relevant turbulence scales down to the dissipation scales, a mean-field turbulence model is used to describe the turbulence of sub-grid scales and their effects via a turbulent electromotive force (EMF). The mean-field model describes the turbulent EMF as a function of the mean values of current density, vorticity, magnetic field as well as of the energy, cross-helicity, and residual helicity of the turbulence. We found that, mainly around X-points of turbulent reconnection, strongly enhanced localized EMFs most efficiently accelerated electrons and caused the formation of power-law spectra. Magnetic-field-aligned EMFs, caused by the turbulence, dominate the electron acceleration process. Scaling the acceleration processes to parameters of the Hermean magnetotail, electron energies up to 60 keV can be reached by turbulent plasmoid reconnection through the thermal plasma.

  3. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  4. Links Between Earthquake Characteristics and Subducting Plate Heterogeneity in the 2016 Pedernales Ecuador Earthquake Rupture Zone

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2016-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  5. Turbulence new approaches

    CERN Document Server

    Belotserkovskii, OM; Chechetkin, VM

    2005-01-01

    The authors present the results of numerical experiments carried out to examine the problem of development of turbulence and convection. On the basis of the results, they propose a physical model of the development of turbulence. Numerical algorithms and difference schema for carrying out numerical experiments in hydrodynamics, are proposed. Original algorithms, suitable for calculation of the development of the processes of turbulence and convection in different conditions, even on astrophysical objects, are presented. The results of numerical modelling of several important phenomena having both fundamental and applied importance are described.

  6. Non-gaussian turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Hoejstrup, J [NEG Micon Project Development A/S, Randers (Denmark); Hansen, K S [Denmarks Technical Univ., Dept. of Energy Engineering, Lyngby (Denmark); Pedersen, B J [VESTAS Wind Systems A/S, Lem (Denmark); Nielsen, M [Risoe National Lab., Wind Energy and Atmospheric Physics, Roskilde (Denmark)

    1999-03-01

    The pdf`s of atmospheric turbulence have somewhat wider tails than a Gaussian, especially regarding accelerations, whereas velocities are close to Gaussian. This behaviour is being investigated using data from a large WEB-database in order to quantify the amount of non-Gaussianity. Models for non-Gaussian turbulence have been developed, by which artificial turbulence can be generated with specified distributions, spectra and cross-correlations. The artificial time series will then be used in load models and the resulting loads in the Gaussian and the non-Gaussian cases will be compared. (au)

  7. Earthquakes; May-June 1982

    Science.gov (United States)

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  8. A correlation for single phase turbulent mixing in square rod arrays under highly turbulent conditions

    International Nuclear Information System (INIS)

    Jeong, Hae Yong; Ha, Kwi Seok; Kwon, Young Min; Chang, Won Pyo; Lee, Yong Bum

    2006-01-01

    The existing experimental data related to the turbulent mixing factor in rod arrays is examined and a new definition of the turbulent mixing factor is introduced to take into account the turbulent mixing of fluids with various Prandtl numbers. The new definition of the mixing factor is based on the eddy diffusivity of energy. With this definition of the mixing factor, it was found that the geometrical parameter, δ ij /D h , correlates the turbulent mixing data better than S/d, which has been used frequently in existing correlations. Based on the experimental data for a highly turbulent condition in square rod arrays, a correlation describing turbulent mixing dependent on the parameter δ ij /D h has been developed. The correlation is insensitive to the Re number and it takes into account the effect of the turbulent Prandtl number. The proposed correlation predicts a reasonable mixing even at a lower S/d ratio

  9. Financial sector taxation: Financial activities tax or financial transaction tax?

    Directory of Open Access Journals (Sweden)

    Danuše Nerudová

    2011-01-01

    Full Text Available The recent financial crises has revealed the need to improve and ensure the stability of the financial sector to reduce negative externalities, to ensure fair and substantial contribution of the financial sector to the public finances and the need to consolidate public finance. All those needs represent substantial arguments for the discussion about the introduction of financial sector taxation. There are discussed in the paper two possible schemes of financial sector taxation – financial transaction tax and financial activities tax. The aim of the paper is to research the possibility of the introduction of financial sector taxation, to discuss the pros and cons of two major candidates on financial sector taxation – financial transaction tax and financial activities tax and to suggest the possible candidate suitable for the implementation on the EU level. Financial transaction tax represents the tool suitable mainly on global level, for only in that case enables generate sufficient financial resources. From EU point of view is considered as less suitable, for it bears the risk of reallocation. Therefore the introduction of financial activities tax on EU level is considered as a better solution for the financial sector taxation in the EU, for financial sector is exempted from value added tax. With respect to the fact, that the implementation would represent the innovative approach to the financial sector taxation, there are no empirical proves and therefore this could be the subject of further research.

  10. Numerical investigation of kinetic turbulence in relativistic pair plasmas - I. Turbulence statistics

    Science.gov (United States)

    Zhdankin, Vladimir; Uzdensky, Dmitri A.; Werner, Gregory R.; Begelman, Mitchell C.

    2018-02-01

    We describe results from particle-in-cell simulations of driven turbulence in collisionless, magnetized, relativistic pair plasma. This physical regime provides a simple setting for investigating the basic properties of kinetic turbulence and is relevant for high-energy astrophysical systems such as pulsar wind nebulae and astrophysical jets. In this paper, we investigate the statistics of turbulent fluctuations in simulations on lattices of up to 10243 cells and containing up to 2 × 1011 particles. Due to the absence of a cooling mechanism in our simulations, turbulent energy dissipation reduces the magnetization parameter to order unity within a few dynamical times, causing turbulent motions to become sub-relativistic. In the developed stage, our results agree with predictions from magnetohydrodynamic turbulence phenomenology at inertial-range scales, including a power-law magnetic energy spectrum with index near -5/3, scale-dependent anisotropy of fluctuations described by critical balance, lognormal distributions for particle density and internal energy density (related by a 4/3 adiabatic index, as predicted for an ultra-relativistic ideal gas), and the presence of intermittency. We also present possible signatures of a kinetic cascade by measuring power-law spectra for the magnetic, electric and density fluctuations at sub-Larmor scales.

  11. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  12. Plasma Soliton Turbulence and Statistical Mechanics

    International Nuclear Information System (INIS)

    Treumann, R.A.; Pottelette, R.

    1999-01-01

    Collisionless kinetic plasma turbulence is described approximately in terms of a superposition of non-interacting solitary waves. We discuss the relevance of such a description under astrophysical conditions. Several types of solitary waves may be of interest in this relation as generators of turbulence and turbulent transport. A consistent theory of turbulence can be given only in a few particular cases when the description can be reduced to the Korteweg-de Vries equation or some other simple equation like the Kadomtsev-Petviashvili equation. It turns out that the soliton turbulence is usually energetically harder than the ordinary weakly turbulent plasma description. This implies that interaction of particles with such kinds of turbulence can lead to stronger acceleration than in ordinary turbulence. However, the description in our model is only classical and non-relativistic. Transport in solitary turbulence is most important for drift wave turbulence. Such waves form solitary drift wave vortices which may provide cross-field transport. A more general discussion is given on transport. In a model of Levy flight trapping of particles in solitons (or solitary turbulence) one finds that the residence time of particles in the region of turbulence may be described by a generalized Lorentzian probability distribution. It is shown that under collisionless equilibrium conditions far away from thermal equilibrium such distributions are natural equilibrium distributions. A consistent thermodynamic description of such media can be given in terms of a generalized Lorentzian statistical mechanics and thermodynamics. (author)

  13. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  14. DIFFICULTIES RELATED TO THE FINANCIAL POSITION REPORTING INTO THE PUBLIC SECTOR IN ROMANIA

    OpenAIRE

    Aurelia ŞTEFĂNESCU

    2014-01-01

    Within the context of a turbulent economic environment with impact on the vulnerability of the public sector entities, the stakeholders’ needs of information are focussed on the assessment of liquidities and their solvency, on the sustenability of service offerring, as well as on the capacity of the entities to answer a dynamic environment in terms of cost, quality and continuity. In this respect, the current study has as objective to identify the difficulties of reporting the financial posit...

  15. 33 CFR 222.4 - Reporting earthquake effects.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Reporting earthquake effects. 222..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... significant earthquakes. It primarily concerns damage surveys following the occurrences of earthquakes. (b...

  16. Earthquakes - a danger to deep-lying repositories?

    International Nuclear Information System (INIS)

    2012-03-01

    This booklet issued by the Swiss National Cooperative for the Disposal of Radioactive Waste NAGRA takes a look at geological factors concerning earthquakes and the safety of deep-lying repositories for nuclear waste. The geological processes involved in the occurrence of earthquakes are briefly looked at and the definitions for magnitude and intensity of earthquakes are discussed. Examples of damage caused by earthquakes are given. The earthquake situation in Switzerland is looked at and the effects of earthquakes on sub-surface structures and deep-lying repositories are discussed. Finally, the ideas proposed for deep-lying geological repositories for nuclear wastes are discussed

  17. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  18. Financialization impedes climate change mitigation: Evidence from the early American solar industry

    Science.gov (United States)

    Jerneck, Max

    2017-01-01

    The article investigates how financialization impedes climate change mitigation by examining its effects on the early history of one low-carbon industry, solar photovoltaics in the United States. The industry grew rapidly in the 1970s, as large financial conglomerates acquired independent firms. While providing needed financial support, conglomerates changed the focus from existing markets in consumer applications toward a future utility market that never materialized. Concentration of the industry also left it vulnerable to the corporate restructuring of the 1980s, when the conglomerates were dismantled and solar divisions were pared back or sold off to foreign firms. Both the move toward conglomeration, when corporations became managed as stock portfolios, and its subsequent reversal were the result of increased financial dominance over corporate governance. The American case is contrasted with the more successful case of Japan, where these changes to corporate governance did not occur. Insulated from shareholder pressure and financial turbulence, Japanese photovoltaics manufacturers continued to expand investment throughout the 1980s when their American rivals were cutting back. The study is informed by Joseph Schumpeter’s theory of creative destruction and Hyman Minsky’s theory of financialization, along with economic sociology. By highlighting the tenuous and conflicting relation between finance and production that shaped the early history of the photovoltaics industry, the article raises doubts about the prevailing approach to mitigate climate change through carbon pricing. Given the uncertainty of innovation and the ease of speculation, it will do little to spur low-carbon technology development without financial structures supporting patient capital. PMID:28435862

  19. Financialization impedes climate change mitigation: Evidence from the early American solar industry.

    Science.gov (United States)

    Jerneck, Max

    2017-03-01

    The article investigates how financialization impedes climate change mitigation by examining its effects on the early history of one low-carbon industry, solar photovoltaics in the United States. The industry grew rapidly in the 1970s, as large financial conglomerates acquired independent firms. While providing needed financial support, conglomerates changed the focus from existing markets in consumer applications toward a future utility market that never materialized. Concentration of the industry also left it vulnerable to the corporate restructuring of the 1980s, when the conglomerates were dismantled and solar divisions were pared back or sold off to foreign firms. Both the move toward conglomeration, when corporations became managed as stock portfolios, and its subsequent reversal were the result of increased financial dominance over corporate governance. The American case is contrasted with the more successful case of Japan, where these changes to corporate governance did not occur. Insulated from shareholder pressure and financial turbulence, Japanese photovoltaics manufacturers continued to expand investment throughout the 1980s when their American rivals were cutting back. The study is informed by Joseph Schumpeter's theory of creative destruction and Hyman Minsky's theory of financialization, along with economic sociology. By highlighting the tenuous and conflicting relation between finance and production that shaped the early history of the photovoltaics industry, the article raises doubts about the prevailing approach to mitigate climate change through carbon pricing. Given the uncertainty of innovation and the ease of speculation, it will do little to spur low-carbon technology development without financial structures supporting patient capital.

  20. Current-driven turbulence in plasmas

    International Nuclear Information System (INIS)

    Kluiver, H. de.

    1977-10-01

    Research on plasma heating in linear and toroidal systems using current-driven turbulence is reviewed. The motivation for this research is presented. Relations between parameters describing the turbulent plasma state and macroscopic observables are given. Several linear and toroidal devices used in current-driven turbulence studies are described, followed by a discussion of special diagnostic methods used. Experimental results on the measurement of electron and ion heating, anomalous plasma conductivity and associated turbulent fluctuation spectra are reviewed. Theories on current-driven turbulence are discussed and compared with experiments. It is demonstrated from the experimental results that current-driven turbulence occurs not only for extreme values of the electric field but also for an experimentally much more accessible and wide range of parameters. This forms a basis for a discussion on possible future applications in fusion-oriented plasma research

  1. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  2. Mapping Tectonic Stress Using Earthquakes

    International Nuclear Information System (INIS)

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  3. Financialization and financial profit

    Directory of Open Access Journals (Sweden)

    Arturo Guillén

    2014-09-01

    Full Text Available This article starts from the critical review of the concept of financial capital. I consider it is necessary not to confuse this category with of financialization, which has acquired a certificate of naturalization from the rise of neoliberalism. Although financial monopoly-financial capital is the hegemonic segment of the bourgeoisie in the major capitalist countries, their dominance does not imply, a fortiori, financialization of economic activity, since it depends of the conditions of the process reproduction of capital. The emergence of joint stock companies modified the formation of the average rate of profit. The "promoter profit" becomes one of the main forms of income of monopoly-financial capital. It is postulated that financial profit is a kind of "extraordinary surplus-value" which is appropriated by monopoly-financial capital by means of the monopolistic control it exerts on the issue and circulation of fictitious capital.

  4. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  5. Particle Settling in Low Energy Turbulence

    Science.gov (United States)

    Allen, Rachel; MacVean, Lissa; Tse, Ian; Mazzaro, Laura; Stacey, Mark; Variano, Evan

    2014-11-01

    Particle settling velocities can be altered by turbulence. In turbulence, dense particles may get trapped in convergent flow regions, and falling particles may be swept towards the downward side of turbulent eddies, resulting in enhanced settling velocities. The degree of velocity enhancement may depend on the Stokes number, the Rouse number, and the turbulent Reynolds number. In a homogeneous, isotropic turbulence tank, we tested the effects of particle size and type, suspended sediment concentration, and level of turbulence on the settling velocities of particles typically found in muddy estuaries. Two Acoustic Doppler Velocimeters (ADVs), separated vertically, measured turbulent velocities and suspended sediment concentrations, which yield condition dependent settling velocities, via ∂/á C ñ ∂ t = -∂/∂ z (ws á C ñ + á w ' C ' ñ) . These results are pertinent to fine sediment transport in estuaries, where high concentrations of suspended material are transported and impacted by low energy turbulence.

  6. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  7. Turbulence Intensity Scaling: A Fugue

    OpenAIRE

    Basse, Nils T.

    2018-01-01

    We study streamwise turbulence intensity definitions using smooth- and rough-wall pipe flow measurements made in the Princeton Superpipe. Scaling of turbulence intensity with the bulk (and friction) Reynolds number is provided for the definitions. The turbulence intensity is proportional to the square root of the friction factor with the same proportionality constant for smooth- and rough-wall pipe flow. Turbulence intensity definitions providing the best description of the measurements are i...

  8. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    Science.gov (United States)

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  9. Cryogenic turbulence

    CERN Document Server

    CERN. Geneva. Audiovisual Unit

    2005-01-01

    Understanding turbulence is vital in astrophysics, geophysics and many engineering applications, with thermal convection playing a central role. I shall describe progress that has recently been made in understanding this ubiquitous phenomenon by making controlled experiments using low-temperature helium, and a brief account of the frontier topic of superfluid turbulence will also be given. CERN might be able to play a unique role in experiments to probe these two problems.

  10. On the correlation of heat transfer in turbulent boundary layers subjected to free-stream turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, M.J.; Hollingsworth, D.K.

    1999-07-01

    The turbulent flow of a fluid bounded by a heated surface is a wonderfully complex yet derisively mundane phenomenon. Despite its commonness in natural and man-made environments, the authors struggle to accurately predict its behavior in many simple situations. A complexity encountered in a number of flows is the presence of free-stream turbulence. A turbulent free-stream typically yields increased surface friction and heat transfer. Turbulent boundary layers with turbulent free-streams are encountered in gas-turbine engines, rocket nozzles, electronic-cooling passages, geophysical flows, and numerous other dynamic systems. Here, turbulent boundary layers were subjected to grid-generated free-stream turbulence to study the effects of length scale and intensity on heat transfer. The research focused on correlating heat transfer without the use of conventional boundary-layer Reynolds numbers. The boundary-layers studied ranged from 400 to 2,700 in momentum-thickness Reynolds number and from 450 to 1,900 in enthalpy-thickness Reynolds number. Free-stream turbulence intensities varied from 0.1 to 8.0%. The turbulent-to-viscous length-scale ratios presented are the smallest found in the heat-transfer literature; the ratios spanned from 100 to 1000. The turbulent-to-thermal ratios (using enthalpy thickness as the thermal scale) are also the smallest reported; the ratios ranged from 3.2 to 12.3. A length-scale dependence was identified in a Stanton number based on a near-wall streamwise velocity fluctuation. A new near-wall Stanton number was introduced; this parameter was regarded as a constant in a two-region boundary-layer model. The new model correlated heat-transfer to within 7%.

  11. Stigma in science: the case of earthquake prediction.

    Science.gov (United States)

    Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

    2018-01-01

    This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  12. Turbulent flux and the diffusion of passive tracers in electrostatic turbulence

    DEFF Research Database (Denmark)

    Basu, R.; Jessen, T.; Naulin, V.

    2003-01-01

    The connection between the diffusion of passive tracer particles and the anomalous turbulent flux in electrostatic drift-wave turbulence is investigated by direct numerical solutions of the 2D Hasegawa-Wakatani equations. The probability density functions for the point-wise and flux surface...

  13. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  14. Real Time Earthquake Information System in Japan

    Science.gov (United States)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  15. Turbulence in extended synchrotron radio sources. I. Polarization of turbulent sources. II. Power-spectral analysis

    International Nuclear Information System (INIS)

    Eilek, J.A.

    1989-01-01

    Recent theories of magnetohydrodynamic turbulence are used to construct microphysical turbulence models, with emphasis on models of anisotropic turbulence. These models have been applied to the determination of the emergent polarization from a resolved uniform source. It is found that depolarization alone is not a unique measure of the turbulence, and that the turblence will also affect the total-intensity distributions. Fluctuations in the intensity image can thus be employed to measure turbulence strength. In the second part, it is demonstrated that a power-spectral analysis of the total and polarized intensity images can be used to obtain the power spectra of the synchrotron emission. 81 refs

  16. Turbulent wakes of fractal objects

    NARCIS (Netherlands)

    Staicu, A.D.; Mazzi, B.; Vassilicos, J.C.; Water, van de W.

    2003-01-01

    Turbulence of a windtunnel flow is stirred using objects that have a fractal structure. The strong turbulent wakes resulting from three such objects which have different fractal dimensions are probed using multiprobe hot-wire anemometry in various configurations. Statistical turbulent quantities are

  17. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  18. Turbulence in the solar wind

    CERN Document Server

    Bruno, Roberto

    2016-01-01

    This book provides an overview of solar wind turbulence from both the theoretical and observational perspective. It argues that the interplanetary medium offers the best opportunity to directly study turbulent fluctuations in collisionless plasmas. In fact, during expansion, the solar wind evolves towards a state characterized by large-amplitude fluctuations in all observed parameters, which resembles, at least at large scales, the well-known hydrodynamic turbulence. This text starts with historical references to past observations and experiments on turbulent flows. It then introduces the Navier-Stokes equations for a magnetized plasma whose low-frequency turbulence evolution is described within the framework of the MHD approximation. It also considers the scaling of plasma and magnetic field fluctuations and the study of nonlinear energy cascades within the same framework. It reports observations of turbulence in the ecliptic and at high latitude, treating Alfvénic and compressive fluctuations separately in...

  19. Turbulent entrainment across turbulent-nonturbulent interfaces in stably stratified mixing layers

    Science.gov (United States)

    Watanabe, T.; Riley, J. J.; Nagata, K.

    2017-10-01

    The entrainment process in stably stratified mixing layers is studied in relation to the turbulent-nonturbulent interface (TNTI) using direct numerical simulations. The statistics are calculated with the interface coordinate in an Eulerian frame as well as with the Lagrangian fluid particles entrained from the nonturbulent to the turbulent regions. The characteristics of entrainment change as the buoyancy Reynolds number Reb decreases and the flow begins to layer. The baroclinic torque delays the enstrophy growth of the entrained fluids at small Reb, while this effect is less efficient for large Reb. The entrained particle movement within the TNTI layer is dominated by the small dissipative scales, and the rapid decay of the kinetic energy dissipation rate due to buoyancy causes the entrained particle movement relative to the interface location to become slower. Although the Eulerian statistics confirm that there exists turbulent fluid with strong vorticity or with large buoyancy frequency near the TNTI, the entrained fluid particles circumvent these regions by passing through the TNTI in strain-dominant regions or in regions with small buoyancy frequency. The multiparticle statistics show that once the nonturbulent fluid volumes are entrained, they are deformed into flattened shapes in the vertical direction and diffuse in the horizontal direction. When Reb is large enough for small-scale turbulence to exist, the entrained fluid is able to penetrate into the turbulent core region. Once the flow begins to layer with decreasing Reb, however, the entrained fluid volume remains near the outer edge of the turbulent region and forms a stably stratified layer without vertical overturning.

  20. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  1. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  2. Saturation of the turbulent dynamo.

    Science.gov (United States)

    Schober, J; Schleicher, D R G; Federrath, C; Bovino, S; Klessen, R S

    2015-08-01

    The origin of strong magnetic fields in the Universe can be explained by amplifying weak seed fields via turbulent motions on small spatial scales and subsequently transporting the magnetic energy to larger scales. This process is known as the turbulent dynamo and depends on the properties of turbulence, i.e., on the hydrodynamical Reynolds number and the compressibility of the gas, and on the magnetic diffusivity. While we know the growth rate of the magnetic energy in the linear regime, the saturation level, i.e., the ratio of magnetic energy to turbulent kinetic energy that can be reached, is not known from analytical calculations. In this paper we present a scale-dependent saturation model based on an effective turbulent resistivity which is determined by the turnover time scale of turbulent eddies and the magnetic energy density. The magnetic resistivity increases compared to the Spitzer value and the effective scale on which the magnetic energy spectrum is at its maximum moves to larger spatial scales. This process ends when the peak reaches a characteristic wave number k☆ which is determined by the critical magnetic Reynolds number. The saturation level of the dynamo also depends on the type of turbulence and differs for the limits of large and small magnetic Prandtl numbers Pm. With our model we find saturation levels between 43.8% and 1.3% for Pm≫1 and between 2.43% and 0.135% for Pm≪1, where the higher values refer to incompressible turbulence and the lower ones to highly compressible turbulence.

  3. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  4. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  5. Analysing news media coverage of the 2015 Nepal earthquake using a community capitals lens: implications for disaster resilience.

    Science.gov (United States)

    Dhakal, Subas P

    2018-04-01

    South Asia is one of the regions of the world most vulnerable to natural disasters. Although news media analyses of disasters have been conducted frequently in various settings globally, there is little research on populous South Asia. This paper begins to fill this gap by evaluating local and foreign news media coverage of the earthquake in Nepal on 25 April 2015. It broadens the examination of news media coverage of disaster response beyond traditional framing theory, utilising community capitals (built, cultural, financial, human, natural, political, and social) lens to perform a thematic content analysis of 405 news items. Overall, financial and natural capital received the most and the least emphasis respectively. Statistically significant differences between local and foreign news media were detected vis-à-vis built, financial, and political capital. The paper concludes with a discussion of the social utility of news media analysis using the community capitals framework to inform disaster resilience. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  6. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  7. Exploring Earthquakes in Real-Time

    Science.gov (United States)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  8. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  9. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  10. A FRAMEWORK FOR THE TREATMENT OF FINANCIAL CONTAGION EFFECTS IN THE CONTEXT OF THE ACTUAL EUROPEAN TURBULENCES

    Directory of Open Access Journals (Sweden)

    Boscoianu Mircea

    2010-12-01

    Full Text Available There is still a debate regarding a possible restoring of the confidence in European financial markets because there are still underlying problems from the super-sized finance that actually worsened. Anti crisis strategy efficiency and future costs of real reform make analysts more prudent in forecasts. In addition, a possible reduction risk appetite and the loss of confidence will fuel a negative perspective regarding the recovery of emerging economies, extreme fragile to regional or global contagion effects. In modern financial crises, the events spiral out of control, panic and contagion come very fast. Greek debt crisis is the most serious extreme financial event in the Eurozone, with severe contagion features. An analysis of Eurocontagion effects in the context of Greece crisis by using a dynamic version of the Hawkes jump-diffusion model is suggested.

  11. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    Science.gov (United States)

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  12. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  13. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems

    Science.gov (United States)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang

    2013-10-01

    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  14. Family functioning and its predictors among disaster bereaved individuals in China: eighteen months after the Wenchuan Earthquake.

    Directory of Open Access Journals (Sweden)

    Xiaoyi Cao

    Full Text Available BACKGROUND: The 2008 Wenchuan earthquake in China resulted in great loss of life and property, and previous studies have focused on psychopathological symptoms in survivors after disasters. This study examined perceived family functioning and its predictors in disaster bereaved individuals eighteen months after the 2008 Wenchuan earthquake. METHODOLOGY/FINDINGS: This was a cross-sectional study of a convenience sample of 264 bereaved individuals. The instruments used in the study included Family APGAR Index, Family Adaptability and Cohesion Evaluation ScaleãÀ, Emotional and Social Loneliness Scale, and a range of items eliciting demographic characteristics and disaster-related variables. The results indicated that the rates of moderate family dysfunction and severe family dysfunction in bereaved individuals were 37.1% and 12.9%, respectively. Less financial loss during the earthquake was a significant predictor for positive family function. Better self-rated health status after the earthquake was significantly related to positive family function, cohesion, and adaptability. Scores on family cohesion and adaptability in bereaved individuals from extended or nuclear families were significantly higher than those from single-parent families. The ability to give birth to another baby of bereaved parents was a significant predictor for positive family function and cohesion. Poorer family function, cohesion and adaptability were significantly related to greater loneliness. CONCLUSIONS/SIGNIFICANCE: This study found a high prevalence of family dysfunction in bereaved individuals eighteen months after the 2008 Wenchuan earthquake. Strategies can be designed to facilitate post-disaster recovery, particularly for the bereaved at high risk for family dysfunction. The study provides useful information for post-disaster rebuilding and relief work.

  15. Earthquake evaluation of a substation network

    International Nuclear Information System (INIS)

    Matsuda, E.N.; Savage, W.U.; Williams, K.K.; Laguens, G.C.

    1991-01-01

    The impact of the occurrence of a large, damaging earthquake on a regional electric power system is a function of the geographical distribution of strong shaking, the vulnerability of various types of electric equipment located within the affected region, and operational resources available to maintain or restore electric system functionality. Experience from numerous worldwide earthquake occurrences has shown that seismic damage to high-voltage substation equipment is typically the reason for post-earthquake loss of electric service. In this paper, the authors develop and apply a methodology to analyze earthquake impacts on Pacific Gas and Electric Company's (PG and E's) high-voltage electric substation network in central and northern California. The authors' objectives are to identify and prioritize ways to reduce the potential impact of future earthquakes on our electric system, refine PG and E's earthquake preparedness and response plans to be more realistic, and optimize seismic criteria for future equipment purchases for the electric system

  16. Hydrodynamic study of the turbulent fluidized beds; Etude hydrodynamique des lits fluidises turbulents

    Energy Technology Data Exchange (ETDEWEB)

    Taxil, I.

    1996-12-20

    Gas-solid turbulent fluidization has already been widely studied in the literature. However, its definition and specificities remain controversial and confused. Most of the studies focussed on the turbulent transition velocities are based on wall pressure drop fluctuations studies. In this work, we first characterize the turbulent regime with the classical study of pressure drop signals with standard deviation analysis, completed with a more specific frequency analysis and also by a stochastic analysis. Then, we evaluate bubble flow properties. Experimental results have been obtained in a 0.2 m I.D. fluidized bed expanding to 0.4 m I.D. in the freeboard in order to limit entrainment at high fluidization velocities. The so lid used was FCC catalyst. It was fluidized by air at ambient conditions. The superficial fluidization velocity ranged 0.2 to 2 m/s. Fast response transducers recorded pressure drop at the wall and bubble flow properties (bubble size, bubble velocity and bubble frequency) could be deduced from a light reflected signal at various bed locations with optical fibers. It has been shown the turbulent regime is delimited by two velocities: Uc (onset of turbulent regime) and Utr (onset of transport regime), which can be determined based on standard deviations, dominant frequencies and width of wave land of pressure signals. The stochastic analysis confirms that the signal enriches in frequencies in the turbulent regime. Bubble size and bubble velocity could be correlated to the main superficial gas velocity. The main change in bubble flow in the turbulent regime was shown to be the stagnation of the bubble frequency at its maximum value. It was also shown that the bubble flow properties in the turbulent regime imply a strong aeration of the emulsion phase. (authors) 76 refs.

  17. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  18. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  19. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  20. 4th European Turbulence Conference

    CERN Document Server

    1993-01-01

    The European Turbulence Conferences have been organized under the auspices of the European Mechanics Committee (Euromech) to provide a forum for discussion and exchange of recent and new results in the field of turbulence. The first conference was organized in Lyon in 1986 with 152 participants. The second and third conferences were held in Berlin (1988) and Stockholm (1990) with 165 and 172 participants respectively. The fourth was organized in Delft from 30 June to 3 July 1992 by the J.M. Burgers Centre. There were 214 participants from 22 countries. This steadily growing number of participants demonstrates both the success and need for this type of conference. The main topics of the Fourth European Turbulence Conference were: Dynamical Systems and Transition; Statistical Physics and Turbulence; Experiments and Novel Experimental Techniques; Particles and Bubbles in Turbulence; Simulation Methods; Coherent Structures; Turbulence Modelling and Compressibility Effects. In addition a special session was held o...

  1. Statistical properties of turbulence: An overview

    Indian Academy of Sciences (India)

    the turbulent advection of passive scalars, turbulence in the one-dimensional Burgers equation, and fluid turbulence in the presence of polymer ... However, it is not easy to state what would consti- tute a solution of the turbulence ...... flow with Lagrangian tracers and use a cubic spline interpolation method to calculate their ...

  2. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  3. ADIABATIC HEATING OF CONTRACTING TURBULENT FLUIDS

    International Nuclear Information System (INIS)

    Robertson, Brant; Goldreich, Peter

    2012-01-01

    Turbulence influences the behavior of many astrophysical systems, frequently by providing non-thermal pressure support through random bulk motions. Although turbulence is commonly studied in systems with constant volume and mean density, turbulent astrophysical gases often expand or contract under the influence of pressure or gravity. Here, we examine the behavior of turbulence in contracting volumes using idealized models of compressed gases. Employing numerical simulations and an analytical model, we identify a simple mechanism by which the turbulent motions of contracting gases 'adiabatically heat', experiencing an increase in their random bulk velocities until the largest eddies in the gas circulate over a Hubble time of the contraction. Adiabatic heating provides a mechanism for sustaining turbulence in gases where no large-scale driving exists. We describe this mechanism in detail and discuss some potential applications to turbulence in astrophysical settings.

  4. Exploiting similarity in turbulent shear flows for turbulence modeling

    Science.gov (United States)

    Robinson, David F.; Harris, Julius E.; Hassan, H. A.

    1992-12-01

    It is well known that current k-epsilon models cannot predict the flow over a flat plate and its wake. In an effort to address this issue and other issues associated with turbulence closure, a new approach for turbulence modeling is proposed which exploits similarities in the flow field. Thus, if we consider the flow over a flat plate and its wake, then in addition to taking advantage of the log-law region, we can exploit the fact that the flow becomes self-similar in the far wake. This latter behavior makes it possible to cast the governing equations as a set of total differential equations. Solutions of this set and comparison with measured shear stress and velocity profiles yields the desired set of model constants. Such a set is, in general, different from other sets of model constants. The rational for such an approach is that if we can correctly model the flow over a flat plate and its far wake, then we can have a better chance of predicting the behavior in between. It is to be noted that the approach does not appeal, in any way, to the decay of homogeneous turbulence. This is because the asymptotic behavior of the flow under consideration is not representative of the decay of homogeneous turbulence.

  5. Exploiting similarity in turbulent shear flows for turbulence modeling

    Science.gov (United States)

    Robinson, David F.; Harris, Julius E.; Hassan, H. A.

    1992-01-01

    It is well known that current k-epsilon models cannot predict the flow over a flat plate and its wake. In an effort to address this issue and other issues associated with turbulence closure, a new approach for turbulence modeling is proposed which exploits similarities in the flow field. Thus, if we consider the flow over a flat plate and its wake, then in addition to taking advantage of the log-law region, we can exploit the fact that the flow becomes self-similar in the far wake. This latter behavior makes it possible to cast the governing equations as a set of total differential equations. Solutions of this set and comparison with measured shear stress and velocity profiles yields the desired set of model constants. Such a set is, in general, different from other sets of model constants. The rational for such an approach is that if we can correctly model the flow over a flat plate and its far wake, then we can have a better chance of predicting the behavior in between. It is to be noted that the approach does not appeal, in any way, to the decay of homogeneous turbulence. This is because the asymptotic behavior of the flow under consideration is not representative of the decay of homogeneous turbulence.

  6. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Science.gov (United States)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  7. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  8. Measurements of Turbulence Attenuation by a Dilute Dispersion of Solid Particles in Homogeneous Isotropic Turbulence

    Science.gov (United States)

    Eaton, John; Hwang, Wontae; Cabral, Patrick

    2002-11-01

    This research addresses turbulent gas flows laden with fine solid particles at sufficiently large mass loading that strong two-way coupling occurs. By two-way coupling we mean that the particle motion is governed largely by the flow, while the particles affect the gas-phase mean flow and the turbulence properties. Our main interest is in understanding how the particles affect the turbulence. Computational techniques have been developed which can accurately predict flows carrying particles that are much smaller than the smallest scales of turbulence. Also, advanced computational techniques and burgeoning computer resources make it feasible to fully resolve very large particles moving through turbulent flows. However, flows with particle diameters of the same order as the Kolmogorov scale of the turbulence are notoriously difficult to predict. Some simple flows show strong turbulence attenuation with reductions in the turbulent kinetic energy by up to a factor of five. On the other hand, some seemingly similar flows show almost no modification. No model has been proposed that allows prediction of when the strong attenuation will occur. Unfortunately, many technological and natural two-phase flows fall into this regime, so there is a strong need for new physical understanding and modeling capability. Our objective is to study the simplest possible turbulent particle-laden flow, namely homogeneous, isotropic turbulence with a uniform dispersion of monodisperse particles. We chose such a simple flow for two reasons. First, the simplicity allows us to probe the interaction in more detail and offers analytical simplicity in interpreting the results. Secondly, this flow can be addressed by numerical simulation, and many research groups are already working on calculating the flow. Our detailed data can help guide some of these efforts. By using microgravity, we can further simplify the flow to the case of no mean velocity for either the turbulence or the particles. In fact

  9. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  10. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Science.gov (United States)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  11. Tearing instabilities in turbulence

    International Nuclear Information System (INIS)

    Ishizawa, A.; Nakajima, N.

    2009-01-01

    Full text: Effects of micro-turbulence on tearing instabilities are investigated by numerically solving a reduced set of two-fluid equations. Micro-turbulence excites both large-scale and small-scale Fourier modes through energy transfer due to nonlinear mode coupling. The energy transfer to large scale mode does not directly excite tearing instability but it gives an initiation of tearing instability. When tearing instability starts to grow, the excited small scale mode plays an important role. The mixing of magnetic flux by micro-turbulence is the dominant factor of non-ideal MHD effect at the resonant surface and it gives rise to magnetic reconnection which causes tearing instability. Tearing instabilities were investigated against static equilibrium or flowing equilibrium so far. On the other hand, the recent progress of computer power allows us to investigate interactions between turbulence and coherent modes such as tearing instabilities in magnetically confined plasmas by means of direct numerical simulations. In order to investigate effects of turbulence on tearing instabilities we consider a situation that tearing mode is destabilized in a quasi-equilibrium including micro-turbulence. We choose an initial equilibrium that is unstable against kinetic ballooning modes and tearing instabilities. Tearing instabilities are current driven modes and thus they are unstable for large scale Fourier modes. On the other hand kinetic ballooning modes are unstable for poloidal Fourier modes that are characterized by ion Larmor radius. The energy of kinetic ballooning modes spreads over wave number space through nonlinear Fourier mode coupling. We present that micro-turbulence affects tearing instabilities in two different ways by three-dimensional numerical simulation of a reduced set of two-fluid equations. One is caused by energy transfer to large scale modes, the other is caused by energy transfer to small scale modes. The former is the excitation of initial

  12. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  13. Group-kinetic theory and modeling of atmospheric turbulence

    Science.gov (United States)

    Tchen, C. M.

    1989-01-01

    A group kinetic method is developed for analyzing eddy transport properties and relaxation to equilibrium. The purpose is to derive the spectral structure of turbulence in incompressible and compressible media. Of particular interest are: direct and inverse cascade, boundary layer turbulence, Rossby wave turbulence, two phase turbulence; compressible turbulence, and soliton turbulence. Soliton turbulence can be found in large scale turbulence, turbulence connected with surface gravity waves and nonlinear propagation of acoustical and optical waves. By letting the pressure gradient represent the elementary interaction among fluid elements and by raising the Navier-Stokes equation to higher dimensionality, the master equation was obtained for the description of the microdynamical state of turbulence.

  14. Strong Turbulence in Low-beta Plasmas

    DEFF Research Database (Denmark)

    Tchen, C. M.; Pécseli, Hans; Larsen, Søren Ejling

    1980-01-01

    An investigation of the spectral structure of turbulence in a plasma confined by a strong homogeneous magnetic field was made by means of a fluid description. The turbulent spectrum is divided into subranges. Mean gradients of velocity and density excite turbulent motions, and govern the production......-cathode reflex arc, Stellarator, Zeta discharge, ionospheric plasmas, and auroral plasma turbulence....

  15. Advances in compressible turbulent mixing

    International Nuclear Information System (INIS)

    Dannevik, W.P.; Buckingham, A.C.; Leith, C.E.

    1992-01-01

    This volume includes some recent additions to original material prepared for the Princeton International Workshop on the Physics of Compressible Turbulent Mixing, held in 1988. Workshop participants were asked to emphasize the physics of the compressible mixing process rather than measurement techniques or computational methods. Actual experimental results and their meaning were given precedence over discussions of new diagnostic developments. Theoretical interpretations and understanding were stressed rather than the exposition of new analytical model developments or advances in numerical procedures. By design, compressibility influences on turbulent mixing were discussed--almost exclusively--from the perspective of supersonic flow field studies. The papers are arranged in three topical categories: Foundations, Vortical Domination, and Strongly Coupled Compressibility. The Foundations category is a collection of seminal studies that connect current study in compressible turbulent mixing with compressible, high-speed turbulent flow research that almost vanished about two decades ago. A number of contributions are included on flow instability initiation, evolution, and transition between the states of unstable flow onset through those descriptive of fully developed turbulence. The Vortical Domination category includes theoretical and experimental studies of coherent structures, vortex pairing, vortex-dynamics-influenced pressure focusing. In the Strongly Coupled Compressibility category the organizers included the high-speed turbulent flow investigations in which the interaction of shock waves could be considered an important source for production of new turbulence or for the enhancement of pre-existing turbulence. Individual papers are processed separately

  16. Advances in compressible turbulent mixing

    Energy Technology Data Exchange (ETDEWEB)

    Dannevik, W.P.; Buckingham, A.C.; Leith, C.E. [eds.

    1992-01-01

    This volume includes some recent additions to original material prepared for the Princeton International Workshop on the Physics of Compressible Turbulent Mixing, held in 1988. Workshop participants were asked to emphasize the physics of the compressible mixing process rather than measurement techniques or computational methods. Actual experimental results and their meaning were given precedence over discussions of new diagnostic developments. Theoretical interpretations and understanding were stressed rather than the exposition of new analytical model developments or advances in numerical procedures. By design, compressibility influences on turbulent mixing were discussed--almost exclusively--from the perspective of supersonic flow field studies. The papers are arranged in three topical categories: Foundations, Vortical Domination, and Strongly Coupled Compressibility. The Foundations category is a collection of seminal studies that connect current study in compressible turbulent mixing with compressible, high-speed turbulent flow research that almost vanished about two decades ago. A number of contributions are included on flow instability initiation, evolution, and transition between the states of unstable flow onset through those descriptive of fully developed turbulence. The Vortical Domination category includes theoretical and experimental studies of coherent structures, vortex pairing, vortex-dynamics-influenced pressure focusing. In the Strongly Coupled Compressibility category the organizers included the high-speed turbulent flow investigations in which the interaction of shock waves could be considered an important source for production of new turbulence or for the enhancement of pre-existing turbulence. Individual papers are processed separately.

  17. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  18. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  19. Impact- and earthquake- proof roof structure

    International Nuclear Information System (INIS)

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  20. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    Science.gov (United States)

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  1. (Multi)fractality of Earthquakes by use of Wavelet Analysis

    Science.gov (United States)

    Enescu, B.; Ito, K.; Struzik, Z. R.

    2002-12-01

    The fractal character of earthquakes' occurrence, in time, space or energy, has by now been established beyond doubt and is in agreement with modern models of seismicity. Moreover, the cascade-like generation process of earthquakes -with one "main" shock followed by many aftershocks, having their own aftershocks- may well be described through multifractal analysis, well suited for dealing with such multiplicative processes. The (multi)fractal character of seismicity has been analysed so far by using traditional techniques, like the box-counting and correlation function algorithms. This work introduces a new approach for characterising the multifractal patterns of seismicity. The use of wavelet analysis, in particular of the wavelet transform modulus maxima, to multifractal analysis was pioneered by Arneodo et al. (1991, 1995) and applied successfully in diverse fields, such as the study of turbulence, the DNA sequences or the heart rate dynamics. The wavelets act like a microscope, revealing details about the analysed data at different times and scales. We introduce and perform such an analysis on the occurrence time of earthquakes and show its advantages. In particular, we analyse shallow seismicity, characterised by a high aftershock "productivity", as well as intermediate and deep seismic activity, known for its scarcity of aftershocks. We examine as well declustered (aftershocks removed) versions of seismic catalogues. Our preliminary results show some degree of multifractality for the undeclustered, shallow seismicity. On the other hand, at large scales, we detect a monofractal scaling behaviour, clearly put in evidence for the declustered, shallow seismic activity. Moreover, some of the declustered sequences show a long-range dependent (LRD) behaviour, characterised by a Hurst exponent, H > 0.5, in contrast with the memory-less, Poissonian model. We demonstrate that the LRD is a genuine characteristic and is not an effect of the time series probability

  2. Hospital compliance with a state unfunded mandate: the case of California's Earthquake Safety Law.

    Science.gov (United States)

    McCue, Michael J; Thompson, Jon M

    2012-01-01

    Abstract In recent years, community hospitals have experienced heightened regulation with many unfunded mandates. The authors assessed the market, organizational, operational, and financial characteristics of general acute care hospitals in California that have a main acute care hospital building that is noncompliant with state requirements and at risk of major structural collapse from earthquakes. Using California hospital data from 2007 to 2009, and employing logistic regression analysis, the authors found that hospitals having buildings that are at the highest risk of collapse are located in larger population markets, possess smaller market share, have a higher percentage of Medicaid patients, and have less liquidity.

  3. Application of τc*Pd for identifying damaging earthquakes for earthquake early warning

    Science.gov (United States)

    Huang, P. L.; Lin, T. L.; Wu, Y. M.

    2014-12-01

    Earthquake Early Warning System (EEWS) is an effective approach to mitigate earthquake damage. In this study, we used the seismic record by the Kiban Kyoshin network (KiK-net), because it has dense station coverage and co-located borehole strong-motion seismometers along with the free-surface strong-motion seismometers. We used inland earthquakes with moment magnitude (Mw) from 5.0 to 7.3 between 1998 and 2012. We choose 135 events and 10950 strong ground accelerograms recorded by the 696 strong ground accelerographs. Both the free-surface and the borehole data are used to calculate τc and Pd, respectively. The results show that τc*Pd has a good correlation with PGV and is a robust parameter for assessing the potential of damaging earthquake. We propose the value of τc*Pd determined from seconds after the arrival of P wave could be a threshold for the on-site type of EEW.

  4. De-trending of turbulence measurements

    DEFF Research Database (Denmark)

    Hansen, Kurt Schaldemose; Larsen, Gunner Chr.

    2006-01-01

    contribution to the wind speed turbulence intensity for a number of representative locations. A linear de-trending process has been implemented during indexing of the time-series. The observed de-trended turbulence intensities are reduced 3 – 15 % compared to the raw turbulence intensity. This reduction...... depends primarily on site characteristics and local mean wind speed variations. Reduced turbulence intensity will result in lower design fatigue loads. This aspect of de-trending is discussed by use of a simple heuristic load model. Finally an empirical model for de-trending wind resource data...

  5. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  6. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  7. Comparison of turbulence mitigation algorithms

    Science.gov (United States)

    Kozacik, Stephen T.; Paolini, Aaron; Sherman, Ariel; Bonnett, James; Kelmelis, Eric

    2017-07-01

    When capturing imagery over long distances, atmospheric turbulence often degrades the data, especially when observation paths are close to the ground or in hot environments. These issues manifest as time-varying scintillation and warping effects that decrease the effective resolution of the sensor and reduce actionable intelligence. In recent years, several image processing approaches to turbulence mitigation have shown promise. Each of these algorithms has different computational requirements, usability demands, and degrees of independence from camera sensors. They also produce different degrees of enhancement when applied to turbulent imagery. Additionally, some of these algorithms are applicable to real-time operational scenarios while others may only be suitable for postprocessing workflows. EM Photonics has been developing image-processing-based turbulence mitigation technology since 2005. We will compare techniques from the literature with our commercially available, real-time, GPU-accelerated turbulence mitigation software. These comparisons will be made using real (not synthetic), experimentally obtained data for a variety of conditions, including varying optical hardware, imaging range, subjects, and turbulence conditions. Comparison metrics will include image quality, video latency, computational complexity, and potential for real-time operation. Additionally, we will present a technique for quantitatively comparing turbulence mitigation algorithms using real images of radial resolution targets.

  8. Turbulence in two-phase flows

    International Nuclear Information System (INIS)

    Sullivan, J.P.; Houze, R.N.; Buenger, D.E.; Theofanous, T.G.

    1981-01-01

    Hot film Anemometry and Laser Doppler Velocimetry have been employed in this work to study the turbulence characteristics of Bubbly and Stratified two-phase flows, respectively. Extensive consistency checks were made to establish the reliability and hence the utility of these experimental techniques for the measurement of turbulence in two-phase flows. Buoyancy-driven turbulence in vertical bubbly flows has been identified experimentally and correlated in terms of a shear velocity superposition approach. This approach provides a criterion for the demarcation of the buoyancy-driven turbulence region from the wall shear-generated turbulence region. Our data confirm the roughly isotropic behavior expected for buoyancy-driven turbulence. Upgrading of our experimental system will permit investigations of the wall-shear dominated regime (i.e., isotropy, superposition approach, etc.). The stratified flow data demonstrate clearly that the maximum in the mean velocity profile does not coincide with the zero shear plane, indicating the existence of a negative eddy viscosity region. Previous studies do not take into account this difference and thus they yield incorrect friction factor data in addition to certain puzzling behavior in the upper wall region. The conditioned turbulence data in the wavy region indicate interesting trends and that an appropriate normalization of intensities must take into account the shear velocity at the interfacial (wavy) region

  9. Temporal stress changes caused by earthquakes: A review

    Science.gov (United States)

    Hardebeck, Jeanne L.; Okada, Tomomi

    2018-01-01

    Earthquakes can change the stress field in the Earth’s lithosphere as they relieve and redistribute stress. Earthquake-induced stress changes have been observed as temporal rotations of the principal stress axes following major earthquakes in a variety of tectonic settings. The stress changes due to the 2011 Mw9.0 Tohoku-Oki, Japan, earthquake were particularly well documented. Earthquake stress rotations can inform our understanding of earthquake physics, most notably addressing the long-standing problem of whether the Earth’s crust at plate boundaries is “strong” or “weak.” Many of the observed stress rotations, including that due to the Tohoku-Oki earthquake, indicate near-complete stress drop in the mainshock. This implies low background differential stress, on the order of earthquake stress drop, supporting the weak crust model. Earthquake stress rotations can also be used to address other important geophysical questions, such as the level of crustal stress heterogeneity and the mechanisms of postseismic stress reloading. The quantitative interpretation of stress rotations is evolving from those based on simple analytical methods to those based on more sophisticated numerical modeling that can capture the spatial-temporal complexity of the earthquake stress changes.

  10. Remote Triggering of the Mw 6.9 Hokkaido Earthquake as a Result of the Mw 6.6 Indonesian Earthquake on September 11, 2008

    Directory of Open Access Journals (Sweden)

    Cheng-Horng Lin

    2012-01-01

    Full Text Available Only just recently, the phenomenon of earthquakes being triggered by a distant earthquake has been well established. Yet, most of the triggered earthquakes have been limited to small earthquakes (M < 3. Also, the exact triggering mechanism for earthquakes is still not clear. Here I show how one strong earthquake (Mw = 6.6 is capable of triggering another (Mw = 6.9 at a remote distance (~4750 km. On September 11, 2008, two strong earthquakes with magnitudes (Mw of 6.6 and 6.9 hit respectively in Indonesia and Japan within a short interval of ~21 minutes time. Careful examination of broadband seismograms recorded in Japan shows that the Hokkaido earthquake occurred just as the surface waves generated by the Indonesia earthquake arrived. Although the peak dynamic stress estimated at the focus of the Hokkaido earthquake was just reaching the lower bound for the capability of triggering earthquakes in general, a more plausible mechanism for triggering an earthquake might be attributed to the change of a fault property by fluid infiltration. These observations suggest that the Hokkaido earthquake was likely triggered from a remote distance by the surface waves generated from the Indonesia earthquake. If some more cases can be observed, a temporal warning of possible interaction between strong earthquakes might be concerned in the future.

  11. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  12. Modelling high Reynolds number wall-turbulence interactions in laboratory experiments using large-scale free-stream turbulence.

    Science.gov (United States)

    Dogan, Eda; Hearst, R Jason; Ganapathisubramani, Bharathram

    2017-03-13

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to 'simulate' high Reynolds number wall-turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  13. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    Science.gov (United States)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  14. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  15. Inter-Disciplinary Validation of Pre Earthquake Signals. Case Study for Major Earthquakes in Asia (2004-2010) and for 2011 Tohoku Earthquake

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Hattori, K.; Liu, J.-Y.; Yang. T. Y.; Parrot, M.; Kafatos, M.; Taylor, P.

    2012-01-01

    We carried out multi-sensors observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several physical and environmental parameters, which we found, associated with the earthquake processes: thermal infrared radiation, temperature and concentration of electrons in the ionosphere, radon/ion activities, and air temperature/humidity in the atmosphere. We used satellite and ground observations and interpreted them with the Lithosphere-Atmosphere- Ionosphere Coupling (LAIC) model, one of possible paradigms we study and support. We made two independent continues hind-cast investigations in Taiwan and Japan for total of 102 earthquakes (M>6) occurring from 2004-2011. We analyzed: (1) ionospheric electromagnetic radiation, plasma and energetic electron measurements from DEMETER (2) emitted long-wavelength radiation (OLR) from NOAA/AVHRR and NASA/EOS; (3) radon/ion variations (in situ data); and 4) GPS Total Electron Content (TEC) measurements collected from space and ground based observations. This joint analysis of ground and satellite data has shown that one to six (or more) days prior to the largest earthquakes there were anomalies in all of the analyzed physical observations. For the latest March 11 , 2011 Tohoku earthquake, our analysis shows again the same relationship between several independent observations characterizing the lithosphere /atmosphere coupling. On March 7th we found a rapid increase of emitted infrared radiation observed from satellite data and subsequently an anomaly developed near the epicenter. The GPS/TEC data indicated an increase and variation in electron density reaching a maximum value on March 8. Beginning from this day we confirmed an abnormal TEC variation over the epicenter in the lower ionosphere. These findings revealed the existence of atmospheric and ionospheric phenomena occurring prior to the 2011 Tohoku earthquake, which indicated new evidence of a distinct

  16. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  17. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    Science.gov (United States)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  18. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  19. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  20. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    Science.gov (United States)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  1. Turbulent black holes.

    Science.gov (United States)

    Yang, Huan; Zimmerman, Aaron; Lehner, Luis

    2015-02-27

    We demonstrate that rapidly spinning black holes can display a new type of nonlinear parametric instability-which is triggered above a certain perturbation amplitude threshold-akin to the onset of turbulence, with possibly observable consequences. This instability transfers from higher temporal and azimuthal spatial frequencies to lower frequencies-a phenomenon reminiscent of the inverse cascade displayed by (2+1)-dimensional fluids. Our finding provides evidence for the onset of transitory turbulence in astrophysical black holes and predicts observable signatures in black hole binaries with high spins. Furthermore, it gives a gravitational description of this behavior which, through the fluid-gravity duality, can potentially shed new light on the remarkable phenomena of turbulence in fluids.

  2. Turbulence-chemistry interactions in reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Barlow, R.S.; Carter, C.D. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    Interactions between turbulence and chemistry in nonpremixed flames are investigated through multiscalar measurements. Simultaneous point measurements of major species, NO, OH, temperature, and mixture fraction are obtained by combining spontaneous Raman scattering, Rayleigh scattering, and laser-induced fluorescence (LIF). NO and OH fluorescence signals are converted to quantitative concentrations by applying shot-to-shot corrections for local variations of the Boltzmann fraction and collisional quenching rate. These measurements of instantaneous thermochemical states in turbulent flames provide insights into the fundamental nature of turbulence-chemistry interactions. The measurements also constitute a unique data base for evaluation and refinement of turbulent combustion models. Experimental work during the past year has focused on three areas: (1) investigation of the effects of differential molecular diffusion in turbulent combustion: (2) experiments on the effects of Halon CF{sub 3}Br, a fire retardant, on the structure of turbulent flames of CH{sub 4} and CO/H{sub 2}/N{sub 2}; and (3) experiments on NO formation in turbulent hydrogen jet flames.

  3. Turbulence, Magnetic Reconnection in Turbulent Fluids and Energetic Particle Acceleration

    Science.gov (United States)

    Lazarian, A.; Vlahos, L.; Kowal, G.; Yan, H.; Beresnyak, A.; de Gouveia Dal Pino, E. M.

    2012-11-01

    Turbulence is ubiquitous in astrophysics. It radically changes many astrophysical phenomena, in particular, the propagation and acceleration of cosmic rays. We present the modern understanding of compressible magnetohydrodynamic (MHD) turbulence, in particular its decomposition into Alfvén, slow and fast modes, discuss the density structure of turbulent subsonic and supersonic media, as well as other relevant regimes of astrophysical turbulence. All this information is essential for understanding the energetic particle acceleration that we discuss further in the review. For instance, we show how fast and slow modes accelerate energetic particles through the second order Fermi acceleration, while density fluctuations generate magnetic fields in pre-shock regions enabling the first order Fermi acceleration of high energy cosmic rays. Very importantly, however, the first order Fermi cosmic ray acceleration is also possible in sites of magnetic reconnection. In the presence of turbulence this reconnection gets fast and we present numerical evidence supporting the predictions of the Lazarian and Vishniac (Astrophys. J. 517:700-718, 1999) model of fast reconnection. The efficiency of this process suggests that magnetic reconnection can release substantial amounts of energy in short periods of time. As the particle tracing numerical simulations show that the particles can be efficiently accelerated during the reconnection, we argue that the process of magnetic reconnection may be much more important for particle acceleration than it is currently accepted. In particular, we discuss the acceleration arising from reconnection as a possible origin of the anomalous cosmic rays measured by Voyagers as well as the origin cosmic ray excess in the direction of Heliotail.

  4. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    Science.gov (United States)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-10

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  5. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    Science.gov (United States)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  6. Cosmic turbulence

    International Nuclear Information System (INIS)

    Drury, L.O.; Stewart, J.M.

    1976-01-01

    A generalization of a transformation due to Kurskov and Ozernoi is used to rewrite the usual equations governing subsonic turbulence in Robertson-Walker cosmological models as Navier-Stokes equations with a time-dependent viscosity. This paper first rederives some well-known results in a very simple way by means of this transformation. The main result however is that the establishment of a Kolmogorov spectrum at recombination appears to be incompatible with subsonic turbulence. The conditions after recombination are also discussed briefly. (author)

  7. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  8. Multi-time, multi-scale correlation functions in turbulence and in turbulent models

    NARCIS (Netherlands)

    Biferale, L.; Boffetta, G.; Celani, A.; Toschi, F.

    1999-01-01

    A multifractal-like representation for multi-time, multi-scale velocity correlation in turbulence and dynamical turbulent models is proposed. The importance of subleading contributions to time correlations is highlighted. The fulfillment of the dynamical constraints due to the equations of motion is

  9. Dissipation range turbulent cascades in plasmas

    International Nuclear Information System (INIS)

    Terry, P. W.; Almagri, A. F.; Forest, C. B.; Nornberg, M. D.; Rahbarnia, K.; Sarff, J. S.; Fiksel, G.; Hatch, D. R.; Jenko, F.; Prager, S. C.; Ren, Y.

    2012-01-01

    Dissipation range cascades in plasma turbulence are described and spectra are formulated from the scaled attenuation in wavenumber space of the spectral energy transfer rate. This yields spectra characterized by the product of a power law and exponential fall-off, applicable to all scales. Spectral indices of the power law and exponential fall-off depend on the scaling of the dissipation, the strength of the nonlinearity, and nonlocal effects when dissipation rates of multiple fluctuation fields are different. The theory is used to derive spectra for MHD turbulence with magnetic Prandtl number greater than unity, extending previous work. The theory is also applied to generic plasma turbulence by considering the spectrum from damping with arbitrary wavenumber scaling. The latter is relevant to ion temperature gradient turbulence modeled by gyrokinetics. The spectrum in this case has an exponential component that becomes weaker at small scale, giving a power law asymptotically. Results from the theory are compared to three very different types of turbulence. These include the magnetic plasma turbulence of the Madison Symmetric Torus, the MHD turbulence of liquid metal in the Madison Dynamo Experiment, and gyrokinetic simulation of ion temperature gradient turbulence.

  10. Physical experiments and analysis on the generation and evolution of tsunami-induced turbulent coherent structures

    Science.gov (United States)

    Kalligeris, Nikos; Lynett, Patrick

    2017-11-01

    Numerous historical accounts describe the formation of ``whirpools'' inside ports and harbors during tsunami events, causing port operation disruptions. Videos from the Japan 2011 tsunami revealed complex nearshore flow patters, resulting from the interaction of tsunami-induced currents with the man-made coastline, and the generation of large eddies (or turbulent coherent structures) in numerous ports and harbors near the earthquake epicenter. The aim of this work is to study the generation and evolution of tsunami-induced turbulent coherent structures (TCS) in a well-controlled environment using realistic scaling. A physical configuration is created in the image of a port entrance at a scale of 1:27 and a small-amplitude, long period wave creates a transient flow through the asymmetric harbor channel. A separated region forms, which coupled with the transient flow, leads to the formation of a stable monopolar TCS. The surface flow is examined through mono- and stereo-PTV techniques to extract surface velocity vectors. Surface velocity maps and vortex flow profiles are used to study the experimental TCS generation and evolution, and characterize the TCS structure. Analytical tools are used to describe the TCS growth rate and kinetic energy decay. This work was funded by the National Science Foundation NEES Research program, with Award Number 1135026.

  11. High Reynolds Number Turbulence

    National Research Council Canada - National Science Library

    Smits, Alexander J

    2007-01-01

    The objectives of the grant were to provide a systematic study to fill the gap between existing research on low Reynolds number turbulent flows to the kinds of turbulent flows encountered on full-scale vehicles...

  12. Radon, gas geochemistry, groundwater, and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    King, Chi-Yu [Power Reactor and Nuclear Fuel Development Corp., Tono Geoscience Center, Toki, Gifu (Japan)

    1998-12-31

    Radon monitoring in groundwater, soil air, and atmosphere has been continued in many seismic areas of the world for earthquake-prediction and active-fault studies. Some recent measurements of radon and other geochemical and hydrological parameters have been made for sufficiently long periods, with reliable instruments, and together with measurements of meteorological variables and solid-earth tides. The resultant data are useful in better distinguishing earthquake-related changes from various background noises. Some measurements have been carried out in areas where other geophysical measurements are being made also. Comparative studies of various kinds of geophysical data are helpful in ascertaining the reality of the earthquake-related and fault-related anomalies and in understanding the underlying mechanisms. Spatial anomalies of radon and other terrestrial gasses have been observed for many active faults. Such observations indicate that gas concentrations are very much site dependent, particularly on fault zones where terrestrial fluids may move vertically. Temporal anomalies have been reliably observed before and after some recent earthquakes, including the 1995 Kobe earthquake, and the general pattern of anomaly occurrence remains the same as observed before: They are recorded at only relatively few sensitive sites, which can be at much larger distances than expected from existing earthquake-source models. The sensitivity of a sensitive site is also found to be changeable with time. These results clearly show the inadequacy of the existing dilatancy-fluid diffusion and elastic-dislocation models for earthquake sources to explain earthquake-related geochemical and geophysical changes recorded at large distances. (J.P.N.)

  13. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    Science.gov (United States)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.

    2017-09-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  15. Plant state display device after occurrence of earthquake

    International Nuclear Information System (INIS)

    Kitada, Yoshio; Yonekura, Kazuyoshi.

    1992-01-01

    If a nuclear power plant should encounter earthquakes, an earthquake response analysis value previously stored and the earthquakes observed are compared to judge the magnitude of the earthquakes. From the result of the judgement, a possibility that an abnormality is recognized in plant equipment systems after the earthquakes is evaluated, in comparison with a previously stored earthquake fragility data base of each of equipment/systems. The result of the evaluation is displayed in a central control chamber. The plant equipment system is judged such that abnormalities are recognized at a high probability is evaluated by a previously stored earthquake PSA method for the influence of the abnormality on plant safety, and the result is displayed in the central control chamber. (I.S.)

  16. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  17. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Energy Technology Data Exchange (ETDEWEB)

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  18. Seismic-resistant design of nuclear power stations in Japan, earthquake country. Lessons learned from Chuetsu-oki earthquake

    International Nuclear Information System (INIS)

    Irikura, Kojiro

    2008-01-01

    The new assessment (back-check) of earthquake-proof safety was being conducted at Kashiwazaki-Kariwa Nuclear Power Plants, Tokyo Electric Co. in response to a request based on the guideline for reactor evaluation for seismic-resistant design code, revised in 2006, when the 2007 Chuetsu-oki Earthquake occurred and brought about an unexpectedly huge tremor in this area, although the magnitude of the earthquake was only 6.8 but the intensity of earthquake motion exceeded 2.5-fold more than supposed. This paper introduces how and why the guideline for seismic-resistant design of nuclear facilities was revised in 2006, the outline of the Chuetsu-oki Earthquake, and preliminary findings and lessons learned from the Earthquake. The paper specifically discusses on (1) how we may specify in advance geologic active faults as has been overlooked this time, (2) how we can make adequate models for seismic origin from which we can extract its characteristics, and (3) how the estimation of strong ground motion simulation may be possible for ground vibration level of a possibly overlooked fault. (S. Ohno)

  19. Turbulent equipartitions in two dimensional drift convection

    International Nuclear Information System (INIS)

    Isichenko, M.B.; Yankov, V.V.

    1995-01-01

    Unlike the thermodynamic equipartition of energy in conservative systems, turbulent equipartitions (TEP) describe strongly non-equilibrium systems such as turbulent plasmas. In turbulent systems, energy is no longer a good invariant, but one can utilize the conservation of other quantities, such as adiabatic invariants, frozen-in magnetic flux, entropy, or combination thereof, in order to derive new, turbulent quasi-equilibria. These TEP equilibria assume various forms, but in general they sustain spatially inhomogeneous distributions of the usual thermodynamic quantities such as density or temperature. This mechanism explains the effects of particle and energy pinch in tokamaks. The analysis of the relaxed states caused by turbulent mixing is based on the existence of Lagrangian invariants (quantities constant along fluid-particle or other orbits). A turbulent equipartition corresponds to the spatially uniform distribution of relevant Lagrangian invariants. The existence of such turbulent equilibria is demonstrated in the simple model of two dimensional electrostatically turbulent plasma in an inhomogeneous magnetic field. The turbulence is prescribed, and the turbulent transport is assumed to be much stronger than the classical collisional transport. The simplicity of the model makes it possible to derive the equations describing the relaxation to the TEP state in several limits

  20. Wave turbulence in magnetized plasmas

    Directory of Open Access Journals (Sweden)

    S. Galtier

    2009-02-01

    Full Text Available The paper reviews the recent progress on wave turbulence for magnetized plasmas (MHD, Hall MHD and electron MHD in the incompressible and compressible cases. The emphasis is made on homogeneous and anisotropic turbulence which usually provides the best theoretical framework to investigate space and laboratory plasmas. The solar wind and the coronal heating problems are presented as two examples of application of anisotropic wave turbulence. The most important results of wave turbulence are reported and discussed in the context of natural and simulated magnetized plasmas. Important issues and possible spurious interpretations are also discussed.

  1. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  2. Contribution to the study of turbulence spectra

    Science.gov (United States)

    Dumas, R.

    1979-01-01

    An apparatus suitable for turbulence measurement between ranges of 1 to 5000 cps and from 6 to 16,000 cps was developed and is described. Turbulence spectra downstream of the grills were examined with reference to their general characteristics, their LF qualities, and the effects of periodic turbulence. Medium and HF are discussed. Turbulence spectra in the boundary layers are similarly examined, with reference to their fluctuations at right angles to the wall, and to lateral fluctuations. Turbulence spectra in a boundary layer with suction to the wall is discussed. Induced turbulence, and turbulence spectra at high Reynolds numbers. Calculations are presented relating to the effect of filtering on the value of the correlations in time and space.

  3. Mathematical and physical theory of turbulence

    CERN Document Server

    Cannon, John

    2006-01-01

    Although the current dynamical system approach offers several important insights into the turbulence problem, issues still remain that present challenges to conventional methodologies and concepts. These challenges call for the advancement and application of new physical concepts, mathematical modeling, and analysis techniques. Bringing together experts from physics, applied mathematics, and engineering, Mathematical and Physical Theory of Turbulence discusses recent progress and some of the major unresolved issues in two- and three-dimensional turbulence as well as scalar compressible turbulence. Containing introductory overviews as well as more specialized sections, this book examines a variety of turbulence-related topics. The authors concentrate on theory, experiments, computational, and mathematical aspects of Navier-Stokes turbulence; geophysical flows; modeling; laboratory experiments; and compressible/magnetohydrodynamic effects. The topics discussed in these areas include finite-time singularities a...

  4. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  5. <> earthquakes: a growing contribution to the Catalogue of Strong Italian Earthquakes

    Directory of Open Access Journals (Sweden)

    E. Guidoboni

    2000-06-01

    Full Text Available The particular structure of the research into historical seismology found in this catalogue has allowed a lot of information about unknown seismic events to be traced. This new contribution to seismologic knowledge mainly consists in: i the retrieval and organisation within a coherent framework of documentary evidence of earthquakes that took place between the Middle Ages and the sixteenth century; ii the improved knowledge of seismic events, even destructive events, which in the past had been "obscured" by large earthquakes; iii the identification of earthquakes in "silent" seismic areas. The complex elements to be taken into account when dealing with unknown seismic events have been outlined; much "new" information often falls into one of the following categories: simple chronological errors relative to other well-known events; descriptions of other natural phenomena, though defined in texts as "earthquakes" (landslides, hurricanes, tornadoes, etc.; unknown tremors belonging to known seismic periods; tremors that may be connected with events which have been catalogued under incorrect dates and with very approximate estimates of location and intensity. This proves that this was not a real seismic "silence" but a research vacuum.

  6. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  7. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  8. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  9. The Pocatello Valley, Idaho, earthquake

    Science.gov (United States)

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  10. MULTIFLUID MAGNETOHYDRODYNAMIC TURBULENT DECAY

    International Nuclear Information System (INIS)

    Downes, T. P.; O'Sullivan, S.

    2011-01-01

    It is generally believed that turbulence has a significant impact on the dynamics and evolution of molecular clouds and the star formation that occurs within them. Non-ideal magnetohydrodynamic (MHD) effects are known to influence the nature of this turbulence. We present the results of a suite of 512 3 resolution simulations of the decay of initially super-Alfvenic and supersonic fully multifluid MHD turbulence. We find that ambipolar diffusion increases the rate of decay of the turbulence while the Hall effect has virtually no impact. The decay of the kinetic energy can be fitted as a power law in time and the exponent is found to be -1.34 for fully multifluid MHD turbulence. The power spectra of density, velocity, and magnetic field are all steepened significantly by the inclusion of non-ideal terms. The dominant reason for this steepening is ambipolar diffusion with the Hall effect again playing a minimal role except at short length scales where it creates extra structure in the magnetic field. Interestingly we find that, at least at these resolutions, the majority of the physics of multifluid turbulence can be captured by simply introducing fixed (in time and space) resistive terms into the induction equation without the need for a full multifluid MHD treatment. The velocity dispersion is also examined and, in common with previously published results, it is found not to be power law in nature.

  11. Two-dimensional turbulent convection

    Science.gov (United States)

    Mazzino, Andrea

    2017-11-01

    We present an overview of the most relevant, and sometimes contrasting, theoretical approaches to Rayleigh-Taylor and mean-gradient-forced Rayleigh-Bénard two-dimensional turbulence together with numerical and experimental evidences for their support. The main aim of this overview is to emphasize that, despite the different character of these two systems, especially in relation to their steadiness/unsteadiness, turbulent fluctuations are well described by the same scaling relationships originated from the Bolgiano balance. The latter states that inertial terms and buoyancy terms balance at small scales giving rise to an inverse kinetic energy cascade. The main difference with respect to the inverse energy cascade in hydrodynamic turbulence [R. H. Kraichnan, "Inertial ranges in two-dimensional turbulence," Phys. Fluids 10, 1417 (1967)] is that the rate of cascade of kinetic energy here is not constant along the inertial range of scales. Thanks to the absence of physical boundaries, the two systems here investigated turned out to be a natural physical realization of the Kraichnan scaling regime hitherto associated with the elusive "ultimate state of thermal convection" [R. H. Kraichnan, "Turbulent thermal convection at arbitrary Prandtl number," Phys. Fluids 5, 1374-1389 (1962)].

  12. Evaluating spatial and temporal relationships between an earthquake cluster near Entiat, central Washington, and the large December 1872 Entiat earthquake

    Science.gov (United States)

    Brocher, Thomas M.; Blakely, Richard J.; Sherrod, Brian

    2017-01-01

    We investigate spatial and temporal relations between an ongoing and prolific seismicity cluster in central Washington, near Entiat, and the 14 December 1872 Entiat earthquake, the largest historic crustal earthquake in Washington. A fault scarp produced by the 1872 earthquake lies within the Entiat cluster; the locations and areas of both the cluster and the estimated 1872 rupture surface are comparable. Seismic intensities and the 1–2 m of coseismic displacement suggest a magnitude range between 6.5 and 7.0 for the 1872 earthquake. Aftershock forecast models for (1) the first several hours following the 1872 earthquake, (2) the largest felt earthquakes from 1900 to 1974, and (3) the seismicity within the Entiat cluster from 1976 through 2016 are also consistent with this magnitude range. Based on this aftershock modeling, most of the current seismicity in the Entiat cluster could represent aftershocks of the 1872 earthquake. Other earthquakes, especially those with long recurrence intervals, have long‐lived aftershock sequences, including the Mw">MwMw 7.5 1891 Nobi earthquake in Japan, with aftershocks continuing 100 yrs after the mainshock. Although we do not rule out ongoing tectonic deformation in this region, a long‐lived aftershock sequence can account for these observations.

  13. Financial Literacy, Financial Education, and Economic Outcomes

    Science.gov (United States)

    Hastings, Justine S.; Madrian, Brigitte C.; Skimmyhorn, William L.

    2013-01-01

    In this article, we review the literature on financial literacy, financial education, and consumer financial outcomes. We consider how financial literacy is measured in the current literature and examine how well the existing literature addresses whether financial education improves financial literacy or personal financial outcomes. We discuss the…

  14. Earthquake Hazard and Risk in New Zealand

    Science.gov (United States)

    Apel, E. V.; Nyst, M.; Fitzenz, D. D.; Molas, G.

    2014-12-01

    To quantify risk in New Zealand we examine the impact of updating the seismic hazard model. The previous RMS New Zealand hazard model is based on the 2002 probabilistic seismic hazard maps for New Zealand (Stirling et al., 2002). The 2015 RMS model, based on Stirling et al., (2012) will update several key source parameters. These updates include: implementation a new set of crustal faults including multi-segment ruptures, updating the subduction zone geometry and reccurrence rate and implementing new background rates and a robust methodology for modeling background earthquake sources. The number of crustal faults has increased by over 200 from the 2002 model, to the 2012 model which now includes over 500 individual fault sources. This includes the additions of many offshore faults in northern, east-central, and southwest regions. We also use the recent data to update the source geometry of the Hikurangi subduction zone (Wallace, 2009; Williams et al., 2013). We compare hazard changes in our updated model with those from the previous version. Changes between the two maps are discussed as well as the drivers for these changes. We examine the impact the hazard model changes have on New Zealand earthquake risk. Considered risk metrics include average annual loss, an annualized expected loss level used by insurers to determine the costs of earthquake insurance (and premium levels), and the loss exceedance probability curve used by insurers to address their solvency and manage their portfolio risk. We analyze risk profile changes in areas with large population density and for structures of economic and financial importance. New Zealand is interesting in that the city with the majority of the risk exposure in the country (Auckland) lies in the region of lowest hazard, where we don't have a lot of information about the location of faults and distributed seismicity is modeled by averaged Mw-frequency relationships on area sources. Thus small changes to the background rates

  15. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  16. Surface latent heat flux as an earthquake precursor

    Directory of Open Access Journals (Sweden)

    S. Dey

    2003-01-01

    Full Text Available The analysis of surface latent heat flux (SLHF from the epicentral regions of five recent earthquakes that occurred in close proximity to the oceans has been found to show anomalous behavior. The maximum increase of SLHF is found 2–7 days prior to the main earthquake event. This increase is likely due to an ocean-land-atmosphere interaction. The increase of SLHF prior to the main earthquake event is attributed to the increase in infrared thermal (IR temperature in the epicentral and surrounding region. The anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the tides and monsoon in surface latent heat flux. Efforts have been made to understand the level of background noise in the epicentral regions of the five earthquakes considered in the present paper. A comparison of SLHF from the epicentral regions over the coastal earthquakes and the earthquakes that occurred far away from the coast has been made and it has been found that the anomalous behavior of SLHF prior to the main earthquake event is only associated with the coastal earthquakes.

  17. Method to Determine Appropriate Source Models of Large Earthquakes Including Tsunami Earthquakes for Tsunami Early Warning in Central America

    Science.gov (United States)

    Tanioka, Yuichiro; Miranda, Greyving Jose Arguello; Gusman, Aditya Riadi; Fujii, Yushiro

    2017-08-01

    Large earthquakes, such as the Mw 7.7 1992 Nicaragua earthquake, have occurred off the Pacific coasts of El Salvador and Nicaragua in Central America and have generated distractive tsunamis along these coasts. It is necessary to determine appropriate fault models before large tsunamis hit the coast. In this study, first, fault parameters were estimated from the W-phase inversion, and then an appropriate fault model was determined from the fault parameters and scaling relationships with a depth dependent rigidity. The method was tested for four large earthquakes, the 1992 Nicaragua tsunami earthquake (Mw7.7), the 2001 El Salvador earthquake (Mw7.7), the 2004 El Astillero earthquake (Mw7.0), and the 2012 El Salvador-Nicaragua earthquake (Mw7.3), which occurred off El Salvador and Nicaragua in Central America. The tsunami numerical simulations were carried out from the determined fault models. We found that the observed tsunami heights, run-up heights, and inundation areas were reasonably well explained by the computed ones. Therefore, our method for tsunami early warning purpose should work to estimate a fault model which reproduces tsunami heights near the coast of El Salvador and Nicaragua due to large earthquakes in the subduction zone.

  18. Building financial and insurance resilience in the context of climate change

    Directory of Open Access Journals (Sweden)

    Miškić Miroslav

    2017-01-01

    Full Text Available The key challenge for individuals, businesses and governments would be the building financial and insurance resilience in changing climate. It becomes important issue for the financial management to create financial protection and insurance means to manage the financial losses, reducing the economic impact of disaster events, and supporting better recovery. In accordance with that the Paper provides an overview of the field and desk research of potential income implications of climate change for the financial management of disaster risks and losses. Desk research is based on Serbian case and its experience with the 2014 floods. Key findings of the field research provided in Serbia in 2016 on managing the risk of natural disasters, floods, fires, earthquakes as a part of organizational risk in 92 manufacturing firms, banks and insurance companies is also provided in the Paper. The methods used are: statistical description, X2 test and liner regression models. The results of both researches on risk management of floods showed that: companies calculate the impact of this risk to their year revenues as small, also a non-strategic approach of the Serbian government can be seen, as a financial gap of 65% in covering the losses. The research results pointed also to low awareness of the problem on the corporate and national level. The contribution of the Paper is to support further development of country and local plans for more effectively reducing the economic disruption of disaster events and policy approaches to supporting the penetration of disaster finance and insurance coverage and the capacity of insurance markets to absorb these risks. Also, to support the improvement of the culture of risk management of business sector in this field.

  19. Aperture averaging in strong oceanic turbulence

    Science.gov (United States)

    Gökçe, Muhsin Caner; Baykal, Yahya

    2018-04-01

    Receiver aperture averaging technique is employed in underwater wireless optical communication (UWOC) systems to mitigate the effects of oceanic turbulence, thus to improve the system performance. The irradiance flux variance is a measure of the intensity fluctuations on a lens of the receiver aperture. Using the modified Rytov theory which uses the small-scale and large-scale spatial filters, and our previously presented expression that shows the atmospheric structure constant in terms of oceanic turbulence parameters, we evaluate the irradiance flux variance and the aperture averaging factor of a spherical wave in strong oceanic turbulence. Irradiance flux variance variations are examined versus the oceanic turbulence parameters and the receiver aperture diameter are examined in strong oceanic turbulence. Also, the effect of the receiver aperture diameter on the aperture averaging factor is presented in strong oceanic turbulence.

  20. Earthquake simulation, actual earthquake monitoring and analytical methods for soil-structure interaction investigation

    Energy Technology Data Exchange (ETDEWEB)

    Tang, H T [Seismic Center, Electric Power Research Institute, Palo Alto, CA (United States)

    1988-07-01

    Approaches for conducting in-situ soil-structure interaction experiments are discussed. High explosives detonated under the ground can generate strong ground motion to induce soil-structure interaction (SSI). The explosive induced data are useful in studying the dynamic characteristics of the soil-structure system associated with the inertial aspect of the SSI problem. The plane waves generated by the explosives cannot adequately address the kinematic interaction associated with actual earthquakes because of he difference in wave fields and their effects. Earthquake monitoring is ideal for obtaining SSI data that can address all aspects of the SSI problem. The only limitation is the level of excitation that can be obtained. Neither the simulated earthquake experiments nor the earthquake monitoring experiments can have exact similitude if reduced scale test structures are used. If gravity effects are small, reasonable correlations between the scaled model and the prototype can be obtained provided that input motion can be scaled appropriately. The key product of the in-situ experiments is the data base that can be used to qualify analytical methods for prototypical applications. (author)

  1. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  2. Critical behavior in earthquake energy dissipation

    Science.gov (United States)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  3. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  4. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  5. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  6. Quantify the complexity of turbulence

    Science.gov (United States)

    Tao, Xingtian; Wu, Huixuan

    2017-11-01

    Many researchers have used Reynolds stress, power spectrum and Shannon entropy to characterize a turbulent flow, but few of them have measured the complexity of turbulence. Yet as this study shows, conventional turbulence statistics and Shannon entropy have limits when quantifying the flow complexity. Thus, it is necessary to introduce new complexity measures- such as topology complexity and excess information-to describe turbulence. Our test flow is a classic turbulent cylinder wake at Reynolds number 8100. Along the stream-wise direction, the flow becomes more isotropic and the magnitudes of normal Reynolds stresses decrease monotonically. These seem to indicate the flow dynamics becomes simpler downstream. However, the Shannon entropy keeps increasing along the flow direction and the dynamics seems to be more complex, because the large-scale vortices cascade to small eddies, the flow is less correlated and more unpredictable. In fact, these two contradictory observations partially describe the complexity of a turbulent wake. Our measurements (up to 40 diameters downstream the cylinder) show that the flow's degree-of-complexity actually increases firstly and then becomes a constant (or drops slightly) along the stream-wise direction. University of Kansas General Research Fund.

  7. Retrospective analysis of the Spitak earthquake

    Directory of Open Access Journals (Sweden)

    A. K. Tovmassian

    1995-06-01

    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  8. Relation of astrophysical turbulence and magnetic reconnection

    Energy Technology Data Exchange (ETDEWEB)

    Lazarian, A. [Department of Astronomy, University of Wisconsin, 475 North Charter Street, Madison, Wisconsin 53706 (United States); Eyink, Gregory L. [Department of Applied Mathematics and Statistics, Johns Hopkins University, Baltimore, Maryland 21218 (United States); Vishniac, E. T. [Department of Physics and Astronomy, McMaster University, 1280 Main Street West, Hamilton, Ontario L8S 4M1 (Canada)

    2012-01-15

    Astrophysical fluids are generically turbulent and this must be taken into account for most transport processes. We discuss how the preexisting turbulence modifies magnetic reconnection and how magnetic reconnection affects the MHD turbulent cascade. We show the intrinsic interdependence and interrelation of magnetic turbulence and magnetic reconnection, in particular, that strong magnetic turbulence in 3D requires reconnection and 3D magnetic turbulence entails fast reconnection. We follow the approach in Eyink et al.[Astrophys. J. 743, 51 (2011)] to show that the expressions of fast magnetic reconnection in A. Lazarian and E. T. Vishniac [Astrophys. J. 517, 700 (1999)] can be recovered if Richardson diffusion of turbulent flows is used instead of ordinary Ohmic diffusion. This does not revive, however, the concept of magnetic turbulent diffusion which assumes that magnetic fields can be mixed up in a passive way down to a very small dissipation scales. On the contrary, we are dealing the reconnection of dynamically important magnetic field bundles which strongly resist bending and have well defined mean direction weakly perturbed by turbulence. We argue that in the presence of turbulence the very concept of flux-freezing requires modification. The diffusion that arises from magnetic turbulence can be called reconnection diffusion as it based on reconnection of magnetic field lines. The reconnection diffusion has important implications for the continuous transport processes in magnetized plasmas and for star formation. In addition, fast magnetic reconnection in turbulent media induces the First order Fermi acceleration of energetic particles, can explain solar flares and gamma ray bursts. However, the most dramatic consequence of these developments is the fact that the standard flux freezing concept must be radically modified in the presence of turbulence.

  9. Financial flow as a part of business logistics

    Directory of Open Access Journals (Sweden)

    Štangová Nora

    2003-09-01

    Full Text Available In these latter years, our concerns completely got into the new situation that relates with the conversion on market economy. Their prosperity is going from the ability of management to adapt to the variable market conditions.The basic aim of the concerns is not to reach the maximal profit but the effort on longlife existence, their growth and global optimalization. A LOGISTICS is dealing with the philosophy of material, information and financial optimal flow control.In this contribution we created a comprehensive logistics model of concerns, which determines the interconnection of the mentioned flows. The special attention is given to the information and financial flows. Mainly, we highlighted the need for correct, early and confident information, because they are most precious sources for the concerns in this "turbulent" time.In this contribution we mentioned the importance of information in proceeding of obtaining and allocating the funds. Farther, we even mentioned the requisite of proper selection of optimal way and method to realise payment for bought production factors or sold products, eventually provided services. Behaviour of this optimalization process provides stability and solvency of concern and its reputation.

  10. Validity of the assumption of Gaussian turbulence; Gyldighed af antagelsen om Gaussisk turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M.; Hansen, K.S.; Juul Pedersen, B.

    2000-07-01

    Wind turbines are designed to withstand the impact of turbulent winds, which fluctuations usually are assumed of Gaussian probability distribution. Based on a large number of measurements from many sites, this seems a reasonable assumption in flat homogeneous terrain whereas it may fail in complex terrain. At these sites the wind speed often has a skew distribution with more frequent lulls than gusts. In order to simulate aerodynamic loads, a numerical turbulence simulation method was developed and implemented. This method may simulate multiple time series of variable not necessarily Gaussian distribution without distortion of the spectral distribution or spatial coherence. The simulated time series were used as input to the dynamic-response simulation program Vestas Turbine Simulator (VTS). In this way we simulated the dynamic response of systems exposed to turbulence of either Gaussian or extreme, yet realistic, non-Gaussian probability distribution. Certain loads on turbines with active pitch regulation were enhanced by up to 15% compared to pure Gaussian turbulence. It should, however, be said that the undesired effect depends on the dynamic system, and it might be mitigated by optimisation of the wind turbine regulation system after local turbulence characteristics. (au)

  11. Nonlinear acoustic/seismic waves in earthquake processes

    International Nuclear Information System (INIS)

    Johnson, Paul A.

    2012-01-01

    Nonlinear dynamics induced by seismic sources and seismic waves are common in Earth. Observations range from seismic strong ground motion (the most damaging aspect of earthquakes), intense near-source effects, and distant nonlinear effects from the source that have important consequences. The distant effects include dynamic earthquake triggering—one of the most fascinating topics in seismology today—which may be elastically nonlinearly driven. Dynamic earthquake triggering is the phenomenon whereby seismic waves generated from one earthquake trigger slip events on a nearby or distant fault. Dynamic triggering may take place at distances thousands of kilometers from the triggering earthquake, and includes triggering of the entire spectrum of slip behaviors currently identified. These include triggered earthquakes and triggered slow, silent-slip during which little seismic energy is radiated. It appears that the elasticity of the fault gouge—the granular material located between the fault blocks—is key to the triggering phenomenon.

  12. On the plant operators performance during earthquake

    International Nuclear Information System (INIS)

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  13. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  14. Seismic-electromagnetic precursors of Romania's Vrancea earthquakes

    International Nuclear Information System (INIS)

    Enescu, B.D.; Enescu, C.; Constantin, A. P.

    1999-01-01

    Diagrams were plotted from electromagnetic data that were recorded at Muntele Rosu Observatory during December 1996 to January 1997, and December 1997 to September 1998. The times when Vrancea earthquakes of magnitudes M ≥ 3.9 occurred within these periods are marked on the diagrams.The parameters of the earthquakes are given in a table which also includes information on the magnetic and electric anomalies (perturbations) preceding these earthquakes. The magnetic data prove that Vrancea earthquakes are preceded by magnetic perturbations that may be regarded as their short-term precursors. Perturbations, which could likewise be seen as short-term precursors of Vrancea earthquakes, are also noticed in the electric records. Still, a number of electric data do cast a doubt on their forerunning nature. Some suggestions are made in the end of the paper on how electromagnetic research should go ahead to be of use for Vrancea earthquake prediction. (authors)

  15. Wall roughness induces asymptotic ultimate turbulence

    NARCIS (Netherlands)

    Zhu, Xiaojue; Verschoof, Ruben Adriaan; Bakhuis, Dennis; Huisman, Sander Gerard; Verzicco, Roberto; Sun, Chao; Lohse, Detlef

    2018-01-01

    Turbulence governs the transport of heat, mass and momentum on multiple scales. In real-world applications, wall-bounded turbulence typically involves surfaces that are rough; however, characterizing and understanding the effects of wall roughness on turbulence remains a challenge. Here, by

  16. Earthquake activity along the Himalayan orogenic belt

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2017-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  17. Prediction of site specific ground motion for large earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1990-01-01

    In this paper, we apply the semi-empirical synthesis method by IRIKURA (1983, 1986) to the estimation of site specific ground motion using accelerograms observed at Kumatori in Osaka prefecture. Target earthquakes used here are a comparatively distant earthquake (Δ=95 km, M=5.6) caused by the YAMASAKI fault and a near earthquake (Δ=27 km, M=5.6). The results obtained are as follows. 1) The accelerograms from the distant earthquake (M=5.6) are synthesized using the aftershock records (M=4.3) for 1983 YAMASAKI fault earthquake whose source parameters have been obtained by other authors from the hypocentral distribution of the aftershocks. The resultant synthetic motions show a good agreement with the observed ones. 2) The synthesis for a near earthquake (M=5.6, we call this target earthquake) are made using a small earthquake which occurred in the neighborhood of the target earthquake. Here, we apply two methods for giving the parameters for synthesis. One method is to use the parameters of YAMASAKI fault earthquake which has the same magnitude as the target earthquake, and the other is to use the parameters obtained from several existing empirical formulas. The resultant synthetic motion with the former parameters shows a good agreement with the observed one, but that with the latter does not. 3) We estimate the source parameters from the source spectra of several earthquakes which have been observed in this site. Consequently we find that the small earthquakes (M<4) as Green's functions should be carefully used because the stress drops are not constant. 4) We propose that we should designate not only the magnitudes but also seismic moments of the target earthquake and the small earthquake. (J.P.N.)

  18. Turbulent deflagrations, autoignitions, and detonations

    KAUST Repository

    Bradley, Derek

    2012-09-01

    Measurements of turbulent burning velocities in fan-stirred explosion bombs show an initial linear increase with the fan speed and RMS turbulent velocity. The line then bends over to form a plateau of high values around the maximum attainable burning velocity. A further increase in fan speed leads to the eventual complete quenching of the flame due to increasing localised extinctions because of the flame stretch rate. The greater the Markstein number, the more readily does flame quenching occur. Flame propagation along a duct closed at one end, with and without baffles to increase the turbulence, is subjected to a one-dimensional analysis. The flame, initiated at the closed end of the long duct, accelerates by the turbulent feedback mechanism, creating a shock wave ahead of it, until the maximum turbulent burning velocity for the mixture is attained. With the confining walls, the mixture is compressed between the flame and the shock plane up to the point where it might autoignite. This can be followed by a deflagration to detonation transition. The maximum shock intensity occurs with the maximum attainable turbulent burning velocity, and this defines the limit for autoignition of the mixture. For more reactive mixtures, autoignition can occur at turbulent burning velocities that are less than the maximum attainable one. Autoignition can be followed by quasi-detonation or fully developed detonation. The stability of ensuing detonations is discussed, along with the conditions that may lead to their extinction. © 2012 by Pleiades Publishing, Ltd.

  19. Inflow Turbulence Generation Methods

    Science.gov (United States)

    Wu, Xiaohua

    2017-01-01

    Research activities on inflow turbulence generation methods have been vigorous over the past quarter century, accompanying advances in eddy-resolving computations of spatially developing turbulent flows with direct numerical simulation, large-eddy simulation (LES), and hybrid Reynolds-averaged Navier-Stokes-LES. The weak recycling method, rooted in scaling arguments on the canonical incompressible boundary layer, has been applied to supersonic boundary layer, rough surface boundary layer, and microscale urban canopy LES coupled with mesoscale numerical weather forecasting. Synthetic methods, originating from analytical approximation to homogeneous isotropic turbulence, have branched out into several robust methods, including the synthetic random Fourier method, synthetic digital filtering method, synthetic coherent eddy method, and synthetic volume forcing method. This article reviews major progress in inflow turbulence generation methods with an emphasis on fundamental ideas, key milestones, representative applications, and critical issues. Directions for future research in the field are also highlighted.

  20. Clustered and transient earthquake sequences in mid-continents

    Science.gov (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.

    2012-12-01

    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  1. Investigating landslides caused by earthquakes - A historical review

    Science.gov (United States)

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  2. Post-Earthquake Reconstruction — in Context of Housing

    Science.gov (United States)

    Sarkar, Raju

    Comprehensive rescue and relief operations are always launched with no loss of time with active participation of the Army, Governmental agencies, Donor agencies, NGOs, and other Voluntary organizations after each Natural Disaster. There are several natural disasters occurring throughout the world round the year and one of them is Earthquake. More than any other natural catastrophe, an earthquake represents the undoing of our most basic pre-conceptions of the earth as the source of stability or the first distressing factor due to earthquake is the collapse of our dwelling units. Earthquake has affected buildings since people began constructing them. So after each earthquake a reconstruction of housing program is very much essential since housing is referred to as shelter satisfying one of the so-called basic needs next to food and clothing. It is a well-known fact that resettlement (after an earthquake) is often accompanied by the creation of ghettos and ensuing problems in the provision of infrastructure and employment. In fact a housing project after Bhuj earthquake in Gujarat, India, illustrates all the negative aspects of resettlement in the context of reconstruction. The main theme of this paper is to consider few issues associated with post-earthquake reconstruction in context of housing, all of which are significant to communities that have had to rebuild after catastrophe or that will face such a need in the future. Few of them are as follows: (1) Why rebuilding opportunities are time consuming? (2) What are the causes of failure in post-earthquake resettlement? (3) How can holistic planning after an earthquake be planned? (4) What are the criteria to be checked for sustainable building materials? (5) What are the criteria for success in post-earthquake resettlement? (6) How mitigation in post-earthquake housing can be made using appropriate repair, restoration, and strengthening concepts?

  3. Modeling financial disaster risk management in developing countries

    Science.gov (United States)

    Mechler, R.; Hochrainer, S.; Pflug, G.; Linnerooth-Bayer, J.

    2005-12-01

    The public sector plays a major role in reducing the long-term economic repercussions of disasters by repairing damaged infrastructure and providing financial assistance to households and businesses. If critical infrastructure is not repaired in a timely manner, there can be serious effects on the economy and the livelihoods of the population. The repair of public infrastructure, however, can be a significant drain on public budgets especially in developing and transition countries. Developing country governments frequently lack the liquidity, even including international aid and loans, to fully repair damaged critical public infrastructure or provide sufficient support to households and businesses for their recovery. The earthquake in Gujarat, and other recent cases of government post-disaster liquidity crises, have sounded an alarm, prompting financial development organizations, such as the World Bank, among others, to call for greater attention to reducing financial vulnerability and increasing the resilience of the public sector. This talk reports on a model designed to illustrate the tradeoffs and choices a developing country must make in financially managing the economic risks due to natural disasters. Budgetary resources allocated to pre-disaster risk management strategies, such as loss mitigation measures, a catastrophe reserve fund, insurance and contingent credit arrangements for public assets, reduce the probability of financing gaps - the inability of governments to meet their full obligations in providing relief to private victims and restoring public infrastructure - or prevent the deterioration of the ability to undertake additional borrowing without incurring a debt crisis. The model -which is equipped with a graphical interface - can be a helpful tool for building capacity of policy makers for developing and assessing public financing strategies for disaster risk by indicating the respective costs and consequences of financing alternatives.

  4. Recent results on analytical plasma turbulence theory: Realizability, intermittency, submarginal turbulence, and self-organized criticality

    International Nuclear Information System (INIS)

    Krommes, J.A.

    2000-01-01

    Recent results and future challenges in the systematic analytical description of plasma turbulence are described. First, the importance of statistical realizability is stressed, and the development and successes of the Realizable Markovian Closure are briefly reviewed. Next, submarginal turbulence (linearly stable but nonlinearly self-sustained fluctuations) is considered and the relevance of nonlinear instability in neutral-fluid shear flows to submarginal turbulence in magnetized plasmas is discussed. For the Hasegawa-Wakatani equations, a self-consistency loop that leads to steady-state vortex regeneration in the presence of dissipation is demonstrated and a partial unification of recent work of Drake (for plasmas) and of Waleffe (for neutral fluids) is given. Brief remarks are made on the difficulties facing a quantitatively accurate statistical description of submarginal turbulence. Finally, possible connections between intermittency, submarginal turbulence, and self-organized criticality (SOC) are considered and outstanding questions are identified

  5. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    Science.gov (United States)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  6. Compressibility, turbulence and high speed flow

    CERN Document Server

    Gatski, Thomas B

    2009-01-01

    This book introduces the reader to the field of compressible turbulence and compressible turbulent flows across a broad speed range through a unique complimentary treatment of both the theoretical foundations and the measurement and analysis tools currently used. For the computation of turbulent compressible flows, current methods of averaging and filtering are presented so that the reader is exposed to a consistent development of applicable equation sets for both the mean or resolved fields as well as the transport equations for the turbulent stress field. For the measurement of turbulent compressible flows, current techniques ranging from hot-wire anemometry to PIV are evaluated and limitations assessed. Characterizing dynamic features of free shear flows, including jets, mixing layers and wakes, and wall-bounded flows, including shock-turbulence and shock boundary-layer interactions, obtained from computations, experiments and simulations are discussed. Key features: * Describes prediction methodologies in...

  7. Compressibility, turbulence and high speed flow

    CERN Document Server

    Gatski, Thomas B

    2013-01-01

    Compressibility, Turbulence and High Speed Flow introduces the reader to the field of compressible turbulence and compressible turbulent flows across a broad speed range, through a unique complimentary treatment of both the theoretical foundations and the measurement and analysis tools currently used. The book provides the reader with the necessary background and current trends in the theoretical and experimental aspects of compressible turbulent flows and compressible turbulence. Detailed derivations of the pertinent equations describing the motion of such turbulent flows is provided and an extensive discussion of the various approaches used in predicting both free shear and wall bounded flows is presented. Experimental measurement techniques common to the compressible flow regime are introduced with particular emphasis on the unique challenges presented by high speed flows. Both experimental and numerical simulation work is supplied throughout to provide the reader with an overall perspective of current tre...

  8. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  9. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  10. Turbulent viscosity and scale laws in turbulent jets with variable density; Viscosite turbulente et lois d`echelles dans les jets turbulents a masse volumique variable

    Energy Technology Data Exchange (ETDEWEB)

    Pietri, L.; Amielh, M.; Anselmet, F.; Fulachier, L. [Institut de Recherche sur les Phinomenes Hors Equilibre Equipe Turbulence, 13 - Marseille (France)

    1997-12-31

    Turbulent flows with strong density variations, like helium jets in the ambient air, have specific properties linked with the difference of gas densities. This paper presents some experimental results of turbulence properties inside such flows: the Reynolds tensions and the associated turbulent viscosity, and some characteristics linked with the statistical properties of the different turbulence scales. These last results allows to show the complexity of such flows characterized by the influence of external parameters (Reynolds number, initial density ratio, initial momentum flux) that govern the evolution of these parameters inside the jet from the nozzle up to regions where similarity properties are reached. (J.S.) 12 refs.

  11. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    Science.gov (United States)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  12. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  13. Earthquake risk assessment of Alexandria, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  14. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    Science.gov (United States)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  15. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    Science.gov (United States)

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  16. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  17. The use of radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Ramola, R.C.; Singh, M.; Sandhu, A.S.; Singh, S.; Virk, H.S.

    1990-01-01

    Radon monitoring for earthquake prediction is part of an integral approach since the discovery of coherent and time anomalous radon concentrations prior to, during and after the 1966 Tashkent earthquake. In this paper some studies of groundwater and soil gas radon content in relation to earthquake activities are reviewed. Laboratory experiments and the development of groundwater and soil gas radon monitoring systems are described. In addition, radon monitoring studies conducted at the Guru Nanak Dev University Campus since 1986 are presented in detail. During these studies some anomalous changes in radon concentration were recorded before earthquakes occurred in the region. The anomalous radon increases are independent of meteorological conditions and appear to be caused by strain changes, which precede the earthquake. Anomalous changes in radon concentration before an earthquake suggest that radon monitoring can serve as an additional technique in the earthquake prediction programme in India. (author)

  18. Modeling fast and slow earthquakes at various scales.

    Science.gov (United States)

    Ide, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  19. Turbulent Flame Speed Scaling for Positive Markstein Number Expanding Flames in Near Isotropic Turbulence

    Science.gov (United States)

    Chaudhuri, Swetaprovo; Wu, Fujia; Law, Chung

    2012-11-01

    In this work we clarify the role of Markstein diffusivity on turbulent flame speed and it's scaling, from analysis and experimental measurements on constant-pressure expanding flames propagating in near isotropic turbulence. For all C0-C4 hydrocarbon-air mixtures presented in this work and recently published C8 data from Leeds, the normalized turbulent flame speed data of individual mixtures approximately follows the recent theoretical and experimental ReT, f 0 . 5 scaling, where the average radius is the length scale and thermal diffusivity is the transport property. We observe that for a constant ReT, f 0 . 5 , the normalized turbulent flame speed decreases with increasing Mk. This could be explained by considering Markstein diffusivity as the large wavenumber, flame surface fluctuation dissipation mechanism. As originally suggested by the theory, replacing thermal diffusivity with Markstein diffusivity in the turbulence Reynolds number definition above, the present and Leeds dataset could be scaled by the new ReT, f 0 . 5 irrespective of the fuel considered, equivalence ratio, pressure and turbulence intensity for positive Mk flames. This work was supported by the Combustion Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Basic Energy Sciences under Award Number DE-SC0001198 and by the Air Force Office of Scientific Research.

  20. Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide

    Science.gov (United States)

    Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.

    2017-12-01

    GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.

  1. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  2. Optimizing Stellarators for Turbulent Transport

    International Nuclear Information System (INIS)

    Mynick, H.E.; Pomphrey, N.; Xanthopoulos, P.

    2010-01-01

    Up to now, the term 'transport-optimized' stellarators has meant optimized to minimize neoclassical transport, while the task of also mitigating turbulent transport, usually the dominant transport channel in such designs, has not been addressed, due to the complexity of plasma turbulence in stellarators. Here, we demonstrate that stellarators can also be designed to mitigate their turbulent transport, by making use of two powerful numerical tools not available until recently, namely gyrokinetic codes valid for 3D nonlinear simulations, and stellarator optimization codes. A first proof-of-principle configuration is obtained, reducing the level of ion temperature gradient turbulent transport from the NCSX baseline design by a factor of about 2.5.

  3. Stochastic Subspace Modelling of Turbulence

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Pedersen, B. J.; Nielsen, Søren R.K.

    2009-01-01

    positive definite cross-spectral density matrix a frequency response matrix is constructed which determines the turbulence vector as a linear filtration of Gaussian white noise. Finally, an accurate state space modelling method is proposed which allows selection of an appropriate model order......, and estimation of a state space model for the vector turbulence process incorporating its phase spectrum in one stage, and its results are compared with a conventional ARMA modelling method.......Turbulence of the incoming wind field is of paramount importance to the dynamic response of civil engineering structures. Hence reliable stochastic models of the turbulence should be available from which time series can be generated for dynamic response and structural safety analysis. In the paper...

  4. Scalar transport across the turbulent/non-turbulent interface in jets: Schmidt number effects

    Science.gov (United States)

    Silva, Tiago S.; B. da Silva, Carlos; Idmec Team

    2016-11-01

    The dynamics of a passive scalar field near a turbulent/non-turbulent interface (TNTI) is analysed through direct numerical simulations (DNS) of turbulent planar jets, with Reynolds numbers ranging from 142 <= Reλ <= 246 , and Schmidt numbers from 0 . 07 <= Sc <= 7 . The steepness of the scalar gradient, as observed from conditional profiles near the TNTI, increases with the Schmidt number. Conditional scalar gradient budgets show that for low and moderate Schmidt numbers a diffusive superlayer emerges at the TNTI, where the scalar gradient diffusion dominates, while the production is negligible. For low Schmidt numbers the growth of the turbulent front is commanded by the molecular diffusion, whereas the scalar gradient convection is negligible. The authors acknowledge the Laboratory for Advanced Computing at University of Coimbra for providing HPC, computing, consulting resources that have contributed to the research results reported within this paper. URL http://www.lca.uc.pt.

  5. Financial integration and financial development in transition economies: What happens during financial crises?

    Directory of Open Access Journals (Sweden)

    Igor Masten

    2011-12-01

    Full Text Available

    This paper provides an empirical analysis of the role of financial development and financial integration in the growth dynamics of transition countries. We focus on the role of financial integration in determining the impact of financial development on growth, distinguishing “normal times” from periods of financial crises. In addition to confirming the significant positive effect on growth exerted by financial development and financial integration, our estimates show that a higher degree of financial openness tends to reduce the contractionary effect of financial crises, by cushioning the effect on the domestic supply of credit. Consequently, the high reliance on international capital flows by transition countries does not necessarily increase their financial fragility. This implies that financial protectionism is a self-defeating policy, at least for transition countries.

  6. Turbulence and turbulence-generated structural loading in wind turbine clusters

    DEFF Research Database (Denmark)

    Frandsen, Sten Tronæs

    2007-01-01

    of the model is that it became part of the Danish standard for wind turbine design DS 472 (2001) in August 2001 and it is part of the corresponding international standard, IEC61400-1 (2005). Also, extreme loading under normal operation for wake conditions and the efficiency of very large wind farms......Turbulence - in terms of standard deviation of wind speed fluctuations - and other flow characteristics are different in the interior of wind farms relative to the free flow and action must be taken to ensure sufficient structural sustainability of the wind turbines exposed to “wind farm flow......”. The standard deviation of wind speed fluctuations is a known key parameter for both extreme- and fatigue loading, and it is argued and found to be justified that a model for change in turbulence intensity alone may account for increased fatigue loading in wind farms. Changes in scale of turbulence...

  7. Low cost earthquake resistant ferrocement small house

    International Nuclear Information System (INIS)

    Saleem, M.A.; Ashraf, M.; Ashraf, M.

    2008-01-01

    The greatest humanitarian challenge faced even today after one year of Kashmir Hazara earthquake is that of providing shelter. Currently on the globe one in seven people live in a slum or refugee camp. The earthquake of October 2005 resulted in a great loss of life and property. This research work is mainly focused on developing a design of small size, low cost and earthquake resistant house. Ferrocement panels are recommended as the main structural elements with lightweight truss roofing system. Earthquake resistance is ensured by analyzing the structure on ETABS for a seismic activity of zone 4. The behavior of structure is found satisfactory under the earthquake loading. An estimate of cost is also presented which shows that it is an economical solution. (author)

  8. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  9. Turbulence and turbulence-generated structural loading in wind turbine clusters

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, Sten

    2007-01-15

    Turbulence, in terms of standard deviation of wind speed fluctuations, and other flow characteristics are different in the interior of wind farms relative to the free flow and action must be taken to ensure sufficient structural sustainability of the wind turbines exposed to 'wind farm flow'. The standard deviation of wind speed fluctuations is a known key parameter for both extreme- and fatigue loading, and it is argued and found to be justified that a model for change in turbulence intensity alone may account for increased fatigue loading in wind farms. Changes in scale of turbulence and horizontal flow-shear also influence the dynamic response and thus fatigue loading. However, these parameters are typically negatively or positively correlated with the standard deviation of wind speed fluctuations, which therefore can, if need be, represent these other variables. Thus, models for spatially averaged turbulence intensity inside the wind farm and direct-wake turbulence intensity are being devised and a method to combine the different load situations is proposed. The combination of the load cases implies a weighting method involving the slope of the considered material's Woehler curve. In the context, this is novel and necessary to avoid excessive safety for fatigue estimation of the structure's steel components, and non-conservatism for fibreglass components. The proposed model offers significant reductions in computational efforts in the design process. The status for the implementation of the model is that it became part of the Danish standard for wind turbine design DS 472 (2001) in August 2001 and it is part of the corresponding international standard, IEC61400-1 (2005). Also, extreme loading under normal operation for wake conditions and the efficiency of very large wind farms are discussed. (au)

  10. Laser beam propagation in atmospheric turbulence

    Science.gov (United States)

    Murty, S. S. R.

    1979-01-01

    The optical effects of atmospheric turbulence on the propagation of low power laser beams are reviewed in this paper. The optical effects are produced by the temperature fluctuations which result in fluctuations of the refractive index of air. The commonly-used models of index-of-refraction fluctuations are presented. Laser beams experience fluctuations of beam size, beam position, and intensity distribution within the beam due to refractive turbulence. Some of the observed effects are qualitatively explained by treating the turbulent atmosphere as a collection of moving gaseous lenses of various sizes. Analytical results and experimental verifications of the variance, covariance and probability distribution of intensity fluctuations in weak turbulence are presented. For stronger turbulence, a saturation of the optical scintillations is observed. The saturation of scintillations involves a progressive break-up of the beam into multiple patches; the beam loses some of its lateral coherence. Heterodyne systems operating in a turbulent atmosphere experience a loss of heterodyne signal due to the destruction of coherence.

  11. Designing an Earthquake-Resistant Building

    Science.gov (United States)

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  12. The HayWired Earthquake Scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  13. Dynamic paradigm of turbulence

    International Nuclear Information System (INIS)

    Mukhamedov, Alfred M.

    2006-01-01

    In this paper a dynamic paradigm of turbulence is proposed. The basic idea consists in the novel definition of chaotic structure given with the help of Pfaff system of PDE associated with the turbulent dynamics. A methodological analysis of the new and the former paradigm is produced

  14. Design basis earthquakes for critical industrial facilities and their characteristics, and the Southern Hyogo prefecture earthquake, 17 January 1995

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki

    1998-12-01

    This paper deals with how to establish the concept of the design basis earthquake (DBE) for critical industrial facilities such as nuclear power plants in consideration of disasters such as the Southern Hyogo prefecture earthquake, the so-called Kobe earthquake in 1995. The author once discussed various DBEs at the 7th World Conference on Earthquake Engineering. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared the values of accelerations of a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo prefecture earthquake in 1995 exceeded the previous assumption of the author, even though the results of the previous paper had been pessimistic. According to the experience of the Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2} of previous DBEs.

  15. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  16. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  17. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  18. Improved model of quasi-particle turbulence (with applications to Alfven and drift wave turbulence)

    International Nuclear Information System (INIS)

    Mendonca, J. T.; Hizanidis, K.

    2011-01-01

    We consider the classical problem of wave stability and dispersion in a turbulent plasma background. We adopt a kinetic description for the quasi-particle turbulence. We describe an improved theoretical approach, which goes beyond the geometric optics approximation and retains the recoil effects associated with the emission and absorption of low frequency waves by nearly resonant quasi-particles. We illustrate the present approach by considering two particular examples. One is the excitation of zonal flows by drift wave turbulence or driftons. The other is the coupling between ion acoustic waves and Alfven wave turbulence, eventually leading to saturation of Alfven wave growth. Both examples are relevant to anomalous transport in magnetic fusion devices. Connection with previous results is established. We show that these results are recovered in the geometric optics approximation.

  19. Electromotive force in strongly compressible magnetohydrodynamic turbulence

    Science.gov (United States)

    Yokoi, N.

    2017-12-01

    Variable density fluid turbulence is ubiquitous in geo-fluids, not to mention in astrophysics. Depending on the source of density variation, variable density fluid turbulence may be divided into two categories: the weak compressible (entropy mode) turbulence for slow flow and the strong compressible (acoustic mode) turbulence for fast flow. In the strong compressible turbulence, the pressure fluctuation induces a strong density fluctuation ρ ', which is represented by the density variance ( denotes the ensemble average). The turbulent effect on the large-scale magnetic-field B induction is represented by the turbulent electromotive force (EMF) (u': velocity fluctuation, b': magnetic-field fluctuation). In the usual treatment in the dynamo theory, the expression for the EMF has been obtained in the framework of incompressible or weak compressible turbulence, where only the variation of the mean density , if any, is taken into account. We see from the equation of the density fluctuation ρ', the density variance is generated by the large mean density variation ∂ coupled with the turbulent mass flux . This means that in the region where the mean density steeply changes, the density variance effect becomes relevant for the magnetic field evolution. This situation is typically the case for phenomena associated with shocks and compositional discontinuities. With the aid of the analytical theory of inhomogeneous compressible magnetohydrodynamic (MHD) turbulence, the expression for the turbulent electromotive force is investigated. It is shown that, among others, an obliqueness (misalignment) between the mean density gradient ∂ and the mean magnetic field B may contribute to the EMF as ≈χ B×∂ with the turbulent transport coefficient χ proportional to the density variance (χ ). This density variance effect is expected to strongly affect the EMF near the interface, and changes the transport properties of turbulence. In the case of an interface under the MHD slow

  20. PDF Modeling of Turbulent Combustion

    National Research Council Canada - National Science Library

    Pope, Stephen B

    2006-01-01

    .... The PDF approach to turbulent combustion has the advantages of fully representing the turbulent fluctuations of species and temperature, and of allowing realistic combustion chemistry to be implemented...

  1. An introduction to turbulence and its measurement

    CERN Document Server

    Bradshaw, P

    1971-01-01

    An Introduction to Turbulence and Its Measurement is an introductory text on turbulence and its measurement. It combines the physics of turbulence with measurement techniques and covers topics ranging from measurable quantities and their physical significance to the analysis of fluctuating signals, temperature and concentration measurements, and the hot-wire anemometer. Examples of turbulent flows are presented. This book is comprised of eight chapters and begins with an overview of the physics of turbulence, paying particular attention to Newton's second law of motion, the Newtonian viscous f

  2. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  3. Earthquake-triggered landslides in southwest China

    OpenAIRE

    X. L. Chen; Q. Zhou; H. Ran; R. Dong

    2012-01-01

    Southwest China is located in the southeastern margin of the Tibetan Plateau and it is a region of high seismic activity. Historically, strong earthquakes that occurred here usually generated lots of landslides and brought destructive damages. This paper introduces several earthquake-triggered landslide events in this region and describes their characteristics. Also, the historical data of earthquakes with a magnitude of 7.0 or greater, having occurred in this region, is col...

  4. The results of the pilot project in Georgia to install a network of electromagnetic radiation before the earthquake

    Science.gov (United States)

    Machavariani, Kakhaber; Khazaradze, Giorgi; Turazashvili, Ioseb; Kachakhidze, Nino; Kachakhidze, Manana; Gogoberidze, Vitali

    2016-04-01

    The world's scientific literature recently published many very important and interesting works of VLF / LF electromagnetic emissions, which is observed in the process of earthquake preparation. This works reliable earthquake prediction in terms of trends. Because, Georgia is located in Trans Asian earthquake zone, VLF / LF electromagnetic emissions network are essential. In this regard, it was possible to take first steps. It is true that our university has Shota Rustaveli National Science Foundation № DI / 21 / 9-140 / 13 grant, which included the installation of a receiver in Georgia, but failed due to lack of funds to buy this device. However, European friends helped us (Prof. Dr. PF Biagi and Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. Turkish scientists expedition in Georgia was organized in August 2015. They brought with them VLF / LF electromagnetic emissions receiver and together with Georgian scientists install near Tbilisi. The station was named GEO-TUR. It should be noted that Georgia was involved in the work of the European network. It is possible to completely control the earthquake in Georgia in terms of electromagnetic radiation. This enables scientists to obtain the relevant information not only on the territory of our country, but also on seismically active European countries as well. In order to maintain and develop our country in this new direction, it is necessary to keep independent group of scientists who will learn electromagnetic radiation ahead of an earthquake in Georgia. At this stage, we need to remedy this shortcoming, it is necessary and appropriate specialists to Georgia to engage in a joint international research. The work is carried out in the frame of grant (DI/21/9-140/13 „Pilot project of before earthquake detected Very Low Frequency/Low Frequency electromagnetic emission network installation in Georgia") by financial support of Shota Rustaveli National Science Foundation.

  5. Analysis of chaos in plasma turbulence

    DEFF Research Database (Denmark)

    Pedersen, T.S.; Michelsen, Poul; Juul Rasmussen, J.

    1996-01-01

    -stationary turbulent state is reached in a finite time, independent of the initial conditions. Different regimes of the turbulent state can be obtained by varying the coupling parameter C, related to the parallel electron dynamics. The turbulence is described by using particle tracking and tools from chaos analysis...

  6. The PDF method for turbulent combustion

    Science.gov (United States)

    Pope, S. B.

    1991-01-01

    Probability Density Function (PDF) methods provide a means of calculating the properties of turbulent reacting flows. They have been successfully applied to many turbulent flames, including some with finite rate kinetic effects. Here the methods are reviewed with an emphasis on computational issues and their application to turbulent combustion.

  7. Statistics of the turbulent/non-turbulent interface in a spatially evolving mixing layer

    KAUST Repository

    Cristancho, Juan

    2012-12-01

    The thin interface separating the inner turbulent region from the outer irrotational fluid is analyzed in a direct numerical simulation of a spatially developing turbulent mixing layer. A vorticity threshold is defined to detect the interface separating the turbulent from the non-turbulent regions of the flow, and to calculate statistics conditioned on the distance from this interface. Velocity and passive scalar statistics are computed and compared to the results of studies addressing other shear flows, such as turbulent jets and wakes. The conditional statistics for velocity are in remarkable agreement with the results for other types of free shear flow available in the literature. In addition, a detailed analysis of the passive scalar field (with Sc 1) in the vicinity of the interface is presented. The scalar has a jump at the interface, even stronger than that observed for velocity. The strong jump for the scalar has been observed before in the case of high Schmidt number, but it is a new result for Schmidt number of order one. Finally, the dissipation for the kinetic energy and the scalar are presented. While the kinetic energy dissipation has its maximum far from the interface, the scalar dissipation is characterized by a strong peak very close to the interface.

  8. Nonexistence of two forms of turbulent bremsstrahlung

    International Nuclear Information System (INIS)

    Kuijpers, J.; Melrose, D.B.

    1985-01-01

    It is shown that the forms of turbulent bremsstrahlung proposed by Tsytovich, Stenflo, and Wilhelmsson and by Nambu do not exist. The proposed mechanisms involve upconversion of ion sound turbulence into Langmuir turbulence, with the ion sound waves being emitted and absorbed resonantly and the Langmuir waves being emitted and absorbed nonresonantly. It is pointed out that a symmetry implicit in a standard QED treatment implies that there is another contribution to turbulent bremsstrahlung in addition to that calculated by Tsytovich, Stenflo, and Wilhelmsson and that the two contributions cancel exactly, leading to the null result. (Our arguments on this point have proved controversial.) Nambu made an approximation inconsistently, and when this approximation is not made, two terms in his analytic treatment cancel exactly. We argue that turbulent bremsstrahlung is related to a radiative correction in which the resonant emission of ion sound turbulence is modified by the nonresonant emission and absorption of Langmuir waves. Physically we interpret the nonexistence of turbulent bremsstrahlung as being due to each emission of a Langmuir quantum being associated with an absorption of an identical Langmuir quantum so that the Langmuir turbulence is unchanged. Proposed astrophysical applications of turbulent bremsstrahlung need to be reconsidered

  9. Predator-prey encounters in turbulent waters

    DEFF Research Database (Denmark)

    Mann, J.; Ott, Søren; Pécseli, H.L.

    2002-01-01

    With reference to studies of predator-prey encounters in turbulent waters, we demonstrate the feasibility of an experimental method for investigations of particle fluxes to an absorbing surface in turbulent flows. A laboratory experiment is carried out, where an approximately homogeneous and isot......With reference to studies of predator-prey encounters in turbulent waters, we demonstrate the feasibility of an experimental method for investigations of particle fluxes to an absorbing surface in turbulent flows. A laboratory experiment is carried out, where an approximately homogeneous...

  10. Marmara Island earthquakes, of 1265 and 1935; Turkey

    Directory of Open Access Journals (Sweden)

    Y. Altınok

    2006-01-01

    Full Text Available The long-term seismicity of the Marmara Sea region in northwestern Turkey is relatively well-recorded. Some large and some of the smaller events are clearly associated with fault zones known to be seismically active, which have distinct morphological expressions and have generated damaging earthquakes before and later. Some less common and moderate size earthquakes have occurred in the vicinity of the Marmara Islands in the west Marmara Sea. This paper presents an extended summary of the most important earthquakes that have occurred in 1265 and 1935 and have since been known as the Marmara Island earthquakes. The informative data and the approaches used have therefore the potential of documenting earthquake ruptures of fault segments and may extend the records kept on earthquakes far before known history, rock falls and abnormal sea waves observed during these events, thus improving hazard evaluations and the fundamental understanding of the process of an earthquake.

  11. Dynamic structure in self-sustained turbulence

    International Nuclear Information System (INIS)

    Itoh, K.; Itoh, S.; Yagi, M.; Fukuyama, A.

    1995-06-01

    Dynamical equation for the self-sustained and pressure-driven turbulence in toroidal plasmas is derived. The growth rate of the dressed-test mode, which belongs to the subcritical turbulence, is obtained as a function of the turbulent transport coefficient. In the limit of the low fluctuation level, the mode has the feature of the nonlinear instability and shows the explosive growth. The growth rate vanishes when the driven transport reaches to the stationarily-turbulent level. The stationary solution is thermodynamically stable. The characteristic time, by which the stationary and self-sustained turbulence is established, scales with the ion-sound transit time and is accelerated by the bad magnetic curvature. Influences of the pressure gradient as well as the radial electric field inhomogeneity are quantified. (author)

  12. Pebble Accretion in Turbulent Protoplanetary Disks

    Science.gov (United States)

    Xu, Ziyan; Bai, Xue-Ning; Murray-Clay, Ruth A.

    2017-09-01

    It has been realized in recent years that the accretion of pebble-sized dust particles onto planetary cores is an important mode of core growth, which enables the formation of giant planets at large distances and assists planet formation in general. The pebble accretion theory is built upon the orbit theory of dust particles in a laminar protoplanetary disk (PPD). For sufficiently large core mass (in the “Hill regime”), essentially all particles of appropriate sizes entering the Hill sphere can be captured. However, the outer regions of PPDs are expected to be weakly turbulent due to the magnetorotational instability (MRI), where turbulent stirring of particle orbits may affect the efficiency of pebble accretion. We conduct shearing-box simulations of pebble accretion with different levels of MRI turbulence (strongly turbulent assuming ideal magnetohydrodynamics, weakly turbulent in the presence of ambipolar diffusion, and laminar) and different core masses to test the efficiency of pebble accretion at a microphysical level. We find that accretion remains efficient for marginally coupled particles (dimensionless stopping time {τ }s˜ 0.1{--}1) even in the presence of strong MRI turbulence. Though more dust particles are brought toward the core by the turbulence, this effect is largely canceled by a reduction in accretion probability. As a result, the overall effect of turbulence on the accretion rate is mainly reflected in the changes in the thickness of the dust layer. On the other hand, we find that the efficiency of pebble accretion for strongly coupled particles (down to {τ }s˜ 0.01) can be modestly reduced by strong turbulence for low-mass cores.

  13. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  14. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  15. Distribution of incremental static stress caused by earthquakes

    Directory of Open Access Journals (Sweden)

    Y. Y. Kagan

    1994-01-01

    Full Text Available Theoretical calculations, simulations and measurements of rotation of earthquake focal mechanisms suggest that the stress in earthquake focal zones follows the Cauchy distribution which is one of the stable probability distributions (with the value of the exponent α equal to 1. We review the properties of the stable distributions and show that the Cauchy distribution is expected to approximate the stress caused by earthquakes occurring over geologically long intervals of a fault zone development. However, the stress caused by recent earthquakes recorded in instrumental catalogues, should follow symmetric stable distributions with the value of α significantly less than one. This is explained by a fractal distribution of earthquake hypocentres: the dimension of a hypocentre set, ��, is close to zero for short-term earthquake catalogues and asymptotically approaches 2¼ for long-time intervals. We use the Harvard catalogue of seismic moment tensor solutions to investigate the distribution of incremental static stress caused by earthquakes. The stress measured in the focal zone of each event is approximated by stable distributions. In agreement with theoretical considerations, the exponent value of the distribution approaches zero as the time span of an earthquake catalogue (ΔT decreases. For large stress values α increases. We surmise that it is caused by the δ increase for small inter-earthquake distances due to location errors.

  16. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  17. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    Science.gov (United States)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  18. Clinical characteristics of patients seizure following the 2016 Kumamoto earthquake.

    Science.gov (United States)

    Inatomi, Yuichiro; Nakajima, Makoto; Yonehara, Toshiro; Ando, Yukio

    2017-06-01

    To investigate the clinical characteristics of patients with seizure following the 2016 Kumamoto earthquake. We retrospectively studied patients with seizure admitted to our hospital for 12weeks following the earthquake. We compared the clinical backgrounds and characteristics of the patients: before (the same period from the previous 3years) and after the earthquake; and the early (first 2weeks) and late (subsequent 10weeks) phases. A total of 60 patients with seizure were admitted to the emergency room after the earthquake, and 175 (58.3/year) patients were admitted before the earthquake. Of them, 35 patients with seizure were hospitalized in the Department of Neurology after the earthquake, and 96 (32/year) patients were hospitalized before the earthquake. In patients after the earthquake, males and non-cerebrovascular diseases as an epileptogenic disease were seen more frequently than before the earthquake. During the early phase after the earthquake, female, first-attack, and non-focal-type patients were seen more frequently than during the late phase after the earthquake. These characteristics of patients with seizure during the early phase after the earthquake suggest that many patients had non-epileptic seizures. To prevent seizures following earthquakes, mental stress and physical status of evacuees must be assessed. Copyright © 2017. Published by Elsevier Ltd.

  19. Role of the Internet in Anticipating and Mitigating Earthquake Catastrophes, and the Emergence of Personal Risk Management (Invited)

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Donnellan, A.; Graves, W.; Tiampo, K. F.; Klein, W.

    2009-12-01

    Risks from natural and financial catastrophes are currently managed by a combination of large public and private institutions. Public institutions usually are comprised of government agencies that conduct studies, formulate policies and guidelines, enforce regulations, and make “official” forecasts. Private institutions include insurance and reinsurance companies, and financial service companies that underwrite catastrophe (“cat”) bonds, and make private forecasts. Although decisions about allocating resources and developing solutions are made by large institutions, the costs of dealing with catastrophes generally fall for the most part on businesses and the general public. Information on potential risks is generally available to the public for some hazards but not others. For example, in the case of weather, private forecast services are provided by www.weather.com and www.wunderground.com. For earthquakes in California (only), the official forecast is the WGCEP-USGS forecast, but provided in a format that is difficult for the public to use. Other privately made forecasts are currently available, for example by the JPL QuakeSim and Russian groups, but these efforts are limited. As more of the world’s population moves increasingly into major seismic zones, new strategies are needed to allow individuals to manage their personal risk from large and damaging earthquakes. Examples include individual mitigation measures such as retrofitting, as well as microinsurance in both developing and developed countries, as well as other financial strategies. We argue that the “long tail” of the internet offers an ideal, and greatly underutilized mechanism to reach out to consumers and to provide them with the information and tools they need to confront and manage seismic hazard and risk on an individual, personalized basis. Information of this type includes not only global hazard forecasts, which are now possible, but also global risk estimation. Additionally

  20. Flames in fractal grid generated turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Goh, K H H; Hampp, F; Lindstedt, R P [Department of Mechanical Engineering, Imperial College, London SW7 2AZ (United Kingdom); Geipel, P, E-mail: p.lindstedt@imperial.ac.uk [Siemens Industrial Turbomachinery AB, SE-612 83 Finspong (Sweden)

    2013-12-15

    Twin premixed turbulent opposed jet flames were stabilized for lean mixtures of air with methane and propane in fractal grid generated turbulence. A density segregation method was applied alongside particle image velocimetry to obtain velocity and scalar statistics. It is shown that the current fractal grids increase the turbulence levels by around a factor of 2. Proper orthogonal decomposition (POD) was applied to show that the fractal grids produce slightly larger turbulent structures that decay at a slower rate as compared to conventional perforated plates. Conditional POD (CPOD) was also implemented using the density segregation technique and the results show that CPOD is essential to segregate the relative structures and turbulent kinetic energy distributions in each stream. The Kolmogorov length scales were also estimated providing values {approx}0.1 and {approx}0.5 mm in the reactants and products, respectively. Resolved profiles of flame surface density indicate that a thin flame assumption leading to bimodal statistics is not perfectly valid under the current conditions and it is expected that the data obtained will be of significant value to the development of computational methods that can provide information on the conditional structure of turbulence. It is concluded that the increase in the turbulent Reynolds number is without any negative impact on other parameters and that fractal grids provide a route towards removing the classical problem of a relatively low ratio of turbulent to bulk strain associated with the opposed jet configuration. (paper)