WorldWideScience

Sample records for adaptively smoothed seismicity

  1. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  2. Did you smooth your well logs the right way for seismic interpretation?

    International Nuclear Information System (INIS)

    Duchesne, Mathieu J; Gaillot, Philippe

    2011-01-01

    Correlations between physical properties and seismic reflection data are useful to determine the geological nature of seismic reflections and the lateral extent of geological strata. The difference in resolution between well logs and seismic data is a major hurdle faced by seismic interpreters when tying both data sets. In general, log data have a resolution of at least two orders of magnitude greater than seismic data. Smoothing physical property logs improves correlation at the seismic scale. Three different approaches were used and compared to smooth a density log: binomial filtering, seismic wavelet filtering and discrete wavelet transform (DWT) filtering. Regression plots between the density logs and the acoustic impedance show that the data smoothed with the DWT is the only method that preserves the original relationship between the raw density data and the acoustic impedance. Smoothed logs were then used to generate synthetic seismograms that were tied to seismic data at the borehole site. Best ties were achieved using the synthetic seismogram computed with the density log processed with the DWT. The good performance of the DWT is explained by its adaptive multi-scale characteristic which preserved significant local changes of density on the high-resolution data series that were also pictured at the seismic scale. Since synthetic seismograms are generated using smoothed logs, the choice of the smoothing method impacts on the quality of seismic-to-well ties. This ultimately can have economical implications during hydrocarbon exploration or exploitation phases

  3. An adaptive method for γ spectra smoothing

    International Nuclear Information System (INIS)

    Xiao Gang; Zhou Chunlin; Li Tiantuo; Han Feng; Di Yuming

    2001-01-01

    Adaptive wavelet method and multinomial fitting gliding method are used for smoothing γ spectra, respectively, and then FWHM of 1332 keV peak of 60 Co and activities of 238 U standard specimen are calculated. Calculated results show that adaptive wavelet method is better than the other

  4. Adaptive prediction applied to seismic event detection

    International Nuclear Information System (INIS)

    Clark, G.A.; Rodgers, P.W.

    1981-01-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data

  5. Adaptive prediction applied to seismic event detection

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G.A.; Rodgers, P.W.

    1981-09-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data.

  6. Time-Independent Annual Seismic Rates, Based on Faults and Smoothed Seismicity, Computed for Seismic Hazard Assessment in Italy

    Science.gov (United States)

    Murru, M.; Falcone, G.; Taroni, M.; Console, R.

    2017-12-01

    In 2015 the Italian Department of Civil Protection, started a project for upgrading the official Italian seismic hazard map (MPS04) inviting the Italian scientific community to participate in a joint effort for its realization. We participated providing spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events on the entire Italian territory, considering cells of 0.1°x0.1° from M4.5 up to M8.1 for magnitude bin of 0.1 units. Our final model was composed by two different models, merged in one ensemble model, each one with the same weight: the first one was realized by a smoothed seismicity approach, the second one using the seismogenic faults. The spatial smoothed seismicity was obtained using the smoothing method introduced by Frankel (1995) applied to the historical and instrumental seismicity. In this approach we adopted a tapered Gutenberg-Richter relation with a b-value fixed to 1 and a corner magnitude estimated with the bigger events in the catalogs. For each seismogenic fault provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate (for each cells of 0.1°x0.1°) for magnitude bin of 0.1 units, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the same tapered Gutenberg-Richter relation of the smoothed seismicity model. The annual rate for the final model was determined in the following way: if the cell falls within one of the seismic sources, we merge the respective value of rate determined by the seismic moments of the earthquakes generated by each fault and the value of the smoothed seismicity model with the same weight; if instead the cells fall outside of any seismic source we considered the rate obtained from the spatial smoothed seismicity. Here we present the final results of our study to be used for the new Italian seismic hazard map.

  7. An Adaptable Seismic Data Format

    Science.gov (United States)

    Krischer, Lion; Smith, James; Lei, Wenjie; Lefebvre, Matthieu; Ruan, Youyi; de Andrade, Elliott Sales; Podhorszki, Norbert; Bozdağ, Ebru; Tromp, Jeroen

    2016-11-01

    We present ASDF, the Adaptable Seismic Data Format, a modern and practical data format for all branches of seismology and beyond. The growing volume of freely available data coupled with ever expanding computational power opens avenues to tackle larger and more complex problems. Current bottlenecks include inefficient resource usage and insufficient data organization. Properly scaling a problem requires the resolution of both these challenges, and existing data formats are no longer up to the task. ASDF stores any number of synthetic, processed or unaltered waveforms in a single file. A key improvement compared to existing formats is the inclusion of comprehensive meta information, such as event or station information, in the same file. Additionally, it is also usable for any non-waveform data, for example, cross-correlations, adjoint sources or receiver functions. Last but not least, full provenance information can be stored alongside each item of data, thereby enhancing reproducibility and accountability. Any data set in our proposed format is self-describing and can be readily exchanged with others, facilitating collaboration. The utilization of the HDF5 container format grants efficient and parallel I/O operations, integrated compression algorithms and check sums to guard against data corruption. To not reinvent the wheel and to build upon past developments, we use existing standards like QuakeML, StationXML, W3C PROV and HDF5 wherever feasible. Usability and tool support are crucial for any new format to gain acceptance. We developed mature C/Fortran and Python based APIs coupling ASDF to the widely used SPECFEM3D_GLOBE and ObsPy toolkits.

  8. Optimal Smoothing in Adaptive Location Estimation

    OpenAIRE

    Mammen, Enno; Park, Byeong U.

    1997-01-01

    In this paper higher order performance of kernel basedadaptive location estimators are considered. Optimalchoice of smoothing parameters is discussed and it isshown how much is lossed in efficiency by not knowingthe underlying translation density.

  9. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping

    2015-06-24

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  10. Efficient computation of smoothing splines via adaptive basis sampling

    KAUST Repository

    Ma, Ping; Huang, Jianhua Z.; Zhang, Nan

    2015-01-01

    © 2015 Biometrika Trust. Smoothing splines provide flexible nonparametric regression estimators. However, the high computational cost of smoothing splines for large datasets has hindered their wide application. In this article, we develop a new method, named adaptive basis sampling, for efficient computation of smoothing splines in super-large samples. Except for the univariate case where the Reinsch algorithm is applicable, a smoothing spline for a regression problem with sample size n can be expressed as a linear combination of n basis functions and its computational complexity is generally O(n3). We achieve a more scalable computation in the multivariate case by evaluating the smoothing spline using a smaller set of basis functions, obtained by an adaptive sampling scheme that uses values of the response variable. Our asymptotic analysis shows that smoothing splines computed via adaptive basis sampling converge to the true function at the same rate as full basis smoothing splines. Using simulation studies and a large-scale deep earth core-mantle boundary imaging study, we show that the proposed method outperforms a sampling method that does not use the values of response variables.

  11. Adaptive Smoothed Finite Elements (ASFEM) for history dependent material models

    International Nuclear Information System (INIS)

    Quak, W.; Boogaard, A. H. van den

    2011-01-01

    A successful simulation of a bulk forming process with finite elements can be difficult due to distortion of the finite elements. Nodal smoothed Finite Elements (NSFEM) are an interesting option for such a process since they show good distortion insensitivity and moreover have locking-free behavior and good computational efficiency. In this paper a method is proposed which takes advantage of the nodally smoothed field. This method, named adaptive smoothed finite elements (ASFEM), revises the mesh for every step of a simulation without mapping the history dependent material parameters. In this paper an updated-Lagrangian implementation is presented. Several examples are given to illustrate the method and to show its properties.

  12. Adaptive Smoothing in fMRI Data Processing Neural Networks

    DEFF Research Database (Denmark)

    Vilamala, Albert; Madsen, Kristoffer Hougaard; Hansen, Lars Kai

    2017-01-01

    in isolation. With the advent of new tools for deep learning, recent work has proposed to turn these pipelines into end-to-end learning networks. This change of paradigm offers new avenues to improvement as it allows for a global optimisation. The current work aims at benefitting from this paradigm shift...... by defining a smoothing step as a layer in these networks able to adaptively modulate the degree of smoothing required by each brain volume to better accomplish a given data analysis task. The viability is evaluated on real fMRI data where subjects did alternate between left and right finger tapping tasks....

  13. Smooth muscle adaptation after intestinal transection and resection.

    Science.gov (United States)

    Thompson, J S; Quigley, E M; Adrian, T E

    1996-09-01

    Changes in motor function occur in the intestinal remnant after intestinal resection. Smooth muscle adaptation also occurs, particularly after extensive resection. The time course of these changes and their interrelationship are unclear. Our aim was to evaluate changes in canine smooth muscle structure and function during intestinal adaptation after transection and resection. Twenty-five dogs underwent either transection (N = 10), 50% distal resection (N = 10), or 50% proximal resection (N = 5). Thickness and length of the circular (CM) and longitudinal (LM) muscle layers were measured four and 12 weeks after resection. In vitro length-tension properties and response to a cholinergic agonist were studied in mid-jejunum and mid-ileum. Transection alone caused increased CM length in the jejunum proximal to the transection but did not affect LM length or muscle thickness. A 50% resection resulted in increased length of CM throughout the intestine and thickening of CM and LM near the anastomosis. Active tension of jejunal CM increased transiently four weeks after resection. Active tension in jejunal LM was decreased 12 weeks after transection and resection. Sensitivity of CM to carbachol was similar after transection and resection. It is concluded that: (1) Structural adaptation of both circular and longitudinal muscle occurs after intestinal resection. (2) This process is influenced by the site of the intestinal remnant. (3) Only minor and transient changes occur in smooth muscle function after resection. (4) Factors other than muscle adaptation are likely involved in the changes in motor function seen following massive bowel resection.

  14. Adapting standards to the site. Example of Seismic Base Isolation

    International Nuclear Information System (INIS)

    Viallet, Emmanuel

    2014-01-01

    Emmanuel Viallet, Civil Design Manager at EDF engineering center SEPTEN, concluded the morning's lectures with a presentation on how to adapt a standard design to site characteristics. He presented the example of the seismic isolation of the Cruas NPP for which the standard 900 MW design was indeed built on 'anti-seismic pads' to withstand local seismic load

  15. Adaptive control of energy storage systems for power smoothing applications

    DEFF Research Database (Denmark)

    Meng, Lexuan; Dragicevic, Tomislav; Guerrero, Josep M.

    2017-01-01

    Energy storage systems (ESSs) are desired and widely applied for power smoothing especially in systems with renewable generation and pulsed loads. High-pass-filter (HPF) is commonly applied in those applications in which the HPF extracts the high frequency fluctuating power and uses...... that as the power reference for ESS. The cut-off frequency, as the critical parameter, actually decides the power/energy compensated by ESS. Practically the state-of-charge (SoC) of the ESS has to be limited for safety and life-cycle considerations. In this paper an adaptive cut-off frequency design is proposed...

  16. A local cubic smoothing in an adaptation mode

    International Nuclear Information System (INIS)

    Dikoussar, N.D.

    2001-01-01

    A new approach to a local curve approximation and the smoothing is proposed. The relation between curve points is defined using a special cross-ratio weight functions. The coordinates of three curve points are used as parameters for both the weight functions and the tree-point cubic model (TPS). A very simple in computing and stable to random errors cubic smoother in an adaptation mode (LOCUS) is constructed. The free parameter of TPS is estimated independently of the fixed parameters by recursion with the effective error suppression and can be controlled by the cross-ratio parameters. Efficiency and the noise stability of the algorithm are confirmed by examples and by comparison with other known non-parametric smoothers

  17. Adaptive Outlier-tolerant Exponential Smoothing Prediction Algorithms with Applications to Predict the Temperature in Spacecraft

    OpenAIRE

    Hu Shaolin; Zhang Wei; Li Ye; Fan Shunxi

    2011-01-01

    The exponential smoothing prediction algorithm is widely used in spaceflight control and in process monitoring as well as in economical prediction. There are two key conundrums which are open: one is about the selective rule of the parameter in the exponential smoothing prediction, and the other is how to improve the bad influence of outliers on prediction. In this paper a new practical outlier-tolerant algorithm is built to select adaptively proper parameter, and the exponential smoothing pr...

  18. ASDF: An Adaptable Seismic Data Format with Full Provenance

    Science.gov (United States)

    Smith, J. A.; Krischer, L.; Tromp, J.; Lefebvre, M. P.

    2015-12-01

    In order for seismologists to maximize their knowledge of how the Earth works, they must extract the maximum amount of useful information from all recorded seismic data available for their research. This requires assimilating large sets of waveform data, keeping track of vast amounts of metadata, using validated standards for quality control, and automating the workflow in a careful and efficient manner. In addition, there is a growing gap between CPU/GPU speeds and disk access speeds that leads to an I/O bottleneck in seismic workflows. This is made even worse by existing seismic data formats that were not designed for performance and are limited to a few fixed headers for storing metadata.The Adaptable Seismic Data Format (ASDF) is a new data format for seismology that solves the problems with existing seismic data formats and integrates full provenance into the definition. ASDF is a self-describing format that features parallel I/O using the parallel HDF5 library. This makes it a great choice for use on HPC clusters. The format integrates the standards QuakeML for seismic sources and StationXML for receivers. ASDF is suitable for storing earthquake data sets, where all waveforms for a single earthquake are stored in a one file, ambient noise cross-correlations, and adjoint sources. The format comes with a user-friendly Python reader and writer that gives seismologists access to a full set of Python tools for seismology. There is also a faster C/Fortran library for integrating ASDF into performance-focused numerical wave solvers, such as SPECFEM3D_GLOBE. Finally, a GUI tool designed for visually exploring the format exists that provides a flexible interface for both research and educational applications. ASDF is a new seismic data format that offers seismologists high-performance parallel processing, organized and validated contents, and full provenance tracking for automated seismological workflows.

  19. An adaptive segment method for smoothing lidar signal based on noise estimation

    Science.gov (United States)

    Wang, Yuzhao; Luo, Pingping

    2014-10-01

    An adaptive segmentation smoothing method (ASSM) is introduced in the paper to smooth the signal and suppress the noise. In the ASSM, the noise is defined as the 3σ of the background signal. An integer number N is defined for finding the changing positions in the signal curve. If the difference of adjacent two points is greater than 3Nσ, the position is recorded as an end point of the smoothing segment. All the end points detected as above are recorded and the curves between them will be smoothed separately. In the traditional method, the end points of the smoothing windows in the signals are fixed. The ASSM creates changing end points in different signals and the smoothing windows could be set adaptively. The windows are always set as the half of the segmentations and then the average smoothing method will be applied in the segmentations. The Iterative process is required for reducing the end-point aberration effect in the average smoothing method and two or three times are enough. In ASSM, the signals are smoothed in the spacial area nor frequent area, that means the frequent disturbance will be avoided. A lidar echo was simulated in the experimental work. The echo was supposed to be created by a space-born lidar (e.g. CALIOP). And white Gaussian noise was added to the echo to act as the random noise resulted from environment and the detector. The novel method, ASSM, was applied to the noisy echo to filter the noise. In the test, N was set to 3 and the Iteration time is two. The results show that, the signal could be smoothed adaptively by the ASSM, but the N and the Iteration time might be optimized when the ASSM is applied in a different lidar.

  20. An Adaptable Seismic Data Format for Modern Scientific Workflows

    Science.gov (United States)

    Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.

    2013-12-01

    Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.

  1. Adaptive quantization-parameter clip scheme for smooth quality in H.264/AVC.

    Science.gov (United States)

    Hu, Sudeng; Wang, Hanli; Kwong, Sam

    2012-04-01

    In this paper, we investigate the issues over the smooth quality and the smooth bit rate during rate control (RC) in H.264/AVC. An adaptive quantization-parameter (Q(p)) clip scheme is proposed to optimize the quality smoothness while keeping the bit-rate fluctuation at an acceptable level. First, the frame complexity variation is studied by defining a complexity ratio between two nearby frames. Second, the range of the generated bits is analyzed to prevent the encoder buffer from overflow and underflow. Third, based on the safe range of the generated bits, an optimal Q(p) clip range is developed to reduce the quality fluctuation. Experimental results demonstrate that the proposed Q(p) clip scheme can achieve excellent performance in quality smoothness and buffer regulation.

  2. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    Science.gov (United States)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  3. A simple smoothness indicator for the WENO scheme with adaptive order

    Science.gov (United States)

    Huang, Cong; Chen, Li Li

    2018-01-01

    The fifth order WENO scheme with adaptive order is competent for solving hyperbolic conservation laws, its reconstruction is a convex combination of a fifth order linear reconstruction and three third order linear reconstructions. Note that, on uniform mesh, the computational cost of smoothness indicator for fifth order linear reconstruction is comparable with the sum of ones for three third order linear reconstructions, thus it is too heavy; on non-uniform mesh, the explicit form of smoothness indicator for fifth order linear reconstruction is difficult to be obtained, and its computational cost is much heavier than the one on uniform mesh. In order to overcome these problems, a simple smoothness indicator for fifth order linear reconstruction is proposed in this paper.

  4. Smooth Adaptive Internal Model Control Based on U Model for Nonlinear Systems with Dynamic Uncertainties

    Directory of Open Access Journals (Sweden)

    Li Zhao

    2016-01-01

    Full Text Available An improved smooth adaptive internal model control based on U model control method is presented to simplify modeling structure and parameter identification for a class of uncertain dynamic systems with unknown model parameters and bounded external disturbances. Differing from traditional adaptive methods, the proposed controller can simplify the identification of time-varying parameters in presence of bounded external disturbances. Combining the small gain theorem and the virtual equivalent system theory, learning rate of smooth adaptive internal model controller has been analyzed and the closed-loop virtual equivalent system based on discrete U model has been constructed as well. The convergence of this virtual equivalent system is proved, which further shows the convergence of the complex closed-loop discrete U model system. Finally, simulation and experimental results on a typical nonlinear dynamic system verified the feasibility of the proposed algorithm. The proposed method is shown to have lighter identification burden and higher control accuracy than the traditional adaptive controller.

  5. Stability of bumps in piecewise smooth neural fields with nonlinear adaptation

    KAUST Repository

    Kilpatrick, Zachary P.

    2010-06-01

    We study the linear stability of stationary bumps in piecewise smooth neural fields with local negative feedback in the form of synaptic depression or spike frequency adaptation. The continuum dynamics is described in terms of a nonlocal integrodifferential equation, in which the integral kernel represents the spatial distribution of synaptic weights between populations of neurons whose mean firing rate is taken to be a Heaviside function of local activity. Discontinuities in the adaptation variable associated with a bump solution means that bump stability cannot be analyzed by constructing the Evans function for a network with a sigmoidal gain function and then taking the high-gain limit. In the case of synaptic depression, we show that linear stability can be formulated in terms of solutions to a system of pseudo-linear equations. We thus establish that sufficiently strong synaptic depression can destabilize a bump that is stable in the absence of depression. These instabilities are dominated by shift perturbations that evolve into traveling pulses. In the case of spike frequency adaptation, we show that for a wide class of perturbations the activity and adaptation variables decouple in the linear regime, thus allowing us to explicitly determine stability in terms of the spectrum of a smooth linear operator. We find that bumps are always unstable with respect to this class of perturbations, and destabilization of a bump can result in either a traveling pulse or a spatially localized breather. © 2010 Elsevier B.V. All rights reserved.

  6. Development and characterization of a magnetorheological elastomer based adaptive seismic isolator

    International Nuclear Information System (INIS)

    Li, Yancheng; Li, Jianchun; Samali, Bijan; Li, Weihua

    2013-01-01

    One of the main shortcomings in current base isolation design/practice is lack of adaptability. As a result, a base isolation system that is effective for one type earthquake may become ineffective or may have adverse effect for other earthquakes. The vulnerability of traditional base isolation systems can be exaggerated by two types of earthquakes, i.e. near-field earthquakes and far-field earthquakes. This paper addresses the challenge facing current base isolation design/practice by proposing a new type of seismic isolator for the base isolation system, namely an adaptive seismic isolator. The novel adaptive seismic isolator utilizes magnetorheological elastomer (MRE) for its field-sensitive material property. Traditional seismic isolator design with a unique laminated structure of steel and MRE layers has been adopted in the novel MRE seismic isolator. To evaluate and characterize the behavior of the MRE seismic isolator, experimental testing was conducted on a shake table facility under harmonic cycling loading. Experimental results show that the proposed adaptive seismic isolator can successfully alter the lateral stiffness and damping force in real time up to 37% and 45% respectively. Based on the successful development of the novel adaptive seismic isolator, a discussion is also extended to the impact and potential applications of such a device in structural control applications in civil engineering. (paper)

  7. A DAFT DL_POLY distributed memory adaptation of the Smoothed Particle Mesh Ewald method

    Science.gov (United States)

    Bush, I. J.; Todorov, I. T.; Smith, W.

    2006-09-01

    The Smoothed Particle Mesh Ewald method [U. Essmann, L. Perera, M.L. Berkowtz, T. Darden, H. Lee, L.G. Pedersen, J. Chem. Phys. 103 (1995) 8577] for calculating long ranged forces in molecular simulation has been adapted for the parallel molecular dynamics code DL_POLY_3 [I.T. Todorov, W. Smith, Philos. Trans. Roy. Soc. London 362 (2004) 1835], making use of a novel 3D Fast Fourier Transform (DAFT) [I.J. Bush, The Daresbury Advanced Fourier transform, Daresbury Laboratory, 1999] that perfectly matches the Domain Decomposition (DD) parallelisation strategy [W. Smith, Comput. Phys. Comm. 62 (1991) 229; M.R.S. Pinches, D. Tildesley, W. Smith, Mol. Sim. 6 (1991) 51; D. Rapaport, Comput. Phys. Comm. 62 (1991) 217] of the DL_POLY_3 code. In this article we describe software adaptations undertaken to import this functionality and provide a review of its performance.

  8. Total variation regularization for seismic waveform inversion using an adaptive primal dual hybrid gradient method

    Science.gov (United States)

    Yong, Peng; Liao, Wenyuan; Huang, Jianping; Li, Zhenchuan

    2018-04-01

    Full waveform inversion is an effective tool for recovering the properties of the Earth from seismograms. However, it suffers from local minima caused mainly by the limited accuracy of the starting model and the lack of a low-frequency component in the seismic data. Because of the high velocity contrast between salt and sediment, the relation between the waveform and velocity perturbation is strongly nonlinear. Therefore, salt inversion can easily get trapped in the local minima. Since the velocity of salt is nearly constant, we can make the most of this characteristic with total variation regularization to mitigate the local minima. In this paper, we develop an adaptive primal dual hybrid gradient method to implement total variation regularization by projecting the solution onto a total variation norm constrained convex set, through which the total variation norm constraint is satisfied at every model iteration. The smooth background velocities are first inverted and the perturbations are gradually obtained by successively relaxing the total variation norm constraints. Numerical experiment of the projection of the BP model onto the intersection of the total variation norm and box constraints has demonstrated the accuracy and efficiency of our adaptive primal dual hybrid gradient method. A workflow is designed to recover complex salt structures in the BP 2004 model and the 2D SEG/EAGE salt model, starting from a linear gradient model without using low-frequency data below 3 Hz. The salt inversion processes demonstrate that wavefield reconstruction inversion with a total variation norm and box constraints is able to overcome local minima and inverts the complex salt velocity layer by layer.

  9. A local adaptive method for the numerical approximation in seismic wave modelling

    Directory of Open Access Journals (Sweden)

    Galuzzi Bruno G.

    2017-12-01

    Full Text Available We propose a new numerical approach for the solution of the 2D acoustic wave equation to model the predicted data in the field of active-source seismic inverse problems. This method consists in using an explicit finite difference technique with an adaptive order of approximation of the spatial derivatives that takes into account the local velocity at the grid nodes. Testing our method to simulate the recorded seismograms in a marine seismic acquisition, we found that the low computational time and the low approximation error of the proposed approach make it suitable in the context of seismic inversion problems.

  10. Evaluation of non-linear adaptive smoothing filter by digital phantom

    International Nuclear Information System (INIS)

    Sato, Kazuhiro; Ishiya, Hiroki; Oshita, Ryosuke; Yanagawa, Isao; Goto, Mitsunori; Mori, Issei

    2008-01-01

    As a result of the development of multi-slice CT, diagnoses based on three-dimensional reconstruction images and multi-planar reconstruction have spread. For these applications, which require high z-resolution, thin slice imaging is essential. However, because z-resolution is always based on a trade-off with image noise, thin slice imaging is necessarily accompanied by an increase in noise level. To improve the quality of thin slice images, a non-linear adaptive smoothing filter has been developed, and is being widely applied to clinical use. We developed a digital bar pattern phantom for the purpose of evaluating the effect of this filter and attempted evaluation from an addition image of the bar pattern phantom and the image of the water phantom. The effect of this filter was changed in a complex manner by the contrast and spatial frequency of the original image. We have confirmed the reduced effect of image noise in the low frequency component of the image, but decreased contrast or increased quantity of noise in the image of the high frequency component. This result represents the effect of change in the adaptation of this filter. The digital phantom was useful for this evaluation, but to understand the total effect of filtering, much improvement of the shape of the digital phantom is required. (author)

  11. A new fuzzy adaptive particle swarm optimization for non-smooth economic dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher; Mojarrad, Hassan Doagou; Nayeripour, Majid [Electrical and Electronic Engineering Department, Shiraz University of Technology, Shiraz (Iran)

    2010-04-15

    This paper proposes a novel method for solving the Non-convex Economic Dispatch (NED) problems, by the Fuzzy Adaptive Modified Particle Swarm Optimization (FAMPSO). Practical ED problems have non-smooth cost functions with equality and inequality constraints when generator valve-point loading effects are taken into account. Modern heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution for ED problems. PSO is one of modern heuristic algorithms, in which particles change place to get close to the best position and find the global minimum point. However, the classic PSO may converge to a local optimum solution and the performance of the PSO highly depends on the internal parameters. To overcome these drawbacks, in this paper, a new mutation is proposed to improve the global searching capability and prevent the convergence to local minima. Also, a fuzzy system is used to tune its parameters such as inertia weight and learning factors. In order to evaluate the performance of the proposed algorithm, it is applied to a system consisting of 13 and 40 thermal units whose fuel cost function is calculated by taking account of the effect of valve-point loading. Simulation results demonstrate the superiority of the proposed algorithm compared to other optimization algorithms presented in literature. (author)

  12. Adaptive smoothing based on Gaussian processes regression increases the sensitivity and specificity of fMRI data.

    Science.gov (United States)

    Strappini, Francesca; Gilboa, Elad; Pitzalis, Sabrina; Kay, Kendrick; McAvoy, Mark; Nehorai, Arye; Snyder, Abraham Z

    2017-03-01

    Temporal and spatial filtering of fMRI data is often used to improve statistical power. However, conventional methods, such as smoothing with fixed-width Gaussian filters, remove fine-scale structure in the data, necessitating a tradeoff between sensitivity and specificity. Specifically, smoothing may increase sensitivity (reduce noise and increase statistical power) but at the cost loss of specificity in that fine-scale structure in neural activity patterns is lost. Here, we propose an alternative smoothing method based on Gaussian processes (GP) regression for single subjects fMRI experiments. This method adapts the level of smoothing on a voxel by voxel basis according to the characteristics of the local neural activity patterns. GP-based fMRI analysis has been heretofore impractical owing to computational demands. Here, we demonstrate a new implementation of GP that makes it possible to handle the massive data dimensionality of the typical fMRI experiment. We demonstrate how GP can be used as a drop-in replacement to conventional preprocessing steps for temporal and spatial smoothing in a standard fMRI pipeline. We present simulated and experimental results that show the increased sensitivity and specificity compared to conventional smoothing strategies. Hum Brain Mapp 38:1438-1459, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Hybrid Adaptive Multilevel Monte Carlo Algorithm for Non-Smooth Observables of Itô Stochastic Differential Equations

    KAUST Repository

    Rached, Nadhir B.

    2014-01-06

    A new hybrid adaptive MC forward Euler algorithm for SDEs with singular coefficients and non-smooth observables is developed. This adaptive method is based on the derivation of a new error expansion with computable leading order terms. When a non-smooth binary payoff is considered, the new adaptive method achieves the same complexity as the uniform discretization with smooth problems. Moreover, the new developed algorithm is extended to the multilevel Monte Carlo (MLMC) forward Euler setting which reduces the complexity from O(TOL-3) to O(TOL-2(log(TOL))2). For the binary option case, it recovers the standard multilevel computational cost O(TOL-2(log(TOL))2). When considering a higher order Milstein scheme, a similar complexity result was obtained by Giles using the uniform time stepping for one dimensional SDEs, see [2]. The difficulty to extend Giles’ Milstein MLMC method to the multidimensional case is an argument for the flexibility of our new constructed adaptive MLMC forward Euler method which can be easily adapted to this setting. Similarly, the expected complexity O(TOL-2(log(TOL))2) is reached for the multidimensional case and verified numerically.

  14. Hybrid Adaptive Multilevel Monte Carlo Algorithm for Non-Smooth Observables of Itô Stochastic Differential Equations

    KAUST Repository

    Rached, Nadhir B.; Hoel, Haakon; Tempone, Raul

    2014-01-01

    A new hybrid adaptive MC forward Euler algorithm for SDEs with singular coefficients and non-smooth observables is developed. This adaptive method is based on the derivation of a new error expansion with computable leading order terms. When a non-smooth binary payoff is considered, the new adaptive method achieves the same complexity as the uniform discretization with smooth problems. Moreover, the new developed algorithm is extended to the multilevel Monte Carlo (MLMC) forward Euler setting which reduces the complexity from O(TOL-3) to O(TOL-2(log(TOL))2). For the binary option case, it recovers the standard multilevel computational cost O(TOL-2(log(TOL))2). When considering a higher order Milstein scheme, a similar complexity result was obtained by Giles using the uniform time stepping for one dimensional SDEs, see [2]. The difficulty to extend Giles’ Milstein MLMC method to the multidimensional case is an argument for the flexibility of our new constructed adaptive MLMC forward Euler method which can be easily adapted to this setting. Similarly, the expected complexity O(TOL-2(log(TOL))2) is reached for the multidimensional case and verified numerically.

  15. Adaptive semi-active control of buildings under seismic solicitations

    International Nuclear Information System (INIS)

    Roberti, V.; Jezequel, L.

    1993-01-01

    This paper describes an adaptive semi-active control method whereby nonlinear distributed systems are identified by their dynamical response. Approximate procedures are proposed which take into account the nonlinear behavior of the dynamic system considered. It is shown that only slight knowledge of nonlinearities is needed to apply feedback and feedforward control laws. The method is implemented to a simple example of a building with three degrees of freedom and the numerical results are analyzed

  16. A Multimode Adaptive Pushover Procedure for Seismic Assessment of Integral Bridges

    Directory of Open Access Journals (Sweden)

    Ehsan Mohtashami

    2013-01-01

    Full Text Available This paper presents a new adaptive pushover procedure to account for the effect of higher modes in order to accurately estimate the seismic response of bridges. The effect of higher modes is considered by introducing a minimum value for the total effective modal mass. The proposed method employs enough number of modes to ensure that the defined total effective modal mass participates in all increments of the pushover loading. An adaptive demand curve is also developed for assessment of the seismic demand. The efficiency and robustness of the proposed method are demonstrated by conducting a parametric study. The analysis includes 18 four-span integral bridges with various heights of piers. The inelastic response history analysis is employed as reference solution in this study. Numerical results indicate excellent accuracy of the proposed method in assessment of the seismic response. For most bridges investigated in this study, the difference between the estimated response of the proposed method and the inelastic response history analysis is less than 25% for displacements and 10% for internal forces. This indicates a very good accuracy compared to available pushover procedures in the literature. The proposed method is therefore recommended to be applied to the seismic performance evaluation of integral bridges for engineering applications.

  17. Smooth pursuit adaptation (SPA exhibits features useful to compensate changes in the properties of the smooth pursuit eye movement system due to usage.

    Directory of Open Access Journals (Sweden)

    Suryadeep eDash

    2013-10-01

    Full Text Available Smooth-pursuit adaptation (SPA refers to the fact that pursuit gain in the early, still open-loop response phase of the pursuit eye movement can be adjusted based on experience. For instance, if the target moves initially at a constant velocity for approximately 100-200ms and then steps to a higher velocity, subjects learn to up-regulate the pursuit gain associated with the initial target velocity (gain-increase SPA in order to reduce the retinal error resulting from the velocity step. Correspondingly, a step to a lower target velocity leads to a decrease in gain (gain-decrease SPA. In this study we demonstrate that the increase in peak eye velocity during gain-increase SPA is a consequence of expanding the duration of the eye acceleration profile while the decrease in peak velocity during gain-decrease SPA results from reduced peak eye acceleration but unaltered duration. Furthermore, we show that carrying out stereotypical smooth pursuit eye movements elicited by constant velocity target ramps for several hundred trials (= test of pursuit resilience leads to a clear drop in initial peak acceleration, a reflection of oculomotor and/ or cognitive fatigue. However, this drop in acceleration gets compensated by an increase in the duration of the acceleration profile, thereby keeping initial pursuit gain constant. The compensatory expansion of the acceleration profile in the pursuit resilience experiment is reminiscent of the one leading to gain-increase SPA, suggesting that both processes tap one and the same neuronal mechanism warranting a precise acceleration/ duration trade-off. Finally, we show that the ability to adjust acceleration duration during pursuit resilience depends on the integrity of the oculomotor vermis (OMV as indicated by the complete loss of the duration adjustment following a surgical lesion of the OMV in one rhesus monkey we could study.

  18. Seismic communication in a blind subterranean mammal: a major somatosensory mechanism in adaptive evolution underground.

    OpenAIRE

    Nevo, E; Heth, G; Pratt, H

    1991-01-01

    Seismic communication, through low-frequency and patterned substrate-borne vibrations that are generated by head thumping, and which travel long distances underground, is important in the nonvisual communication of subterranean mole rats of the Spalax ehrenbergi superspecies (2n = 52, 54, 58, and 60) in Israel. This importance pertains both intraspecifically in adaptation and interspecifically in speciation. Neurophysiologic, behavioral, and anatomic findings in this study suggest that the me...

  19. An adaptive spatio-temporal smoothing model for estimating trends and step changes in disease risk

    OpenAIRE

    Rushworth, Alastair; Lee, Duncan; Sarran, Christophe

    2014-01-01

    Statistical models used to estimate the spatio-temporal pattern in disease\\ud risk from areal unit data represent the risk surface for each time period with known\\ud covariates and a set of spatially smooth random effects. The latter act as a proxy\\ud for unmeasured spatial confounding, whose spatial structure is often characterised by\\ud a spatially smooth evolution between some pairs of adjacent areal units while other\\ud pairs exhibit large step changes. This spatial heterogeneity is not c...

  20. Hybrid Adaptive Multilevel Monte Carlo Algorithm for Non-Smooth Observables of Itô Stochastic Differential Equations

    KAUST Repository

    Rached, Nadhir B.

    2013-12-01

    The Monte Carlo forward Euler method with uniform time stepping is the standard technique to compute an approximation of the expected payoff of a solution of an Itô SDE. For a given accuracy requirement TOL, the complexity of this technique for well behaved problems, that is the amount of computational work to solve the problem, is O(TOL-3). A new hybrid adaptive Monte Carlo forward Euler algorithm for SDEs with non-smooth coefficients and low regular observables is developed in this thesis. This adaptive method is based on the derivation of a new error expansion with computable leading-order terms. The basic idea of the new expansion is the use of a mixture of prior information to determine the weight functions and posterior information to compute the local error. In a number of numerical examples the superior efficiency of the hybrid adaptive algorithm over the standard uniform time stepping technique is verified. When a non-smooth binary payoff with either GBM or drift singularity type of SDEs is considered, the new adaptive method achieves the same complexity as the uniform discretization with smooth problems. Moreover, the new developed algorithm is extended to the MLMC forward Euler setting which reduces the complexity from O(TOL-3) to O(TOL-2(log(TOL))2). For the binary option case with the same type of Itô SDEs, the hybrid adaptive MLMC forward Euler recovers the standard multilevel computational cost O(TOL-2(log(TOL))2). When considering a higher order Milstein scheme, a similar complexity result was obtained by Giles using the uniform time stepping for one dimensional SDEs. The difficulty to extend Giles\\' Milstein MLMC method to the multidimensional case is an argument for the flexibility of our new constructed adaptive MLMC forward Euler method which can be easily adapted to this setting. Similarly, the expected complexity O(TOL-2(log(TOL))2) is reached for the multidimensional case and verified numerically.

  1. Adaptive Multilevel Methods with Local Smoothing for $H^1$- and $H^{\\mathrm{curl}}$-Conforming High Order Finite Element Methods

    KAUST Repository

    Janssen, Bä rbel; Kanschat, Guido

    2011-01-01

    A multilevel method on adaptive meshes with hanging nodes is presented, and the additional matrices appearing in the implementation are derived. Smoothers of overlapping Schwarz type are discussed; smoothing is restricted to the interior of the subdomains refined to the current level; thus it has optimal computational complexity. When applied to conforming finite element discretizations of elliptic problems and Maxwell equations, the method's convergence rates are very close to those for the nonadaptive version. Furthermore, the smoothers remain efficient for high order finite elements. We discuss the implementation in a general finite element code using the example of the deal.II library. © 2011 Societ y for Industrial and Applied Mathematics.

  2. Examining Potential Boundary Bias Effects in Kernel Smoothing on Equating: An Introduction for the Adaptive and Epanechnikov Kernels.

    Science.gov (United States)

    Cid, Jaime A; von Davier, Alina A

    2015-05-01

    Test equating is a method of making the test scores from different test forms of the same assessment comparable. In the equating process, an important step involves continuizing the discrete score distributions. In traditional observed-score equating, this step is achieved using linear interpolation (or an unscaled uniform kernel). In the kernel equating (KE) process, this continuization process involves Gaussian kernel smoothing. It has been suggested that the choice of bandwidth in kernel smoothing controls the trade-off between variance and bias. In the literature on estimating density functions using kernels, it has also been suggested that the weight of the kernel depends on the sample size, and therefore, the resulting continuous distribution exhibits bias at the endpoints, where the samples are usually smaller. The purpose of this article is (a) to explore the potential effects of atypical scores (spikes) at the extreme ends (high and low) on the KE method in distributions with different degrees of asymmetry using the randomly equivalent groups equating design (Study I), and (b) to introduce the Epanechnikov and adaptive kernels as potential alternative approaches to reducing boundary bias in smoothing (Study II). The beta-binomial model is used to simulate observed scores reflecting a range of different skewed shapes.

  3. Adaptive endpoint detection of seismic signal based on auto-correlated function

    International Nuclear Information System (INIS)

    Fan Wanchun; Shi Ren

    2001-01-01

    Based on the analysis of auto-correlation function, the notion of the distance between auto-correlation function was quoted, and the characterization of the noise and the signal with noise were discussed by using the distance. Then, the method of auto- adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low signal with noise ratio circumstance

  4. Local Adaptive Calibration of the GLASS Surface Incident Shortwave Radiation Product Using Smoothing Spline

    Science.gov (United States)

    Zhang, X.; Liang, S.; Wang, G.

    2015-12-01

    Incident solar radiation (ISR) over the Earth's surface plays an important role in determining the Earth's climate and environment. Generally, can be obtained from direct measurements, remotely sensed data, or reanalysis and general circulation models (GCMs) data. Each type of product has advantages and limitations: the surface direct measurements provide accurate but sparse spatial coverage, whereas other global products may have large uncertainties. Ground measurements have been normally used for validation and occasionally calibration, but transforming their "true values" spatially to improve the satellite products is still a new and challenging topic. In this study, an improved thin-plate smoothing spline approach is presented to locally "calibrate" the Global LAnd Surface Satellite (GLASS) ISR product using the reconstructed ISR data from surface meteorological measurements. The influences of surface elevation on ISR estimation was also considered in the proposed method. The point-based surface reconstructed ISR was used as the response variable, and the GLASS ISR product and the surface elevation data at the corresponding locations as explanatory variables to train the thin plate spline model. We evaluated the performance of the approach using the cross-validation method at both daily and monthly time scales over China. We also evaluated estimated ISR based on the thin-plate spline method using independent ground measurements at 10 sites from the Coordinated Enhanced Observation Network (CEON). These validation results indicated that the thin plate smoothing spline method can be effectively used for calibrating satellite derived ISR products using ground measurements to achieve better accuracy.

  5. A 'Good' muscle in a 'Bad' environment: the importance of airway smooth muscle force adaptation to airway hyperresponsiveness.

    Science.gov (United States)

    Bossé, Ynuk; Chapman, David G; Paré, Peter D; King, Gregory G; Salome, Cheryl M

    2011-12-15

    Asthma is characterized by airway inflammation, with a consequent increase in spasmogens, and exaggerated airway narrowing in response to stimuli, termed airway hyperresponsiveness (AHR). The nature of any relationship between inflammation and AHR is less clear. Recent ex vivo data has suggested a novel mechanism by which inflammation may lead to AHR, in which increased basal ASM-tone, due to the presence of spasmogens in the airways, may "strengthen" the ASM and ultimately lead to exaggerated airway narrowing. This phenomenon was termed "force adaptation" [Bossé, Y., Chin, L.Y., Paré, P.D., Seow, C.Y., 2009. Adaptation of airway smooth muscle to basal tone: relevance to airway hyperresponsiveness. Am. J. Respir. Cell Mol. Biol. 40, 13-18]. However, it is unknown whether the magnitude of the effect of force adaptation ex vivo could contribute to exaggerated airway narrowing in vivo. Our aim was to utilize a computational model of ASM shortening in order to quantify the potential effect of force adaptation on airway narrowing when all other mechanical factors were kept constant. The shortening in the model is dictated by a balance between physiological loads and ASM force-generating capacity at different lengths. The results suggest that the magnitude of the effect of force adaptation on ASM shortening would lead to substantially more airway narrowing during bronchial challenge at any given airway generation. We speculate that the increased basal ASM-tone in asthma, due to the presence of inflammation-derived spasmogens, produces an increase in the force-generating capacity of ASM, predisposing to AHR during subsequent challenge. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Adaptive Multilevel Methods with Local Smoothing for $H^1$- and $H^{\\mathrm{curl}}$-Conforming High Order Finite Element Methods

    KAUST Repository

    Janssen, Bärbel

    2011-01-01

    A multilevel method on adaptive meshes with hanging nodes is presented, and the additional matrices appearing in the implementation are derived. Smoothers of overlapping Schwarz type are discussed; smoothing is restricted to the interior of the subdomains refined to the current level; thus it has optimal computational complexity. When applied to conforming finite element discretizations of elliptic problems and Maxwell equations, the method\\'s convergence rates are very close to those for the nonadaptive version. Furthermore, the smoothers remain efficient for high order finite elements. We discuss the implementation in a general finite element code using the example of the deal.II library. © 2011 Societ y for Industrial and Applied Mathematics.

  7. Effect of acetylcysteine on adaptation of intestinal smooth muscle after small bowel bypass

    International Nuclear Information System (INIS)

    Weisbrodt, N.W.; Belloso, R.M.; Biskin, L.C.; Dudrick, P.S.; Dudrick, S.J.

    1986-01-01

    The authors have postulated that the adaptive changes in function and structure of bypassed segments of small bowel are due in part to the change in intestinal contents following operation. The purpose of these experiments was to determine if a mucolytic agent could alter the adaptation. Rats were anesthetized and a 70% jejunoileal bypass was performed. The bypassed segments then were perfused with either saline or acetylcysteine for 3-12 days. Then, either intestinal transit was determined using Cr-51, or segments were taken for morphometric analysis. Transit, as assessed by the geometric center, was increased 32% by acetylcysteine treatment. Treatment also caused a decrease in hypertrophy of the muscularis. Muscle wet weight, muscle cross-sectional area, and muscle layer thickness all were significantly less in those animals infused with acetyl-cysteine. No decreases in hypertrophy were seen in the in-continuity segments. These data indicate that alterations in intestinal content can affect the course of adaptation of intestinal muscle in response to small bowel bypass

  8. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    Science.gov (United States)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was

  9. Adaptive estimation of a time-varying phase with coherent states: Smoothing can give an unbounded improvement over filtering

    Science.gov (United States)

    Laverick, Kiarn T.; Wiseman, Howard M.; Dinani, Hossein T.; Berry, Dominic W.

    2018-04-01

    The problem of measuring a time-varying phase, even when the statistics of the variation is known, is considerably harder than that of measuring a constant phase. In particular, the usual bounds on accuracy, such as the 1 /(4 n ¯) standard quantum limit with coherent states, do not apply. Here, by restricting to coherent states, we are able to analytically obtain the achievable accuracy, the equivalent of the standard quantum limit, for a wide class of phase variation. In particular, we consider the case where the phase has Gaussian statistics and a power-law spectrum equal to κp -1/|ω| p for large ω , for some p >1 . For coherent states with mean photon flux N , we give the quantum Cramér-Rao bound on the mean-square phase error as [psin(π /p ) ] -1(4N /κ ) -(p -1 )/p . Next, we consider whether the bound can be achieved by an adaptive homodyne measurement in the limit N /κ ≫1 , which allows the photocurrent to be linearized. Applying the optimal filtering for the resultant linear Gaussian system, we find the same scaling with N , but with a prefactor larger by a factor of p . By contrast, if we employ optimal smoothing we can exactly obtain the quantum Cramér-Rao bound. That is, contrary to previously considered (p =2 ) cases of phase estimation, here the improvement offered by smoothing over filtering is not limited to a factor of 2 but rather can be unbounded by a factor of p . We also study numerically the performance of these estimators for an adaptive measurement in the limit where N /κ is not large and find a more complicated picture.

  10. Adaptive Control Design for Autonomous Operation of Multiple Energy Storage Systems in Power Smoothing Applications

    DEFF Research Database (Denmark)

    Meng, Lexuan; Dragicevic, Tomislav; Guerrero, Josep M.

    2018-01-01

    -pass-filter (HPF) structure. It generates the power reference according to the fluctuating power and provides a stabilization effect. The power and energy supplied by ESS are majorly configured by the cut-off frequency and gain of the HPF. Considering the operational limits on ESS state-of-charge (SoC), this paper...... proposes an adaptive cut-off frequency design method to realize communication-less and autonomous operation of a system with multiple distributed ESS. The experimental results demonstrate that the SoCs of all ESS units are kept within safe margins, while the SoC level and power of the paralleled units...... converge to the final state, providing a natural plug-and-play function....

  11. Adaptive multiscale MCMC algorithm for uncertainty quantification in seismic parameter estimation

    KAUST Repository

    Tan, Xiaosi

    2014-08-05

    Formulating an inverse problem in a Bayesian framework has several major advantages (Sen and Stoffa, 1996). It allows finding multiple solutions subject to flexible a priori information and performing uncertainty quantification in the inverse problem. In this paper, we consider Bayesian inversion for the parameter estimation in seismic wave propagation. The Bayes\\' theorem allows writing the posterior distribution via the likelihood function and the prior distribution where the latter represents our prior knowledge about physical properties. One of the popular algorithms for sampling this posterior distribution is Markov chain Monte Carlo (MCMC), which involves making proposals and calculating their acceptance probabilities. However, for large-scale problems, MCMC is prohibitevely expensive as it requires many forward runs. In this paper, we propose a multilevel MCMC algorithm that employs multilevel forward simulations. Multilevel forward simulations are derived using Generalized Multiscale Finite Element Methods that we have proposed earlier (Efendiev et al., 2013a; Chung et al., 2013). Our overall Bayesian inversion approach provides a substantial speed-up both in the process of the sampling via preconditioning using approximate posteriors and the computation of the forward problems for different proposals by using the adaptive nature of multiscale methods. These aspects of the method are discussed n the paper. This paper is motivated by earlier work of M. Sen and his collaborators (Hong and Sen, 2007; Hong, 2008) who proposed the development of efficient MCMC techniques for seismic applications. In the paper, we present some preliminary numerical results.

  12. Improvement of Detection of Hypoattenuation in Acute Ischemic Stroke in Unenhanced Computed Tomography Using an Adaptive Smoothing Filter

    International Nuclear Information System (INIS)

    Takahashi, N.; Lee, Y.; Tsai, D. Y.; Ishii, K.; Kinoshita, T.; Tamura, H.; K imura, M.

    2008-01-01

    Background: Much attention has been directed toward identifying early signs of cerebral ischemia on computed tomography (CT) images. Hypoattenuation of ischemic brain parenchyma has been found to be the most frequent early sign. Purpose: To evaluate the effect of a previously proposed adaptive smoothing filter for improving detection of parenchymal hypoattenuation of acute ischemic stroke on unenhanced CT images. Material and Methods: Twenty-six patients with parenchymal hypoattenuation and 49 control subjects without hypoattenuation were retrospectively selected in this study. The adaptive partial median filter (APMF) designed for improving detectability of hypoattenuation areas on unenhanced CT images was applied. Seven radiologists, including four certified radiologists and three radiology residents, indicated their confidence level regarding the presence (or absence) of hypoattenuation on CT images, first without and then with the APMF processed images. Their performances without and with the APMF processed images were evaluated by receiver operating characteristic (ROC) analysis. Results: The mean areas under the ROC curves (AUC) for all observers increased from 0.875 to 0.929 (P=0.002) when the radiologists observed with the APMF processed images. The mean sensitivity in the detection of hypoattenuation significantly improved, from 69% (126 of 182 observations) to 89% (151 of 182 observations), when employing the APMF (P=0.012). The specificity, however, was unaffected by the APMF (P=0.41). Conclusion: The APMF has the potential to improve the detection of parenchymal hypoattenuation of acute ischemic stroke on unenhanced CT images

  13. Adaptive endpoint detection of seismic signal based on auto-correlated function

    International Nuclear Information System (INIS)

    Fan Wanchun; Shi Ren

    2000-01-01

    There are certain shortcomings for the endpoint detection by time-waveform envelope and/or by checking the travel table (both labelled as the artificial detection method). Based on the analysis of the auto-correlation function, the notion of the distance between auto-correlation functions was quoted, and the characterizations of the noise and the signal with noise were discussed by using the distance. Then, the method of auto-adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low SNR circumstance

  14. Compression of seismic data: filter banks and extended transforms, synthesis and adaptation; Compression de donnees sismiques: bancs de filtres et transformees etendues, synthese et adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Duval, L.

    2000-11-01

    Wavelet and wavelet packet transforms are the most commonly used algorithms for seismic data compression. Wavelet coefficients are generally quantized and encoded by classical entropy coding techniques. We first propose in this work a compression algorithm based on the wavelet transform. The wavelet transform is used together with a zero-tree type coding, with first use in seismic applications. Classical wavelet transforms nevertheless yield a quite rigid approach, since it is often desirable to adapt the transform stage to the properties of each type of signal. We thus propose a second algorithm using, instead of wavelets, a set of so called 'extended transforms'. These transforms, originating from the filter bank theory, are parameterized. Classical examples are Malvar's Lapped Orthogonal Transforms (LOT) or de Queiroz et al. Generalized Lapped Orthogonal Transforms (GenLOT). We propose several optimization criteria to build 'extended transforms' which are adapted the properties of seismic signals. We further show that these transforms can be used with the same zero-tree type coding technique as used with wavelets. Both proposed algorithms provide exact compression rate choice, block-wise compression (in the case of extended transforms) and partial decompression for quality control or visualization. Performances are tested on a set of actual seismic data. They are evaluated for several quality measures. We also compare them to other seismic compression algorithms. (author)

  15. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  16. Adaptive neuro-fuzzy inference systems for semi-automatic discrimination between seismic events: a study in Tehran region

    Science.gov (United States)

    Vasheghani Farahani, Jamileh; Zare, Mehdi; Lucas, Caro

    2012-04-01

    Thisarticle presents an adaptive neuro-fuzzy inference system (ANFIS) for classification of low magnitude seismic events reported in Iran by the network of Tehran Disaster Mitigation and Management Organization (TDMMO). ANFIS classifiers were used to detect seismic events using six inputs that defined the seismic events. Neuro-fuzzy coding was applied using the six extracted features as ANFIS inputs. Two types of events were defined: weak earthquakes and mining blasts. The data comprised 748 events (6289 signals) ranging from magnitude 1.1 to 4.6 recorded at 13 seismic stations between 2004 and 2009. We surveyed that there are almost 223 earthquakes with M ≤ 2.2 included in this database. Data sets from the south, east, and southeast of the city of Tehran were used to evaluate the best short period seismic discriminants, and features as inputs such as origin time of event, distance (source to station), latitude of epicenter, longitude of epicenter, magnitude, and spectral analysis (fc of the Pg wave) were used, increasing the rate of correct classification and decreasing the confusion rate between weak earthquakes and quarry blasts. The performance of the ANFIS model was evaluated for training and classification accuracy. The results confirmed that the proposed ANFIS model has good potential for determining seismic events.

  17. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions

    Science.gov (United States)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann

    2017-07-01

    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  18. Multicomponent ensemble models to forecast induced seismicity

    Science.gov (United States)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels

  19. Adaptive multiscale MCMC algorithm for uncertainty quantification in seismic parameter estimation

    KAUST Repository

    Tan, Xiaosi; Gibson, Richard L.; Leung, Wing Tat; Efendiev, Yalchin R.

    2014-01-01

    problem. In this paper, we consider Bayesian inversion for the parameter estimation in seismic wave propagation. The Bayes' theorem allows writing the posterior distribution via the likelihood function and the prior distribution where the latter represents

  20. Ring-Shaped Seismicity Structures in the Areas of Sarez and Nurek Water Reservoirs (Tajikistan): Lithosphere Adaptation to Additional Loading

    Science.gov (United States)

    Kopnichev, Yu. F.; Sokolova, I. N.

    2017-12-01

    Seismicity characteristics in the areas of Sarez Lake and the Nurek water reservoir are studied. Ring-shaped seismicity structures in two depth ranges (0-33 and 34-70 km) formed prior to the Pamir earthquake of December 7, 2015 ( M w = 7.2). Seismicity rings cross each other near the Usoi Dam, which formed after the strong earthquake in 1911 and led to the formation of Sarez Lake, and near the epicenter of the Pamir earthquake. In addition, three out of the four strongest events ( M ≥ 6.0) recorded in the Pamir region at depths of more than 70 km since 1950 have occurred near Sarez Lake. An aggregate of the data allows us to conclude that the Pamir earthquake, despite its very large energy, refers to events related to induced seismicity. Ring-shaped seismicity structures in two depth ranges also formed in the Nurek water reservoir area. It is supposed that the formation of ring-shaped structures is related to the self-organization processes of a geological system, which result in the ascent of deep-seated fluids. In this respect, the lithosphere is gradually adapting to the additional load related to the filling of the water reservoir. The difference between Nurek Dam (and many other hydroelectric power stations as well) and Usoi Dam is the permanent vibration in the former case due to water falling from a height of more than 200 m. Such an effect can lead to gradual stress dissipation, resulting in the occurrence of much weaker events when compared to the Pamir earthquake of December 7, 2015, in the areas of artificial water reservoirs.

  1. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    Science.gov (United States)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  2. Seismic hazard in the Intermountain West

    Science.gov (United States)

    Haller, Kathleen; Moschetti, Morgan P.; Mueller, Charles; Rezaeian, Sanaz; Petersen, Mark D.; Zeng, Yuehua

    2015-01-01

    The 2014 national seismic-hazard model for the conterminous United States incorporates new scientific results and important model adjustments. The current model includes updates to the historical catalog, which is spatially smoothed using both fixed-length and adaptive-length smoothing kernels. Fault-source characterization improved by adding faults, revising rates of activity, and incorporating new results from combined inversions of geologic and geodetic data. The update also includes a new suite of published ground motion models. Changes in probabilistic ground motion are generally less than 10% in most of the Intermountain West compared to the prior assessment, and ground-motion hazard in four Intermountain West cities illustrates the range and magnitude of change in the region. Seismic hazard at reference sites in Boise and Reno increased as much as 10%, whereas hazard in Salt Lake City decreased 5–6%. The largest change was in Las Vegas, where hazard increased 32–35%.

  3. Power-Smoothing Scheme of a DFIG Using the Adaptive Gain Depending on the Rotor Speed and Frequency Deviation

    DEFF Research Database (Denmark)

    Lee, Hyewon; Hwang, Min; Muljadi, Eduard

    2017-01-01

    In an electric power grid that has a high penetration level of wind, the power fluctuation of a large-scale wind power plant (WPP) caused by varying wind speeds deteriorates the system frequency regulation. This paper proposes a power-smoothing scheme of a doubly-fed induction generator (DFIG...... demonstrate that the proposed scheme significantly lessens the output power fluctuation of a WPP under various scenarios by modifying the gain with the rotor speed and frequency deviation, and thereby it can regulate the frequency deviation within a narrow range.......) that significantly mitigates the system frequency fluctuation while preventing over-deceleration of the rotor speed. The proposed scheme employs an additional control loop relying on the system frequency deviation that operates in combination with the maximum power point tracking control loop. To improve the power...

  4. Improving mouse controlling and movement for people with Parkinson's disease and involuntary tremor using adaptive path smoothing technique via B-spline.

    Science.gov (United States)

    Hashem, Seyed Yashar Bani; Zin, Nor Azan Mat; Yatim, Noor Faezah Mohd; Ibrahim, Norlinah Mohamed

    2014-01-01

    Many input devices are available for interacting with computers, but the computer mouse is still the most popular device for interaction. People who suffer from involuntary tremor have difficulty using the mouse in the normal way. The target participants of this research were individuals who suffer from Parkinson's disease. Tremor in limbs makes accurate mouse movements impossible or difficult without any assistive technologies to help. This study explores a new assistive technique-adaptive path smoothing via B-spline (APSS)-to enhance mouse controlling based on user's tremor level and type. APSS uses Mean filtering and B-spline to provide a smoothed mouse trajectory. Seven participants who have unwanted tremor evaluated APSS. Results show that APSS is very promising and greatly increases their control of the computer mouse. Result of user acceptance test also shows that user perceived APSS as easy to use. They also believe it to be a useful tool and intend to use it once it is available. Future studies could explore the possibility of integrating APSS with one assistive pointing technique, such as the Bubble cursor or the Sticky target technique, to provide an all in one solution for motor disabled users.

  5. Seismic hazard in the Nation's breadbasket

    Science.gov (United States)

    Boyd, Oliver; Haller, Kathleen; Luco, Nicolas; Moschetti, Morgan P.; Mueller, Charles; Petersen, Mark D.; Rezaeian, Sanaz; Rubinstein, Justin L.

    2015-01-01

    The USGS National Seismic Hazard Maps were updated in 2014 and included several important changes for the central United States (CUS). Background seismicity sources were improved using a new moment-magnitude-based catalog; a new adaptive, nearest-neighbor smoothing kernel was implemented; and maximum magnitudes for background sources were updated. Areal source zones developed by the Central and Eastern United States Seismic Source Characterization for Nuclear Facilities project were simplified and adopted. The weighting scheme for ground motion models was updated, giving more weight to models with a faster attenuation with distance compared to the previous maps. Overall, hazard changes (2% probability of exceedance in 50 years, across a range of ground-motion frequencies) were smaller than 10% in most of the CUS relative to the 2008 USGS maps despite new ground motion models and their assigned logic tree weights that reduced the probabilistic ground motions by 5–20%.

  6. Smooth manifolds

    CERN Document Server

    Sinha, Rajnikant

    2014-01-01

    This book offers an introduction to the theory of smooth manifolds, helping students to familiarize themselves with the tools they will need for mathematical research on smooth manifolds and differential geometry. The book primarily focuses on topics concerning differential manifolds, tangent spaces, multivariable differential calculus, topological properties of smooth manifolds, embedded submanifolds, Sard’s theorem and Whitney embedding theorem. It is clearly structured, amply illustrated and includes solved examples for all concepts discussed. Several difficult theorems have been broken into many lemmas and notes (equivalent to sub-lemmas) to enhance the readability of the book. Further, once a concept has been introduced, it reoccurs throughout the book to ensure comprehension. Rank theorem, a vital aspect of smooth manifolds theory, occurs in many manifestations, including rank theorem for Euclidean space and global rank theorem. Though primarily intended for graduate students of mathematics, the book ...

  7. Adaptive elimination of optical fiber transmission noise in fiber ocean bottom seismic system

    Science.gov (United States)

    Zhong, Qiuwen; Hu, Zhengliang; Cao, Chunyan; Dong, Hongsheng

    2017-10-01

    In this paper, a pressure and acceleration insensitive reference Interferometer is used to obtain laser and public noise introduced by transmission fiber and laser. By using direct subtraction and adaptive filtering, this paper attempts to eliminate and estimation the transmission noise of sensing probe. This paper compares the noise suppression effect of four methods, including the direct subtraction (DS), the least mean square error adaptive elimination (LMS), the normalized least mean square error adaptive elimination (NLMS) and the least square (RLS) adaptive filtering. The experimental results show that the noise reduction effect of RLS and NLMS are almost the same, better than LMS and DS, which can reach 8dB (@100Hz). But considering the workload, RLS is not conducive to the real-time operating system. When it comes to the same treatment effect, the practicability of NLMS is higher than RLS. The noise reduction effect of LMS is slightly worse than that of RLS and NLMS, about 6dB (@100Hz), but its computational complexity is small, which is beneficial to the real time system implementation. It can also be seen that the DS method has the least amount of computational complexity, but the noise suppression effect is worse than that of the adaptive filter due to the difference of the noise amplitude between the RI and the SI, only 4dB (@100Hz) can be reached. The adaptive filter can basically eliminate the influence of the transmission noise, and the simulation signal of the sensor is kept intact.

  8. Surface smoothness

    DEFF Research Database (Denmark)

    Tummala, Sudhakar; Dam, Erik B.

    2010-01-01

    accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used....... We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers....

  9. Seismic testing

    International Nuclear Information System (INIS)

    Sollogoub, Pierre

    2001-01-01

    This lecture deals with: qualification methods for seismic testing; objectives of seismic testing; seismic testing standards including examples; main content of standard; testing means; and some important elements of seismic testing

  10. Seismic Symphonies

    Science.gov (United States)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and

  11. Seismic Technology Adapted to Analyzing and Developing Geothermal Systems Below Surface-Exposed High-Velocity Rocks Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; DeAngelo, Michael V. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Ermolaeva, Elena [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Hardage, Bob A. [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Remington, Randy [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Sava, Diana [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wagner, Donald [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology; Wei, Shuijion [Univ. of Texas, Austin, TX (United States). Bureau of Economic Geology

    2013-02-01

    The objective of our research was to develop and demonstrate seismic data-acquisition and data-processing technologies that allow geothermal prospects below high-velocity rock outcrops to be evaluated. To do this, we acquired a 3-component seismic test line across an area of exposed high-velocity rocks in Brewster County, Texas, where there is high heat flow and surface conditions mimic those found at numerous geothermal prospects. Seismic contractors have not succeeded in creating good-quality seismic data in this area for companies who have acquired data for oil and gas exploitation purposes. Our test profile traversed an area where high-velocity rocks and low-velocity sediment were exposed on the surface in alternating patterns that repeated along the test line. We verified that these surface conditions cause non-ending reverberations of Love waves, Rayleigh waves, and shallow critical refractions to travel across the earth surface between the boundaries of the fast-velocity and slow-velocity material exposed on the surface. These reverberating surface waves form the high level of noise in this area that does not allow reflections from deep interfaces to be seen and utilized. Our data-acquisition method of deploying a box array of closely spaced geophones allowed us to recognize and evaluate these surface-wave noise modes regardless of the azimuth direction to the surface anomaly that backscattered the waves and caused them to return to the test-line profile. With this knowledge of the surface-wave noise, we were able to process these test-line data to create P-P and SH-SH images that were superior to those produced by a skilled seismic data-processing contractor. Compared to the P-P data acquired along the test line, the SH-SH data provided a better detection of faults and could be used to trace these faults upward to the boundaries of exposed surface rocks. We expanded our comparison of the relative value of S-wave and P-wave seismic data for geothermal

  12. Seismic hazard in the eastern United States

    Science.gov (United States)

    Mueller, Charles; Boyd, Oliver; Petersen, Mark D.; Moschetti, Morgan P.; Rezaeian, Sanaz; Shumway, Allison

    2015-01-01

    The U.S. Geological Survey seismic hazard maps for the central and eastern United States were updated in 2014. We analyze results and changes for the eastern part of the region. Ratio maps are presented, along with tables of ground motions and deaggregations for selected cities. The Charleston fault model was revised, and a new fault source for Charlevoix was added. Background seismicity sources utilized an updated catalog, revised completeness and recurrence models, and a new adaptive smoothing procedure. Maximum-magnitude models and ground motion models were also updated. Broad, regional hazard reductions of 5%–20% are mostly attributed to new ground motion models with stronger near-source attenuation. The revised Charleston fault geometry redistributes local hazard, and the new Charlevoix source increases hazard in northern New England. Strong increases in mid- to high-frequency hazard at some locations—for example, southern New Hampshire, central Virginia, and eastern Tennessee—are attributed to updated catalogs and/or smoothing.

  13. The 2014 update to the National Seismic Hazard Model in California

    Science.gov (United States)

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  14. Adaptation.

    Science.gov (United States)

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  15. Adaptation

    International Development Research Centre (IDRC) Digital Library (Canada)

    building skills, knowledge or networks on adaptation, ... the African partners leading the AfricaAdapt network, together with the UK-based Institute of Development Studies; and ... UNCCD Secretariat, Regional Coordination Unit for Africa, Tunis, Tunisia .... 26 Rural–urban Cooperation on Water Management in the Context of.

  16. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  17. Bayesian Exponential Smoothing.

    OpenAIRE

    Forbes, C.S.; Snyder, R.D.; Shami, R.S.

    2000-01-01

    In this paper, a Bayesian version of the exponential smoothing method of forecasting is proposed. The approach is based on a state space model containing only a single source of error for each time interval. This model allows us to improve current practices surrounding exponential smoothing by providing both point predictions and measures of the uncertainty surrounding them.

  18. Smooth polyhedral surfaces

    KAUST Repository

    Gü nther, Felix; Jiang, Caigui; Pottmann, Helmut

    2017-01-01

    Polyhedral surfaces are fundamental objects in architectural geometry and industrial design. Whereas closeness of a given mesh to a smooth reference surface and its suitability for numerical simulations were already studied extensively, the aim of our work is to find and to discuss suitable assessments of smoothness of polyhedral surfaces that only take the geometry of the polyhedral surface itself into account. Motivated by analogies to classical differential geometry, we propose a theory of smoothness of polyhedral surfaces including suitable notions of normal vectors, tangent planes, asymptotic directions, and parabolic curves that are invariant under projective transformations. It is remarkable that seemingly mild conditions significantly limit the shapes of faces of a smooth polyhedral surface. Besides being of theoretical interest, we believe that smoothness of polyhedral surfaces is of interest in the architectural context, where vertices and edges of polyhedral surfaces are highly visible.

  19. Smooth polyhedral surfaces

    KAUST Repository

    Günther, Felix

    2017-03-15

    Polyhedral surfaces are fundamental objects in architectural geometry and industrial design. Whereas closeness of a given mesh to a smooth reference surface and its suitability for numerical simulations were already studied extensively, the aim of our work is to find and to discuss suitable assessments of smoothness of polyhedral surfaces that only take the geometry of the polyhedral surface itself into account. Motivated by analogies to classical differential geometry, we propose a theory of smoothness of polyhedral surfaces including suitable notions of normal vectors, tangent planes, asymptotic directions, and parabolic curves that are invariant under projective transformations. It is remarkable that seemingly mild conditions significantly limit the shapes of faces of a smooth polyhedral surface. Besides being of theoretical interest, we believe that smoothness of polyhedral surfaces is of interest in the architectural context, where vertices and edges of polyhedral surfaces are highly visible.

  20. Seismic Ecology

    Science.gov (United States)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power

  1. Smoothness of limit functors

    Indian Academy of Sciences (India)

    Abstract. Let S be a scheme. Assume that we are given an action of the one dimen- sional split torus Gm,S on a smooth affine S-scheme X. We consider the limit (also called attractor) subfunctor Xλ consisting of points whose orbit under the given action. 'admits a limit at 0'. We show that Xλ is representable by a smooth ...

  2. Deterministic seismic hazard macrozonation of India

    Indian Academy of Sciences (India)

    The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided ...

  3. Revealed smooth nontransitive preferences

    DEFF Research Database (Denmark)

    Keiding, Hans; Tvede, Mich

    2013-01-01

    In the present paper, we are concerned with the behavioural consequences of consumers having nontransitive preference relations. Data sets consist of finitely many observations of price vectors and consumption bundles. A preference relation rationalizes a data set provided that for every observed...... consumption bundle, all strictly preferred bundles are more expensive than the observed bundle. Our main result is that data sets can be rationalized by a smooth nontransitive preference relation if and only if prices can normalized such that the law of demand is satisfied. Market data sets consist of finitely...... many observations of price vectors, lists of individual incomes and aggregate demands. We apply our main result to characterize market data sets consistent with equilibrium behaviour of pure-exchange economies with smooth nontransitive consumers....

  4. Generalizing smooth transition autoregressions

    DEFF Research Database (Denmark)

    Chini, Emilio Zanetti

    We introduce a variant of the smooth transition autoregression - the GSTAR model - capable to parametrize the asymmetry in the tails of the transition equation by using a particular generalization of the logistic function. A General-to-Specific modelling strategy is discussed in detail, with part......We introduce a variant of the smooth transition autoregression - the GSTAR model - capable to parametrize the asymmetry in the tails of the transition equation by using a particular generalization of the logistic function. A General-to-Specific modelling strategy is discussed in detail......, with particular emphasis on two different LM-type tests for the null of symmetric adjustment towards a new regime and three diagnostic tests, whose power properties are explored via Monte Carlo experiments. Four classical real datasets illustrate the empirical properties of the GSTAR, jointly to a rolling...

  5. Seismic Studies

    Energy Technology Data Exchange (ETDEWEB)

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground

  6. Seismic Studies

    International Nuclear Information System (INIS)

    R. Quittmeyer

    2006-01-01

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at

  7. Seismic protection

    International Nuclear Information System (INIS)

    Herbert, R.

    1988-01-01

    To ensure that a nuclear reactor or other damage-susceptible installation is, so far as possible, tripped and already shut down before the arrival of an earthquake shock at its location, a ring of monitoring seismic sensors is provided around it, each sensor being spaced from it by a distance (possibly several kilometres) such that (taking into account the seismic-shock propagation velocity through the intervening ground) a shock monitored by the sensor and then advancing to the installation site will arrive there later than a warning signal emitted by the sensor and received at the installation, by an interval sufficient to allow the installation to trip and shut down, or otherwise assume an optimum anti-seismic mode, in response to the warning signal. Extra sensors located in boreholes may define effectively a three-dimensional (hemispherical) sensing boundary rather than a mere two-dimensional ring. (author)

  8. Induced Seismicity

    Science.gov (United States)

    Keranen, Katie M.; Weingarten, Matthew

    2018-05-01

    The ability of fluid-generated subsurface stress changes to trigger earthquakes has long been recognized. However, the dramatic rise in the rate of human-induced earthquakes in the past decade has created abundant opportunities to study induced earthquakes and triggering processes. This review briefly summarizes early studies but focuses on results from induced earthquakes during the past 10 years related to fluid injection in petroleum fields. Study of these earthquakes has resulted in insights into physical processes and has identified knowledge gaps and future research directions. Induced earthquakes are challenging to identify using seismological methods, and faults and reefs strongly modulate spatial and temporal patterns of induced seismicity. However, the similarity of induced and natural seismicity provides an effective tool for studying earthquake processes. With continuing development of energy resources, increased interest in carbon sequestration, and construction of large dams, induced seismicity will continue to pose a hazard in coming years.

  9. Smooth Phase Interpolated Keying

    Science.gov (United States)

    Borah, Deva K.

    2007-01-01

    Smooth phase interpolated keying (SPIK) is an improved method of computing smooth phase-modulation waveforms for radio communication systems that convey digital information. SPIK is applicable to a variety of phase-shift-keying (PSK) modulation schemes, including quaternary PSK (QPSK), octonary PSK (8PSK), and 16PSK. In comparison with a related prior method, SPIK offers advantages of better performance and less complexity of implementation. In a PSK scheme, the underlying information waveform that one seeks to convey consists of discrete rectangular steps, but the spectral width of such a waveform is excessive for practical radio communication. Therefore, the problem is to smooth the step phase waveform in such a manner as to maintain power and bandwidth efficiency without incurring an unacceptably large error rate and without introducing undesired variations in the amplitude of the affected radio signal. Although the ideal constellation of PSK phasor points does not cause amplitude variations, filtering of the modulation waveform (in which, typically, a rectangular pulse is converted to a square-root raised cosine pulse) causes amplitude fluctuations. If a power-efficient nonlinear amplifier is used in the radio communication system, the fluctuating-amplitude signal can undergo significant spectral regrowth, thus compromising the bandwidth efficiency of the system. In the related prior method, one seeks to solve the problem in a procedure that comprises two major steps: phase-value generation and phase interpolation. SPIK follows the two-step approach of the related prior method, but the details of the steps are different. In the phase-value-generation step, the phase values of symbols in the PSK constellation are determined by a phase function that is said to be maximally smooth and that is chosen to minimize the spectral spread of the modulated signal. In this step, the constellation is divided into two groups by assigning, to information symbols, phase values

  10. Anti-smooth muscle antibody

    Science.gov (United States)

    ... gov/ency/article/003531.htm Anti-smooth muscle antibody To use the sharing features on this page, please enable JavaScript. Anti-smooth muscle antibody is a blood test that detects the presence ...

  11. Smooth functors vs. differential forms

    NARCIS (Netherlands)

    Schreiber, U.; Waldorf, K.

    2011-01-01

    We establish a relation between smooth 2-functors defined on the path 2-groupoid of a smooth manifold and differential forms on this manifold. This relation can be understood as a part of a dictionary between fundamental notions from category theory and differential geometry. We show that smooth

  12. Seismic hazard estimation based on the distributed seismicity in northern China

    Science.gov (United States)

    Yang, Yong; Shi, Bao-Ping; Sun, Liang

    2008-03-01

    In this paper, we have proposed an alternative seismic hazard modeling by using distributed seismicites. The distributed seismicity model does not need delineation of seismic source zones, and simplify the methodology of probabilistic seismic hazard analysis. Based on the devastating earthquake catalogue, we established three seismicity model, derived the distribution of a-value in northern China by using Gaussian smoothing function, and calculated peak ground acceleration distributions for this area with 2%, 5% and 10% probability of exceedance in a 50-year period by using three attenuation models, respectively. In general, the peak ground motion distribution patterns are consistent with current seismic hazard map of China, but in some specific seismic zones which include Shanxi Province and Shijiazhuang areas, our results indicated a little bit higher peak ground motions and zonation characters which are in agreement with seismicity distribution patterns in these areas. The hazard curves have been developed for Beijing, Tianjin, Taiyuan, Tangshan, and Ji’nan, the metropolitan cities in the northern China. The results showed that Tangshan, Taiyuan, Beijing has a higher seismic hazard than that of other cities mentioned above.

  13. Exponential smoothing weighted correlations

    Science.gov (United States)

    Pozzi, F.; Di Matteo, T.; Aste, T.

    2012-06-01

    In many practical applications, correlation matrices might be affected by the "curse of dimensionality" and by an excessive sensitiveness to outliers and remote observations. These shortcomings can cause problems of statistical robustness especially accentuated when a system of dynamic correlations over a running window is concerned. These drawbacks can be partially mitigated by assigning a structure of weights to observational events. In this paper, we discuss Pearson's ρ and Kendall's τ correlation matrices, weighted with an exponential smoothing, computed on moving windows using a data-set of daily returns for 300 NYSE highly capitalized companies in the period between 2001 and 2003. Criteria for jointly determining optimal weights together with the optimal length of the running window are proposed. We find that the exponential smoothing can provide more robust and reliable dynamic measures and we discuss that a careful choice of the parameters can reduce the autocorrelation of dynamic correlations whilst keeping significance and robustness of the measure. Weighted correlations are found to be smoother and recovering faster from market turbulence than their unweighted counterparts, helping also to discriminate more effectively genuine from spurious correlations.

  14. Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench

    Science.gov (United States)

    Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan

    2016-04-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.

  15. Seismic Discrimination

    Science.gov (United States)

    1979-09-30

    were presumed nuclear explosions announced by ERDA. Of the last, 11 were at the Semipalatinsk test site , 2 at the Western Kazakh test site , 2 in Novaya...which will fulfill U.S. ob- ligations that may be incurred under a possible future Comprehensive Test Ban Treaty. This report includes 9 contributions...which could assume U.S. seismic-data-management responsibilities in the event that international agreement is reached on a Comprehensive Test Ban

  16. Martian seismicity

    International Nuclear Information System (INIS)

    Goins, N.R.; Lazarewicz, A.R.

    1979-01-01

    During the Viking mission to Mars, the seismometer on Lander II collected approximately 0.24 Earth years of observations data, excluding periods of time dominated by wind-induced Lander vibration. The ''quiet-time'' data set contains no confirmed seismic events. A proper assessment of the significance of this fact requires quantitative estimates of the expected detection rate of the Viking seismometer. The first step is to calculate the minimum magnitude event detectable at a given distance, including the effects of geometric spreading, anelastic attenuation, seismic signal duration, seismometer frequency response, and possible poor ground coupling. Assuming various numerical quantities and a Martian seismic activity comparable to that of intraplate earthquakes, the appropriate integral gives an expected annual detection rate of 10 events, nearly all of which are local. Thus only two to three events would be expected in the observational period presently on hand and the lack of observed events is not in gross contradiction to reasonable expectations. Given the same assumptions, a seismometer 20 times more sensitive than the present instrument would be expected to detect about 120 events annually

  17. Smooth functions statistics

    International Nuclear Information System (INIS)

    Arnold, V.I.

    2006-03-01

    To describe the topological structure of a real smooth function one associates to it the graph, formed by the topological variety, whose points are the connected components of the level hypersurface of the function. For a Morse function, such a graph is a tree. Generically, it has T triple vertices, T + 2 endpoints, 2T + 2 vertices and 2T + 1 arrows. The main goal of the present paper is to study the statistics of the graphs, corresponding to T triple points: what is the growth rate of the number φ(T) of different graphs? Which part of these graphs is representable by the polynomial functions of corresponding degree? A generic polynomial of degree n has at most (n - 1) 2 critical points on R 2 , corresponding to 2T + 2 = (n - 1) 2 + 1, that is to T = 2k(k - 1) saddle-points for degree n = 2k

  18. Viscoplastic augmentation of the smooth cap model

    International Nuclear Information System (INIS)

    Schwer, Leonard E.

    1994-01-01

    The most common numerical viscoplastic implementations are formulations attributed to Perzyna. Although Perzyna-type algorithms are popular, they have several disadvantages relating to the lack of enforcement of the consistency condition in plasticity. The present work adapts a relatively unknown viscoplastic formulation attributed to Duvaut and Lions and generalized to multi-surface plasticity by Simo et al. The attraction of the Duvaut-Lions formulation is its ease of numerical implementation in existing elastoplastic algorithms. The present work provides a motivation for the Duvaut-Lions viscoplastic formulation, derivation of the algorithm and comparison with the Perzyna algorithm. A simple uniaxial strain numerical simulation is used to compare the results of the Duvaut-Lions algorithm, as adapted to the ppercase[dyna3d] smooth cap model with results from a Perzyna algorithm adapted by Katona and Muleret to an implicit code. ((orig.))

  19. Classification of smooth Fano polytopes

    DEFF Research Database (Denmark)

    Øbro, Mikkel

    A simplicial lattice polytope containing the origin in the interior is called a smooth Fano polytope, if the vertices of every facet is a basis of the lattice. The study of smooth Fano polytopes is motivated by their connection to toric varieties. The thesis concerns the classification of smooth...... Fano polytopes up to isomorphism. A smooth Fano -polytope can have at most vertices. In case of vertices an explicit classification is known. The thesis contains the classification in case of vertices. Classifications of smooth Fano -polytopes for fixed exist only for . In the thesis an algorithm...... for the classification of smooth Fano -polytopes for any given is presented. The algorithm has been implemented and used to obtain the complete classification for ....

  20. Innovations in seismic tomography, their applications and induced seismic events in carbon sequestration

    Science.gov (United States)

    Li, Peng

    This dissertation presents two innovations in seismic tomography and a new discovery of induced seismic events associated with CO2 injection at an Enhanced Oil Recovery (EOR) site. The following are brief introductions of these three works. The first innovated work is adaptive ambient seismic noise tomography (AANT). Traditional ambient noise tomography methods using regular grid nodes are often ill posed because the inversion grids do not always represent the distribution of ray paths. Large grid spacing is usually used to reduce the number of inversion parameters, which may not be able to solve for small-scale velocity structure. We present a new adaptive tomography method with irregular grids that provides a few advantages over the traditional methods. First, irregular grids with different sizes and shapes can fit the ray distribution better and the traditionally ill-posed problem can become more stable owing to the different parameterizations. Second, the data in the area with dense ray sampling will be sufficiently utilized so that the model resolution can be greatly improved. Both synthetic and real data are used to test the newly developed tomography algorithm. In synthetic data tests, we compare the resolution and stability of the traditional and adaptive methods. The results show that adaptive tomography is more stable and performs better in improving the resolution in the area with dense ray sampling. For real data, we extract the ambient noise signals of the seismic data near the Garlock Fault region, obtained from the Southern California Earthquake Data Center. The resulting group velocity of Rayleigh waves is well correlated with the geological structures. High velocity anomalies are shown in the cold southern Sierra Nevada, the Tehachapi Mountains and the Western San Gabriel Mountains. The second innovated work is local earthquake tomography with full topography (LETFT). In this work, we develop a new three-dimensional local earthquake tomography

  1. SmoothMoves : Smooth pursuits head movements for augmented reality

    NARCIS (Netherlands)

    Esteves, Augusto; Verweij, David; Suraiya, Liza; Islam, Rasel; Lee, Youryang; Oakley, Ian

    2017-01-01

    SmoothMoves is an interaction technique for augmented reality (AR) based on smooth pursuits head movements. It works by computing correlations between the movements of on-screen targets and the user's head while tracking those targets. The paper presents three studies. The first suggests that head

  2. AcquisitionFootprintAttenuationDrivenbySeismicAttributes

    Directory of Open Access Journals (Sweden)

    Cuellar-Urbano Mayra

    2014-04-01

    Full Text Available Acquisition footprint, one of the major problems that PEMEX faces in seismic imaging, is noise highly correlated to the geometric array of sources and receivers used for onshore and offshore seismic acquisitions. It prevails in spite of measures taken during acquisition and data processing. This pattern, throughout the image, is easily confused with geological features and misguides seismic attribute computation. In this work, we use seismic data from PEMEX Exploración y Producción to show the conditioning process for removing random and coherent noise using linear filters. Geometric attributes used in a workflow were computed for obtaining an acquisition footprint noise model and adaptively subtract it from the seismic data.

  3. Seismic instrumentation

    International Nuclear Information System (INIS)

    1984-06-01

    RFS or Regles Fondamentales de Surete (Basic Safety Rules) applicable to certain types of nuclear facilities lay down requirements with which compliance, for the type of facilities and within the scope of application covered by the RFS, is considered to be equivalent to compliance with technical French regulatory practice. The object of the RFS is to take advantage of standardization in the field of safety, while allowing for technical progress in that field. They are designed to enable the operating utility and contractors to know the rules pertaining to various subjects which are considered to be acceptable by the Service Central de Surete des Installations Nucleaires, or the SCSIN (Central Department for the Safety of Nuclear Facilities). These RFS should make safety analysis easier and lead to better understanding between experts and individuals concerned with the problems of nuclear safety. The SCSIN reserves the right to modify, when considered necessary, any RFS and specify, if need be, the terms under which a modification is deemed retroactive. The aim of this RFS is to define the type, location and operating conditions for seismic instrumentation needed to determine promptly the seismic response of nuclear power plants features important to safety to permit comparison of such response with that used as the design basis

  4. Smoothness in Binomial Edge Ideals

    Directory of Open Access Journals (Sweden)

    Hamid Damadi

    2016-06-01

    Full Text Available In this paper we study some geometric properties of the algebraic set associated to the binomial edge ideal of a graph. We study the singularity and smoothness of the algebraic set associated to the binomial edge ideal of a graph. Some of these algebraic sets are irreducible and some of them are reducible. If every irreducible component of the algebraic set is smooth we call the graph an edge smooth graph, otherwise it is called an edge singular graph. We show that complete graphs are edge smooth and introduce two conditions such that the graph G is edge singular if and only if it satisfies these conditions. Then, it is shown that cycles and most of trees are edge singular. In addition, it is proved that complete bipartite graphs are edge smooth.

  5. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  6. Seismic studies for nuclear installations sites

    International Nuclear Information System (INIS)

    Mohammadioun, B.; Faure, J.

    1988-01-01

    The french experience in seismic risks assessment for french nuclear installations permits to set out the objectives, the phases the geographic extensions of workings to be realized for the installation safety. The data to be collected for the safety analysis are specified, they concern the regional seismotectonics, the essential seismic data for determining the seism level to be taken into account and defining the soil movement spectra adapted to the site. It is necessary to follow up the seismic surveillance during the installation construction and life. 7 refs. (F.M.)

  7. Seismic qualification of equipment

    International Nuclear Information System (INIS)

    Heidebrecht, A.C.; Tso, W.K.

    1983-03-01

    This report describes the results of an investigation into the seismic qualification of equipment located in CANDU nuclear power plants. It is particularly concerned with the evaluation of current seismic qualification requirements, the development of a suitable methodology for the seismic qualification of safety systems, and the evaluation of seismic qualification analysis and testing procedures

  8. Seismic exploration for water on Mars

    International Nuclear Information System (INIS)

    Page, T.

    1987-01-01

    It is proposed to soft-land three seismometers in the Utopia-Elysium region and three or more radio controlled explosive charges at nearby sites that can be accurately located by an orbiter. Seismic signatures of timed explosions, to be telemetered to the orbiter, will be used to detect present surface layers, including those saturated by volatiles such as water and/or ice. The Viking Landers included seismometers that showed that at present Mars is seismically quiet, and that the mean crustal thickness at the site is about 14 to 18 km. The new seismic landers must be designed to minimize wind vibration noise, and the landing sites selected so that each is well formed on the regolith, not on rock outcrops or in craters. The explosive charges might be mounted on penetrators aimed at nearby smooth areas. They must be equipped with radio emitters for accurate location and radio receivers for timed detonation

  9. German seismic regulations

    International Nuclear Information System (INIS)

    Danisch, Ruediger

    2002-01-01

    Rules and regulations for seismic design in Germany cover the following: seismic design of conventional buildings; and seismic design of nuclear facilities. Safety criteria for NPPs, accident guidelines, and guidelines for PWRs as well as safety standards are cited. Safety standards concerned with NPPs seismic design include basic principles, soil analysis, design of building structures, design of mechanical and electrical components, seismic instrumentation, and measures to be undertaken after the earthquake

  10. Modeling of seismic data in the downward continuation approach

    NARCIS (Netherlands)

    Stolk, C.C.; de Hoop, Maarten V.

    2005-01-01

    Seismic data are commonly modeled by a high-frequency single scattering approximation. This amounts to a linearization in the medium coefficient about a smooth background. The discontinuities are contained in the medium perturbation. The high-frequency part of the wavefield in the background medium

  11. Seismic inverse scattering in the downward continuation approach

    NARCIS (Netherlands)

    Stolk, C.C.; de Hoop, M.V.

    Seismic data are commonly modeled by a linearization around a smooth background medium in combination with a high frequency approximation. The perturbation of the medium coefficient is assumed to contain the discontinuities. This leads to two inverse problems, first the linearized inverse problem

  12. AxiSEM3D: broadband seismic wavefields in 3-D aspherical Earth models

    Science.gov (United States)

    Leng, K.; Nissen-Meyer, T.; Zad, K. H.; van Driel, M.; Al-Attar, D.

    2017-12-01

    Seismology is the primary tool for data-informed inference of Earth structure and dynamics. Simulating seismic wave propagation at a global scale is fundamental to seismology, but remains as one of most challenging problems in scientific computing, because of both the multiscale nature of Earth's interior and the observable frequency band of seismic data. We present a novel numerical method to simulate global seismic wave propagation in realistic 3-D Earth models. Our method, named AxiSEM3D, is a hybrid of spectral element method and pseudospectral method. It reduces the azimuthal dimension of wavefields by means of a global Fourier series parameterization, of which the number of terms can be locally adapted to the inherent azimuthal smoothness of the wavefields. AxiSEM3D allows not only for material heterogeneities, such as velocity, density, anisotropy and attenuation, but also for finite undulations on radial discontinuities, both solid-solid and solid-fluid, and thereby a variety of aspherical Earth features such as ellipticity, topography, variable crustal thickness, and core-mantle boundary topography. Such interface undulations are equivalently interpreted as material perturbations of the contiguous media, based on the "particle relabelling transformation". Efficiency comparisons show that AxiSEM3D can be 1 to 3 orders of magnitude faster than conventional 3-D methods, with the speedup increasing with simulation frequency and decreasing with model complexity, but for all realistic structures the speedup remains at least one order of magnitude. The observable frequency range of global seismic data (up to 1 Hz) has been covered for wavefield modelling upon a 3-D Earth model with reasonable computing resources. We show an application of surface wave modelling within a state-of-the-art global crustal model (Crust1.0), with the synthetics compared to real data. The high-performance C++ code is released at github.com/AxiSEM3D/AxiSEM3D.

  13. Aging may negatively impact movement smoothness during stair negotiation.

    Science.gov (United States)

    Dixon, P C; Stirling, L; Xu, X; Chang, C C; Dennerlein, J T; Schiffman, J M

    2018-05-26

    Stairs represent a barrier to safe locomotion for some older adults, potentially leading to the adoption of a cautious gait strategy that may lack fluidity. This strategy may be characterized as unsmooth; however, stair negotiation smoothness has yet to be quantified. The aims of this study were to assess age- and task-related differences in head and body center of mass (COM) acceleration patterns and smoothness during stair negotiation and to determine if smoothness was associated with the timed "Up and Go" (TUG) test of functional movement. Motion data from nineteen older and twenty young adults performing stair ascent, stair descent, and overground straight walking trials were analyzed and used to compute smoothness based on the log-normalized dimensionless jerk (LDJ) and the velocity spectral arc length (SPARC) metrics. The associations between TUG and smoothness measures were evaluated using Pearson's correlation coefficient (r). Stair tasks increased head and body COM acceleration pattern differences across groups, compared to walking (p < 0.05). LDJ smoothness for the head and body COM decreased in older adults during stair descent, compared to young adults (p ≤ 0.015) and worsened with increasing TUG for all tasks (-0.60 ≤ r ≤ -0.43). SPARC smoothness of the head and body COM increased in older adults, regardless of task (p < 0.001), while correlations showed improved SPARC smoothness with increasing TUG for some tasks (0.33 ≤ r ≤ 0.40). The LDJ outperforms SPARC in identifying age-related stair negotiation adaptations and is associated with performance on a clinical test of gait. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Seismic intrusion detector system

    Science.gov (United States)

    Hawk, Hervey L.; Hawley, James G.; Portlock, John M.; Scheibner, James E.

    1976-01-01

    A system for monitoring man-associated seismic movements within a control area including a geophone for generating an electrical signal in response to seismic movement, a bandpass amplifier and threshold detector for eliminating unwanted signals, pulse counting system for counting and storing the number of seismic movements within the area, and a monitoring system operable on command having a variable frequency oscillator generating an audio frequency signal proportional to the number of said seismic movements.

  15. National Seismic Station

    International Nuclear Information System (INIS)

    Stokes, P.A.

    1982-06-01

    The National Seismic Station was developed to meet the needs of regional or worldwide seismic monitoring of underground nuclear explosions to verify compliance with a nuclear test ban treaty. The Station acquires broadband seismic data and transmits it via satellite to a data center. It is capable of unattended operation for periods of at least a year, and will detect any tampering that could result in the transmission of unauthentic seismic data

  16. Quantitative Seismic Amplitude Analysis

    NARCIS (Netherlands)

    Dey, A.K.

    2011-01-01

    The Seismic Value Chain quantifies the cyclic interaction between seismic acquisition, imaging and reservoir characterization. Modern seismic innovation to address the global imbalance in hydrocarbon supply and demand requires such cyclic interaction of both feed-forward and feed-back processes.

  17. Non-smooth dynamical systems

    CERN Document Server

    2000-01-01

    The book provides a self-contained introduction to the mathematical theory of non-smooth dynamical problems, as they frequently arise from mechanical systems with friction and/or impacts. It is aimed at applied mathematicians, engineers, and applied scientists in general who wish to learn the subject.

  18. Panel Smooth Transition Regression Models

    DEFF Research Database (Denmark)

    González, Andrés; Terasvirta, Timo; Dijk, Dick van

    We introduce the panel smooth transition regression model. This new model is intended for characterizing heterogeneous panels, allowing the regression coefficients to vary both across individuals and over time. Specifically, heterogeneity is allowed for by assuming that these coefficients are bou...

  19. Smoothing type buffer memory device

    International Nuclear Information System (INIS)

    Podorozhnyj, D.M.; Yashin, I.V.

    1990-01-01

    The layout of the micropower 4-bit smoothing type buffer memory device allowing one to record without counting the sequence of input randomly distributed pulses in multi-channel devices with serial poll, is given. The power spent by a memory cell for one binary digit recording is not greater than 0.15 mW, the device dead time is 10 mus

  20. Covariances of smoothed observational data

    Czech Academy of Sciences Publication Activity Database

    Vondrák, Jan; Čepek, A.

    2000-01-01

    Roč. 40, 5-6 (2000), s. 42-44 ISSN 1210-2709 R&D Projects: GA ČR GA205/98/1104 Institutional research plan: CEZ:AV0Z1003909 Keywords : digital filter * smoothing * estimation of uncertainties Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  1. Income smoothing by Dutch hospitals

    NARCIS (Netherlands)

    Boterenbrood, D.R.

    2014-01-01

    Research indicates that hospitals manage their earnings. However, these findings might be influenced by methodological issues. In this study, I exploit specific features of Dutch hospitals to study income smoothing while limiting these methodological issues. The managers of Dutch hospitals have the

  2. France's seismic zoning

    International Nuclear Information System (INIS)

    Mohammadioun, B.

    1997-01-01

    In order to assess the seismic hazard in France in relation to nuclear plant siting, the CEA, EDF and the BRGM (Mine and Geology Bureau) have carried out a collaboration which resulted in a seismic-tectonic map of France and a data base on seismic history (SIRENE). These studies were completed with a seismic-tectonic zoning, taking into account a very long period of time, that enabled a probabilistic evaluation of the seismic hazard in France, and that may be related to adjacent country hazard maps

  3. Seismic changes industry

    International Nuclear Information System (INIS)

    Taylor, G.

    1992-01-01

    This paper discusses the growth in the seismic industry as a result of the recent increases in the foreign market. With the decline of communism and the opening of Latin America to exploration, seismic teams have moved out into these areas in support of the oil and gas industry. The paper goes on to discuss the improved technology available for seismic resolution and the subsequent use of computers to field-proof the data while the seismic team is still on-site. It also discusses the effects of new computer technology on reducing the amount of support staff that is required to both conduct and interpret seismic information

  4. Smoothing the payoff for efficient computation of Basket option prices

    KAUST Repository

    Bayer, Christian

    2017-07-22

    We consider the problem of pricing basket options in a multivariate Black–Scholes or Variance-Gamma model. From a numerical point of view, pricing such options corresponds to moderate and high-dimensional numerical integration problems with non-smooth integrands. Due to this lack of regularity, higher order numerical integration techniques may not be directly available, requiring the use of methods like Monte Carlo specifically designed to work for non-regular problems. We propose to use the inherent smoothing property of the density of the underlying in the above models to mollify the payoff function by means of an exact conditional expectation. The resulting conditional expectation is unbiased and yields a smooth integrand, which is amenable to the efficient use of adaptive sparse-grid cubature. Numerical examples indicate that the high-order method may perform orders of magnitude faster than Monte Carlo or Quasi Monte Carlo methods in dimensions up to 35.

  5. The 2014 United States National Seismic Hazard Model

    Science.gov (United States)

    Petersen, Mark D.; Moschetti, Morgan P.; Powers, Peter; Mueller, Charles; Haller, Kathleen; Frankel, Arthur; Zeng, Yuehua; Rezaeian, Sanaz; Harmsen, Stephen; Boyd, Oliver; Field, Edward; Chen, Rui; Rukstales, Kenneth S.; Luco, Nicolas; Wheeler, Russell; Williams, Robert; Olsen, Anna H.

    2015-01-01

    New seismic hazard maps have been developed for the conterminous United States using the latest data, models, and methods available for assessing earthquake hazard. The hazard models incorporate new information on earthquake rupture behavior observed in recent earthquakes; fault studies that use both geologic and geodetic strain rate data; earthquake catalogs through 2012 that include new assessments of locations and magnitudes; earthquake adaptive smoothing models that more fully account for the spatial clustering of earthquakes; and 22 ground motion models, some of which consider more than double the shaking data applied previously. Alternative input models account for larger earthquakes, more complicated ruptures, and more varied ground shaking estimates than assumed in earlier models. The ground motions, for levels applied in building codes, differ from the previous version by less than ±10% over 60% of the country, but can differ by ±50% in localized areas. The models are incorporated in insurance rates, risk assessments, and as input into the U.S. building code provisions for earthquake ground shaking.

  6. Seismic stops for nuclear power plants

    International Nuclear Information System (INIS)

    Cloud, R.L.; Leung, J.S.M.; Anderson, P.H.

    1989-01-01

    In the regulated world of nuclear power, the need to have analytical proof of performance in hypothetical design-basis events such as earth quakes has placed a premium on design configurations that are mathematically tractable and easily analyzed. This is particularly true for the piping design. Depending on how the piping analyses are organized and on how old the plant is, there may be from 200 to 1000 separate piping runs to be designed, analyzed, and qualified. In this situation, the development of snubbers seemed like the answer to a piping engineer's prayer. At any place where seismic support was required but thermal motion had to be accommodated, a snubber could be specified. But, as experience has now shown, the program was solved only on paper. This article presents an alternative to conventional snubbers. These new devices, termed Seismic Stops are designed to replace snubbers directly and look like snubbers on the outside. But their design is based on a completely different principle. The original concept has adapted from early seismic-resistant pipe support designs used on fossil power plants in California. The fundamental idea is to provide a space envelope in which the pipe can expand freely between the hot and cold positions, but cannot move outside the envelope. Seismic Stops are designed to transmit any possible impact load, as would occur in an earthquake, away from the pipe itself to the Seismic Stop. The Seismic Stop pipe support is shown

  7. Angola Seismicity MAP

    Science.gov (United States)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  8. Exchange rate smoothing in Hungary

    OpenAIRE

    Karádi, Péter

    2005-01-01

    The paper proposes a structural empirical model capable of examining exchange rate smoothing in the small, open economy of Hungary. The framework assumes the existence of an unobserved and changing implicit exchange rate target. The central bank is assumed to use interest rate policy to obtain this preferred rate in the medium term, while market participants are assumed to form rational expectations about this target and influence exchange rates accordingly. The paper applies unobserved varia...

  9. Geomorphology and seismic risk

    Science.gov (United States)

    Panizza, Mario

    1991-07-01

    The author analyses the contributions provided by geomorphology in studies suited to the assessment of seismic risk: this is defined as function of the seismic hazard, of the seismic susceptibility, and of the vulnerability. The geomorphological studies applicable to seismic risk assessment can be divided into two sectors: (a) morpho-neotectonic investigations conducted to identify active tectonic structures; (b) geomorphological and morphometric analyses aimed at identifying the particular situations that amplify or reduce seismic susceptibility. The morpho-neotectonic studies lead to the identification, selection and classification of the lineaments that can be linked with active tectonic structures. The most important geomorphological situations that can condition seismic susceptibility are: slope angle, debris, morphology, degradational slopes, paleo-landslides and underground cavities.

  10. Smooth surfaces from rational bilinear patches

    KAUST Repository

    Shi, Ling; Wang, Jun; Pottmann, Helmut

    2014-01-01

    Smooth freeform skins from simple panels constitute a challenging topic arising in contemporary architecture. We contribute to this problem area by showing how to approximate a negatively curved surface by smoothly joined rational bilinear patches

  11. Calcium dynamics in vascular smooth muscle

    OpenAIRE

    Amberg, Gregory C.; Navedo, Manuel F.

    2013-01-01

    Smooth muscle cells are ultimately responsible for determining vascular luminal diameter and blood flow. Dynamic changes in intracellular calcium are a critical mechanism regulating vascular smooth muscle contractility. Processes influencing intracellular calcium are therefore important regulators of vascular function with physiological and pathophysiological consequences. In this review we discuss the major dynamic calcium signals identified and characterized in vascular smooth muscle cells....

  12. multiscale smoothing in supervised statistical learning

    Indian Academy of Sciences (India)

    Optimum level of smoothing is chosen based on the entire training sample, while a good choice of smoothing parameter may also depend on the observation to be classified. One may like to assess the strength of evidence in favor of different competing class at different scale of smoothing. In allows only one single ...

  13. A SAS IML Macro for Loglinear Smoothing

    Science.gov (United States)

    Moses, Tim; von Davier, Alina

    2011-01-01

    Polynomial loglinear models for one-, two-, and higher-way contingency tables have important applications to measurement and assessment. They are essentially regarded as a smoothing technique, which is commonly referred to as loglinear smoothing. A SAS IML (SAS Institute, 2002a) macro was created to implement loglinear smoothing according to…

  14. Burar seismic station: evaluation of seismic performance

    International Nuclear Information System (INIS)

    Ghica, Daniela; Popa, Mihaela

    2005-01-01

    A new seismic monitoring system, the Bucovina Seismic Array (BURAR), has been established since July 2002, in the Northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics (NIEP), Romania. The small-aperture array consists of 10 seismic sensors (9 vertical short-period and one three-component broad band) located in boreholes and distributed in a 5 x 5 km 2 area. At present, the seismic data are continuously recorded by the BURAR and transmitted in real-time to the Romanian National Data Center in Bucharest and National Data Center of the USA, in Florida. Based on the BURAR seismic information gathered at the National Data Center, NIEP (ROM N DC), in the August 2002 - December 2004 time interval, analysis and statistical assessments were performed. Following the preliminary processing of the data, several observations on the global performance of the BURAR system were emphasized. Data investigation showed an excellent efficiency of the BURAR system particularly in detecting teleseismic and regional events. Also, a statistical analysis for the BURAR detection capability of the local Vrancea events was performed in terms of depth and magnitude for the year 2004. The high signal detection capability of the BURAR resulted, generally, in improving the location solutions for the Vrancea seismic events. The location solution accuracy is enhanced when adding BURAR recordings, especially in the case of low magnitude events (recorded by few stations). The location accuracy is increased, both in terms of constraining hypocenter depth and epicentral coordinates. Our analysis certifies the importance of the BURAR system in NIEP efforts to elaborate seismic bulletins. Furthermore, the specific procedures for array data processing (beam forming, f-k analysis) increase significantly the signal-to-noise ratio by summing up the coherent signals from the array components, and ensure a better accuracy

  15. Comparison of some nonlinear smoothing methods

    International Nuclear Information System (INIS)

    Bell, P.R.; Dillon, R.S.

    1977-01-01

    Due to the poor quality of many nuclear medicine images, computer-driven smoothing procedures are frequently employed to enhance the diagnostic utility of these images. While linear methods were first tried, it was discovered that nonlinear techniques produced superior smoothing with little detail suppression. We have compared four methods: Gaussian smoothing (linear), two-dimensional least-squares smoothing (linear), two-dimensional least-squares bounding (nonlinear), and two-dimensional median smoothing (nonlinear). The two dimensional least-squares procedures have yielded the most satisfactorily enhanced images, with the median smoothers providing quite good images, even in the presence of widely aberrant points

  16. Smooth random change point models.

    Science.gov (United States)

    van den Hout, Ardo; Muniz-Terrera, Graciela; Matthews, Fiona E

    2011-03-15

    Change point models are used to describe processes over time that show a change in direction. An example of such a process is cognitive ability, where a decline a few years before death is sometimes observed. A broken-stick model consists of two linear parts and a breakpoint where the two lines intersect. Alternatively, models can be formulated that imply a smooth change between the two linear parts. Change point models can be extended by adding random effects to account for variability between subjects. A new smooth change point model is introduced and examples are presented that show how change point models can be estimated using functions in R for mixed-effects models. The Bayesian inference using WinBUGS is also discussed. The methods are illustrated using data from a population-based longitudinal study of ageing, the Cambridge City over 75 Cohort Study. The aim is to identify how many years before death individuals experience a change in the rate of decline of their cognitive ability. Copyright © 2010 John Wiley & Sons, Ltd.

  17. Calcium signaling in smooth muscle.

    Science.gov (United States)

    Hill-Eubanks, David C; Werner, Matthias E; Heppner, Thomas J; Nelson, Mark T

    2011-09-01

    Changes in intracellular Ca(2+) are central to the function of smooth muscle, which lines the walls of all hollow organs. These changes take a variety of forms, from sustained, cell-wide increases to temporally varying, localized changes. The nature of the Ca(2+) signal is a reflection of the source of Ca(2+) (extracellular or intracellular) and the molecular entity responsible for generating it. Depending on the specific channel involved and the detection technology employed, extracellular Ca(2+) entry may be detected optically as graded elevations in intracellular Ca(2+), junctional Ca(2+) transients, Ca(2+) flashes, or Ca(2+) sparklets, whereas release of Ca(2+) from intracellular stores may manifest as Ca(2+) sparks, Ca(2+) puffs, or Ca(2+) waves. These diverse Ca(2+) signals collectively regulate a variety of functions. Some functions, such as contractility, are unique to smooth muscle; others are common to other excitable cells (e.g., modulation of membrane potential) and nonexcitable cells (e.g., regulation of gene expression).

  18. Seismic Wave Propagation in Icy Ocean Worlds

    Science.gov (United States)

    Stähler, Simon C.; Panning, Mark P.; Vance, Steven D.; Lorenz, Ralph D.; van Driel, Martin; Nissen-Meyer, Tarje; Kedar, Sharon

    2018-01-01

    Seismology was developed on Earth and shaped our model of the Earth's interior over the twentieth century. With the exception of the Philae lander, all in situ extraterrestrial seismological effort to date was limited to other terrestrial planets. All have in common a rigid crust above a solid mantle. The coming years may see the installation of seismometers on Europa, Titan, and Enceladus, so it is necessary to adapt seismological concepts to the setting of worlds with global oceans covered in ice. Here we use waveform analyses to identify and classify wave types, developing a lexicon for icy ocean world seismology intended to be useful to both seismologists and planetary scientists. We use results from spectral-element simulations of broadband seismic wavefields to adapt seismological concepts to icy ocean worlds. We present a concise naming scheme for seismic waves and an overview of the features of the seismic wavefield on Europa, Titan, Ganymede, and Enceladus. In close connection with geophysical interior models, we analyze simulated seismic measurements of Europa and Titan that might be used to constrain geochemical parameters governing the habitability of a sub-ice ocean.

  19. Seismic texture classification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Vinther, R.

    1997-12-31

    The seismic texture classification method, is a seismic attribute that can both recognize the general reflectivity styles and locate variations from these. The seismic texture classification performs a statistic analysis for the seismic section (or volume) aiming at describing the reflectivity. Based on a set of reference reflectivities the seismic textures are classified. The result of the seismic texture classification is a display of seismic texture categories showing both the styles of reflectivity from the reference set and interpolations and extrapolations from these. The display is interpreted as statistical variations in the seismic data. The seismic texture classification is applied to seismic sections and volumes from the Danish North Sea representing both horizontal stratifications and salt diapers. The attribute succeeded in recognizing both general structure of successions and variations from these. Also, the seismic texture classification is not only able to display variations in prospective areas (1-7 sec. TWT) but can also be applied to deep seismic sections. The seismic texture classification is tested on a deep reflection seismic section (13-18 sec. TWT) from the Baltic Sea. Applied to this section the seismic texture classification succeeded in locating the Moho, which could not be located using conventional interpretation tools. The seismic texture classification is a seismic attribute which can display general reflectivity styles and deviations from these and enhance variations not found by conventional interpretation tools. (LN)

  20. Smoothing dynamic positron emission tomography time courses using functional principal components

    OpenAIRE

    Jiang, Ci-Ren; Aston, John A. D.; Wang, Jane-Ling

    2009-01-01

    A functional smoothing approach to the analysis of PET time course data is presented. By borrowing information across space and accounting for this pooling through the use of a non-parametric covariate adjustment, it is possible to smooth the PET time course data thus reducing the noise. A new model for functional data analysis, the Multiplicative Nonparametric Random Effects Model, is introduced to more accurately account for the variation in the data. A locally adaptive bandwidth choice hel...

  1. Lensing smoothing of BAO wiggles

    Energy Technology Data Exchange (ETDEWEB)

    Dio, Enea Di, E-mail: enea.didio@oats.inaf.it [INAF—Osservatorio Astronomico di Trieste, Via G.B. Tiepolo 11, I-34143 Trieste (Italy)

    2017-03-01

    We study non-perturbatively the effect of the deflection angle on the BAO wiggles of the matter power spectrum in real space. We show that from redshift z ∼2 this introduces a dispersion of roughly 1 Mpc at BAO scale, which corresponds approximately to a 1% effect. The lensing effect induced by the deflection angle, which is completely geometrical and survey independent, smears out the BAO wiggles. The effect on the power spectrum amplitude at BAO scale is about 0.1 % for z ∼2 and 0.2 % for z ∼4. We compare the smoothing effects induced by the lensing potential and non-linear structure formation, showing that the two effects become comparable at z ∼ 4, while the lensing effect dominates for sources at higher redshifts. We note that this effect is not accounted through BAO reconstruction techniques.

  2. Radial smoothing and closed orbit

    International Nuclear Information System (INIS)

    Burnod, L.; Cornacchia, M.; Wilson, E.

    1983-11-01

    A complete simulation leading to a description of one of the error curves must involve four phases: (1) random drawing of the six set-up points within a normal population having a standard deviation of 1.3 mm; (b) random drawing of the six vertices of the curve in the sextant mode within a normal population having a standard deviation of 1.2 mm. These vertices are to be set with respect to the axis of the error lunes, while this axis has as its origins the positions defined by the preceding drawing; (c) mathematical definition of six parabolic curves and their junctions. These latter may be curves with very slight curvatures, or segments of a straight line passing through the set-up point and having lengths no longer than one LSS. Thus one gets a mean curve for the absolute errors; (d) plotting of the actually observed radial positions with respect to the mean curve (results of smoothing)

  3. Using Seismic Interferometry to Investigate Seismic Swarms

    Science.gov (United States)

    Matzel, E.; Morency, C.; Templeton, D. C.

    2017-12-01

    Seismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Hundreds of small earthquakes often occur along a fault during a seismic swarm. This seismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the fault, itself. Here we focus on two methods of seismic interferometry, ambient noise correlation (ANC) and the virtual seismometer method (VSM). ANC is based on the observation that the Earth's background noise includes coherent energy, which can be recovered by observing over long time periods and allowing the incoherent energy to cancel out. The cross correlation of ambient noise between a pair of stations results in a waveform that is identical to the seismogram that would result if an impulsive source located at one of the stations was recorded at the other, the Green function (GF). The calculation of the GF is often stable after a few weeks of continuous data correlation, any perturbations to the GF after that point are directly related to changes in the subsurface and can be used for 4D monitoring.VSM is a style of seismic interferometry that provides fast, precise, high frequency estimates of the Green's function (GF) between earthquakes. VSM illuminates the subsurface precisely where the pressures are changing and has the potential to image the evolution of seismicity over time, including changes in the style of faulting. With hundreds of earthquakes, we can calculate thousands of waveforms. At the same time, VSM collapses the computational domain, often by 2-3 orders of magnitude. This allows us to do high frequency 3D modeling in the fault region. Using data from a swarm of earthquakes near the Salton Sea, we demonstrate the power of these techniques, illustrating our ability to scale from the far field, where sources are well separated, to the near field where their locations fall within each other

  4. The Seismic Analyzer: Interpreting and Illustrating 2D Seismic Data

    OpenAIRE

    Patel, Daniel; Giertsen, Christopher; Thurmond, John; Gjelberg, John; Gröller, Eduard

    2008-01-01

    We present a toolbox for quickly interpreting and illustrating 2D slices of seismic volumetric reflection data. Searching for oil and gas involves creating a structural overview of seismic reflection data to identify hydrocarbon reservoirs. We improve the search of seismic structures by precalculating the horizon structures of the seismic data prior to interpretation. We improve the annotation of seismic structures by applying novel illustrative rendering algorithms tailored to seism...

  5. Robust estimation of seismic coda shape

    Science.gov (United States)

    Nikkilä, Mikko; Polishchuk, Valentin; Krasnoshchekov, Dmitry

    2014-04-01

    We present a new method for estimation of seismic coda shape. It falls into the same class of methods as non-parametric shape reconstruction with the use of neural network techniques where data are split into a training and validation data sets. We particularly pursue the well-known problem of image reconstruction formulated in this case as shape isolation in the presence of a broadly defined noise. This combined approach is enabled by the intrinsic feature of seismogram which can be divided objectively into a pre-signal seismic noise with lack of the target shape, and the remainder that contains scattered waveforms compounding the coda shape. In short, we separately apply shape restoration procedure to pre-signal seismic noise and the event record, which provides successful delineation of the coda shape in the form of a smooth almost non-oscillating function of time. The new algorithm uses a recently developed generalization of classical computational-geometry tool of α-shape. The generalization essentially yields robust shape estimation by ignoring locally a number of points treated as extreme values, noise or non-relevant data. Our algorithm is conceptually simple and enables the desired or pre-determined level of shape detail, constrainable by an arbitrary data fit criteria. The proposed tool for coda shape delineation provides an alternative to moving averaging and/or other smoothing techniques frequently used for this purpose. The new algorithm is illustrated with an application to the problem of estimating the coda duration after a local event. The obtained relation coefficient between coda duration and epicentral distance is consistent with the earlier findings in the region of interest.

  6. Modelling free surface flows with smoothed particle hydrodynamics

    Directory of Open Access Journals (Sweden)

    L.Di G.Sigalotti

    2006-01-01

    Full Text Available In this paper the method of Smoothed Particle Hydrodynamics (SPH is extended to include an adaptive density kernel estimation (ADKE procedure. It is shown that for a van der Waals (vdW fluid, this method can be used to deal with free-surface phenomena without difficulties. In particular, arbitrary moving boundaries can be easily handled because surface tension is effectively simulated by the cohesive pressure forces. Moreover, the ADKE method is seen to increase both the accuracy and stability of SPH since it allows the width of the kernel interpolant to vary locally in a way that only the minimum necessary smoothing is applied at and near free surfaces and sharp fluid-fluid interfaces. The method is robust and easy to implement. Examples of its resolving power are given for both the formation of a circular liquid drop under surface tension and the nonlinear oscillation of excited drops.

  7. Doing smooth pursuit paradigms in Windows 7

    DEFF Research Database (Denmark)

    Wilms, Inge Linda

    predict strengths or deficits in perception and attention. However, smooth pursuit movements have been difficult to study and very little normative data is available for smooth pursuit performance in children and adults. This poster describes the challenges in setting up a smooth pursuit paradigm...... in Windows 7 with live capturing of eye movements using a Tobii TX300 eye tracker. In particular, the poster describes the challenges and limitations created by the hardware and the software...

  8. Bessel smoothing filter for spectral-element mesh

    Science.gov (United States)

    Trinh, P. T.; Brossier, R.; Métivier, L.; Virieux, J.; Wellington, P.

    2017-06-01

    Smoothing filters are extremely important tools in seismic imaging and inversion, such as for traveltime tomography, migration and waveform inversion. For efficiency, and as they can be used a number of times during inversion, it is important that these filters can easily incorporate prior information on the geological structure of the investigated medium, through variable coherent lengths and orientation. In this study, we promote the use of the Bessel filter to achieve these purposes. Instead of considering the direct application of the filter, we demonstrate that we can rely on the equation associated with its inverse filter, which amounts to the solution of an elliptic partial differential equation. This enhances the efficiency of the filter application, and also its flexibility. We apply this strategy within a spectral-element-based elastic full waveform inversion framework. Taking advantage of this formulation, we apply the Bessel filter by solving the associated partial differential equation directly on the spectral-element mesh through the standard weak formulation. This avoids cumbersome projection operators between the spectral-element mesh and a regular Cartesian grid, or expensive explicit windowed convolution on the finite-element mesh, which is often used for applying smoothing operators. The associated linear system is solved efficiently through a parallel conjugate gradient algorithm, in which the matrix vector product is factorized and highly optimized with vectorized computation. Significant scaling behaviour is obtained when comparing this strategy with the explicit convolution method. The theoretical numerical complexity of this approach increases linearly with the coherent length, whereas a sublinear relationship is observed practically. Numerical illustrations are provided here for schematic examples, and for a more realistic elastic full waveform inversion gradient smoothing on the SEAM II benchmark model. These examples illustrate well the

  9. Seismic sequences in the Sombrero Seismic Zone

    Science.gov (United States)

    Pulliam, J.; Huerfano, V. A.; ten Brink, U.; von Hillebrandt, C.

    2007-05-01

    The northeastern Caribbean, in the vicinity of Puerto Rico and the Virgin Islands, has a long and well-documented history of devastating earthquakes and tsunamis, including major events in 1670, 1787, 1867, 1916, 1918, and 1943. Recently, seismicity has been concentrated to the north and west of the British Virgin Islands, in the region referred to as the Sombrero Seismic Zone by the Puerto Rico Seismic Network (PRSN). In the combined seismicity catalog maintained by the PRSN, several hundred small to moderate magnitude events can be found in this region prior to 2006. However, beginning in 2006 and continuing to the present, the rate of seismicity in the Sombrero suddenly increased, and a new locus of activity developed to the east of the previous location. Accurate estimates of seismic hazard, and the tsunamigenic potential of seismic events, depend on an accurate and comprehensive understanding of how strain is being accommodated in this corner region. Are faults locked and accumulating strain for release in a major event? Or is strain being released via slip over a diffuse system of faults? A careful analysis of seismicity patterns in the Sombrero region has the potential to both identify faults and modes of failure, provided the aggregation scheme is tuned to properly identify related events. To this end, we experimented with a scheme to identify seismic sequences based on physical and temporal proximity, under the assumptions that (a) events occur on related fault systems as stress is refocused by immediately previous events and (b) such 'stress waves' die out with time, so that two events that occur on the same system within a relatively short time window can be said to have a similar 'trigger' in ways that two nearby events that occurred years apart cannot. Patterns that emerge from the identification, temporal sequence, and refined locations of such sequences of events carry information about stress accommodation that is obscured by large clouds of

  10. Income and Consumption Smoothing among US States

    DEFF Research Database (Denmark)

    Sørensen, Bent; Yosha, Oved

    within regions but not between regions. This suggests that capital markets transcend regional barriers while credit markets are regional in their nature. Smoothing within the club of rich states is accomplished mainly via capital markets whereas consumption smoothing is dominant within the club of poor...... states. The fraction of a shock to gross state products smoothed by the federal tax-transfer system is the same for various regions and other clubs of states. We calculate the scope for consumption smoothing within various regions and clubs, finding that most gains from risk sharing can be achieved...

  11. Seismic Creep, USA Images

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Seismic creep is the constant or periodic movement on a fault as contrasted with the sudden rupture associated with an earthquake. It is a usually slow deformation...

  12. BUILDING 341 Seismic Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Halle, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-06-15

    The Seismic Evaluation of Building 341 located at Lawrence Livermore National Laboratory in Livermore, California has been completed. The subject building consists of a main building, Increment 1, and two smaller additions; Increments 2 and 3.

  13. Seismic data acquisition systems

    International Nuclear Information System (INIS)

    Kolvankar, V.G.; Nadre, V.N.; Rao, D.S.

    1989-01-01

    Details of seismic data acquisition systems developed at the Bhabha Atomic Research Centre, Bombay are reported. The seismic signals acquired belong to different signal bandwidths in the band from 0.02 Hz to 250 Hz. All these acquisition systems are built around a unique technique of recording multichannel data on to a single track of an audio tape and in digital form. Techniques of how these signals in different bands of frequencies were acquired and recorded are described. Method of detecting seismic signals and its performance is also discussed. Seismic signals acquired in different set-ups are illustrated. Time indexing systems for different set-ups and multichannel waveform display systems which form essential part of the data acquisition systems are also discussed. (author). 13 refs., 6 figs., 1 tab

  14. PSMG switchgear seismic analysis

    International Nuclear Information System (INIS)

    Kuehster, C.J.

    1977-01-01

    LOFT primary coolant system motor generator (PSMG) switchgear boxes were analyzed for sliding and overturning during a seismic event. Boxes are located in TAN-650, Room B-239, with the PSMG generators. Both boxes are sufficiently anchored to the floor

  15. Seismic facies; Facies sismicas

    Energy Technology Data Exchange (ETDEWEB)

    Johann, Paulo Roberto Schroeder [PETROBRAS, Rio de Janeiro, RJ (Brazil). Exploracao e Producao Corporativo. Gerencia de Reservas e Reservatorios]. E-mail: johann@petrobras.com.br

    2004-11-01

    The method presented herein describes the seismic facies as representations of curves and vertical matrixes of the lithotypes proportions. The seismic facies are greatly interested in capturing the spatial distributions (3D) of regionalized variables, as for example, lithotypes, sedimentary facies groups and/ or porosity and/or other properties of the reservoirs and integrate them into the 3D geological modeling (Johann, 1997). Thus when interpreted as curves or vertical matrixes of proportions, seismic facies allow us to build a very important tool for structural analysis of regionalized variables. The matrixes have an important application in geostatistical modeling. In addition, this approach provides results about the depth and scale of the wells profiles, that is, seismic data is integrated to the characterization of reservoirs in depth maps and in high resolution maps. The link between the different necessary technical phases involved in the classification of the segments of seismic traces is described herein in groups of predefined traces of two approaches: a) not supervised and b) supervised by the geological knowledge available on the studied reservoir. The multivariate statistical methods used to obtain the maps of the seismic facies units are interesting tools to be used to provide a lithostratigraphic and petrophysical understanding of a petroleum reservoir. In the case studied these seismic facies units are interpreted as representative of the depositional system as a part of the Namorado Turbiditic System, Namorado Field, Campos Basin.Within the scope of PRAVAP 19 (Programa Estrategico de Recuperacao Avancada de Petroleo - Strategic Program of Advanced Petroleum Recovery) some research work on algorithms is underway to select new optimized attributes to apply seismic facies. One example is the extraction of attributes based on the wavelet transformation and on the time-frequency analysis methodology. PRAVAP is also carrying out research work on an

  16. Seismic migration in generalized coordinates

    Science.gov (United States)

    Arias, C.; Duque, L. F.

    2017-06-01

    Reverse time migration (RTM) is a technique widely used nowadays to obtain images of the earth’s sub-surface, using artificially produced seismic waves. This technique has been developed for zones with flat surface and when applied to zones with rugged topography some corrections must be introduced in order to adapt it. This can produce defects in the final image called artifacts. We introduce a simple mathematical map that transforms a scenario with rugged topography into a flat one. The three steps of the RTM can be applied in a way similar to the conventional ones just by changing the Laplacian in the acoustic wave equation for a generalized one. We present a test of this technique using the Canadian foothills SEG velocity model.

  17. Seismic migration in generalized coordinates

    International Nuclear Information System (INIS)

    Arias, C.; Duque, L. F.

    2017-01-01

    Reverse time migration (RTM) is a technique widely used nowadays to obtain images of the earth’s sub-surface, using artificially produced seismic waves. This technique has been developed for zones with flat surface and when applied to zones with rugged topography some corrections must be introduced in order to adapt it. This can produce defects in the final image called artifacts. We introduce a simple mathematical map that transforms a scenario with rugged topography into a flat one. The three steps of the RTM can be applied in a way similar to the conventional ones just by changing the Laplacian in the acoustic wave equation for a generalized one. We present a test of this technique using the Canadian foothills SEG velocity model. (paper)

  18. Seismic Consequence Abstraction

    International Nuclear Information System (INIS)

    Gross, M.

    2004-01-01

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274])

  19. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  20. Smooth horizons and quantum ripples

    International Nuclear Information System (INIS)

    Golovnev, Alexey

    2015-01-01

    Black holes are unique objects which allow for meaningful theoretical studies of strong gravity and even quantum gravity effects. An infalling and a distant observer would have very different views on the structure of the world. However, a careful analysis has shown that it entails no genuine contradictions for physics, and the paradigm of observer complementarity has been coined. Recently this picture was put into doubt. In particular, it was argued that in old black holes a firewall must form in order to protect the basic principles of quantum mechanics. This AMPS paradox has already been discussed in a vast number of papers with different attitudes and conclusions. Here we want to argue that a possible source of confusion is the neglect of quantum gravity effects. Contrary to widespread perception, it does not necessarily mean that effective field theory is inapplicable in rather smooth neighbourhoods of large black hole horizons. The real offender might be an attempt to consistently use it over the huge distances from the near-horizon zone of old black holes to the early radiation. We give simple estimates to support this viewpoint and show how the Page time and (somewhat more speculative) scrambling time do appear. (orig.)

  1. Smooth horizons and quantum ripples

    Energy Technology Data Exchange (ETDEWEB)

    Golovnev, Alexey [Saint Petersburg State University, High Energy Physics Department, Saint-Petersburg (Russian Federation)

    2015-05-15

    Black holes are unique objects which allow for meaningful theoretical studies of strong gravity and even quantum gravity effects. An infalling and a distant observer would have very different views on the structure of the world. However, a careful analysis has shown that it entails no genuine contradictions for physics, and the paradigm of observer complementarity has been coined. Recently this picture was put into doubt. In particular, it was argued that in old black holes a firewall must form in order to protect the basic principles of quantum mechanics. This AMPS paradox has already been discussed in a vast number of papers with different attitudes and conclusions. Here we want to argue that a possible source of confusion is the neglect of quantum gravity effects. Contrary to widespread perception, it does not necessarily mean that effective field theory is inapplicable in rather smooth neighbourhoods of large black hole horizons. The real offender might be an attempt to consistently use it over the huge distances from the near-horizon zone of old black holes to the early radiation. We give simple estimates to support this viewpoint and show how the Page time and (somewhat more speculative) scrambling time do appear. (orig.)

  2. Local Transfer Coefficient, Smooth Channel

    Directory of Open Access Journals (Sweden)

    R. T. Kukreja

    1998-01-01

    Full Text Available Naphthalene sublimation technique and the heat/mass transfer analogy are used to determine the detailed local heat/mass transfer distributions on the leading and trailing walls of a twopass square channel with smooth walls that rotates about a perpendicular axis. Since the variation of density is small in the flow through the channel, buoyancy effect is negligible. Results show that, in both the stationary and rotating channel cases, very large spanwise variations of the mass transfer exist in he turn and in the region immediately downstream of the turn in the second straight pass. In the first straight pass, the rotation-induced Coriolis forces reduce the mass transfer on the leading wall and increase the mass transfer on the trailing wall. In the turn, rotation significantly increases the mass transfer on the leading wall, especially in the upstream half of the turn. Rotation also increases the mass transfer on the trailing wall, more in the downstream half of the turn than in the upstream half of the turn. Immediately downstream of the turn, rotation causes the mass transfer to be much higher on the trailing wall near the downstream corner of the tip of the inner wall than on the opposite leading wall. The mass transfer in the second pass is higher on the leading wall than on the trailing wall. A slower flow causes higher mass transfer enhancement in the turn on both the leading and trailing walls.

  3. Seismic scrammability of HTTR control rods

    International Nuclear Information System (INIS)

    Nishiguchi, I.; Iyoku, T.; Ito, N.; Watanabe, Y.; Araki, T.; Katagiri, S.

    1990-01-01

    Scrammability tests on HTTR (High-Temperature Engineering Test Reactor) control rods under seismic conditions have been carried out and seismic conditions influences on scram time as well as functional integrity were examined. A control rod drive located in a stand-pipe at the top of a reactor vessel, raises and lowers a pair of control rods by suspension cables. Each flexible control rod consists of 10 neutron absorber sections held together by a metal spine passing through the center. It falls into a hole in graphite blocks due to gravity at scram. In the tests, a full scale control rod drive and a pair of control rods were employed with a column of graphite blocks in which holes for rods were formed. Blocks misalignment and contact with the hole surface during earthquakes were considered as major causes of disturbance in scram time. Therefore, the following parameters were set up in the tests: excitation direction, combination or horizontal and vertical excitation, acceleration, frequency and block to block gaps. Main results obtained from tests are as follow. 1) Every scram time obtained under the design conditions was within 6 seconds. On the contrary, the scram times were 5.2 seconds when there were no vibration. Therefore, it was concluded that the seismic effects on scram time were not significant. 2) Scram time became longer with increase in both acceleration and horizontal excitation frequency, and control rods fell very smoothly without any jerkiness. This suggests that collision between control rods and hole surface is the main disturbing factor of falling motion. 3) Mechanical and functional integrity of control rod drive mechanism, control rods and graphite blocks was confirmed after 140 seismic scrammability tests. (author). 10 figs, 1 tab

  4. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  5. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  6. Smoothed Analysis of Local Search Algorithms

    NARCIS (Netherlands)

    Manthey, Bodo; Dehne, Frank; Sack, Jörg-Rüdiger; Stege, Ulrike

    2015-01-01

    Smoothed analysis is a method for analyzing the performance of algorithms for which classical worst-case analysis fails to explain the performance observed in practice. Smoothed analysis has been applied to explain the performance of a variety of algorithms in the last years. One particular class of

  7. Assessment of smoothed spectra using autocorrelation function

    International Nuclear Information System (INIS)

    Urbanski, P.; Kowalska, E.

    2006-01-01

    Recently, data and signal smoothing became almost standard procedures in the spectrometric and chromatographic methods. In radiometry, the main purpose to apply smoothing is minimisation of the statistical fluctuation and avoid distortion. The aim of the work was to find a qualitative parameter, which could be used, as a figure of merit for detecting distortion of the smoothed spectra, based on the linear model. It is assumed that as long as the part of the raw spectrum removed by the smoothing procedure (v s ) will be of random nature, the smoothed spectrum can be considered as undistorted. Thanks to this feature of the autocorrelation function, drifts of the mean value in the removed noise vs as well as its periodicity can be more easily detected from the autocorrelogram than from the original data

  8. Mediators on human airway smooth muscle.

    Science.gov (United States)

    Armour, C; Johnson, P; Anticevich, S; Ammit, A; McKay, K; Hughes, M; Black, J

    1997-01-01

    1. Bronchial hyperresponsiveness in asthma may be due to several abnormalities, but must include alterations in the airway smooth muscle responsiveness and/or volume. 2. Increased responsiveness of airway smooth muscle in vitro can be induced by certain inflammatory cell products and by induction of sensitization (atopy). 3. Increased airway smooth muscle growth can also be induced by inflammatory cell products and atopic serum. 4. Mast cell numbers are increased in the airways of asthmatics and, in our studies, in airway smooth muscle that is sensitized and hyperresponsive. 5. We propose that there is a relationship between mast cells and airway smooth muscle cells which, once an allergic process has been initiated, results in the development of critical features in the lungs in asthma.

  9. Seismic isolation - efficient procedure for seismic response assessement

    International Nuclear Information System (INIS)

    Zamfir, M. A.; Androne, M.

    2016-01-01

    The aim of this analysis is to reduce the dynamic response of a structure. The seismic isolation solution must take into consideration the specific site ground motion. In this paper will be presented results obtained by applying the seismic isolation method. Based on the obtained results, important conclusions can be outlined: the seismic isolation device has the ability to reduce seismic acceleration of the seismic isolated structure to values that no longer present a danger to people and environment; the seismic isolation solution is limiting devices deformations to safety values for ensuring structural integrity and stability of the entire system; the effective seismic energy dissipation and with no side effects both for the seismic isolated building and for the devices used, and the return to the initial position before earthquake occurence are obtained with acceptable permanent displacement. (authors)

  10. Delineation of seismic source zones based on seismicity parameters ...

    Indian Academy of Sciences (India)

    these source zones were evaluated and were used in the hazard evaluation. ... seismic sources, linear and areal, were considered in the present study to model the seismic sources in the ..... taken as an authentic reference manual for iden-.

  11. Seismic fragility analyses

    International Nuclear Information System (INIS)

    Kostov, Marin

    2000-01-01

    In the last two decades there is increasing number of probabilistic seismic risk assessments performed. The basic ideas of the procedure for performing a Probabilistic Safety Analysis (PSA) of critical structures (NUREG/CR-2300, 1983) could be used also for normal industrial and residential buildings, dams or other structures. The general formulation of the risk assessment procedure applied in this investigation is presented in Franzini, et al., 1984. The probability of failure of a structure for an expected lifetime (for example 50 years) can be obtained from the annual frequency of failure, β E determined by the relation: β E ∫[d[β(x)]/dx]P(flx)dx. β(x) is the annual frequency of exceedance of load level x (for example, the variable x may be peak ground acceleration), P(fI x) is the conditional probability of structure failure at a given seismic load level x. The problem leads to the assessment of the seismic hazard β(x) and the fragility P(fl x). The seismic hazard curves are obtained by the probabilistic seismic hazard analysis. The fragility curves are obtained after the response of the structure is defined as probabilistic and its capacity and the associated uncertainties are assessed. Finally the fragility curves are combined with the seismic loading to estimate the frequency of failure for each critical scenario. The frequency of failure due to seismic event is presented by the scenario with the highest frequency. The tools usually applied for probabilistic safety analyses of critical structures could relatively easily be adopted to ordinary structures. The key problems are the seismic hazard definitions and the fragility analyses. The fragility could be derived either based on scaling procedures or on the base of generation. Both approaches have been presented in the paper. After the seismic risk (in terms of failure probability) is assessed there are several approaches for risk reduction. Generally the methods could be classified in two groups. The

  12. The persistent signature of tropical cyclones in ambient seismic noise

    KAUST Repository

    Gualtieri, Lucia; Camargo, Suzana J.; Pascale, Salvatore; Pons, Flavio M.E.; Ekströ m, Gö ran

    2017-01-01

    The spectrum of ambient seismic noise shows strong signals associated with tropical cyclones, yet a detailed understanding of these signals and the relationship between them and the storms is currently lacking. Through the analysis of more than a decade of seismic data recorded at several stations located in and adjacent to the northwest Pacific Ocean, here we show that there is a persistent and frequency-dependent signature of tropical cyclones in ambient seismic noise that depends on characteristics of the storm and on the detailed location of the station relative to the storm. An adaptive statistical model shows that the spectral amplitude of ambient seismic noise, and notably of the short-period secondary microseisms, has a strong relationship with tropical cyclone intensity and can be employed to extract information on the tropical cyclones.

  13. The persistent signature of tropical cyclones in ambient seismic noise

    KAUST Repository

    Gualtieri, Lucia

    2017-12-28

    The spectrum of ambient seismic noise shows strong signals associated with tropical cyclones, yet a detailed understanding of these signals and the relationship between them and the storms is currently lacking. Through the analysis of more than a decade of seismic data recorded at several stations located in and adjacent to the northwest Pacific Ocean, here we show that there is a persistent and frequency-dependent signature of tropical cyclones in ambient seismic noise that depends on characteristics of the storm and on the detailed location of the station relative to the storm. An adaptive statistical model shows that the spectral amplitude of ambient seismic noise, and notably of the short-period secondary microseisms, has a strong relationship with tropical cyclone intensity and can be employed to extract information on the tropical cyclones.

  14. Seismic forecast using geostatistics

    International Nuclear Information System (INIS)

    Grecu, Valeriu; Mateiciuc, Doru

    2007-01-01

    The main idea of this research direction consists in the special way of constructing a new type of mathematical function as being a correlation between a computed statistical quantity and another physical quantity. This type of function called 'position function' was taken over by the authors of this study in the field of seismology with the hope of solving - at least partially - the difficult problem of seismic forecast. The geostatistic method of analysis focuses on the process of energy accumulation in a given seismic area, completing this analysis by a so-called loading function. This function - in fact a temporal function - describes the process of energy accumulation during a seismic cycle from a given seismic area. It was possible to discover a law of evolution of the seismic cycles that was materialized in a so-called characteristic function. This special function will help us to forecast the magnitude and the occurrence moment of the largest earthquake in the analysed area. Since 2000, the authors have been evolving to a new stage of testing: real - time analysis, in order to verify the quality of the method. There were five large earthquakes forecasts. (authors)

  15. Pickering seismic safety margin

    International Nuclear Information System (INIS)

    Ghobarah, A.; Heidebrecht, A.C.; Tso, W.K.

    1992-06-01

    A study was conducted to recommend a methodology for the seismic safety margin review of existing Canadian CANDU nuclear generating stations such as Pickering A. The purpose of the seismic safety margin review is to determine whether the nuclear plant has sufficient seismic safety margin over its design basis to assure plant safety. In this review process, it is possible to identify the weak links which might limit the seismic performance of critical structures, systems and components. The proposed methodology is a modification the EPRI (Electric Power Research Institute) approach. The methodology includes: the characterization of the site margin earthquake, the definition of the performance criteria for the elements of a success path, and the determination of the seismic withstand capacity. It is proposed that the margin earthquake be established on the basis of using historical records and the regional seismo-tectonic and site specific evaluations. The ability of the components and systems to withstand the margin earthquake is determined by database comparisons, inspection, analysis or testing. An implementation plan for the application of the methodology to the Pickering A NGS is prepared

  16. Seismicity and seismic monitoring in the Asse salt mine

    International Nuclear Information System (INIS)

    Flach, D.; Gommlich, G.; Hente, B.

    1987-01-01

    Seismicity analyses are made in order to assess the safety of candidate sites for ultimate disposal of hazardous wastes. The report in hand reviews the seismicity history of the Asse salt mine and presents recent results of a measuring campaign made in the area. The monitoring network installed at the site supplies data and information on the regional seismicity, on seismic amplitudes under ground and above ground, and on microseismic activities. (DG) [de

  17. Smooth halos in the cosmic web

    International Nuclear Information System (INIS)

    Gaite, José

    2015-01-01

    Dark matter halos can be defined as smooth distributions of dark matter placed in a non-smooth cosmic web structure. This definition of halos demands a precise definition of smoothness and a characterization of the manner in which the transition from smooth halos to the cosmic web takes place. We introduce entropic measures of smoothness, related to measures of inequality previously used in economy and with the advantage of being connected with standard methods of multifractal analysis already used for characterizing the cosmic web structure in cold dark matter N-body simulations. These entropic measures provide us with a quantitative description of the transition from the small scales portrayed as a distribution of halos to the larger scales portrayed as a cosmic web and, therefore, allow us to assign definite sizes to halos. However, these ''smoothness sizes'' have no direct relation to the virial radii. Finally, we discuss the influence of N-body discreteness parameters on smoothness

  18. Smooth halos in the cosmic web

    Energy Technology Data Exchange (ETDEWEB)

    Gaite, José, E-mail: jose.gaite@upm.es [Physics Dept., ETSIAE, IDR, Universidad Politécnica de Madrid, Pza. Cardenal Cisneros 3, E-28040 Madrid (Spain)

    2015-04-01

    Dark matter halos can be defined as smooth distributions of dark matter placed in a non-smooth cosmic web structure. This definition of halos demands a precise definition of smoothness and a characterization of the manner in which the transition from smooth halos to the cosmic web takes place. We introduce entropic measures of smoothness, related to measures of inequality previously used in economy and with the advantage of being connected with standard methods of multifractal analysis already used for characterizing the cosmic web structure in cold dark matter N-body simulations. These entropic measures provide us with a quantitative description of the transition from the small scales portrayed as a distribution of halos to the larger scales portrayed as a cosmic web and, therefore, allow us to assign definite sizes to halos. However, these ''smoothness sizes'' have no direct relation to the virial radii. Finally, we discuss the influence of N-body discreteness parameters on smoothness.

  19. Delineation of seismic source zones based on seismicity parameters ...

    Indian Academy of Sciences (India)

    In the present study, an attempt has been made to delineate seismic source zones in the study area (south India) based on the seismicity parameters. Seismicity parameters and the maximum probable earthquake for these source zones were evaluated and were used in the hazard evaluation. The probabilistic evaluation of ...

  20. Seismic Microzonation for Refinement of Seismic Load Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Savich, A. I.; Bugaevskii, A. G., E-mail: office@geodyn.ru, E-mail: bugaevskiy@geodyn.ru [Center of the Office of Geodynamic Observations in the Power Sector, an affiliate of JSC “Institut Gidroproekt” (Russian Federation)

    2016-05-15

    Functional dependencies are established for the characteristics of seismic transients recorded at various points of a studied site, which are used to propose a new approach to seismic microzonation (SMZ) that enables the creation of new SMZ maps of strong seismic motion, with due regard for dynamic parameters of recorded transients during weak earthquakes.

  1. Seismic acceleration map expected for Japanese central region

    International Nuclear Information System (INIS)

    Sugiyama, Takeshi; Maeda, Kouji; Ishii, Kiyoshi; Suzuki, Makoto.

    1990-01-01

    Since electric generating and supplying facilities scatter in large areas, the seismic acceleration map, which defines the anticipated earthquake ground motions in a broad region, is very useful information for the design of those facilities against large earthquakes. This paper describes the development of a seismic acceleration map for the Central Japanese Region by incorporating the analytical results based on historical earthquake records and active fault data using probability and statistics. In the region, there have occurred several destructive earthquakes; Anseitokai (1854, M = 8.4) and Tohnankai (1944, M = 7.9) earthquakes along the Nankai trough; Nohbi (1891, M = 8.0) and Fukui (1948, M = 7.1) earthquakes in inland ares. Some of the historical earthquake data were obtained by instrument last one hundred years, whereas others by literary descriptions for nearly 1,000 years. The active fault data, have been collected mainly from the surveys of fault topography and geology, and are considered to indicate the average seismic activity for the past million years. A proposed seismic acceleration map for the return period of 75 years, calculated on the free surface of base stratum, was estimated by the following way. The analytical result based on the historical earthquake records was adopted mainly, because the Japanese seismic design criteria have been developed based on them. The proposed seismic acceleration map was revised by including the result based on the active fault data for the areas, where historical earthquake records lack, and the result was smoothed to evaluate the final seismic acceleration map. (author)

  2. Induced seismicity. Final report

    International Nuclear Information System (INIS)

    Segall, P.

    1997-01-01

    The objective of this project has been to develop a fundamental understanding of seismicity associated with energy production. Earthquakes are known to be associated with oil, gas, and geothermal energy production. The intent is to develop physical models that predict when seismicity is likely to occur, and to determine to what extent these earthquakes can be used to infer conditions within energy reservoirs. Early work focused on earthquakes induced by oil and gas extraction. Just completed research has addressed earthquakes within geothermal fields, such as The Geysers in northern California, as well as the interactions of dilatancy, friction, and shear heating, on the generation of earthquakes. The former has involved modeling thermo- and poro-elastic effects of geothermal production and water injection. Global Positioning System (GPS) receivers are used to measure deformation associated with geothermal activity, and these measurements along with seismic data are used to test and constrain thermo-mechanical models

  3. Experimental investigation of smoothing by spectral dispersion

    International Nuclear Information System (INIS)

    Regan, Sean P.; Marozas, John A.; Kelly, John H.; Boehly, Thomas R.; Donaldson, William R.; Jaanimagi, Paul A.; Keck, Robert L.; Kessler, Terrance J.; Meyerhofer, David D.; Seka, Wolf

    2000-01-01

    Measurements of smoothing rates for smoothing by spectral dispersion (SSD) of high-power, solid-state laser beams used for inertial confinement fusion (ICF) research are reported. Smoothing rates were obtained from the intensity distributions of equivalent target plane images for laser pulses of varying duration. Simulations of the experimental data with the known properties of the phase plates and the frequency modulators are in good agreement with the experimental data. These results inspire confidence in extrapolating to higher bandwidths and other SSD configurations that may be suitable for ICF experiments and ultimately for direct-drive laser-fusion ignition. (c) 2000 Optical Society of America

  4. Bifurcations of non-smooth systems

    Science.gov (United States)

    Angulo, Fabiola; Olivar, Gerard; Osorio, Gustavo A.; Escobar, Carlos M.; Ferreira, Jocirei D.; Redondo, Johan M.

    2012-12-01

    Non-smooth systems (namely piecewise-smooth systems) have received much attention in the last decade. Many contributions in this area show that theory and applications (to electronic circuits, mechanical systems, …) are relevant to problems in science and engineering. Specially, new bifurcations have been reported in the literature, and this was the topic of this minisymposium. Thus both bifurcation theory and its applications were included. Several contributions from different fields show that non-smooth bifurcations are a hot topic in research. Thus in this paper the reader can find contributions from electronics, energy markets and population dynamics. Also, a carefully-written specific algebraic software tool is presented.

  5. Quake warnings, seismic culture

    Science.gov (United States)

    Allen, Richard M.; Cochran, Elizabeth S.; Huggins, Tom; Miles, Scott; Otegui, Diego

    2017-01-01

    Since 1990, nearly one million people have died from the impacts of earthquakes. Reducing those impacts requires building a local seismic culture in which residents are aware of earthquake risks and value efforts to mitigate harm. Such efforts include earthquake early warning (EEW) systems that provide seconds to minutes notice of pending shaking. Recent events in Mexico provide an opportunity to assess performance and perception of an EEW system and highlight areas for further improvement. We have learned that EEW systems, even imperfect ones, can help people prepare for earthquakes and build local seismic culture, both beneficial in reducing earthquake-related losses.

  6. Induced Seismicity Monitoring System

    Science.gov (United States)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range

  7. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    Science.gov (United States)

    Abo El Ezz, Ahmad

    conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  8. Seismic microzonation of Bangalore, India

    Indian Academy of Sciences (India)

    Evaluation of seismic hazards and microzonation of cities enable us to characterize the potential seismic areas which have similar exposures to haz- ards of earthquakes, and these results can be used for designing new structures or retrofitting the existing ones. Study of seismic hazard and preparation of microzonation ...

  9. Seismic and dynamic qualification methods

    International Nuclear Information System (INIS)

    Lin, C.W.

    1985-01-01

    This book presents the papers given at a conference on seismic effects on nuclear power plants. Topics considered at the conference included seismic qualification of equipment, multifrequency test methodologies, damping in piping systems, the amplification factor, thermal insulation, welded joints, and response factors for seismic risk analysis of piping

  10. Smooth surfaces from rational bilinear patches

    KAUST Repository

    Shi, Ling

    2014-01-01

    Smooth freeform skins from simple panels constitute a challenging topic arising in contemporary architecture. We contribute to this problem area by showing how to approximate a negatively curved surface by smoothly joined rational bilinear patches. The approximation problem is solved with help of a new computational approach to the hyperbolic nets of Huhnen-Venedey and Rörig and optimization algorithms based on it. We also discuss its limits which lie in the topology of the input surface. Finally, freeform deformations based on Darboux transformations are used to generate smooth surfaces from smoothly joined Darboux cyclide patches; in this way we eliminate the restriction to surfaces with negative Gaussian curvature. © 2013 Elsevier B.V.

  11. Smooth embeddings with Stein surface images

    OpenAIRE

    Gompf, Robert E.

    2011-01-01

    A simple characterization is given of open subsets of a complex surface that smoothly perturb to Stein open subsets. As applications, complex 2-space C^2 contains domains of holomorphy (Stein open subsets) that are exotic R^4's, and others homotopy equivalent to the 2-sphere but cut out by smooth, compact 3-manifolds. Pseudoconvex embeddings of Brieskorn spheres and other 3-manifolds into complex surfaces are constructed, as are pseudoconcave holomorphic fillings (with disagreeing contact and...

  12. Some splines produced by smooth interpolation

    Czech Academy of Sciences Publication Activity Database

    Segeth, Karel

    2018-01-01

    Roč. 319, 15 February (2018), s. 387-394 ISSN 0096-3003 R&D Projects: GA ČR GA14-02067S Institutional support: RVO:67985840 Keywords : smooth data approximation * smooth data interpolation * cubic spline Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 1.738, year: 2016 http://www.sciencedirect.com/science/article/pii/S0096300317302746?via%3Dihub

  13. Some splines produced by smooth interpolation

    Czech Academy of Sciences Publication Activity Database

    Segeth, Karel

    2018-01-01

    Roč. 319, 15 February (2018), s. 387-394 ISSN 0096-3003 R&D Projects: GA ČR GA14-02067S Institutional support: RVO:67985840 Keywords : smooth data approximation * smooth data interpolation * cubic spline Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 1.738, year: 2016 http://www. science direct.com/ science /article/pii/S0096300317302746?via%3Dihub

  14. Optimal Smooth Consumption and Annuity Design

    DEFF Research Database (Denmark)

    Bruhn, Kenneth; Steffensen, Mogens

    2013-01-01

    We propose an optimization criterion that yields extraordinary consumption smoothing compared to the well known results of the life-cycle model. Under this criterion we solve the related consumption and investment optimization problem faced by individuals with preferences for intertemporal stabil...... stability in consumption. We find that the consumption and investment patterns demanded under the optimization criterion is in general offered as annuity benefits from products in the class of ‘Formula Based Smoothed Investment-Linked Annuities’....

  15. The seismic analyzer: interpreting and illustrating 2D seismic data.

    Science.gov (United States)

    Patel, Daniel; Giertsen, Christopher; Thurmond, John; Gjelberg, John; Gröller, M Eduard

    2008-01-01

    We present a toolbox for quickly interpreting and illustrating 2D slices of seismic volumetric reflection data. Searching for oil and gas involves creating a structural overview of seismic reflection data to identify hydrocarbon reservoirs. We improve the search of seismic structures by precalculating the horizon structures of the seismic data prior to interpretation. We improve the annotation of seismic structures by applying novel illustrative rendering algorithms tailored to seismic data, such as deformed texturing and line and texture transfer functions. The illustrative rendering results in multi-attribute and scale invariant visualizations where features are represented clearly in both highly zoomed in and zoomed out views. Thumbnail views in combination with interactive appearance control allows for a quick overview of the data before detailed interpretation takes place. These techniques help reduce the work of seismic illustrators and interpreters.

  16. Adaptive metric kernel regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    2000-01-01

    Kernel smoothing is a widely used non-parametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this contribution, we propose an algorithm that adapts the input metric used in multivariate...... regression by minimising a cross-validation estimate of the generalisation error. This allows to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms...

  17. Adaptive Metric Kernel Regression

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Kernel smoothing is a widely used nonparametric pattern recognition technique. By nature, it suffers from the curse of dimensionality and is usually difficult to apply to high input dimensions. In this paper, we propose an algorithm that adapts the input metric used in multivariate regression...... by minimising a cross-validation estimate of the generalisation error. This allows one to automatically adjust the importance of different dimensions. The improvement in terms of modelling performance is illustrated on a variable selection task where the adaptive metric kernel clearly outperforms the standard...

  18. Relays undergo seismic tests

    International Nuclear Information System (INIS)

    Burton, J.C.

    1977-01-01

    Utilities are required by the Nuclear Regulatory Commission to document that seismic vibration will not adversely affect critical electrical equipment. Seismic testing should be designed to determine the malfunction level (fragility testing). Input possibilities include a continuous sine, a decaying sine, a sine beat, random vibrations, and combinations of random vibrations and sine beat. The sine beat most accurately simulates a seismic event. Test frequencies have a broad range in order to accommodate a variety of relay types and cabinet mounting. Simulation of motion along three axes offers several options, but is best achieved by three in-phase single-axis vibration machines that are less likely to induce testing fatigue failure. Consensus on what constitutes relay failure favors a maximum two microsecond discontinuity. Performance tests should be conducted for at least two of the following: (1) nonoperating modes, (2) operating modes, or (3) the transition above the two modes, with the monitoring mode documented for all three. Results should specify a capability curve of maximum safe seismic acceleration and a graph plotting acceleration with sine-beat frequency

  19. Mobile seismic exploration

    Energy Technology Data Exchange (ETDEWEB)

    Dräbenstedt, A., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de; Seyfried, V. [Research & Development, Polytec GmbH, Waldbronn (Germany); Cao, X.; Rembe, C., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de [Institute of Electrical Information Technology, TU Clausthal, Clausthal-Zellerfeld (Germany); Polom, U., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de [Leibniz Institute of Applied Geophysics, Hannover (Germany); Pätzold, F.; Hecker, P. [Institute of Flight Guidance, TU Braunschweig, Braunschweig (Germany); Zeller, T. [Clausthaler Umwelttechnik Institut CUTEC, Clausthal-Zellerfeld (Germany)

    2016-06-28

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDV measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.

  20. Understanding induced seismicity

    NARCIS (Netherlands)

    Elsworth, Derek; Spiers, Christopher J.|info:eu-repo/dai/nl/304829323; Niemeijer, Andre R.|info:eu-repo/dai/nl/370832132

    2016-01-01

    Fluid injection–induced seismicity has become increasingly widespread in oil- and gas-producing areas of the United States (1–3) and western Canada. It has shelved deep geothermal energy projects in Switzerland and the United States (4), and its effects are especially acute in Oklahoma, where

  1. Ground motion input in seismic evaluation studies

    International Nuclear Information System (INIS)

    Sewell, R.T.; Wu, S.C.

    1996-07-01

    This report documents research pertaining to conservatism and variability in seismic risk estimates. Specifically, it examines whether or not artificial motions produce unrealistic evaluation demands, i.e., demands significantly inconsistent with those expected from real earthquake motions. To study these issues, two types of artificial motions are considered: (a) motions with smooth response spectra, and (b) motions with realistic variations in spectral amplitude across vibration frequency. For both types of artificial motion, time histories are generated to match target spectral shapes. For comparison, empirical motions representative of those that might result from strong earthquakes in the Eastern U.S. are also considered. The study findings suggest that artificial motions resulting from typical simulation approaches (aimed at matching a given target spectrum) are generally adequate and appropriate in representing the peak-response demands that may be induced in linear structures and equipment responding to real earthquake motions. Also, given similar input Fourier energies at high-frequencies, levels of input Fourier energy at low frequencies observed for artificial motions are substantially similar to those levels noted in real earthquake motions. In addition, the study reveals specific problems resulting from the application of Western U.S. type motions for seismic evaluation of Eastern U.S. nuclear power plants

  2. Effect of Phase Transformations on Seismic Velocities

    Science.gov (United States)

    Weidner, D. J.; Li, L.; Whitaker, M.; Triplett, R.

    2017-12-01

    The radial velocity structure of the Earth consists of smooth variations of velocities with depth punctuated by abrupt changes of velocity, which are typically due to multivariant phase transformations, where high - low pressure phases can coexist. In this mixed phase region, both the effective shear and bulk moduli will be significantly reduced by the dynamic interaction of the propagating wave and the phase transition if the period of the wave is long enough relative to the kinetic time so that some of the transition can take place. In this presentation, we will give examples from both laboratory studies of phases transitions of Earth minerals and the calculated velocity profile based on our models. We focus on understanding the time limiting factor of the phase transformation in order to extrapolate laboratory results to Earth observations. Both the olivine to ringwoodite transition and KLB-1 partial melting are explored. We find that when the transformation requires diffusion, the kinetics are often slowed down considerably and as a result the diffusivity of atoms become the limiting factor of characteristic time. Specifically Fe-Mg exchange rate in the olivine-ringwoodite phase transition becomes the limiting factor that seismic waves are likely to sample. On the other hand, partial melting is an extremely fast phase transformation at seismic wave periods. We present evidence that ultrasonic waves, with a period of a few tens of nanoseconds, are slowed by the reduction of the effective elastic moduli in this case.

  3. High Voltage Seismic Generator

    Science.gov (United States)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes

  4. Romanian seismic network

    International Nuclear Information System (INIS)

    Ionescu, Constantin; Rizescu, Mihaela; Popa, Mihaela; Grigore, Adrian

    2000-01-01

    The research in the field of seismology in Romania is mainly carried out by the National Institute for Earth Physics (NIEP). The NIEP activities are mainly concerned with the fundamental research financed by research contracts from public sources and the maintenance and operation of the Romanian seismic network. A three stage seismic network is now operating under NIEP, designed mainly to monitor the Vrancea seismic region in a magnitude range from microearthquakes to strong events: - network of 18 short-period seismometers (S13); - Teledyne Geotech Instruments (Texas); - network of 7 stations with local digital recording (PCM-5000) on magnetic tape, made up of, S13 geophone (T=2 s) on vertical component and SH1 geophone (T=5 s) on horizontal components; - network of 28 SMA-1 accelerometers and 30 digital accelerometers (Kinemetrics - K2) installed in the free field conditions in the framework of the joint German-Romanian cooperation program (CRC); the K2 instruments cover a magnitude range from 1.4 to 8.0. Since 1994, MLR (Muntele Rosu) station has become part of the GEOFON network and was provided with high performance broad band instruments. At Bucharest and Timisoara data centers, an automated and networked seismological system performs the on-line digital acquisition and processing of the telemetered data. Automatic processing includes discrimination between local and distant seismic events, earthquake location and magnitude computation, and source parameter determination for local earthquakes. The results are rapidly distributed via Internet, to several seismological services in Europe and USA, to be used in the association/confirmation procedures. Plans for new developments of the network include the upgrade from analog to digital telemetry and new stations for monitoring local seismicity. (authors)

  5. Bayesian inversion of refraction seismic traveltime data

    Science.gov (United States)

    Ryberg, T.; Haberland, Ch

    2018-03-01

    We apply a Bayesian Markov chain Monte Carlo (McMC) formalism to the inversion of refraction seismic, traveltime data sets to derive 2-D velocity models below linear arrays (i.e. profiles) of sources and seismic receivers. Typical refraction data sets, especially when using the far-offset observations, are known as having experimental geometries which are very poor, highly ill-posed and far from being ideal. As a consequence, the structural resolution quickly degrades with depth. Conventional inversion techniques, based on regularization, potentially suffer from the choice of appropriate inversion parameters (i.e. number and distribution of cells, starting velocity models, damping and smoothing constraints, data noise level, etc.) and only local model space exploration. McMC techniques are used for exhaustive sampling of the model space without the need of prior knowledge (or assumptions) of inversion parameters, resulting in a large number of models fitting the observations. Statistical analysis of these models allows to derive an average (reference) solution and its standard deviation, thus providing uncertainty estimates of the inversion result. The highly non-linear character of the inversion problem, mainly caused by the experiment geometry, does not allow to derive a reference solution and error map by a simply averaging procedure. We present a modified averaging technique, which excludes parts of the prior distribution in the posterior values due to poor ray coverage, thus providing reliable estimates of inversion model properties even in those parts of the models. The model is discretized by a set of Voronoi polygons (with constant slowness cells) or a triangulated mesh (with interpolation within the triangles). Forward traveltime calculations are performed by a fast, finite-difference-based eikonal solver. The method is applied to a data set from a refraction seismic survey from Northern Namibia and compared to conventional tomography. An inversion test

  6. A new seismic station in Romania the Bucovina seismic array

    International Nuclear Information System (INIS)

    Grigore, Adrian; Grecu, Bogdan; Ionescu, Constantin; Ghica, Daniela; Popa, Mihaela; Rizescu, Mihaela

    2002-01-01

    Recently, a new seismic monitoring station, the Bucovina Seismic Array, has been established in the northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics, Romania. The array consists of 10 seismic sensors (9 short-period and one broad band) located in boreholes and distributed in a 5 x 5 km area. On July 24, 2002 the official Opening Ceremony of Bucovina Seismic Array took place in the area near the city of Campulung Moldovenesc in the presence of Romanian Prime Minister, Adrian Nastase. Starting with this date, the new seismic monitoring system became fully operational by continuous recording and transmitting data in real-time to the National Data Center of Romania, in Bucharest and to the National Data Center of USA, in Florida. Bucovina Seismic Array, added to the present Seismic Network, will provide much better seismic monitoring coverage of Romania's territory, on-scale recording for weak-to-strong events, and will contribute to advanced seismological studies on seismic hazard and risk, local effects and microzonation, seismic source physics, Earth structure. (authors)

  7. Non-parametric smoothing of experimental data

    International Nuclear Information System (INIS)

    Kuketayev, A.T.; Pen'kov, F.M.

    2007-01-01

    Full text: Rapid processing of experimental data samples in nuclear physics often requires differentiation in order to find extrema. Therefore, even at the preliminary stage of data analysis, a range of noise reduction methods are used to smooth experimental data. There are many non-parametric smoothing techniques: interval averages, moving averages, exponential smoothing, etc. Nevertheless, it is more common to use a priori information about the behavior of the experimental curve in order to construct smoothing schemes based on the least squares techniques. The latter methodology's advantage is that the area under the curve can be preserved, which is equivalent to conservation of total speed of counting. The disadvantages of this approach include the lack of a priori information. For example, very often the sums of undifferentiated (by a detector) peaks are replaced with one peak during the processing of data, introducing uncontrolled errors in the determination of the physical quantities. The problem is solvable only by having experienced personnel, whose skills are much greater than the challenge. We propose a set of non-parametric techniques, which allows the use of any additional information on the nature of experimental dependence. The method is based on a construction of a functional, which includes both experimental data and a priori information. Minimum of this functional is reached on a non-parametric smoothed curve. Euler (Lagrange) differential equations are constructed for these curves; then their solutions are obtained analytically or numerically. The proposed approach allows for automated processing of nuclear physics data, eliminating the need for highly skilled laboratory personnel. Pursuant to the proposed approach is the possibility to obtain smoothing curves in a given confidence interval, e.g. according to the χ 2 distribution. This approach is applicable when constructing smooth solutions of ill-posed problems, in particular when solving

  8. Dense-body aggregates as plastic structures supporting tension in smooth muscle cells.

    Science.gov (United States)

    Zhang, Jie; Herrera, Ana M; Paré, Peter D; Seow, Chun Y

    2010-11-01

    The wall of hollow organs of vertebrates is a unique structure able to generate active tension and maintain a nearly constant passive stiffness over a large volume range. These properties are predominantly attributable to the smooth muscle cells that line the organ wall. Although smooth muscle is known to possess plasticity (i.e., the ability to adapt to large changes in cell length through structural remodeling of contractile apparatus and cytoskeleton), the detailed structural basis for the plasticity is largely unknown. Dense bodies, one of the most prominent structures in smooth muscle cells, have been regarded as the anchoring sites for actin filaments, similar to the Z-disks in striated muscle. Here, we show that the dense bodies and intermediate filaments formed cable-like structures inside airway smooth muscle cells and were able to adjust the cable length according to cell length and tension. Stretching the muscle cell bundle in the relaxed state caused the cables to straighten, indicating that these intracellular structures were connected to the extracellular matrix and could support passive tension. These plastic structures may be responsible for the ability of smooth muscle to maintain a nearly constant tensile stiffness over a large length range. The finding suggests that the structural plasticity of hollow organs may originate from the dense-body cables within the smooth muscle cells.

  9. Effect of smoothing on robust chaos.

    Science.gov (United States)

    Deshpande, Amogh; Chen, Qingfei; Wang, Yan; Lai, Ying-Cheng; Do, Younghae

    2010-08-01

    In piecewise-smooth dynamical systems, situations can arise where the asymptotic attractors of the system in an open parameter interval are all chaotic (e.g., no periodic windows). This is the phenomenon of robust chaos. Previous works have established that robust chaos can occur through the mechanism of border-collision bifurcation, where border is the phase-space region where discontinuities in the derivatives of the dynamical equations occur. We investigate the effect of smoothing on robust chaos and find that periodic windows can arise when a small amount of smoothness is present. We introduce a parameter of smoothing and find that the measure of the periodic windows in the parameter space scales linearly with the parameter, regardless of the details of the smoothing function. Numerical support and a heuristic theory are provided to establish the scaling relation. Experimental evidence of periodic windows in a supposedly piecewise linear dynamical system, which has been implemented as an electronic circuit, is also provided.

  10. TAX SMOOTHING: TESTS ON INDONESIAN DATA

    Directory of Open Access Journals (Sweden)

    Rudi Kurniawan

    2011-01-01

    Full Text Available This paper contributes to the literature of public debt management by testing for tax smoothing behaviour in Indonesia. Tax smoothing means that the government smooths the tax rate across all future time periods to minimize the distortionary costs of taxation over time for a given path of government spending. In a stochastic economy with an incomplete bond market, tax smoothing implies that the tax rate approximates a random walk and changes in the tax rate are nearly unpredictable. For that purpose, two tests were performed. First, random walk behaviour of the tax rate was examined by undertaking unit root tests. The null hypothesis of unit root cannot be rejected, indicating that the tax rate is nonstationary and, hence, it follows a random walk. Second, the predictability of the tax rate was examined by regressing changes in the tax rate on its own lagged values and also on lagged values of changes in the goverment expenditure ratio, and growth of real output. They are found to be not significant in predicting changes in the tax rate. Taken together, the present evidence seems to be consistent with the tax smoothing, therefore provides support to this theory.

  11. Comparison of seismic sources for shallow seismic: sledgehammer and pyrotechnics

    Directory of Open Access Journals (Sweden)

    Brom Aleksander

    2015-10-01

    Full Text Available The pyrotechnic materials are one of the types of the explosives materials which produce thermal, luminous or sound effects, gas, smoke and their combination as a result of a self-sustaining chemical reaction. Therefore, pyrotechnics can be used as a seismic source that is designed to release accumulated energy in a form of seismic wave recorded by tremor sensors (geophones after its passage through the rock mass. The aim of this paper was to determine the utility of pyrotechnics for shallow seismic engineering. The work presented comparing the conventional method of seismic wave excitation for seismic refraction method like plate and hammer and activating of firecrackers on the surface. The energy released by various sources and frequency spectra was compared for the two types of sources. The obtained results did not determine which sources gave the better results but showed very interesting aspects of using pyrotechnics in seismic measurements for example the use of pyrotechnic materials in MASW.

  12. Seismic detection of tornadoes

    Science.gov (United States)

    Tatom, F. B.

    1993-01-01

    Tornadoes represent the most violent of all forms of atmospheric storms, each year resulting in hundreds of millions of dollars in property damage and approximately one hundred fatalities. In recent years, considerable success has been achieved in detecting tornadic storms by means of Doppler radar. However, radar systems cannot determine when a tornado is actually in contact with the ground, expect possibly at extremely close range. At the present time, human observation is the only truly reliable way of knowing that a tornado is actually on the ground. However, considerable evidence exists indicating that a tornado in contact with the ground produces a significant seismic signal. If such signals are generated, the seismic detection and warning of an imminent tornado can become a distinct possibility. 

  13. Seismic Safety Guide

    International Nuclear Information System (INIS)

    Eagling, D.G.

    1985-01-01

    The Seismic Safety Guide provides facilities managers with practical guidelines for administering a comprehensive earthquake safety program. Most facilities managers, unfamiliar with earthquake engineering, tend to look for answers in techniques more sophisticated than required to solve the actual problems in earthquake safety. Often the approach to solutions to these problems is so academic, legalistic, and financially overwhelming that mitigation of actual seismic hazards simply does not get done in a timely, cost-effective way. The objective of the Guide is to provide practical advice about earthquake safety so that managers and engineers can get the job done without falling into common pitfalls, prolonged diagnosis, and unnecessary costs. It is comprehensive with respect to earthquakes in that it covers the most important aspects of natural hazards, site planning, rehabilitation of existing buildings, design of new facilities, operational safety, emergency planning, non-structural elements, life lines, and risk management. 5 references

  14. Seismic analysis - what goal

    International Nuclear Information System (INIS)

    Tagart, S.W.

    1978-01-01

    The seismic analysis of nuclear components is characterized today by extensive engineering computer calculations in order to satisfy both the component standard codes such as ASME III as well as federal regulations and guides. The current nuclear siesmic design procedure has envolved in a fragmented fashion and continues to change its elements as improved technology leads to changing standards and guides. The dominant trend is a monotonic increase in the overall conservation with time causing a similar trend in costs of nuclear power plants. Ironically the improvements in the state of art are feeding a process which is eroding the very incentives that attracted us to nuclear power in the first place. This paper examines the cause of this process and suggests that what is needed is a realistic goal which appropriately addresses the overall uncertainty of the seismic design process. (Auth.)

  15. Seismic capacity of switchgear

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassir, M.; Pepper, S.

    1989-01-01

    As part of a component fragility program sponsored by the USNRC, BNL has collected existing information on the seismic capacity of switchgear assemblies from major manufacturers. Existing seismic test data for both low and medium voltage switchgear assemblies have been evaluated and the generic results are presented in this paper. The failure modes are identified and the corresponding generic lower bound capacity levels are established. The test response spectra have been used as a measure of the test vibration input. The results indicate that relays chatter at a very low input level at the base of the switchgear cabinet. This change of state of devices including relays have been observed. Breaker tripping occurs at a higher vibration level. Although the structural failure of internal elements have been noticed, the overall switchgear cabinet structure withstands a high vibration level. 5 refs., 2 figs., 2 tabs

  16. Source of seismic signals

    Energy Technology Data Exchange (ETDEWEB)

    Frankovskii, B.A.; Khor' yakov, K.A.

    1980-08-30

    Patented is a source of seismic signals consisting of a shock generator with a basic low-voltage and auxillary high-voltage stator coils, a capacitive transformer and control switches. To increase the amplitude of signal excitation a condensor battery and auxillary commutator are introduced into the device, which are connected in parallel and serially into the circuit of the main low-voltage stator coil.

  17. Stutter seismic source

    Energy Technology Data Exchange (ETDEWEB)

    Gumma, W. H.; Hughes, D. R.; Zimmerman, N. S.

    1980-08-12

    An improved seismic prospecting system comprising the use of a closely spaced sequence of source initiations at essentially the same location to provide shorter objective-level wavelets than are obtainable with a single pulse. In a preferred form, three dynamite charges are detonated in the same or three closely spaced shot holes to generate a downward traveling wavelet having increased high frequency content and reduced content at a peak frequency determined by initial testing.

  18. Long Period Seismic Waves

    Science.gov (United States)

    1976-08-01

    Geoffsica, TPHM. No. 5 , p. 161. Vargas, Freddy (To he published in 1976) 1 .-DTSCRP1TNACTON DE EVENTO«; NATHDALE«; Y ARTTFTCT ALES. 2.- CALCULO DEL...seismic risk, bv de - fininn relative weiqht of maximum MM intensity at a pivon distance ponulation density, area feolupy and attenuation of intensity wit...Population densitv, area peolopv and attenuation of intensitv with distance, is presented topether with a map anplvinp theorv to Bo- livia. ^«^a

  19. Oklahoma seismic network

    International Nuclear Information System (INIS)

    Luza, K.V.; Lawson, J.E. Jr.; Univ. of Oklahoma, Norman, OK

    1993-07-01

    The US Nuclear Regulatory Commission has established rigorous guidelines that must be adhered to before a permit to construct a nuclear-power plant is granted to an applicant. Local as well as regional seismicity and structural relationships play an integral role in the final design criteria for nuclear power plants. The existing historical record of seismicity is inadequate in a number of areas of the Midcontinent region because of the lack of instrumentation and (or) the sensitivity of the instruments deployed to monitor earthquake events. The Nemaha Uplift/Midcontinent Geophysical Anomaly is one of five principal areas east of the Rocky Mountain front that has a moderately high seismic-risk classification. The Nemaha uplift, which is common to the states of Oklahoma, Kansas, and Nebraska, is approximately 415 miles long and 12-14 miles wide. The Midcontinent Geophysical Anomaly extends southward from Minnesota across Iowa and the southeastern corner of Nebraska and probably terminates in central Kansas. A number of moderate-sized earthquakes--magnitude 5 or greater--have occurred along or west of the Nemaha uplift. The Oklahoma Geological Survey, in cooperation with the geological surveys of Kansas, Nebraska, and Iowa, conducted a 5-year investigation of the seismicity and tectonic relationships of the Nemaha uplift and associated geologic features in the Midcontinent. This investigation was intended to provide data to be used to design nuclear-power plants. However, the information is also being used to design better large-scale structures, such as dams and high-use buildings, and to provide the necessary data to evaluate earthquake-insurance rates in the Midcontinent

  20. Lyapunov exponents and smooth ergodic theory

    CERN Document Server

    Barreira, Luis

    2001-01-01

    This book is a systematic introduction to smooth ergodic theory. The topics discussed include the general (abstract) theory of Lyapunov exponents and its applications to the stability theory of differential equations, stable manifold theory, absolute continuity, and the ergodic theory of dynamical systems with nonzero Lyapunov exponents (including geodesic flows). The authors consider several non-trivial examples of dynamical systems with nonzero Lyapunov exponents to illustrate some basic methods and ideas of the theory. This book is self-contained. The reader needs a basic knowledge of real analysis, measure theory, differential equations, and topology. The authors present basic concepts of smooth ergodic theory and provide complete proofs of the main results. They also state some more advanced results to give readers a broader view of smooth ergodic theory. This volume may be used by those nonexperts who wish to become familiar with the field.

  1. Multiple predictor smoothing methods for sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  2. Adsorption on smooth electrodes: A radiotracer study

    International Nuclear Information System (INIS)

    Rice-Jackson, L.M.

    1990-01-01

    Adsorption on solids is a complicated process and in most cases, occurs as the early stage of other more complicated processes, i.e. chemical reactions, electrooxidation, electroreduction. The research reported here combines the electroanalytical method, cyclic voltammetry, and the use of radio-labeled isotopes, soft beta emitters, to study adsorption processes at smooth electrodes. The in-situ radiotracer method is highly anion (molecule) specific and provides information on the structure and composition of the electric double layer. The emphasis of this research was on studying adsorption processes at smooth electrodes of copper, gold, and platinum. The application of the radiotracer method to these smooth surfaces have led to direct in-situ measurements from which surface coverage was determined; anions and molecules were identified; and weak interactions of adsorbates with the surface of the electrodes were readily monitored. 179 refs

  3. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  4. Seismic contracts and agreements

    International Nuclear Information System (INIS)

    Cooper, N.M.; Krause, V.

    1999-01-01

    Some points to consider regarding management of seismic projects within the Canadian petroleum industry were reviewed. Seismic projects involve the integration of many services. This paper focused on user-provider relationships, the project planning process, competitive bid considerations, the types of agreement used for seismic and their implications, and the impact that certain points of control may have on a company: (1) initial estimate versus actual cost, (2) liability, (3) safety and operational performance, and (4) quality of deliverables. The objective is to drive home the point that in today's environment where companies are forming, merging, or collapsing on a weekly basis , chain of command and accountability are issues that can no longer be dealt with casually. Companies must form business relationships with service providers with a full knowledge of benefits and liabilities of the style of relationship they choose. Diligent and proactive management tends to optimize cost, safety and liability issues, all of which have a bearing on the points of control available to the company

  5. Establishing seismic design criteria to achieve an acceptable seismic margin

    International Nuclear Information System (INIS)

    Kennedy, R.P.

    1997-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2). What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the Safe Shutdown Earthquake ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented

  6. Polarization beam smoothing for inertial confinement fusion

    International Nuclear Information System (INIS)

    Rothenberg, Joshua E.

    2000-01-01

    For both direct and indirect drive approaches to inertial confinement fusion (ICF) it is imperative to obtain the best possible drive beam uniformity. The approach chosen for the National Ignition Facility uses a random-phase plate to generate a speckle pattern with a precisely controlled envelope on target. A number of temporal smoothing techniques can then be employed to utilize bandwidth to rapidly change the speckle pattern, and thus average out the small-scale speckle structure. One technique which generally can supplement other smoothing methods is polarization smoothing (PS): the illumination of the target with two distinct and orthogonally polarized speckle patterns. Since these two polarizations do not interfere, the intensity patterns add incoherently, and the rms nonuniformity can be reduced by a factor of (√2). A number of PS schemes are described and compared on the basis of the aggregate rms and the spatial spectrum of the focused illumination distribution. The (√2) rms nonuniformity reduction of PS is present on an instantaneous basis and is, therefore, of particular interest for the suppression of laser plasma instabilities, which have a very rapid response time. When combining PS and temporal methods, such as smoothing by spectral dispersion (SSD), PS can reduce the rms of the temporally smoothed illumination by an additional factor of (√2). However, it has generally been thought that in order to achieve this reduction of (√2), the increased divergence of the beam from PS must exceed the divergence of SSD. It is also shown here that, over the time scales of interest to direct or indirect drive ICF, under some conditions PS can reduce the smoothed illumination rms by nearly (√2) even when the PS divergence is much smaller than that of SSD. (c) 2000 American Institute of Physics

  7. Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory

    Science.gov (United States)

    Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi

    2018-03-01

    With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.

  8. Some properties of the smoothed Wigner function

    International Nuclear Information System (INIS)

    Soto, F.; Claverie, P.

    1981-01-01

    Recently it has been proposed a modification of the Wigner function which consists in smoothing it by convolution with a phase-space gaussian function; this smoothed Wigner function is non-negative if the gaussian parameters Δ and delta satisfy the condition Δdelta > h/2π. We analyze in this paper the predictions of this modified Wigner function for the harmonic oscillator, for anharmonic oscillator and finally for the hydrogen atom. We find agreement with experiment in the linear case, but for strongly nonlinear systems, such as the hydrogen atom, the results obtained are completely wrong. (orig.)

  9. Cardiac, Skeletal, and smooth muscle mitochondrial respiration

    DEFF Research Database (Denmark)

    Park, Song-Young; Gifford, Jayson R; Andtbacka, Robert H I

    2014-01-01

    , skeletal, and smooth muscle was harvested from a total of 22 subjects (53±6 yrs) and mitochondrial respiration assessed in permeabilized fibers. Complex I+II, state 3 respiration, an index of oxidative phosphorylation capacity, fell progressively from cardiac, skeletal, to smooth muscle (54±1; 39±4; 15......±1 pmol•s(-1)•mg (-1), prespiration rates were normalized by CS (respiration...... per mitochondrial content), oxidative phosphorylation capacity was no longer different between the three muscle types. Interestingly, Complex I state 2 normalized for CS activity, an index of non-phosphorylating respiration per mitochondrial content, increased progressively from cardiac, skeletal...

  10. Smooth massless limit of field theories

    International Nuclear Information System (INIS)

    Fronsdal, C.

    1980-01-01

    The massless limit of Fierz-Pauli field theories, describing fields with fixed mass and spin interacting with external sources, is examined. Results are obtained for spins, 1, 3/2, 2 and 3 using conventional models, and then for all half-integral spins in a relatively model-independent manner. It is found that the massless limit is smooth provided that the sources satisfy certain conditions. In the massless limit these conditions reduce to the conservation laws required by internal consistency of massless field theory. Smoothness simply requires that quantities that vanish in the massless case approach zero in a certain well-defined manner. (orig.)

  11. Seismic fragility capacity of equipment

    International Nuclear Information System (INIS)

    Iijima, Toru; Abe, Hiroshi; Suzuki, Kenichi

    2006-01-01

    Seismic probabilistic safety assessment (PSA) is an available method to evaluate residual risks of nuclear plants that are designed on definitive seismic conditions. From our preliminary seismic PSA analysis, horizontal shaft pumps are important components that have significant influences on the core damage frequency (CDF). An actual horizontal shaft pump and some kinds of elements were tested to evaluate realistic fragility capacities. Our test results showed that the realistic fragility capacity of horizontal shaft pump would be at least four times as high as a current value, 1.6 x 9.8 m/s 2 , used for our seismic PSA. We are going to incorporate the fragility capacity data that were obtained from those tests into our seismic PSA analysis, and we expect that the reliability of seismic PSA should increase. (author)

  12. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  13. Seismic Prediction While Drilling (SPWD): Seismic exploration ahead of the drill bit using phased array sources

    Science.gov (United States)

    Jaksch, Katrin; Giese, Rüdiger; Kopf, Matthias

    2010-05-01

    In the case of drilling for deep reservoirs previous exploration is indispensable. In recent years the focus shifted more on geological structures like small layers or hydrothermal fault systems. Beside 2D- or 3D-seismics from the surface and seismic measurements like Vertical Seismic Profile (VSP) or Seismic While Drilling (SWD) within a borehole these methods cannot always resolute this structures. The resolution is worsen the deeper and smaller the sought-after structures are. So, potential horizons like small layers in oil exploration or fault zones usable for geothermal energy production could be failed or not identified while drilling. The application of a device to explore the geology with a high resolution ahead of the drill bit in direction of drilling would be of high importance. Such a device would allow adjusting the drilling path according to the real geology and would minimize the risk of discovery and hence the costs for drilling. Within the project SPWD a device for seismic exploration ahead of the drill bit will be developed. This device should allow the seismic exploration to predict areas about 50 to 100 meters ahead of the drill bit with a resolution of one meter. At the GFZ a first prototype consisting of different units for seismic sources, receivers and data loggers has been designed and manufactured. As seismic sources four standard magnetostrictive actuators and as receivers four 3-component-geophones are used. Every unit, actuator or geophone, can be rotated in steps of 15° around the longitudinal axis of the prototype to test different measurement configurations. The SPWD prototype emits signal frequencies of about 500 up to 5000 Hz which are significant higher than in VSP and SWD. An increased radiation of seismic wave energy in the direction of the borehole axis allows the view in areas to be drilled. Therefore, every actuator must be controlled independently of each other regarding to amplitude and phase of the source signal to

  14. Seismic Imager Space Telescope

    Science.gov (United States)

    Sidick, Erkin; Coste, Keith; Cunningham, J.; Sievers,Michael W.; Agnes, Gregory S.; Polanco, Otto R.; Green, Joseph J.; Cameron, Bruce A.; Redding, David C.; Avouac, Jean Philippe; hide

    2012-01-01

    A concept has been developed for a geostationary seismic imager (GSI), a space telescope in geostationary orbit above the Pacific coast of the Americas that would provide movies of many large earthquakes occurring in the area from Southern Chile to Southern Alaska. The GSI movies would cover a field of view as long as 300 km, at a spatial resolution of 3 to 15 m and a temporal resolution of 1 to 2 Hz, which is sufficient for accurate measurement of surface displacements and photometric changes induced by seismic waves. Computer processing of the movie images would exploit these dynamic changes to accurately measure the rapidly evolving surface waves and surface ruptures as they happen. These measurements would provide key information to advance the understanding of the mechanisms governing earthquake ruptures, and the propagation and arrest of damaging seismic waves. GSI operational strategy is to react to earthquakes detected by ground seismometers, slewing the satellite to point at the epicenters of earthquakes above a certain magnitude. Some of these earthquakes will be foreshocks of larger earthquakes; these will be observed, as the spacecraft would have been pointed in the right direction. This strategy was tested against the historical record for the Pacific coast of the Americas, from 1973 until the present. Based on the seismicity recorded during this time period, a GSI mission with a lifetime of 10 years could have been in position to observe at least 13 (22 on average) earthquakes of magnitude larger than 6, and at least one (2 on average) earthquake of magnitude larger than 7. A GSI would provide data unprecedented in its extent and temporal and spatial resolution. It would provide this data for some of the world's most seismically active regions, and do so better and at a lower cost than could be done with ground-based instrumentation. A GSI would revolutionize the understanding of earthquake dynamics, perhaps leading ultimately to effective warning

  15. Smoothing-Based Relative Navigation and Coded Aperture Imaging

    Science.gov (United States)

    Saenz-Otero, Alvar; Liebe, Carl Christian; Hunter, Roger C.; Baker, Christopher

    2017-01-01

    This project will develop an efficient smoothing software for incremental estimation of the relative poses and velocities between multiple, small spacecraft in a formation, and a small, long range depth sensor based on coded aperture imaging that is capable of identifying other spacecraft in the formation. The smoothing algorithm will obtain the maximum a posteriori estimate of the relative poses between the spacecraft by using all available sensor information in the spacecraft formation.This algorithm will be portable between different satellite platforms that possess different sensor suites and computational capabilities, and will be adaptable in the case that one or more satellites in the formation become inoperable. It will obtain a solution that will approach an exact solution, as opposed to one with linearization approximation that is typical of filtering algorithms. Thus, the algorithms developed and demonstrated as part of this program will enhance the applicability of small spacecraft to multi-platform operations, such as precisely aligned constellations and fractionated satellite systems.

  16. Directional asymmetries in human smooth pursuit eye movements.

    Science.gov (United States)

    Ke, Sally R; Lam, Jessica; Pai, Dinesh K; Spering, Miriam

    2013-06-27

    Humans make smooth pursuit eye movements to bring the image of a moving object onto the fovea. Although pursuit accuracy is critical to prevent motion blur, the eye often falls behind the target. Previous studies suggest that pursuit accuracy differs between motion directions. Here, we systematically assess asymmetries in smooth pursuit. In experiment 1, binocular eye movements were recorded while observers (n = 20) tracked a small spot of light moving along one of four cardinal or diagonal axes across a featureless background. We analyzed pursuit latency, acceleration, peak velocity, gain, and catch-up saccade latency, number, and amplitude. In experiment 2 (n = 22), we examined the effects of spatial location and constrained stimulus motion within the upper or lower visual field. Pursuit was significantly faster (higher acceleration, peak velocity, and gain) and smoother (fewer and later catch-up saccades) in response to downward versus upward motion in both the upper and the lower visual fields. Pursuit was also more accurate and smoother in response to horizontal versus vertical motion. CONCLUSIONS. Our study is the first to report a consistent up-down asymmetry in human adults, regardless of visual field. Our findings suggest that pursuit asymmetries are adaptive responses to the requirements of the visual context: preferred motion directions (horizontal and downward) are more critical to our survival than nonpreferred ones.

  17. Seismic capacity of a reinforced concrete frame structure without seismic detailing and limited ductility seismic design in moderate seismicity

    International Nuclear Information System (INIS)

    Kim, J. K.; Kim, I. H.

    1999-01-01

    A four-story reinforced concrete frame building model is designed for the gravity loads only. Static nonlinear pushover analyses are performed in two orthogonal horizontal directions. The overall capacity curves are converted into ADRS spectra and compared with demand spectra. At several points the deformed shape, moment and shear distribution are calculated. Based on these results limited ductility seismic design concept is proposed as an alternative seismic design approach in moderate seismicity resign

  18. Seismic safety research program plan

    International Nuclear Information System (INIS)

    1987-05-01

    This document presents a plan for seismic research to be performed by the Structural and Seismic Engineering Branch in the Office of Nuclear Regulatory Research. The plan describes the regulatory needs and related research necessary to address the following issues: uncertainties in seismic hazard, earthquakes larger than the design basis, seismic vulnerabilities, shifts in building frequency, piping design, and the adequacy of current criteria and methods. In addition to presenting current and proposed research within the NRC, the plan discusses research sponsored by other domestic and foreign sources

  19. Seismic modelling of shallow coalfields

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.C. (University of Calgary, Calgary, Alberta (Canada). Dept. of Geology and Geophysics.)

    1987-01-01

    This study was undertaken in order to determine whether reflection seismic surveys can be used to map stratigraphic and structural detail of shallow Plains-type coal deposits. Two coalfields in central Alberta were used to examine and determine optimum acquisition parameters for reflection seismic surveys in such settings. The study was based on 1-D and 2-D numerical seismic modelling using sonic and density well logs to formulate a layered earth model. Additional objectives were to interpret the reflection seismic data in terms of geologic features in the study area, and to investigate the relationship between vertical resolution and field acquisition geometry. 27 refs., 41 figs.

  20. Risk based seismic design criteria

    International Nuclear Information System (INIS)

    Kennedy, R.P.

    1999-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2) What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the safe-shutdown-earthquake (SSE) ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented. (orig.)

  1. 16-dimensional smooth projective planes with large collineation groups

    OpenAIRE

    Bödi, Richard

    1998-01-01

    Erworben im Rahmen der Schweizer Nationallizenzen (http://www.nationallizenzen.ch) Smooth projective planes are projective planes defined on smooth manifolds (i.e. the set of points and the set of lines are smooth manifolds) such that the geometric operations of join and intersection are smooth. A systematic study of such planes and of their collineation groups can be found in previous works of the author. We prove in this paper that a 16-dimensional smooth projective plane which admits a ...

  2. Hear it, See it, Explore it: Visualizations and Sonifications of Seismic Signals

    Science.gov (United States)

    Fisher, M.; Peng, Z.; Simpson, D. W.; Kilb, D. L.

    2010-12-01

    Sonification of seismic data is an innovative way to represent seismic data in the audible range (Simpson, 2005). Seismic waves with different frequency and temporal characteristics, such as those from teleseismic earthquakes, deep “non-volcanic” tremor and local earthquakes, can be easily discriminated when time-compressed to the audio range. Hence, sonification is particularly useful for presenting complicated seismic signals with multiple sources, such as aftershocks within the coda of large earthquakes, and remote triggering of earthquakes and tremor by large teleseismic earthquakes. Previous studies mostly focused on converting the seismic data into audible files by simple time compression or frequency modulation (Simpson et al., 2009). Here we generate animations of the seismic data together with the sounds. We first read seismic data in the SAC format into Matlab, and generate a sequence of image files and an associated WAV sound file. Next, we use a third party video editor, such as the QuickTime Pro, to combine the image sequences and the sound file into an animation. We have applied this simple procedure to generate animations of remotely triggered earthquakes, tremor and low-frequency earthquakes in California, and mainshock-aftershock sequences in Japan and California. These animations clearly demonstrate the interactions of earthquake sequences and the richness of the seismic data. The tool developed in this study can be easily adapted for use in other research applications and to create sonification/animation of seismic data for education and outreach purpose.

  3. The influence of backfill on seismicity

    CSIR Research Space (South Africa)

    Hemp, DA

    1990-09-01

    Full Text Available , that the seismicity has been reduced in areas where backfill had been placed. A factor complicating the evaluation of backfill on seismicity is the effect of geological structures on seismicity....

  4. Smoothness in Banach spaces. Selected problems

    Czech Academy of Sciences Publication Activity Database

    Fabian, Marián; Montesinos, V.; Zizler, Václav

    2006-01-01

    Roč. 100, č. 2 (2006), s. 101-125 ISSN 1578-7303 R&D Projects: GA ČR(CZ) GA201/04/0090; GA AV ČR(CZ) IAA100190610 Institutional research plan: CEZ:AV0Z10190503 Keywords : smooth norm * renorming * weakly compactly generated space Subject RIV: BA - General Mathematics

  5. The Koch curve as a smooth manifold

    International Nuclear Information System (INIS)

    Epstein, Marcelo; Sniatycki, Jedrzej

    2008-01-01

    We show that there exists a homeomorphism between the closed interval [0,1] is contained in R and the Koch curve endowed with the subset topology of R 2 . We use this homeomorphism to endow the Koch curve with the structure of a smooth manifold with boundary

  6. on Isolated Smooth Muscle Preparation in Rats

    African Journals Online (AJOL)

    Samuel Olaleye

    ABSTRACT. This study investigated the receptor effects of methanolic root extract of ... Phytochemical Analysis: Photochemistry of the methanolic extract was ... mounted with resting tension 0.5g in an organ bath containing .... Effects of extra cellular free Ca2+ and 0.5mM ... isolated smooth muscle by high K+ on the other.

  7. PHANTOM: Smoothed particle hydrodynamics and magnetohydrodynamics code

    Science.gov (United States)

    Price, Daniel J.; Wurster, James; Nixon, Chris; Tricco, Terrence S.; Toupin, Stéven; Pettitt, Alex; Chan, Conrad; Laibe, Guillaume; Glover, Simon; Dobbs, Clare; Nealon, Rebecca; Liptai, David; Worpel, Hauke; Bonnerot, Clément; Dipierro, Giovanni; Ragusa, Enrico; Federrath, Christoph; Iaconi, Roberto; Reichardt, Thomas; Forgan, Duncan; Hutchison, Mark; Constantino, Thomas; Ayliffe, Ben; Mentiplay, Daniel; Hirsh, Kieran; Lodato, Giuseppe

    2017-09-01

    Phantom is a smoothed particle hydrodynamics and magnetohydrodynamics code focused on stellar, galactic, planetary, and high energy astrophysics. It is modular, and handles sink particles, self-gravity, two fluid and one fluid dust, ISM chemistry and cooling, physical viscosity, non-ideal MHD, and more. Its modular structure makes it easy to add new physics to the code.

  8. Data driven smooth tests for composite hypotheses

    NARCIS (Netherlands)

    Inglot, Tadeusz; Kallenberg, Wilbert C.M.; Ledwina, Teresa

    1997-01-01

    The classical problem of testing goodness-of-fit of a parametric family is reconsidered. A new test for this problem is proposed and investigated. The new test statistic is a combination of the smooth test statistic and Schwarz's selection rule. More precisely, as the sample size increases, an

  9. Full Waveform Inversion Using Nonlinearly Smoothed Wavefields

    KAUST Repository

    Li, Y.; Choi, Yun Seok; Alkhalifah, Tariq Ali; Li, Z.

    2017-01-01

    The lack of low frequency information in the acquired data makes full waveform inversion (FWI) conditionally converge to the accurate solution. An initial velocity model that results in data with events within a half cycle of their location in the observed data was required to converge. The multiplication of wavefields with slightly different frequencies generates artificial low frequency components. This can be effectively utilized by multiplying the wavefield with itself, which is nonlinear operation, followed by a smoothing operator to extract the artificially produced low frequency information. We construct the objective function using the nonlinearly smoothed wavefields with a global-correlation norm to properly handle the energy imbalance in the nonlinearly smoothed wavefield. Similar to the multi-scale strategy, we progressively reduce the smoothing width applied to the multiplied wavefield to welcome higher resolution. We calculate the gradient of the objective function using the adjoint-state technique, which is similar to the conventional FWI except for the adjoint source. Examples on the Marmousi 2 model demonstrate the feasibility of the proposed FWI method to mitigate the cycle-skipping problem in the case of a lack of low frequency information.

  10. On the theory of smooth structures. 2

    International Nuclear Information System (INIS)

    Shafei Deh Abad, A.

    1992-09-01

    In this paper we continue by introducing the concepts of substructures, quotient structures and tensor product, and examine some of their properties. By using the concept of tensor product, in the next paper, we will give another product for smooth structures which is a characterization of integral domains which are not fields. (author). 2 refs

  11. Full Waveform Inversion Using Nonlinearly Smoothed Wavefields

    KAUST Repository

    Li, Y.

    2017-05-26

    The lack of low frequency information in the acquired data makes full waveform inversion (FWI) conditionally converge to the accurate solution. An initial velocity model that results in data with events within a half cycle of their location in the observed data was required to converge. The multiplication of wavefields with slightly different frequencies generates artificial low frequency components. This can be effectively utilized by multiplying the wavefield with itself, which is nonlinear operation, followed by a smoothing operator to extract the artificially produced low frequency information. We construct the objective function using the nonlinearly smoothed wavefields with a global-correlation norm to properly handle the energy imbalance in the nonlinearly smoothed wavefield. Similar to the multi-scale strategy, we progressively reduce the smoothing width applied to the multiplied wavefield to welcome higher resolution. We calculate the gradient of the objective function using the adjoint-state technique, which is similar to the conventional FWI except for the adjoint source. Examples on the Marmousi 2 model demonstrate the feasibility of the proposed FWI method to mitigate the cycle-skipping problem in the case of a lack of low frequency information.

  12. Local smoothness for global optical flow

    DEFF Research Database (Denmark)

    Rakêt, Lars Lau

    2012-01-01

    by this technique and work on local-global optical flow we propose a simple method for fusing optical flow estimates of different smoothness by evaluating interpolation quality locally by means of L1 block match on the corresponding set of gradient images. We illustrate the method in a setting where optical flows...

  13. Interval Forecast for Smooth Transition Autoregressive Model ...

    African Journals Online (AJOL)

    In this paper, we propose a simple method for constructing interval forecast for smooth transition autoregressive (STAR) model. This interval forecast is based on bootstrapping the residual error of the estimated STAR model for each forecast horizon and computing various Akaike information criterion (AIC) function. This new ...

  14. Supplementary speed control for wind power smoothing

    NARCIS (Netherlands)

    Haan, de J.E.S.; Frunt, J.; Kechroud, A.; Kling, W.L.

    2010-01-01

    Wind fluctuations result in even larger wind power fluctuations because the power of wind is proportional to the cube of the wind speed. This report analyzes wind power fluctuations to investigate inertial power smoothing, in particular for the frequency range of 0.08 - 0.5 Hz. Due to the growing

  15. A Smoothed Finite Element-Based Elasticity Model for Soft Bodies

    Directory of Open Access Journals (Sweden)

    Juan Zhang

    2017-01-01

    Full Text Available One of the major challenges in mesh-based deformation simulation in computer graphics is to deal with mesh distortion. In this paper, we present a novel mesh-insensitive and softer method for simulating deformable solid bodies under the assumptions of linear elastic mechanics. A face-based strain smoothing method is adopted to alleviate mesh distortion instead of the traditional spatial adaptive smoothing method. Then, we propose a way to combine the strain smoothing method and the corotational method. With this approach, the amplitude and frequency of transient displacements are slightly affected by the distorted mesh. Realistic simulation results are generated under large rotation using a linear elasticity model without adding significant complexity or computational cost to the standard corotational FEM. Meanwhile, softening effect is a by-product of our method.

  16. Classification algorithms using adaptive partitioning

    KAUST Repository

    Binev, Peter; Cohen, Albert; Dahmen, Wolfgang; DeVore, Ronald

    2014-01-01

    © 2014 Institute of Mathematical Statistics. Algorithms for binary classification based on adaptive tree partitioning are formulated and analyzed for both their risk performance and their friendliness to numerical implementation. The algorithms can be viewed as generating a set approximation to the Bayes set and thus fall into the general category of set estimators. In contrast with the most studied tree-based algorithms, which utilize piecewise constant approximation on the generated partition [IEEE Trans. Inform. Theory 52 (2006) 1335.1353; Mach. Learn. 66 (2007) 209.242], we consider decorated trees, which allow us to derive higher order methods. Convergence rates for these methods are derived in terms the parameter - of margin conditions and a rate s of best approximation of the Bayes set by decorated adaptive partitions. They can also be expressed in terms of the Besov smoothness β of the regression function that governs its approximability by piecewise polynomials on adaptive partition. The execution of the algorithms does not require knowledge of the smoothness or margin conditions. Besov smoothness conditions are weaker than the commonly used Holder conditions, which govern approximation by nonadaptive partitions, and therefore for a given regression function can result in a higher rate of convergence. This in turn mitigates the compatibility conflict between smoothness and margin parameters.

  17. Classification algorithms using adaptive partitioning

    KAUST Repository

    Binev, Peter

    2014-12-01

    © 2014 Institute of Mathematical Statistics. Algorithms for binary classification based on adaptive tree partitioning are formulated and analyzed for both their risk performance and their friendliness to numerical implementation. The algorithms can be viewed as generating a set approximation to the Bayes set and thus fall into the general category of set estimators. In contrast with the most studied tree-based algorithms, which utilize piecewise constant approximation on the generated partition [IEEE Trans. Inform. Theory 52 (2006) 1335.1353; Mach. Learn. 66 (2007) 209.242], we consider decorated trees, which allow us to derive higher order methods. Convergence rates for these methods are derived in terms the parameter - of margin conditions and a rate s of best approximation of the Bayes set by decorated adaptive partitions. They can also be expressed in terms of the Besov smoothness β of the regression function that governs its approximability by piecewise polynomials on adaptive partition. The execution of the algorithms does not require knowledge of the smoothness or margin conditions. Besov smoothness conditions are weaker than the commonly used Holder conditions, which govern approximation by nonadaptive partitions, and therefore for a given regression function can result in a higher rate of convergence. This in turn mitigates the compatibility conflict between smoothness and margin parameters.

  18. Role of Smooth Muscle in Intestinal Inflammation

    Directory of Open Access Journals (Sweden)

    Stephen M Collins

    1996-01-01

    Full Text Available The notion that smooth muscle function is altered in inflammation is prompted by clinical observations of altered motility in patients with inflammatory bowel disease (IBD. While altered motility may reflect inflammation-induced changes in intrinsic or extrinsic nerves to the gut, changes in gut hormone release and changes in muscle function, recent studies have provided in vitro evidence of altered muscle contractility in muscle resected from patients with ulcerative colitis or Crohn’s disease. In addition, the observation that smooth muscle cells are more numerous and prominent in the strictured bowel of IBD patients compared with controls suggests that inflammation may alter the growth of intestinal smooth muscle. Thus, inflammation is associated with changes in smooth muscle growth and contractility that, in turn, contribute to important symptoms of IBD including diarrhea (from altered motility and pain (via either altered motility or stricture formation. The involvement of smooth muscle in this context may be as an innocent bystander, where cells and products of the inflammatory process induce alterations in muscle contractility and growth. However, it is likely that intestinal muscle cells play a more active role in the inflammatory process via the elaboration of mediators and trophic factors, including cytokines, and via the production of collagen. The concept of muscle cells as active participants in the intestinal inflammatory process is a new concept that is under intense study. This report summarizes current knowledge as it relates to these two aspects of altered muscle function (growth and contractility in the inflamed intestine, and will focus on mechanisms underlying these changes, based on data obtained from animal models of intestinal inflammation.

  19. Smoothing a Piecewise-Smooth: An Example from Plankton Population Dynamics

    DEFF Research Database (Denmark)

    Piltz, Sofia Helena

    2016-01-01

    In this work we discuss a piecewise-smooth dynamical system inspired by plankton observations and constructed for one predator switching its diet between two different types of prey. We then discuss two smooth formulations of the piecewise-smooth model obtained by using a hyperbolic tangent funct...... function and adding a dimension to the system. We compare model behaviour of the three systems and show an example case where the steepness of the switch is determined from a comparison with data on freshwater plankton....

  20. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  1. Seismic Level 2 PSA

    International Nuclear Information System (INIS)

    Dirksen, Gerben; Pellissetti, Manuel; Duncan-Whiteman, Paul

    2014-01-01

    For most external events, the calculation of the core damage frequency (CDF) in Level 1 PSA is sufficient to be able to show that the contribution of the event to the plant risk is negligible. However, it is not sufficient to compare the CDF due to the external event to the total plant CDF; instead the Level 1 PSA result for the event should be compared to the large early release frequency (LERF), or alternatively arguments should be given why the CDF from the external event will not contribute mostly to LERF. For seismic events in particular, it can often not be easily excluded that sequences leading to core damage would not also result in LERF. Since the confinement function is one of the most essential functions for Level 2 PSA, special care must be taken of the containment penetrations. For example systems with containment penetrations that are normally closed during operation or are designed to withstand more than the maximum containment pressure are normally screened out in the Level 2 PSA for the containment isolation function, however the possibility of LOCA in such systems due to an earthquake may nevertheless lead to containment bypass. Additionally, the functionality of passive features may be compromised in case of a beyond design earthquake. In the present paper, we present crucial ingredients of a methodology for a Level 2 seismic PSA. This methodology consists of the following steps: Extension of the seismic equipment list (SEL) to include Level 2 PSA relevant systems (e.g. containment isolation system, features for core melt stabilization, hydrogen mitigation systems), Determination of the systems within the existing SEL with increased demands in case of severe accidents, Determination of essential components for which a dedicated fragility analysis needs to be performed. (author)

  2. Statistical physics, seismogenesis, and seismic hazard

    Science.gov (United States)

    Main, Ian

    1996-11-01

    generic statistical properties similar to the "universal" behavior seen in a wide variety of critical phenomena, with significant implications for practical problems in probabilistic seismic hazard evaluation. In particular, the notion of self-organized criticality (or near-criticality) gives a scientific rationale for the a priori assumption of "stationarity" used as a first step in the prediction of the future level of hazard. The Gutenberg-Richter law (a power law in energy or seismic moment) is found to apply only within a finite scale range, both in model and natural seismicity. Accordingly, the frequency-magnitude distribution can be generalized to a gamma distribution in energy or seismic moment (a power law, with an exponential tail). This allows extrapolations of the frequency-magnitude distribution and the maximum credible magnitude to be constrained by observed seismic or tectonic moment release rates. The answers to other questions raised are less clear, for example, the effect of the a priori assumption of a Poisson process in a system with strong local interactions, and the impact of zoning a potentially multifractal distribution of epicentres with smooth polygons. The results of some models show premonitory patterns of seismicity which could in principle be used as mainshock precursors. However, there remains no consensus, on both theoretical and practical grounds, on the possibility or otherwise of reliable intermediate-term earthquake prediction.

  3. Seismic wave generator

    International Nuclear Information System (INIS)

    Devaure, Bernard.

    1982-01-01

    This invention concerns a device for simulating earth tremors. This device includes a seismic wave generator formed of a cylinder, one end of which is closed by one of the walls of a cell containing a soil, the other end being closed by a wall on which are fixed pyrotechnic devices generating shock waves inside the cylinder. These waves are transmitted from the cylinder to the cell through openings made in the cell wall. This device also includes a mechanical device acting as low-pass filter, located inside the cylinder and close to the cell wall [fr

  4. Seismic risk perception test

    Science.gov (United States)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro

    2013-04-01

    The perception of risks involves the process of collecting, selecting and interpreting signals about uncertain impacts of events, activities or technologies. In the natural sciences the term risk seems to be clearly defined, it means the probability distribution of adverse effects, but the everyday use of risk has different connotations (Renn, 2008). The two terms, hazards and risks, are often used interchangeably by the public. Knowledge, experience, values, attitudes and feelings all influence the thinking and judgement of people about the seriousness and acceptability of risks. Within the social sciences however the terminology of 'risk perception' has become the conventional standard (Slovic, 1987). The mental models and other psychological mechanisms which people use to judge risks (such as cognitive heuristics and risk images) are internalized through social and cultural learning and constantly moderated (reinforced, modified, amplified or attenuated) by media reports, peer influences and other communication processes (Morgan et al., 2001). Yet, a theory of risk perception that offers an integrative, as well as empirically valid, approach to understanding and explaining risk perception is still missing". To understand the perception of risk is necessary to consider several areas: social, psychological, cultural, and their interactions. Among the various research in an international context on the perception of natural hazards, it seemed promising the approach with the method of semantic differential (Osgood, C.E., Suci, G., & Tannenbaum, P. 1957, The measurement of meaning. Urbana, IL: University of Illinois Press). The test on seismic risk perception has been constructed by the method of the semantic differential. To compare opposite adjectives or terms has been used a Likert's scale to seven point. The test consists of an informative part and six sections respectively dedicated to: hazard; vulnerability (home and workplace); exposed value (with reference to

  5. Mine-induced seismicity at East-Rand proprietary mines

    CSIR Research Space (South Africa)

    Milev, AM

    1995-09-01

    Full Text Available Mining results in seismic activity of varying intensity, from small micro seismic events to larger seismic events, often associated with significant seismic induced damages. This work deals with the understanding of the present seismicity...

  6. Seismic risk assessment of Navarre (Northern Spain)

    Science.gov (United States)

    Gaspar-Escribano, J. M.; Rivas-Medina, A.; García Rodríguez, M. J.; Benito, B.; Tsige, M.; Martínez-Díaz, J. J.; Murphy, P.

    2009-04-01

    The RISNA project, financed by the Emergency Agency of Navarre (Northern Spain), aims at assessing the seismic risk of the entire region. The final goal of the project is the definition of emergency plans for future earthquakes. With this purpose, four main topics are covered: seismic hazard characterization, geotechnical classification, vulnerability assessment and damage estimation to structures and exposed population. A geographic information system is used to integrate, analyze and represent all information colleted in the different phases of the study. Expected ground motions on rock conditions with a 90% probability of non-exceedance in an exposure time of 50 years are determined following a Probabilistic Seismic Hazard Assessment (PSHA) methodology that includes a logic tree with different ground motion and source zoning models. As the region under study is located in the boundary between Spain and France, an effort is required to collect and homogenise seismological data from different national and regional agencies. A new homogenised seismic catalogue, merging data from Spanish, French, Catalonian and international agencies and establishing correlations between different magnitude scales, is developed. In addition, a new seismic zoning model focused on the study area is proposed. Results show that the highest ground motions on rock conditions are expected in the northeastern part of the region, decreasing southwards. Seismic hazard can be expressed as low-to-moderate. A geotechnical classification of the entire region is developed based on surface geology, available borehole data and morphotectonic constraints. Frequency-dependent amplification factors, consistent with code values, are proposed. The northern and southern parts of the region are characterized by stiff and soft soils respectively, being the softest soils located along river valleys. Seismic hazard maps including soil effects are obtained by applying these factors to the seismic hazard maps

  7. Quantifying uncertainties of seismic Bayesian inversion of Northern Great Plains

    Science.gov (United States)

    Gao, C.; Lekic, V.

    2017-12-01

    Elastic waves excited by earthquakes are the fundamental observations of the seismological studies. Seismologists measure information such as travel time, amplitude, and polarization to infer the properties of earthquake source, seismic wave propagation, and subsurface structure. Across numerous applications, seismic imaging has been able to take advantage of complimentary seismic observables to constrain profiles and lateral variations of Earth's elastic properties. Moreover, seismic imaging plays a unique role in multidisciplinary studies of geoscience by providing direct constraints on the unreachable interior of the Earth. Accurate quantification of uncertainties of inferences made from seismic observations is of paramount importance for interpreting seismic images and testing geological hypotheses. However, such quantification remains challenging and subjective due to the non-linearity and non-uniqueness of geophysical inverse problem. In this project, we apply a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm for a transdimensional Bayesian inversion of continental lithosphere structure. Such inversion allows us to quantify the uncertainties of inversion results by inverting for an ensemble solution. It also yields an adaptive parameterization that enables simultaneous inversion of different elastic properties without imposing strong prior information on the relationship between them. We present retrieved profiles of shear velocity (Vs) and radial anisotropy in Northern Great Plains using measurements from USArray stations. We use both seismic surface wave dispersion and receiver function data due to their complementary constraints of lithosphere structure. Furthermore, we analyze the uncertainties of both individual and joint inversion of those two data types to quantify the benefit of doing joint inversion. As an application, we infer the variation of Moho depths and crustal layering across the northern Great Plains.

  8. Weak localization of seismic waves

    International Nuclear Information System (INIS)

    Larose, E.; Margerin, L.; Tiggelen, B.A. van; Campillo, M.

    2004-01-01

    We report the observation of weak localization of seismic waves in a natural environment. It emerges as a doubling of the seismic energy around the source within a spot of the width of a wavelength, which is several tens of meters in our case. The characteristic time for its onset is the scattering mean-free time that quantifies the internal heterogeneity

  9. DRY TRANSFER FACILITY SEISMIC ANALYSIS

    International Nuclear Information System (INIS)

    EARNEST, S.; KO, H.; DOCKERY, W.; PERNISI, R.

    2004-01-01

    The purpose of this calculation is to perform a dynamic and static analysis on the Dry Transfer Facility, and to determine the response spectra seismic forces for the design basis ground motions. The resulting seismic forces and accelerations will be used in a subsequent calculation to complete preliminary design of the concrete shear walls, diaphragms, and basemat

  10. Seismic Data Gathering and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    Three recent earthquakes in the last seven years have exceeded their design basis earthquake values (so it is implied that damage to SSC’s should have occurred). These seismic events were recorded at North Anna (August 2011, detailed information provided in [Virginia Electric and Power Company Memo]), Fukushima Daichii and Daini (March 2011 [TEPCO 1]), and Kaswazaki-Kariwa (2007, [TEPCO 2]). However, seismic walk downs at some of these plants indicate that very little damage occurred to safety class systems and components due to the seismic motion. This report presents seismic data gathered for two of the three events mentioned above and recommends a path for using that data for two purposes. One purpose is to determine what margins exist in current industry standard seismic soil-structure interaction (SSI) tools. The second purpose is the use the data to validated seismic site response tools and SSI tools. The gathered data represents free field soil and in-structure acceleration time histories data. Gathered data also includes elastic and dynamic soil properties and structural drawings. Gathering data and comparing with existing models has potential to identify areas of uncertainty that should be removed from current seismic analysis and SPRA approaches. Removing uncertainty (to the extent possible) from SPRA’s will allow NPP owners to make decisions on where to reduce risk. Once a realistic understanding of seismic response is established for a nuclear power plant (NPP) then decisions on needed protective measures, such as SI, can be made.

  11. Random noise suppression of seismic data using non-local Bayes algorithm

    Science.gov (United States)

    Chang, De-Kuan; Yang, Wu-Yang; Wang, Yi-Hui; Yang, Qing; Wei, Xin-Jian; Feng, Xiao-Ying

    2018-02-01

    For random noise suppression of seismic data, we present a non-local Bayes (NL-Bayes) filtering algorithm. The NL-Bayes algorithm uses the Gaussian model instead of the weighted average of all similar patches in the NL-means algorithm to reduce the fuzzy of structural details, thereby improving the denoising performance. In the denoising process of seismic data, the size and the number of patches in the Gaussian model are adaptively calculated according to the standard deviation of noise. The NL-Bayes algorithm requires two iterations to complete seismic data denoising, but the second iteration makes use of denoised seismic data from the first iteration to calculate the better mean and covariance of the patch Gaussian model for improving the similarity of patches and achieving the purpose of denoising. Tests with synthetic and real data sets demonstrate that the NL-Bayes algorithm can effectively improve the SNR and preserve the fidelity of seismic data.

  12. Processing Approaches for DAS-Enabled Continuous Seismic Monitoring

    Science.gov (United States)

    Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.

    2017-12-01

    Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.

  13. Seismic isolation of small modular reactors using metamaterials

    Directory of Open Access Journals (Sweden)

    Witarto Witarto

    2018-04-01

    Full Text Available Adaptation of metamaterials at micro- to nanometer scales to metastructures at much larger scales offers a new alternative for seismic isolation systems. These new isolation systems, known as periodic foundations, function both as a structural foundation to support gravitational weight of the superstructure and also as a seismic isolator to isolate the superstructure from incoming seismic waves. Here we describe the application of periodic foundations for the seismic protection of nuclear power plants, in particular small modular reactors (SMR. For this purpose, a large-scale shake table test on a one-dimensional (1D periodic foundation supporting an SMR building model was conducted. The 1D periodic foundation was designed and fabricated using reinforced concrete and synthetic rubber (polyurethane materials. The 1D periodic foundation structural system was tested under various input waves, which include white noise, stepped sine and seismic waves in the horizontal and vertical directions as well as in the torsional mode. The shake table test results show that the 1D periodic foundation can reduce the acceleration response (transmissibility of the SMR building up to 90%. In addition, the periodic foundation-isolated structure also exhibited smaller displacement than the non-isolated SMR building. This study indicates that the challenge faced in developing metastructures can be overcome and the periodic foundations can be applied to isolating vibration response of engineering structures.

  14. Seismic isolation of small modular reactors using metamaterials

    Science.gov (United States)

    Witarto, Witarto; Wang, S. J.; Yang, C. Y.; Nie, Xin; Mo, Y. L.; Chang, K. C.; Tang, Yu; Kassawara, Robert

    2018-04-01

    Adaptation of metamaterials at micro- to nanometer scales to metastructures at much larger scales offers a new alternative for seismic isolation systems. These new isolation systems, known as periodic foundations, function both as a structural foundation to support gravitational weight of the superstructure and also as a seismic isolator to isolate the superstructure from incoming seismic waves. Here we describe the application of periodic foundations for the seismic protection of nuclear power plants, in particular small modular reactors (SMR). For this purpose, a large-scale shake table test on a one-dimensional (1D) periodic foundation supporting an SMR building model was conducted. The 1D periodic foundation was designed and fabricated using reinforced concrete and synthetic rubber (polyurethane) materials. The 1D periodic foundation structural system was tested under various input waves, which include white noise, stepped sine and seismic waves in the horizontal and vertical directions as well as in the torsional mode. The shake table test results show that the 1D periodic foundation can reduce the acceleration response (transmissibility) of the SMR building up to 90%. In addition, the periodic foundation-isolated structure also exhibited smaller displacement than the non-isolated SMR building. This study indicates that the challenge faced in developing metastructures can be overcome and the periodic foundations can be applied to isolating vibration response of engineering structures.

  15. Advances in Rotational Seismic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, Robert [Applied Technology Associates, Albuquerque, NM (United States); Laughlin, Darren [Applied Technology Associates, Albuquerque, NM (United States); Brune, Robert [Applied Technology Associates, Albuquerque, NM (United States)

    2016-10-19

    Rotational motion is increasingly understood to be a significant part of seismic wave motion. Rotations can be important in earthquake strong motion and in Induced Seismicity Monitoring. Rotational seismic data can also enable shear selectivity and improve wavefield sampling for vertical geophones in 3D surveys, among other applications. However, sensor technology has been a limiting factor to date. The US Department of Energy (DOE) and Applied Technology Associates (ATA) are funding a multi-year project that is now entering Phase 2 to develop and deploy a new generation of rotational sensors for validation of rotational seismic applications. Initial focus is on induced seismicity monitoring, particularly for Enhanced Geothermal Systems (EGS) with fracturing. The sensors employ Magnetohydrodynamic (MHD) principles with broadband response, improved noise floors, robustness, and repeatability. This paper presents a summary of Phase 1 results and Phase 2 status.

  16. Seismic isolation in New Zealand

    International Nuclear Information System (INIS)

    Skinner, R.I.; Robinson, W.H.; McVerry, G.H.

    1989-01-01

    Bridges, buildings, and industrial equipment can be given increased protection from earthquake damage by limiting the earthquake attack through seismic isolation. A broad summary of the seismic responses of base-isolated structures is of considerable assistance for their preliminary design. Seismic isolation as already used in New Zealand consists of a flexible base or support combined with some form of energy-dissipating device, usually involving the hysteretic working of steel or lead. This paper presents examples of the New Zealand experience, where seismic isolation has been used for 42 bridges, 3 buildings, a tall chimney, and high-voltage capacitor banks. Additional seismic response factors, which may be important for nuclear power plants, are also discussed briefly

  17. Integrated system for seismic evaluations

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1989-01-01

    This paper describes the various features of the seismic module of the CARES system (computer analysis for rapid evaluation of structures). This system was developed to perform rapid evaluations of structural behavior and capability of nuclear power plant facilities. The CARES is structural in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the features of the seismic module in particular. The development of the seismic modules of the CARES system is based on an approach which incorporates major aspects of seismic analysis currently employed by the industry into an integrated system that allows for carrying out interactively computations of structural response to seismic motions. The code operates on a PC computer system and has multi-graphics capabilities

  18. Civil Works Seismic Designs

    International Nuclear Information System (INIS)

    1985-12-01

    RFS or Regles Fondamentales de Surete (Basic Safety Rules) applicable to certain types of nuclear facilities lay down requirements with which compliance, for the type of facilities and within the scope of application covered by the RFS, is considered to be equivalent to compliance with technical French regulatory practice. The object of the RFS is to take advantage of standardization in the field of safety, while allowing for technical progress in that field. They are designed to enable the operating utility and contractors to know the rules pertaining to various subjects which are considered to be acceptable by the Service Central de Surete des Installations Nucleaires, or the SCSIN (Central Department for the Safety of Nuclear Facilities). These RFS should make safety analysis easier and lead to better understanding between experts and individuals concerned with the problems of nuclear safety. The SCSIN reserves the right to modify, when considered necessary, any RFS and specify, if need be, the terms under which a modification is deemed retroactive. This rule defines: - the parameters characterizing the design seismic motions - the calculation methods - the mathematical schematization principles on which calculations are based - the use of the seismic response for the structure checking - the content of the documents to be presented

  19. A seismic recording device

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, R; Kind, A G; Thompson, S R

    1983-06-08

    A method and a device for noting the moment of an explosion on a seismic recording is proposed, in which the moment of the explosion is recorded as a result of a break in an electrical circuit under the effects of the explosive charge used to excite the seismic waves. The electrical circuit being broken is connected to the same energy source as the electric detonator which initiates the explosion, which is attached to a high frequency, alternating current source, where the circuit being broken is either the primary or the secondary winding of a transformer, through which the electric detonator is switched in to the source. The moment the circuit is broken is determined from the ceasation of current in the circuit or by the sharp rise in voltage in the broken sector. The method makes it possible to more precisely fix the moment of the break than the existing methods. When insulated copper wires are used the recording of the time occurs 100 microseconds after the explosion.

  20. Overview of seismic margin insights gained from seismic PRA results

    International Nuclear Information System (INIS)

    Kennedy, R.P.; Sues, R.H.; Campbell, R.D.

    1986-01-01

    This paper presents the findings of a study conducted under NRC and EPRI sponsorship in which published seismic PRAs were reviewed in order to gain insight to the seismic margins inherent in existing nuclear plants. The approach taken was to examine the fragilities of those components which have been found to be dominant contributors to seismic risk at plants in low-to-moderate seismic regions (SSE levels between 0.12g and 0.25g). It is concluded that there is significant margin inherent in the capacity of most critical components above the plant design basis. For ground motions less than about 0.3g, the predominant sources of seismic risk are loss of offsite power coupled with random failure of the emergency diesels, non-recoverable circuit breaker trip due to relay chatter, unanchored equipment, unreinforced non-load bearing block walls, vertical water storage tanks, systems interactions and possibly soil liquefaction. Recommendations as to which components should be reviewed in seismic margin studies for margin earthquakes less than 0.3g, between 0.3g and 0.5g, and greater than 0.5g, developed by the NRC expert panel on the quantification of seismic margins (based on the review of past PRA data, earthquake experience data, and their own personal experience) are presented

  1. On smoothness-asymmetric null infinities

    International Nuclear Information System (INIS)

    Valiente Kroon, Juan Antonio

    2006-01-01

    We discuss the existence of asymptotically Euclidean initial data sets for the vacuum Einstein field equations which would give rise (modulo an existence result for the evolution equations near spatial infinity) to developments with a past and a future null infinity of different smoothness. For simplicity, the analysis is restricted to the class of conformally flat, axially symmetric initial data sets. It is shown how the free parameters in the second fundamental form of the data can be used to satisfy certain obstructions to the smoothness of null infinity. The resulting initial data sets could be interpreted as those of some sort of (nonlinearly) distorted Schwarzschild black hole. Their developments would be that they admit a peeling future null infinity, but at the same time have a polyhomogeneous (non-peeling) past null infinity

  2. Smooth homogeneous structures in operator theory

    CERN Document Server

    Beltita, Daniel

    2005-01-01

    Geometric ideas and techniques play an important role in operator theory and the theory of operator algebras. Smooth Homogeneous Structures in Operator Theory builds the background needed to understand this circle of ideas and reports on recent developments in this fruitful field of research. Requiring only a moderate familiarity with functional analysis and general topology, the author begins with an introduction to infinite dimensional Lie theory with emphasis on the relationship between Lie groups and Lie algebras. A detailed examination of smooth homogeneous spaces follows. This study is illustrated by familiar examples from operator theory and develops methods that allow endowing such spaces with structures of complex manifolds. The final section of the book explores equivariant monotone operators and Kähler structures. It examines certain symmetry properties of abstract reproducing kernels and arrives at a very general version of the construction of restricted Grassmann manifolds from the theory of loo...

  3. Does responsive pricing smooth demand shocks?

    OpenAIRE

    Pascal, Courty; Mario, Pagliero

    2011-01-01

    Using data from a unique pricing experiment, we investigate Vickrey’s conjecture that responsive pricing can be used to smooth both predictable and unpredictable demand shocks. Our evidence shows that increasing the responsiveness of price to demand conditions reduces the magnitude of deviations in capacity utilization rates from a pre-determined target level. A 10 percent increase in price variability leads to a decrease in the variability of capacity utilization rates between...

  4. The Smooth Muscle of the Artery

    Science.gov (United States)

    1975-01-01

    of vascular smooth muscle are contrac- tion, thereby mediating vaso constriction, and the synthesis of the extracellular proteins and polysaccharides ...of the monosaccharides turned out to be different for instance from cornea to aorta (229, 283). In the conditions yed (4 hours incubation at 37 degrees... polysaccharides only. This glyco- protein is not very rich in sugar components (- 5Z) (228, 284), but is a very acidic protein (286). Fig.66 shows

  5. seismic-py: Reading seismic data with Python

    Directory of Open Access Journals (Sweden)

    2008-08-01

    Full Text Available The field of seismic exploration of the Earth has changed
    dramatically over the last half a century. The Society of Exploration
    Geophysicists (SEG has worked to create standards to store the vast
    amounts of seismic data in a way that will be portable across computer
    architectures. However, it has been impossible to predict the needs of the
    immense range of seismic data acquisition systems. As a result, vendors have
    had to bend the rules to accommodate the needs of new instruments and
    experiment types. For low level access to seismic data, there is need for a
    standard open source library to allow access to a wide range of vendor data
    files that can handle all of the variations. A new seismic software package,
    seismic-py, provides an infrastructure for creating and managing drivers for
    each particular format. Drivers can be derived from one of the known formats
    and altered to handle any slight variations. Alternatively drivers can be
    developed from scratch for formats that are very different from any previously
    defined format. Python has been the key to making driver development easy
    and efficient to implement. The goal of seismic-py is to be the base system
    that will power a wide range of experimentation with seismic data and at the
    same time provide clear documentation for the historical record of seismic
    data formats.

  6. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  7. Smooth Nb surfaces fabricated by buffered electropolishing

    International Nuclear Information System (INIS)

    Wu, Andy T.; Mammosser, John; Phillips, Larry; Delayen, Jean; Reece, Charles; Wilkerson, Amy; Smith, David; Ike, Robert

    2007-01-01

    It was demonstrated that smooth Nb surfaces could be obtained through buffered electropolishing (BEP) employing an electrolyte consisting of lactic, sulfuric, and hydrofluoric acids. Parameters that control the polishing process were optimized to achieve a smooth surface finish. The polishing rate of BEP was determined to be 0.646 μm/min which was much higher than 0.381 μm/min achieved by the conventional electropolishing (EP) process widely used in the superconducting radio frequency (SRF) community. Root mean square measurements using a 3D profilometer revealed that Nb surfaces treated by BEP were an order of magnitude smoother than those treated by the optimized EP process. The chemical composition of the Nb surfaces after BEP was analyzed by static and dynamic secondary ion mass spectrometry (SIMS) systems. SIMS results implied that the surface oxide structure of Nb might be more complicated than what usually believed and could be inhomogeneous. Preliminary results of BEP on Nb SRF single cell cavities and half-cells were reported. It was shown that smooth and bright surfaces could be obtained in 1800 s when the electric field inside a SRF cavity was uniform during a BEP process. This study showed that BEP is a promising technique for surface treatment on Nb SRF cavities to be used in particle accelerators

  8. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  9. Seasonal variations in shallow Alaska seismicity and stress modulation from GRACE derived hydrological loading

    Science.gov (United States)

    Johnson, C. W.; Fu, Y.; Burgmann, R.

    2017-12-01

    Shallow (≤50 km), low magnitude (M≥2.0) seismicity in southern Alaska is examined for seasonal variations during the annual hydrological cycle. The seismicity is declustered with a spatio-temporal epidemic type aftershock sequence (ETAS) model. The removal of aftershock sequences allows detailed investigation of seismicity rate changes, as water and ice loads modulate crustal stresses throughout the year. The GRACE surface loads are obtained from the JPL mass concentration blocks (mascons) global land and ocean solutions. The data product is smoothed with a 9˚ Gaussian filter and interpolated on a 25 km grid. To inform the surface loading model, the global solutions are limited to the region from -160˚ to -120˚ and 50˚ to 70˚. The stress changes are calculated using a 1D spherical layered earth model at depth intervals of 10 km from 10 - 50 km in the study region. To evaluate the induced seasonal stresses, we use >30 years of earthquake focal mechanisms to constrain the background stress field orientation and assess the stress change with respect to the principal stress orientation. The background stress field is assumed to control the preferred orientation of faulting, and stress field perturbations are expected to increase or decrease seismicity. The number of excess earthquakes is calculated with respect to the background seismicity rates. Here, we present preliminary results for the shallow seismicity variations and quantify the seasonal stresses associated with changes in hydrological loading.

  10. Assessment of finite element and smoothed particles hydrodynamics methods for modeling serrated chip formation in hardened steel

    Directory of Open Access Journals (Sweden)

    Usama Umer

    2016-05-01

    Full Text Available This study aims to perform comparative analyses in modeling serrated chip morphologies using traditional finite element and smoothed particles hydrodynamics methods. Although finite element models are being employed in predicting machining performance variables for the last two decades, many drawbacks and limitations exist with the current finite element models. The problems like excessive mesh distortions, high numerical cost of adaptive meshing techniques, and need of geometric chip separation criteria hinder its practical implementation in metal cutting industries. In this study, a mesh free method, namely, smoothed particles hydrodynamics, is implemented for modeling serrated chip morphology while machining AISI H13 hardened tool steel. The smoothed particles hydrodynamics models are compared with the traditional finite element models, and it has been found that the smoothed particles hydrodynamics models have good capabilities in handling large distortions and do not need any geometric or mesh-based chip separation criterion.

  11. Seismicity and tectonics of Bangladesh

    International Nuclear Information System (INIS)

    Hossain, K.M.

    1989-05-01

    Northern and eastern Bangladesh and surrounding areas belong to a seismically active zone and are associated with the subduction of the Indian plate. The seismicity and tectonics have been studied in detail and the observations have been correlated to understand the earthquake phenomenon in the region. The morphotectonic behaviour of northern Bangladesh shows that it is deeply related to the movement of the Dauki fault system and relative upliftment of the Shillong plateau. Contemporary seismicity in the Dauki fault system is relatively quiet comparing to that in the Naga-Disang-Haflong thrust belt giving rise to the probability of sudden release of energy being accumulated in the vicinity of the Dauki fault system. This observation corresponds with the predicted average return period of a large earthquake (1897 type) and the possibility of M > 8 earthquake in the vicinity of the Dauki fault within this century should not be ruled out. The seismicity in the folded belt in the east follows the general trend of Arakan-Yoma anticlinorium and represents shallow and low-angled thrust movements in conformity with the field observation. Seismotectonic behaviour in the deep basin part of Bangladesh demonstrates that an intraplate movement in the basement rock has been taking place along the deep-seated faults causing relative upliftment and subsidence in the basin. Bangladesh has been divided into three seismic zones on the basis of morphotectonic and seismic behaviour. Zone-I has been identified as the zone of high seismic risk. (author). 43 refs, 5 figs, 3 tabs

  12. Seismic hazard assessment: Issues and alternatives

    Science.gov (United States)

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  13. Reassessment of probabilistic seismic hazard in the Marmara region

    Science.gov (United States)

    Kalkan, Erol; Gulkan, Polat; Yilmaz, Nazan; Çelebi, Mehmet

    2009-01-01

    In 1999, the eastern coastline of the Marmara region (Turkey) witnessed increased seismic activity on the North Anatolian fault (NAF) system with two damaging earthquakes (M 7.4 Kocaeli and M 7.2 D??zce) that occurred almost three months apart. These events have reduced stress on the western segment of the NAF where it continues under the Marmara Sea. The undersea fault segments have been recently explored using bathymetric and reflection surveys. These recent findings helped scientists to understand the seismotectonic environment of the Marmara basin, which has remained a perplexing tectonic domain. On the basis of collected new data, seismic hazard of the Marmara region is reassessed using a probabilistic approach. Two different earthquake source models: (1) the smoothed-gridded seismicity model and (2) fault model and alternate magnitude-frequency relations, Gutenberg-Richter and characteristic, were used with local and imported ground-motion-prediction equations. Regional exposure is computed and quantified on a set of hazard maps that provide peak horizontal ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 sec on uniform firm-rock site condition (760 m=sec average shear wave velocity in the upper 30 m). These acceleration levels were computed for ground motions having 2% and 10% probabilities of exceedance in 50 yr, corresponding to return periods of about 2475 and 475 yr, respectively. The maximum PGA computed (at rock site) is 1.5g along the fault segments of the NAF zone extending into the Marmara Sea. The new maps generally show 10% to 15% increase for PGA, 0.2 and 1.0 sec spectral acceleration values across much of Marmara compared to previous regional hazard maps. Hazard curves and smooth design spectra for three site conditions: rock, soil, and soft-soil are provided for the Istanbul metropolitan area as possible tools in future risk estimates.

  14. Integrated system for seismic evaluations

    International Nuclear Information System (INIS)

    Xu, J.; Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1989-01-01

    This paper describes the various features of the Seismic Module of the CARES system (Computer Analysis for Rapid Evaluation of Structures). This system was developed by Brookhaven National Laboratory (BNL) for the US Nuclear Regulatory Commission to perform rapid evaluations of structural behavior and capability of nuclear power plant facilities. The CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the features of the Seismic Module in particular. The development of the Seismic Module of the CARES system is based on an approach which incorporates all major aspects of seismic analysis currently employed by the industry into an integrated system that allows for carrying out interactively computations of structural response to seismic motions. The code operates on a PC computer system and has multi-graphics capabilities. It has been designed with user friendly features and it allows for interactive manipulation of various analysis phases during the seismic design process. The capabilities of the seismic module include (a) generation of artificial time histories compatible with given design ground response spectra, (b) development of Power Spectral Density (PSD) functions associated with the seismic input, (c) deconvolution analysis using vertically propagating shear waves through a given soil profile, and (d) development of in-structure response spectra or corresponding PSD's. It should be pointed out that these types of analyses can also be performed individually by using available computer codes such as FLUSH, SAP, etc. The uniqueness of the CARES, however, lies on its ability to perform all required phases of the seismic analysis in an integrated manner. 5 refs., 6 figs

  15. Canadian seismic agreement

    International Nuclear Information System (INIS)

    Wetmiller, R.J.; Lyons, J.A.; Shannon, W.E.; Munro, P.S.; Thomas, J.T.; Andrew, M.D.; Lamontagne, M.; Wong, C.; Anglin, F.M.; Plouffe, M.; Lapointe, S.P.; Adams, J.; Drysdale, J.A.

    1990-04-01

    This is the twenty-first progress report under the agreement entitled Canadian Seismic Agreement between the US Nuclear Regulatory Commission (NRC) and the Canadian Commercial Corporation. Activities undertaken by the Geophysics Division of the Geological Survey of Canada (GD/GSC) during the period from July 01, 1988 to June 30, 1989 and supported in part by the NRC agreement are described below under four headings; Eastern Canada Telemetred Network and local network developments, Datalab developments, strong motion network developments and earthquake activity. In this time period eastern Canada experienced its largest earthquake in over 50 years. This earthquake, which has been christened the Saguenay earthquake, has provided a wealth of new data pertinent to earthquake engineering studies in eastern North America and is the subject of many continuing studies, which are presently being carried out at GD and elsewhere. 41 refs., 21 figs., 7 tabs

  16. Artificial seismic acceleration

    Science.gov (United States)

    Felzer, Karen R.; Page, Morgan T.; Michael, Andrew J.

    2015-01-01

    In their 2013 paper, Bouchon, Durand, Marsan, Karabulut, 3 and Schmittbuhl (BDMKS) claim to see significant accelerating seismicity before M 6.5 interplate mainshocks, but not before intraplate mainshocks, reflecting a preparatory process before large events. We concur with the finding of BDMKS that their interplate dataset has significantly more fore- shocks than their intraplate dataset; however, we disagree that the foreshocks are predictive of large events in particular. Acceleration in stacked foreshock sequences has been seen before and has been explained by the cascade model, in which earthquakes occasionally trigger aftershocks larger than themselves4. In this model, the time lags between the smaller mainshocks and larger aftershocks follow the inverse power law common to all aftershock sequences, creating an apparent acceleration when stacked (see Supplementary Information).

  17. Seismics - Yesterday and today

    International Nuclear Information System (INIS)

    Frei, W.

    2014-01-01

    This article published in the Swiss Bulletin for Applied Geology takes a look at technical developments in the field of seismological exploration over the past 25 years. In particular, developments in the information technology area are discussed. Increased data-storage capacities and miniaturization of data-capture systems and sensors are examined. In spite of such developments, the quality of the seismological data acquired is quoted as not showing significantly increased quality. Alternatives to vibration-based seismic exploration are discussed. The challenges faced by near-surface seismology are looked at. Computer-based statistical correction of data and improved resolution are discussed, as is hybrid seismology. Examples are quoted and graphically illustrated. A list of relevant literature completes the article

  18. Seismic and Infrasound Location

    Energy Technology Data Exchange (ETDEWEB)

    Arrowsmith, Stephen J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Begnaud, Michael L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-19

    This presentation includes slides on Signal Propagation Through the Earth/Atmosphere Varies at Different Scales; 3D Seismic Models: RSTT; Ray Coverage (Pn); Source-Specific Station Corrections (SSSCs); RSTT Conclusions; SALSA3D (SAndia LoS Alamos) Global 3D Earth Model for Travel Time; Comparison of IDC SSSCs to RSTT Predictions; SALSA3D; Validation and Model Comparison; DSS Lines in the Siberian Platform; DSS Line CRA-4 Comparison; Travel Time Δak135; Travel Time Prediction Uncertainty; SALSA3D Conclusions; Infrasound Data Processing: An example event; Infrasound Data Processing: An example event; Infrasound Location; How does BISL work?; BISL: Application to the 2013 DPRK Test; and BISL: Ongoing Research.

  19. Seismic Fracture Characterization Methodologies for Enhanced Geothermal Systems

    Energy Technology Data Exchange (ETDEWEB)

    Queen, John H. [Hi-Geophysical, Inc., Ponca, OK (United States)

    2016-05-09

    Executive Summary The overall objective of this work was the development of surface and borehole seismic methodologies using both compressional and shear waves for characterizing faults and fractures in Enhanced Geothermal Systems. We used both surface seismic and vertical seismic profile (VSP) methods. We adapted these methods to the unique conditions encountered in Enhanced Geothermal Systems (EGS) creation. These conditions include geological environments with volcanic cover, highly altered rocks, severe structure, extreme near surface velocity contrasts and lack of distinct velocity contrasts at depth. One of the objectives was the development of methods for identifying more appropriate seismic acquisition parameters for overcoming problems associated with these geological factors. Because temperatures up to 300º C are often encountered in these systems, another objective was the testing of VSP borehole tools capable of operating at depths in excess of 1,000 m and at temperatures in excess of 200º C. A final objective was the development of new processing and interpretation techniques based on scattering and time-frequency analysis, as well as the application of modern seismic migration imaging algorithms to seismic data acquired over geothermal areas. The use of surface seismic reflection data at Brady's Hot Springs was found useful in building a geological model, but only when combined with other extensive geological and geophysical data. The use of fine source and geophone spacing was critical in producing useful images. The surface seismic reflection data gave no information about the internal structure (extent, thickness and filling) of faults and fractures, and modeling suggests that they are unlikely to do so. Time-frequency analysis was applied to these data, but was not found to be significantly useful in their interpretation. Modeling does indicate that VSP and other seismic methods with sensors located at depth in wells will be the most

  20. Seismic retrofitting of Apsara reactor building

    International Nuclear Information System (INIS)

    Reddy, G.R.; Parulekar, Y.M.; Sharma, A.; Rao, K.N.; Narasimhan, Rajiv; Srinivas, K.; Basha, S.M.; Thomas, V.S.; Soma Kumar, K.

    2006-01-01

    Seismic analysis of Apsara Reactor building was carried out and was found not meeting the current seismic requirements. Due to the building not qualifying for seismic loads, a retrofit scheme using elasto-plastic dampers is proposed. Following activities have been performed in this direction: Carried out detailed seismic analysis of Apsara reactor building structure incorporating proposed seismic retrofit. Demonstrating the capability of the retrofitted structure to with stand the earth quake level for Trombay site as per the current standards by analysis and by model studies. Implementation of seismic retrofit program. This paper presents the details of above aspects related to Seismic analysis and retrofitting of Apsara reactor building. (author)

  1. Comparison of seismic isolation concepts for FBR

    International Nuclear Information System (INIS)

    Shiojiri, H.; Mazda, T.; Kasai, H.; Kanda, J.N.; Kubo, T.; Madokoro, M.; Shimomura, T.; Nojima, O.

    1989-01-01

    This paper seeks to verify the reliability and effectiveness of seismic isolation for FBR. Some results of the preliminary study of the program are described. Seismic isolation concepts and corresponding seismic isolation devices were selected. Three kinds of seismically-isolated FBR plant concepts were developed by applying promising seismic isolation concepts to the non-isolated FBR plant, and by developing plant component layout plans and building structural designs. Each plant was subjected to seismic response analysis and reduction in the amount of material of components and buildings were estimated for each seismic isolation concepts. Research and development items were evaluated

  2. Seismic efficiency of meteor airbursts

    Science.gov (United States)

    Svetsov, V. V.; Artemieva, N. A.; Shuvalov, V. V.

    2017-08-01

    We present the results of numerical simulation for impacts of relatively small asteroids and ice bodies of 30-100 m in size, decelerated in the atmosphere and exploding before they reach the surface, but still producing seismic effects due to the impact wave reaching the surface. The calculated magnitudes fall within the range of 4 to 6, and average seismic efficiency of these events is 2.5 × 10-5. The results obtained allow the seismic hazard from impacts of cosmic bodies to be estimated.

  3. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    Energy Technology Data Exchange (ETDEWEB)

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  4. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    International Nuclear Information System (INIS)

    E.N. Lindner

    2004-01-01

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  5. Adaptive capture of expert knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, C.L.; Jones, R.D. [Los Alamos National Lab., NM (United States); Hand, Un Kyong [Los Alamos National Lab., NM (United States)]|[US Navy (United States)

    1995-05-01

    A method is introduced that can directly acquire knowledge-engineered, rule-based logic in an adaptive network. This adaptive representation of the rule system can then replace the rule system in simulated intelligent agents and thereby permit further performance-based adaptation of the rule system. The approach described provides both weight-fitting network adaptation and potentially powerful rule mutation and selection mechanisms. Nonlinear terms are generated implicitly in the mutation process through the emergent interaction of multiple linear terms. By this method it is possible to acquire nonlinear relations that exist in the training data without addition of hidden layers or imposition of explicit nonlinear terms in the network. We smoothed and captured a set of expert rules with an adaptive network. The motivation for this was to (1) realize a speed advantage over traditional rule-based simulations; (2) have variability in the intelligent objects not possible by rule-based systems but provided by adaptive systems: and (3) maintain the understandability of rule-based simulations. A set of binary rules was smoothed and converted into a simple set of arithmetic statements, where continuous, non-binary rules are permitted. A neural network, called the expert network, was developed to capture this rule set, which it was able to do with zero error. The expert network is also capable of learning a nonmonotonic term without a hidden layer. The trained network in feedforward operation is fast running, compact, and traceable to the rule base.

  6. Nodular smooth muscle metaplasia in multiple peritoneal endometriosis

    OpenAIRE

    Kim, Hyun-Soo; Yoon, Gun; Ha, Sang Yun; Song, Sang Yong

    2015-01-01

    We report here an unusual presentation of peritoneal endometriosis with smooth muscle metaplasia as multiple protruding masses on the lateral pelvic wall. Smooth muscle metaplasia is a common finding in rectovaginal endometriosis, whereas in peritoneal endometriosis, smooth muscle metaplasia is uncommon and its nodular presentation on the pelvic wall is even rarer. To the best of our knowledge, this is the first case of nodular smooth muscle metaplasia occurring in peritoneal endometriosis. A...

  7. Seismic gaps and plate tectonics: seismic potential for major boundaries

    Energy Technology Data Exchange (ETDEWEB)

    McCann, W R; Nishenko, S P; Sykes, L R; Krause, J

    1979-01-01

    The theory of plate tectonics provides a basic framework for evaluating the potential for future great earthquakes to occur along major plate boundaries. Along most of the transform and convergent plate boundaries considered in this paper, the majority of seismic slip occurs during large earthquakes, i.e., those of magnitude 7 or greater. The concepts that rupture zones, as delineated by aftershocks, tend to abut rather than overlap, and large events occur in regions with histories of both long-and short-term seismic quiescence are used in this paper to delineate major seismic gaps. The term seismic gap is taken to refer to any region along an active plate boundary that has not experienced a large thrust or strike-slip earthquake for more than 30 years. A region of high seismic potential is a seismic gap that, for historic or tectonic reasons, is considered likely to produce a large shock during the next few decades. The seismic gap technique provides estimates of the location, size of future events and origin time to within a few tens of years at best. The accompanying map summarizes six categories of seismic potential for major plate boundaries in and around the margins of the Pacific Ocean and the Caribbean, South Sandwich and Sunda (Indonesia) regions for the next few decades. These six categories are meant to be interpreted as forecasts of the location and size of future large shocks and should not be considered to be predictions in which a precise estimate of the time of occurrence is specified. The categories of potential assigned here provide a rationale for assigning priorities for instrumentation, for future studies aimed at predicting large earthquakes and for making estimates of tsunami potential.

  8. Radial Basis Function Based Quadrature over Smooth Surfaces

    Science.gov (United States)

    2016-03-24

    Radial Basis Functions φ(r) Piecewise Smooth (Conditionally Positive Definite) MN Monomial |r|2m+1 TPS thin plate spline |r|2mln|r| Infinitely Smooth...smooth surfaces using polynomial interpolants, while [27] couples Thin - Plate Spline interpolation (see table 1) with Green’s integral formula [29

  9. Smoothing-Norm Preconditioning for Regularizing Minimum-Residual Methods

    DEFF Research Database (Denmark)

    Hansen, Per Christian; Jensen, Toke Koldborg

    2006-01-01

    take into account a smoothing norm for the solution. This technique is well established for CGLS, but it does not immediately carry over to minimum-residual methods when the smoothing norm is a seminorm or a Sobolev norm. We develop a new technique which works for any smoothing norm of the form $\\|L...

  10. Neurophysiology and Neuroanatomy of Smooth Pursuit in Humans

    Science.gov (United States)

    Lencer, Rebekka; Trillenberg, Peter

    2008-01-01

    Smooth pursuit eye movements enable us to focus our eyes on moving objects by utilizing well-established mechanisms of visual motion processing, sensorimotor transformation and cognition. Novel smooth pursuit tasks and quantitative measurement techniques can help unravel the different smooth pursuit components and complex neural systems involved…

  11. Progressive Seismic Failure, Seismic Gap, and Great Seismic Risk across the Densely Populated North China Basin

    Science.gov (United States)

    Yin, A.; Yu, X.; Shen, Z.

    2014-12-01

    Although the seismically active North China basin has the most complete written records of pre-instrumentation earthquakes in the world, this information has not been fully utilized for assessing potential earthquake hazards of this densely populated region that hosts ~200 million people. In this study, we use the historical records to document the earthquake migration pattern and the existence of a 180-km seismic gap along the 600-km long right-slip Tangshan-Hejian-Cixian (THC) fault zone that cuts across the North China basin. The newly recognized seismic gap, which is centered at Tianjin with a population of 11 million people and ~120 km from Beijing (22 million people) and Tangshan (7 million people), has not been ruptured in the past 1000 years by M≥6 earthquakes. The seismic migration pattern in the past millennium suggests that the epicenters of major earthquakes have shifted towards this seismic gap along the THC fault, which implies that the 180- km gap could be the site of the next great earthquake with M≈7.6 if it is ruptured by a single event. Alternatively, the seismic gap may be explained by aseismic creeping or seismic strain transfer between active faults.

  12. Post-seismic relaxation from geodetic and seismic data

    Directory of Open Access Journals (Sweden)

    Mikhail V. Rodkin

    2017-01-01

    Full Text Available We have examined the aftershock sequence and the post-seismic deformation process of the Parkfield earthquake (2004, M = 6, California, USA source area using GPS data. This event was chosen because of the possibility of joint analysis of data from the rather dense local GPS network (from SOPAC Internet archive and of the availability of the rather detailed aftershock sequence data (http://www.ncedc.org/ncedc/catalog-search.html. The relaxation process of post-seismic deformation prolongs about the same 400 days as the seismic aftershock process does. Thus, the aftershock process and the relaxation process in deformation could be the different sides of the same process. It should be noted that the ratio of the released seismic energy and of the GPS obtained deformation is quite different for the main shock and for the aftershock stage. The ratio of the released seismic energy to the deformation value decreases essentially for the post-shock process. The similar change in the seismic energy/deformation value ratio is valid in a few other strong earthquakes. Thus, this decrease seems typical of aftershock sequences testifying for decrease of ratio of elastic to inelastic deformation in the process of post-shock relaxation when the source area appears to be mostly fractured after the main shock occurs, but the healing process had no yet sufficient time to develop.

  13. The Lithosphere in Italy: Structure and Seismicity

    International Nuclear Information System (INIS)

    Brandmayr, Enrico; Blagoeva Raykova, Reneta; Zuri, Marco; Romanelli, Fabio; Doglioni, Carlo; Panza, Giuliano Francesco

    2010-07-01

    We propose a structural model for the lithosphere-asthenosphere system for the Italic region by means of the S-wave velocity (V S ) distribution with depth. To obtain the velocity structure the following methods are used in the sequence: frequency-time analysis (FTAN); 2D tomography (plotted on a grid 1 o x 1 o ); non-linear inversion; smoothing optimization method. The 3D V S structure (and its uncertainties) of the study region is assembled as a juxtaposition of the selected representative cellular models. The distribution of seismicity and heat flow is used as an independent constraint for the definition of the crustal and lithospheric thickness. The moment tensor inversion of recent damaging earthquakes which occurred in the Italic region is performed through a powerful non-linear technique and it is related to the different rheologic-mechanic properties of the crust and uppermost mantle. The obtained picture of the lithosphere-asthenosphere system for the Italic region confirms a mantle extremely vertically stratified and laterally strongly heterogeneous. The lateral variability in the mantle is interpreted in terms of subduction zones, slab dehydration, inherited mantle chemical anisotropies, asthenospheric upwellings, and so on. The western Alps and the Dinarides have slabs with low dip, whereas the Apennines show a steeper subduction. No evidence for any type of mantle plume is observed. The asymmetric expansion of the Tyrrhenian Sea, which may be interpreted as related to a relative eastward mantle flow with respect to the overlying lithosphere, is confirmed. (author)

  14. Smooth and non-smooth travelling waves in a nonlinearly dispersive Boussinesq equation

    International Nuclear Information System (INIS)

    Shen Jianwei; Xu Wei; Lei Youming

    2005-01-01

    The dynamical behavior and special exact solutions of nonlinear dispersive Boussinesq equation (B(m,n) equation), u tt -u xx -a(u n ) xx +b(u m ) xxxx =0, is studied by using bifurcation theory of dynamical system. As a result, all possible phase portraits in the parametric space for the travelling wave system, solitary wave, kink and anti-kink wave solutions and uncountably infinite many smooth and non-smooth periodic wave solutions are obtained. It can be shown that the existence of singular straight line in the travelling wave system is the reason why smooth waves converge to cusp waves, finally. When parameter are varied, under different parametric conditions, various sufficient conditions guarantee the existence of the above solutions are given

  15. Seismic link at plate boundary

    Indian Academy of Sciences (India)

    process constrain the seismic hazard assessment. Some frequent issues .... to obtain information on the causality between .... 2004), and low frequency deep triggering. (Miyazawa .... can trigger shallow thrust fault earthquakes; Science 306.

  16. Worldwide Marine Seismic Reflection Profiles

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC maintains a large volume of both Analog and Digital seismic reflection data. Currently only a limited number of lines are available online. Digital data include...

  17. Shear wave profiles from surface wave inversion: the impact of uncertainty on seismic site response analysis

    International Nuclear Information System (INIS)

    Boaga, J; Vignoli, G; Cassiani, G

    2011-01-01

    Inversion is a critical step in all geophysical techniques, and is generally fraught with ill-posedness. In the case of seismic surface wave studies, the inverse problem can lead to different equivalent subsoil models and consequently to different local seismic response analyses. This can have a large impact on an earthquake engineering design. In this paper, we discuss the consequences of non-uniqueness of surface wave inversion on seismic responses, with both numerical and experimental data. Our goal is to evaluate the consequences on common seismic response analysis in the case of different impedance contrast conditions. We verify the implications of inversion uncertainty, and consequently of data information content, on realistic local site responses. A stochastic process is used to generate a set of 1D shear wave velocity profiles from several specific subsurface models. All these profiles are characterized as being equivalent, i.e. their responses, in terms of a dispersion curve, are compatible with the uncertainty in the same surface wave data. The generated 1D shear velocity models are then subjected to a conventional one-dimensional seismic ground response analysis using a realistic input motion. While recent analyses claim that the consequences of surface wave inversion uncertainties are very limited, our test points out that a relationship exists between inversion confidence and seismic responses in different subsoils. In the case of regular and relatively smooth increase of shear wave velocities with depth, as is usual in sedimentary plains, our results show that the choice of a specific model among equivalent solutions strongly influences the seismic response. On the other hand, when the shallow subsoil is characterized by a strong impedance contrast (thus revealing a characteristic soil resonance period), as is common in the presence of a shallow bedrock, equivalent solutions provide practically the same seismic amplification, especially in the

  18. ASIC PROTEINS REGULATE SMOOTH MUSCLE CELL MIGRATION

    OpenAIRE

    Grifoni, Samira C.; Jernigan, Nikki L.; Hamilton, Gina; Drummond, Heather A.

    2007-01-01

    The purpose of the present study was to investigate Acid Sensing Ion Channel (ASIC) protein expression and importance in cellular migration. We recently demonstrated Epithelial Na+ Channel (ENaC) proteins are required for vascular smooth muscle cell (VSMC) migration, however the role of the closely related ASIC proteins has not been addressed. We used RT-PCR and immunolabeling to determine expression of ASIC1, ASIC2, ASIC3 and ASIC4 in A10 cells. We used small interference RNA to silence indi...

  19. A smooth exit from eternal inflation?

    Science.gov (United States)

    Hawking, S. W.; Hertog, Thomas

    2018-04-01

    The usual theory of inflation breaks down in eternal inflation. We derive a dual description of eternal inflation in terms of a deformed Euclidean CFT located at the threshold of eternal inflation. The partition function gives the amplitude of different geometries of the threshold surface in the no-boundary state. Its local and global behavior in dual toy models shows that the amplitude is low for surfaces which are not nearly conformal to the round three-sphere and essentially zero for surfaces with negative curvature. Based on this we conjecture that the exit from eternal inflation does not produce an infinite fractal-like multiverse, but is finite and reasonably smooth.

  20. On spaces of functions of smoothness zero

    International Nuclear Information System (INIS)

    Besov, Oleg V

    2012-01-01

    The paper is concerned with the new spaces B-bar p,q 0 of functions of smoothness zero defined on the n-dimensional Euclidean space R n or on a subdomain G of R n . These spaces are compared with the spaces B p,q 0 (R n ) and bmo(R n ). The embedding theorems for Sobolev spaces are refined in terms of the space B-bar p,q 0 with the limiting exponent. Bibliography: 8 titles.

  1. Smooth Nanowire/Polymer Composite Transparent Electrodes

    KAUST Repository

    Gaynor, Whitney; Burkhard, George F.; McGehee, Michael D.; Peumans, Peter

    2011-01-01

    Smooth composite transparent electrodes are fabricated via lamination of silver nanowires into the polymer poly-(4,3-ethylene dioxythiophene): poly(styrene-sulfonate) (PEDOT:PSS). The surface roughness is dramatically reduced compared to bare nanowires. High-efficiency P3HT:PCBM organic photovoltaic cells can be fabricated using these composites, reproducing the performance of cells on indium tin oxide (ITO) on glass and improving the performance of cells on ITO on plastic. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Workshop on advances in smooth particle hydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Wingate, C.A.; Miller, W.A.

    1993-12-31

    This proceedings contains viewgraphs presented at the 1993 workshop held at Los Alamos National Laboratory. Discussed topics include: negative stress, reactive flow calculations, interface problems, boundaries and interfaces, energy conservation in viscous flows, linked penetration calculations, stability and consistency of the SPH method, instabilities, wall heating and conservative smoothing, tensors, tidal disruption of stars, breaking the 10,000,000 particle limit, modelling relativistic collapse, SPH without H, relativistic KSPH avoidance of velocity based kernels, tidal compression and disruption of stars near a supermassive rotation black hole, and finally relativistic SPH viscosity and energy.

  3. Smooth Nanowire/Polymer Composite Transparent Electrodes

    KAUST Repository

    Gaynor, Whitney

    2011-04-29

    Smooth composite transparent electrodes are fabricated via lamination of silver nanowires into the polymer poly-(4,3-ethylene dioxythiophene): poly(styrene-sulfonate) (PEDOT:PSS). The surface roughness is dramatically reduced compared to bare nanowires. High-efficiency P3HT:PCBM organic photovoltaic cells can be fabricated using these composites, reproducing the performance of cells on indium tin oxide (ITO) on glass and improving the performance of cells on ITO on plastic. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Seismic Applications of Energy Dampers

    OpenAIRE

    Shambhu Sinha

    2004-01-01

    Damping devices based on the operating principle of high velocity fluid flow through orifices have found numerous applications in the shock and vibration isolation of aerospace and defence systems. The study aims to investigate the feasibility of using energy dissipating fluid viscous dampers in structures to protect against seismic loads and to prove analytically and  experimentally that fluid viscous dampers can improve the seismic capacity of a structure by reducing damage and displacement...

  5. Position paper: Seismic design criteria

    International Nuclear Information System (INIS)

    Farnworth, S.K.

    1995-01-01

    The purpose of this paper is to document the seismic design criteria to be used on the Title 11 design of the underground double-shell waste storage tanks and appurtenant facilities of the Multi-Function Waste Tank Facility (MWTF) project, and to provide the history and methodologies for determining the recommended Design Basis Earthquake (DBE) Peak Ground Acceleration (PGA) anchors for site-specific seismic response spectra curves. Response spectra curves for use in design are provided in Appendix A

  6. Visualization of volumetric seismic data

    Science.gov (United States)

    Spickermann, Dela; Böttinger, Michael; Ashfaq Ahmed, Khawar; Gajewski, Dirk

    2015-04-01

    Mostly driven by demands of high quality subsurface imaging, highly specialized tools and methods have been developed to support the processing, visualization and interpretation of seismic data. 3D seismic data acquisition and 4D time-lapse seismic monitoring are well-established techniques in academia and industry, producing large amounts of data to be processed, visualized and interpreted. In this context, interactive 3D visualization methods proved to be valuable for the analysis of 3D seismic data cubes - especially for sedimentary environments with continuous horizons. In crystalline and hard rock environments, where hydraulic stimulation techniques may be applied to produce geothermal energy, interpretation of the seismic data is a more challenging problem. Instead of continuous reflection horizons, the imaging targets are often steep dipping faults, causing a lot of diffractions. Without further preprocessing these geological structures are often hidden behind the noise in the data. In this PICO presentation we will present a workflow consisting of data processing steps, which enhance the signal-to-noise ratio, followed by a visualization step based on the use the commercially available general purpose 3D visualization system Avizo. Specifically, we have used Avizo Earth, an extension to Avizo, which supports the import of seismic data in SEG-Y format and offers easy access to state-of-the-art 3D visualization methods at interactive frame rates, even for large seismic data cubes. In seismic interpretation using visualization, interactivity is a key requirement for understanding complex 3D structures. In order to enable an easy communication of the insights gained during the interactive visualization process, animations of the visualized data were created which support the spatial understanding of the data.

  7. Quantum key distribution with finite resources: Smooth Min entropy vs. Smooth Renyi entropy

    Energy Technology Data Exchange (ETDEWEB)

    Mertz, Markus; Abruzzo, Silvestre; Bratzik, Sylvia; Kampermann, Hermann; Bruss, Dagmar [Institut fuer Theoretische Physik III, Duesseldorf (Germany)

    2010-07-01

    We consider different entropy measures that play an important role in the analysis of the security of QKD with finite resources. The smooth min entropy leads to an optimal bound for the length of a secure key. Another bound on the secure key length was derived by using Renyi entropies. Unfortunately, it is very hard or even impossible to calculate these entropies for realistic QKD scenarios. To estimate the security rate it becomes important to find computable bounds on these entropies. Here, we compare a lower bound for the smooth min entropy with a bound using Renyi entropies. We compare these entropies for the six-state protocol with symmetric attacks.

  8. Russian regulatory approaches to seismic design and seismic analysis of NPP piping

    International Nuclear Information System (INIS)

    Kaliberda, Y.V.

    2003-01-01

    The paper presents an overview of Russian regulatory approaches to seismic design and seismic analysis of NPP piping. The paper is focused on categorization and seismic analysis of nuclear power plant items (piping, equipment, supports, valves, but not building structures). The paper outlines the current seismic recommendations, corresponding methods with the examples of calculation models. The paper considers calculation results of the mechanisms of dynamic behavior and the problems of developing a rational and economical approaches to seismic design and seismic protection. (author)

  9. Recent Vs. Historical Seismicity Analysis For Banat Seismic Region (Western Part Of Romania)

    OpenAIRE

    Oros Eugen; Diaconescu Mihai

    2015-01-01

    The present day seismic activity from a region reflects the active tectonics and can confirm the seismic potential of the seismogenic sources as they are modelled using the historical seismicity. This paper makes a comparative analysis of the last decade seismicity recorded in the Banat Seismic Region (western part of Romania) and the historical seismicity of the region (Mw≥4.0). Four significant earthquake sequences have been recently localized in the region, three of them nearby the city of...

  10. Cooperative New Madrid seismic network

    International Nuclear Information System (INIS)

    Herrmann, R.B.; Johnston, A.C.

    1990-01-01

    The development and installation of components of a U.S. National Seismic Network (USNSN) in the eastern United States provides the basis for long term monitoring of eastern earthquakes. While the broad geographical extent of this network provides a uniform monitoring threshold for the purpose of identifying and locating earthquakes and while it will provide excellent data for defining some seismic source parameters for larger earthquakes through the use of waveform modeling techniques, such as depth and focal mechanism, by itself it will not be able to define the scaling of high frequency ground motions since it will not focus on any of the major seismic zones in the eastern U.S. Realizing this need and making use of a one time availability of funds for studying New Madrid earthquakes, Saint Louis University and Memphis State University successfully competed for funding in a special USGS RFP for New Madrid studies. The purpose of the proposal is to upgrade the present seismic networks run by these institutions in order to focus on defining the seismotectonics and ground motion scaling in the New Madrid Seismic Zone. The proposed network is designed both to complement the U.S. National Seismic Network and to make use of the capabilities of the communication links of that network

  11. A Bayesian approach to infer the radial distribution of temperature and anisotropy in the transition zone from seismic data

    Science.gov (United States)

    Drilleau, M.; Beucler, E.; Mocquet, A.; Verhoeven, O.; Moebs, G.; Burgos, G.; Montagner, J.

    2013-12-01

    Mineralogical transformations and matter transfers within the Earth's mantle make the 350-1000 km depth range (considered here as the mantle transition zone) highly heterogeneous and anisotropic. Most of the 3-D global tomographic models are anchored on small perturbations from 1-D models such as PREM, and are secondly interpreted in terms of temperature and composition distributions. However, the degree of heterogeneity in the transition zone can be strong enough so that the concept of a 1-D reference seismic model may be addressed. To avoid the use of any seismic reference model, we developed a Markov chain Monte Carlo algorithm to directly interpret surface wave dispersion curves in terms of temperature and radial anisotropy distributions, considering a given composition of the mantle. These interpretations are based on laboratory measurements of elastic moduli and Birch-Murnaghan equation of state. An originality of the algorithm is its ability to explore both smoothly varying models and first-order discontinuities, using C1-Bézier curves, which interpolate the randomly chosen values for depth, temperature and radial anisotropy. This parameterization is able to generate a self-adapting parameter space exploration while reducing the computing time. Using a Bayesian exploration, the probability distributions on temperature and anisotropy are governed by uncertainties on the data set. The method was successfully applied to both synthetic data and real dispersion curves. Surface wave measurements along the Vanuatu- California path suggest a strong anisotropy above 400 km depth which decreases below, and a monotonous temperature distribution between 350 and 1000 km depth. On the contrary, a negative shear wave anisotropy of about 2 % is found at the top of the transition zone below Eurasia. Considering compositions ranging from piclogite to pyrolite, the overall temperature profile and temperature gradient are higher for the continental path than for the oceanic

  12. Isotropic Growth of Graphene toward Smoothing Stitching.

    Science.gov (United States)

    Zeng, Mengqi; Tan, Lifang; Wang, Lingxiang; Mendes, Rafael G; Qin, Zhihui; Huang, Yaxin; Zhang, Tao; Fang, Liwen; Zhang, Yanfeng; Yue, Shuanglin; Rümmeli, Mark H; Peng, Lianmao; Liu, Zhongfan; Chen, Shengli; Fu, Lei

    2016-07-26

    The quality of graphene grown via chemical vapor deposition still has very great disparity with its theoretical property due to the inevitable formation of grain boundaries. The design of single-crystal substrate with an anisotropic twofold symmetry for the unidirectional alignment of graphene seeds would be a promising way for eliminating the grain boundaries at the wafer scale. However, such a delicate process will be easily terminated by the obstruction of defects or impurities. Here we investigated the isotropic growth behavior of graphene single crystals via melting the growth substrate to obtain an amorphous isotropic surface, which will not offer any specific grain orientation induction or preponderant growth rate toward a certain direction in the graphene growth process. The as-obtained graphene grains are isotropically round with mixed edges that exhibit high activity. The orientation of adjacent grains can be easily self-adjusted to smoothly match each other over a liquid catalyst with facile atom delocalization due to the low rotation steric hindrance of the isotropic grains, thus achieving the smoothing stitching of the adjacent graphene. Therefore, the adverse effects of grain boundaries will be eliminated and the excellent transport performance of graphene will be more guaranteed. What is more, such an isotropic growth mode can be extended to other types of layered nanomaterials such as hexagonal boron nitride and transition metal chalcogenides for obtaining large-size intrinsic film with low defect.

  13. Smooth Tubercle Bacilli: Neglected Opportunistic Tropical Pathogens

    Directory of Open Access Journals (Sweden)

    Djaltou eAboubaker

    2016-01-01

    Full Text Available Smooth tubercle bacilli (STB including ‘‘Mycobacterium canettii’’ are members of the Mycobacterium tuberculosis complex (MTBC which cause non-contagious tuberculosis in human. This group comprises less than one hundred isolates characterized by smooth colonies and cordless organisms. Most STB isolates have been obtained from patients exposed to the Republic of Djibouti but seven isolates, including the three seminal ones obtained by Georges Canetti between 1968 and 1970, were recovered from patients in France, Madagascar, Sub-Sahara East Africa and French Polynesia. STB form a genetically heterogeneous group of MTBC organisms with large 4.48 ± 0.05 Mb genomes which may link Mycobacterium kansasii to MTBC organisms. Lack of inter-human transmission suggested a yet unknown environmental reservoir. Clinical data indicate a respiratory tract route of contamination and the digestive tract as an alternative route of contamination. Further epidemiological and clinical studies are warranted to elucidate areas of uncertainty regarding these unusual mycobacteria and the tuberculosis they cause.

  14. Snap evaporation of droplets on smooth topographies.

    Science.gov (United States)

    Wells, Gary G; Ruiz-Gutiérrez, Élfego; Le Lirzin, Youen; Nourry, Anthony; Orme, Bethany V; Pradas, Marc; Ledesma-Aguilar, Rodrigo

    2018-04-11

    Droplet evaporation on solid surfaces is important in many applications including printing, micro-patterning and cooling. While seemingly simple, the configuration of evaporating droplets on solids is difficult to predict and control. This is because evaporation typically proceeds as a "stick-slip" sequence-a combination of pinning and de-pinning events dominated by static friction or "pinning", caused by microscopic surface roughness. Here we show how smooth, pinning-free, solid surfaces of non-planar topography promote a different process called snap evaporation. During snap evaporation a droplet follows a reproducible sequence of configurations, consisting of a quasi-static phase-change controlled by mass diffusion interrupted by out-of-equilibrium snaps. Snaps are triggered by bifurcations of the equilibrium droplet shape mediated by the underlying non-planar solid. Because the evolution of droplets during snap evaporation is controlled by a smooth topography, and not by surface roughness, our ideas can inspire programmable surfaces that manage liquids in heat- and mass-transfer applications.

  15. A high-resolution ambient seismic noise model for Europe

    Science.gov (United States)

    Kraft, Toni

    2014-05-01

    measurement precision (i.e. earthquake location), while considering this extremely complex boundary condition. To solve this problem I have developed a high-resolution ambient seismic noise model for Europe. The model is based on land-use data derived from satellite imagery by the EU-project CORINE in a resolution of 100x100m. The the CORINE data consists of several land-use classes, which, besides others, contain: industrial areas, mines, urban fabric, agricultural areas, permanent corps, forests and open spaces. Additionally, open GIS data for highways, and major and minor roads and railway lines were included from the OpenStreetMap project (www.openstreetmap.org). This data was divided into three classes that represent good, intermediate and bad ambient conditions of the corresponding land-use class based on expert judgment. To account for noise propagation away from its source a smoothing operator was applied to individual land-use noise-fields. Finally, the noise-fields were stacked to obtain an European map of ambient noise conditions. A calibration of this map with data of existing seismic stations Europe allowed me to estimate the expected noise level in actual ground motion units for the three ambient noise condition classes of the map. The result is a high-resolution ambient seismic noise map, that allows the network designer to make educated predictions on the expected noise level for arbitrary location in Europe. The ambient noise model was successfully tested in several network optimization projects in Switzerland and surrounding countries and will hopefully be a valuable contribution to improving the data quality of microseismic monitoring networks in Europe.

  16. Multicomponent seismic applications in coalbed methane development

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, D.; Trend, S. [Calgary Univ., AB (Canada). Dept. of Geology and Geophysics

    2004-07-01

    Seismic applications for coalbed methane (CBM) development are used to address the following challenges: lateral continuity of coal zones; vertical continuity of coal seams; permeability of cleats and fractures; coal quality and gas content; wet versus dry coal zones; and, monitoring storage of greenhouse gases. This paper presented a brief description of existing seismic programs, including 2-D and 3-D surface seismic surveys; multicomponent seismic surveys; vertical seismic profiles; cross-well seismic surveys; and, time-lapse seismic surveys. A comparative evaluation of their use in the Horseshoe Canyon Formation and the Ardley Formation was presented. The study showed that variations in reservoir properties resulting from gas production and dewatering can be effectively imaged using seismic surveys. Seismic surveys are useful in reservoir management, monitoring sweep efficiency during enhanced natural gas from coal (NGC) production, monitoring disposal of produced water and verifying storage of carbon dioxide for carbon credits. tabs., figs.

  17. Romanian Educational Seismic Network Project

    Science.gov (United States)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  18. Smoothness without smoothing: why Gaussian naive Bayes is not naive for multi-subject searchlight studies.

    Directory of Open Access Journals (Sweden)

    Rajeev D S Raizada

    Full Text Available Spatial smoothness is helpful when averaging fMRI signals across multiple subjects, as it allows different subjects' corresponding brain areas to be pooled together even if they are slightly misaligned. However, smoothing is usually not applied when performing multivoxel pattern-based analyses (MVPA, as it runs the risk of blurring away the information that fine-grained spatial patterns contain. It would therefore be desirable, if possible, to carry out pattern-based analyses which take unsmoothed data as their input but which produce smooth images as output. We show here that the Gaussian Naive Bayes (GNB classifier does precisely this, when it is used in "searchlight" pattern-based analyses. We explain why this occurs, and illustrate the effect in real fMRI data. Moreover, we show that analyses using GNBs produce results at the multi-subject level which are statistically robust, neurally plausible, and which replicate across two independent data sets. By contrast, SVM classifiers applied to the same data do not generate a replication, even if the SVM-derived searchlight maps have smoothing applied to them. An additional advantage of GNB classifiers for searchlight analyses is that they are orders of magnitude faster to compute than more complex alternatives such as SVMs. Collectively, these results suggest that Gaussian Naive Bayes classifiers may be a highly non-naive choice for multi-subject pattern-based fMRI studies.

  19. Seismic risk analysis for the Westinghouse Electric facility, Cheswick, Pennsylvania

    International Nuclear Information System (INIS)

    1977-01-01

    This report presents the results of a detailed seismic risk analysis of the Westinghouse Electric plutonium fuel development facility at Cheswick, Pennsylvania. This report focuses on earthquakes. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region around the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. For example, allowance was made for both the uncertainty in predicting maximum possible earthquakes in the region and the effect of the dispersion of data about the best fit attenuation relation. The attenuation relationship is derived from two of the most recent, advanced studies relating earthquake intensity reports and acceleration. Results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented as return period accelerations. The best estimate curve indicates that the Westinghouse facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and each of the source regions contributes almost equally to the cumulative risk at the site

  20. Seismic reevaluation of nuclear facilities worldwide: Overview and status

    International Nuclear Information System (INIS)

    Campbell, R.D.; Hardy, G.S.; Ravindra, M.K.; Johnson, J.J.; Hoy, A.J.

    1995-01-01

    Existing nuclear facilities throughout the world are being subjected to severe scrutiny of their safety in tile event of an earthquake. In the United States, there have been several licensing and safety review issues for which industry and regulatory agencies have cooperated to develop rational and economically feasible criteria for resolving the issues. Currently, all operating nuclear power plants in the United States are conducting an Individual Plant Examination of External Events, including earthquakes beyond tile design basis. About two-thirds of tile operating plants are conducting parallel programs for verifying, tile seismic adequacy of equipment for the design basis earthquake. The U.S. Department of Energy is also beginning to perform detailed evaluations of their facilities, many of which had little or no seismic design. Western European countries also have been reevaluating their older nuclear power plants for seismic events often adapting the criteria developed in the United States. With the change in tile political systems in Eastern Europe, there is a strong emphasis from their Western European neighbors to evaluate and Upgrade tile safely of their operating nuclear power plants. Finally, nuclear facilities in Asia are, also, being evaluated for seismic vulnerabilities. This paper focuses oil tile methodologies that have been developed for reevaluation of existing nuclear power plants and presents examples of the application of these methodologies to nuclear facilities worldwide. (author)

  1. Seismic reevaluation of nuclear facilities worldwide: Overview and status

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, R D; Hardy, G S; Ravindra, M K [EQE International, Irvine, CA (United States); Johnson, J J [EQE International, San Francisco, CA (United States); Hoy, A J [EQE International Ltd., Birchwood, Warrington (United Kingdom)

    1995-07-01

    Existing nuclear facilities throughout the world are being subjected to severe scrutiny of their safety in tile event of an earthquake. In the United States, there have been several licensing and safety review issues for which industry and regulatory agencies have cooperated to develop rational and economically feasible criteria for resolving the issues. Currently, all operating nuclear power plants in the United States are conducting an Individual Plant Examination of External Events, including earthquakes beyond tile design basis. About two-thirds of tile operating plants are conducting parallel programs for verifying, tile seismic adequacy of equipment for the design basis earthquake. The U.S. Department of Energy is also beginning to perform detailed evaluations of their facilities, many of which had little or no seismic design. Western European countries also have been reevaluating their older nuclear power plants for seismic events often adapting the criteria developed in the United States. With the change in tile political systems in Eastern Europe, there is a strong emphasis from their Western European neighbors to evaluate and Upgrade tile safely of their operating nuclear power plants. Finally, nuclear facilities in Asia are, also, being evaluated for seismic vulnerabilities. This paper focuses oil tile methodologies that have been developed for reevaluation of existing nuclear power plants and presents examples of the application of these methodologies to nuclear facilities worldwide. (author)

  2. Seismic failure modes and seismic safety of Hardfill dam

    Directory of Open Access Journals (Sweden)

    Kun Xiong

    2013-04-01

    Full Text Available Based on microscopic damage theory and the finite element method, and using the Weibull distribution to characterize the random distribution of the mechanical properties of materials, the seismic response of a typical Hardfill dam was analyzed through numerical simulation during the earthquakes with intensities of 8 degrees and even greater. The seismic failure modes and failure mechanism of the dam were explored as well. Numerical results show that the Hardfill dam remains at a low stress level and undamaged or slightly damaged during an earthquake with an intensity of 8 degrees. During overload earthquakes, tensile cracks occur at the dam surfaces and extend to inside the dam body, and the upstream dam body experiences more serious damage than the downstream dam body. Therefore, under the seismic conditions, the failure pattern of the Hardfill dam is the tensile fracture of the upstream regions and the dam toe. Compared with traditional gravity dams, Hardfill dams have better seismic performance and greater seismic safety.

  3. Seismic analysis for the ALMR

    International Nuclear Information System (INIS)

    Tajirian, F.F.

    1992-01-01

    The Advanced Liquid Metal Reactor (ALMR) design uses seismic isolation as a cost effective approach for simplifying seismic design of the reactor module, and for enhancing margins to handle beyond design basis earthquakes (BDBE). A comprehensive seismic analysis plan has been developed to confirm the adequacy of the design and to support regulatory licensing activities. In this plan state-of-the-art computer programs are used to evaluate the system response of the ALMR. Several factors that affect seismic response will be investigated. These include variability in the input earthquake mechanism, soil-structure interaction effects, and nonlinear response of the isolators. This paper reviews the type of analyses that are planned, and discuses the approach that will be used for validating the specific features of computer programs that are required in the analysis of isolated structures. To date, different linear and nonlinear seismic analyses have been completed. The results of recently completed linear analyses have been summarized elsewhere. The findings of three-dimensional seismic nonlinear analyses are presented in this paper. These analyses were performed to evaluate the effect of changes of isolator horizontal stiffness with horizontal displacement on overall response, to develop an approach for representing BDBE events with return periods exceeding 10,000 years, and to assess margins in the design for BDBEs. From the results of these analyses and bearing test data, it can be concluded that a properly designed and constructed seismic isolation system can accommodate displacements several times the design safe shutdown earthquake (SSE) for the ALMR. (author)

  4. Escoamento uniforme em canais circulares lisos. Parte I: adaptação e validação do método de Kazemipour Uniform flow in smooth circular channels. Part I: adaptation and validation of the Kazemipour method

    Directory of Open Access Journals (Sweden)

    Maurício C. Goldfarb

    2004-12-01

    Full Text Available A partir da equação de von Karman Prandtl para tubos pressurizados, Kazemipour & Apelt (1980 desenvolveram uma metodologia para cálculo do escoamento em canais circulares lisos, denominada método de Kazemipour o qual, apesar de apresentar resultados de bastante eficiência necessita, no entanto, de recursos gráficos na sua aplicação, o que impossibilita a solução através de métodos computacionais e, também, a comparação deste com outras metodologias existentes. Neste trabalho, mostram-se os resultados da investigação analítica que resulta na validação do método de Kazemipour, como também o ajuste, de acordo com o procedimento proposto por Silva & Figueiredo (1993, de maneira a tornar o procedimento completamente equacionável sem a necessidade de recursos gráficos. O resultado encontrado é satisfatório e sua aplicação é apresentada num exemplo de aplicação prática.Considering the von Karman Prandtl equation for pressurized tubes, Kazemipour & Apelt (1980 developed a methodology for flow calculation in smooth circular channels, denominated as method of Kazemipour. Inspite of good results, the Kazemipour method needs graphic tools in its application, which makes its solution through computational methods and comparison to other existing methodologies difficult. In this research, the results of the analytic investigation that provides the validation of the Kazemipour method are shown, as well as the adjustments according to procedure proposed by Silva & Figueiredo (1993, performed in such a way to make the procedure independent of graphic tools. The result obtained is satisfactory and its use is presented in an example of practical application.

  5. Smooth function approximation using neural networks.

    Science.gov (United States)

    Ferrari, Silvia; Stengel, Robert F

    2005-01-01

    An algebraic approach for representing multidimensional nonlinear functions by feedforward neural networks is presented. In this paper, the approach is implemented for the approximation of smooth batch data containing the function's input, output, and possibly, gradient information. The training set is associated to the network adjustable parameters by nonlinear weight equations. The cascade structure of these equations reveals that they can be treated as sets of linear systems. Hence, the training process and the network approximation properties can be investigated via linear algebra. Four algorithms are developed to achieve exact or approximate matching of input-output and/or gradient-based training sets. Their application to the design of forward and feedback neurocontrollers shows that algebraic training is characterized by faster execution speeds and better generalization properties than contemporary optimization techniques.

  6. Smooth driving of Moessbauer electromechanical transducers

    Energy Technology Data Exchange (ETDEWEB)

    Veiga, A., E-mail: veiga@fisica.unlp.edu.ar; Mayosky, M. A. [Universidad Nacional de La Plata, Facultad de Ingenieria (Argentina); Martinez, N.; Mendoza Zelis, P.; Pasquevich, G. A.; Sanchez, F. H. [Instituto de Fisica La Plata, CONICET (Argentina)

    2011-11-15

    Quality of Moessbauer spectra is strongly related to the performance of source velocity modulator. Traditional electromechanical driving techniques demand hard-edged square or triangular velocity waveforms that introduce long settling times and demand careful driver tuning. For this work, the behavior of commercial velocity transducers and drive units was studied under different working conditions. Different velocity reference waveforms in constant-acceleration, constant-velocity and programmable-velocity techniques were tested. Significant improvement in spectrometer efficiency and accuracy was achieved by replacing triangular and square hard edges with continuous smooth-shaped transitions. A criterion for best waveform selection and synchronization is presented and attainable enhancements are evaluated. In order to fully exploit this driving technique, a compact microprocessor-based architecture is proposed and a suitable data acquisition system implementation is presented. System linearity and efficiency characterization are also shown.

  7. Smooth muscle cell phenotypic switching in stroke.

    Science.gov (United States)

    Poittevin, Marine; Lozeron, Pierre; Hilal, Rose; Levy, Bernard I; Merkulova-Rainon, Tatiana; Kubis, Nathalie

    2014-06-01

    Disruption of cerebral blood flow after stroke induces cerebral tissue injury through multiple mechanisms that are not yet fully understood. Smooth muscle cells (SMCs) in blood vessel walls play a key role in cerebral blood flow control. Cerebral ischemia triggers these cells to switch to a phenotype that will be either detrimental or beneficial to brain repair. Moreover, SMC can be primarily affected genetically or by toxic metabolic molecules. After stroke, this pathological phenotype has an impact on the incidence, pattern, severity, and outcome of the cerebral ischemic disease. Although little research has been conducted on the pathological role and molecular mechanisms of SMC in cerebrovascular ischemic diseases, some therapeutic targets have already been identified and could be considered for further pharmacological development. We examine these different aspects in this review.

  8. Smoothed Particle Hydrodynamics Coupled with Radiation Transfer

    Science.gov (United States)

    Susa, Hajime

    2006-04-01

    We have constructed a brand-new radiation hydrodynamics solver based upon Smoothed Particle Hydrodynamics, which works on a parallel computer system. The code is designed to investigate the formation and evolution of first-generation objects at z ≳ 10, where the radiative feedback from various sources plays important roles. The code can compute the fraction of chemical species e, H+, H, H-, H2, and H+2 by by fully implicit time integration. It also can deal with multiple sources of ionizing radiation, as well as radiation at Lyman-Werner band. We compare the results for a few test calculations with the results of one-dimensional simulations, in which we find good agreements with each other. We also evaluate the speedup by parallelization, which is found to be almost ideal, as long as the number of sources is comparable to the number of processors.

  9. Contruction of a smoothed DEA frontier

    Directory of Open Access Journals (Sweden)

    Mello João Carlos Correia Baptista Soares de

    2002-01-01

    Full Text Available It is known that the DEA multipliers model does not allow a unique solution for the weights. This is due to the absence of unique derivatives in the extreme-efficient points, which is a consequence of the piecewise linear nature of the frontier. In this paper we propose a method to solve this problem, consisting of changing the original DEA frontier for a new one, smooth (with continuous derivatives at every point and closest to the original frontier. We present the theoretical development for the general case, exemplified with the particular case of the BCC model with one input and one output. The 3-dimensional problem is briefly discussed. Some uses of the model are summarised, and one of them, a new Cross-Evaluation model, is presented.

  10. Fluid injection and induced seismicity

    Science.gov (United States)

    Kendall, Michael; Verdon, James

    2016-04-01

    The link between fluid injection, or extraction, and induced seismicity has been observed in reservoirs for many decades. In fact spatial mapping of low magnitude events is routinely used to estimate a stimulated reservoir volume. However, the link between subsurface fluid injection and larger felt seismicity is less clear and has attracted recent interest with a dramatic increase in earthquakes associated with the disposal of oilfield waste fluids. In a few cases, hydraulic fracturing has also been linked to induced seismicity. Much can be learned from past case-studies of induced seismicity so that we can better understand the risks posed. Here we examine 12 case examples and consider in particular controls on maximum event size, lateral event distributions, and event depths. Our results suggest that injection volume is a better control on maximum magnitude than past, natural seismicity in a region. This might, however, simply reflect the lack of baseline monitoring and/or long-term seismic records in certain regions. To address this in the UK, the British Geological Survey is leading the deployment of monitoring arrays in prospective shale gas areas in Lancashire and Yorkshire. In most cases, seismicity is generally located in close vicinity to the injection site. However, in some cases, the nearest events are up to 5km from the injection point. This gives an indication of the minimum radius of influence of such fluid injection projects. The most distant events are never more than 20km from the injection point, perhaps implying a maximum radius of influence. Some events are located in the target reservoir, but most occur below the injection depth. In fact, most events lie in the crystalline basement underlying the sedimentary rocks. This suggests that induced seismicity may not pose a leakage risk for fluid migration back to the surface, as it does not impact caprock integrity. A useful application for microseismic data is to try and forecast induced seismicity

  11. Diffusion tensor smoothing through weighted Karcher means

    Science.gov (United States)

    Carmichael, Owen; Chen, Jun; Paul, Debashis; Peng, Jie

    2014-01-01

    Diffusion tensor magnetic resonance imaging (MRI) quantifies the spatial distribution of water Diffusion at each voxel on a regular grid of locations in a biological specimen by Diffusion tensors– 3 × 3 positive definite matrices. Removal of noise from DTI is an important problem due to the high scientific relevance of DTI and relatively low signal to noise ratio it provides. Leading approaches to this problem amount to estimation of weighted Karcher means of Diffusion tensors within spatial neighborhoods, under various metrics imposed on the space of tensors. However, it is unclear how the behavior of these estimators varies with the magnitude of DTI sensor noise (the noise resulting from the thermal e!ects of MRI scanning) as well as the geometric structure of the underlying Diffusion tensor neighborhoods. In this paper, we combine theoretical analysis, empirical analysis of simulated DTI data, and empirical analysis of real DTI scans to compare the noise removal performance of three kernel-based DTI smoothers that are based on Euclidean, log-Euclidean, and affine-invariant metrics. The results suggest, contrary to conventional wisdom, that imposing a simplistic Euclidean metric may in fact provide comparable or superior noise removal, especially in relatively unstructured regions and/or in the presence of moderate to high levels of sensor noise. On the contrary, log-Euclidean and affine-invariant metrics may lead to better noise removal in highly structured anatomical regions, especially when the sensor noise is of low magnitude. These findings emphasize the importance of considering the interplay of sensor noise magnitude and tensor field geometric structure when assessing Diffusion tensor smoothing options. They also point to the necessity for continued development of smoothing methods that perform well across a large range of scenarios. PMID:25419264

  12. Generalized seismic analysis

    Science.gov (United States)

    Butler, Thomas G.

    1993-09-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  13. Seismic behaviour of geotechnical structures

    Directory of Open Access Journals (Sweden)

    F. Vinale

    2002-06-01

    Full Text Available This paper deals with some fundamental considerations regarding the behaviour of geotechnical structures under seismic loading. First a complete definition of the earthquake disaster risk is provided, followed by the importance of performing site-specific hazard analysis. Then some suggestions are provided in regard to adequate assessment of soil parameters, a crucial point to properly analyze the seismic behaviour of geotechnical structures. The core of the paper is centered on a critical review of the analysis methods available for studying geotechnical structures under seismic loadings. All of the available methods can be classified into three main classes, including the pseudo-static, pseudo-dynamic and dynamic approaches, each of which is reviewed for applicability. A more advanced analysis procedure, suitable for a so-called performance-based design approach, is also described in the paper. Finally, the seismic behaviour of the El Infiernillo Dam was investigated. It was shown that coupled elastoplastic dynamic analyses disclose some of the important features of dam behaviour under seismic loading, confirmed by comparing analytical computation and experimental measurements on the dam body during and after a past earthquake.

  14. Study on structural seismic margin and probabilistic seismic risk. Development of a structural capacity-seismic risk diagram

    International Nuclear Information System (INIS)

    Nakajima, Masato; Ohtori, Yasuki; Hirata, Kazuta

    2010-01-01

    Seismic margin is extremely important index and information when we evaluate and account seismic safety of critical structures, systems and components quantitatively. Therefore, it is required that electric power companies evaluate the seismic margin of each plant in back-check of nuclear power plants in Japan. The seismic margin of structures is usually defined as a structural capacity margin corresponding to design earthquake ground motion. However, there is little agreement as to the definition of the seismic margin and we have no knowledge about a relationship between the seismic margin and seismic risk (annual failure probability) which is obtained in PSA (Probabilistic Safety Assessment). The purpose of this report is to discuss a definition of structural seismic margin and to develop a diagram which can identify a relation between seismic margin and seismic risk. The main results of this paper are described as follows: (1) We develop seismic margin which is defined based on the fact that intensity of earthquake ground motion is more appropriate than the conventional definition (i.e., the response-based seismic margin) for the following reasons: -seismic margin based on earthquake ground motion is invariant where different typed structures are considered, -stakeholders can understand the seismic margin based on the earthquake ground motion better than the response-based one. (2) The developed seismic margin-risk diagram facilitates us to judge easily whether we need to perform detailed probabilistic risk analysis or only deterministic analysis, given that the reference risk level although information on the uncertainty parameter beta is not obtained. (3) We have performed numerical simulations based on the developed method for four sites in Japan. The structural capacity-risk diagram differs depending on each location because the diagram is greatly influenced by seismic hazard information for a target site. Furthermore, the required structural capacity

  15. Bifurcation theory for finitely smooth planar autonomous differential systems

    Science.gov (United States)

    Han, Maoan; Sheng, Lijuan; Zhang, Xiang

    2018-03-01

    In this paper we establish bifurcation theory of limit cycles for planar Ck smooth autonomous differential systems, with k ∈ N. The key point is to study the smoothness of bifurcation functions which are basic and important tool on the study of Hopf bifurcation at a fine focus or a center, and of Poincaré bifurcation in a period annulus. We especially study the smoothness of the first order Melnikov function in degenerate Hopf bifurcation at an elementary center. As we know, the smoothness problem was solved for analytic and C∞ differential systems, but it was not tackled for finitely smooth differential systems. Here, we present their optimal regularity of these bifurcation functions and their asymptotic expressions in the finite smooth case.

  16. Impact of spectral smoothing on gamma radiation portal alarm probabilities

    International Nuclear Information System (INIS)

    Burr, T.; Hamada, M.; Hengartner, N.

    2011-01-01

    Gamma detector counts are included in radiation portal monitors (RPM) to screen for illicit nuclear material. Gamma counts are sometimes smoothed to reduce variance in the estimated underlying true mean count rate, which is the 'signal' in our context. Smoothing reduces total error variance in the estimated signal if the bias that smoothing introduces is more than offset by the variance reduction. An empirical RPM study for vehicle screening applications is presented for unsmoothed and smoothed gamma counts in low-resolution plastic scintillator detectors and in medium-resolution NaI detectors. - Highlights: → We evaluate options for smoothing counts from gamma detectors deployed for portal monitoring. → A new multiplicative bias correction (MBC) is shown to reduce bias in peak and valley regions. → Performance is measured using mean squared error and detection probabilities for sources. → Smoothing with the MBC improves detection probabilities and the mean squared error.

  17. Pre-failure behaviour of an unstable limestone cliff from displacement and seismic data

    Directory of Open Access Journals (Sweden)

    J.-L. Got

    2010-04-01

    Full Text Available We monitored the displacement and seismic activity of an unstable vertical rock slice in a natural limestone cliff of the southeast Vercors massif, southeast France, during the months preceding its collapse. Displacement measurements showed an average acceleration of the movement of its top, with clear increases in the displacement velocity and in the discrete seismic event production rate during periods where temperature falls, with more activity when rainfall or frost occurs. Crises of discrete seismic events produce high amplitudes in periodograms, but do not change the high frequency base noise level rate. We infer that these crises express the critical crack growth induced by water weakening (from water vapor condensation or rain of the rock strength rather than to a rapid change in applied stresses. Seismic noise analysis showed a steady increase in the high frequency base noise level and the emergence of spectral modes in the signal recorded by the sensor installed on the unstable rock slice during the weeks preceding the collapse. High frequency seismic noise base level seems to represent subcritical crack growth. It is a smooth and robust parameter whose variations are related to generalized changes in the rupture process. Drop of the seismic noise amplitude was concomitant with the emergence of spectral modes – that are compatible with high-order eigenmodes of the unstable rock slice – during the later stages of its instability. Seismic noise analysis, especially high frequency base noise level analysis may complement that of inverse displacement velocity in early-warning approaches when strong displacement fluctuations occur.

  18. Calibration of Seismic Attributes for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Pennington, Wayne D.

    2002-05-29

    This project is intended to enhance the ability to use seismic data for the determination of rock and fluid properties through an improved understanding of the physics underlying the relationships between seismic attributes and formation.

  19. SEG Advances in Rotational Seismic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, Robert; Laughlin, Darren; Brune, Bob

    2016-10-17

    Significant advancements in the development of sensors to enable rotational seismic measurements have been achieved. Prototypes are available now to support experiments that help validate the utility of rotational seismic measurements.

  20. Seismic risks posed by mine flooding

    CSIR Research Space (South Africa)

    Goldbach, OD

    2009-09-01

    Full Text Available are allowed to flood. Such flooding-induced seismicity can have significant environmental, social and economic consequences, and may endanger neighbouring mines and surface communities. While fluid-induced seismicity has been observed in other settings (e...

  1. Annual Hanford seismic report - fiscal year 1996

    International Nuclear Information System (INIS)

    Hartshorn, D.C.; Reidel, S.P.

    1996-12-01

    Seismic monitoring (SM) at the Hanford Site was established in 1969 by the US Geological Survey (USGS) under a contract with the US Atomic Energy Commission. Since 1980, the program has been managed by several contractors under the US Department of Energy (USDOE). Effective October 1, 1996, the Seismic Monitoring workscope, personnel, and associated contracts were transferred to the USDOE Pacific Northwest National Laboratory (PNNL). SM is tasked to provide an uninterrupted collection and archives of high-quality raw and processed seismic data from the Hanford Seismic Network (HSN) located on and encircling the Hanford Site. SM is also tasked to locate and identify sources of seismic activity and monitor changes in the historical pattern of seismic activity at the Hanford Site. The data compiled are used by SM, Waste Management, and engineering activities at the Hanford Site to evaluate seismic hazards and seismic design for the Site

  2. SEISMIC DESIGN CRITERIA FOR NUCLEAR POWER REACTORS

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, R. A.

    1963-10-15

    The nature of nuclear power reactors demands an exceptionally high degree of seismic integrity. Considerations involved in defining earthquake resistance requirements are discussed. Examples of seismic design criteria and applications of the spectrum technique are described. (auth)

  3. Seismic analysis and testing of nuclear power plants

    International Nuclear Information System (INIS)

    1979-01-01

    The following subjects are discussed in this guide: General Recommendations for seismic classification, loading combinations and allowable limits; seismic analysis methods; implications for seismic design; seismic testing and qualification; seismic instrumentation; modelling techniques; material property characterization; seismic response of soil deposits and earth structures; liquefaction and ground failure; slope stability; sloshing effects in water pools; qualification testing by means of the transport vehicle

  4. 3D Seismic Imaging using Marchenko Methods

    Science.gov (United States)

    Lomas, A.; Curtis, A.

    2017-12-01

    Marchenko methods are novel, data driven techniques that allow seismic wavefields from sources and receivers on the Earth's surface to be redatumed to construct wavefields with sources in the subsurface - including complex multiply-reflected waves, and without the need for a complex reference model. In turn, this allows subsurface images to be constructed at any such subsurface redatuming points (image or virtual receiver points). Such images are then free of artefacts from multiply-scattered waves that usually contaminate migrated seismic images. Marchenko algorithms require as input the same information as standard migration methods: the full reflection response from sources and receivers at the Earth's surface, and an estimate of the first arriving wave between the chosen image point and the surface. The latter can be calculated using a smooth velocity model estimated using standard methods. The algorithm iteratively calculates a signal that focuses at the image point to create a virtual source at that point, and this can be used to retrieve the signal between the virtual source and the surface. A feature of these methods is that the retrieved signals are naturally decomposed into up- and down-going components. That is, we obtain both the signal that initially propagated upwards from the virtual source and arrived at the surface, separated from the signal that initially propagated downwards. Figure (a) shows a 3D subsurface model with a variable density but a constant velocity (3000m/s). Along the surface of this model (z=0) in both the x and y directions are co-located sources and receivers at 20-meter intervals. The redatumed signal in figure (b) has been calculated using Marchenko methods from a virtual source (1200m, 500m and 400m) to the surface. For comparison the true solution is given in figure (c), and shows a good match when compared to figure (b). While these 2D redatuming and imaging methods are still in their infancy having first been developed in

  5. The Apollo passive seismic experiment

    Science.gov (United States)

    Latham, G. V.; Dorman, H. J.; Horvath, P.; Ibrahim, A. K.; Koyama, J.; Nakamura, Y.

    1979-01-01

    The completed data set obtained from the 4-station Apollo seismic network includes signals from approximately 11,800 events of various types. Four data sets for use by other investigators, through the NSSDC, are in preparation. Some refinement of the lunar model based on seismic data can be expected, but its gross features remain as presented two years ago. The existence of a small, molten core remains dependent upon the analysis of signals from a single, far-side impact. Analysis of secondary arrivals from other sources may eventually resolve this issue, as well as continued refinement of the magnetic field measurements. Evidence of considerable lateral heterogeneity within the moon continues to build. The mystery of the much meteoroid flux estimate derived from lunar seismic measurements, as compared with earth-based estimates, remains; although, significant correlations between terrestrial and lunar observations are beginning to emerge.

  6. Seismic scanning tunneling macroscope - Theory

    KAUST Repository

    Schuster, Gerard T.

    2012-09-01

    We propose a seismic scanning tunneling macroscope (SSTM) that can detect the presence of sub-wavelength scatterers in the near-field of either the source or the receivers. Analytic formulas for the time reverse mirror (TRM) profile associated with a single scatterer model show that the spatial resolution limit to be, unlike the Abbe limit of λ/2, independent of wavelength and linearly proportional to the source-scatterer separation as long as the point scatterer is in the near-field region; if the sub-wavelength scatterer is a spherical impedance discontinuity then the resolution will also be limited by the radius of the sphere. Therefore, superresolution imaging can be achieved as the scatterer approaches the source. This is analogous to an optical scanning tunneling microscope that has sub-wavelength resolution. Scaled to seismic frequencies, it is theoretically possible to extract 100 Hz information from 20 Hz data by imaging of near-field seismic energy.

  7. Seismic scanning tunneling macroscope - Theory

    KAUST Repository

    Schuster, Gerard T.; Hanafy, Sherif M.; Huang, Yunsong

    2012-01-01

    We propose a seismic scanning tunneling macroscope (SSTM) that can detect the presence of sub-wavelength scatterers in the near-field of either the source or the receivers. Analytic formulas for the time reverse mirror (TRM) profile associated with a single scatterer model show that the spatial resolution limit to be, unlike the Abbe limit of λ/2, independent of wavelength and linearly proportional to the source-scatterer separation as long as the point scatterer is in the near-field region; if the sub-wavelength scatterer is a spherical impedance discontinuity then the resolution will also be limited by the radius of the sphere. Therefore, superresolution imaging can be achieved as the scatterer approaches the source. This is analogous to an optical scanning tunneling microscope that has sub-wavelength resolution. Scaled to seismic frequencies, it is theoretically possible to extract 100 Hz information from 20 Hz data by imaging of near-field seismic energy.

  8. Adaptive Education.

    Science.gov (United States)

    Anderson, Lorin W.

    1979-01-01

    Schools have devised several ways to adapt instruction to a wide variety of student abilities and needs. Judged by criteria for what adaptive education should be, most learning for mastery programs look good. (Author/JM)

  9. Six-term exact sequences for smooth generalized crossed products

    DEFF Research Database (Denmark)

    Gabriel, Olivier; Grensing, Martin

    2013-01-01

    We define smooth generalized crossed products and prove six-term exact sequences of Pimsner–Voiculescu type. This sequence may, in particular, be applied to smooth subalgebras of the quantum Heisenberg manifolds in order to compute the generators of their cyclic cohomology. Further, our results...... include the known results for smooth crossed products. Our proof is based on a combination of arguments from the setting of (Cuntz–)Pimsner algebras and the Toeplitz proof of Bott periodicity....

  10. Star Products with Separation of Variables Admitting a Smooth Extension

    Science.gov (United States)

    Karabegov, Alexander

    2012-08-01

    Given a complex manifold M with an open dense subset Ω endowed with a pseudo-Kähler form ω which cannot be smoothly extended to a larger open subset, we consider various examples where the corresponding Kähler-Poisson structure and a star product with separation of variables on (Ω, ω) admit smooth extensions to M. We give a simple criterion of the existence of a smooth extension of a star product and apply it to these examples.

  11. Star products with separation of variables admitting a smooth extension

    OpenAIRE

    Karabegov, Alexander

    2010-01-01

    Given a complex manifold $M$ with an open dense subset $\\Omega$ endowed with a pseudo-Kaehler form $\\omega$ which cannot be smoothly extended to a larger open subset, we consider various examples where the corresponding Kaehler-Poisson structure and a star product with separation of variables on $(\\Omega, \\omega)$ admit smooth extensions to $M$. We suggest a simple criterion of the existence of a smooth extension of a star product and apply it to these examples.

  12. Fast compact algorithms and software for spline smoothing

    CERN Document Server

    Weinert, Howard L

    2012-01-01

    Fast Compact Algorithms and Software for Spline Smoothing investigates algorithmic alternatives for computing cubic smoothing splines when the amount of smoothing is determined automatically by minimizing the generalized cross-validation score. These algorithms are based on Cholesky factorization, QR factorization, or the fast Fourier transform. All algorithms are implemented in MATLAB and are compared based on speed, memory use, and accuracy. An overall best algorithm is identified, which allows very large data sets to be processed quickly on a personal computer.

  13. Seismic Signal Compression Using Nonparametric Bayesian Dictionary Learning via Clustering

    Directory of Open Access Journals (Sweden)

    Xin Tian

    2017-06-01

    Full Text Available We introduce a seismic signal compression method based on nonparametric Bayesian dictionary learning method via clustering. The seismic data is compressed patch by patch, and the dictionary is learned online. Clustering is introduced for dictionary learning. A set of dictionaries could be generated, and each dictionary is used for one cluster’s sparse coding. In this way, the signals in one cluster could be well represented by their corresponding dictionaries. A nonparametric Bayesian dictionary learning method is used to learn the dictionaries, which naturally infers an appropriate dictionary size for each cluster. A uniform quantizer and an adaptive arithmetic coding algorithm are adopted to code the sparse coefficients. With comparisons to other state-of-the art approaches, the effectiveness of the proposed method could be validated in the experiments.

  14. Seismic evaluation of the Mors Dome

    International Nuclear Information System (INIS)

    Kreitz, E.

    1982-01-01

    The ''Seismic Case History'' of the Mors saltdome was already published in detail by ELSAM/ELKRAFT so only a few important points need to be mentioned here: (a) Processing and interpretation of the seismic material. (b) Stratigraphic classification of the most important seismic reflection horizons. (c) Construction of the depth sections and description of the saltdome model. (d) Investigations of the problematic salt overhang using interactive seismic modelling. (EG)

  15. Seismic re-evaluation of Mochovce nuclear power plant. Seismic reevaluation of civil structures

    International Nuclear Information System (INIS)

    Podrouzek, P.

    1997-01-01

    In this contribution, an overview of seismic design procedures used for reassessment of seismic safety of civil structures at the Mochovce NPP in Slovak Republic presented. As an introduction, the objectives, history, and current status of seismic design of the NPP have been explained. General philosophy of design methods, seismic classification of buildings, seismic data, calculation methods, assumptions on structural behavior under seismic loading and reliability assessment were described in detail in the subsequent section. Examples of calculation models used for dynamic calculations of seismic response are given in the last section. (author)

  16. Core seismic methods verification report

    International Nuclear Information System (INIS)

    Olsen, B.E.; Shatoff, H.D.; Rakowski, J.E.; Rickard, N.D.; Thompson, R.W.; Tow, D.; Lee, T.H.

    1979-12-01

    This report presents the description and validation of the analytical methods for calculation of the seismic loads on an HTGR core and the core support structures. Analytical modeling, integration schemes, parameter assignment, parameter sensitivity, and correlation with test data are key topics which have been covered in detail. Much of the text concerns the description and the results of a series of scale model tests performed to obtain data for code correlation. A discussion of scaling laws, model properties, seismic excitation, instrumentation, and data reduction methods is also presented, including a section on the identification and calculation of statistical errors in the test data

  17. Advanced Seismic While Drilling System

    Energy Technology Data Exchange (ETDEWEB)

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII

  18. Seismic design of piping systems

    International Nuclear Information System (INIS)

    Anglaret, G.; Beguin, J.L.

    1986-01-01

    This paper deals with the method used in France for the PWR nuclear plants to derive locations and types of supports of auxiliary and secondary piping systems taking earthquake in account. The successive steps of design are described, then the seismic computation method and its particular conditions of applications for piping are presented. The different types of support (and especially seismic ones) are described and also their conditions of installation. The method used to compare functional tests results and computation results in order to control models is mentioned. Some experiments realised on site or in laboratory, in order to validate models and methods, are presented [fr

  19. Implementation guidelines for seismic PSA

    International Nuclear Information System (INIS)

    Coman, Ovidiu; Samaddar, Sujit; Hibino, Kenta; )

    2014-01-01

    The presentation was devoted to development of guidelines for implementation of a seismic PSA. If successful, these guidelines can close an important gap. ASME/ANS PRA standards and the related IAEA Safety Guide (IAEA NS-G-2.13) describe capability requirements for seismic PSA in order to support risk-informed applications. However, practical guidance on how to meet these requirements is limited. Such guidelines could significantly contribute to improving risk-informed safety demonstration, safety management and decision making. Extensions of this effort to further PSA areas, particularly to PSA for other external hazards, can enhance risk-informed applications

  20. Seismic characterization of fracture properties

    International Nuclear Information System (INIS)

    Myer, L.R.; Hopkins, D.; Cook, N.G.W.; Pyrak-Nolte, L.J.

    1990-01-01

    The purpose of this paper is to show that there is a relationship, both empirical and theoretical, between the measured seismic response, the mechanical stiffness (also referred to as specific stiffness) of fractures and their hydraulic conductivity. Laboratory measurements of the mechanical stiffness, hydraulic conductivity and seismic properties of natural fractures are summarized. A theoretical model for the amplitude and group time delay for compressional and shear waves transmitted across a single fracture is presented. Predictions based on this model are compared with laboratory measurements. Finally, the results for a single fracture are extended to multiple parallel fractures. 13 refs., 6 figs

  1. The seismic reassessment Mochovce NPP

    International Nuclear Information System (INIS)

    Baumeister, P.

    2004-01-01

    The design of Mochovce NPP was based on the Novo-Voronez type WWER-440/213 reactor - twin units. Seismic characteristic of this region is characterized by very low activity. Mochovce NPP site is located on the rock soil with volcanic layer (andesit). Seismic reassessment of Mochovce NPP was done in two steps: deterministic approach up to commissioning confirmed value Horizontal Peak Ground Acceleration HPGA=0.1 g and activities after commissioning as a consequence of the IAEA mission indicate higher hazard values. (author)

  2. Seismic Holography of Solar Activity

    Science.gov (United States)

    Lindsey, Charles

    2000-01-01

    The basic goal of the project was to extend holographic seismic imaging techniques developed under a previous NASA contract, and to incorporate phase diagnostics. Phase-sensitive imaging gives us a powerful probe of local thermal and Doppler perturbations in active region subphotospheres, allowing us to map thermal structure and flows associated with "acoustic moats" and "acoustic glories". These remarkable features were discovered during our work, by applying simple acoustic power holography to active regions. Included in the original project statement was an effort to obtain the first seismic images of active regions on the Sun's far surface.

  3. Community Seismic Network (CSN)

    Science.gov (United States)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.

    2012-12-01

    We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging

  4. Green's function representations for seismic interferometry

    NARCIS (Netherlands)

    Wapenaar, C.P.A.; Fokkema, J.T.

    2006-01-01

    The term seismic interferometry refers to the principle of generating new seismic responses by crosscorrelating seismic observations at different receiver locations. The first version of this principle was derived by Claerbout (1968), who showed that the reflection response of a horizontally layered

  5. Redatuming of sparse 3D seismic data

    NARCIS (Netherlands)

    Tegtmeier, S.

    2007-01-01

    The purpose of a seismic survey is to produce an image of the subsurface providing an overview of the earth's discontinuities. The aim of seismic processing is to recreate this image. The seismic method is especially well suited for the exploration and the monitoring of hydrocarbon reservoirs. A

  6. Seismic risk map for Southeastern Brazil

    International Nuclear Information System (INIS)

    Mioto, J.A.

    1984-01-01

    During the last few years, some studies regarding seismic risk were prepared for three regions of Brazil. They were carried on account of two basic interests: first, toward the seismic history and recurrence of Brazilian seismic events; second, in a way as to provide seismic parameters for the design and construction of hydro and nuclear power plants. The first seismic risk map prepared for the southeastern region was elaborated in 1979 by 6he Universidade de Brasilia (UnB-Brasilia Seismological Station). In 1981 another seismic risk map was completed on the basis of seismotectonic studies carried out for the design and construction of the Nuclear power plants of Itaorna Beach (Angra dos Reis, Rio de Janeiro) by IPT (Mining and Applied Geology Division). In Brazil, until 1984, seismic studies concerning hydro and nuclear power plants and other civil construction of larger size did not take into account the seismic events from the point of view of probabilities of seismic recurrences. Such analysis in design is more important than the choice of a level of intensity or magnitude, or adoption of a seismicity level ased on deterministic methods. In this way, some considerations were made, concerning the use of seisms in Brazilian designs of hydro and nuclear power plants, as far as seismic analysis is concerned, recently altered over the current seismic risk panorama. (D.J.M.) [pt

  7. A linear motor as seismic horizontal vibrator

    NARCIS (Netherlands)

    Drijkoningen, G.; Veltman, A.; Hendrix, W.H.A.; Brouwer, J.; Hemstede, A.

    2006-01-01

    In this paper we propose to use the concept of linear synchronous motors to act as a seismic shear-wave vibratory source. We show that a linear motor, even with a design that is not focussed on application of seismic surveying, gives seismic records that are convincing and comparable with an

  8. seismic refraction investigation of the subsurface structure

    African Journals Online (AJOL)

    DR. AMINU

    employed for exploration include magnetic, electrical and gravitational methods, which depends on the earth's natural fields. Others are seismic and electromagnetic methods, which depends on the introduction of artificial energy in thereof. The seismic refraction method uses the seismic energy that returns to the surface of ...

  9. Seismic activity maps for the Armenian Highlands

    Energy Technology Data Exchange (ETDEWEB)

    Karapetyan, N.K.; Manukyan, Zh.O.

    1976-01-01

    Seismic activity maps for the periods 1952 to 1967 and 1952 to 1968 were compiled for the Armenian Highlands in order to study the spatial distribution of earthquake recurrence and to construct maps in isolines of seismic activity. Diagrams are presented illustrating such seismic activity maps for the indicated periods. 4 references, 3 figures, 1 table.

  10. Constraints on mantle convection from seismic tomography

    NARCIS (Netherlands)

    Kárason, H.; Hilst, R.D. van der

    2000-01-01

    Since the advent of global seismic tomography some 25 years ago, advances in technology, seismological theory, and data acquisition have allowed spectacular progress in our ability to image seismic heterogeneity in Earth's mantle. We briefly review some concepts of seismic tomography, such as

  11. Advancing New 3D Seismic Interpretation Methods for Exploration and Development of Fractured Tight Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    James Reeves

    2005-01-31

    In a study funded by the U.S. Department of Energy and GeoSpectrum, Inc., new P-wave 3D seismic interpretation methods to characterize fractured gas reservoirs are developed. A data driven exploratory approach is used to determine empirical relationships for reservoir properties. Fractures are predicted using seismic lineament mapping through a series of horizon and time slices in the reservoir zone. A seismic lineament is a linear feature seen in a slice through the seismic volume that has negligible vertical offset. We interpret that in regions of high seismic lineament density there is a greater likelihood of fractured reservoir. Seismic AVO attributes are developed to map brittle reservoir rock (low clay) and gas content. Brittle rocks are interpreted to be more fractured when seismic lineaments are present. The most important attribute developed in this study is the gas sensitive phase gradient (a new AVO attribute), as reservoir fractures may provide a plumbing system for both water and gas. Success is obtained when economic gas and oil discoveries are found. In a gas field previously plagued with poor drilling results, four new wells were spotted using the new methodology and recently drilled. The wells have estimated best of 12-months production indicators of 2106, 1652, 941, and 227 MCFGPD. The latter well was drilled in a region of swarming seismic lineaments but has poor gas sensitive phase gradient (AVO) and clay volume attributes. GeoSpectrum advised the unit operators that this location did not appear to have significant Lower Dakota gas before the well was drilled. The other three wells are considered good wells in this part of the basin and among the best wells in the area. These new drilling results have nearly doubled the gas production and the value of the field. The interpretation method is ready for commercialization and gas exploration and development. The new technology is adaptable to conventional lower cost 3D seismic surveys.

  12. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... offered by adaptive lighting control are created by the ways that the system components, the network and data flow can be coordinated through software so that the dynamic variations are controlled in ways that meaningfully adapt according to people’s situations and design intentions. This book discusses...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  13. Induced seismicity and carbon storage: Risk assessment and mitigation strategies

    Energy Technology Data Exchange (ETDEWEB)

    White, Joshua A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Foxall, William [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bachmann, Corinne [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chiaramonte, Laura [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Daley, Thomas M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-28

    assessment and mitigation approach. A phased approach to risk management is then introduced. The basic goal of the phased approach is to constantly adapt site operations to current conditions and available characterization data. The remainder of the report then focuses in detail on different components of the monitoring, risk assessment, and mitigation strategies. Issues in current seismic risk assessment methods that must be modified to address induce seismicity are highlighted. The report then concludes with several specific recommendations for operators and regulatory authorities to consider when selecting, permitting, and operating a storage project.

  14. Nuclear fusion-independent smooth muscle differentiation of human adipose-derived stem cells induced by a smooth muscle environment.

    Science.gov (United States)

    Zhang, Rong; Jack, Gregory S; Rao, Nagesh; Zuk, Patricia; Ignarro, Louis J; Wu, Benjamin; Rodríguez, Larissa V

    2012-03-01

    Human adipose-derived stem cells hASC have been isolated and were shown to have multilineage differentiation capacity. Although both plasticity and cell fusion have been suggested as mechanisms for cell differentiation in vivo, the effect of the local in vivo environment on the differentiation of adipose-derived stem cells has not been evaluated. We previously reported the in vitro capacity of smooth muscle differentiation of these cells. In this study, we evaluate the effect of an in vivo smooth muscle environment in the differentiation of hASC. We studied this by two experimental designs: (a) in vivo evaluation of smooth muscle differentiation of hASC injected into a smooth muscle environment and (b) in vitro evaluation of smooth muscle differentiation capacity of hASC exposed to bladder smooth muscle cells. Our results indicate a time-dependent differentiation of hASC into mature smooth muscle cells when these cells are injected into the smooth musculature of the urinary bladder. Similar findings were seen when the cells were cocultured in vitro with primary bladder smooth muscle cells. Chromosomal analysis demonstrated that microenvironment cues rather than nuclear fusion are responsible for this differentiation. We conclude that cell plasticity is present in hASCs, and their differentiation is accomplished in the absence of nuclear fusion. Copyright © 2011 AlphaMed Press.

  15. Seismic tomography with the reversible jump algorithm

    Science.gov (United States)

    Bodin, Thomas; Sambridge, Malcolm

    2009-09-01

    The reversible jump algorithm is a statistical method for Bayesian inference with a variable number of unknowns. Here, we apply this method to the seismic tomography problem. The approach lets us consider the issue of model parametrization (i.e. the way of discretizing the velocity field) as part of the inversion process. The model is parametrized using Voronoi cells with mobile geometry and number. The size, position and shape of the cells defining the velocity model are directly determined by the data. The inverse problem is tackled within a Bayesian framework and explicit regularization of model parameters is not required. The mobile position and number of cells means that global damping procedures, controlled by an optimal regularization parameter, are avoided. Many velocity models with variable numbers of cells are generated via a transdimensional Markov chain and information is extracted from the ensemble as a whole. As an aid to interpretation we visualize the expected earth model that is obtained via Monte Carlo integration in a straightforward manner. The procedure is particularly adept at imaging rapid changes or discontinuities in wave speed. While each velocity model in the final ensemble consists of many discontinuities at cell boundaries, these are smoothed out in the averaged ensemble solution while those required by the data are reinforced. The ensemble of models can also be used to produce uncertainty estimates and experiments with synthetic data suggest that they represent actual uncertainty surprisingly well. We use the fast marching method in order to iteratively update the ray geometry and account for the non-linearity of the problem. The method is tested here with synthetic data in a 2-D application and compared with a subspace method that is a more standard matrix-based inversion scheme. Preliminary results illustrate the advantages of the reversible jump algorithm. A real data example is also shown where a tomographic image of Rayleigh wave

  16. ASIC proteins regulate smooth muscle cell migration.

    Science.gov (United States)

    Grifoni, Samira C; Jernigan, Nikki L; Hamilton, Gina; Drummond, Heather A

    2008-03-01

    The purpose of the present study was to investigate Acid Sensing Ion Channel (ASIC) protein expression and importance in cellular migration. We recently demonstrated that Epithelial Na(+)Channel (ENaC) proteins are required for vascular smooth muscle cell (VSMC) migration; however, the role of the closely related ASIC proteins has not been addressed. We used RT-PCR and immunolabeling to determine expression of ASIC1, ASIC2, ASIC3 and ASIC4 in A10 cells. We used small interference RNA to silence individual ASIC expression and determine the importance of ASIC proteins in wound healing and chemotaxis (PDGF-bb)-initiated migration. We found ASIC1, ASIC2, and ASIC3, but not ASIC4, expression in A10 cells. ASIC1, ASIC2, and ASIC3 siRNA molecules significantly suppressed expression of their respective proteins compared to non-targeting siRNA (RISC) transfected controls by 63%, 44%, and 55%, respectively. Wound healing was inhibited by 10, 20, and 26% compared to RISC controls following suppression of ASIC1, ASIC2, and ASIC3, respectively. Chemotactic migration was inhibited by 30% and 45%, respectively, following suppression of ASIC1 and ASIC3. ASIC2 suppression produced a small, but significant, increase in chemotactic migration (4%). Our data indicate that ASIC expression is required for normal migration and may suggest a novel role for ASIC proteins in cellular migration.

  17. An implicit Smooth Particle Hydrodynamic code

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, Charles E. [Univ. of New Mexico, Albuquerque, NM (United States)

    2000-05-01

    An implicit version of the Smooth Particle Hydrodynamic (SPH) code SPHINX has been written and is working. In conjunction with the SPHINX code the new implicit code models fluids and solids under a wide range of conditions. SPH codes are Lagrangian, meshless and use particles to model the fluids and solids. The implicit code makes use of the Krylov iterative techniques for solving large linear-systems and a Newton-Raphson method for non-linear corrections. It uses numerical derivatives to construct the Jacobian matrix. It uses sparse techniques to save on memory storage and to reduce the amount of computation. It is believed that this is the first implicit SPH code to use Newton-Krylov techniques, and is also the first implicit SPH code to model solids. A description of SPH and the techniques used in the implicit code are presented. Then, the results of a number of tests cases are discussed, which include a shock tube problem, a Rayleigh-Taylor problem, a breaking dam problem, and a single jet of gas problem. The results are shown to be in very good agreement with analytic solutions, experimental results, and the explicit SPHINX code. In the case of the single jet of gas case it has been demonstrated that the implicit code can do a problem in much shorter time than the explicit code. The problem was, however, very unphysical, but it does demonstrate the potential of the implicit code. It is a first step toward a useful implicit SPH code.

  18. Continuous distribution of elastic parameters of the shallow quaternary layers along the 3C seismic profile of east Bucharest

    International Nuclear Information System (INIS)

    Bala, A.; Raileanu, V.; Cristea, P.; Nitica, C.

    2008-01-01

    Processing techniques applied to seismic data acquired by reflection methods. Seismic methods are efficient research methods for civil engineering and environmental geology, which invite to develop specific methodologies. Therefore, programs for processing data collected with refraction seismic techniques (based on head and transmitted waves) and by transmission tomography for velocity were developed. The visual programming medium Borland Delphi was utilized to create the program MEDCONT, whose abilities, by menus and dialog windows, are both commanded and controlled. The accuracy and the adaptability of the program to field cases are validated by data resulted from forward models and also collected by applications on field objectives. (authors)

  19. Seismic maps foster landmark legislation

    Science.gov (United States)

    Borcherdt, Roger D.; Brown, Robert B.; Page, Robert A.; Wentworth, Carl M.; Hendley, James W.

    1995-01-01

    When a powerful earthquake strikes an urban region, damage concentrates not only near the quake's source. Damage can also occur many miles from the source in areas of soft ground. In recent years, scientists have developed ways to identify and map these areas of high seismic hazard. This advance has spurred pioneering legislation to reduce earthquake losses in areas of greatest hazard.

  20. Micromachined silicon seismic accelerometer development

    Energy Technology Data Exchange (ETDEWEB)

    Barron, C.C.; Fleming, J.G.; Montague, S. [and others

    1996-08-01

    Batch-fabricated silicon seismic transducers could revolutionize the discipline of seismic monitoring by providing inexpensive, easily deployable sensor arrays. Our ultimate goal is to fabricate seismic sensors with sensitivity and noise performance comparable to short-period seismometers in common use. We expect several phases of development will be required to accomplish that level of performance. Traditional silicon micromachining techniques are not ideally suited to the simultaneous fabrication of a large proof mass and soft suspension, such as one needs to achieve the extreme sensitivities required for seismic measurements. We have therefore developed a novel {open_quotes}mold{close_quotes} micromachining technology that promises to make larger proof masses (in the 1-10 mg range) possible. We have successfully integrated this micromolding capability with our surface-micromachining process, which enables the formation of soft suspension springs. Our calculations indicate that devices made in this new integrated technology will resolve down to at least sub-{mu}G signals, and may even approach the 10{sup -10} G/{radical}Hz acceleration levels found in the low-earth-noise model.

  1. Southern Appalachian Regional Seismic Network

    Energy Technology Data Exchange (ETDEWEB)

    Chiu, S.C.C.; Johnston, A.C.; Chiu, J.M. [Memphis State Univ., TN (United States). Center for Earthquake Research and Information

    1994-08-01

    The seismic activity in the southern Appalachian area was monitored by the Southern Appalachian Regional Seismic Network (SARSN) since late 1979 by the Center for Earthquake Research and Information (CERI) at Memphis State University. This network provides good spatial coverage for earthquake locations especially in east Tennessee. The level of activity concentrates more heavily in the Valley and Ridge province of eastern Tennessee, as opposed to the Blue Ridge or Inner Piedmont. The large majority of these events lie between New York - Alabama lineament and the Clingman/Ocoee lineament, magnetic anomalies produced by deep-seated basement structures. Therefore SARSN, even with its wide station spacing, has been able to define the essential first-order seismological characteristics of the Southern Appalachian seismic zone. The focal depths of the southeastern U.S. earthquakes concentrate between 8 and 16 km, occurring principally beneath the Appalachian overthrust. In cross-sectional views, the average seismicity is shallower to the east beneath the Blue Ridge and Piedmont provinces and deeper to the west beneath the Valley and Ridge and the North American craton. Results of recent focal mechanism studies by using the CERI digital earthquake catalog between October, 1986 and December, 1991, indicate that the basement of the Valley and Ridge province is under a horizontal, NE-SW compressive stress. Right-lateral strike-slip faulting on nearly north-south fault planes is preferred because it agrees with the trend of the regional magnetic anomaly pattern.

  2. Southern Appalachian Regional Seismic Network

    International Nuclear Information System (INIS)

    Chiu, S.C.C.; Johnston, A.C.; Chiu, J.M.

    1994-08-01

    The seismic activity in the southern Appalachian area was monitored by the Southern Appalachian Regional Seismic Network (SARSN) since late 1979 by the Center for Earthquake Research and Information (CERI) at Memphis State University. This network provides good spatial coverage for earthquake locations especially in east Tennessee. The level of activity concentrates more heavily in the Valley and Ridge province of eastern Tennessee, as opposed to the Blue Ridge or Inner Piedmont. The large majority of these events lie between New York - Alabama lineament and the Clingman/Ocoee lineament, magnetic anomalies produced by deep-seated basement structures. Therefore SARSN, even with its wide station spacing, has been able to define the essential first-order seismological characteristics of the Southern Appalachian seismic zone. The focal depths of the southeastern U.S. earthquakes concentrate between 8 and 16 km, occurring principally beneath the Appalachian overthrust. In cross-sectional views, the average seismicity is shallower to the east beneath the Blue Ridge and Piedmont provinces and deeper to the west beneath the Valley and Ridge and the North American craton. Results of recent focal mechanism studies by using the CERI digital earthquake catalog between October, 1986 and December, 1991, indicate that the basement of the Valley and Ridge province is under a horizontal, NE-SW compressive stress. Right-lateral strike-slip faulting on nearly north-south fault planes is preferred because it agrees with the trend of the regional magnetic anomaly pattern

  3. Evaluating Seismic Activity in Ethiopia

    African Journals Online (AJOL)

    map is constructed from which seismic risks in a given sector ... troyed (10, 11) and the people of Eritrea remember these years ... terms of damage caused to man-made structures; they refer to .... walls of a well designed modern building were deta- ched from ... Although, at present, no theory is satisfactory, the fact remains.

  4. Seismic motions from project Rulison

    Energy Technology Data Exchange (ETDEWEB)

    Loux, P C [Environmental Research Corp., Alexandria, VA (United States)

    1970-05-15

    In the range from a few to a few hundred km, seismic measurements from the Rulison event are shown and compared with experimentally and analytically derived pre-event estimates. Seismograms, peak accelerations, and response spectra are given along with a description of the associated geologic environment. Techniques used for the pre-event estimates are identified with emphasis on supportive data and on Rulison results. Of particular interest is the close-in seismic frequency content which is expected to contain stronger high frequency components. This higher frequency content translates into stronger accelerations within the first tens of km, which in turn affect safety preparations. Additionally, the local geologic structure at nearby population centers must be considered. Pre-event reverse profile refraction surveys are used to delineate the geology at Rifle, Rulison, Grand Valley, and other sites. The geologic parameters are then used as input to seismic amplification models which deliver estimates of local resonant frequencies. Prediction of such resonances allows improved safety assurance against seismic effects hazards. (author)

  5. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-12-01

    We have developed and tested technology for a new type of direct hydrocarbon detection. The method uses inelastic rock properties to greatly enhance the sensitivity of surface seismic methods to the presence of oil and gas saturation. These methods include use of energy absorption, dispersion, and attenuation (Q) along with traditional seismic attributes like velocity, impedance, and AVO. Our approach is to combine three elements: (1) a synthesis of the latest rock physics understanding of how rock inelasticity is related to rock type, pore fluid types, and pore microstructure, (2) synthetic seismic modeling that will help identify the relative contributions of scattering and intrinsic inelasticity to apparent Q attributes, and (3) robust algorithms that extract relative wave attenuation attributes from seismic data. This project provides: (1) Additional petrophysical insight from acquired data; (2) Increased understanding of rock and fluid properties; (3) New techniques to measure reservoir properties that are not currently available; and (4) Provide tools to more accurately describe the reservoir and predict oil location and volumes. These methodologies will improve the industry's ability to predict and quantify oil and gas saturation distribution, and to apply this information through geologic models to enhance reservoir simulation. We have applied for two separate patents relating to work that was completed as part of this project.

  6. Seismic component fragility data base for IPEEE

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.

    1990-01-01

    Seismic probabilistic risk assessment or a seismic margin study will require a reliable data base of seismic fragility of various equipment classes. Brookhaven National Laboratory (BNL) has selected a group of equipment and generically evaluated the seismic fragility of each equipment class by use of existing test data. This paper briefly discusses the evaluation methodology and the fragility results. The fragility analysis results when used in the Individual Plant Examination for External Events (IPEEE) Program for nuclear power plants are expected to provide insights into seismic vulnerabilities of equipment for earthquakes beyond the design basis. 3 refs., 1 fig., 1 tab

  7. Seismic excitation by space shuttles

    Science.gov (United States)

    Kanamori, H.; Mori, J.; Sturtevant, B.; Anderson, D.L.; Heaton, T.

    1992-01-01

    Shock waves generated by the space shuttles Columbia (August 13, 1989), Atlantis (April 11, 1991) and Discovery (September 18, 1991) on their return to Edwards Air Force Base, California, were recorded by TERRAscope (Caltech's broadband seismic network), the Caltech-U.S.G.S Southern California Seismic Network (SCSN), and the University of Southern California (USC) Los Angeles Basin Seismic Network. The spatial pattern of the arrival times exhibits hyperbolic shock fronts from which the path, velocity and altitude of the space shuttle could be determined. The shock wave was acoustically coupled to the ground, converted to a seismic wave, and recorded clearly at the broadband TERRAscope stations. The acoustic coupling occurred very differently depending on the conditions of the Earth's surface surrounding the station. For a seismic station located on hard bedrock, the shock wave (N wave) was clearly recorded with little distortion. Aside from the N wave, very little acoustic coupling of the shock wave energy to the ground occurred at these sites. The observed N wave record was used to estimate the overpressure of the shock wave accurately; a pressure change of 0.5 to 2.2 mbars was obtained. For a seismic station located close to the ocean or soft sedimentary basins, a significant amount of shock wave energy was transferred to the ground through acoustic coupling of the shock wave and the oceanic Rayleigh wave. A distinct topography such as a mountain range was found effective to couple the shock wave energy to the ground. Shock wave energy was also coupled to the ground very effectively through large man made structures such as high rise buildings and offshore oil drilling platforms. For the space shuttle Columbia, in particular, a distinct pulse having a period of about 2 to 3 seconds was observed, 12.5 s before the shock wave, with a broadband seismograph in Pasadena. This pulse was probably excited by the high rise buildings in downtown Los Angeles which were

  8. Statistical Seismology and Induced Seismicity

    Science.gov (United States)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

  9. Combined GPS and seismic monitoring of a 12-story structure in a region of induced seismicity in Oklahoma

    Science.gov (United States)

    Haase, J. S.; Soliman, M.; Kim, H.; Jaiswal, P.; Saunders, J. K.; Vernon, F.; Zhang, W.

    2017-12-01

    This work focuses on quantifying ground motions and their effects in Oklahoma near the location of the 2016 Mw 5.8 Pawnee earthquake, where seismicity has been increasing due to wastewater injection related to oil and natural gas production. Much of the building inventory in Oklahoma was constructed before the increase in seismicity and before the implementation of earthquake design and detailing provisions for reinforced concrete (RC) structures. We will use combined GPS/seismic monitoring techniques to measure ground motion in the field and the response of structures to this ground motion. Several Oklahoma State University buildings experienced damage due to the Pawnee earthquake. The USGS Shake Map product estimated peak ground acceleration (PGA) ranging from 0.12g to 0.15g at campus locations. We are deploying a high-rate GPS sensor and accelerometer on the roof and another accelerometer at ground level of a 12-story RC structure and at selected field sites in order to collect ambient noise data and nearby seismicity. The longer period recording characteristics of the GPS/seismic system are particularly well adapted to monitoring these large structures in the event of a significant earthquake. Gross characteristics of the structural system are described, which consists of RC columns and RC slabs in all stories. We conducted a preliminary structural analysis including modal analysis and response spectrum analysis based on a finite element (FE) simulation, which indicated that the period associated with the first X-axis bending, first torsional, and first Y-axis bending modes are 2.2 s, 2.1 s, and 1.8 s, respectively. Next, a preliminary analysis was conducted to estimate the range of expected deformation at the roof level for various earthquake excitations. The earthquake analysis shows a maximum roof displacement of 5 and 7 cm in the horizontal directions resulting from earthquake loads with PGA of 0.2g, well above the noise level of the combined GPS/seismic

  10. Bandwidth selection in smoothing functions | Kibua | East African ...

    African Journals Online (AJOL)

    ... inexpensive and, hence, worth adopting. We argue that the bandwidth parameter is determined by two factors: the kernel function and the length of the smoothing region. We give an illustrative example of its application using real data. Keywords: Kernel, Smoothing functions, Bandwidth > East African Journal of Statistics ...

  11. Three-phase electric drive with modified electronic smoothing inductor

    DEFF Research Database (Denmark)

    Singh, Yash Veer; Rasmussen, Peter Omand; Andersen, Torben Ole

    2010-01-01

    This paper presents a three-phase electric drive with a modified electronic smoothing inductor (MESI) having reduced size of passive components. The classical electronic smoothing inductor (ESI) is able to control a diode bridge output current and also reduce not only mains current harmonics...

  12. Smooth Maps of a Foliated Manifold in a Symplectic Manifold

    Indian Academy of Sciences (India)

    Let be a smooth manifold with a regular foliation F and a 2-form which induces closed forms on the leaves of F in the leaf topology. A smooth map f : ( M , F ) ⟶ ( N , ) in a symplectic manifold ( N , ) is called a foliated symplectic immersion if restricts to an immersion on each leaf of the foliation and further, the ...

  13. Classification of smooth structures on a homotopy complex ...

    Indian Academy of Sciences (India)

    Abstract. We classify, up to diffeomorphism, all closed smooth manifolds homeo- morphic to the complex projective n-space CPn, where n = 3 and 4. Let M2n be a closed smooth 2n-manifold homotopy equivalent to CPn. We show that, up to diffeo- morphism, M6 has a unique differentiable structure and M8 has at most two ...

  14. Classification of smooth structures on a homotopy complex ...

    Indian Academy of Sciences (India)

    We classify, up to diffeomorphism, all closed smooth manifolds homeomorphic to the complex projective n -space C P n , where n = 3 and 4. Let M 2 n be a closed smooth 2 n -manifold homotopy equivalent to C P n . We show that, up to diffeomorphism, M 6 has a unique differentiable structure and M 8 has at most two ...

  15. Some asymptotic theory for variance function smoothing | Kibua ...

    African Journals Online (AJOL)

    Simple selection of the smoothing parameter is suggested. Both homoscedastic and heteroscedastic regression models are considered. Keywords: Asymptotic, Smoothing, Kernel, Bandwidth, Bias, Variance, Mean squared error, Homoscedastic, Heteroscedastic. > East African Journal of Statistics Vol. 1 (1) 2005: pp. 9-22 ...

  16. On smoothed analysis of quicksort and Hoare's find

    NARCIS (Netherlands)

    Fouz, Mahmoud; Kufleitner, Manfred; Manthey, Bodo; Zeini Jahromi, Nima; Ngo, H.Q.

    2009-01-01

    We provide a smoothed analysis of Hoare’s find algorithm and we revisit the smoothed analysis of quicksort. Hoare’s find algorithm – often called quickselect – is an easy-to-implement algorithm for finding the $k$-th smallest element of a sequence. While the worst-case number of comparisons that

  17. Investigation of angular and axial smoothing of PET data

    International Nuclear Information System (INIS)

    Daube-Witherspoon, M.E.; Carson, R.E.

    1996-01-01

    Radial filtering of emission and transmission data is routinely performed in PET during reconstruction in order to reduce image noise. Angular smoothing is not typically done, due to the introduction of a non-uniform resolution loss; axial filtering is also not usually performed on data acquired in 2D mode. The goal of this paper was to assess the effects of angular and axial smoothing on noise and resolution. Angular and axial smoothing was incorporated into the reconstruction process on the Scanditronix PC2048-15B brain PET scanner. In-plane spatial resolution and noise reduction were measured for different amounts of radial and angular smoothing. For radial positions away from the center of the scanner, noise reduction and degraded tangential resolution with no loss of radial resolution were seen. Near the center, no resolution loss was observed, but there was also no reduction in noise for angular filters up to a 7 degrees FWHM. These results can be understood by considering the combined effects of smoothing projections across rows (angles) and then summing (backprojecting). Thus, angular smoothing is not optimal due to its anisotropic noise reduction and resolution degradation properties. However, uniform noise reduction comparable to that seen with radial filtering can be achieved with axial smoothing of transmission data. The axial results suggest that combined radial and axial transmission smoothing could lead to improved noise characteristics with more isotropic resolution degradation

  18. A Note on the Definition of a Smooth Curve

    Science.gov (United States)

    Euler, Russell; Sadek, Jawad

    2005-01-01

    In many elementary calculus textbooks in use today, the definition of a "smooth curve" is slightly ambiguous from the students' perspective. Even when smoothness is defined carefully, there is a shortage of relevant exercises that would serve to elaborate on related subtle points which many students may find confusing. In this article, the authors…

  19. Smooth surfaces from bilinear patches: Discrete affine minimal surfaces

    KAUST Repository

    Käferböck, Florian

    2013-06-01

    Motivated by applications in freeform architecture, we study surfaces which are composed of smoothly joined bilinear patches. These surfaces turn out to be discrete versions of negatively curved affine minimal surfaces and share many properties with their classical smooth counterparts. We present computational design approaches and study special cases which should be interesting for the architectural application. 2013 Elsevier B.V.

  20. Dynamics of wetting on smooth and rough surfaces.

    NARCIS (Netherlands)

    Cazabat, A.M.; Cohen Stuart, M.A.

    1987-01-01

    The rate of spreading of non-volatile liquids on smooth and on rough surfaces was investigated. The radius of the wetted spot was found to agree with recently proposed scaling laws (t 1/10 for capillarity driven andt 1/8 for gravity driven spreading) when the surface was smooth. However, the

  1. Neurophysiology and Neuroanatomy of Smooth Pursuit: Lesion Studies

    Science.gov (United States)

    Sharpe, James A.

    2008-01-01

    Smooth pursuit impairment is recognized clinically by the presence of saccadic tracking of a small object and quantified by reduction in pursuit gain, the ratio of smooth eye movement velocity to the velocity of a foveal target. Correlation of the site of brain lesions, identified by imaging or neuropathological examination, with defective smooth…

  2. Mechanisms of mechanical strain memory in airway smooth muscle.

    Science.gov (United States)

    Kim, Hak Rim; Hai, Chi-Ming

    2005-10-01

    We evaluated the hypothesis that mechanical deformation of airway smooth muscle induces structural remodeling of airway smooth muscle cells, thereby modulating mechanical performance in subsequent contractions. This hypothesis implied that past experience of mechanical deformation was retained (or "memorized") as structural changes in airway smooth muscle cells, which modulated the cell's subsequent contractile responses. We termed this phenomenon mechanical strain memory. Preshortening has been found to induce attenuation of both force and isotonic shortening velocity in cholinergic receptor-activated airway smooth muscle. Rapid stretching of cholinergic receptor-activated airway smooth muscle from an initial length to a final length resulted in post-stretch force and myosin light chain phosphorylation that correlated significantly with initial length. Thus post-stretch muscle strips appeared to retain memory of the initial length prior to rapid stretch (mechanical strain memory). Cytoskeletal recruitment of actin- and integrin-binding proteins and Erk 1/2 MAPK appeared to be important mechanisms of mechanical strain memory. Sinusoidal length oscillation led to force attenuation during oscillation and in subsequent contractions in intact airway smooth muscle, and p38 MAPK appeared to be an important mechanism. In contrast, application of local mechanical strain to cultured airway smooth muscle cells induced local actin polymerization and cytoskeletal stiffening. It is conceivable that deep inspiration-induced bronchoprotection may be a manifestation of mechanical strain memory such that mechanical deformation from past breathing cycles modulated the mechanical performance of airway smooth muscle in subsequent cycles in a continuous and dynamic manner.

  3. Enhanced seismic criteria for piping

    International Nuclear Information System (INIS)

    Touboul, F. . E-mail francoise.touboul@cea.fr; Blay, N.; Sollogoub, P.; Chapuliot, S.

    2006-01-01

    In situ or laboratory experiments have shown that piping systems exhibit satisfactory seismic behavior. Seismic motion is not severe enough to significantly damage piping systems unless large differential motions of anchorage are imposed. Nevertheless, present design criteria for piping are very severe and require a large number of supports, which creates overly rigid piping systems. CEA, in collaboration with EDF, FRAMATOME and IRSN, has launched a large R and D program on enhanced design methods which will be less severe, but still conservative, and compatible with defect justification during operation. This paper presents the background of the R and D work on this matter, and CEA proposed equations. Our approach is based on the difference between the real behavior (or the best estimated computed one) with the one supposed by codified methods. Codified criteria are applied on an elastically calculated behavior that can be significantly different from the real one: the effect of plasticity may be very meaningful, even with low incursion in the plastic domain. Moreover, and particularly in piping systems, the elastic follow-up effect affects stress distribution for both seismic and thermal loads. For seismic load, we have proposed to modify the elastic moment limitation, based on the interpretation of experimental results on piping systems. The methods have been validated on more industrial cases, and some of the consequences of the changes have been studied: modification of the drawings and of the number of supports, global displacements, forces in the supports, stability of potential defects, etc. The basic aim of the studies undertaken is to make a decision on the stress classification problem, one that is not limited to seismic induced stresses, and to propose simplified methods for its solution

  4. Time-dependent seismic tomography

    Science.gov (United States)

    Julian, B.R.; Foulger, G.R.

    2010-01-01

    Of methods for measuring temporal changes in seismic-wave speeds in the Earth, seismic tomography is among those that offer the highest spatial resolution. 3-D tomographic methods are commonly applied in this context by inverting seismic wave arrival time data sets from different epochs independently and assuming that differences in the derived structures represent real temporal variations. This assumption is dangerous because the results of independent inversions would differ even if the structure in the Earth did not change, due to observational errors and differences in the seismic ray distributions. The latter effect may be especially severe when data sets include earthquake swarms or aftershock sequences, and may produce the appearance of correlation between structural changes and seismicity when the wave speeds are actually temporally invariant. A better approach, which makes it possible to assess what changes are truly required by the data, is to invert multiple data sets simultaneously, minimizing the difference between models for different epochs as well as the rms arrival-time residuals. This problem leads, in the case of two epochs, to a system of normal equations whose order is twice as great as for a single epoch. The direct solution of this system would require twice as much memory and four times as much computational effort as would independent inversions. We present an algorithm, tomo4d, that takes advantage of the structure and sparseness of the system to obtain the solution with essentially no more effort than independent inversions require. No claim to original US government works Journal compilation ?? 2010 RAS.

  5. [Investigation of fast filter of ECG signals with lifting wavelet and smooth filter].

    Science.gov (United States)

    Li, Xuefei; Mao, Yuxing; He, Wei; Yang, Fan; Zhou, Liang

    2008-02-01

    The lifting wavelet is used to decompose the original ECG signals and separate them into the approach signals with low frequency and the detail signals with high frequency, based on frequency characteristic. Parts of the detail signals are ignored according to the frequency characteristic. To avoid the distortion of QRS Complexes, the approach signals are filtered by an adaptive smooth filter with a proper threshold value. Through the inverse transform of the lifting wavelet, the reserved approach signals are reconstructed, and the three primary kinds of noise are limited effectively. In addition, the method is fast and there is no time delay between input and output.

  6. Seismic risk assessment of a BWR

    International Nuclear Information System (INIS)

    Wells, J.E.; Bernreuter, D.L.; Chen, J.C.; Lappa, D.A.; Chuang, T.Y.; Murray, R.C.; Johnson, J.J.

    1987-01-01

    The simplified seismic risk methodology developed in the USNRC Seismic Safety Margins Research Program (SSMRP) was demonstrated by its application to the Zion nuclear power plant (PWR). The simplified seismic risk methodology was developed to reduce the costs associated with a seismic risk analysis while providing adequate results. A detailed model of Zion, including systems analysis models (initiating events, event trees, and fault trees), SSI and structure models, and piping models, was developed and used in assessing the seismic risk of the Zion nuclear power plant (FSAR). The simplified seismic risk methodology was applied to the LaSalle County Station nuclear power plant, a BWR; to further demonstrate its applicability, and if possible, to provide a basis for comparing the seismic risk from PWRs and BWRs. (orig./HP)

  7. Methodology for seismic PSA of NPPs

    International Nuclear Information System (INIS)

    Jirsa, P.

    1999-09-01

    A general methodology is outlined for seismic PSA (probabilistic safety assessment). The main objectives of seismic PSA include: description of the course of an event; understanding the most probable failure sequences; gaining insight into the overall probability of reactor core damage; identification of the main seismic risk contributors; identification of the range of peak ground accelerations contributing significantly to the plant risk; and comparison of the seismic risk with risks from other events. The results of seismic PSA are typically compared with those of internal PSA and of PSA of other external events. If the results of internal and external PSA are available, sensitivity studies and cost benefit analyses are performed prior to any decision regarding corrective actions. If the seismic PSA involves analysis of the containment, useful information can be gained regarding potential seismic damage of the containment. (P.A.)

  8. Seismic Risk Perception compared with seismic Risk Factors

    Science.gov (United States)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  9. ON THE DERIVATIVE OF SMOOTH MEANINGFUL FUNCTIONS

    Directory of Open Access Journals (Sweden)

    Sanjo Zlobec

    2011-02-01

    Full Text Available The derivative of a function f in n variables at a point x* is one of the most important tools in mathematical modelling. If this object exists, it is represented by the row n-tuple f(x* = [∂f/∂xi(x*] called the gradient of f at x*, abbreviated: “the gradient”. The evaluation of f(x* is usually done in two stages, first by calculating the n partials and then their values at x = x*. In this talk we give an alternative approach. We show that one can characterize the gradient without differentiation! The idea is to fix an arbitrary row n-tuple G and answer the following question: What is a necessary and sufficient condition such that G is the gradient of a given f at a given x*? The answer is given after adjusting the quadratic envelope property introduced in [3]. We work with smooth, i.e., continuously differentiable, functions with a Lipschitz derivative on a compact convex set with a non-empty interior. Working with this class of functions is not a serious restriction. In fact, loosely speaking, “almost all” smooth meaningful functions used in modelling of real life situations are expected to have a bounded “acceleration” hence they belong to this class. In particular, the class contains all twice differentiable functions [1]. An important property of the functions from this class is that every f can be represented as the difference of some convex function and a convex quadratic function. This decomposition was used in [3] to characterize the zero derivative points. There we obtained reformulations and augmentations of some well known classic results on optimality such as Fermats extreme value theorem (known from high school and the Lagrange multiplier theorem from calculus [2, 3]. In this talk we extend the results on zero derivative points to characterize the relation G = f(x*, where G is an arbitrary n-tuple. Some special cases: If G = O, we recover the results on zero derivative points. For functions of a single

  10. Site response assessment using borehole seismic records

    Energy Technology Data Exchange (ETDEWEB)

    Park, Donghee; Chang, Chunjoong; Choi, Weonhack [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In regions with high seismic activity, such as Japan, the Western United States and Taiwan, borehole seismometers installed deep underground are used to monitor seismic activity during the course of seismic wave propagation at various depths and to study the stress changes due to earthquakes and analyze the connection to fault movements. The Korea Meteorological Administration (KMA) and the Korea Institute of Geology and Mining (KIGAM) have installed and are operating borehole seismometers at a depth of 70∼100 meters for the precise determination of epicenters. Also, Korea Hydro and Nuclear Power Co., Ltd. (KHNP) has installed and is operating 2 borehole seismic stations near Weolseong area to observe at a depth of 140 meters seismic activities connected to fault activity. KHNP plans to operate in the second half of 2014 a borehole seismic station for depths less than 300 and 600 meters in order to study the seismic response characteristics in deep strata. As a basic study for analyzing ground motion response characteristics at depths of about 300 to 600 meters in connection with the deep geological disposal of spent nuclear fuel, the present study examined the background noise response characteristics of the borehole seismic station operated by KHNP. In order to analyze the depth-dependent impact of seismic waves at deeper depths than in Korea, seismic data collected by Japan's KIK-net seismic stations were used and the seismic wave characteristics analyzed by size and depth. In order to analyze the borehole seismic observation data from the seismic station operated by KHNP, this study analyzed the background noise characteristics by using a probability density function.

  11. Site response assessment using borehole seismic records

    International Nuclear Information System (INIS)

    Park, Donghee; Chang, Chunjoong; Choi, Weonhack

    2014-01-01

    In regions with high seismic activity, such as Japan, the Western United States and Taiwan, borehole seismometers installed deep underground are used to monitor seismic activity during the course of seismic wave propagation at various depths and to study the stress changes due to earthquakes and analyze the connection to fault movements. The Korea Meteorological Administration (KMA) and the Korea Institute of Geology and Mining (KIGAM) have installed and are operating borehole seismometers at a depth of 70∼100 meters for the precise determination of epicenters. Also, Korea Hydro and Nuclear Power Co., Ltd. (KHNP) has installed and is operating 2 borehole seismic stations near Weolseong area to observe at a depth of 140 meters seismic activities connected to fault activity. KHNP plans to operate in the second half of 2014 a borehole seismic station for depths less than 300 and 600 meters in order to study the seismic response characteristics in deep strata. As a basic study for analyzing ground motion response characteristics at depths of about 300 to 600 meters in connection with the deep geological disposal of spent nuclear fuel, the present study examined the background noise response characteristics of the borehole seismic station operated by KHNP. In order to analyze the depth-dependent impact of seismic waves at deeper depths than in Korea, seismic data collected by Japan's KIK-net seismic stations were used and the seismic wave characteristics analyzed by size and depth. In order to analyze the borehole seismic observation data from the seismic station operated by KHNP, this study analyzed the background noise characteristics by using a probability density function

  12. The utility of petroleum seismic exploration data in delineating structural features within salt anticlines

    Science.gov (United States)

    Stockton, S.L.; Balch, Alfred H.

    1978-01-01

    The Salt Valley anticline, in the Paradox Basin of southeastern Utah, is under investigation for use as a location for storage of solid nuclear waste. Delineation of thin, nonsalt interbeds within the upper reaches of the salt body is extremely important because the nature and character of any such fluid- or gas-saturated horizons would be critical to the mode of emplacement of wastes into the structure. Analysis of 50 km of conventional seismic-reflection data, in the vicinity of the anticline, indicates that mapping of thin beds at shallow depths may well be possible using a specially designed adaptation of state-of-the-art seismic oil-exploration procedures. Computer ray-trace modeling of thin beds in salt reveals that the frequency and spatial resolution required to map the details of interbeds at shallow depths (less than 750 m) may be on the order of 500 Hz, with surface-spread lengths of less than 350 m. Consideration should be given to the burial of sources and receivers in order to attenuate surface noise and to record the desired high frequencies. Correlation of the seismic-reflection data with available well data and surface geology reveals the complex, structurally initiated diapir, whose upward flow was maintained by rapid contemporaneous deposition of continental clastic sediments on its flanks. Severe collapse faulting near the crests of these structures has distorted the seismic response. Evidence exists, however, that intrasalt thin beds of anhydrite, dolomite, and black shale are mappable on seismic record sections either as short, discontinuous reflected events or as amplitude anomalies that result from focusing of the reflected seismic energy by the thin beds; computer modeling of the folded interbeds confirms both of these as possible causes of seismic response from within the salt diapir. Prediction of the seismic signatures of the interbeds can be made from computer-model studies. Petroleum seismic-reflection data are unsatisfactory for

  13. Research on high level radioactive waste repository seismic design criteria

    International Nuclear Information System (INIS)

    Jing Xu

    2012-01-01

    Review seismic hazard analysis principle and method in site suitable assessment process of Yucca Mountain Project, and seismic design criteria and seismic design basis in primary design process. Demonstrated spatial character of seismic hazard by calculated regional seismic hazard map. Contrasted different level seismic design basis to show their differences and relation. Discussed seismic design criteria for preclosure phrase of high level waste repository and preference goal under beyond design basis ground motion. (author)

  14. ADAPT Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — Advanced Diagnostics and Prognostics Testbed (ADAPT) Project Lead: Scott Poll Subject Fault diagnosis in electrical power systems Description The Advanced...

  15. Identifying Reflectors in Seismic Images via Statistic and Syntactic Methods

    Directory of Open Access Journals (Sweden)

    Carlos A. Perez

    2010-04-01

    Full Text Available In geologic interpretation of seismic reflection data, accurate identification of reflectors is the foremost step to ensure proper subsurface structural definition. Reflector information, along with other data sets, is a key factor to predict the presence of hydrocarbons. In this work, mathematic and pattern recognition theory was adapted to design two statistical and two syntactic algorithms which constitute a tool in semiautomatic reflector identification. The interpretive power of these four schemes was evaluated in terms of prediction accuracy and computational speed. Among these, the semblance method was confirmed to render the greatest accuracy and speed. Syntactic methods offer an interesting alternative due to their inherently structural search method.

  16. High-resolution seismic wave propagation using local time stepping

    KAUST Repository

    Peter, Daniel

    2017-03-13

    High-resolution seismic wave simulations often require local refinements in numerical meshes to accurately capture e.g. steep topography or complex fault geometry. Together with explicit time schemes, this dramatically reduces the global time step size for ground-motion simulations due to numerical stability conditions. To alleviate this problem, local time stepping (LTS) algorithms allow an explicit time stepping scheme to adapt the time step to the element size, allowing nearoptimal time steps everywhere in the mesh. This can potentially lead to significantly faster simulation runtimes.

  17. Promoting seismic retrofit implementation through "nudge": using warranty as a driver.

    Science.gov (United States)

    Fujimi, Toshio; Tatano, Hirokazu

    2013-10-01

    This article proposes a new type of warranty policy that applies the "nudge" concept developed by Thaler and Sunstein to encourage homeowners in Japan to implement seismic retrofitting. Homeowner adaptation to natural disasters through loss reduction measures is known to be inadequate. To encourage proactive risk management, the "nudge" approach capitalizes on how choice architecture can influence human decision-making tendencies. For example, people tend to place more value on a warranty for consumer goods than on actuarial value. This article proposes a "warranty for seismic retrofitting" as a "nudge" policy that gives homeowners the incentive to adopt loss reduction measures. Under such a contract, the government guarantees all repair costs in the event of earthquake damage to the house if the homeowner implements seismic retrofitting. To estimate the degree to which a warranty will increase the perceived value of seismic retrofitting, we use field survey data from 1,200 homeowners. Our results show that a warranty increases the perceived value of seismic retrofitting by an average of 33%, and an approximate cost-benefit analysis indicates that such a warranty can be more economically efficient than an ex ante subsidy. Furthermore, we address the failure of the standard expected utility model to explain homeowners' decisions based on warranty evaluation, and explore the significant influence of ambiguity aversion on the efficacy of seismic retrofitting and nonanalytical factors such as feelings or trust. © 2013 Society for Risk Analysis.

  18. StreamMap: Smooth Dynamic Visualization of High-Density Streaming Points.

    Science.gov (United States)

    Li, Chenhui; Baciu, George; Han, Yu

    2018-03-01

    Interactive visualization of streaming points for real-time scatterplots and linear blending of correlation patterns is increasingly becoming the dominant mode of visual analytics for both big data and streaming data from active sensors and broadcasting media. To better visualize and interact with inter-stream patterns, it is generally necessary to smooth out gaps or distortions in the streaming data. Previous approaches either animate the points directly or present a sampled static heat-map. We propose a new approach, called StreamMap, to smoothly blend high-density streaming points and create a visual flow that emphasizes the density pattern distributions. In essence, we present three new contributions for the visualization of high-density streaming points. The first contribution is a density-based method called super kernel density estimation that aggregates streaming points using an adaptive kernel to solve the overlapping problem. The second contribution is a robust density morphing algorithm that generates several smooth intermediate frames for a given pair of frames. The third contribution is a trend representation design that can help convey the flow directions of the streaming points. The experimental results on three datasets demonstrate the effectiveness of StreamMap when dynamic visualization and visual analysis of trend patterns on streaming points are required.

  19. Input for seismic hazard assessment using Vrancea seismic source region

    International Nuclear Information System (INIS)

    Ivan, Iren-Adelina; Enescu, B.D.; Pantea, A.

    1998-01-01

    We use an extended and combined data base including historical and modern, qualitative and quantitative data, i.e., more than 25 events during the period 1790 - 1990 with epicentral/maximum intensities ranging from X to V degree (MSK scale), the variation interval of isoseismal curves ranging from IX th to III rd degree. The data set was analysed using both the sum phasor techniques of Ridelek and Sacks (1984) for different magnitudes and depth intervals and the Stepp's method. For the assessment of seismic hazard we need a pattern of seismic source regions including an estimation for the maximum expected magnitude and the return period for the studied regions. Another necessary step in seismic hazard assessment is to develop attenuation relationships specific to a seismogenic zone, particularly to sub-crustal earthquakes of Vrancea region. The conceptual frame involves the use of appropriate decay models and consideration of the randomness in the attenuation, taking into account the azimuthal variation of the isoseist shapes. (authors)

  20. A comparison of long-term changes in seismicity at The Geysers, Salton Sea, and Coso geothermal fields

    Science.gov (United States)

    Trugman, Daniel T.; Shearer, Peter M.; Borsa, Adrian A.; Fialko, Yuri

    2016-01-01

    Geothermal energy is an important source of renewable energy, yet its production is known to induce seismicity. Here we analyze seismicity at the three largest geothermal fields in California: The Geysers, Salton Sea, and Coso. We focus on resolving the temporal evolution of seismicity rates, which provides important observational constraints on how geothermal fields respond to natural and anthropogenic loading. We develop an iterative, regularized inversion procedure to partition the observed seismicity rate into two components: (1) the interaction rate due to earthquake-earthquake triggering and (2) the smoothly varying background rate controlled by other time-dependent stresses, including anthropogenic forcing. We apply our methodology to compare long-term changes in seismicity to monthly records of fluid injection and withdrawal. At The Geysers, we find that the background seismicity rate is highly correlated with fluid injection, with the mean rate increasing by approximately 50% and exhibiting strong seasonal fluctuations following construction of the Santa Rosa pipeline in 2003. In contrast, at both Salton Sea and Coso, the background seismicity rate has remained relatively stable since 1990, though both experience short-term rate fluctuations that are not obviously modulated by geothermal plant operation. We also observe significant temporal variations in Gutenberg-Richter b value, earthquake magnitude distribution, and earthquake depth distribution, providing further evidence for the dynamic evolution of stresses within these fields. The differing field-wide responses to fluid injection and withdrawal may reflect differences in in situ reservoir conditions and local tectonics, suggesting that a complex interplay of natural and anthropogenic stressing controls seismicity within California's geothermal fields.

  1. Methodology and main results of seismic source characterization for the PEGASOS Project, Switzerland

    International Nuclear Information System (INIS)

    Coppersmith, K. J.; Youngs, R. R.; Sprecher, Ch.

    2009-01-01

    Under the direction of the National Cooperative for the Disposal of Radioactive Waste (NAGRA), a probabilistic seismic hazard analysis was conducted for the Swiss nuclear power plant sites. The study has become known under the name 'PEGASOS Project'. This is the first of a group of papers in this volume that describes the seismic source characterization methodology and the main results of the project. A formal expert elicitation process was used, including dissemination of a comprehensive database, multiple workshops for identification and discussion of alternative models and interpretations, elicitation interviews, feedback to provide the experts with the implications of their preliminary assessments, and full documentation of the assessments. A number of innovative approaches to the seismic source characterization methodology were developed by four expert groups and implemented in the study. The identification of epistemic uncertainties and treatment using logic trees were important elements of the assessments. Relative to the assessment of the seismotectonic framework, the four expert teams identified similar main seismotectonic elements: the Rhine Graben, the Jura / Molasse regions, Helvetic and crystalline subdivisions of the Alps, and the southern Germany region. In defining seismic sources, the expert teams used a variety of approaches. These range from large regional source zones having spatially-smoothed seismicity to smaller local zones, to account for spatial variations in observed seismicity. All of the teams discussed the issue of identification of feature-specific seismic sources (i.e. individual mapped faults) as well as the potential reactivation of the boundary faults of the Permo-Carboniferous grabens. Other important seismic source definition elements are the specification of earthquake rupture dimensions and the earthquake depth distribution. Maximum earthquake magnitudes were assessed for each seismic source using approaches that consider the

  2. Adaptive and non-adaptive data hiding methods for grayscale images based on modulus function

    Directory of Open Access Journals (Sweden)

    Najme Maleki

    2014-07-01

    Full Text Available This paper presents two adaptive and non-adaptive data hiding methods for grayscale images based on modulus function. Our adaptive scheme is based on the concept of human vision sensitivity, so the pixels in edge areas than to smooth areas can tolerate much more changes without making visible distortion for human eyes. In our adaptive scheme, the average differencing value of four neighborhood pixels into a block via a threshold secret key determines whether current block is located in edge or smooth area. Pixels in the edge areas are embedded by Q-bit of secret data with a larger value of Q than that of pixels placed in smooth areas. Also in this scholar, we represent one non-adaptive data hiding algorithm. Our non-adaptive scheme, via an error reduction procedure, produces a high visual quality for stego-image. The proposed schemes present several advantages. 1-of aspects the embedding capacity and visual quality of stego-image are scalable. In other words, the embedding rate as well as the image quality can be scaled for practical applications 2-the high embedding capacity with minimal visual distortion can be achieved, 3-our methods require little memory space for secret data embedding and extracting phases, 4-secret keys have used to protect of the embedded secret data. Thus, level of security is high, 5-the problem of overflow or underflow does not occur. Experimental results indicated that the proposed adaptive scheme significantly is superior to the currently existing scheme, in terms of stego-image visual quality, embedding capacity and level of security and also our non-adaptive method is better than other non-adaptive methods, in view of stego-image quality. Results show which our adaptive algorithm can resist against the RS steganalysis attack.

  3. Fully probabilistic seismic source inversion – Part 1: Efficient parameterisation

    Directory of Open Access Journals (Sweden)

    S. C. Stähler

    2014-11-01

    Full Text Available Seismic source inversion is a non-linear problem in seismology where not just the earthquake parameters themselves but also estimates of their uncertainties are of great practical importance. Probabilistic source inversion (Bayesian inference is very adapted to this challenge, provided that the parameter space can be chosen small enough to make Bayesian sampling computationally feasible. We propose a framework for PRobabilistic Inference of Seismic source Mechanisms (PRISM that parameterises and samples earthquake depth, moment tensor, and source time function efficiently by using information from previous non-Bayesian inversions. The source time function is expressed as a weighted sum of a small number of empirical orthogonal functions, which were derived from a catalogue of >1000 source time functions (STFs by a principal component analysis. We use a likelihood model based on the cross-correlation misfit between observed and predicted waveforms. The resulting ensemble of solutions provides full uncertainty and covariance information for the source parameters, and permits propagating these source uncertainties into travel time estimates used for seismic tomography. The computational effort is such that routine, global estimation of earthquake mechanisms and source time functions from teleseismic broadband waveforms is feasible.

  4. Seismic and tsunami safety margin assessment

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-08-15

    Nuclear Regulation Authority is going to establish new seismic and tsunami safety guidelines to increase the safety of NPPs. The main purpose of this research is testing structures/components important to safety and tsunami resistant structures/components, and evaluating the capacity of them against earthquake and tsunami. Those capacity data will be utilized for the seismic and tsunami back-fit review based on the new seismic and tsunami safety guidelines. The summary of the program in 2012 is as follows. 1. Component seismic capacity test and quantitative seismic capacity evaluation. PWR emergency diesel generator partial-model seismic capacity tests have been conducted and quantitative seismic capacities have been evaluated. 2. Seismic capacity evaluation of switching-station electric equipment. Existing seismic test data investigation, specification survey and seismic response analyses have been conducted. 3. Tsunami capacity evaluation of anti-inundation measure facilities. Tsunami pressure test have been conducted utilizing a small breakwater model and evaluated basic characteristics of tsunami pressure against seawall structure. (author)

  5. Enhancement of seismic resistance of buildings

    Directory of Open Access Journals (Sweden)

    Claudiu-Sorin Dragomir

    2014-03-01

    Full Text Available The objectives of the paper are both seismic instrumentation for damage assessment and enhancing of seismic resistance of buildings. In according with seismic design codes in force the buildings are designed to resist at seismic actions. Due to the time evolution of these design provisions, there are buildings that were designed decades ago, under the less stringent provisions. The conceptual conformation is nowadays provided in all Codes of seismic design. According to the Code of seismic design P100-1:2006 the asymmetric structures do not have an appropriate seismic configuration; they have disadvantageous distribution of volumes, mass and stiffness. Using results of temporary seismic instrumentation the safety condition of the building may be assessed in different phases of work. Based on this method, the strengthening solutions may be identified and the need of seismic joints may be emphasised. All the aforementioned ideas are illustrated through a case study. Therefore it will be analysed the dynamic parameter evolution of an educational building obtained in different periods. Also, structural intervention scenarios to enhance seismic resistance will be presented.

  6. Seismic and tsunami safety margin assessment

    International Nuclear Information System (INIS)

    2013-01-01

    Nuclear Regulation Authority is going to establish new seismic and tsunami safety guidelines to increase the safety of NPPs. The main purpose of this research is testing structures/components important to safety and tsunami resistant structures/components, and evaluating the capacity of them against earthquake and tsunami. Those capacity data will be utilized for the seismic and tsunami back-fit review based on the new seismic and tsunami safety guidelines. The summary of the program in 2012 is as follows. 1. Component seismic capacity test and quantitative seismic capacity evaluation. PWR emergency diesel generator partial-model seismic capacity tests have been conducted and quantitative seismic capacities have been evaluated. 2. Seismic capacity evaluation of switching-station electric equipment. Existing seismic test data investigation, specification survey and seismic response analyses have been conducted. 3. Tsunami capacity evaluation of anti-inundation measure facilities. Tsunami pressure test have been conducted utilizing a small breakwater model and evaluated basic characteristics of tsunami pressure against seawall structure. (author)

  7. Seismic Isolation Working Meeting Gap Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  8. Smooth pursuit eye movements and schizophrenia: literature review.

    Science.gov (United States)

    Franco, J G; de Pablo, J; Gaviria, A M; Sepúlveda, E; Vilella, E

    2014-09-01

    To review the scientific literature about the relationship between impairment on smooth pursuit eye movements and schizophrenia. Narrative review that includes historical articles, reports about basic and clinical investigation, systematic reviews, and meta-analysis on the topic. Up to 80% of schizophrenic patients have impairment of smooth pursuit eye movements. Despite the diversity of test protocols, 65% of patients and controls are correctly classified by their overall performance during this pursuit. The smooth pursuit eye movements depend on the ability to anticipate the target's velocity and the visual feedback, as well as on learning and attention. The neuroanatomy implicated in smooth pursuit overlaps to some extent with certain frontal cortex zones associated with some clinical and neuropsychological characteristics of the schizophrenia, therefore some specific components of smooth pursuit anomalies could serve as biomarkers of the disease. Due to their sedative effect, antipsychotics have a deleterious effect on smooth pursuit eye movements, thus these movements cannot be used to evaluate the efficacy of the currently available treatments. Standardized evaluation of smooth pursuit eye movements on schizophrenia will allow to use specific aspects of that pursuit as biomarkers for the study of its genetics, psychopathology, or neuropsychology. Copyright © 2013 Sociedad Española de Oftalmología. Published by Elsevier Espana. All rights reserved.

  9. Nodular smooth muscle metaplasia in multiple peritoneal endometriosis.

    Science.gov (United States)

    Kim, Hyun-Soo; Yoon, Gun; Ha, Sang Yun; Song, Sang Yong

    2015-01-01

    We report here an unusual presentation of peritoneal endometriosis with smooth muscle metaplasia as multiple protruding masses on the lateral pelvic wall. Smooth muscle metaplasia is a common finding in rectovaginal endometriosis, whereas in peritoneal endometriosis, smooth muscle metaplasia is uncommon and its nodular presentation on the pelvic wall is even rarer. To the best of our knowledge, this is the first case of nodular smooth muscle metaplasia occurring in peritoneal endometriosis. As observed in this case, when performing laparoscopic surgery in order to excise malignant tumors of intra-abdominal or pelvic organs, it can be difficult for surgeons to distinguish the metastatic tumors from benign nodular pelvic wall lesions, including endometriosis, based on the gross findings only. Therefore, an intraoperative frozen section biopsy of the pelvic wall nodules should be performed to evaluate the peritoneal involvement by malignant tumors. Moreover, this report implies that peritoneal endometriosis, as well as rectovaginal endometriosis, can clinically present as nodular lesions if obvious smooth muscle metaplasia is present. The pathological investigation of smooth muscle cells in peritoneal lesions can contribute not only to the precise diagnosis but also to the structure and function of smooth muscle cells and related cells involved in the histogenesis of peritoneal endometriosis.

  10. The Smoothing Hypothesis, Stock Returns and Risk in Brazil

    Directory of Open Access Journals (Sweden)

    Antonio Lopo Martinez

    2011-01-01

    Full Text Available Income smoothing is defined as the deliberate normalization of income in order to reach a desired trend. If the smoothing causes more information to be reflected in the stock price, it is likely to improve the allocation of resources and can be a critical factor in investment decisions. This study aims to build metrics to determine the degree of smoothing in Brazilian public companies, to classify them as smoothing and non-smoothing companies and additionally to present evidence on the long-term relationship between the smoothing hypothesis and stock return and risk. Using the Economatica and CVM databases, this study focuses on 145 companies in the period 1998-2007. We find that Brazilian smoothers have a smaller degree of systemic risk than non-smoothers. In average terms, the beta of smoothers is significantly lower than non-smoothers. Regarding return, we find that the abnormal annualized returns of smoothers are significantly higher. We confirm differences in the groups by nonparametric and parametric tests in cross section or as time series, indicating that there is a statistically significant difference in performance in the Brazilian market between firms that do and do not engage in smoothing.

  11. Modeling the dispersion effects of contractile fibers in smooth muscles

    Science.gov (United States)

    Murtada, Sae-Il; Kroon, Martin; Holzapfel, Gerhard A.

    2010-12-01

    Micro-structurally based models for smooth muscle contraction are crucial for a better understanding of pathological conditions such as atherosclerosis, incontinence and asthma. It is meaningful that models consider the underlying mechanical structure and the biochemical activation. Hence, a simple mechanochemical model is proposed that includes the dispersion of the orientation of smooth muscle myofilaments and that is capable to capture available experimental data on smooth muscle contraction. This allows a refined study of the effects of myofilament dispersion on the smooth muscle contraction. A classical biochemical model is used to describe the cross-bridge interactions with the thin filament in smooth muscles in which calcium-dependent myosin phosphorylation is the only regulatory mechanism. A novel mechanical model considers the dispersion of the contractile fiber orientations in smooth muscle cells by means of a strain-energy function in terms of one dispersion parameter. All model parameters have a biophysical meaning and may be estimated through comparisons with experimental data. The contraction of the middle layer of a carotid artery is studied numerically. Using a tube the relationships between the internal pressure and the stretches are investigated as functions of the dispersion parameter, which implies a strong influence of the orientation of smooth muscle myofilaments on the contraction response. It is straightforward to implement this model in a finite element code to better analyze more complex boundary-value problems.

  12. Seismic signal simulation and study of underground nuclear sources by moment inversion

    International Nuclear Information System (INIS)

    Crusem, R.

    1986-09-01

    Some problems of underground nuclear explosions are examined from the seismological point of view. In the first part a model is developed for mean seismic propagation through the lagoon of Mururoa atoll and for calculation of synthetic seismograms (in intermediate fields: 5 to 20 km) by summation of discrete wave numbers. In the second part this ground model is used with a linear inversion method of seismic moments for estimation of elastic source terms equivalent to the nuclear source. Only the isotrope part is investigated solution stability is increased by using spectral smoothing and a minimal phase hypothesis. Some examples of applications are presented: total energy estimation of a nuclear explosion, simulation of mechanical effects induced by an underground explosion [fr

  13. A semi-supervised method to detect seismic random noise with fuzzy GK clustering

    International Nuclear Information System (INIS)

    Hashemi, Hosein; Javaherian, Abdolrahim; Babuska, Robert

    2008-01-01

    We present a new method to detect random noise in seismic data using fuzzy Gustafson–Kessel (GK) clustering. First, using an adaptive distance norm, a matrix is constructed from the observed seismic amplitudes. The next step is to find centres of ellipsoidal clusters and construct a partition matrix which determines the soft decision boundaries between seismic events and random noise. The GK algorithm updates the cluster centres in order to iteratively minimize the cluster variance. Multiplication of the fuzzy membership function with values of each sample yields new sections; we name them 'clustered sections'. The seismic amplitude values of the clustered sections are given in a way to decrease the level of noise in the original noisy seismic input. In pre-stack data, it is essential to study the clustered sections in a f–k domain; finding the quantitative index for weighting the post-stack data needs a similar approach. Using the knowledge of a human specialist together with the fuzzy unsupervised clustering, the method is a semi-supervised random noise detection. The efficiency of this method is investigated on synthetic and real seismic data for both pre- and post-stack data. The results show a significant improvement of the input noisy sections without harming the important amplitude and phase information of the original data. The procedure for finding the final weights of each clustered section should be carefully done in order to keep almost all the evident seismic amplitudes in the output section. The method interactively uses the knowledge of the seismic specialist in detecting the noise

  14. Ambiguous Adaptation

    DEFF Research Database (Denmark)

    Møller Larsen, Marcus; Lyngsie, Jacob

    2017-01-01

    We investigate the connection between contract duration, relational mechanisms, and premature relationship termination. Based on an analysis of a large sample of exchange relationships in the global service-provider industry, we argue that investments in either longer contract duration or more in...... ambiguous reference points for adaption and thus increase the likelihood of premature termination by restricting the parties' set of adaptive actions....

  15. Climate adaptation

    Science.gov (United States)

    Kinzig, Ann P.

    2015-03-01

    This paper is intended as a brief introduction to climate adaptation in a conference devoted otherwise to the physics of sustainable energy. Whereas mitigation involves measures to reduce the probability of a potential event, such as climate change, adaptation refers to actions that lessen the impact of climate change. Mitigation and adaptation differ in other ways as well. Adaptation does not necessarily have to be implemented immediately to be effective; it only needs to be in place before the threat arrives. Also, adaptation does not necessarily require global, coordinated action; many effective adaptation actions can be local. Some urban communities, because of land-use change and the urban heat-island effect, currently face changes similar to some expected under climate change, such as changes in water availability, heat-related morbidity, or changes in disease patterns. Concern over those impacts might motivate the implementation of measures that would also help in climate adaptation, despite skepticism among some policy makers about anthropogenic global warming. Studies of ancient civilizations in the southwestern US lends some insight into factors that may or may not be important to successful adaptation.

  16. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  17. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    Science.gov (United States)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  18. Lunar seismicity, structure, and tectonics

    Science.gov (United States)

    Lammlein, D. R.; Latham, G. V.; Dorman, J.; Nakamura, Y.; Ewing, M.

    1974-01-01

    Natural seismic events have been detected by the long-period seismometers at Apollo stations 16, 14, 15, and 12 at annual rates of 3300, 1700, 800, and 700, respectively, with peak activity at 13- to 14-day intervals. The data are used to describe magnitudes, source characteristics, and periodic features of lunar seismicity. In a present model, the rigid lithosphere overlies an asthenosphere of reduced rigidity in which present-day partial melting is probable. Tidal deformation presumably leads to critical stress concentrations at the base of the lithosphere, where moonquakes are found to occur. The striking tidal periodicities in the pattern of moonquake occurrence and energy release suggest that tidal energy is the dominant source of energy released as moonquakes. Thus, tidal energy is dissipated by moonquakes in the lithosphere and probably by inelastic processes in the asthenosphere.

  19. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Joel Walls; M.T. Taner; Gary Mavko; Jack Dvorkin

    2002-01-01

    In Section 1 of this first report we will describe the work we are doing to collect and analyze rock physics data for the purpose of modeling seismic attenuation from other measurable quantities such as porosity, water saturation, clay content and net stress. This work and other empirical methods to be presented later, will form the basis for ''Q pseudo-well modeling'' that is a key part of this project. In Section 2 of this report, we will show the fundamentals of a new method to extract Q, dispersion, and attenuation from field seismic data. The method is called Gabor-Morlet time-frequency decomposition. This technique has a number of advantages including greater stability and better time resolution than spectral ratio methods.

  20. Advances in experimental seismic engineering

    International Nuclear Information System (INIS)

    Muthumani, K.; Gopalakrishnan, N.; Sathish Kumar, K.; Iyer, Nagesh R.

    2011-01-01

    Seismic testing plays a key role in better understanding physical phenomena, validating and improving analysis and design methods, and qualifying sensitive equipment. There are several different experimental techniques that can be used to test the response of structures to verify their seismic performance. These include (i) Quasi-static testing (ii) Shake table testing, (iii) Effective force testing (iv) Pseudodynamic testing and (v) Real-time dynamic hybrid testing. The sophisticated shaking table facilities and modern data acquisition and processing methods using high speed computers have made it possible to improve the accuracy and reliability of the experimental data, and to increase the number of gauge points, thus yielding a more detailed picture of the structural behavior. Lifeline structures like nuclear power plants and thermal power

  1. An economical educational seismic system

    Science.gov (United States)

    Lehman, J. D.

    1980-01-01

    There is a considerable interest in seismology from the nonprofessional or amateur standpoint. The operation of a seismic system can be satisfying and educational, especially when you have built and operated the system yourself. A long-period indoor-type sensor and recording system that works extremely well has been developed in the James Madison University Physics Deparment. The system can be built quite economically, and any educational institution that cannot commit themselves to a professional installation need not be without first-hand seismic information. The system design approach has been selected by college students working a project or senior thesis, several elementary and secondary science teachers, as well as the more ambitious tinkerer or hobbyist at home 

  2. Displacement Based Seismic Design Criteria

    International Nuclear Information System (INIS)

    Costello, J.F.; Hofmayer, C.; Park, Y.J.

    1999-01-01

    The USNRC has initiated a project to determine if any of the likely revisions to traditional earthquake engineering practice are relevant to seismic design of the specialized structures, systems and components of nuclear power plants and of such significance to suggest that a change in design practice might be warranted. As part of the initial phase of this study, a literature survey was conducted on the recent changes in seismic design codes/standards, on-going activities of code-writing organizations/communities, and published documents on displacement-based design methods. This paper provides a summary of recent changes in building codes and on-going activities for future codes. It also discusses some technical issues for further consideration

  3. Adaptive steganography

    Science.gov (United States)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  4. Adaptive Lighting

    DEFF Research Database (Denmark)

    Petersen, Kjell Yngve; Søndergaard, Karin; Kongshaug, Jesper

    2015-01-01

    the investigations of lighting scenarios carried out in two test installations: White Cube and White Box. The test installations are discussed as large-scale experiential instruments. In these test installations we examine what could potentially occur when light using LED technology is integrated and distributed......Adaptive Lighting Adaptive lighting is based on a partial automation of the possibilities to adjust the colour tone and brightness levels of light in order to adapt to people’s needs and desires. IT support is key to the technical developments that afford adaptive control systems. The possibilities...... differently into an architectural body. We also examine what might occur when light is dynamic and able to change colour, intensity and direction, and when it is adaptive and can be brought into interaction with its surroundings. In short, what happens to an architectural space when artificial lighting ceases...

  5. Seismic Imaging of Mantle Plumes

    Science.gov (United States)

    Nataf, Henri-Claude

    The mantle plume hypothesis was proposed thirty years ago by Jason Morgan to explain hotspot volcanoes such as Hawaii. A thermal diapir (or plume) rises from the thermal boundary layer at the base of the mantle and produces a chain of volcanoes as a plate moves on top of it. The idea is very attractive, but direct evidence for actual plumes is weak, and many questions remain unanswered. With the great improvement of seismic imagery in the past ten years, new prospects have arisen. Mantle plumes are expected to be rather narrow, and their detection by seismic techniques requires specific developments as well as dedicated field experiments. Regional travel-time tomography has provided good evidence for plumes in the upper mantle beneath a few hotspots (Yellowstone, Massif Central, Iceland). Beneath Hawaii and Iceland, the plume can be detected in the transition zone because it deflects the seismic discontinuities at 410 and 660 km depths. In the lower mantle, plumes are very difficult to detect, so specific methods have been worked out for this purpose. There are hints of a plume beneath the weak Bowie hotspot, as well as intriguing observations for Hawaii. Beneath Iceland, high-resolution tomography has just revealed a wide and meandering plume-like structure extending from the core-mantle boundary up to the surface. Among the many phenomena that seem to take place in the lowermost mantle (or D''), there are also signs there of the presence of plumes. In this article I review the main results obtained so far from these studies and discuss their implications for plume dynamics. Seismic imaging of mantle plumes is still in its infancy but should soon become a turbulent teenager.

  6. Seismic evaluation of nuclear installations

    International Nuclear Information System (INIS)

    Mattar Neto, Miguel

    1997-01-01

    Some considerations regarding extreme external events, natural or man-induce, such as earthquakes, floods, air crashes, etc, shall be done for nuclear facilities to minimizing the potential impact of the installation on the public and the environment. In this paper the main aspects of the seismic evaluation of nuclear facilities (except the nuclear power reactors) will be presented based on different codes and standards. (author). 7 refs., 2 tabs

  7. Seismic hazard studies in Egypt

    Directory of Open Access Journals (Sweden)

    Abuo El-Ela A. Mohamed

    2012-12-01

    Full Text Available The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba–Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5° within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA values were detected in the western part of the western desert and it is less than 25 gal.

  8. The ISC Seismic Event Bibliography

    Science.gov (United States)

    Di Giacomo, Domenico; Storchak, Dmitry

    2015-04-01

    The International Seismological Centre (ISC) is a not-for-profit organization operating in the UK for the last 50 years and producing the ISC Bulletin - the definitive worldwide summary of seismic events, both natural and anthropogenic - starting from the beginning of 20th century. Often researchers need to gather information related to specific seismic events for various reasons. To facilitate such task, in 2012 we set up a new database linking earthquakes and other seismic events in the ISC Bulletin to bibliographic records of scientific articles (mostly peer-reviewed journals) that describe those events. Such association allows users of the ISC Event Bibliography (www.isc.ac.uk/event_bibliography/index.php) to run searches for publications via a map-based web interface and, optionally, selecting scientific publications related to either specific events or events in the area of interest. Some of the greatest earthquakes were described in several hundreds of articles published over a period of few years. The journals included in our database are not limited to seismology but bring together a variety of fields in geosciences (e.g., engineering seismology, geodesy and remote sensing, tectonophysics, monitoring research, tsunami, geology, geochemistry, hydrogeology, atmospheric sciences, etc.) making this service useful in multidisciplinary studies. Usually papers dealing with large data set are not included (e.g., papers describing a seismic catalogue). Currently the ISC Event Bibliography includes over 17,000 individual publications from about 500 titles related to over 14,000 events that occurred in last 100+ years. The bibliographic records in the Event Bibliography start in the 1950s, and it is updated as new publications become available.

  9. Seismic Shot Processing on GPU

    OpenAIRE

    Johansen, Owe

    2009-01-01

    Today s petroleum industry demand an ever increasing amount of compu- tational resources. Seismic processing applications in use by these types of companies have generally been using large clusters of compute nodes, whose only computing resource has been the CPU. However, using Graphics Pro- cessing Units (GPU) for general purpose programming is these days becoming increasingly more popular in the high performance computing area. In 2007, NVIDIA corporation launched their framework for develo...

  10. Seismic analysis of axisymmetric shells

    International Nuclear Information System (INIS)

    Jospin, R.J.; Toledo, E.M.; Feijoo, R.A.

    1984-01-01

    Axisymmetric shells subjected to multiple support excitation are studied. The shells are spatialy discretized by the finite element method and in order to obtain estimates for the maximum values of displacements and stresses the response spectrum tecnique is used. Finally, some numerical results are presented and discussed in the case of a shell of revolution with vertical symmetry axis, subjected to seismic ground motions in the horizontal, vertical and rocking directions. (Author) [pt

  11. Smooth solutions of the Navier-Stokes equations

    International Nuclear Information System (INIS)

    Pokhozhaev, S I

    2014-01-01

    We consider smooth solutions of the Cauchy problem for the Navier-Stokes equations on the scale of smooth functions which are periodic with respect to x∈R 3 . We obtain existence theorems for global (with respect to t>0) and local solutions of the Cauchy problem. The statements of these depend on the smoothness and the norm of the initial vector function. Upper bounds for the behaviour of solutions in both classes, which depend on t, are also obtained. Bibliography: 10 titles

  12. EXCHANGE-RATES FORECASTING: EXPONENTIAL SMOOTHING TECHNIQUES AND ARIMA MODELS

    Directory of Open Access Journals (Sweden)

    Dezsi Eva

    2011-07-01

    Full Text Available Exchange rates forecasting is, and has been a challenging task in finance. Statistical and econometrical models are widely used in analysis and forecasting of foreign exchange rates. This paper investigates the behavior of daily exchange rates of the Romanian Leu against the Euro, United States Dollar, British Pound, Japanese Yen, Chinese Renminbi and the Russian Ruble. Smoothing techniques are generated and compared with each other. These models include the Simple Exponential Smoothing technique, as the Double Exponential Smoothing technique, the Simple Holt-Winters, the Additive Holt-Winters, namely the Autoregressive Integrated Moving Average model.

  13. Electrochemically replicated smooth aluminum foils for anodic alumina nanochannel arrays

    International Nuclear Information System (INIS)

    Biring, Sajal; Tsai, K-T; Sur, Ujjal Kumar; Wang, Y-L

    2008-01-01

    A fast electrochemical replication technique has been developed to fabricate large-scale ultra-smooth aluminum foils by exploiting readily available large-scale smooth silicon wafers as the masters. Since the adhesion of aluminum on silicon depends on the time of surface pretreatment in water, it is possible to either detach the replicated aluminum from the silicon master without damaging the replicated aluminum and master or integrate the aluminum film to the silicon substrate. Replicated ultra-smooth aluminum foils are used for the growth of both self-organized and lithographically guided long-range ordered arrays of anodic alumina nanochannels without any polishing pretreatment

  14. Additional Smooth and Rough Water Trials of SKI-CAT.

    Science.gov (United States)

    1981-08-01

    REPORT & PERIOD COVERED ADDITIONAL SMOOTH AND ROUGH WATER TRIALS OF FINAL SKI- CAT S. PERFORMING ORO. REPORT NUMSER 7. AUTHOR() I. CONTRACT OR GRANT NUMUr...Identif by bloc membe) ’ " -Further tests of SKI- CAT were made in smooth and rough water. Smooth water results confirmed the performance results of...reductions in the accelerations and motions of SKI- CAT over against the head seasreut DD , +A ,3 1473 EDITION OF I NOVS IS OBSOLETE UNCIbSJFIED SIME 0102-014

  15. Smooth invariant densities for random switching on the torus

    Science.gov (United States)

    Bakhtin, Yuri; Hurth, Tobias; Lawley, Sean D.; Mattingly, Jonathan C.

    2018-04-01

    We consider a random dynamical system obtained by switching between the flows generated by two smooth vector fields on the 2d-torus, with the random switchings happening according to a Poisson process. Assuming that the driving vector fields are transversal to each other at all points of the torus and that each of them allows for a smooth invariant density and no periodic orbits, we prove that the switched system also has a smooth invariant density, for every switching rate. Our approach is based on an integration by parts formula inspired by techniques from Malliavin calculus.

  16. Smoothing optimization of supporting quadratic surfaces with Zernike polynomials

    Science.gov (United States)

    Zhang, Hang; Lu, Jiandong; Liu, Rui; Ma, Peifu

    2018-03-01

    A new optimization method to get a smooth freeform optical surface from an initial surface generated by the supporting quadratic method (SQM) is proposed. To smooth the initial surface, a 9-vertex system from the neighbor quadratic surface and the Zernike polynomials are employed to establish a linear equation system. A local optimized surface to the 9-vertex system can be build by solving the equations. Finally, a continuous smooth optimization surface is constructed by stitching the above algorithm on the whole initial surface. The spot corresponding to the optimized surface is no longer discrete pixels but a continuous distribution.

  17. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...

  18. Design of adaptive switching control for hypersonic aircraft

    Directory of Open Access Journals (Sweden)

    Xin Jiao

    2015-10-01

    Full Text Available This article proposes a novel adaptive switching control of hypersonic aircraft based on type-2 Takagi–Sugeno–Kang fuzzy sliding mode control and focuses on the problem of stability and smoothness in the switching process. This method uses full-state feedback to linearize the nonlinear model of hypersonic aircraft. Combining the interval type-2 Takagi–Sugeno–Kang fuzzy approach with sliding mode control keeps the adaptive switching process stable and smooth. For rapid stabilization of the system, the adaptive laws use a direct constructive Lyapunov analysis together with an established type-2 Takagi–Sugeno–Kang fuzzy logic system. Simulation results indicate that the proposed control scheme can maintain the stability and smoothness of switching process for the hypersonic aircraft.

  19. Seismic risk map of Korea

    International Nuclear Information System (INIS)

    Lee, S.H.; Lee, Y.K.; Eum, S.H.; Yang, S.J.; Chun, M.S.

    1983-01-01

    A study on seismic hazard level in Korea has been performed and the main results of the study are summarized as follows: 1. Historians suggest that the quality of historical earthquake data may be accurate in some degree and the data should be used in seismic risk analysis. 2. The historical damage events are conformed in historical literatures and their intensities are re-evaluated by joint researchers. The maximum MM intensity of them is VIII evaluated for 17 events. 3. The relation of earthquakes to surface fault is not clear. It seems resonable to related them to tectonic provinces. 4. Statistical seismic risk analysis shows that the acceleration expected within 50O year return period is less than 0.25G when only instrumental earthquakes are used and less than 0.10G if all of instrumental and historical earthquakes are used. The acceleration in Western Coast and Kyungsang area is higher than the other regions in Korea. 5. The maximum horizontal acceleration determined by conservative method is 0.26G when historical earthquake data are used and less than 0.20G if only instrumental earthquakes are used. The return period of 0.26G is 240 years in Kyungsang province and longer in other provinces. (Author)

  20. The seismic reflection inverse problem

    International Nuclear Information System (INIS)

    Symes, W W

    2009-01-01

    The seismic reflection method seeks to extract maps of the Earth's sedimentary crust from transient near-surface recording of echoes, stimulated by explosions or other controlled sound sources positioned near the surface. Reasonably accurate models of seismic energy propagation take the form of hyperbolic systems of partial differential equations, in which the coefficients represent the spatial distribution of various mechanical characteristics of rock (density, stiffness, etc). Thus the fundamental problem of reflection seismology is an inverse problem in partial differential equations: to find the coefficients (or at least some of their properties) of a linear hyperbolic system, given the values of a family of solutions in some part of their domains. The exploration geophysics community has developed various methods for estimating the Earth's structure from seismic data and is also well aware of the inverse point of view. This article reviews mathematical developments in this subject over the last 25 years, to show how the mathematics has both illuminated innovations of practitioners and led to new directions in practice. Two themes naturally emerge: the importance of single scattering dominance and compensation for spectral incompleteness by spatial redundancy. (topical review)

  1. Building a Smartphone Seismic Network

    Science.gov (United States)

    Kong, Q.; Allen, R. M.

    2013-12-01

    We are exploring to build a new type of seismic network by using the smartphones. The accelerometers in smartphones can be used to record earthquakes, the GPS unit can give an accurate location, and the built-in communication unit makes the communication easier for this network. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. In order to build this network, we developed an application for android phones and server to record the acceleration in real time. These records can be sent back to a server in real time, and analyzed at the server. We evaluated the performance of the smartphone as a seismic recording instrument by comparing them with high quality accelerometer while located on controlled shake tables for a variety of tests, and also the noise floor test. Based on the daily human activity data recorded by the volunteers and the shake table tests data, we also developed algorithm for the smartphones to detect earthquakes from daily human activities. These all form the basis of setting up a new prototype smartphone seismic network in the near future.

  2. Simulations of seismic acquisition footprint

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, J.; Margrave, G.; Lawton, D. [Calgary Univ., AB (Canada)

    2008-07-01

    Numerical simulations were performed to investigate the causes of commonly observed artefacts in seismic field data. These seismic acquisition footprints typically consist of modulations in recorded amplitudes that are spatially correlated to the surface locations of sources and receivers used in a survey. Two broad classes of footprint were considered, notably amplitude variations related to the edges of the survey and the amplitude variations in the interior of the survey. The variations in amplitude obscure the true reflection response of the subsurface. The MATLAB numerical modelling code was used to produce the synthetic seismic data and create a thorough dataset using a survey design incorporating dense grids of sources and receivers. The footprint consisting of periodic amplitude variations in the interior of the surveys, similar to that observed in field data and likely produced by poor sampling, was observed in the decimated dataset. This type of footprint varied in strength between images produced with different processing algorithms. The observed footprint in these simulations was most organized in the unmigrated stack and was somewhat randomized after poststack. 2 refs., 1 tab., 3 figs.

  3. Forecasting Italian seismicity through a spatio-temporal physical model: importance of considering time-dependency and reliability of the forecast

    Directory of Open Access Journals (Sweden)

    Amir Hakimhashemi

    2010-11-01

    Full Text Available We apply here a forecasting model to the Italian region for the spatio-temporal distribution of seismicity based on a smoothing Kernel function, Coulomb stress variations, and a rate-and-state friction law. We tested the feasibility of this approach, and analyzed the importance of introducing time-dependency in forecasting future events. The change in seismicity rate as a function of time was estimated by calculating the Coulomb stress change imparted by large earthquakes. We applied our approach to the region of Italy, and used all of the cataloged earthquakes that occurred up to 2006 to generate the reference seismicity rate. For calculation of the time-dependent seismicity rate changes, we estimated the rate-and-state stress transfer imparted by all of the ML≥4.0 earthquakes that occurred during 2007 and 2008. To validate the results, we first compared the reference seismicity rate with the distribution of ML≥1.8 earthquakes since 2007, using both a non-declustered and a declustered catalog. A positive correlation was found, and all of the forecast earthquakes had locations within 82% and 87% of the study area with the highest seismicity rate, respectively. Furthermore, 95% of the forecast earthquakes had locations within 27% and 47% of the study area with the highest seismicity rate, respectively. For the time-dependent seismicity rate changes, the number of events with locations in the regions with a seismicity rate increase was 11% more than in the regions with a seismicity rate decrease.

  4. xQuake: A Modern Approach to Seismic Network Analytics

    Science.gov (United States)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  5. Two applications of time reversal mirrors: Seismic radio and seismic radar

    KAUST Repository

    Hanafy, Sherif M.; Schuster, Gerard T.

    2011-01-01

    Two seismic applications of time reversal mirrors (TRMs) are introduced and tested with field experiments. The first one is sending, receiving, and decoding coded messages similar to a radio except seismic waves are used. The second one is, similar

  6. Role of seismic PRA in seismic safety decisions of nuclear power plants

    International Nuclear Information System (INIS)

    Ravindra, M.K.; Kennedy, R.P.; Sues, R.H.

    1985-01-01

    This paper highlights the important roles that seismic probabilistic risk assessments (PRAs) can play in the seismic safety decisions of nuclear power plants. If a seismic PRA has been performed for a plant, its results can be utilized to evaluate the seismic capability beyond the safe shutdown event (SSE). Seismic fragilities of key structures and equipment, fragilities of dominant plant damage states and the frequencies of occurrence of these plant damage states are reviewed to establish the seismic safety of the plant beyond the SSE level. Guidelines for seismic margin reviews and upgrading may be developed by first identifying the generic classes of structures and equipment that have been shown to be dominant risk contributors in the completed seismic PRAs, studying the underlying causes for their contribution and examining why certain other items (e.g., piping) have not proved to be high-risk-contributors

  7. Seismic safety programme at NPP Paks. Propositions for coordinated international activity in seismic safety of the WWER-440 V-213

    International Nuclear Information System (INIS)

    Katona, T.

    1995-01-01

    This paper presents the Paks NPP seismic safety program, highlighting the specifics of the WWER-440/213 type in operation, and the results of work obtained so far. It covers the following scope: establishment of the seismic safety program (original seismic design, current requirements, principles and structure of the seismic safety program); implementation of the seismic safety program (assessing the seismic hazard of the site, development of the new concept of seismic safety for the NPP, assessing the seismic resistance of the building and the technology); realization of the seismic safety of higher level (technical solutions, drawings, realization); ideas and propositions for coordinated international activity

  8. NRC systematic evaluation program: seismic review

    International Nuclear Information System (INIS)

    Levin, H.A.

    1980-01-01

    The NRC Systematic Evaluation Program is currently making an assessment of the seismic design safety of 11 older nuclear power plant facilities. The general review philosophy and review criteria relative to seismic input, structural response, and equipment functionability are presented, including the rationale for the development of these guidelines considering the significant evolution of seismic design criteria since these plants were originally licensed. Technical approaches thought more realistic in light of current knowledge are utilized. Initial findings for plants designed to early seismic design procedures suggest that with minor exceptions, these plants possess adequate seismic design margins when evaluated against the intent of current criteria. However, seismic qualification of electrical equipment has been identified as a subject which requires more in-depth evaluation

  9. Review of nuclear piping seismic design requirements

    International Nuclear Information System (INIS)

    Slagis, G.C.; Moore, S.E.

    1994-01-01

    Modern-day nuclear plant piping systems are designed with a large number of seismic supports and snubbers that may be detrimental to plant reliability. Experimental tests have demonstrated the inherent ruggedness of ductile steel piping for seismic loading. Present methods to predict seismic loads on piping are based on linear-elastic analysis methods with low damping. These methods overpredict the seismic response of ductile steel pipe. Section III of the ASME Boiler and Pressure Vessel Code stresses limits for piping systems that are based on considerations of static loads and hence are overly conservative. Appropriate stress limits for seismic loads on piping should be incorporated into the code to allow more flexible piping designs. The existing requirements and methods for seismic design of piping systems, including inherent conservations, are explained to provide a technical foundation for modifications to those requirements. 30 refs., 5 figs., 3 tabs

  10. Seismic design practices for power systems

    International Nuclear Information System (INIS)

    Schiff, A.J.

    1991-01-01

    In this paper, the evolution of seismic design practices in electric power systems is reviewed. In California the evolution had led to many installation practices that are directed at improving the seismic ruggedness of power system facilities, particularly high voltage substation equipment. The primary means for substantiating the seismic ruggedness of important, hard to analyze substation equipment is through vibration testing. Current activities include system evaluations, development of emergency response plans and their exercise, and review elements that impact the entire system, such as energy control centers and communication systems. From a national perspective there is a need to standardize seismic specifications, identify a seismic specialist within each utility and enhance communications among these specialists. There is a general need to incorporate good seismic design practices on a national basis emphasizing new construction

  11. Seismic hazard in Hawaii: High rate of large earthquakes and probabilistics ground-motion maps

    Science.gov (United States)

    Klein, F.W.; Frankel, A.D.; Mueller, C.S.; Wesson, R.L.; Okubo, P.G.

    2001-01-01

    The seismic hazard and earthquake occurrence rates in Hawaii are locally as high as that near the most hazardous faults elsewhere in the United States. We have generated maps of peak ground acceleration (PGA) and spectral acceleration (SA) (at 0.2, 0.3 and 1.0 sec, 5% critical damping) at 2% and 10% exceedance probabilities in 50 years. The highest hazard is on the south side of Hawaii Island, as indicated by the MI 7.0, MS 7.2, and MI 7.9 earthquakes, which occurred there since 1868. Probabilistic values of horizontal PGA (2% in 50 years) on Hawaii's south coast exceed 1.75g. Because some large earthquake aftershock zones and the geometry of flank blocks slipping on subhorizontal decollement faults are known, we use a combination of spatially uniform sources in active flank blocks and smoothed seismicity in other areas to model seismicity. Rates of earthquakes are derived from magnitude distributions of the modem (1959-1997) catalog of the Hawaiian Volcano Observatory's seismic network supplemented by the historic (1868-1959) catalog. Modern magnitudes are ML measured on a Wood-Anderson seismograph or MS. Historic magnitudes may add ML measured on a Milne-Shaw or Bosch-Omori seismograph or MI derived from calibrated areas of MM intensities. Active flank areas, which by far account for the highest hazard, are characterized by distributions with b slopes of about 1.0 below M 5.0 and about 0.6 above M 5.0. The kinked distribution means that large earthquake rates would be grossly under-estimated by extrapolating small earthquake rates, and that longer catalogs are essential for estimating or verifying the rates of large earthquakes. Flank earthquakes thus follow a semicharacteristic model, which is a combination of background seismicity and an excess number of large earthquakes. Flank earthquakes are geometrically confined to rupture zones on the volcano flanks by barriers such as rift zones and the seaward edge of the volcano, which may be expressed by a magnitude

  12. Seismic microzoning in the metropolitan area of Port - au-Prince - complexity of the subsoil

    Science.gov (United States)

    Gilles, R.; Bertil, D.; Belvaux, M.; Roulle, A.; Noury, G.; Prepetit, C.; Jean-Philippe, J.

    2013-12-01

    The magnitude 7.3 earthquake that struck Haiti in January 12, 2010 has caused a lot of damages in surrounding areas epicenter. These damages are due to a lack of knowledge of the Haitian subsoil. To overcome this problem, the LNBTP, the BME and BRGM have agreed to implement a project of seismic microzonation of the metropolitan area of Port-au-Prince which is financed by the Fund for the reconstruction of the country. The seismic microzonation is an important tool for knowledge of seismic risk. It is based on a collection of geological, geotechnical, geophysical and measures and recognition and the campaign of numerous sites. It describes a class of specific soils with associated spectral response. The objective of the microzoning is to identify and map the homogeneous zones of lithology, topography, liquefaction and ground movements. The zoning of lithological sites effect is to identify and map areas with geological and geomechanical consistent and homogeneous seismic response; the objective is to provide, in each area, seismic movements adapted to the ground. This zoning is done in about five steps: 1- Cross-analysis of geological, geotechnical and geophysical information; 2- Such information comprise the existing data collected and the data acquired during the project; 3- Identification of homogeneous areas. 4- Definition of one or more columns of representative soils associated with each zone; 5 - Possible consolidation of area to get the final seismic zoning. 27 zones types were considered for the study of sites effects after the analysis of all geological, geotechnical and geophysical data. For example, for the formation of Delmas, there are 5 areas with soil classes ranging from D to C. Soil columns described in the metropolitan area of Port-au-Prince are processed with the CyberQuake software, which is developed at the BRGM by Modaressi et al. in 1997, to calculate their response to seismic rock solicitation. The seismic motion is determined by 4

  13. Early estimation of epicenter seismic intensities according to co-seismic deformation

    OpenAIRE

    Weidong, Li; Chaojun, Zhang; Dahui, Li; Jiayong, He; Huizhong, Chen; Lomnitz, Cinna

    2010-01-01

    The absolute fault displacement in co-seismic deformation is derived assuming that location, depth, faulting mechanism and magnitude of the earthquake are known. The 2008 Wenchuan earthquake (M8.0) is used as an example to determine the distribution of seismic intensities using absolute displacement and a crustal model. We fnd that an early prediction of the distribution of seismic intensities after a large earthquake may be performed from the estimated absolute co-seismic displacements using...

  14. LANL seismic screening method for existing buildings

    International Nuclear Information System (INIS)

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method

  15. Derivatives of Multivariate Bernstein Operators and Smoothness with Jacobi Weights

    Directory of Open Access Journals (Sweden)

    Jianjun Wang

    2012-01-01

    Full Text Available Using the modulus of smoothness, directional derivatives of multivariate Bernstein operators with weights are characterized. The obtained results partly generalize the corresponding ones for multivariate Bernstein operators without weights.

  16. Smooth surfaces from bilinear patches: Discrete affine minimal surfaces

    KAUST Repository

    Kä ferbö ck, Florian; Pottmann, Helmut

    2013-01-01

    Motivated by applications in freeform architecture, we study surfaces which are composed of smoothly joined bilinear patches. These surfaces turn out to be discrete versions of negatively curved affine minimal surfaces and share many properties

  17. Ensemble Kalman filtering with one-step-ahead smoothing

    KAUST Repository

    Raboudi, Naila F.; Ait-El-Fquih, Boujemaa; Hoteit, Ibrahim

    2018-01-01

    error statistics. This limits their representativeness of the background error covariances and, thus, their performance. This work explores the efficiency of the one-step-ahead (OSA) smoothing formulation of the Bayesian filtering problem to enhance

  18. Estimate of K-functionals and modulus of smoothness constructed ...

    Indian Academy of Sciences (India)

    2016-08-26

    functional and a modulus of smoothness for the Dunkl transform on Rd. Author Affiliations. M El Hamma1 R Daher1. Department of Mathematics, Faculty of Sciences Aïn Chock, University of Hassan II, Casablanca, Morocco. Dates.

  19. Small Smooth Units ('Young' Lavas?) Abutting Lobate Scarps on Mercury

    Science.gov (United States)

    Malliband, C. C.; Rothery, D. A.; Balme, M. R.; Conway, S. J.

    2018-05-01

    We have identified small units abutting, and so stratigraphy younger than, lobate scarps. This post dates the end of large scale smooth plains formation at the onset of global contraction. This elaborates the history of volcanism on Mercury.

  20. Carrier tracking by smoothing filter improves symbol SNR

    Science.gov (United States)

    Pomalaza-Raez, Carlos A.; Hurd, William J.

    1986-01-01

    The potential benefit of using a smoothing filter to estimate carrier phase over use of phase locked loops (PLL) is determined. Numerical results are presented for the performance of three possible configurations of the deep space network advanced receiver. These are residual carrier PLL, sideband aided residual carrier PLL, and finally sideband aiding with a Kalman smoother. The average symbol signal to noise ratio (SNR) after losses due to carrier phase estimation error is computed for different total power SNRs, symbol rates and symbol SNRs. It is found that smoothing is most beneficial for low symbol SNRs and low symbol rates. Smoothing gains up to 0.4 dB over a sideband aided residual carrier PLL, and the combined benefit of smoothing and sideband aiding relative to a residual carrier loop is often in excess of 1 dB.