WorldWideScience

Sample records for resolution frequency analysis

  1. The frequency analysis particle resolution technique of 6LiI(Eu) scintillation detector

    International Nuclear Information System (INIS)

    Duan Shaojie

    1995-01-01

    To measure the distribution and rate of tritium production by neutron in a 6 LiD sphere, the 6 LiI(Eu) scintillation detector was used. In the measurement, the frequency analysis particle resolution technique was used. The experiment was completed perfectly

  2. Preliminary frequency-domain analysis for the reconstructed spatial resolution of muon tomography

    Science.gov (United States)

    Yu, B.; Zhao, Z.; Wang, X.; Wang, Y.; Wu, D.; Zeng, Z.; Zeng, M.; Yi, H.; Luo, Z.; Yue, X.; Cheng, J.

    2014-11-01

    Muon tomography is an advanced technology to non-destructively detect high atomic number materials. It exploits the multiple Coulomb scattering information of muon to reconstruct the scattering density image of the traversed object. Because of the statistics of muon scattering, the measurement error of system and the data incompleteness, the reconstruction is always accompanied with a certain level of interference, which will influence the reconstructed spatial resolution. While statistical noises can be reduced by extending the measuring time, system parameters determine the ultimate spatial resolution that one system can reach. In this paper, an effective frequency-domain model is proposed to analyze the reconstructed spatial resolution of muon tomography. The proposed method modifies the resolution analysis in conventional computed tomography (CT) to fit the different imaging mechanism in muon scattering tomography. The measured scattering information is described in frequency domain, then a relationship between the measurements and the original image is proposed in Fourier domain, which is named as "Muon Central Slice Theorem". Furthermore, a preliminary analytical expression of the ultimate reconstructed spatial is derived, and the simulations are performed for validation. While the method is able to predict the ultimate spatial resolution of a given system, it can also be utilized for the optimization of system design and construction.

  3. Preliminary frequency-domain analysis for the reconstructed spatial resolution of muon tomography

    International Nuclear Information System (INIS)

    Yu, B.; Zhao, Z.; Wang, X.; Wang, Y.; Wu, D.; Zeng, Z.; Zeng, M.; Yi, H.; Luo, Z.; Yue, X.; Cheng, J.

    2014-01-01

    Muon tomography is an advanced technology to non-destructively detect high atomic number materials. It exploits the multiple Coulomb scattering information of muon to reconstruct the scattering density image of the traversed object. Because of the statistics of muon scattering, the measurement error of system and the data incompleteness, the reconstruction is always accompanied with a certain level of interference, which will influence the reconstructed spatial resolution. While statistical noises can be reduced by extending the measuring time, system parameters determine the ultimate spatial resolution that one system can reach. In this paper, an effective frequency-domain model is proposed to analyze the reconstructed spatial resolution of muon tomography. The proposed method modifies the resolution analysis in conventional computed tomography (CT) to fit the different imaging mechanism in muon scattering tomography. The measured scattering information is described in frequency domain, then a relationship between the measurements and the original image is proposed in Fourier domain, which is named as M uon Central Slice Theorem . Furthermore, a preliminary analytical expression of the ultimate reconstructed spatial is derived, and the simulations are performed for validation. While the method is able to predict the ultimate spatial resolution of a given system, it can also be utilized for the optimization of system design and construction

  4. Time-frequency analysis with temporal and spectral resolution as the human auditory system

    DEFF Research Database (Denmark)

    Agerkvist, Finn T.

    1992-01-01

    The human perception of sound is a suitable area for the application of a simultaneous time-frequency analysis, since the ear is selective in both domains. A perfect reconstruction filter bank with bandwidths approximating the critical bands is presented. The orthogonality of the filter makes...... it possible to examine the masking effect with realistic signals. The tree structure of the filter bank makes it difficult to obtain well-attenuated stop-bands. The use of filters of different length solves this problem...

  5. High-resolution numerical model of the middle and inner ear for a detailed analysis of radio frequency absorption

    International Nuclear Information System (INIS)

    Schmid, Gernot; Ueberbacher, Richard; Samaras, Theodoros; Jappel, Alexandra; Baumgartner, Wolf-Dieter; Tschabitscher, Manfred; Mazal, Peter R

    2007-01-01

    In order to enable a detailed analysis of radio frequency (RF) absorption in the human middle and inner ear organs, a numerical model of these organs was developed at a spatial resolution of 0.1 mm, based on a real human tissue sample. The dielectric properties of the liquids (perilymph and endolymph) inside the bony labyrinth were measured on samples of ten freshly deceased humans. After inserting this model into a commercially available numerical head model, FDTD-based computations for exposure scenarios with generic models of handheld devices operated close to the head in the frequency range 400-3700 MHz were carried out. For typical output power values of real handheld mobile communication devices the obtained results showed only very small amounts of absorbed RF power in the middle and inner ear organs. Highest absorption in the middle and inner ear was found for the 400 MHz irradiation. In this case, the RF power absorbed inside the labyrinth and the vestibulocochlear nerve was as low as 166 μW and 12 μW, respectively, when considering a device of 500 mW output power operated close to the ear. For typical mobile phone frequencies (900 MHz and 1850 MHz) and output power values (250 mW and 125 mW) the corresponding values of absorbed RF power were found to be more than one order of magnitude lower than the values given above. These results indicate that temperature-related biologically relevant effects on the middle and inner ear, induced by the RF emissions of typical handheld mobile communication devices, are unlikely

  6. Time-Frequency Feature Representation Using Multi-Resolution Texture Analysis and Acoustic Activity Detector for Real-Life Speech Emotion Recognition

    Directory of Open Access Journals (Sweden)

    Kun-Ching Wang

    2015-01-01

    Full Text Available The classification of emotional speech is mostly considered in speech-related research on human-computer interaction (HCI. In this paper, the purpose is to present a novel feature extraction based on multi-resolutions texture image information (MRTII. The MRTII feature set is derived from multi-resolution texture analysis for characterization and classification of different emotions in a speech signal. The motivation is that we have to consider emotions have different intensity values in different frequency bands. In terms of human visual perceptual, the texture property on multi-resolution of emotional speech spectrogram should be a good feature set for emotion classification in speech. Furthermore, the multi-resolution analysis on texture can give a clearer discrimination between each emotion than uniform-resolution analysis on texture. In order to provide high accuracy of emotional discrimination especially in real-life, an acoustic activity detection (AAD algorithm must be applied into the MRTII-based feature extraction. Considering the presence of many blended emotions in real life, in this paper make use of two corpora of naturally-occurring dialogs recorded in real-life call centers. Compared with the traditional Mel-scale Frequency Cepstral Coefficients (MFCC and the state-of-the-art features, the MRTII features also can improve the correct classification rates of proposed systems among different language databases. Experimental results show that the proposed MRTII-based feature information inspired by human visual perception of the spectrogram image can provide significant classification for real-life emotional recognition in speech.

  7. High resolution mid-infrared spectroscopy based on frequency upconversion

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin; Hu, Qi; Tidemand-Lichtenberg, Peter

    2013-01-01

    signals can be analyzed. The obtainable frequency resolution is usually in the nm range where sub nm resolution is preferred in many applications, like gas spectroscopy. In this work we demonstrate how to obtain sub nm resolution when using upconversion. In the presented realization one object point...... high resolution spectral performance by observing emission from hot water vapor in a butane gas burner....

  8. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  9. Resonance frequency analysis

    Directory of Open Access Journals (Sweden)

    Rajiv K Gupta

    2011-01-01

    Full Text Available Initial stability at the placement and development of osseointegration are two major issues for implant survival. Implant stability is a mechanical phenomenon which is related to the local bone quality and quantity, type of implant, and placement technique used. The application of a simple, clinically applicable, non-invasive test to assess implant stability and osseointegration is considered highly desirable. Resonance frequency analysis (RFA is one of such techniques which is most frequently used now days. The aim of this paper was to review and analyze critically the current available literature in the field of RFA, and to also discuss based on scientific evidence, the prognostic value of RFA to detect implants at risk of failure. A search was made using the PubMed database to find all the literature published on "Resonance frequency analysis for implant stability" till date. Articles discussed in vivo or in vitro studies comparing RFA with other methods of implant stability measurement and articles discussing its reliability were thoroughly reviewed and discussed. A limited number of clinical reports were found. Various studies have demonstrated the feasibility and predictability of the technique. However, most of these articles are based on retrospective data or uncontrolled cases. Randomized, prospective, parallel-armed longitudinal human trials are based on short-term results and long-term follow up are still scarce in this field. Nonetheless, from available literature, it may be concluded that RFA technique evaluates implant stability as a function of stiffness of the implant bone interface and is influenced by factors such as bone type, exposed implant height above the alveolar crest. Resonance frequency analysis could serve as a non-invasive diagnostic tool for detecting the implant stability of dental implants during the healing stages and in subsequent routine follow up care after treatment. Future studies, preferably randomized

  10. Collective Thomson scattering measurements with high frequency resolution at TEXTOR

    DEFF Research Database (Denmark)

    Stejner Pedersen, Morten; Nielsen, Stefan Kragh; Korsholm, Søren Bang

    2010-01-01

    We discuss the development and first results of a receiver system for the collective Thomson scattering (CTS) diagnostic at TEXTOR with frequency resolution in the megahertz range or better. The improved frequency resolution expands the diagnostic range and utility of CTS measurements in general ...... and is a prerequisite for measurements of ion Bernstein wave signatures in CTS spectra. The first results from the new acquisition system are shown to be consistent with theory and with simultaneous measurements by the standard receiver system. © 2010 EURATOM...

  11. Resolution analysis by random probing

    NARCIS (Netherlands)

    Fichtner, Andreas; van Leeuwen, T.

    2015-01-01

    We develop and apply methods for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of our methods are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full‐waveform

  12. On temporal correlations in high-resolution frequency counting

    OpenAIRE

    Dunker, Tim; Hauglin, Harald; Rønningen, Ole Petter

    2016-01-01

    We analyze noise properties of time series of frequency data from different counting modes of a Keysight 53230A frequency counter. We use a 10 MHz reference signal from a passive hydrogen maser connected via phase-stable Huber+Suhner Sucoflex 104 cables to the reference and input connectors of the counter. We find that the high resolution gap-free (CONT) frequency counting process imposes long-term correlations in the output data, resulting in a modified Allan deviation that is characteristic...

  13. High-resolution melting analysis, a simple and effective method for reliable mutation scanning and frequency studies in the ACADVL gene

    DEFF Research Database (Denmark)

    Olsen, Rikke Katrine Jentoft; Dobrowolski, Steven F; Kjeldsen, Margrethe

    2010-01-01

    -long-chain acyl-CoA dehydrogenase deficiency (VLCADD), the second most common fatty acid oxidation disorder detected by expanded newborn screening, to demonstrate accurate and fast diagnostic evaluation of the ACADVL gene utilizing DNA extracted from the newborn screening dried blood spot and high resolution melt...

  14. Extended Opacity Tables with Higher Temperature-Density-Frequency Resolution

    Science.gov (United States)

    Schillaci, Mark; Orban, Chris; Delahaye, Franck; Pinsonneault, Marc; Nahar, Sultana; Pradhan, Anil

    2015-05-01

    Theoretical models for plasma opacities underpin our understanding of radiation transport in many different astrophysical objects. These opacity models are also relevant to HEDP experiments such as ignition scale experiments on NIF. We present a significantly expanded set of opacity data from the widely utilized Opacity Project, and make these higher resolution data publicly available through OSU's portal with dropbox.com. This expanded data set is used to assess how accurate the interpolation of opacity data in temperature-density-frequency dimensions must be in order to adequately model the properties of most stellar types. These efforts are the beginning of a larger project to improve the theoretical opacity models in light of experimental results at the Sandia Z-pinch showing that the measured opacity of Iron disagrees strongly with all current models.

  15. Quantitative subsurface analysis using frequency modulated thermal wave imaging

    Science.gov (United States)

    Subhani, S. K.; Suresh, B.; Ghali, V. S.

    2018-01-01

    Quantitative depth analysis of the anomaly with an enhanced depth resolution is a challenging task towards the estimation of depth of the subsurface anomaly using thermography. Frequency modulated thermal wave imaging introduced earlier provides a complete depth scanning of the object by stimulating it with a suitable band of frequencies and further analyzing the subsequent thermal response using a suitable post processing approach to resolve subsurface details. But conventional Fourier transform based methods used for post processing unscramble the frequencies with a limited frequency resolution and contribute for a finite depth resolution. Spectral zooming provided by chirp z transform facilitates enhanced frequency resolution which can further improves the depth resolution to axially explore finest subsurface features. Quantitative depth analysis with this augmented depth resolution is proposed to provide a closest estimate to the actual depth of subsurface anomaly. This manuscript experimentally validates this enhanced depth resolution using non stationary thermal wave imaging and offers an ever first and unique solution for quantitative depth estimation in frequency modulated thermal wave imaging.

  16. a Thz Photomixing Synthesizer Based on a Fiber Frequency Comb for High Resolution Rotational Spectroscopy

    Science.gov (United States)

    Hindle, Francis; Mouret, Gael; Cuisset, Arnaud; Yang, Chun; Eliet, Sophie; Bocquet, Robin

    2010-06-01

    To date the principal application for photomixing sources has been for high resolution spectroscopy of gases due to the large tuning range and spectral purity. New Developments of the Opto-Electronic THz Spectrometer have been performed in order to obtain a powerful tool for High-Resolution Spectroscopy. The combination of two extended cavity laser diodes and fast charge carrier lifetime semiconductor materials has allowed a continuous-wave THz spectrometer to be constructed based on optical heterodyning. Unlike many THz sources, this instrument gives access to all frequencies in the range 0.3 to 3.5 THz with a resolution of 1 MHz. The main spectroscopic applications of this spectrometer were dedicated to line profile analysis of rotational transitions referenced in the spectroscopic databases. One limitation of the THz spectrometer was accuracy with which the generated frequency is known. Recently, this obstacle has been circled with the construction of a photomixing spectrometer where the two pump lasers are phase locked to two modes of a repetition rate stabilized frequency doubled fiber laser frequency comb. In order to achieve a tuning range in excess to 100 MHz a third cw laser was required in the new configuration of the THz spectrometer. To assess the performances of this instrument, the frequencies of the pure rotational transitions of OCS molecules have been measured between 0,8 to 1,2 THz. A rms inferior to 100 kHz, deduced from the frequencies measured, demonstrates that the THz photomixing synthesizer is now able to be competitive with microwave and submillimeter techniques. S. Matton, F. Rohart, R. Bocquet, D. Bigourd, A. Cuisset, F. Hindle, G. Mouret, J. Mol. Spectrosc., 2006, 239: 182. C. Yang, J. Buldyreva, I. E. Gordon, F. Rohart, A. Cuisset, G. Mouret, R. Bocquet, F. Hindle, J. Quant. Spectrosc. Radiat. Transfer, 2008, 109: 2857. G. Mouret, F. Hindle, A. Cuisset, C. Yang, R. Bocquet, M. Lours, D. Rovera, Opt. Express, 2009, 17: 22031.

  17. Stepped-frequency radar sensors theory, analysis and design

    CERN Document Server

    Nguyen, Cam

    2016-01-01

    This book presents the theory, analysis and design of microwave stepped-frequency radar sensors. Stepped-frequency radar sensors are attractive for various sensing applications that require fine resolution. The book consists of five chapters. The first chapter describes the fundamentals of radar sensors including applications followed by a review of ultra-wideband pulsed, frequency-modulated continuous-wave (FMCW), and stepped-frequency radar sensors. The second chapter discusses a general analysis of radar sensors including wave propagation in media and scattering on targets, as well as the radar equation. The third chapter addresses the analysis of stepped-frequency radar sensors including their principles and design parameters. Chapter 4 presents the development of two stepped-frequency radar sensors at microwave and millimeter-wave frequencies based on microwave integrated circuits (MICs), microwave monolithic integrated circuits (MMICs) and printed-circuit antennas, and discusses their signal processing....

  18. Comparative study between ultrahigh spatial frequency algorithm and high spatial frequency algorithm in high-resolution CT of the lungs

    International Nuclear Information System (INIS)

    Oh, Yu Whan; Kim, Jung Kyuk; Suh, Won Hyuck

    1994-01-01

    To date, the high spatial frequency algorithm (HSFA) which reduces image smoothing and increases spatial resolution has been used for the evaluation of parenchymal lung diseases in thin-section high-resolution CT. In this study, we compared the ultrahigh spatial frequency algorithm (UHSFA) with the high spatial frequency algorithm in the assessment of thin section images of the lung parenchyma. Three radiologists compared the UHSFA and HSFA on identical CT images in a line-pair resolution phantom, one lung specimen, 2 patients with normal lung and 18 patients with abnormal lung parenchyma. Scanning of a line-pair resolution phantom demonstrated no difference in resolution between two techniques but it showed that outer lines of the line pairs with maximal resolution looked thicker on UHSFA than those on HSFA. Lung parenchymal detail with UHSFA was judged equal or superior to HSFA in 95% of images. Lung parenchymal sharpness was improved with UHSFA in all images. Although UHSFA resulted in an increase in visible noise, observers did not found that image noise interfered with image interpretation. The visual CT attenuation of normal lung parenchyma is minimally increased in images with HSFA. The overall visual preference of the images reconstructed on UHSFA was considered equal to or greater than that of those reconstructed on HSFA in 78% of images. The ultrahigh spatial frequency algorithm improved the overall visual quality of the images in pulmonary parenchymal high-resolution CT

  19. Triple-Frequency GPS Precise Point Positioning Ambiguity Resolution Using Dual-Frequency Based IGS Precise Clock Products

    Directory of Open Access Journals (Sweden)

    Fei Liu

    2017-01-01

    Full Text Available With the availability of the third civil signal in the Global Positioning System, triple-frequency Precise Point Positioning ambiguity resolution methods have drawn increasing attention due to significantly reduced convergence time. However, the corresponding triple-frequency based precise clock products are not widely available and adopted by applications. Currently, most precise products are generated based on ionosphere-free combination of dual-frequency L1/L2 signals, which however are not consistent with the triple-frequency ionosphere-free carrier-phase measurements, resulting in inaccurate positioning and unstable float ambiguities. In this study, a GPS triple-frequency PPP ambiguity resolution method is developed using the widely used dual-frequency based clock products. In this method, the interfrequency clock biases between the triple-frequency and dual-frequency ionosphere-free carrier-phase measurements are first estimated and then applied to triple-frequency ionosphere-free carrier-phase measurements to obtain stable float ambiguities. After this, the wide-lane L2/L5 and wide-lane L1/L2 integer property of ambiguities are recovered by estimating the satellite fractional cycle biases. A test using a sparse network is conducted to verify the effectiveness of the method. The results show that the ambiguity resolution can be achieved in minutes even tens of seconds and the positioning accuracy is in decimeter level.

  20. Terahertz Harmonic Operation of Microwave Fresnel Zone Plate Lens and Antenna: Frequency Filtering and Space Resolution Properties

    Directory of Open Access Journals (Sweden)

    Hristo D. Hristov

    2011-01-01

    Full Text Available This paper examines the binary Fresnel zone plate (FZP lens frequency-harmonic and space-resolution focusing, and its application as a FZP lens antenna. A microwave FZP lens antenna (FZPA radiates both at design (90 GHz and terahertz (THz odd harmonic frequencies. Frequency and space domain antenna operation are studied analytically by use of the vector diffraction integral applied to a realistic printed FZPA. It is found that all harmonic gain peaks are roughly identical in form, bandwidth, and top values. At each harmonic frequency, the FZPA has a beamwidth that closely follows the Rayleigh resolution criterion. If the lens/antenna resolution is of prime importance and the small aperture efficiency is a secondary problem the microwave-design FZP lens antenna can be of great use at much higher terahertz frequencies. Important feature of the microwave FZP lens is its broader-zone construction compared to the equal in resolution terahertz-design FZP lens. Thus, unique and expensive microtechnology for the microwave FZP lens fabrication is not required. High-order harmonic operation of the FZP lens or lens antenna could find space resolution and frequency filtering applications in the terahertz and optical metrology, imaging tomography, short-range communications, spectral analysis, synchrotron facilities, and so on.

  1. Two high-frequency mutual inductance bridges with high resolution

    NARCIS (Netherlands)

    Flokstra, Jakob; Gerritsma, G.J.; Kreuwel, H.J.M.; van der Marel, L.C.

    1980-01-01

    Two mutual inductance bridges are described for operation up to about 100 kHz. Special attention is paid to the sensitivity and resolution of the bridges. Both bridges can be used to measure variations of about 10 pH in the mutual inductance. The first bridge consists of passive elements only

  2. Optical frequency comb for high resolution hydrogen spectroscopy

    International Nuclear Information System (INIS)

    Arnoult, O.

    2006-11-01

    In this work, we perform an absolute frequency measurement of the 1S-3S transition in atomic hydrogen, in order to improve the uncertainties on both the Rydberg constant and the Lamb shift L1S. In the experiment, a CW stabilized Ti:Sa laser is doubled twice in LBO (LiB 3 O 5 ) and BBO (β-BaB 2 O 4 ) crystals. The 1S-3S transition is excited by two photons at 205 nm in an optical cavity colinear with the atomic beam, at room temperature. The remaining second-order Doppler effect is compensated by a quadratic Stark effect resulting from an applied static magnetic field. An optical frequency comb is used to compare directly the Ti:Sa frequency with the microwave frequency standard. We detect fluorescence at 656 nm thanks to a CCD camera. Fitting the experimental data with our calculated line shapes leads to a value of the second-order Doppler effect in disagreement with approximative predictions for the 1S-3S frequency. We suggest the existence of stray electric fields as a possible systematic effect. The slides of the defence of the thesis have been added at the end of the document. (author)

  3. Small displacement measurements with subatomic resolution by beat frequency measurements

    Czech Academy of Sciences Publication Activity Database

    Číp, Ondřej; Petrů, František; Buchta, Zdeněk; Lazar, Josef

    2007-01-01

    Roč. 18, č. 7 (2007), s. 2005-2013 ISSN 0957-0233 R&D Projects: GA AV ČR KJB200650503; GA MŠk(CZ) LC06007; GA ČR GA102/07/1179; GA MPO FT-TA3/133 Institutional research plan: CEZ:AV0Z20650511 Keywords : high-resolution interferometry * nanometrology Subject RIV: BH - Optics, Masers, Lasers Impact factor: 1.297, year: 2007

  4. Time and Frequency Localized Pulse Shape for Resolution Enhancement in STFT-BOTDR

    Directory of Open Access Journals (Sweden)

    Linqing Luo

    2016-01-01

    Full Text Available Short-Time Fourier Transform-Brillouin Optical Time-Domain Reflectometry (STFT-BOTDR implements STFT over the full frequency spectrum to measure the distributed temperature and strain along the optic fiber, providing new research advances in dynamic distributed sensing. The spatial and frequency resolution of the dynamic sensing are limited by the Signal to Noise Ratio (SNR and the Time-Frequency (T-F localization of the input pulse shape. T-F localization is fundamentally important for the communication system, which suppresses interchannel interference (ICI and intersymbol interference (ISI to improve the transmission quality in multicarrier modulation (MCM. This paper demonstrates that the T-F localized input pulse shape can enhance the SNR and the spatial and frequency resolution in STFT-BOTDR. Simulation and experiments of T-F localized different pulses shapes are conducted to compare the limitation of the system resolution. The result indicates that rectangular pulse should be selected to optimize the spatial resolution and Lorentzian pulse could be chosen to optimize the frequency resolution, while Gaussian shape pulse can be used in general applications for its balanced performance in both spatial and frequency resolution. Meanwhile, T-F localization is proved to be useful in the pulse shape selection for system resolution optimization.

  5. Edge Detection from High Resolution Remote Sensing Images using Two-Dimensional log Gabor Filter in Frequency Domain

    International Nuclear Information System (INIS)

    Wang, K; Yu, T; Meng, Q Y; Wang, G K; Li, S P; Liu, S H

    2014-01-01

    Edges are vital features to describe the structural information of images, especially high spatial resolution remote sensing images. Edge features can be used to define the boundaries between different ground objects in high spatial resolution remote sensing images. Thus edge detection is important in the remote sensing image processing. Even though many different edge detection algorithms have been proposed, it is difficult to extract the edge features from high spatial resolution remote sensing image including complex ground objects. This paper introduces a novel method to detect edges from the high spatial resolution remote sensing image based on frequency domain. Firstly, the high spatial resolution remote sensing images are Fourier transformed to obtain the magnitude spectrum image (frequency image) by FFT. Then, the frequency spectrum is analyzed by using the radius and angle sampling. Finally, two-dimensional log Gabor filter with optimal parameters is designed according to the result of spectrum analysis. Finally, dot product between the result of Fourier transform and the log Gabor filter is inverse Fourier transformed to obtain the detections. The experimental result shows that the proposed algorithm can detect edge features from the high resolution remote sensing image commendably

  6. Digital timing: sampling frequency, anti-aliasing filter and signal interpolation filter dependence on timing resolution

    International Nuclear Information System (INIS)

    Cho, Sanghee; Grazioso, Ron; Zhang Nan; Aykac, Mehmet; Schmand, Matthias

    2011-01-01

    The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.

  7. Advanced Time-Frequency Representation in Voice Signal Analysis

    Directory of Open Access Journals (Sweden)

    Dariusz Mika

    2018-03-01

    Full Text Available The most commonly used time-frequency representation of the analysis in voice signal is spectrogram. This representation belongs in general to Cohen's class, the class of time-frequency energy distributions. From the standpoint of properties of the resolution spectrogram representation is not optimal. In Cohen class representations are known which have a better resolution properties. All of them are created by smoothing the Wigner-Ville'a (WVD distribution characterized by the best resolution, however, the biggest harmful interference. Used smoothing functions decide about a compromise between the properties of resolution and eliminating harmful interference term. Another class of time-frequency energy distributions is the affine class of distributions. From the point of view of readability of analysis the best properties are known so called Redistribution of energy caused by the use of a general methodology referred to as reassignment to any time-frequency representation. Reassigned distributions efficiently combine a reduction of the interference terms provided by a well adapted smoothing kernel and an increased concentration of the signal components.

  8. Recent developments in time-frequency analysis

    CERN Document Server

    Loughlin, Patrick

    1998-01-01

    Recent Developments in Time-Frequency Analysis brings together in one place important contributions and up-to-date research results in this fast moving area. Recent Developments in Time-Frequency Analysis serves as an excellent reference, providing insight into some of the most challenging research issues in the field.

  9. Resolution analysis in full waveform inversion

    NARCIS (Netherlands)

    Fichtner, A.; Trampert, J.

    2011-01-01

    We propose a new method for the quantitative resolution analysis in full seismic waveform inversion that overcomes the limitations of classical synthetic inversions while being computationally more efficient and applicable to any misfit measure. The method rests on (1) the local quadratic

  10. Impact of the displacement current on low-frequency electromagnetic fields computed using high-resolution anatomy models

    International Nuclear Information System (INIS)

    Barchanski, A; Gersem, H de; Gjonaj, E; Weiland, T

    2005-01-01

    We present a comparison of simulated low-frequency electromagnetic fields in the human body, calculated by means of the electro-quasistatic formulation. The geometrical data in these simulations were provided by an anatomically realistic, high-resolution human body model, while the dielectric properties of the various body tissues were modelled by the parametric Cole-Cole equation. The model was examined under two different excitation sources and various spatial resolutions in a frequency range from 10 Hz to 1 MHz. An analysis of the differences in the computed fields resulting from a neglect of the permittivity was carried out. On this basis, an estimation of the impact of the displacement current on the simulated low-frequency electromagnetic fields in the human body is obtained. (note)

  11. Estimation of red-light running frequency using high-resolution traffic and signal data.

    Science.gov (United States)

    Chen, Peng; Yu, Guizhen; Wu, Xinkai; Ren, Yilong; Li, Yueguang

    2017-05-01

    Red-light-running (RLR) emerges as a major cause that may lead to intersection-related crashes and endanger intersection safety. To reduce RLR violations, it's critical to identify the influential factors associated with RLR and estimate RLR frequency. Without resorting to video camera recordings, this study investigates this important issue by utilizing high-resolution traffic and signal event data collected from loop detectors at five intersections on Trunk Highway 55, Minneapolis, MN. First, a simple method is proposed to identify RLR by fully utilizing the information obtained from stop bar detectors, downstream entrance detectors and advance detectors. Using 12 months of event data, a total of 6550 RLR cases were identified. According to a definition of RLR frequency as the conditional probability of RLR on a certain traffic or signal condition (veh/1000veh), the relationships between RLR frequency and some influential factors including arriving time at advance detector, approaching speed, headway, gap to the preceding vehicle on adjacent lane, cycle length, geometric characteristics and even snowing weather were empirically investigated. Statistical analysis shows good agreement with the traffic engineering practice, e.g., RLR is most likely to occur on weekdays during peak periods under large traffic demands and longer signal cycles, and a total of 95.24% RLR events occurred within the first 1.5s after the onset of red phase. The findings confirmed that vehicles tend to run the red light when they are close to intersection during phase transition, and the vehicles following the leading vehicle with short headways also likely run the red light. Last, a simplified nonlinear regression model is proposed to estimate RLR frequency based on the data from advance detector. The study is expected to helpbetter understand RLR occurrence and further contribute to the future improvement of intersection safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Far Infrared High Resolution Synchrotron FTIR Spectroscopy of the Low Frequency Bending Modes of Dmso

    Science.gov (United States)

    Cuisset, Arnaud; Smirnova, Irina; Bocquet, Robin; Hindle, Francis; Mouret, Gael; Sadovskii, Dmitrii A.; Pirali, Olivier; Roy, Pascale

    2010-06-01

    In addition to its importance for industrial and environmental studies, the monitoring of DiMethylSulfOxyde (DMSO, (CH_3)_2SO) concentrations is of considerable interest for civil protection. The existing high resolution gas phase spectroscopic data of DMSO only concerned the pure rotational transitions in the ground state. In the Far-IR domain, the low-frequency rovibrational transitions have never previously resolved. The high brightness of the AILES beamline of the synchrotron SOLEIL and the instrumental sensitivity provided by the multipass cell allowed to measure for the first time these transitions. 1581 A-type and C-type transitions in the ν11 band have been assigned and 25 molecular constants of Watson's s-form hamiltonian developed to degree 8 have been fitted within the experimental accuracy. The use of then synchrotron radiation has opened many possibilities for new spectroscopic studies. Together with several other recent studies, our successful measurement and analysis of DMSO convincingly demonstrates the potential of the AILES beamline for high resolution FIR spectroscopy. Thus our present work is just at the beginning of unraveling the rovibrational structure of low frequency bending and torsional vibrational states of DMSO and yielding important comprehensive structural and spectroscopic information on this molecule. L. Margules, R. A. Motienko, E. A. Alekseev, J. Demaison, J. Molec. Spectrosc., 260(23),2009 V. Typke, M. Dakkouri, J. Molec. Struct., 599(177),2001 A. Cuisset, L. Nanobashvili, I. Smirnova, R. Bocquet, F. Hindle, G. Mouret, O. Pirali, P. Roy, D. Sadovskii, Chem. Phys. Lett., accepted for publication

  13. Joint Time-Frequency And Wavelet Analysis - An Introduction

    Directory of Open Access Journals (Sweden)

    Majkowski Andrzej

    2014-12-01

    Full Text Available A traditional frequency analysis is not appropriate for observation of properties of non-stationary signals. This stems from the fact that the time resolution is not defined in the Fourier spectrum. Thus, there is a need for methods implementing joint time-frequency analysis (t/f algorithms. Practical aspects of some representative methods of time-frequency analysis, including Short Time Fourier Transform, Gabor Transform, Wigner-Ville Transform and Cone-Shaped Transform are described in this paper. Unfortunately, there is no correlation between the width of the time-frequency window and its frequency content in the t/f analysis. This property is not valid in the case of a wavelet transform. A wavelet is a wave-like oscillation, which forms its own “wavelet window”. Compression of the wavelet narrows the window, and vice versa. Individual wavelet functions are well localized in time and simultaneously in scale (the equivalent of frequency. The wavelet analysis owes its effectiveness to the pyramid algorithm described by Mallat, which enables fast decomposition of a signal into wavelet components.

  14. Cluster analysis of word frequency dynamics

    Science.gov (United States)

    Maslennikova, Yu S.; Bochkarev, V. V.; Belashova, I. A.

    2015-01-01

    This paper describes the analysis and modelling of word usage frequency time series. During one of previous studies, an assumption was put forward that all word usage frequencies have uniform dynamics approaching the shape of a Gaussian function. This assumption can be checked using the frequency dictionaries of the Google Books Ngram database. This database includes 5.2 million books published between 1500 and 2008. The corpus contains over 500 billion words in American English, British English, French, German, Spanish, Russian, Hebrew, and Chinese. We clustered time series of word usage frequencies using a Kohonen neural network. The similarity between input vectors was estimated using several algorithms. As a result of the neural network training procedure, more than ten different forms of time series were found. They describe the dynamics of word usage frequencies from birth to death of individual words. Different groups of word forms were found to have different dynamics of word usage frequency variations.

  15. Cluster analysis of word frequency dynamics

    International Nuclear Information System (INIS)

    Maslennikova, Yu S; Bochkarev, V V; Belashova, I A

    2015-01-01

    This paper describes the analysis and modelling of word usage frequency time series. During one of previous studies, an assumption was put forward that all word usage frequencies have uniform dynamics approaching the shape of a Gaussian function. This assumption can be checked using the frequency dictionaries of the Google Books Ngram database. This database includes 5.2 million books published between 1500 and 2008. The corpus contains over 500 billion words in American English, British English, French, German, Spanish, Russian, Hebrew, and Chinese. We clustered time series of word usage frequencies using a Kohonen neural network. The similarity between input vectors was estimated using several algorithms. As a result of the neural network training procedure, more than ten different forms of time series were found. They describe the dynamics of word usage frequencies from birth to death of individual words. Different groups of word forms were found to have different dynamics of word usage frequency variations

  16. Spatial resolution dependence on spectral frequency in human speech cortex electrocorticography

    Science.gov (United States)

    Muller, Leah; Hamilton, Liberty S.; Edwards, Erik; Bouchard, Kristofer E.; Chang, Edward F.

    2016-10-01

    Objective. Electrocorticography (ECoG) has become an important tool in human neuroscience and has tremendous potential for emerging applications in neural interface technology. Electrode array design parameters are outstanding issues for both research and clinical applications, and these parameters depend critically on the nature of the neural signals to be recorded. Here, we investigate the functional spatial resolution of neural signals recorded at the human cortical surface. We empirically derive spatial spread functions to quantify the shared neural activity for each frequency band of the electrocorticogram. Approach. Five subjects with high-density (4 mm center-to-center spacing) ECoG grid implants participated in speech perception and production tasks while neural activity was recorded from the speech cortex, including superior temporal gyrus, precentral gyrus, and postcentral gyrus. The cortical surface field potential was decomposed into traditional EEG frequency bands. Signal similarity between electrode pairs for each frequency band was quantified using a Pearson correlation coefficient. Main results. The correlation of neural activity between electrode pairs was inversely related to the distance between the electrodes; this relationship was used to quantify spatial falloff functions for cortical subdomains. As expected, lower frequencies remained correlated over larger distances than higher frequencies. However, both the envelope and phase of gamma and high gamma frequencies (30-150 Hz) are largely uncorrelated (<90%) at 4 mm, the smallest spacing of the high-density arrays. Thus, ECoG arrays smaller than 4 mm have significant promise for increasing signal resolution at high frequencies, whereas less additional gain is achieved for lower frequencies. Significance. Our findings quantitatively demonstrate the dependence of ECoG spatial resolution on the neural frequency of interest. We demonstrate that this relationship is consistent across patients and

  17. High-resolution mid-IR spectrometer based on frequency upconversion

    DEFF Research Database (Denmark)

    Hu, Qi; Dam, Jeppe Seidelin; Pedersen, Christian

    2012-01-01

    We demonstrate a novel approach for high-resolution spectroscopy based on frequency upconversion and postfiltering by means of a scanning Fabryx2013;Perot interferometer. The system is based on sum-frequency mixing, shifting the spectral content from the mid-infrared to the near-visible region al......-frequency 1064xA0;nm laser. We investigate water vapor emission lines from a butane burner and compare the measured results to model data. The presented method we suggest to be used for real-time monitoring of specific gas lines and reference signals....

  18. High resolution synchrotron light analysis at ELSA

    Energy Technology Data Exchange (ETDEWEB)

    Switka, Michael; Zander, Sven; Hillert, Wolfgang [Bonn Univ. (Germany). Elektronen-Stretcher Anlage ELSA-Facility (ELSA)

    2013-07-01

    The pulse stretcher ring ELSA provides polarized electrons with energies up to 3.5 GeV for external hadron experiments. In order to suffice the need of stored beam intensities towards 200 mA, advanced beam instability studies need to be carried out. An external diagnostic beamline for synchrotron light analysis has been set up and provides the space for multiple diagnostic tools including a streak camera with time resolution of <1 ps. Beam profile measurements are expected to identify instabilities and reveal their thresholds. The effect of adequate countermeasures is subject to analysis. The current status of the beamline development is presented.

  19. Three-frequency BDS precise point positioning ambiguity resolution based on raw observables

    Science.gov (United States)

    Li, Pan; Zhang, Xiaohong; Ge, Maorong; Schuh, Harald

    2018-02-01

    All BeiDou navigation satellite system (BDS) satellites are transmitting signals on three frequencies, which brings new opportunity and challenges for high-accuracy precise point positioning (PPP) with ambiguity resolution (AR). This paper proposes an effective uncalibrated phase delay (UPD) estimation and AR strategy which is based on a raw PPP model. First, triple-frequency raw PPP models are developed. The observation model and stochastic model are designed and extended to accommodate the third frequency. Then, the UPD is parameterized in raw frequency form while estimating with the high-precision and low-noise integer linear combination of float ambiguity which are derived by ambiguity decorrelation. Third, with UPD corrected, the LAMBDA method is used for resolving full or partial ambiguities which can be fixed. This method can be easily and flexibly extended for dual-, triple- or even more frequency. To verify the effectiveness and performance of triple-frequency PPP AR, tests with real BDS data from 90 stations lasting for 21 days were performed in static mode. Data were processed with three strategies: BDS triple-frequency ambiguity-float PPP, BDS triple-frequency PPP with dual-frequency (B1/B2) and three-frequency AR, respectively. Numerous experiment results showed that compared with the ambiguity-float solution, the performance in terms of convergence time and positioning biases can be significantly improved by AR. Among three groups of solutions, the triple-frequency PPP AR achieved the best performance. Compared with dual-frequency AR, additional the third frequency could apparently improve the position estimations during the initialization phase and under constraint environments when the dual-frequency PPP AR is limited by few satellite numbers.

  20. Triple-frequency GPS precise point positioning with rapid ambiguity resolution

    Science.gov (United States)

    Geng, Jianghui; Bock, Yehuda

    2013-05-01

    At present, reliable ambiguity resolution in real-time GPS precise point positioning (PPP) can only be achieved after an initial observation period of a few tens of minutes. In this study, we propose a method where the incoming triple-frequency GPS signals are exploited to enable rapid convergences to ambiguity-fixed solutions in real-time PPP. Specifically, extra-wide-lane ambiguity resolution can be first achieved almost instantaneously with the Melbourne-Wübbena combination observable on L2 and L5. Then the resultant unambiguous extra-wide-lane carrier-phase is combined with the wide-lane carrier-phase on L1 and L2 to form an ionosphere-free observable with a wavelength of about 3.4 m. Although the noise of this observable is around 100 times the raw carrier-phase noise, its wide-lane ambiguity can still be resolved very efficiently, and the resultant ambiguity-fixed observable can assist much better than pseudorange in speeding up succeeding narrow-lane ambiguity resolution. To validate this method, we use an advanced hardware simulator to generate triple-frequency signals and a high-grade receiver to collect 1-Hz data. When the carrier-phase precisions on L1, L2 and L5 are as poor as 1.5, 6.3 and 1.5 mm, respectively, wide-lane ambiguity resolution can still reach a correctness rate of over 99 % within 20 s. As a result, the correctness rate of narrow-lane ambiguity resolution achieves 99 % within 65 s, in contrast to only 64 % within 150 s in dual-frequency PPP. In addition, we also simulate a multipath-contaminated data set and introduce new ambiguities for all satellites every 120 s. We find that when multipath effects are strong, ambiguity-fixed solutions are achieved at 78 % of all epochs in triple-frequency PPP whilst almost no ambiguities are resolved in dual-frequency PPP. Therefore, we demonstrate that triple-frequency PPP has the potential to achieve ambiguity-fixed solutions within a few minutes, or even shorter if raw carrier-phase precisions are

  1. Development of nanometer resolution C-Band radio frequency beam position monitors in the Final Focus Test Beam

    International Nuclear Information System (INIS)

    Slaton, T.; Mazaheri, G.

    1998-08-01

    Using a 47 GeV electron beam, the Final Focus Test Beam (FFTB) produces vertical spot sizes around 70 nm. These small beam sizes introduce an excellent opportunity to develop and test high resolution Radio Frequency Beam Position Monitors (RF-BPMs). These BPMs are designed to measure pulse to pulse beam motion (jitter) at a theoretical resolution of approximately 1 nm. The beam induces a TM 110 mode with an amplitude linearly proportional to its charge and displacement from the BPM's (cylindrical cavity) axis. The C-band (5,712 MHz) TM 110 signal is processed and converted into beam position for use by the Stanford Linear Collider (SLC) control system. Presented are the experimental procedures, acquisition, and analysis of data demonstrating resolution of jitter near 25 nm. With the design of future e + e - linear colliders requiring spot sizes close to 3 nm, understanding and developing RF-BPMs will be essential in resolving and controlling jitter

  2. Swept-frequency feedback interferometry using terahertz frequency QCLs: a method for imaging and materials analysis.

    Science.gov (United States)

    Rakić, Aleksandar D; Taimre, Thomas; Bertling, Karl; Lim, Yah Leng; Dean, Paul; Indjin, Dragan; Ikonić, Zoran; Harrison, Paul; Valavanis, Alexander; Khanna, Suraj P; Lachab, Mohammad; Wilson, Stephen J; Linfield, Edmund H; Davies, A Giles

    2013-09-23

    The terahertz (THz) frequency quantum cascade laser (QCL) is a compact source of high-power radiation with a narrow intrinsic linewidth. As such, THz QCLs are extremely promising sources for applications including high-resolution spectroscopy, heterodyne detection, and coherent imaging. We exploit the remarkable phase-stability of THz QCLs to create a coherent swept-frequency delayed self-homodyning method for both imaging and materials analysis, using laser feedback interferometry. Using our scheme we obtain amplitude-like and phase-like images with minimal signal processing. We determine the physical relationship between the operating parameters of the laser under feedback and the complex refractive index of the target and demonstrate that this coherent detection method enables extraction of complex refractive indices with high accuracy. This establishes an ultimately compact and easy-to-implement THz imaging and materials analysis system, in which the local oscillator, mixer, and detector are all combined into a single laser.

  3. Relation between frequency of seismic wave and resolution of tomography; Danseiha tomography kaiseki ni okeru shuhasu to bunkaino no kankei

    Energy Technology Data Exchange (ETDEWEB)

    Fujimoto, M; Watanabe, T; Ashida, Y; Sassa, K [Kyoto University, Kyoto (Japan). Faculty of Engineering

    1997-05-27

    With regard to the elastic wave exploration, discussions have been given on the relationship between frequency and resolution in P-wave velocity tomography using the initial travel time. The discussions were carried out by using a new analysis method which incorporates the concept of Fresnel volume into tomography analysis. The following two arrangements were used in the calculation: a cross hole arrangement, in which seismic source and vibration receiving points were arranged so as to surround the three directions of a region extending 250 m in the horizontal direction and 500 m in the vertical direction, and observation is performed between two wells, and a permeation VSP arrangement in which the seismic source is installed on the ground surface and receiving points installed in wells. Restructuring was performed on the velocity structure by using a total of 819 observation travel times. This method has derived results of the restructuring according to frequencies of the seismic source used for the exploration. The resolution shown in the result of the restructuring has become higher as elastic waves with higher frequency are used, and the size of the structure identified from the restructuring result has decreased. This fact reveals that sufficient considerations must be given on frequencies of elastic waves used according to size of objects to be explored. 4 refs., 4 figs.

  4. High resolution terahertz spectroscopy of a whispering gallery mode bubble resonator using Hilbert analysis.

    Science.gov (United States)

    Vogt, Dominik Walter; Leonhardt, Rainer

    2017-07-10

    We report on data processing for continuous wave (CW) terahertz (THz) spectroscopy measurements based on a Hilbert spectral analysis to achieve MHz resolution. As an example we investigate the spectral properties of a whispering gallery mode (WGM) THz bubble resonator at critical coupling. The experimental verification clearly demonstrates the significant advantages in relative frequency resolution and required acquisition time of the proposed method over the traditional data analysis. An effective frequency resolution, only limited by the precision and stability of the laser beat signal, can be achieved without complex extensions to a standard commercially available CW THz spectrometer.

  5. High-Resolution Time-Frequency Spectrum-Based Lung Function Test from a Smartphone Microphone

    Directory of Open Access Journals (Sweden)

    Tharoeun Thap

    2016-08-01

    Full Text Available In this paper, a smartphone-based lung function test, developed to estimate lung function parameters using a high-resolution time-frequency spectrum from a smartphone built-in microphone is presented. A method of estimation of the forced expiratory volume in 1 s divided by forced vital capacity (FEV1/FVC based on the variable frequency complex demodulation method (VFCDM is first proposed. We evaluated our proposed method on 26 subjects, including 13 healthy subjects and 13 chronic obstructive pulmonary disease (COPD patients, by comparing with the parameters clinically obtained from pulmonary function tests (PFTs. For the healthy subjects, we found that an absolute error (AE and a root mean squared error (RMSE of the FEV1/FVC ratio were 4.49% ± 3.38% and 5.54%, respectively. For the COPD patients, we found that AE and RMSE from COPD patients were 10.30% ± 10.59% and 14.48%, respectively. For both groups, we compared the results using the continuous wavelet transform (CWT and short-time Fourier transform (STFT, and found that VFCDM was superior to CWT and STFT. Further, to estimate other parameters, including forced vital capacity (FVC, forced expiratory volume in 1 s (FEV1, and peak expiratory flow (PEF, regression analysis was conducted to establish a linear transformation. However, the parameters FVC, FEV1, and PEF had correlation factor r values of 0.323, 0.275, and −0.257, respectively, while FEV1/FVC had an r value of 0.814. The results obtained suggest that only the FEV1/FVC ratio can be accurately estimated from a smartphone built-in microphone. The other parameters, including FVC, FEV1, and PEF, were subjective and dependent on the subject’s familiarization with the test and performance of forced exhalation toward the microphone.

  6. High-resolution time-frequency representation of EEG data using multi-scale wavelets

    Science.gov (United States)

    Li, Yang; Cui, Wei-Gang; Luo, Mei-Lin; Li, Ke; Wang, Lina

    2017-09-01

    An efficient time-varying autoregressive (TVAR) modelling scheme that expands the time-varying parameters onto the multi-scale wavelet basis functions is presented for modelling nonstationary signals and with applications to time-frequency analysis (TFA) of electroencephalogram (EEG) signals. In the new parametric modelling framework, the time-dependent parameters of the TVAR model are locally represented by using a novel multi-scale wavelet decomposition scheme, which can allow the capability to capture the smooth trends as well as track the abrupt changes of time-varying parameters simultaneously. A forward orthogonal least square (FOLS) algorithm aided by mutual information criteria are then applied for sparse model term selection and parameter estimation. Two simulation examples illustrate that the performance of the proposed multi-scale wavelet basis functions outperforms the only single-scale wavelet basis functions or Kalman filter algorithm for many nonstationary processes. Furthermore, an application of the proposed method to a real EEG signal demonstrates the new approach can provide highly time-dependent spectral resolution capability.

  7. Challenges in Resolution for IC Failure Analysis

    Science.gov (United States)

    Martinez, Nick

    1999-10-01

    Resolution is becoming more and more of a challenge in the world of Failure Analysis in integrated circuits. This is a result of the ongoing size reduction in microelectronics. Determining the cause of a failure depends upon being able to find the responsible defect. The time it takes to locate a given defect is extremely important so that proper corrective actions can be taken. The limits of current microscopy tools are being pushed. With sub-micron feature sizes and even smaller killing defects, optical microscopes are becoming obsolete. With scanning electron microscopy (SEM), the resolution is high but the voltage involved can make these small defects transparent due to the large mean-free path of incident electrons. In this presentation, I will give an overview of the use of inspection methods in Failure Analysis and show example studies of my work as an Intern student at Texas Instruments. 1. Work at Texas Instruments, Stafford, TX, was supported by TI. 2. Work at Texas Tech University, was supported by NSF Grant DMR9705498.

  8. The Linear Time Frequency Analysis Toolbox

    DEFF Research Database (Denmark)

    Søndergaard, Peter Lempel; Torrésani, Bruno; Balazs, Peter

    2011-01-01

    The Linear Time Frequency Analysis Toolbox is a Matlab/Octave toolbox for computational time-frequency analysis. It is intended both as an educational and computational tool. The toolbox provides the basic Gabor, Wilson and MDCT transform along with routines for constructing windows (lter...... prototypes) and routines for manipulating coe cients. It also provides a bunch of demo scripts devoted either to demonstrating the main functions of the toolbox, or to exemplify their use in specic signal processing applications. In this paper we describe the used algorithms, their mathematical background...

  9. Frequency modulation television analysis: Distortion analysis

    Science.gov (United States)

    Hodge, W. H.; Wong, W. H.

    1973-01-01

    Computer simulation is used to calculate the time-domain waveform of standard T-pulse-and-bar test signal distorted in passing through an FM television system. The simulator includes flat or preemphasized systems and requires specification of the RF predetection filter characteristics. The predetection filters are modeled with frequency-symmetric Chebyshev (0.1-db ripple) and Butterworth filters. The computer was used to calculate distorted output signals for sixty-four different specified systems, and the output waveforms are plotted for all sixty-four. Comparison of the plotted graphs indicates that a Chebyshev predetection filter of four poles causes slightly more signal distortion than a corresponding Butterworth filter and the signal distortion increases as the number of poles increases. An increase in the peak deviation also increases signal distortion. Distortion also increases with the addition of preemphasis.

  10. Multiple-image hiding using super resolution reconstruction in high-frequency domains

    Science.gov (United States)

    Li, Xiao-Wei; Zhao, Wu-Xiang; Wang, Jun; Wang, Qiong-Hua

    2017-12-01

    In this paper, a robust multiple-image hiding method using the computer-generated integral imaging and the modified super-resolution reconstruction algorithm is proposed. In our work, the host image is first transformed into frequency domains by cellular automata (CA), to assure the quality of the stego-image, the secret images are embedded into the CA high-frequency domains. The proposed method has the following advantages: (1) robustness to geometric attacks because of the memory-distributed property of elemental images, (2) increasing quality of the reconstructed secret images as the scheme utilizes the modified super-resolution reconstruction algorithm. The simulation results show that the proposed multiple-image hiding method outperforms other similar hiding methods and is robust to some geometric attacks, e.g., Gaussian noise and JPEG compression attacks.

  11. High-resolution broadband terahertz spectroscopy via electronic heterodyne detection of photonically generated terahertz frequency comb.

    Science.gov (United States)

    Pavelyev, D G; Skryl, A S; Bakunov, M I

    2014-10-01

    We report an alternative approach to the terahertz frequency-comb spectroscopy (TFCS) based on nonlinear mixing of a photonically generated terahertz pulse train with a continuous wave signal from an electronic synthesizer. A superlattice is used as a nonlinear mixer. Unlike the standard TFCS technique, this approach does not require a complex double-laser system but retains the advantages of TFCS-high spectral resolution and wide bandwidth.

  12. High-resolution geophysical profiling using a stepped-frequency ground penetrating radar

    Energy Technology Data Exchange (ETDEWEB)

    Noon, D; Longstaff, D [The University of Queensland, (Australia)

    1996-05-01

    This paper describes the results of a ground penetrating radar (GPR) system which uses stepped-frequency waveforms to obtain high-resolution geophysical profiles. The main application for this system is the high-resolution mapping of thin coal seam structures, in order to assist surface mining operations in open-cut coal mines. The required depth of penetration is one meter which represents the maximum thickness of coal seams that are designated `thin`. A resolution of five centimeters is required to resolve the minimum thickness of coal (or shale partings) which can be economically recovered in an open-cut coal mine. For this application, a stepped-frequency GPR system has been developed, because of its ultrawide bandwidth (1 to 2 GHz) and high external loop sensitivity (155 dB). The field test results of the stepped-frequency GPR system on a concrete pavement and at two Australian open-cut coal mines are also presented. 7 refs., 5 figs.

  13. High resolution electromagnetic methods and low frequency dispersion of rock conductivity

    Directory of Open Access Journals (Sweden)

    V. V. Ageev

    1999-06-01

    Full Text Available The influence of frequency dispersion of conductivity (induced polarization of rocks on the results of electromagnetic (EM sounding was studied on the basis of calculation of electric field of vertical magnetic dipole above horizontally layered polarizable sections. Frequency dispersion was approximated by the Debye formula. Polarizable homogeneous halfspace, two, three and multilayered sections were analyzed in frequency and time domains. The calculations for different values of chargeability and time constants of polarization were performed. In the far zone of a source, the IP of rocks led to quasi-wave phenomena. They produced rapid fluctuations of frequency and transient sounding curves (interference phenomena, multireflections in polarizable layers. In the case of transient sounding in the near zone of a source quasistatic distortions prevailed, caused by the counter electromotive force arising in polarizable layers which may lead to strong changes in transient curves. In some cases quasiwave and quasistatic phenomena made EM sounding curves non-interpretable in the class of quasistationary curves over non-dispersive sections. On the other hand, they could increase the resolution and depth of investigation of EM sounding. This was confirmed by an experience of "high-resolution" electroprospecting in Russia. The problem of interpretation of EM sounding data in polarizable sections is nonunique. To achieve uniqueness it is probably necessary to complement them by soundings of other type.

  14. High resolution electromagnetic methods and low frequency dispersion of rock conductivity

    International Nuclear Information System (INIS)

    Svetov, B.S.; Ageev, V.V.

    1999-01-01

    The influence of frequency dispersion of conductivity (induced polarization) of rocks on the results of electromagnetic (EM) sounding was studied on the basis of calculation of electric field of vertical magnetic dipole above horizontally layered polarizable sections. Frequency dispersion was approximated by the Debye formula. Polarizable homogeneous half space, two, three and multilayered section were analyzed in frequency and tim domains. The calculations for different values of charge ability and time constants of polarization were performed. In the far zone of a source, the IP of rocks led to quasi-wave phenomena. They produced rapid fluctuations of frequency and transient sounding curves (interference phenomena, multireflections in polarizable layers). In the case of transient sounding in the near zone of a source quasistatic distortions prevailed, caused by the counter electromotive force arising in polarizable layers which may lead to strong change in transient curves. In same case in quasiwave and quasistatic phenomena made EM sounding curves non-interpretable in the class of quasistationary curves over non-dispersive sections. On the other hand, they could increase the resolution and depth of investigation of EM sounding. This was confirmed by an experience of 'high-resolution' electroprospectring in Russia. The problem of interpretation of EM sounding data in polarizable sections is non unique. To achieve uniqueness it is probably to complement them by sounding of other type

  15. High resolution electromagnetic methods and low frequency dispersion of rock conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Svetov, B.S.; Ageev, V.V. [Geoelectromagnetic Research Institute, Institute of Physics of the Earth, RAS, Moscow (Russian Federation)

    1999-08-01

    The influence of frequency dispersion of conductivity (induced polarization) of rocks on the results of electromagnetic (EM) sounding was studied on the basis of calculation of electric field of vertical magnetic dipole above horizontally layered polarizable sections. Frequency dispersion was approximated by the Debye formula. Polarizable homogeneous half space, two, three and multilayered section were analyzed in frequency and tim domains. The calculations for different values of charge ability and time constants of polarization were performed. In the far zone of a source, the IP of rocks led to quasi-wave phenomena. They produced rapid fluctuations of frequency and transient sounding curves (interference phenomena, multireflections in polarizable layers). In the case of transient sounding in the near zone of a source quasistatic distortions prevailed, caused by the counter electromotive force arising in polarizable layers which may lead to strong change in transient curves. In same case in quasi wave and quasistatic phenomena made Em sounding curves non-interpretable in the class of quasistationary curves over non-dispersive sections. On the other hand, they could increase the resolution and depth of investigation of Em sounding. This was confirmed by an experience of 'high-resolution' electroprospectring in Russia. The problem of interpretation of EM sounding data in polarizable sections is non unique. To achieve uniqueness it is probably to complement them by sounding of other type.

  16. Analysis of the impact of spatial resolution on land/water classifications using high-resolution aerial imagery

    Science.gov (United States)

    Enwright, Nicholas M.; Jones, William R.; Garber, Adrienne L.; Keller, Matthew J.

    2014-01-01

    Long-term monitoring efforts often use remote sensing to track trends in habitat or landscape conditions over time. To most appropriately compare observations over time, long-term monitoring efforts strive for consistency in methods. Thus, advances and changes in technology over time can present a challenge. For instance, modern camera technology has led to an increasing availability of very high-resolution imagery (i.e. submetre and metre) and a shift from analogue to digital photography. While numerous studies have shown that image resolution can impact the accuracy of classifications, most of these studies have focused on the impacts of comparing spatial resolution changes greater than 2 m. Thus, a knowledge gap exists on the impacts of minor changes in spatial resolution (i.e. submetre to about 1.5 m) in very high-resolution aerial imagery (i.e. 2 m resolution or less). This study compared the impact of spatial resolution on land/water classifications of an area dominated by coastal marsh vegetation in Louisiana, USA, using 1:12,000 scale colour-infrared analogue aerial photography (AAP) scanned at four different dot-per-inch resolutions simulating ground sample distances (GSDs) of 0.33, 0.54, 1, and 2 m. Analysis of the impact of spatial resolution on land/water classifications was conducted by exploring various spatial aspects of the classifications including density of waterbodies and frequency distributions in waterbody sizes. This study found that a small-magnitude change (1–1.5 m) in spatial resolution had little to no impact on the amount of water classified (i.e. percentage mapped was less than 1.5%), but had a significant impact on the mapping of very small waterbodies (i.e. waterbodies ≤ 250 m2). These findings should interest those using temporal image classifications derived from very high-resolution aerial photography as a component of long-term monitoring programs.

  17. A high resolution jet analysis for LEP

    International Nuclear Information System (INIS)

    Hariri, S.

    1992-11-01

    A high resolution multijet analysis of hadronic events produced in e + e - annihilation at a C.M.S. energy of 91.2 GeV is described. Hadronic events produced in e + e - annihilations are generated using the Monte Carlo program JETSET7.3 with its two options: Matrix Element (M.E.) and Parton Showers (P.S.). The shower option is used with its default parameter values while the M.E. option is used with an invariant mass cut Y CUT =0.01 instead of 0.02. This choice ensures a better continuity in the evolution of the event shape variables. (K.A.) 3 refs.; 26 figs.; 1 tab

  18. Time-frequency analysis of pediatric murmurs

    Science.gov (United States)

    Lombardo, Joseph S.; Blodgett, Lisa A.; Rosen, Ron S.; Najmi, Amir-Homayoon; Thompson, W. Reid

    1998-05-01

    Technology has provided many new tools to assist in the diagnosis of pathologic conditions of the heart. Echocardiography, Ultrafast CT, and MRI are just a few. While these tools are a valuable resource, they are typically too expensive, large and complex in operation for use in rural, homecare, and physician's office settings. Recent advances in computer performance, miniaturization, and acoustic signal processing, have yielded new technologies that when applied to heart sounds can provide low cost screening for pathologic conditions. The short duration and transient nature of these signals requires processing techniques that provide high resolution in both time and frequency. Short-time Fourier transforms, Wigner distributions, and wavelet transforms have been applied to signals form hearts with various pathologic conditions. While no single technique provides the ideal solution, the combination of tools provides a good representation of the acoustic features of the pathologies selected.

  19. Radio frequency phototube and optical clock: High resolution, high rate and highly stable single photon timing technique

    Energy Technology Data Exchange (ETDEWEB)

    Margaryan, Amur

    2011-10-01

    A new timing technique for single photons based on the radio frequency phototube and optical clock or femtosecond optical frequency comb generator is proposed. The technique has a 20 ps resolution for single photons, is capable of operating with MHz frequencies and achieving 10 fs instability level.

  20. TBV 361 RESOLUTION ANALYSIS: EMPLACEMENT DRIFT ORIENTATION

    International Nuclear Information System (INIS)

    Lin, M.; Kicker, D.C.; Sellers, M.D.

    1999-01-01

    The purpose of this To Be Verified/To Be Determined (TBX) resolution analysis is to release ''To Be Verified'' (TBV)-361 related to the emplacement drift orientation. The system design criterion in ''Subsurface Facility System Description Document'' (CRWMS M andO 1998a, p.9) specifies that the emplacement drift orientation relative to the dominant joint orientations should be at least 30 degrees. The specific objectives for this analysis include the following: (1) Collect and evaluate key block data developed for the repository host horizon rock mass. (2) Assess the dominant joint orientations based on available fracture data. (3) Document the maximum block size as a function of drift orientation. (4) Assess the applicability of the drift orientation/joint orientation offset criterion in the ''Subsurface Facility System Description Document'' (CRWMS M andO 1998a, p.9). (5) Consider the effects of seepage on drift orientation. (6) Verify that the viability assessment (VA) drift orientation complies with the drift orientation/joint orientation offset criterion, or provide justifications and make recommendations for modifying the VA emplacement drift layout. In addition to providing direct support to the System Description Document (SDD), the release of TBV-361 will provide support to the Repository Subsurface Design Department. The results from this activity may also provide data and information needs to support the MGR Requirements Department, the MGR Safety Assurance Department, and the Performance Assessment Organization

  1. The frequency and the degree of fusion of the lung on high-resolution CT

    International Nuclear Information System (INIS)

    Shin, Hwan Sik; Kim, Sung Jin; Bae, Il Hun; Song, Kyung Sup; Kim, Joo Chang; Han, Ki Suk; Cha, Sang Hoon; Park, Kil Sun

    2000-01-01

    To evaluate the frequency and degree of fusion of the lung as seen on high-resolution CT (HRCT). In 210 patients high-resolution CT scans from the apex to the diaphragm were obtained at 1 mm collimation and 7 mm interval. We retrospectively analysed the frequency and degree of fusion of the lung bordering each interlobar fissure. Fusion of the lung was defined when fissure appeared without complete lobar separation. The degree of lung fusion was classified as mild (less than 1/3 of the fissure), moderate (greater than 1/3 and less than 2/3 of fissure), or severe (greater than 2/3 of the fissure). In 90 of 210 patients, all fissures were identified. In 73 of these 90 (81.1%), lung fusion was noted, the most frequent site of this being between the right upper and right middle lobe (53.3%) . The least frequent site was between the upper portion of the left upper and left lower lobe (32.2%). Am mild degree of fusion was most frequently found between the right middle and right lower lobe (83.9%0, while a severe degree was most frequent between the right middle and right upper lobe (50.0%), followed by the lingular division and the left lower lobe (41.9%). HRCT can be used to evaluate the frequency and degree of interlobar lung fusion. (author)

  2. Improving Ambiguity Resolution for Medium Baselines Using Combined GPS and BDS Dual/Triple-Frequency Observations.

    Science.gov (United States)

    Gao, Wang; Gao, Chengfa; Pan, Shuguo; Wang, Denghui; Deng, Jiadong

    2015-10-30

    The regional constellation of the BeiDou navigation satellite system (BDS) has been providing continuous positioning, navigation and timing services since 27 December 2012, covering China and the surrounding area. Real-time kinematic (RTK) positioning with combined BDS and GPS observations is feasible. Besides, all satellites of BDS can transmit triple-frequency signals. Using the advantages of multi-pseudorange and carrier observations from multi-systems and multi-frequencies is expected to be of much benefit for ambiguity resolution (AR). We propose an integrated AR strategy for medium baselines by using the combined GPS and BDS dual/triple-frequency observations. In the method, firstly the extra-wide-lane (EWL) ambiguities of triple-frequency system, i.e., BDS, are determined first. Then the dual-frequency WL ambiguities of BDS and GPS were resolved with the geometry-based model by using the BDS ambiguity-fixed EWL observations. After that, basic (i.e., L1/L2 or B1/B2) ambiguities of BDS and GPS are estimated together with the so-called ionosphere-constrained model, where the ambiguity-fixed WL observations are added to enhance the model strength. During both of the WL and basic AR, a partial ambiguity fixing (PAF) strategy is adopted to weaken the negative influence of new-rising or low-elevation satellites. Experiments were conducted and presented, in which the GPS/BDS dual/triple-frequency data were collected in Nanjing and Zhengzhou of China, with the baseline distance varying from about 28.6 to 51.9 km. The results indicate that, compared to the single triple-frequency BDS system, the combined system can significantly enhance the AR model strength, and thus improve AR performance for medium baselines with a 75.7% reduction of initialization time on average. Besides, more accurate and stable positioning results can also be derived by using the combined GPS/BDS system.

  3. Resolution improvement of low frequency AC magnetic field detection for modulated MR sensors.

    Science.gov (United States)

    Hu, Jinghua; Pan, Mengchun; Hu, Jiafei; Li, Sizhong; Chen, Dixiang; Tian, Wugang; Sun, Kun; Du, Qingfa; Wang, Yuan; Pan, Long; Zhou, Weihong; Zhang, Qi; Li, Peisen; Peng, Junping; Qiu, Weicheng; Zhou, Jikun

    2017-09-01

    Magnetic modulation methods especially Micro-Electro-Mechanical System (MEMS) modulation can improve the sensitivity of magnetoresistive (MR) sensors dramatically, and pT level detection of Direct Current (DC) magnetic field can be realized. While in a Low Frequency Alternate Current (LFAC) magnetic field measurement situation, frequency measurement is limited by a serious spectrum aliasing problem caused by the remanence in sensors and geomagnetic field, leading to target information loss because frequency indicates the magnetic target characteristics. In this paper, a compensation field produced with integrated coils is applied to the MR sensor to remove DC magnetic field distortion, and a LFAC magnetic field frequency estimation algorithm is proposed based on a search of the database, which is derived from the numerical model revealing the relationship of the LFAC frequency and determination factor [defined by the ratio of Discrete Fourier Transform (DFT) coefficients]. In this algorithm, an inverse modulation of sensor signals is performed to detect jumping-off point of LFAC in the time domain; this step is exploited to determine sampling points to be processed. A determination factor is calculated and taken into database to figure out frequency with a binary search algorithm. Experimental results demonstrate that the frequency measurement resolution of the LFAC magnetic field is improved from 12.2 Hz to 0.8 Hz by the presented method, which, within the signal band of a magnetic anomaly (0.04-2 Hz), indicates that the proposed method may expand the applications of magnetoresistive (MR) sensors to human healthcare and magnetic anomaly detection (MAD).

  4. Experimental demonstration of producing high resolution zone plates by spatial-frequency multiplication

    International Nuclear Information System (INIS)

    Yun, W.B.; Howells, M.R.

    1987-01-01

    In an earlier publication, the possibility of producing high resolution zone plates for x-ray applications by spatial-frequency multiplication was analyzed theoretically. The theory predicted that for a daughter zone plate generated from the interference of mth and nth diffraction orders of a parent zone plate, its primary focal spot size and focal length are one (m + n)th of their counterparts of the parent zone plate, respectively. It was also shown that a zone plate with the outermost zone width of as small as 13.8 nm might be produced by this technique. In this paper, we report an experiment which we carried out with laser light (λ = 4166A) for demonstrating this technique. In addition, an outlook for producing high resolution zone plates for x-ray application is briefly discussed

  5. Imaging Optical Frequencies with 100 μ Hz Precision and 1.1 μ m Resolution

    Science.gov (United States)

    Marti, G. Edward; Hutson, Ross B.; Goban, Akihisa; Campbell, Sara L.; Poli, Nicola; Ye, Jun

    2018-03-01

    We implement imaging spectroscopy of the optical clock transition of lattice-trapped degenerate fermionic Sr in the Mott-insulating regime, combining micron spatial resolution with submillihertz spectral precision. We use these tools to demonstrate atomic coherence for up to 15 s on the clock transition and reach a record frequency precision of 2.5 ×10-19. We perform the most rapid evaluation of trapping light shifts and record a 150 mHz linewidth, the narrowest Rabi line shape observed on a coherent optical transition. The important emerging capability of combining high-resolution imaging and spectroscopy will improve the clock precision, and provide a path towards measuring many-body interactions and testing fundamental physics.

  6. Resolution enhancement of robust Bayesian pre-stack inversion in the frequency domain

    Science.gov (United States)

    Yin, Xingyao; Li, Kun; Zong, Zhaoyun

    2016-10-01

    AVO/AVA (amplitude variation with an offset or angle) inversion is one of the most practical and useful approaches to estimating model parameters. So far, publications on AVO inversion in the Fourier domain have been quite limited in view of its poor stability and sensitivity to noise compared with time-domain inversion. For the resolution and stability of AVO inversion in the Fourier domain, a novel robust Bayesian pre-stack AVO inversion based on the mixed domain formulation of stationary convolution is proposed which could solve the instability and achieve superior resolution. The Fourier operator will be integrated into the objective equation and it avoids the Fourier inverse transform in our inversion process. Furthermore, the background constraints of model parameters are taken into consideration to improve the stability and reliability of inversion which could compensate for the low-frequency components of seismic signals. Besides, the different frequency components of seismic signals can realize decoupling automatically. This will help us to solve the inverse problem by means of multi-component successive iterations and the convergence precision of the inverse problem could be improved. So, superior resolution compared with the conventional time-domain pre-stack inversion could be achieved easily. Synthetic tests illustrate that the proposed method could achieve high-resolution results with a high degree of agreement with the theoretical model and verify the quality of anti-noise. Finally, applications on a field data case demonstrate that the proposed method could obtain stable inversion results of elastic parameters from pre-stack seismic data in conformity with the real logging data.

  7. Dynamic frequency-domain interferometer for absolute distance measurements with high resolution

    International Nuclear Information System (INIS)

    Weng, Jidong; Liu, Shenggang; Ma, Heli; Tao, Tianjiong; Wang, Xiang; Liu, Cangli; Tan, Hua

    2014-01-01

    A unique dynamic frequency-domain interferometer for absolute distance measurement has been developed recently. This paper presents the working principle of the new interferometric system, which uses a photonic crystal fiber to transmit the wide-spectrum light beams and a high-speed streak camera or frame camera to record the interference stripes. Preliminary measurements of harmonic vibrations of a speaker, driven by a radio, and the changes in the tip clearance of a rotating gear wheel show that this new type of interferometer has the ability to perform absolute distance measurements both with high time- and distance-resolution

  8. Dynamic frequency-domain interferometer for absolute distance measurements with high resolution

    Science.gov (United States)

    Weng, Jidong; Liu, Shenggang; Ma, Heli; Tao, Tianjiong; Wang, Xiang; Liu, Cangli; Tan, Hua

    2014-11-01

    A unique dynamic frequency-domain interferometer for absolute distance measurement has been developed recently. This paper presents the working principle of the new interferometric system, which uses a photonic crystal fiber to transmit the wide-spectrum light beams and a high-speed streak camera or frame camera to record the interference stripes. Preliminary measurements of harmonic vibrations of a speaker, driven by a radio, and the changes in the tip clearance of a rotating gear wheel show that this new type of interferometer has the ability to perform absolute distance measurements both with high time- and distance-resolution.

  9. High resolution kilometric range optical telemetry in air by radio frequency phase measurement

    Energy Technology Data Exchange (ETDEWEB)

    Guillory, Joffray; García-Márquez, Jorge; Truong, Daniel; Wallerand, Jean-Pierre [Laboratoire Commun de Métrologie LNE-Cnam (LCM), LNE, 1 rue Gaston Boissier, 75015 Paris (France); Šmíd, Radek [Laboratoire Commun de Métrologie LNE-Cnam (LCM), LNE, 1 rue Gaston Boissier, 75015 Paris (France); Institute of Scientific Instruments of the CAS, Kralovopolska 147, 612 64 Brno (Czech Republic); Alexandre, Christophe [Centre d’Études et de Recherche en Informatique et Communications (CEDRIC), Cnam, 292 rue St-Martin, 75003 Paris (France)

    2016-07-15

    We have developed an optical Absolute Distance Meter (ADM) based on the measurement of the phase accumulated by a Radio Frequency wave during its propagation in the air by a laser beam. In this article, the ADM principle will be described and the main results will be presented. In particular, we will emphasize how the choice of an appropriate photodetector can significantly improve the telemeter performances by minimizing the amplitude to phase conversion. Our prototype, tested in the field, has proven its efficiency with a resolution better than 15 μm for a measurement time of 10 ms and distances up to 1.2 km.

  10. The first full-resolution measurements of Auroral Medium Frequency Burst Emissions

    Science.gov (United States)

    Bunch, N. L.; Labelle, J.; Weatherwax, A.; Hughes, J.

    2008-12-01

    Auroral MF burst is a naturally occurring auroral radio emission which appears unstructured on resolution of previous measurements, is observed in the frequency range of 0.8-4.5 MHz, and has typical amplitudes of around 10-14 V2/m2Hz, and durations of a few minutes. The emission occurs at substorm onset. Since Sept 2006, Dartmouth has operated a broadband (0-5 MHz) interferometer at Toolik Lake, Alaska (68° 38' N, 149° 36' W, 68.51 deg. magnetic latitude), designed for the study of auroral MF burst emissions. Normal operation involves taking snapshots of waveforms from four spaced antennas from which wave spectral and directional information is obtained. However, the experiment can also be run in "continuous mode" whereby the signal from a selected antenna is sampled continuously at 10 M samples/second. A "continuous mode" campaign was run 0800-1200 UT (~2200-0200 MLT) daily from March 21 to April 19, 2008. During this campaign more than twenty auroral MF burst emissions were observed, including three extraordinarily intense examples lasting approximately two minutes each. These observations represent the highest time and frequency resolution data ever collected of MF burst emissions. These data allow us to better characterize the null near twice the electron gyrofrequency identified in previous experiments, since examples of this feature observed during this campaign display a strong null ~50 kHz in bandwidth, with sharp boundaries and occasionally coincident with 2 fce auroral roar. These data also allow us to search for frequency-time structures embedded in MF-burst. One prominent feature appears to be a strong single frequency emission which broadens down to lower frequencies over time, spreading to approximately 500 kHz in bandwidth over ~10 ms. Among other features observed are a diffuse and unstructured emission, as well as what could potentially be several separate emission sources, with multiple emissions occurring simultaneously, appearing as weaker

  11. Effects of lateral boundary condition resolution and update frequency on regional climate model predictions

    Science.gov (United States)

    Pankatz, Klaus; Kerkweg, Astrid

    2015-04-01

    The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the German Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In MiKlip, one big question is if regional climate modeling shows "added value", i.e. to evaluate, if regional climate models (RCM) produce better results than the driving models. However, the scope of this study is to look more closely at the setup specific details of regional climate modeling. As regional models only simulate a small domain, they have to inherit information about the state of the atmosphere at their lateral boundaries from external data sets. There are many unresolved questions concerning the setup of lateral boundary conditions (LBC). External data sets come from global models or from global reanalysis data-sets. A temporal resolution of six hours is common for this kind of data. This is mainly due to the fact, that storage space is a limiting factor, especially for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBCs has a significant effect on the climate in the domain of the RCM. The first study examines how the RCM reacts to a higher update frequency. The study is based on a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in the regional domain shows only small deviations, some statistically significant though, of 2m temperature, sea level pressure and precipitation. The second part of the first study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations

  12. High resolution switching mode inductance-to-frequency converter with temperature compensation.

    Science.gov (United States)

    Matko, Vojko; Milanović, Miro

    2014-10-16

    This article proposes a novel method for the temperature-compensated inductance-to-frequency converter with a single quartz crystal oscillating in the switching oscillating circuit to achieve better temperature stability of the converter. The novelty of this method lies in the switching-mode converter, the use of additionally connected impedances in parallel to the shunt capacitances of the quartz crystal, and two inductances in series to the quartz crystal. This brings a considerable reduction of the temperature influence of AT-cut crystal frequency change in the temperature range between 10 and 40 °C. The oscillator switching method and the switching impedances connected to the quartz crystal do not only compensate for the crystal's natural temperature characteristics but also any other influences on the crystal such as ageing as well as from other oscillating circuit elements. In addition, the method also improves frequency sensitivity in inductance measurements. The experimental results show that through high temperature compensation improvement of the quartz crystal characteristics, this switching method theoretically enables a 2 pH resolution. It converts inductance to frequency in the range of 85-100 µH to 2-560 kHz.

  13. High Resolution Switching Mode Inductance-to-Frequency Converter with Temperature Compensation

    Directory of Open Access Journals (Sweden)

    Vojko Matko

    2014-10-01

    Full Text Available This article proposes a novel method for the temperature-compensated inductance-to-frequency converter with a single quartz crystal oscillating in the switching oscillating circuit to achieve better temperature stability of the converter. The novelty of this method lies in the switching-mode converter, the use of additionally connected impedances in parallel to the shunt capacitances of the quartz crystal, and two inductances in series to the quartz crystal. This brings a considerable reduction of the temperature influence of AT-cut crystal frequency change in the temperature range between 10 and 40 °C. The oscillator switching method and the switching impedances connected to the quartz crystal do not only compensate for the crystal’s natural temperature characteristics but also any other influences on the crystal such as ageing as well as from other oscillating circuit elements. In addition, the method also improves frequency sensitivity in inductance measurements. The experimental results show that through high temperature compensation improvement of the quartz crystal characteristics, this switching method theoretically enables a 2 pH resolution. It converts inductance to frequency in the range of 85–100 µH to 2–560 kHz.

  14. Extending electronic length frequency analysis in R

    DEFF Research Database (Denmark)

    Taylor, M. H.; Mildenberger, Tobias K.

    2017-01-01

    VBGF (soVBGF) requires a more intensive search due to two additional parameters. This work describes the implementation of two optimisation approaches ("simulated annealing" and "genetic algorithm") for growth function fitting using the open-source software "R." Using a generated LFQ data set......Electronic length frequency analysis (ELEFAN) is a system of stock assessment methods using length-frequency (LFQ) data. One step is the estimation of growth from the progression of LFQ modes through time using the von Bertalanffy growth function (VBGF). The option to fit a seasonally oscillating...... of the asymptotic length parameter (L-infinity) are found to have significant effects on parameter estimation error. An outlook provides context as to the significance of the R-based implementation for further testing and development, as well as the general relevance of the method for data-limited stock assessment....

  15. Advances in Computational High-Resolution Mechanical Spectroscopy HRMS Part II: Resonant Frequency – Young's Modulus

    International Nuclear Information System (INIS)

    Majewski, M; Magalas, L B

    2012-01-01

    In this paper, we compare the values of the resonant frequency f 0 of free decaying oscillations computed according to the parametric OMI method (Optimization in Multiple Intervals) and nonparametric DFT-based (discrete Fourier transform) methods as a function of the sampling frequency. The analysis is carried out for free decaying signals embedded in an experimental noise recorded for metallic samples in a low-frequency resonant mechanical spectrometer. The Yoshida method (Y), the Agrez' method (A), and new interpolated discrete Fourier transform (IpDFT) methods, that is, the Yoshida-Magalas (YM) and (YM C ) methods developed by the authors are carefully compared for the resonant frequency f 0 = 1.12345 Hz and the logarithmic decrement, δ = 0.0005. Precise estimation of the resonant frequency (Youngs' modulus ∼ f 0 2 ) for real experimental conditions, i.e., for exponentially damped harmonic signals embedded in an experimental noise, is a complex task. In this work, various computing methods are analyzed as a function of the sampling frequency used to digitize free decaying oscillations. The importance of computing techniques to obtain reliable and precise values of the resonant frequency (i.e. Young's modulus) in materials science is emphasized.

  16. High Frequency High Spectral Resolution Focal Plane Arrays for AtLAST

    Science.gov (United States)

    Baryshev, Andrey

    2018-01-01

    Large collecting area single dish telescope such as ATLAST will be especially effective for medium (R 1000) and high (R 50000) spectral resolution observations. Large focal plane array is a natural solution to increase mapping speed. For medium resolution direct detectors with filter banks (KIDs) and or heterodyne technology can be employed. We will analyze performance limits of comparable KID and SIS focal plane array taking into account quantum limit and high background condition of terrestrial observing site. For large heterodyne focal plane arrays, a high current density AlN junctions open possibility of large instantaneous bandwidth >40%. This and possible multi frequency band FPSs presents a practical challenge for spatial sampling and scanning strategies. We will discuss phase array feeds as a possible solution, including a modular back-end system, which can be shared between KID and SIS based FPA. Finally we will discuss achievable sensitivities and pixel co unts for a high frequency (>500 GHz) FPAs and address main technical challenges: LO distribution, wire counts, bias line multiplexing, and monolithic vs. discrete mixer component integration.

  17. A fiber-optic interferometer with subpicometer resolution for dc and low-frequency displacement measurement

    International Nuclear Information System (INIS)

    Smith, D. T.; Pratt, J. R.; Howard, L. P.

    2009-01-01

    We have developed a fiber-optic interferometer optimized for best performance in the frequency range from dc to 1 kHz, with displacement linearity of 1% over a range of ± 25 nm, and noise-limited resolution of 2 pm. The interferometer uses a tunable infrared laser source (nominal 1550 nm wavelength) with high amplitude and wavelength stability, low spontaneous self-emission noise, high sideband suppression, and a coherence control feature that broadens the laser linewidth and dramatically lowers the low-frequency noise in the system. The amplitude stability of the source, combined with the use of specially manufactured ''bend-insensitive'' fiber and all-spliced fiber construction, results in a robust homodyne interferometer system, which achieves resolution of 40 fm Hz -1/2 above 20 Hz and approaches the shot-noise-limit of 20 fm Hz -1/2 at 1 kHz for an optical power of 10 μW, without the need for differential detection. Here we describe the design and construction of the interferometer, as well as modes of operation, and demonstrate its performance.

  18. Calibration of GLONASS Inter-Frequency Code Bias for PPP Ambiguity Resolution with Heterogeneous Rover Receivers

    Directory of Open Access Journals (Sweden)

    Yanyan Liu

    2018-03-01

    Full Text Available Integer ambiguity resolution (IAR is important for rapid initialization of precise point positioning (PPP. Whereas many studies have been limited to Global Positioning System (GPS alone, there is a strong need to add Globalnaya Navigatsionnaya Sputnikovaya Sistema (GLONASS to the PPP-IAR solution. However, the frequency-division multiplexing of GLONASS signals causes inter-frequency code bias (IFCB in the receiving equipment. The IFCB causes GLONASS wide-lane uncalibrated phase delay (UPD estimation with heterogeneous receiver types to fail, so GLONASS ambiguity is therefore traditionally estimated as float values in PPP. A two-step method of calibrating GLONASS IFCB is proposed in this paper, such that GLONASS PPP-IAR can be performed with heterogeneous receivers. Experimental results demonstrate that with the proposed method, GLONASS PPP ambiguity resolution can be achieved across a variety of receiver types. For kinematic PPP with mixed receiver types, the fixing percentage within 10 min is only 33.5% for GPS-only. Upon adding GLONASS, the percentage improves substantially, to 84.9%.

  19. Development of nanometer resolution C-Band radio frequency beam position monitors in the Final Focus Test Beam

    Energy Technology Data Exchange (ETDEWEB)

    Slaton, T.; Mazaheri, G. [Stanford Univ., CA (US). Stanford Linear Accelerator Center; Shintake, T. [National Lab. for High Energy Physics, Tsukuba, Ibaraki (Japan)

    1998-08-01

    Using a 47 GeV electron beam, the Final Focus Test Beam (FFTB) produces vertical spot sizes around 70 nm. These small beam sizes introduce an excellent opportunity to develop and test high resolution Radio Frequency Beam Position Monitors (RF-BPMs). These BPMs are designed to measure pulse to pulse beam motion (jitter) at a theoretical resolution of approximately 1 nm. The beam induces a TM{sub 110} mode with an amplitude linearly proportional to its charge and displacement from the BPM's (cylindrical cavity) axis. The C-band (5,712 MHz) TM{sub 110} signal is processed and converted into beam position for use by the Stanford Linear Collider (SLC) control system. Presented are the experimental procedures, acquisition, and analysis of data demonstrating resolution of jitter near 25 nm. With the design of future e{sup +}e{sup -} linear colliders requiring spot sizes close to 3 nm, understanding and developing RF-BPMs will be essential in resolving and controlling jitter.

  20. Frequency domain analysis of knock images

    Science.gov (United States)

    Qi, Yunliang; He, Xin; Wang, Zhi; Wang, Jianxin

    2014-12-01

    High speed imaging-based knock analysis has mainly focused on time domain information, e.g. the spark triggered flame speed, the time when end gas auto-ignition occurs and the end gas flame speed after auto-ignition. This study presents a frequency domain analysis on the knock images recorded using a high speed camera with direct photography in a rapid compression machine (RCM). To clearly visualize the pressure wave oscillation in the combustion chamber, the images were high-pass-filtered to extract the luminosity oscillation. The luminosity spectrum was then obtained by applying fast Fourier transform (FFT) to three basic colour components (red, green and blue) of the high-pass-filtered images. Compared to the pressure spectrum, the luminosity spectra better identify the resonant modes of pressure wave oscillation. More importantly, the resonant mode shapes can be clearly visualized by reconstructing the images based on the amplitudes of luminosity spectra at the corresponding resonant frequencies, which agree well with the analytical solutions for mode shapes of gas vibration in a cylindrical cavity.

  1. Length-extension resonator as a force sensor for high-resolution frequency-modulation atomic force microscopy in air.

    Science.gov (United States)

    Beyer, Hannes; Wagner, Tino; Stemmer, Andreas

    2016-01-01

    Frequency-modulation atomic force microscopy has turned into a well-established method to obtain atomic resolution on flat surfaces, but is often limited to ultra-high vacuum conditions and cryogenic temperatures. Measurements under ambient conditions are influenced by variations of the dew point and thin water layers present on practically every surface, complicating stable imaging with high resolution. We demonstrate high-resolution imaging in air using a length-extension resonator operating at small amplitudes. An additional slow feedback compensates for changes in the free resonance frequency, allowing stable imaging over a long period of time with changing environmental conditions.

  2. Analysis of Time Resolution in HGCAL Testbeam

    CERN Document Server

    Steentoft, Jonas

    2017-01-01

    Using data from a 250 GeV electron run during the November 2016 HGCAL testbeam, the time resolution of the High Granularity hadronic endcap Calorimeter, HGCAL, was investigated, looking at the seven innermost Si cells, and using them as reference timers for each other. Cuts in the data was applied based on signal amplitude,$0.05 \\hspace{1mm} V < A < 0.45 \\hspace{1mm} V$, position of incoming beam particle,$0 \\hspace{1mm} mm < TDCx < 22\\hspace{1mm} mm$ and $-7\\hspace{1mm} mm resolution of $15-50$ $ps$ was obtained, depending on which two cells were compared, and how the low-statistics cut were placed. We also confirmed a slight correlation between time resolution and distanc...

  3. HIGH-RESOLUTION RADIO OBSERVATIONS OF THE REMNANT OF SN 1987A AT HIGH FREQUENCIES

    International Nuclear Information System (INIS)

    Zanardo, Giovanna; Staveley-Smith, L.; Potter, T. M.; Ng, C.-Y.; Gaensler, B. M.; Manchester, R. N.; Tzioumis, A. K.

    2013-01-01

    We present new imaging observations of the remnant of Supernova (SN) 1987A at 44 GHz, performed in 2011 with the Australia Telescope Compact Array (ATCA). The 0.''35 × 0.''23 resolution of the diffraction-limited image is the highest achieved to date in high-dynamic range. We also present a new ATCA image at 18 GHz derived from 2011 observations, which is super-resolved to 0.''25. The flux density is 40 ± 2 mJy at 44 GHz and 81 ± 6 mJy at 18 GHz. At both frequencies, the remnant exhibits a ring-like emission with two prominent lobes, and an east-west brightness asymmetry that peaks on the eastern lobe. A central feature of fainter emission appears at 44 GHz. A comparison with previous ATCA observations at 18 and 36 GHz highlights higher expansion velocities of the remnant's eastern side. The 18-44 GHz spectral index is α = –0.80 (S ν ∝ν α ). The spectral index map suggests slightly steeper values at the brightest sites on the eastern lobe, whereas flatter values are associated with the inner regions. The remnant morphology at 44 GHz generally matches the structure seen with contemporaneous X-ray and Hα observations. Unlike the Hα emission, both the radio and X-ray emission peaks on the eastern lobe. The regions of flatter spectral index align and partially overlap with the optically visible ejecta. Simple free-free absorption models suggest that emission from a pulsar wind nebula or a compact source inside the remnant may now be detectable at high frequencies or at low frequencies if there are holes in the ionized component of the ejecta.

  4. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  5. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  6. High resolution or optimum resolution? Spatial analysis of the Federmesser site at Andernach, Germany

    NARCIS (Netherlands)

    Stapert, D; Street, M

    1997-01-01

    This paper discusses spatial analysis at site level. It is suggested that spatial analysis has to proceed in several levels, from global to more detailed questions, and that optimum resolution should be established when applying any quantitative methods in this field. As an example, the ring and

  7. NOAA High-Resolution Sea Surface Temperature (SST) Analysis Products

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This archive covers two high resolution sea surface temperature (SST) analysis products developed using an optimum interpolation (OI) technique. The analyses have a...

  8. High-resolution vertical velocities and their power spectrum observed with the MAARSY radar - Part 1: frequency spectrum

    Science.gov (United States)

    Li, Qiang; Rapp, Markus; Stober, Gunter; Latteck, Ralph

    2018-04-01

    The Middle Atmosphere Alomar Radar System (MAARSY) installed at the island of Andøya has been run for continuous probing of atmospheric winds in the upper troposphere and lower stratosphere (UTLS) region. In the current study, we present high-resolution wind measurements during the period between 2010 and 2013 with MAARSY. The spectral analysis applying the Lomb-Scargle periodogram method has been carried out to determine the frequency spectra of vertical wind velocity. From a total of 522 days of observations, the statistics of the spectral slope have been derived and show a dependence on the background wind conditions. It is a general feature that the observed spectra of vertical velocity during active periods (with wind velocity > 10 m s-1) are much steeper than during quiet periods (with wind velocity wind conditions considered together the general spectra are obtained and their slopes are compared with the background horizontal winds. The comparisons show that the observed spectra become steeper with increasing wind velocities under quiet conditions, approach a spectral slope of -5/3 at a wind velocity of 10 m s-1 and then roughly maintain this slope (-5/3) for even stronger winds. Our findings show an overall agreement with previous studies; furthermore, they provide a more complete climatology of frequency spectra of vertical wind velocities under different wind conditions.

  9. Surface electromyography based muscle fatigue detection using high-resolution time-frequency methods and machine learning algorithms.

    Science.gov (United States)

    Karthick, P A; Ghosh, Diptasree Maitra; Ramakrishnan, S

    2018-02-01

    Surface electromyography (sEMG) based muscle fatigue research is widely preferred in sports science and occupational/rehabilitation studies due to its noninvasiveness. However, these signals are complex, multicomponent and highly nonstationary with large inter-subject variations, particularly during dynamic contractions. Hence, time-frequency based machine learning methodologies can improve the design of automated system for these signals. In this work, the analysis based on high-resolution time-frequency methods, namely, Stockwell transform (S-transform), B-distribution (BD) and extended modified B-distribution (EMBD) are proposed to differentiate the dynamic muscle nonfatigue and fatigue conditions. The nonfatigue and fatigue segments of sEMG signals recorded from the biceps brachii of 52 healthy volunteers are preprocessed and subjected to S-transform, BD and EMBD. Twelve features are extracted from each method and prominent features are selected using genetic algorithm (GA) and binary particle swarm optimization (BPSO). Five machine learning algorithms, namely, naïve Bayes, support vector machine (SVM) of polynomial and radial basis kernel, random forest and rotation forests are used for the classification. The results show that all the proposed time-frequency distributions (TFDs) are able to show the nonstationary variations of sEMG signals. Most of the features exhibit statistically significant difference in the muscle fatigue and nonfatigue conditions. The maximum number of features (66%) is reduced by GA and BPSO for EMBD and BD-TFD respectively. The combination of EMBD- polynomial kernel based SVM is found to be most accurate (91% accuracy) in classifying the conditions with the features selected using GA. The proposed methods are found to be capable of handling the nonstationary and multicomponent variations of sEMG signals recorded in dynamic fatiguing contractions. Particularly, the combination of EMBD- polynomial kernel based SVM could be used to

  10. High frequency, high time resolution time-to-digital converter employing passive resonating circuits.

    Science.gov (United States)

    Ripamonti, Giancarlo; Abba, Andrea; Geraci, Angelo

    2010-05-01

    A method for measuring time intervals accurate to the picosecond range is based on phase measurements of oscillating waveforms synchronous with their beginning and/or end. The oscillation is generated by triggering an LC resonant circuit, whose capacitance is precharged. By using high Q resonators and a final active quenching of the oscillation, it is possible to conjugate high time resolution and a small measurement time, which allows a high measurement rate. Methods for fast analysis of the data are considered and discussed with reference to computing resource requirements, speed, and accuracy. Experimental tests show the feasibility of the method and a time accuracy better than 4 ps rms. Methods aimed at further reducing hardware resources are finally discussed.

  11. High frequency, high time resolution time-to-digital converter employing passive resonating circuits

    International Nuclear Information System (INIS)

    Ripamonti, Giancarlo; Abba, Andrea; Geraci, Angelo

    2010-01-01

    A method for measuring time intervals accurate to the picosecond range is based on phase measurements of oscillating waveforms synchronous with their beginning and/or end. The oscillation is generated by triggering an LC resonant circuit, whose capacitance is precharged. By using high Q resonators and a final active quenching of the oscillation, it is possible to conjugate high time resolution and a small measurement time, which allows a high measurement rate. Methods for fast analysis of the data are considered and discussed with reference to computing resource requirements, speed, and accuracy. Experimental tests show the feasibility of the method and a time accuracy better than 4 ps rms. Methods aimed at further reducing hardware resources are finally discussed.

  12. Development of a procedure to model high-resolution wind profiles from smoothed or low-frequency data

    Science.gov (United States)

    Camp, D. W.

    1977-01-01

    The derivation of simulated Jimsphere wind profiles from low-frequency rawinsonde data and a generated set of white noise data are presented. A computer program is developed to model high-resolution wind profiles based on the statistical properties of data from the Kennedy Space Center, Florida. Comparison of the measured Jimsphere data, rawinsonde data, and the simulated profiles shows excellent agreement.

  13. Geographic resolution issues in RAM transportation risk analysis

    International Nuclear Information System (INIS)

    Mills G, Scott; Neuhauser, Sieglinde

    2000-01-01

    Over the years that radioactive material (RAM) transportation risk estimates have been calculated using the RADTRAN code, demand for improved geographic resolution of route characteristics, especially density of population neighboring route segments, has led to code improvements that provide more specific route definition. With the advent of geographic information systems (GISs), the achievable resolution of route characteristics is theoretically very high. The authors have compiled population-density data in 1-kilometer increments for routes extending over hundreds of kilometers without impractical expenditures of time. Achievable resolution of analysis is limited, however, by the resolution of available data. U.S. Census data typically have 1-km or better resolution within densely-populated portions of metropolitan areas but census blocks are much larger in rural areas. Geographic resolution of accident-rate data, especially for heavy/combination trucks, are typically tabulated on a statewide basis. These practical realities cause one to ask what level(s) of resolution may be necessary for meaningful risk analysis of transportation actions on a state or interstate scale

  14. Assessment of modern spectral analysis methods to improve wavenumber resolution of F-K spectra

    International Nuclear Information System (INIS)

    Shirley, T.E.; Laster, S.J.; Meek, R.A.

    1987-01-01

    The improvement in wavenumber spectra obtained by using high resolution spectral estimators is examined. Three modern spectral estimators were tested, namely the Autoregressive/Maximum Entropy (AR/ME) method, the Extended Prony method, and an eigenstructure method. They were combined with the conventional Fourier method by first transforming each trace with a Fast Fourier Transform (FFT). A high resolution spectral estimator was applied to the resulting complex spatial sequence for each frequency. The collection of wavenumber spectra thus computed comprises a hybrid f-k spectrum with high wavenumber resolution and less spectral ringing. Synthetic and real data records containing 25 traces were analyzed by using the hybrid f-k method. The results show an FFT-AR/ME f-k spectrum has noticeably better wavenumber resolution and more spectral dynamic range than conventional spectra when the number of channels is small. The observed improvement suggests the hybrid technique is potentially valuable in seismic data analysis

  15. A population frequency analysis of the FABP2 gene polymorphism

    African Journals Online (AJOL)

    salah

    DNA was extracted from blood samples for genotype analysis. A PCR-RFLP ... Thr54 genotype. The frequencies of the allele Ala54 and the allele Thr54 of the .... Table 2: Genotype percentages and allele frequencies of FABP2 polymorphism in various ethnic groups. Study Group (n). Genotype %. Allele frequency. P. (vs.

  16. Audio Frequency Analysis in Mobile Phones

    Science.gov (United States)

    Aguilar, Horacio Munguía

    2016-01-01

    A new experiment using mobile phones is proposed in which its audio frequency response is analyzed using the audio port for inputting external signal and getting a measurable output. This experiment shows how the limited audio bandwidth used in mobile telephony is the main cause of the poor speech quality in this service. A brief discussion is…

  17. Joint time frequency analysis in digital signal processing

    DEFF Research Database (Denmark)

    Pedersen, Flemming

    with this technique is that the resolution is limited because of distortion. To overcome the resolution limitations of the Fourier Spectogram, many new distributions have been developed. In spite of this the Fourier Spectogram is by far the prime method for the analysis of signals whose spectral content is varying...

  18. Space-frequency analysis and reduction of potential field ambiguity

    Directory of Open Access Journals (Sweden)

    A. Rapolla

    1997-06-01

    Full Text Available Ambiguity of depth estimation of magnetic sources via spectral analysis can be reduced representing its field via a set of space-frequency atoms. This is obtained throughout a continuous wavelet transform using a Morlet analyzing wavelet. In the phase-plane representation even a weak contribution related to deep-seated sources is clearly distinguished with respect a more intense effect of a shallow source, also in the presence of a strong noise. Furthermore, a new concept of local power spectrum allows the depth to both the sources to be correctly interpreted. Neither result can be provided by standard Fourier analysis. Another method is proposed to reduce ambiguity by inversion of potential field data lying along the vertical axis. This method allows a depth resolution to gravity or the magnetic methods and below some conditions helps to reduce their inherent ambiguity. Unlike the case of monopoles, inversion of a vertical profile of gravity data above a cubic source gives correct results for the cube side and density.

  19. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  20. Frequency analysis of DC tolerant current transformers

    International Nuclear Information System (INIS)

    Mlejnek, P; Kaspar, P

    2013-01-01

    This article deals with wide frequency range behaviour of DC tolerant current transformers that are usually used in modern static energy meters. In this application current transformers must comply with European and International Standards in their accuracy and DC tolerance. Therefore, the linear DC tolerant current transformers and double core current transformers are used in this field. More details about the problems of these particular types of transformers can be found in our previous works. Although these transformers are designed mainly for power distribution network frequency (50/60 Hz), it can be interesting to understand their behaviour in wider frequency range. Based on this knowledge the new generations of energy meters with measuring quality of electric energy will be produced. This solution brings better measurement of consumption of nonlinear loads or measurement of non-sinusoidal voltage and current sources such as solar cells or fuel cells. The determination of actual power consumption in such energy meters is done using particular harmonics component of current and voltage. We measured the phase and ratio errors that are the most important parameters of current transformers, to characterize several samples of current transformers of both types

  1. Analysis of smear in high-resolution remote sensing satellites

    Science.gov (United States)

    Wahballah, Walid A.; Bazan, Taher M.; El-Tohamy, Fawzy; Fathy, Mahmoud

    2016-10-01

    High-resolution remote sensing satellites (HRRSS) that use time delay and integration (TDI) CCDs have the potential to introduce large amounts of image smear. Clocking and velocity mismatch smear are two of the key factors in inducing image smear. Clocking smear is caused by the discrete manner in which the charge is clocked in the TDI-CCDs. The relative motion between the HRRSS and the observed object obliges that the image motion velocity must be strictly synchronized with the velocity of the charge packet transfer (line rate) throughout the integration time. During imaging an object off-nadir, the image motion velocity changes resulting in asynchronization between the image velocity and the CCD's line rate. A Model for estimating the image motion velocity in HRRSS is derived. The influence of this velocity mismatch combined with clocking smear on the modulation transfer function (MTF) is investigated by using Matlab simulation. The analysis is performed for cross-track and along-track imaging with different satellite attitude angles and TDI steps. The results reveal that the velocity mismatch ratio and the number of TDI steps have a serious impact on the smear MTF; a velocity mismatch ratio of 2% degrades the MTFsmear by 32% at Nyquist frequency when the TDI steps change from 32 to 96. In addition, the results show that to achieve the requirement of MTFsmear >= 0.95 , for TDI steps of 16 and 64, the allowable roll angles are 13.7° and 6.85° and the permissible pitch angles are no more than 9.6° and 4.8°, respectively.

  2. Precise and fast spatial-frequency analysis using the iterative local Fourier transform.

    Science.gov (United States)

    Lee, Sukmock; Choi, Heejoo; Kim, Dae Wook

    2016-09-19

    The use of the discrete Fourier transform has decreased since the introduction of the fast Fourier transform (fFT), which is a numerically efficient computing process. This paper presents the iterative local Fourier transform (ilFT), a set of new processing algorithms that iteratively apply the discrete Fourier transform within a local and optimal frequency domain. The new technique achieves 210 times higher frequency resolution than the fFT within a comparable computation time. The method's superb computing efficiency, high resolution, spectrum zoom-in capability, and overall performance are evaluated and compared to other advanced high-resolution Fourier transform techniques, such as the fFT combined with several fitting methods. The effectiveness of the ilFT is demonstrated through the data analysis of a set of Talbot self-images (1280 × 1024 pixels) obtained with an experimental setup using grating in a diverging beam produced by a coherent point source.

  3. Comparison of high-resolution Scheimpflug and high-frequency ultrasound biomicroscopy to anterior-segment OCT corneal thickness measurements

    Directory of Open Access Journals (Sweden)

    Kanellopoulos AJ

    2013-11-01

    Full Text Available Anastasios John Kanellopoulos,1,2 George Asimellis1 1Laservision.gr Eye Institute, Athens, Greece; 2New York University Medical School, New York, NY, USA Background: The purpose of this study was to compare and correlate central corneal thickness in healthy, nonoperated eyes with three advanced anterior-segment imaging systems: a high-resolution Scheimpflug tomography camera (Oculyzer II, a spectral-domain anterior-segment optical coherence tomography (AS-OCT system, and a high-frequency ultrasound biomicroscopy (HF-UBM system. Methods: Fifty eyes randomly selected from 50 patients were included in the study. Inclusion criteria were healthy, nonoperated eyes examined consecutively by the same examiner. Corneal imaging was performed by three different methods, ie, Oculyzer II, spectral-domain AS-OCT, and FH-UBM. Central corneal thickness measurements were compared using scatter diagrams, Bland-Altman plots (with bias and 95% confidence intervals, and two-paired analysis. Results: The coefficient of determination (r2 between the Oculyzer II and AS-OCT measurements was 0.895. Likewise, the coefficient was 0.893 between the Oculyzer II and HF-UBM and 0.830 between the AS-OCT and HF-UBM. The trend line coefficients of linearity were 0.925 between the Oculyzer II and the AS-OCT, 1.006 between the Oculyzer II and HF-UBM, and 0.841 between the AS-OCT and HF-UBM. The differences in average corneal thickness between the three pairs of CCT measurements were –6.86 µm between the Oculyzer II and HF-UBM, –12.20 µm between the AS-OCT and Oculyzer II, and +19.06 µm between the HF-UBM and AS-OCT. Conclusion: The three methods used for corneal thickness measurement are highly correlated. Compared with the Scheimplug and ultrasound devices, the AS-OCT appears to report a more accurate, but overally thinner corneal pachymetry. Keywords: anterior eye segment, high-frequency ultrasound biomicroscopy, optical coherence tomography, high-resolution Pentacam

  4. High resolution electromagnetic methods and low frequency dispersion of rock conductivity

    OpenAIRE

    Svetov, B. S.; Ageev, V. V.

    1999-01-01

    The influence of frequency dispersion of conductivity (induced polarization) of rocks on the results of electromagnetic (EM) sounding was studied on the basis of calculation of electric field of vertical magnetic dipole above horizontally layered polarizable sections. Frequency dispersion was approximated by the Debye formula. Polarizable homogeneous halfspace, two, three and multilayered sections were analyzed in frequency and time domains. The calculations for different values of chargeabil...

  5. Super-resolution imaging based on the temperature-dependent electron-phonon collision frequency effect of metal thin films

    Science.gov (United States)

    Ding, Chenliang; Wei, Jingsong; Xiao, Mufei

    2018-05-01

    We herein propose a far-field super-resolution imaging with metal thin films based on the temperature-dependent electron-phonon collision frequency effect. In the proposed method, neither fluorescence labeling nor any special properties are required for the samples. The 100 nm lands and 200 nm grooves on the Blu-ray disk substrates were clearly resolved and imaged through a laser scanning microscope of wavelength 405 nm. The spot size was approximately 0.80 μm , and the imaging resolution of 1/8 of the laser spot size was experimentally obtained. This work can be applied to the far-field super-resolution imaging of samples with neither fluorescence labeling nor any special properties.

  6. Sub-nanometer-resolution imaging of peptide nanotubes in water using frequency modulation atomic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Sugihara, Tomoki; Hayashi, Itsuho; Onishi, Hiroshi [Department of Chemistry, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501 (Japan); Kimura, Kenjiro, E-mail: kimura@gold.kobe-u.ac.jp [Department of Chemistry, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501 (Japan); Tamura, Atsuo [Department of Chemistry, Graduate School of Science, Kobe University, 1-1 Rokkodai-cho, Nada-ku, Kobe 657-8501 (Japan)

    2013-06-20

    Highlights: ► Peptide nanotubes were aligned on highly oriented pyrolytic graphite surface. ► We visualized sub-nanometer-scale structure on peptide nanotube surface in water. ► We observed hydration structure at a peptide nanotube/water interface. - Abstract: Peptide nanotubes are self-assembled fibrous materials composed of cyclic polypeptides. Recently, various aspects of peptide nanotubes have been studied, in particular the utility of different methods for making peptide nanotubes with diverse designed functions. In order to investigate the relationship between formation, function and stability, it is essential to analyze the precise structure of peptide nanotubes. Atomic-scale surface imaging in liquids was recently achieved using frequency modulation atomic force microscopy with improved force sensing. Here we provide a precise surface structural analysis of peptide nanotubes in water without crystallizing them obtained by imaging the nanotubes at the sub-nanometer scale in water. In addition, the local hydration structure around the peptide nanotubes was observed at the nanotube/water interface.

  7. Analysis of the resolution processes of three modeling tasks

    Directory of Open Access Journals (Sweden)

    Cèsar Gallart Palau

    2017-08-01

    Full Text Available In this paper we present a comparative analysis of the resolution process of three modeling tasks performed by secondary education students (13-14 years, designed from three different points of view: The Modelling-eliciting Activities, the LEMA project, and the Realistic Mathematical Problems. The purpose of this analysis is to obtain a methodological characterization of them in order to provide to secondary education teachers a proper selection and sequencing of tasks for their implementation in the classroom.

  8. Resolution about the second strategic analysis on energy policy

    International Nuclear Information System (INIS)

    2009-01-01

    The French National Assembly has adopted a resolution concerning the European Commission's second strategic analysis on energy policy. The resolution makes acknowledgment of the strategic orientations proposed by the European Commission; however the necessity is emphasized to take into consideration the january 2009 gas crisis experience and therefore to ensure a better diversification of the gas supply to Western Europe. It also considers that a higher impulse (indeed with constraint) is to be given to renewable energy development in order to be able to meet the 2020 target of a 20 percent increase in energy efficiency in the EU

  9. A Bayesian Analysis of the Flood Frequency Hydrology Concept

    Science.gov (United States)

    2016-02-01

    ERDC/CHL CHETN-X-1 February 2016 Approved for public release; distribution is unlimited. A Bayesian Analysis of the Flood Frequency Hydrology ...flood frequency hydrology concept as a formal probabilistic-based means by which to coherently combine and also evaluate the worth of different types...and development. INTRODUCTION: Merz and Blöschl (2008a,b) proposed the concept of flood frequency hydrology , which emphasizes the importance of

  10. Time-frequency analysis of human motion during rhythmic exercises.

    Science.gov (United States)

    Omkar, S N; Vyas, Khushi; Vikranth, H N

    2011-01-01

    Biomechanical signals due to human movements during exercise are represented in time-frequency domain using Wigner Distribution Function (WDF). Analysis based on WDF reveals instantaneous spectral and power changes during a rhythmic exercise. Investigations were carried out on 11 healthy subjects who performed 5 cycles of sun salutation, with a body-mounted Inertial Measurement Unit (IMU) as a motion sensor. Variance of Instantaneous Frequency (I.F) and Instantaneous Power (I.P) for performance analysis of the subject is estimated using one-way ANOVA model. Results reveal that joint Time-Frequency analysis of biomechanical signals during motion facilitates a better understanding of grace and consistency during rhythmic exercise.

  11. Frequency analysis of a tower-cable coupled system

    Energy Technology Data Exchange (ETDEWEB)

    Park, Moo Yeol [Young Sin Precision Engineering Ltd., Gyungju (Korea, Republic of); Kim, Seock Hyun; Park, In Su [Kangwon National University, Chuncheon (Korea, Republic of); Cui, Chengxun [Yanbian University, Yangji (China)

    2013-06-15

    This study considers the prediction of natural frequency to avoid resonance in a wind turbine tower- cable coupled system. An analytical model based on the Rayleigh-Ritz method is proposed to predict the resonance frequency of a wind turbine tower structure supported by four guy cables. To verify the validity of the analytical model, a small tower-cable model is manufactured and tested. The frequency and mode data of the tower model are obtained by modal testing and finite element analysis. The validity of the proposed method is verified through the comparison of the frequency analysis results. Finally, using a parametric study with the analytical model, we identified how the cable tension and cable angle affect the resonance frequency of the wind turbine tower structure. From the analysis results, the tension limit and optimal angle of the cable are identified.

  12. Assessing the copula selection for bivariate frequency analysis ...

    Indian Academy of Sciences (India)

    58

    Copulas are applied to overcome the restriction of traditional bivariate frequency ... frequency analysis methods cannot describe the random variable properties that ... In order to overcome the limitation of multivariate distributions, a copula is a ..... The Mann-Kendall (M-K) test is a non-parametric statistical test which is used ...

  13. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  14. Frequency domain analysis of piping systems under short duration loading

    International Nuclear Information System (INIS)

    Sachs, K.; Sand, H.; Lockau, J.

    1981-01-01

    In piping analysis two procedures are used almost exclusively: the modal superposition method for relatively long input time histories (e.g., earthquake) and direct integration of the equations of motion for short input time histories. A third possibility, frequency domain analysis, has only rarely been applied to piping systems to date. This paper suggests the use of frequency domain analysis for specific piping problems for which only direct integration could be used in the past. Direct integration and frequency domain analysis are compared, and it is shown that the frequency domain method is less costly if more than four or five load cases are considered. In addition, this method offers technical advantages, such as more accurate representation of modal damping and greater insight into the structural behavior of the system. (orig.)

  15. Frequency domain performance analysis of nonlinearly controlled motion systems

    NARCIS (Netherlands)

    Pavlov, A.V.; Wouw, van de N.; Pogromski, A.Y.; Heertjes, M.F.; Nijmeijer, H.

    2007-01-01

    At the heart of the performance analysis of linear motion control systems lie essential frequency domain characteristics such as sensitivity and complementary sensitivity functions. For a class of nonlinear motion control systems called convergent systems, generalized versions of these sensitivity

  16. Advance in ERG Analysis: From Peak Time and Amplitude to Frequency, Power, and Energy

    Directory of Open Access Journals (Sweden)

    Mathieu Gauvin

    2014-01-01

    Full Text Available Purpose. To compare time domain (TD: peak time and amplitude analysis of the human photopic electroretinogram (ERG with measures obtained in the frequency domain (Fourier analysis: FA and in the time-frequency domain (continuous (CWT and discrete (DWT wavelet transforms. Methods. Normal ERGs n=40 were analyzed using traditional peak time and amplitude measurements of the a- and b-waves in the TD and descriptors extracted from FA, CWT, and DWT. Selected descriptors were also compared in their ability to monitor the long-term consequences of disease process. Results. Each method extracted relevant information but had distinct limitations (i.e., temporal and frequency resolutions. The DWT offered the best compromise by allowing us to extract more relevant descriptors of the ERG signal at the cost of lesser temporal and frequency resolutions. Follow-ups of disease progression were more prolonged with the DWT (max 29 years compared to 13 with TD. Conclusions. Standardized time domain analysis of retinal function should be complemented with advanced DWT descriptors of the ERG. This method should allow more sensitive/specific quantifications of ERG responses, facilitate follow-up of disease progression, and identify diagnostically significant changes of ERG waveforms that are not resolved when the analysis is only limited to time domain measurements.

  17. Time-frequency analysis and harmonic Gaussian functions

    International Nuclear Information System (INIS)

    Ranaivoson, R.T.R; Raoelina Andriambololona; Hanitriarivo, R.

    2013-01-01

    A method for time-frequency analysis is given. The approach utilizes properties of Gaussian distribution, properties of Hermite polynomials and Fourier analysis. We begin by the definitions of a set of functions called Harmonic Gaussian Functions. Then these functions are used to define a set of transformations, noted Τ n , which associate to a function ψ, of the time variable t, a set of functions Ψ n which depend on time, frequency and frequency (or time) standard deviation. Some properties of the transformations Τ n and the functions Ψ n are given. It is proved in particular that the square of the modulus of each function Ψ n can be interpreted as a representation of the energy distribution of the signal, represented by the function ψ, in the time-frequency plane for a given value of the frequency (or time) standard deviation. It is also shown that the function ψ can be recovered from the functions Ψ n .

  18. ANALYSIS OF FREQUENCY OF PHENYLKETONURIA AMONG INSTITUTIONALIZED

    Directory of Open Access Journals (Sweden)

    S VALIAN

    2003-09-01

    Full Text Available Introduction: Phenylketonuria (PKU is a genetic disease, which is caused by deficiency in phenylalanine hydroxylase (PAH enzyme. Untreated patients will develop a severe mental retardation, which is irreversible. In this study, the incidence of the PKU disease among isolated mentally retarded residents in institutions in Isfahan, was investigated. Methods: A total number of 1541 patients were involved in the study. Of the patients studied, 611 with no known reason for their mental retardation were chosen for blood sampling. Blood samples were collected on filter papers and examined by Gutheri bacterial inhibition assay (GBIA, which is specific for PKU In patients with positive test, the serum phenylalanine was quatitavely analyzed using high pressure liquid chromatography, HPLC. Results: Among the patients examined, 33 were found positive. Quantitative analysis of phenylalanine allowed classification of the patients, indicating 600 with classical, 36% with moderate, and 3% with mild type of PKU Furthermore; it was found that in 68% of the cases, parents are third grade relative. Discussion: The results obtained in this screening study indicated that 2.1% of the patients in the institutions for mentally related in Isfahan suffered from PKU The incidence of the disease is relatively high compare to the reports from other countries. Since, a large number of patients (68% are the results of consanguineous marriages, this kind of marriage could be considered as one of the important factors involved in the prevalence of PKU in Isfahan.

  19. Diagnosis of industrial gearboxes condition by vibration and time-frequency, scale-frequency, frequency-frequency analysis

    Directory of Open Access Journals (Sweden)

    P. Czech

    2012-10-01

    Full Text Available In the article methods of vibroacoustic diagnostics of high-power toothed gears are described. It is shown below, that properly registered and processed acoustic signal or vibration signal may serve as an explicitly interpreted source of diagnostic symptoms. The presented analysis were based on vibration signals registered during the work of the gear of a rolling stand working in Katowice Steel Plant (presently one of the branches of Mittal Steel Poland JSC.

  20. Single-cell resolution of intracellular T cell Ca2+ dynamics in response to frequency-based H2O2 stimulation.

    Science.gov (United States)

    Kniss-James, Ariel S; Rivet, Catherine A; Chingozha, Loice; Lu, Hang; Kemp, Melissa L

    2017-03-01

    Adaptive immune cells, such as T cells, integrate information from their extracellular environment through complex signaling networks with exquisite sensitivity in order to direct decisions on proliferation, apoptosis, and cytokine production. These signaling networks are reliant on the interplay between finely tuned secondary messengers, such as Ca 2+ and H 2 O 2 . Frequency response analysis, originally developed in control engineering, is a tool used for discerning complex networks. This analytical technique has been shown to be useful for understanding biological systems and facilitates identification of the dominant behaviour of the system. We probed intracellular Ca 2+ dynamics in the frequency domain to investigate the complex relationship between two second messenger signaling molecules, H 2 O 2 and Ca 2+ , during T cell activation with single cell resolution. Single-cell analysis provides a unique platform for interrogating and monitoring cellular processes of interest. We utilized a previously developed microfluidic device to monitor individual T cells through time while applying a dynamic input to reveal a natural frequency of the system at approximately 2.78 mHz stimulation. Although our network was much larger with more unknown connections than previous applications, we are able to derive features from our data, observe forced oscillations associated with specific amplitudes and frequencies of stimuli, and arrive at conclusions about potential transfer function fits as well as the underlying population dynamics.

  1. Broadband high-resolution two-photon spectroscopy with laser frequency combs

    OpenAIRE

    Hipke, Arthur; Meek, Samuel A.; Ideguchi, Takuro; Hänsch, Theodor W.; Picqué, Nathalie

    2013-01-01

    Two-photon excitation spectroscopy with broad spectral span is demonstrated at Doppler-limited resolution. We describe first Fourier transform two-photon spectroscopy of an atomic sample with two mode-locked laser oscillators in a dual-comb technique. Each transition is uniquely identified by the modulation imparted by the interfering comb excitations. The temporal modulation of the spontaneous two-photon fluorescence is monitored with a single photodetector, and the spectrum is revealed by a...

  2. Dual-Frequency, Dual-Polarization Microstrip Antenna Development for High-Resolution, Airborne SAR

    DEFF Research Database (Denmark)

    Granholm, Johan; Skou, N.

    2000-01-01

    synthetic aperture radar (SAR) system. The dual-frequency array concept adopted relies on the use of probe-fed perforated, stacked patches for L-band (1.2-1.3 GHz). Inside these perforations probe-fed, wideband stacked microstrip patches for C-band (4.9-5.7 GHz) are placed. Measured impedance and radiation...

  3. Frequency and Angular Resolution for Measuring, Presenting and Predicting Loudspeaker Polar Data

    DEFF Research Database (Denmark)

    Staffeldt, Henrik; Seidel, Felicity

    1996-01-01

    and approved by the AES Standards Committee. The work is a continuation of AES project AES-X07 and is presented to provide background for Call for Comment on Draft AES-5id-xxxx. AES information document for Room acoustics and sound reinforcement systems - Loudspeaker modeling and measurement - Frequency...

  4. Fundamentals of convex analysis duality, separation, representation, and resolution

    CERN Document Server

    Panik, Michael J

    1993-01-01

    Fundamentals of Convex Analysis offers an in-depth look at some of the fundamental themes covered within an area of mathematical analysis called convex analysis. In particular, it explores the topics of duality, separation, representation, and resolution. The work is intended for students of economics, management science, engineering, and mathematics who need exposure to the mathematical foundations of matrix games, optimization, and general equilibrium analysis. It is written at the advanced undergraduate to beginning graduate level and the only formal preparation required is some familiarity with set operations and with linear algebra and matrix theory. Fundamentals of Convex Analysis is self-contained in that a brief review of the essentials of these tool areas is provided in Chapter 1. Chapter exercises are also provided. Topics covered include: convex sets and their properties; separation and support theorems; theorems of the alternative; convex cones; dual homogeneous systems; basic solutions and comple...

  5. Proposing New Methods to Enhance the Low-Resolution Simulated GPR Responses in the Frequency and Wavelet Domains

    Directory of Open Access Journals (Sweden)

    Reza Ahmadi

    2014-12-01

    Full Text Available To date, a number of numerical methods, including the popular Finite-Difference Time Domain (FDTD technique, have been proposed to simulate Ground-Penetrating Radar (GPR responses. Despite having a number of advantages, the finite-difference method also has pitfalls such as being very time consuming in simulating the most common case of media with high dielectric permittivity, causing the forward modelling process to be very long lasting, even with modern high-speed computers. In the present study the well-known hyperbolic pattern response of horizontal cylinders, usually found in GPR B-Scan images, is used as a basic model to examine the possibility of reducing the forward modelling execution time. In general, the simulated GPR traces of common reflected objects are time shifted, as with the Normal Moveout (NMO traces encountered in seismic reflection responses. This suggests the application of Fourier transform to the GPR traces, employing the time-shifting property of the transformation to interpolate the traces between the adjusted traces in the frequency domain (FD. Therefore, in the present study two post-processing algorithms have been adopted to increase the speed of forward modelling while maintaining the required precision. The first approach is based on linear interpolation in the Fourier domain, resulting in increasing lateral trace-to-trace interval of appropriate sampling frequency of the signal, preventing any aliasing. In the second approach, a super-resolution algorithm based on 2D-wavelet transform is developed to increase both vertical and horizontal resolution of the GPR B-Scan images through preserving scale and shape of hidden hyperbola features. Through comparing outputs from both methods with the corresponding actual high-resolution forward response, it is shown that both approaches can perform satisfactorily, although the wavelet-based approach outperforms the frequency-domain approach noticeably, both in amplitude and

  6. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI

    Energy Technology Data Exchange (ETDEWEB)

    Gholipour, Ali, E-mail: ali.gholipour@childrens.harvard.edu; Afacan, Onur; Scherrer, Benoit; Prabhu, Sanjay P.; Warfield, Simon K. [Department of Radiology, Boston Children’s Hospital, Boston, Massachusetts 02115 and Harvard Medical School, Boston, Massachusetts 02115 (United States); Aganj, Iman [Radiology Department, Martinos Center for Biomedical Imaging, Massachusetts General Hospital, Boston, Massachusetts 02129 and Harvard Medical School, Boston, Massachusetts 02115 (United States); Sahin, Mustafa [Department of Neurology, Boston Children’s Hospital, Boston, Massachusetts 02115 and Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2015-12-15

    Purpose: To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. Methods: The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) of image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Results: Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in

  7. Super-resolution reconstruction in frequency, image, and wavelet domains to reduce through-plane partial voluming in MRI

    International Nuclear Information System (INIS)

    Gholipour, Ali; Afacan, Onur; Scherrer, Benoit; Prabhu, Sanjay P.; Warfield, Simon K.; Aganj, Iman; Sahin, Mustafa

    2015-01-01

    Purpose: To compare and evaluate the use of super-resolution reconstruction (SRR), in frequency, image, and wavelet domains, to reduce through-plane partial voluming effects in magnetic resonance imaging. Methods: The reconstruction of an isotropic high-resolution image from multiple thick-slice scans has been investigated through techniques in frequency, image, and wavelet domains. Experiments were carried out with thick-slice T2-weighted fast spin echo sequence on the Academic College of Radiology MRI phantom, where the reconstructed images were compared to a reference high-resolution scan using peak signal-to-noise ratio (PSNR), structural similarity image metric (SSIM), mutual information (MI), and the mean absolute error (MAE) of image intensity profiles. The application of super-resolution reconstruction was then examined in retrospective processing of clinical neuroimages of ten pediatric patients with tuberous sclerosis complex (TSC) to reduce through-plane partial voluming for improved 3D delineation and visualization of thin radial bands of white matter abnormalities. Results: Quantitative evaluation results show improvements in all evaluation metrics through super-resolution reconstruction in the frequency, image, and wavelet domains, with the highest values obtained from SRR in the image domain. The metric values for image-domain SRR versus the original axial, coronal, and sagittal images were PSNR = 32.26 vs 32.22, 32.16, 30.65; SSIM = 0.931 vs 0.922, 0.924, 0.918; MI = 0.871 vs 0.842, 0.844, 0.831; and MAE = 5.38 vs 7.34, 7.06, 6.19. All similarity metrics showed high correlations with expert ranking of image resolution with MI showing the highest correlation at 0.943. Qualitative assessment of the neuroimages of ten TSC patients through in-plane and out-of-plane visualization of structures showed the extent of partial voluming effect in a real clinical scenario and its reduction using SRR. Blinded expert evaluation of image resolution in

  8. Statistical Analysis of Solar PV Power Frequency Spectrum for Optimal Employment of Building Loads

    Energy Technology Data Exchange (ETDEWEB)

    Olama, Mohammed M [ORNL; Sharma, Isha [ORNL; Kuruganti, Teja [ORNL; Fugate, David L [ORNL

    2017-01-01

    In this paper, a statistical analysis of the frequency spectrum of solar photovoltaic (PV) power output is conducted. This analysis quantifies the frequency content that can be used for purposes such as developing optimal employment of building loads and distributed energy resources. One year of solar PV power output data was collected and analyzed using one-second resolution to find ideal bounds and levels for the different frequency components. The annual, seasonal, and monthly statistics of the PV frequency content are computed and illustrated in boxplot format. To examine the compatibility of building loads for PV consumption, a spectral analysis of building loads such as Heating, Ventilation and Air-Conditioning (HVAC) units and water heaters was performed. This defined the bandwidth over which these devices can operate. Results show that nearly all of the PV output (about 98%) is contained within frequencies lower than 1 mHz (equivalent to ~15 min), which is compatible for consumption with local building loads such as HVAC units and water heaters. Medium frequencies in the range of ~15 min to ~1 min are likely to be suitable for consumption by fan equipment of variable air volume HVAC systems that have time constants in the range of few seconds to few minutes. This study indicates that most of the PV generation can be consumed by building loads with the help of proper control strategies, thereby reducing impact on the grid and the size of storage systems.

  9. Effects of cations and cholesterol with sphingomyelin membranes investigated by high-resolution broadband sum frequency vibrational spectroscopy

    Science.gov (United States)

    Zhang, Zhen; Feng, Rong-juan; Li, Yi-yi; Liu, Ming-hua; Guo, Yuan

    2017-08-01

    Sphingomyelin(SM) is specifically enriched in the plasma membrane of mammalian cells. Its molecular structure is compose by N-acyl-Derythro-sphingosylphosphorylcholine. The function of the SM related to membrane signaling and protein trafficking are relied on the interactions of the SM, cations, cholesterol and proteins. In this report, the interaction of three different nature SMs, cations and cholesterol at air/aqueous interfaces studied by high-resolution broadband sum frequency vibrational spectroscopy, respectively. Our results shed lights on understanding the relationship between SMs monolayer, cholesterol and Cations.

  10. Atomic resolution ultrafast scanning tunneling microscope with scan rate breaking the resonant frequency of a quartz tuning fork resonator.

    Science.gov (United States)

    Li, Quanfeng; Lu, Qingyou

    2011-05-01

    We present an ultra-fast scanning tunneling microscope with atomic resolution at 26 kHz scan rate which surpasses the resonant frequency of the quartz tuning fork resonator used as the fast scan actuator. The main improvements employed in achieving this new record are (1) fully low voltage design (2) independent scan control and data acquisition, where the tuning fork (carrying a tip) is blindly driven to scan by a function generator with the scan voltage and tunneling current (I(T)) being measured as image data (this is unlike the traditional point-by-point move and measure method where data acquisition and scan control are switched many times).

  11. SRXRF analysis with spatial resolution of dental calculus

    International Nuclear Information System (INIS)

    Sanchez, Hector Jorge; Perez, Carlos Alberto; Grenon, Miriam

    2000-01-01

    This work presents elemental-composition studies of dental calculus by X-ray fluorescence analysis using synchrotron radiation. The intrinsic characteristics of synchrotron light allow for a semi-quantitative analysis with spatial resolution. The experiments were carried out in the high-vacuum station of the XRF beamline at the Synchrotron Light National Laboratory (Campinas, Brazil). All the measurements were performed in conventional geometry (45 deg. + 45 deg.) and the micro-collimation was attained via a pair of orthogonal slits mounted in the beamline. In this way, pixels of 50 μmx50 μm were obtained keeping a high flux of photons on the sample. Samples of human dental calculus were measured in different positions along their growing axis, in order to determine variations of the compositions in the pattern of deposit. Intensity ratios of minor elements and traces were obtained, and linear profiles and surface distributions were determined. As a general summary, we can conclude that μXRF experiments with spatial resolution on dental calculus are feasible with simple collimation and adequate positioning systems, keeping a high flux of photon. These results open interesting perspectives for the future station of the line, devoted to μXRF, which will reach resolutions of the order of 10 μm

  12. SRXRF analysis with spatial resolution of dental calculus

    Science.gov (United States)

    Sánchez, Héctor Jorge; Pérez, Carlos Alberto; Grenón, Miriam

    2000-09-01

    This work presents elemental-composition studies of dental calculus by X-ray fluorescence analysis using synchrotron radiation. The intrinsic characteristics of synchrotron light allow for a semi-quantitative analysis with spatial resolution. The experiments were carried out in the high-vacuum station of the XRF beamline at the Synchrotron Light National Laboratory (Campinas, Brazil). All the measurements were performed in conventional geometry (45°+45°) and the micro-collimation was attained via a pair of orthogonal slits mounted in the beamline. In this way, pixels of 50 μm×50 μm were obtained keeping a high flux of photons on the sample. Samples of human dental calculus were measured in different positions along their growing axis, in order to determine variations of the compositions in the pattern of deposit. Intensity ratios of minor elements and traces were obtained, and linear profiles and surface distributions were determined. As a general summary, we can conclude that μXRF experiments with spatial resolution on dental calculus are feasible with simple collimation and adequate positioning systems, keeping a high flux of photon. These results open interesting perspectives for the future station of the line, devoted to μXRF, which will reach resolutions of the order of 10 μm.

  13. Bivariable analysis of ventricular late potentials in high resolution ECG records

    International Nuclear Information System (INIS)

    Orosco, L; Laciar, E

    2007-01-01

    In this study the bivariable analysis for ventricular late potentials detection in high-resolution electrocardiographic records is proposed. The standard time-domain analysis and the application of the time-frequency technique to high-resolution ECG records are briefly described as well as their corresponding results. In the proposed technique the time-domain parameter, QRSD and the most significant time-frequency index, EN QRS are used like variables. A bivariable index is defined, that combines the previous parameters. The propose technique allows evaluating the risk of ventricular tachycardia in post-myocardial infarct patients. The results show that the used bivariable index allows discriminating between the patient's population with ventricular tachycardia and the subjects of the control group. Also, it was found that the bivariable technique obtains a good valuation as diagnostic test. It is concluded that comparatively, the valuation of the bivariable technique as diagnostic test is superior to that of the time-domain method and the time-frequency technique evaluated individually

  14. High-resolution acoustic imaging at low frequencies using 3D-printed metamaterials

    Directory of Open Access Journals (Sweden)

    S. Laureti

    2016-12-01

    Full Text Available An acoustic metamaterial has been constructed using 3D printing. It contained an array of air-filled channels, whose size and shape could be varied within the design and manufacture process. In this paper we analyze both numerically and experimentally the properties of this polymer metamaterial structure, and demonstrate its use for the imaging of a sample with sub-wavelength dimensions in the audible frequency range.

  15. A time-frequency analysis of wave packet fractional revivals

    International Nuclear Information System (INIS)

    Ghosh, Suranjana; Banerji, J

    2007-01-01

    We show that the time-frequency analysis of the autocorrelation function is, in many ways, a more appropriate tool to resolve fractional revivals of a wave packet than the usual time-domain analysis. This advantage is crucial in reconstructing the initial state of the wave packet when its coherent structure is short-lived and decays before it is fully revived. Our calculations are based on the model example of fractional revivals in a Rydberg wave packet of circular states. We end by providing an analytical investigation which fully agrees with our numerical observations on the utility of time-frequency analysis in the study of wave packet fractional revivals

  16. High Order Differential Frequency Hopping: Design and Analysis

    Directory of Open Access Journals (Sweden)

    Yong Li

    2015-01-01

    Full Text Available This paper considers spectrally efficient differential frequency hopping (DFH system design. Relying on time-frequency diversity over large spectrum and high speed frequency hopping, DFH systems are robust against hostile jamming interference. However, the spectral efficiency of conventional DFH systems is very low due to only using the frequency of each channel. To improve the system capacity, in this paper, we propose an innovative high order differential frequency hopping (HODFH scheme. Unlike in traditional DFH where the message is carried by the frequency relationship between the adjacent hops using one order differential coding, in HODFH, the message is carried by the frequency and phase relationship using two-order or higher order differential coding. As a result, system efficiency is increased significantly since the additional information transmission is achieved by the higher order differential coding at no extra cost on either bandwidth or power. Quantitative performance analysis on the proposed scheme demonstrates that transmission through the frequency and phase relationship using two-order or higher order differential coding essentially introduces another dimension to the signal space, and the corresponding coding gain can increase the system efficiency.

  17. Optical Frequency Comb Fourier Transform Spectroscopy with Resolution Exceeding the Limit Set by the Optical Path Difference

    Science.gov (United States)

    Foltynowicz, Aleksandra; Rutkowski, Lucile; Johanssson, Alexandra C.; Khodabakhsh, Amir; Maslowski, Piotr; Kowzan, Grzegorz; Lee, Kevin; Fermann, Martin

    2015-06-01

    Fourier transform spectrometers (FTS) based on optical frequency combs (OFC) allow detection of broadband molecular spectra with high signal-to-noise ratios within acquisition times orders of magnitude shorter than traditional FTIRs based on thermal sources. Due to the pulsed nature of OFCs the interferogram consists of a series of bursts rather than a single burst at zero optical path difference (OPD). The comb mode structure can be resolved by acquiring multiple bursts, in both mechanical FTS systems and dual-comb spectroscopy. However, in all existing demonstrations the resolution was ultimately limited either by the maximum available OPD between the interferometer arms or by the total acquisition time enabled by the storage memory. We present a method that provides spectral resolution exceeding the limit set by the maximum OPD using an interferogram containing only a single burst. The method allows measurements of absorption lines narrower than the OPD-limited resolution without any influence of the instrumental lineshape function. We demonstrate this by measuring undistorted CO2 and CO absorption lines with linewidth narrower than the OPD-limited resolution using OFC-based mechanical FTS in the near- and mid-infrared wavelength ranges. The near-infrared system is based on an Er:fiber femtosecond laser locked to a high finesse cavity, while the mid-infrared system is based on a Tm:fiber-laser-pumped optical parametric oscillator coupled to a multi-pass cell. We show that the method allows acquisition of high-resolution molecular spectra with interferometer length orders of magnitude shorter than traditional FTIR. Mandon, J., G. Guelachvili, and N. Picque, Nat. Phot., 2009. 3(2): p. 99-102. Zeitouny, M., et al., Ann. Phys., 2013. 525(6): p. 437-442. Zolot, A.M., et al., Opt. Lett., 2012. 37(4): p. 638-640.

  18. Gear fault diagnosis based on the structured sparsity time-frequency analysis

    Science.gov (United States)

    Sun, Ruobin; Yang, Zhibo; Chen, Xuefeng; Tian, Shaohua; Xie, Yong

    2018-03-01

    Over the last decade, sparse representation has become a powerful paradigm in mechanical fault diagnosis due to its excellent capability and the high flexibility for complex signal description. The structured sparsity time-frequency analysis (SSTFA) is a novel signal processing method, which utilizes mixed-norm priors on time-frequency coefficients to obtain a fine match for the structure of signals. In order to extract the transient feature from gear vibration signals, a gear fault diagnosis method based on SSTFA is proposed in this work. The steady modulation components and impulsive components of the defective gear vibration signals can be extracted simultaneously by choosing different time-frequency neighborhood and generalized thresholding operators. Besides, the time-frequency distribution with high resolution is obtained by piling different components in the same diagram. The diagnostic conclusion can be made according to the envelope spectrum of the impulsive components or by the periodicity of impulses. The effectiveness of the method is verified by numerical simulations, and the vibration signals registered from a gearbox fault simulator and a wind turbine. To validate the efficiency of the presented methodology, comparisons are made among some state-of-the-art vibration separation methods and the traditional time-frequency analysis methods. The comparisons show that the proposed method possesses advantages in separating feature signals under strong noise and accounting for the inner time-frequency structure of the gear vibration signals.

  19. SPREAD: a high-resolution daily gridded precipitation dataset for Spain – an extreme events frequency and intensity overview

    Directory of Open Access Journals (Sweden)

    R. Serrano-Notivoli

    2017-09-01

    Full Text Available A high-resolution daily gridded precipitation dataset was built from raw data of 12 858 observatories covering a period from 1950 to 2012 in peninsular Spain and 1971 to 2012 in Balearic and Canary islands. The original data were quality-controlled and gaps were filled on each day and location independently. Using the serially complete dataset, a grid with a 5 × 5 km spatial resolution was constructed by estimating daily precipitation amounts and their corresponding uncertainty at each grid node. Daily precipitation estimations were compared to original observations to assess the quality of the gridded dataset. Four daily precipitation indices were computed to characterise the spatial distribution of daily precipitation and nine extreme precipitation indices were used to describe the frequency and intensity of extreme precipitation events. The Mediterranean coast and the Central Range showed the highest frequency and intensity of extreme events, while the number of wet days and dry and wet spells followed a north-west to south-east gradient in peninsular Spain, from high to low values in the number of wet days and wet spells and reverse in dry spells. The use of the total available data in Spain, the independent estimation of precipitation for each day and the high spatial resolution of the grid allowed for a precise spatial and temporal assessment of daily precipitation that is difficult to achieve when using other methods, pre-selected long-term stations or global gridded datasets. SPREAD dataset is publicly available at https://doi.org/10.20350/digitalCSIC/7393.

  20. Noise Measurement and Frequency Analysis of Commercially Available Noisy Toys

    Directory of Open Access Journals (Sweden)

    Shohreh Jalaie

    2005-06-01

    Full Text Available Objective: Noise measurement and frequency analysis of commercially available noisy toys were the main purposes of the study. Materials and Methods: 181 noisy toys commonly found in toy stores in different zones of Tehran were selected and categorized into 10 groups. Noise measurement were done at 2, 25, and 50 cm from toys in dBA. The noisiest toy of each group was frequency analyzed in octave bands. Results: The highest and the lowest intensity levels belonged to the gun (mean=112 dBA and range of 100-127 dBA and to the rattle-box (mean=84 dBA and range of 74-95 dBA, respectively. Noise intensity levels significantly decreased with increasing distance except for two toys. Noise frequency analysis indicated energy in effective hearing frequencies. Most of the toys energies were in the middle and high frequency region. Conclusion: As intensity level of the toys is considerable, mostly more than 90 dBA, and also their energy exist in the middle and high frequency region, toys should be considered as a cause of the hearing impairment.

  1. An operational modal analysis method in frequency and spatial domain

    Science.gov (United States)

    Wang, Tong; Zhang, Lingmi; Tamura, Yukio

    2005-12-01

    A frequency and spatial domain decomposition method (FSDD) for operational modal analysis (OMA) is presented in this paper, which is an extension of the complex mode indicator function (CMIF) method for experimental modal analysis (EMA). The theoretical background of the FSDD method is clarified. Singular value decomposition is adopted to separate the signal space from the noise space. Finally, an enhanced power spectrum density (PSD) is proposed to obtain more accurate modal parameters by curve fitting in the frequency domain. Moreover, a simulation case and an application case are used to validate this method.

  2. Gravitational torque frequency analysis for the Einstein elevator experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ashenberg, Joshua [Harvard-Smithsonian Center for Astrophysics (CfA), Cambridge, MA (United States); Lorenzini, Enrico C [University of Padova, Padua (Italy)

    2007-09-07

    Testing the principle of equivalence with a differential acceleration detector that spins while free falling requires a reliable extraction of a very small violation signal from the noise in the output signal frequency spectrum. The experiment is designed such that the violation signal is modulated by the spin of the test bodies. The possible violation signal is mixed with the intrinsic white noise of the detector and the colored noise associated with the modulation of gravitational perturbations, through the spin, and inertial-motion-related noise. In order to avoid false alarms the frequencies of the gravitational disturbances and the violation signal must be separate. This paper presents a model for the perturbative gravitational torque that affects the measurement. The torque is expanded in an asymptotic series to the fourth order and then expressed as a frequency spectrum. A spectral analysis shows the design conditions for frequency separation between the perturbing torque and the violation signal.

  3. The influence of sense-contingent argument structure frequencies on ambiguity resolution in aphasia.

    Science.gov (United States)

    Huck, Anneline; Thompson, Robin L; Cruice, Madeline; Marshall, Jane

    2017-06-01

    Verbs with multiple senses can show varying argument structure frequencies, depending on the underlying sense. When acknowledge is used to mean 'recognise', it takes a direct object (DO), but when it is used to mean 'admit' it prefers a sentence complement (SC). The purpose of this study was to investigate whether people with aphasia (PWA) can exploit such meaning-structure probabilities during the reading of temporarily ambiguous sentences, as demonstrated for neurologically healthy individuals (NHI) in a self-paced reading study (Hare et al., 2003). Eleven people with mild or moderate aphasia and eleven neurologically healthy control participants read sentences while their eyes were tracked. Using adapted materials from the study by Hare et al. target sentences containing an SC structure (e.g. He acknowledged (that) his friends would probably help him a lot) were presented following a context prime that biased either a direct object (DO-bias) or sentence complement (SC-bias) reading of the verbs. Half of the stimuli sentences did not contain that so made the post verbal noun phrase (his friends) structurally ambiguous. Both groups of participants were influenced by structural ambiguity as well as by the context bias, indicating that PWA can, like NHI, use their knowledge of a verb's sense-based argument structure frequency during online sentence reading. However, the individuals with aphasia showed delayed reading patterns and some individual differences in their sensitivity to context and ambiguity cues. These differences compared to the NHI may contribute to difficulties in sentence comprehension in aphasia. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Time-frequency analysis of phonocardiogram signals using wavelet transform: a comparative study.

    Science.gov (United States)

    Ergen, Burhan; Tatar, Yetkin; Gulcur, Halil Ozcan

    2012-01-01

    Analysis of phonocardiogram (PCG) signals provides a non-invasive means to determine the abnormalities caused by cardiovascular system pathology. In general, time-frequency representation (TFR) methods are used to study the PCG signal because it is one of the non-stationary bio-signals. The continuous wavelet transform (CWT) is especially suitable for the analysis of non-stationary signals and to obtain the TFR, due to its high resolution, both in time and in frequency and has recently become a favourite tool. It decomposes a signal in terms of elementary contributions called wavelets, which are shifted and dilated copies of a fixed mother wavelet function, and yields a joint TFR. Although the basic characteristics of the wavelets are similar, each type of the wavelets produces a different TFR. In this study, eight real types of the most known wavelets are examined on typical PCG signals indicating heart abnormalities in order to determine the best wavelet to obtain a reliable TFR. For this purpose, the wavelet energy and frequency spectrum estimations based on the CWT and the spectra of the chosen wavelets were compared with the energy distribution and the autoregressive frequency spectra in order to determine the most suitable wavelet. The results show that Morlet wavelet is the most reliable wavelet for the time-frequency analysis of PCG signals.

  5. Regulatory analysis for resolution of USI A-17

    International Nuclear Information System (INIS)

    Thatcher, D.F.

    1989-08-01

    This report presents a summary of the regulatory analysis conducted by the NRC staff to evaluate the value and impact of potential alternatives for the resolution of Unresolved Safety Issue (USI) A-17, ''Systems Interactions in Nuclear Power Plants.'' The NRC staff's proposed resolution offered in this report is based on this analysis. The staff's technical finding regarding interactions can be found in NUREG-1174. Adverse systems interactions (ASIs) involve subtle and often very complicated plant-specific dependencies between components and systems, possible compounded by inducing erroneous human intervention. The staff has identified actions to be taken by licensees and the NRC to resolve USI A-17; the staff has also made the judgment that these actions, together with other ongoing activities, would reduce the risk from adverse systems interactions. As discussed further in this report, the staff judgment that the actions are sufficient is not based on the assertion that all systems interactions have been identified, but rather that the A-17 actions, plus other activities by the licensees and staff, will identify precursors to potentially risk-significant interactions so that the action can be taken if deemed necessary

  6. Objective high Resolution Analysis over Complex Terrain with VERA

    Science.gov (United States)

    Mayer, D.; Steinacker, R.; Steiner, A.

    2012-04-01

    VERA (Vienna Enhanced Resolution Analysis) is a model independent, high resolution objective analysis of meteorological fields over complex terrain. This system consists of a special developed quality control procedure and a combination of an interpolation and a downscaling technique. Whereas the so called VERA-QC is presented at this conference in the contribution titled "VERA-QC, an approved Data Quality Control based on Self-Consistency" by Andrea Steiner, this presentation will focus on the method and the characteristics of the VERA interpolation scheme which enables one to compute grid point values of a meteorological field based on irregularly distributed observations and topography related aprior knowledge. Over a complex topography meteorological fields are not smooth in general. The roughness which is induced by the topography can be explained physically. The knowledge about this behavior is used to define the so called Fingerprints (e.g. a thermal Fingerprint reproducing heating or cooling over mountainous terrain or a dynamical Fingerprint reproducing positive pressure perturbation on the windward side of a ridge) under idealized conditions. If the VERA algorithm recognizes patterns of one or more Fingerprints at a few observation points, the corresponding patterns are used to downscale the meteorological information in a greater surrounding. This technique allows to achieve an analysis with a resolution much higher than the one of the observational network. The interpolation of irregularly distributed stations to a regular grid (in space and time) is based on a variational principle applied to first and second order spatial and temporal derivatives. Mathematically, this can be formulated as a cost function that is equivalent to the penalty function of a thin plate smoothing spline. After the analysis field has been divided into the Fingerprint components and the unexplained part respectively, the requirement of a smooth distribution is applied to the

  7. CEST ANALYSIS: AUTOMATED CHANGE DETECTION FROM VERY-HIGH-RESOLUTION REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    M. Ehlers

    2012-08-01

    Full Text Available A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST of the change algorithms is applied to calculate the probability of change for a particular location. CEST

  8. Quantitative high-resolution genomic analysis of single cancer cells.

    Science.gov (United States)

    Hannemann, Juliane; Meyer-Staeckling, Sönke; Kemming, Dirk; Alpers, Iris; Joosse, Simon A; Pospisil, Heike; Kurtz, Stefan; Görndt, Jennifer; Püschel, Klaus; Riethdorf, Sabine; Pantel, Klaus; Brandt, Burkhard

    2011-01-01

    During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  9. Quantitative high-resolution genomic analysis of single cancer cells.

    Directory of Open Access Journals (Sweden)

    Juliane Hannemann

    Full Text Available During cancer progression, specific genomic aberrations arise that can determine the scope of the disease and can be used as predictive or prognostic markers. The detection of specific gene amplifications or deletions in single blood-borne or disseminated tumour cells that may give rise to the development of metastases is of great clinical interest but technically challenging. In this study, we present a method for quantitative high-resolution genomic analysis of single cells. Cells were isolated under permanent microscopic control followed by high-fidelity whole genome amplification and subsequent analyses by fine tiling array-CGH and qPCR. The assay was applied to single breast cancer cells to analyze the chromosomal region centred by the therapeutical relevant EGFR gene. This method allows precise quantitative analysis of copy number variations in single cell diagnostics.

  10. Frequency analysis of the ECG before and during ventricular fibrillation

    NARCIS (Netherlands)

    Herbschleb, J.N.; Heethaar, R.M.; Tweel, I. van der; Meijler, F.L.

    1980-01-01

    Frequency analysis of cardiac electrograms of dogs with ventricular fibrillation (VF) during complete cardiopulmonary bypass and coronary perfusion showed a power spectrum with a peak around 12 Hz and its higher harmonics, suggesting more organization than generally assumed. As a next step to see

  11. Robust detection of discordant sites in regional frequency analysis

    NARCIS (Netherlands)

    Neykov, N.M.; Neytchev, P.N.; Van Gelder, P.H.A.J.M.; Todorov, V.K.

    2007-01-01

    The discordancy measure in terms of the sample L?moment ratios (L?CV, L?skewness, L?kurtosis) of the at?site data is widely recommended in the screening process of atypical sites in the regional frequency analysis (RFA). The sample mean and the covariance matrix of the L?moments ratios, on which the

  12. Application of frequency spectrum analysis in the rotator moving equilibrium

    International Nuclear Information System (INIS)

    Liu Ruilan; Su Guanghui; Shang Zhi; Jia Dounan

    2001-01-01

    The experimental equipment is developed to simulate the rotator vibration. The running state of machine is simulated by using different running conditions. The vibration caused by non-equilibrium mass is analyzed and discussed for first order with focus load. The effective method is found out by using frequency spectrum analysis

  13. Climate Informed Low Flow Frequency Analysis Using Nonstationary Modeling

    Science.gov (United States)

    Liu, D.; Guo, S.; Lian, Y.

    2014-12-01

    Stationarity is often assumed for frequency analysis of low flows in water resources management and planning. However, many studies have shown that flow characteristics, particularly the frequency spectrum of extreme hydrologic events,were modified by climate change and human activities and the conventional frequency analysis without considering the non-stationary characteristics may lead to costly design. The analysis presented in this paper was based on the more than 100 years of daily flow data from the Yichang gaging station 44 kilometers downstream of the Three Gorges Dam. The Mann-Kendall trend test under the scaling hypothesis showed that the annual low flows had significant monotonic trend, whereas an abrupt change point was identified in 1936 by the Pettitt test. The climate informed low flow frequency analysis and the divided and combined method are employed to account for the impacts from related climate variables and the nonstationarities in annual low flows. Without prior knowledge of the probability density function for the gaging station, six distribution functions including the Generalized Extreme Values (GEV), Pearson Type III, Gumbel, Gamma, Lognormal, and Weibull distributions have been tested to find the best fit, in which the local likelihood method is used to estimate the parameters. Analyses show that GEV had the best fit for the observed low flows. This study has also shown that the climate informed low flow frequency analysis is able to exploit the link between climate indices and low flows, which would account for the dynamic feature for reservoir management and provide more accurate and reliable designs for infrastructure and water supply.

  14. High-resolution inelastic X-ray scattering to study the high-frequency atomic dynamics of disordered systems

    International Nuclear Information System (INIS)

    Monaco, G.

    2008-01-01

    The use of momentum-resolved inelastic X-ray scattering with meV energy resolution to study the high-frequency atomic dynamics in disordered systems is here reviewed. The typical realization of this experiment is described together with some common models used to interpret the measured spectra and to extract parameters of interest for the investigation of disordered systems. With the help of some selected examples, the present status of the field is discussed. Particular attention is given to those results which are still open for discussion or controversial, and which will require further development of the technique to be fully solved. Such an instrumental development seems nowadays possible at the light of recently proposed schemes for advanced inelastic X-ray scattering spectrometers. (author)

  15. DATA ACQUISITION AND ANALYSIS OF LOW FREQUENCY ELECTROMAGNETIC FIELD

    Directory of Open Access Journals (Sweden)

    PETRICA POPOV

    2016-06-01

    Full Text Available In recent years more and more studies have shown that, the low frequency field strength (particularly magnetic, 50 / 60Hz are a major risk factor; according to some specialists - even more important as the radiation field. As a result, the personnel serving equipment and facilities such as: electric generators, synchronous, the motors, the inverters or power transformers is subjected continually to intense fields, in their vicinity, with possible harmful effects in the long term by affecting metabolism cell, espectively, the biological mechanisms.Therefore, finding new methods and tools for measurement and analysis of low frequency electromagnetic fields may lead to improved standards for exposure limits of the human body.

  16. Time-Frequency Analysis of Signals Generated by Rotating Machines

    Directory of Open Access Journals (Sweden)

    R. Zetik

    1999-06-01

    Full Text Available This contribution is devoted to the higher order time-frequency analyses of signals. Firstly, time-frequency representations of higher order (TFRHO are defined. Then L-Wigner distribution (LWD is given as a special case of TFRHO. Basic properties of LWD are illustrated based on the analysis of mono-component and multi-component synthetic signals and acoustical signals generated by rotating machine. The obtained results confirm usefulness of LWD application for the purpose of rotating machine condition monitoring.

  17. High frequency vibration analysis by the complex envelope vectorization.

    Science.gov (United States)

    Giannini, O; Carcaterra, A; Sestieri, A

    2007-06-01

    The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.

  18. Fast focus estimation using frequency analysis in digital holography.

    Science.gov (United States)

    Oh, Seungtaik; Hwang, Chi-Young; Jeong, Il Kwon; Lee, Sung-Keun; Park, Jae-Hyeung

    2014-11-17

    A novel fast frequency-based method to estimate the focus distance of digital hologram for a single object is proposed. The focus distance is computed by analyzing the distribution of intersections of smoothed-rays. The smoothed-rays are determined by the directions of energy flow which are computed from local spatial frequency spectrum based on the windowed Fourier transform. So our method uses only the intrinsic frequency information of the optical field on the hologram and therefore does not require any sequential numerical reconstructions and focus detection techniques of conventional photography, both of which are the essential parts in previous methods. To show the effectiveness of our method, numerical results and analysis are presented as well.

  19. Reconstruction of human brain spontaneous activity based on frequency-pattern analysis of magnetoencephalography data

    Science.gov (United States)

    Llinás, Rodolfo R.; Ustinin, Mikhail N.; Rykunov, Stanislav D.; Boyko, Anna I.; Sychev, Vyacheslav V.; Walton, Kerry D.; Rabello, Guilherme M.; Garcia, John

    2015-01-01

    A new method for the analysis and localization of brain activity has been developed, based on multichannel magnetic field recordings, over minutes, superimposed on the MRI of the individual. Here, a high resolution Fourier Transform is obtained over the entire recording period, leading to a detailed multi-frequency spectrum. Further analysis implements a total decomposition of the frequency components into functionally invariant entities, each having an invariant field pattern localizable in recording space. The method, addressed as functional tomography, makes it possible to find the distribution of magnetic field sources in space. Here, the method is applied to the analysis of simulated data, to oscillating signals activating a physical current dipoles phantom, and to recordings of spontaneous brain activity in 10 healthy adults. In the analysis of simulated data, 61 dipoles are localized with 0.7 mm precision. Concerning the physical phantom the method is able to localize three simultaneously activated current dipoles with 1 mm precision. Spatial resolution 3 mm was attained when localizing spontaneous alpha rhythm activity in 10 healthy adults, where the alpha peak was specified for each subject individually. Co-registration of the functional tomograms with each subject's head MRI localized alpha range activity to the occipital and/or posterior parietal brain region. This is the first application of this new functional tomography to human brain activity. The method successfully provides an overall view of brain electrical activity, a detailed spectral description and, combined with MRI, the localization of sources in anatomical brain space. PMID:26528119

  20. Analysis of Automated Aircraft Conflict Resolution and Weather Avoidance

    Science.gov (United States)

    Love, John F.; Chan, William N.; Lee, Chu Han

    2009-01-01

    This paper describes an analysis of using trajectory-based automation to resolve both aircraft and weather constraints for near-term air traffic management decision making. The auto resolution algorithm developed and tested at NASA-Ames to resolve aircraft to aircraft conflicts has been modified to mitigate convective weather constraints. Modifications include adding information about the size of a gap between weather constraints to the routing solution. Routes that traverse gaps that are smaller than a specific size are not used. An evaluation of the performance of the modified autoresolver to resolve both conflicts with aircraft and weather was performed. Integration with the Center-TRACON Traffic Management System was completed to evaluate the effect of weather routing on schedule delays.

  1. Analysis of modal frequency optimization of railway vehicle car body

    Directory of Open Access Journals (Sweden)

    Wenjing Sun

    2016-04-01

    Full Text Available High structural modal frequencies of car body are beneficial as they ensure better vibration control and enhance ride quality of railway vehicles. Modal sensitivity optimization and elastic suspension parameters used in the design of equipment beneath the chassis of the car body are proposed in order to improve the modal frequencies of car bodies under service conditions. Modal sensitivity optimization is based on sensitivity analysis theory which considers the thickness of the body frame at various positions as variables in order to achieve optimization. Equipment suspension design analyzes the influence of suspension parameters on the modal frequencies of the car body through the use of an equipment-car body coupled model. Results indicate that both methods can effectively improve the modal parameters of the car body. Modal sensitivity optimization increases vertical bending frequency from 9.70 to 10.60 Hz, while optimization of elastic suspension parameters increases the vertical bending frequency to 10.51 Hz. The suspension design can be used without alteration to the structure of the car body while ensuring better ride quality.

  2. FREQUENCY ANALYSIS OF VIBRATIONS OF THE ROUND PARACHUTE EDGE

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The article is addressed to the analysis of the videos obtained during flight experiment at the launch of meteo-rocket MMP-06 in order to determine main characteristics of the oscillatory process the edges of the canopy at subsonic speeds at altitudes from 42,2 km to 34.2 km. Data analysis demonstrated that the oscillations of the edge of the canopy has a random character. The structure frequency of 2.4 Hz was identified from the analysis to be determined by the nylon sling stiffness.

  3. Nonstationary Hydrological Frequency Analysis: Theoretical Methods and Application Challenges

    Science.gov (United States)

    Xiong, L.

    2014-12-01

    Because of its great implications in the design and operation of hydraulic structures under changing environments (either climate change or anthropogenic changes), nonstationary hydrological frequency analysis has become so important and essential. Two important achievements have been made in methods. Without adhering to the consistency assumption in the traditional hydrological frequency analysis, the time-varying probability distribution of any hydrological variable can be established by linking the distribution parameters to some covariates such as time or physical variables with the help of some powerful tools like the Generalized Additive Model of Location, Scale and Shape (GAMLSS). With the help of copulas, the multivariate nonstationary hydrological frequency analysis has also become feasible. However, applications of the nonstationary hydrological frequency formula to the design and operation of hydraulic structures for coping with the impacts of changing environments in practice is still faced with many challenges. First, the nonstationary hydrological frequency formulae with time as covariate could only be extrapolated for a very short time period beyond the latest observation time, because such kind of formulae is not physically constrained and the extrapolated outcomes could be unrealistic. There are two physically reasonable methods that can be used for changing environments, one is to directly link the quantiles or the distribution parameters to some measureable physical factors, and the other is to use the derived probability distributions based on hydrological processes. However, both methods are with a certain degree of uncertainty. For the design and operation of hydraulic structures under changing environments, it is recommended that design results of both stationary and nonstationary methods be presented together and compared with each other, to help us understand the potential risks of each method.

  4. Time-variant random interval natural frequency analysis of structures

    Science.gov (United States)

    Wu, Binhua; Wu, Di; Gao, Wei; Song, Chongmin

    2018-02-01

    This paper presents a new robust method namely, unified interval Chebyshev-based random perturbation method, to tackle hybrid random interval structural natural frequency problem. In the proposed approach, random perturbation method is implemented to furnish the statistical features (i.e., mean and standard deviation) and Chebyshev surrogate model strategy is incorporated to formulate the statistical information of natural frequency with regards to the interval inputs. The comprehensive analysis framework combines the superiority of both methods in a way that computational cost is dramatically reduced. This presented method is thus capable of investigating the day-to-day based time-variant natural frequency of structures accurately and efficiently under concrete intrinsic creep effect with probabilistic and interval uncertain variables. The extreme bounds of the mean and standard deviation of natural frequency are captured through the embedded optimization strategy within the analysis procedure. Three particularly motivated numerical examples with progressive relationship in perspective of both structure type and uncertainty variables are demonstrated to justify the computational applicability, accuracy and efficiency of the proposed method.

  5. Model validity and frequency band selection in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui

    2016-12-01

    Experimental modal analysis aims at identifying the modal properties (e.g., natural frequencies, damping ratios, mode shapes) of a structure using vibration measurements. Two basic questions are encountered when operating in the frequency domain: Is there a mode near a particular frequency? If so, how much spectral data near the frequency can be included for modal identification without incurring significant modeling error? For data with high signal-to-noise (s/n) ratios these questions can be addressed using empirical tools such as singular value spectrum. Otherwise they are generally open and can be challenging, e.g., for modes with low s/n ratios or close modes. In this work these questions are addressed using a Bayesian approach. The focus is on operational modal analysis, i.e., with 'output-only' ambient data, where identification uncertainty and modeling error can be significant and their control is most demanding. The approach leads to 'evidence ratios' quantifying the relative plausibility of competing sets of modeling assumptions. The latter involves modeling the 'what-if-not' situation, which is non-trivial but is resolved by systematic consideration of alternative models and using maximum entropy principle. Synthetic and field data are considered to investigate the behavior of evidence ratios and how they should be interpreted in practical applications.

  6. Analysis of core damage frequency: Surry, Unit 1 internal events

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  7. Fast computation of molecular random phase approximation correlation energies using resolution of the identity and imaginary frequency integration

    Science.gov (United States)

    Eshuis, Henk; Yarkony, Julian; Furche, Filipp

    2010-06-01

    The random phase approximation (RPA) is an increasingly popular post-Kohn-Sham correlation method, but its high computational cost has limited molecular applications to systems with few atoms. Here we present an efficient implementation of RPA correlation energies based on a combination of resolution of the identity (RI) and imaginary frequency integration techniques. We show that the RI approximation to four-index electron repulsion integrals leads to a variational upper bound to the exact RPA correlation energy if the Coulomb metric is used. Auxiliary basis sets optimized for second-order Møller-Plesset (MP2) calculations are well suitable for RPA, as is demonstrated for the HEAT [A. Tajti et al., J. Chem. Phys. 121, 11599 (2004)] and MOLEKEL [F. Weigend et al., Chem. Phys. Lett. 294, 143 (1998)] benchmark sets. Using imaginary frequency integration rather than diagonalization to compute the matrix square root necessary for RPA, evaluation of the RPA correlation energy requires O(N4 log N) operations and O(N3) storage only; the price for this dramatic improvement over existing algorithms is a numerical quadrature. We propose a numerical integration scheme that is exact in the two-orbital case and converges exponentially with the number of grid points. For most systems, 30-40 grid points yield μH accuracy in triple zeta basis sets, but much larger grids are necessary for small gap systems. The lowest-order approximation to the present method is a post-Kohn-Sham frequency-domain version of opposite-spin Laplace-transform RI-MP2 [J. Jung et al., Phys. Rev. B 70, 205107 (2004)]. Timings for polyacenes with up to 30 atoms show speed-ups of two orders of magnitude over previous implementations. The present approach makes it possible to routinely compute RPA correlation energies of systems well beyond 100 atoms, as is demonstrated for the octapeptide angiotensin II.

  8. The Peltier driven frequency domain approach in thermal analysis.

    Science.gov (United States)

    De Marchi, Andrea; Giaretto, Valter

    2014-10-01

    The merits of Frequency Domain analysis as a tool for thermal system characterization are discussed, and the complex thermal impedance approach is illustrated. Pure AC thermal flux generation with negligible DC component is possible with a Peltier device, differently from other existing methods in which a significant DC component is intrinsically attached to the generated AC flux. Such technique is named here Peltier Driven Frequency Domain (PDFD). As a necessary prerequisite, a novel one-dimensional analytical model for an asymmetrically loaded Peltier device is developed, which is general enough to be useful in most practical situations as a design tool for measurement systems and as a key for the interpretation of experimental results. Impedance analysis is possible with Peltier devices by the inbuilt Seebeck effect differential thermometer, and is used in the paper for an experimental validation of the analytical model. Suggestions are then given for possible applications of PDFD, including the determination of thermal properties of materials.

  9. Time-Frequency Analysis of the Dispersion of Lamb Modes

    Science.gov (United States)

    Prosser, W. H.; Seale, Michael D.; Smith, Barry T.

    1999-01-01

    Accurate knowledge of the velocity dispersion of Lamb modes is important for ultrasonic nondestructive evaluation methods used in detecting and locating flaws in thin plates and in determining their elastic stiffness coefficients. Lamb mode dispersion is also important in the acoustic emission technique for accurately triangulating the location of emissions in thin plates. In this research, the ability to characterize Lamb mode dispersion through a time-frequency analysis (the pseudo Wigner-Ville distribution) was demonstrated. A major advantage of time-frequency methods is the ability to analyze acoustic signals containing multiple propagation modes, which overlap and superimpose in the time domain signal. By combining time-frequency analysis with a broadband acoustic excitation source, the dispersion of multiple Lamb modes over a wide frequency range can be determined from as little as a single measurement. In addition, the technique provides a direct measurement of the group velocity dispersion. The technique was first demonstrated in the analysis of a simulated waveform in an aluminum plate in which the Lamb mode dispersion was well known. Portions of the dispersion curves of the A(sub 0), A(sub 1), S(sub 0), and S(sub 2)Lamb modes were obtained from this one waveform. The technique was also applied for the analysis of experimental waveforms from a unidirectional graphite/epoxy composite plate. Measurements were made both along, and perpendicular to the fiber direction. In this case, the signals contained only the lowest order symmetric and antisymmetric modes. A least squares fit of the results from several source to detector distances was used. Theoretical dispersion curves were calculated and are shown to be in good agreement with experimental results.

  10. Frequency spectrum analysis of finger photoplethysmographic waveform variability during haemodialysis.

    Science.gov (United States)

    Javed, Faizan; Middleton, Paul M; Malouf, Philip; Chan, Gregory S H; Savkin, Andrey V; Lovell, Nigel H; Steel, Elizabeth; Mackie, James

    2010-09-01

    This study investigates the peripheral circulatory and autonomic response to volume withdrawal in haemodialysis based on spectral analysis of photoplethysmographic waveform variability (PPGV). Frequency spectrum analysis was performed on the baseline and pulse amplitude variabilities of the finger infrared photoplethysmographic (PPG) waveform and on heart rate variability extracted from the ECG signal collected from 18 kidney failure patients undergoing haemodialysis. Spectral powers were calculated from the low frequency (LF, 0.04-0.145 Hz) and high frequency (HF, 0.145-0.45 Hz) bands. In eight stable fluid overloaded patients (fluid removal of >2 L) not on alpha blockers, progressive reduction in relative blood volume during haemodialysis resulted in significant increase in LF and HF powers of PPG baseline and amplitude variability (P analysis of finger PPGV may provide valuable information on the autonomic vascular response to blood volume reduction in haemodialysis, and can be potentially utilized as a non-invasive tool for assessing peripheral circulatory control during routine dialysis procedure.

  11. Regulatory analysis for the resolution of Unresolved Safety Issue A-44, Station Blackout. Draft report

    International Nuclear Information System (INIS)

    Rubin, A.M.

    1986-01-01

    ''Station Blackout'' is the complete loss of alternating current (ac) electric power to the essential and nonessential buses in a nuclear power plant; it results when both offsite power and the onsite emergency ac power systems are unavailable. Because many safety systems required for reactor core decay heat removal and containment heat removal depend on ac power, the consequences of a station blackout could be severe. Because of the concern about the frequency of loss of offsite power, the number of failures of emergency diesel generators, and the potentially severe consequences of a loss of all ac power, ''Station Blackout'' was designated as Unresolved Safety Issue (USI) A-44. This report presents the regulatory analysis for USI A-44. It includes: (1) a summary of the issue, (2) the proposed technical resolution, (3) alternative resolutions considered by the Nuclear Regulatory Commission (NRC) staff, (4) an assessment of the benefits and costs of the recommended resolution, (5) the decision rationale, and (6) the relationship between USI A-44 and other NRC programs and requirements

  12. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  13. Multi-resolution analysis for region of interest extraction in thermographic nondestructive evaluation

    Science.gov (United States)

    Ortiz-Jaramillo, B.; Fandiño Toro, H. A.; Benitez-Restrepo, H. D.; Orjuela-Vargas, S. A.; Castellanos-Domínguez, G.; Philips, W.

    2012-03-01

    Infrared Non-Destructive Testing (INDT) is known as an effective and rapid method for nondestructive inspection. It can detect a broad range of near-surface structuring flaws in metallic and composite components. Those flaws are modeled as a smooth contour centered at peaks of stored thermal energy, termed Regions of Interest (ROI). Dedicated methodologies must detect the presence of those ROIs. In this paper, we present a methodology for ROI extraction in INDT tasks. The methodology deals with the difficulties due to the non-uniform heating. The non-uniform heating affects low spatial/frequencies and hinders the detection of relevant points in the image. In this paper, a methodology for ROI extraction in INDT using multi-resolution analysis is proposed, which is robust to ROI low contrast and non-uniform heating. The former methodology includes local correlation, Gaussian scale analysis and local edge detection. In this methodology local correlation between image and Gaussian window provides interest points related to ROIs. We use a Gaussian window because thermal behavior is well modeled by Gaussian smooth contours. Also, the Gaussian scale is used to analyze details in the image using multi-resolution analysis avoiding low contrast, non-uniform heating and selection of the Gaussian window size. Finally, local edge detection is used to provide a good estimation of the boundaries in the ROI. Thus, we provide a methodology for ROI extraction based on multi-resolution analysis that is better or equal compared with the other dedicate algorithms proposed in the state of art.

  14. Mars Global Surveyor Ka-Band Frequency Data Analysis

    Science.gov (United States)

    Morabito, D.; Butman, S.; Shambayati, S.

    2000-01-01

    The Mars Global Surveyor (MGS) spacecraft, launched on November 7, 1996, carries an experimental space-to-ground telecommunications link at Ka-band (32 GHz) along with the primary X-band (8.4 GHz) downlink. The signals are simultaneously transmitted from a 1.5-in diameter parabolic high gain antenna (HGA) on MGS and received by a beam-waveguide (BWG) R&D 34-meter antenna located in NASA's Goldstone Deep Space Network (DSN) complex near Barstow, California. The projected 5-dB link advantage of Ka-band relative to X-band was confirmed in previous reports using measurements of MGS signal strength data acquired during the first two years of the link experiment from December 1996 to December 1998. Analysis of X-band and Ka-band frequency data and difference frequency (fx-fka)/3.8 data will be presented here. On board the spacecraft, a low-power sample of the X-band downlink from the transponder is upconverted to 32 GHz, the Ka-band frequency, amplified to I-W using a Solid State Power Amplifier, and radiated from the dual X/Ka HGA. The X-band signal is amplified by one of two 25 W TWTAs. An upconverter first downconverts the 8.42 GHz X-band signal to 8 GHz and then multiplies using a X4 multiplier producing the 32 GHz Ka-band frequency. The frequency source selection is performed by an RF switch which can be commanded to select a VCO (Voltage Controlled Oscillator) or USO (Ultra-Stable Oscillator) reference. The Ka-band frequency can be either coherent with the X-band downlink reference or a hybrid combination of the USO and VCO derived frequencies. The data in this study were chosen such that the Ka-band signal is purely coherent with the X-band signal, that is the downconverter is driven by the same frequency source as the X-band downlink). The ground station used to acquire the data is DSS-13, a 34-meter BWG antenna which incorporates a series of mirrors inside beam waveguide tubes which guide the energy to a subterranean pedestal room, providing a stable environment

  15. Frequency prediction by linear stability analysis around mean flow

    Science.gov (United States)

    Bengana, Yacine; Tuckerman, Laurette

    2017-11-01

    The frequency of certain limit cycles resulting from a Hopf bifurcation, such as the von Karman vortex street, can be predicted by linear stability analysis around their mean flows. Barkley (2006) has shown this to yield an eigenvalue whose real part is zero and whose imaginary part matches the nonlinear frequency. This property was named RZIF by Turton et al. (2015); moreover they found that the traveling waves (TW) of thermosolutal convection have the RZIF property. They explained this as a consequence of the fact that the temporal Fourier spectrum is dominated by the mean flow and first harmonic. We could therefore consider that only the first mode is important in the saturation of the mean flow as presented in the Self-Consistent Model (SCM) of Mantic-Lugo et al. (2014). We have implemented a full Newton's method to solve the SCM for thermosolutal convection. We show that while the RZIF property is satisfied far from the threshold, the SCM model reproduces the exact frequency only very close to the threshold. Thus, the nonlinear interaction of only the first mode with itself is insufficiently accurate to estimate the mean flow. Our next step will be to take into account higher harmonics and to apply this analysis to the standing waves, for which RZIF does not hold.

  16. Signal Adaptive System for Space/Spatial-Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Veselin N. Ivanović

    2009-01-01

    Full Text Available This paper outlines the development of a multiple-clock-cycle implementation (MCI of a signal adaptive two-dimensional (2D system for space/spatial-frequency (S/SF signal analysis. The design is based on a method for improved S/SF representation of the analyzed 2D signals, also proposed here. The proposed MCI design optimizes critical design performances related to hardware complexity, making it a suitable system for real time implementation on an integrated chip. Additionally, the design allows the implemented system to take a variable number of clock cycles (CLKs (the only necessary ones regarding desirable—2D Wigner distribution-presentation of autoterms in different frequency-frequency points during the execution. This ability represents a major advantage of the proposed design which helps to optimize the time required for execution and produce an improved, cross-terms-free S/SF signal representation. The design has been verified by a field-programmable gate array (FPGA circuit design, capable of performing S/SF analysis of 2D signals in real time.

  17. The dependence of bar frequency on galaxy mass, colour, and gas content - and angular resolution - in the local universe

    Science.gov (United States)

    Erwin, Peter

    2018-03-01

    I use distance- and mass-limited subsamples of the Spitzer Survey of Stellar Structure in Galaxies (S4G) to investigate how the presence of bars in spiral galaxies depends on mass, colour, and gas content and whether large, Sloan Digital Sky Survey (SDSS)-based investigations of bar frequencies agree with local data. Bar frequency reaches a maximum of fbar ≈ 0.70 at M⋆ ˜ 109.7M⊙, declining to both lower and higher masses. It is roughly constant over a wide range of colours (g - r ≈ 0.1-0.8) and atomic gas fractions (log (M_{H I}/ M_{\\star }) ≈ -2.5 to 1). Bars are thus as common in blue, gas-rich galaxies are they are in red, gas-poor galaxies. This is in sharp contrast to many SDSS-based studies of z ˜ 0.01-0.1 galaxies, which report fbar increasing strongly to higher masses (from M⋆ ˜ 1010 to 1011M⊙), redder colours, and lower gas fractions. The contradiction can be explained if SDSS-based studies preferentially miss bars in, and underestimate the bar fraction for, lower mass (bluer, gas-rich) galaxies due to poor spatial resolution and the correlation between bar size and stellar mass. Simulations of SDSS-style observations using the S4G galaxies as a parent sample, and assuming that bars below a threshold angular size of twice the point spread function full width at half-maximum cannot be identified, successfully reproduce typical SDSS fbar trends for stellar mass and gas mass ratio. Similar considerations may affect high-redshift studies, especially if bars grow in length over cosmic time; simulations suggest that high-redshift bar fractions may thus be systematically underestimated.

  18. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Science.gov (United States)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  19. Time-frequency analysis of submerged synthetic jet

    Science.gov (United States)

    Kumar, Abhay; Saha, Arun K.; Panigrahi, P. K.

    2017-12-01

    The coherent structures transport the finite body of fluid mass through rolling which plays an important role in heat transfer, boundary layer control, mixing, cooling, propulsion and other engineering applications. A synthetic jet in the form of a train of vortex rings having coherent structures of different length scales is expected to be useful in these applications. The propagation and sustainability of these coherent structures (vortex rings) in downstream direction characterize the performance of synthetic jet. In the present study, the velocity signal acquired using the S-type hot-film probe along the synthetic jet centerline has been taken for the spectral analysis. One circular and three rectangular orifices of aspect ratio 1, 2 and 4 actuating at 1, 6 and 18 Hz frequency have been used for creating different synthetic jets. The laser induced fluorescence images are used to study the flow structures qualitatively and help in explaining the velocity signal for detection of coherent structures. The study depicts four regions as vortex rollup and suction region (X/D h ≤ 3), steadily translating region (X/D h ≤ 3-8), vortex breakup region (X/Dh ≤ 4-8) and dissipation of small-scale vortices (X/D h ≤ 8-15). The presence of coherent structures localized in physical and temporal domain is analyzed for the characterization of synthetic jet. Due to pulsatile nature of synthetic jet, analysis of velocity time trace or signal in time, frequency and combined time-frequency domain assist in characterizing the signatures of coherent structures. It has been observed that the maximum energy is in the first harmonic of actuation frequency, which decreases slowly in downstream direction at 6 Hz compared to 1 and 18 Hz of actuation.

  20. A parallel solution for high resolution histological image analysis.

    Science.gov (United States)

    Bueno, G; González, R; Déniz, O; García-Rojo, M; González-García, J; Fernández-Carrobles, M M; Vállez, N; Salido, J

    2012-10-01

    This paper describes a general methodology for developing parallel image processing algorithms based on message passing for high resolution images (on the order of several Gigabytes). These algorithms have been applied to histological images and must be executed on massively parallel processing architectures. Advances in new technologies for complete slide digitalization in pathology have been combined with developments in biomedical informatics. However, the efficient use of these digital slide systems is still a challenge. The image processing that these slides are subject to is still limited both in terms of data processed and processing methods. The work presented here focuses on the need to design and develop parallel image processing tools capable of obtaining and analyzing the entire gamut of information included in digital slides. Tools have been developed to assist pathologists in image analysis and diagnosis, and they cover low and high-level image processing methods applied to histological images. Code portability, reusability and scalability have been tested by using the following parallel computing architectures: distributed memory with massive parallel processors and two networks, INFINIBAND and Myrinet, composed of 17 and 1024 nodes respectively. The parallel framework proposed is flexible, high performance solution and it shows that the efficient processing of digital microscopic images is possible and may offer important benefits to pathology laboratories. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  1. LOFT PSMG Speed Control System frequency response analysis

    International Nuclear Information System (INIS)

    Hansen, H.R.

    1977-01-01

    An analysis was done to gain insight into the shape of the open loop frequency response of the PSMG Speed Control System. The results of the analysis were used as a guide to groom the proportional band and reset time settings of the 2 mode controller in the speed control system. The analysis shows that when an actuator with a timing of 90 degrees per 60 seconds is installed in the system the proportional band and reset time should be 316% and 1 minute. Whereas when grooming the system a proportional band and reset time of 150% and 1.5 minutes were found to be appropriate. The closeness of the settings show that even though a linear model was used to describe the non-linear PSMG Speed Control System, it was accurate enough to be used as a guide to groom the proportional band and reset time settings

  2. Ultra-high resolution water window x ray microscope optics design and analysis

    Science.gov (United States)

    Shealy, David L.; Wang, C.

    1993-01-01

    This project has been focused on the design and analysis of an ultra-high resolution water window soft-x-ray microscope. These activities have been accomplished by completing two tasks contained in the statement of work of this contract. The new results from this work confirm: (1) that in order to achieve resolutions greater than three times the wavelength of the incident radiation, it will be necessary to use spherical mirror surfaces and to use graded multilayer coatings on the secondary in order to accommodate the large variations of the angle of incidence over the secondary when operating the microscope at numerical apertures of 0.35 or greater; (2) that surface contour errors will have a significant effect on the optical performance of the microscope and must be controlled to a peak-to-valley variation of 50-100 A and a frequency of 8 periods over the surface of a mirror; and (3) that tolerance analysis of the spherical Schwarzschild microscope has been shown that the water window operations will require 2-3 times tighter tolerances to achieve a similar performance of operations with 130 A radiation. These results have been included in a manuscript included in the appendix.

  3. Automatic analysis of ciliary beat frequency using optical flow

    Science.gov (United States)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  4. Nonlinear analysis for dual-frequency concurrent energy harvesting

    Science.gov (United States)

    Yan, Zhimiao; Lei, Hong; Tan, Ting; Sun, Weipeng; Huang, Wenhu

    2018-05-01

    The dual-frequency responses of the hybrid energy harvester undergoing the base excitation and galloping were analyzed numerically. In this work, an approximate dual-frequency analytical method is proposed for the nonlinear analysis of such a system. To obtain the approximate analytical solutions of the full coupled distributed-parameter model, the forcing interactions is first neglected. Then, the electromechanical decoupled governing equation is developed using the equivalent structure method. The hybrid mechanical response is finally separated to be the self-excited and forced responses for deriving the analytical solutions, which are confirmed by the numerical simulations of the full coupled model. The forced response has great impacts on the self-excited response. The boundary of Hopf bifurcation is analytically determined by the onset wind speed to galloping, which is linearly increased by the electrical damping. Quenching phenomenon appears when the increasing base excitation suppresses the galloping. The theoretical quenching boundary depends on the forced mode velocity. The quenching region increases with the base acceleration and electrical damping, but decreases with the wind speed. Superior to the base-excitation-alone case, the existence of the aerodynamic force protects the hybrid energy harvester at resonance from damages caused by the excessive large displacement. From the view of the harvested power, the hybrid system surpasses the base-excitation-alone system or the galloping-alone system. This study advances our knowledge on intrinsic nonlinear dynamics of the dual-frequency energy harvesting system by taking advantage of the analytical solutions.

  5. Frequency band analysis of muscle activation during cycling to exhaustion

    Directory of Open Access Journals (Sweden)

    Fernando Diefenthaeler

    2012-04-01

    Full Text Available DOI: http://dx.doi.org/10.5007/1980-0037.2012v14n3p243 Lower limb muscles activation was assessed during cycling to exhaustion using frequency band analysis. Nine cyclists were evaluated in two days. On the first day, cyclists performed a maximal incremental cycling exercise to measure peak power output, which was used on the second day to define the workload for a constant load time to exhaustion cycling exercise (maximal aerobic power output from day 1. Muscle activation of vastus lateralis (VL, long head of biceps femoris (BF, lateral head of gastrocnemius (GL, and tibialis anterior (TA from the right lower limb was recorded during the time to exhaustion cycling exercise. A series of nine band-pass Butterworth digital filters was used to analyze muscle activity amplitude for each band. The overall amplitude of activation and the high and low frequency components were defined to assess the magnitude of fatigue effects on muscle activity via effect sizes. The profile of the overall muscle activation during the test was analyzed using a second order polynomial, and the variability of the overall bands was analyzed by the coefficient of variation for each muscle in each instant of the test. Substantial reduction in the high frequency components of VL and BF activation was observed. The overall and low frequency bands presented trivial to small changes for all muscles. High relationship between the second order polynomial fitting and muscle activity was found (R2 > 0.89 for all muscles. High variability (~25% was found for muscle activation at the four instants of the fatigue test. Changes in the spectral properties of the EMG signal were only substantial when extreme changes in fatigue state were induced.

  6. Frequency-dependent springs in the seismic analysis of structures

    International Nuclear Information System (INIS)

    Tyapin, A.G.

    2005-01-01

    This paper presents a two-step algorithm for the seismic analysis of structure resting on the rigid embedded basement. Frequency-domain analysis of SSI is carried out on the second step for a platform model with special 'soil spring' which is complex, frequency-dependent, wave-dependent and non-balanced. Theory is presented to obtain the parameters of the soil spring on the first step of the analysis, performed without structure (only geometry of the basement is used) using well-known SASSI code (Lysmer et al, 1981) or in some other ways. On the second step in the SASSI analysis the soil spring is included in the model as a special finite element. Thus, the first step enables to save the computer resources on structure, the second step-to save resources on soil. Soil spring is the most general form for a SSI linear analysis: conventional springs and dashpots can be easily represented in such a format. Thus, the presented approach enables to study the impact of various factors (such as the embedment depth and soil-structure separation, the off-diagonal stiffness, various formulas for stiffness and damping, etc.) on the soil spring parameters. These parameters can be studied separately from the structure itself. As an example, the study of the horizontal soil mesh size is presented. Lumped soil spring may be used on the second step to obtain structural response spectra. To get stresses complex stiffness may be distributed over the basement slab and embedded walls. The proposed approach may be considered to be the alternative to the impedance method (see ASCE4-98). (authors)

  7. Protein crystal structure analysis using synchrotron radiation at atomic resolution

    International Nuclear Information System (INIS)

    Nonaka, Takamasa

    1999-01-01

    We can now obtain a detailed picture of protein, allowing the identification of individual atoms, by interpreting the diffraction of X-rays from a protein crystal at atomic resolution, 1.2 A or better. As of this writing, about 45 unique protein structures beyond 1.2 A resolution have been deposited in the Protein Data Bank. This review provides a simplified overview of how protein crystallographers use such diffraction data to solve, refine, and validate protein structures. (author)

  8. Analysis of the positon resolution in centroid measurements in MWPC

    International Nuclear Information System (INIS)

    Gatti, E.; Longoni, A.

    1981-01-01

    Resolution limits in avalanche localization along the anode wires of an MWPC with cathodes connected by resistors and equally spaced amplifiers, are evaluated. A simple weighted-centroid method and a highly linear method based on a linear centroid finding filter, are considered. The contributions to the variance of the estimator of the avalanche position, due to the series noise of the amplifiers and to the thermal noise of the resistive line are separately calculated and compared. A comparison is made with the resolution of the MWPC with isolated cathodes. The calculations are performed with a distributed model of the diffusive line formed by the cathodes and the resistors. A comparison is also made with the results obtained with a simple lumped model of the diffusive line. A number of graphs useful in determining the best parameters of a MWPC, with a specified position and time resolution, are given. It has been found that, for short resolution times, an MWPC with cathodes connected by resitors presents better resolution (lower variance of the estimator of the avalanche position) than an MWPC with isolated cathodes. Conversely, for long resolution times, the variance of the estimator of the avalanche position is lower in an MWPC with isolated cathodes. (orig.)

  9. Provider attributes correlation analysis to their referral frequency and awards.

    Science.gov (United States)

    Wiley, Matthew T; Rivas, Ryan L; Hristidis, Vagelis

    2016-03-14

    There has been a recent growth in health provider search portals, where patients specify filters-such as specialty or insurance-and providers are ranked by patient ratings or other attributes. Previous work has identified attributes associated with a provider's quality through user surveys. Other work supports that intuitive quality-indicating attributes are associated with a provider's quality. We adopt a data-driven approach to study how quality indicators of providers are associated with a rich set of attributes including medical school, graduation year, procedures, fellowships, patient reviews, location, and technology usage. In this work, we only consider providers as individuals (e.g., general practitioners) and not organizations (e.g., hospitals). As quality indicators, we consider the referral frequency of a provider and a peer-nominated quality designation. We combined data from the Centers for Medicare and Medicaid Services (CMS) and several provider rating web sites to perform our analysis. Our data-driven analysis identified several attributes that correlate with and discriminate against referral volume and peer-nominated awards. In particular, our results consistently demonstrate that these attributes vary by locality and that the frequency of an attribute is more important than its value (e.g., the number of patient reviews or hospital affiliations are more important than the average review rating or the ranking of the hospital affiliations, respectively). We demonstrate that it is possible to build accurate classifiers for referral frequency and quality designation, with accuracies over 85 %. Our findings show that a one-size-fits-all approach to ranking providers is inadequate and that provider search portals should calibrate their ranking function based on location and specialty. Further, traditional filters of provider search portals should be reconsidered, and patients should be aware of existing pitfalls with these filters and educated on local

  10. Low energy booster radio frequency cavity structural analysis

    International Nuclear Information System (INIS)

    Jones, K.

    1994-01-01

    The structural design of the Superconducting Super Collider Low Energy Booster (LEB) Radio Frequency (RF) Cavity is very unique. The cavity is made of three different materials which all contribute to its structural strength while at the same time providing a good medium for magnetic properties. Its outer conductor is made of thin walled stainless steel which is later copper plated to reduce the electrical losses. Its tuner housing is made of a fiber reinforced composite laminate, similar to G10, glued to stainless steel plating. The stainless steel of the tuner is slotted to significantly diminish the magnetically-induced eddy currents. The composite laminate is bonded to the stainless steel to restore the structural strength that was lost in slotting. The composite laminate is also a barrier against leakage of the pressurized internal ferrite coolant fluid. The cavity's inner conductor, made of copper and stainless steel, is subjected to high heat loads and must be liquid cooled. The requirements of the Cavity are very stringent and driven primarily by deflection, natural frequency and temperature. Therefore, very intricate finite element analysis was used to complement conventional hand analysis in the design of the cavity. Structural testing of the assembled prototype cavity is planned to demonstrate the compliance of the cavity design to all of its requirements

  11. Low energy booster radio frequency cavity structural analysis

    International Nuclear Information System (INIS)

    Jones, K.

    1993-04-01

    The structural design of the Superconducting Super Collider Low Energy Booster (LEB) Radio Frequency (RF) Cavity is very unique. The cavity is made of three different materials which all contribute to its structural strength while at the same time providing a good medium for magnetic properties. Its outer conductor is made of thin walled stainless steel which is later copper plated to reduce the electrical losses. Its tuner housing is made of a fiber reinforced composite laminate, similar to G10, glued to stainless steel plating. The stainless steel of the tuner is slotted to significantly diminish the magnetically-induced eddy currents. The composite laminate is bonded to the stainless steel to restore the structural strength that was lost in slotting. The composite laminate is also a barrier against leakage of the pressurized internal ferrite coolant fluid. The cavity's inner conductor, made of copper and stainless steel, is subjected to high heat loads and must be liquid cooled. The requirements of the Cavity are very stringent and driven primarily by deflection, natural frequency and temperature. Therefore, very intricate finite element analysis was used to complement conventional hand analysis in the design of the cavity. Structural testing of the assembled prototype cavity is planned to demonstrate the compliance of the cavity design to all of its requirements

  12. Time Frequency Analysis of Spacecraft Propellant Tank Spinning Slosh

    Science.gov (United States)

    Green, Steven T.; Burkey, Russell C.; Sudermann, James

    2010-01-01

    Many spacecraft are designed to spin about an axis along the flight path as a means of stabilizing the attitude of the spacecraft via gyroscopic stiffness. Because of the assembly requirements of the spacecraft and the launch vehicle, these spacecraft often spin about an axis corresponding to a minor moment of inertia. In such a case, any perturbation of the spin axis will cause sloshing motions in the liquid propellant tanks that will eventually dissipate enough kinetic energy to cause the spin axis nutation (wobble) to grow further. This spinning slosh and resultant nutation growth is a primary design problem of spinning spacecraft and one that is not easily solved by analysis or simulation only. Testing remains the surest way to address spacecraft nutation growth. This paper describes a test method and data analysis technique that reveal the resonant frequency and damping behavior of liquid motions in a spinning tank. Slosh resonant frequency and damping characteristics are necessary inputs to any accurate numerical dynamic simulation of the spacecraft.

  13. Vibro-Shock Dynamics Analysis of a Tandem Low Frequency Resonator—High Frequency Piezoelectric Energy Harvester

    Directory of Open Access Journals (Sweden)

    Darius Žižys

    2017-04-01

    Full Text Available Frequency up-conversion is a promising technique for energy harvesting in low frequency environments. In this approach, abundantly available environmental motion energy is absorbed by a Low Frequency Resonator (LFR which transfers it to a high frequency Piezoelectric Vibration Energy Harvester (PVEH via impact or magnetic coupling. As a result, a decaying alternating output signal is produced, that can later be collected using a battery or be transferred directly to the electric load. The paper reports an impact-coupled frequency up-converting tandem setup with different LFR to PVEH natural frequency ratios and varying contact point location along the length of the harvester. RMS power output of different frequency up-converting tandems with optimal resistive values was found from the transient analysis revealing a strong relation between power output and LFR-PVEH natural frequency ratio as well as impact point location. Simulations revealed that higher power output is obtained from a higher natural frequency ratio between LFR and PVEH, an increase of power output by one order of magnitude for a doubled natural frequency ratio and up to 150% difference in power output from different impact point locations. The theoretical results were experimentally verified.

  14. Reconstruction of human brain spontaneous activity based on frequency-pattern analysis of magnetoencephalography data

    Directory of Open Access Journals (Sweden)

    Rodolfo R Llinas

    2015-10-01

    Full Text Available A new method for the analysis and localization of brain activity has been developed, based on multichannel magnetic field recordings, over minutes, superimposed on the MRI of the individual. Here, a high resolution Fourier Transform is obtained over the entire recording period, leading to a detailed multi-frequency spectrum. Further analysis implements a total decomposition of the frequency components into functionally invariant entities, each having an invariant field pattern localizable in recording space. The method, addressed as functional tomography, makes it possible to find the distribution of magnetic field sources in space. Here, the method is applied to the analysis of simulated data, to oscillating signals activating a physical current dipoles phantom, and to recordings of spontaneous brain activity in ten healthy adults. In the analysis of simulated data, 61 dipoles are localized with 0.7 mm precision. Concerning the physical phantom the method is able to localize three simultaneously activated current dipoles with 1 mm precision. Spatial resolution 3 mm was attained when localizing spontaneous alpha rhythm activity in ten healthy adults, where the alpha peak was specified for each subject individually. Co-registration of the functional tomograms with each subject’s head MRI localized alpha range activity to the occipital and/or posterior parietal brain region. This is the first application of this new functional tomography to human brain activity. The method successfully provides an overall view of brain electrical activity, a detailed spectral description and, combined with MRI, the localization of sources in anatomical brain space.

  15. Geographic Resolution Issues in RAM Transport Risk Analysis

    International Nuclear Information System (INIS)

    Mills, G.S.; Neuhauser, K.S.

    2000-01-01

    Transport risk analyses based on the RADTRAN code have been met with continual demands for increased spatial resolution of variations in population densities and other parameters employed in the calculation of risk estimates for transport of radioactive material (RAM). With the advent of geographic information systems (GISs) large quantities of data required to describe transport routes, which may extend to hundreds of kilometers, with high resolution (e.g. 1 km segments) can be handled without inordinate expense. This capability has raised a question concerning the maximum resolution of available input data and compatibility with RADTRAN computational models. Quantitative examinations are presented of spatial resolution issues in the calculation of incident-free doses and accident dose risks. For incident-free calculations, the effect of decreasing route-segment length on accuracy, in view of the model employed, is examined, and means of reducing total data input to the RADTRAN calculations, without loss of meaningful resolution of population concentrations, are presented. In the case of accident-risk calculations, the ability to detail population density under very large dispersal plumes permits comparison of plume modelling to actual data. In both types of calculations, meaningful limits to geographic extent are suggested. (author)

  16. The geometric phase analysis method based on the local high resolution discrete Fourier transform for deformation measurement

    International Nuclear Information System (INIS)

    Dai, Xianglu; Xie, Huimin; Wang, Huaixi; Li, Chuanwei; Wu, Lifu; Liu, Zhanwei

    2014-01-01

    The geometric phase analysis (GPA) method based on the local high resolution discrete Fourier transform (LHR-DFT) for deformation measurement, defined as LHR-DFT GPA, is proposed to improve the measurement accuracy. In the general GPA method, the fundamental frequency of the image plays a crucial role. However, the fast Fourier transform, which is generally employed in the general GPA method, could make it difficult to locate the fundamental frequency accurately when the fundamental frequency is not located at an integer pixel position in the Fourier spectrum. This study focuses on this issue and presents a LHR-DFT algorithm that can locate the fundamental frequency with sub-pixel precision in a specific frequency region for the GPA method. An error analysis is offered and simulation is conducted to verify the effectiveness of the proposed method; both results show that the LHR-DFT algorithm can accurately locate the fundamental frequency and improve the measurement accuracy of the GPA method. Furthermore, typical tensile and bending tests are carried out and the experimental results verify the effectiveness of the proposed method. (paper)

  17. Optical Analysis of an Ultra-High resolution Two-Mirror Soft X-Ray Microscope

    Science.gov (United States)

    Shealy, David L.; Wang, Cheng; Hoover, Richard B.

    1994-01-01

    This work has summarized for a Schwarzschild microscope some relationships between numerical aperture (NA), magnification, diameter of the primary mirror, radius of curvature of the secondary mirror, and the total length of the microscope. To achieve resolutions better than a spherical Schwarzschild microscope of 3.3 Lambda for a perfectly aligned and fabricated system. it is necessary to use aspherical surfaces to control higher-order aberrations. For an NA of 0.35, the aspherical Head microscope provides diffraction limited resolution of 1.4 Lambda where the aspherical surfaces differ from the best fit spherical surface by approximately 1 micrometer. However, the angle of incidence varies significantly over the primary and the secondary mirrors, which will require graded multilayer coatings to operate near peak reflectivities. For higher numerical apertures, the variation of the angle of incidence over the secondary mirror surface becomes a serious problem which must be solved before multilayer coatings can be used for this application. Tolerance analysis of the spherical Schwarzschild microscope has shown that water window operations will require 2-3 times tighter tolerances to achieve a similar performance for operations with 130 A radiation. Surface contour errors have been shown to have a significant impact on the MTF and must be controlled to a peak-to-valley variation of 50-100 A and a frequency of 8 periods over the surface of a mirror.

  18. High resolution radar satellite imagery analysis for safeguards applications

    Energy Technology Data Exchange (ETDEWEB)

    Minet, Christian; Eineder, Michael [German Aerospace Center, Remote Sensing Technology Institute, Department of SAR Signal Processing, Wessling, (Germany); Rezniczek, Arnold [UBA GmbH, Herzogenrath, (Germany); Niemeyer, Irmgard [Forschungszentrum Juelich, Institue of Energy and Climate Research, IEK-6: Nuclear Waste Management and Reactor Safety, Juelich, (Germany)

    2011-12-15

    For monitoring nuclear sites, the use of Synthetic Aperture Radar (SAR) imagery shows essential promises. Unlike optical remote sensing instruments, radar sensors operate under almost all weather conditions and independently of the sunlight, i.e. time of the day. Such technical specifications are required both for continuous and for ad-hoc, timed surveillance tasks. With Cosmo-Skymed, TerraSARX and Radarsat-2, high-resolution SAR imagery with a spatial resolution up to 1m has recently become available. Our work therefore aims to investigate the potential of high-resolution TerraSAR data for nuclear monitoring. This paper focuses on exploiting amplitude of a single acquisition, assessing amplitude changes and phase differences between two acquisitions, and PS-InSAR processing of an image stack.

  19. DOI resolution measurement and error analysis with LYSO and APDs

    International Nuclear Information System (INIS)

    Lee, Chae-hun; Cho, Gyuseong

    2008-01-01

    Spatial resolution degradation in PET occurs at the edge of Field Of View (FOV) due to parallax error. To improve spatial resolution at the edge of FOV, Depth-Of-Interaction (DOI) PET has been investigated and several methods for DOI positioning were proposed. In this paper, a DOI-PET detector module using two 8x4 array avalanche photodiodes (APDs) (Hamamatsu, S8550) and a 2 cm long LYSO scintillation crystal was proposed and its DOI characteristics were investigated experimentally. In order to measure DOI positions, signals from two APDs were compared. Energy resolution was obtained from the sum of two APDs' signals and DOI positioning error was calculated. Finally, an optimum DOI step size in a 2 cm long LYSO were suggested to help to design a DOI-PET

  20. Time-frequency analysis of fusion plasma signals beyond the short-time Fourier transform paradigm: An overview

    International Nuclear Information System (INIS)

    Bizarro, Joao P.S.; Figueiredo, Antonio C.A.

    2008-01-01

    Performing a time-frequency (t-f) analysis on actual magnetic pick-up coil data from the JET tokamak, a comparison is presented between the spectrogram and the Wigner and Choi-Williams distributions. Whereas the former, which stems from the short-time Fourier transform and has been the work-horse for t-f signal processing, implies an unavoidable trade-off between time and frequency resolutions, the latter two belong to a later generation of distributions that yield better, if not optimal joint t-f localization. Topics addressed include signal representation in the t-f plane, frequency identification and evolution, instantaneous-frequency estimation, and amplitude tracking

  1. Comparative analysis of time efficiency and spatial resolution between different EIT reconstruction algorithms

    International Nuclear Information System (INIS)

    Kacarska, Marija; Loskovska, Suzana

    2002-01-01

    In this paper comparative analysis between different EIT algorithms is presented. Analysis is made for spatial and temporal resolution of obtained images by several different algorithms. Discussions consider spatial resolution dependent on data acquisition method, too. Obtained results show that conventional applied-current EIT is more powerful compared to induced-current EIT. (Author)

  2. Oil price and financial markets: Multivariate dynamic frequency analysis

    International Nuclear Information System (INIS)

    Creti, Anna; Ftiti, Zied; Guesmi, Khaled

    2014-01-01

    The aim of this paper is to study the degree of interdependence between oil price and stock market index into two groups of countries: oil-importers and oil-exporters. To this end, we propose a new empirical methodology allowing a time-varying dynamic correlation measure between the stock market index and the oil price series. We use the frequency approach proposed by Priestley and Tong (1973), that is the evolutionary co-spectral analysis. This method allows us to distinguish between short-run and medium-run dependence. In order to complete our study by analysing long-run dependence, we use the cointegration procedure developed by Engle and Granger (1987). We find that interdependence between the oil price and the stock market is stronger in exporters' markets than in the importers' ones. - Highlights: • A new time-varying measure for the stock markets and oil price relationship in different horizons. • We propose a new empirical methodology: multivariate frequency approach. • We propose a comparison between oil importing and exporting countries. • We show that oil is not always countercyclical with respect to stock markets. • When high oil prices originate from supply shocks, oil is countercyclical with stock markets

  3. Frequency Analysis of Aircraft hazards for License Application

    International Nuclear Information System (INIS)

    K. Ashley

    2006-01-01

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards

  4. Stochastic resolution analysis of co-prime arrays in radar

    NARCIS (Netherlands)

    Pribic, R; Coutiño Minguez, M.A.; Leus, G.J.T.

    2016-01-01

    Resolution from co-prime arrays and from a full ULA of the size equal to the virtual size of co-prime arrays is investigated. We take into account not only the resulting beam width but also the fact that fewer measurements are acquired by co-prime arrays. This fact is relevant in compressive

  5. Correlation analysis of the physiological factors controlling fundamental voice frequency.

    Science.gov (United States)

    Atkinson, J E

    1978-01-01

    A technique has been developed to obtain a quantitative measure of correlation between electromyographic (EMG) activity of various laryngeal muscles, subglottal air pressure, and the fundamental frequency of vibration of the vocal folds (Fo). Data were collected and analyzed on one subject, a native speaker of American English. The results show that an analysis of this type can provide a useful measure of correlation between the physiological and acoustical events in speech and, furthermore, can yield detailed insights into the organization and nature of the speech production process. In particular, based on these results, a model is suggested of Fo control involving laryngeal state functions that seems to agree with present knowledge of laryngeal control and experimental evidence.

  6. Analysis and modelling of engineering structures in frequency domain

    International Nuclear Information System (INIS)

    Ishtev, K.; Bonev, Z.; Petrov, P.; Philipov, P.

    1987-01-01

    This paper deals with some possible applications for modelling and analysis of engineering structures, basing on technique, mentioned above. The governing system of equations is written by using frequency domain approach since elemination technique has computational significance in this field. Modelling is made basing on the well known relationship Y(jw) = W(jw) * X(jw). Here X(jw) is a complex Fourier spectra associated with the imput signals being defined as earthquake, wind, hydrodynamic, control or other type of action. W(jw) is a matrix complex transfer function which reveals the correlation between input X und output Y spectra. Y (ja) represents a complex Fourier spectra of output signals. Input and output signals are both associated with master degrees of freedom, thus matrix transfer function is composed of elements in such a manner that solve unknown parameters are implemented implicitly. It is available an integration algorithm of 'condensed' system of equations. (orig./GL)

  7. Brain Network Analysis from High-Resolution EEG Signals

    Science.gov (United States)

    de Vico Fallani, Fabrizio; Babiloni, Fabio

    lattice and a random structure. Such a model has been designated as "small-world" network in analogy with the concept of the small-world phenomenon observed more than 30 years ago in social systems. In a similar way, many types of functional brain networks have been analyzed according to this mathematical approach. In particular, several studies based on different imaging techniques (fMRI, MEG and EEG) have found that the estimated functional networks showed small-world characteristics. In the functional brain connectivity context, these properties have been demonstrated to reflect an optimal architecture for the information processing and propagation among the involved cerebral structures. However, the performance of cognitive and motor tasks as well as the presence of neural diseases has been demonstrated to affect such a small-world topology, as revealed by the significant changes of L and C. Moreover, some functional brain networks have been mostly found to be very unlike the random graphs in their degree-distribution, which gives information about the allocation of the functional links within the connectivity pattern. It was demonstrated that the degree distributions of these networks follow a power-law trend. For this reason those networks are called "scale-free". They still exhibit the small-world phenomenon but tend to contain few nodes that act as highly connected "hubs". Scale-free networks are known to show resistance to failure, facility of synchronization and fast signal processing. Hence, it would be important to see whether the scaling properties of the functional brain networks are altered under various pathologies or experimental tasks. The present Chapter proposes a theoretical graph approach in order to evaluate the functional connectivity patterns obtained from high-resolution EEG signals. In this way, the "Brain Network Analysis" (in analogy with the Social Network Analysis that has emerged as a key technique in modern sociology) represents an

  8. High resolution melting (HRM) analysis of DNA--its role and potential in food analysis.

    Science.gov (United States)

    Druml, Barbara; Cichna-Markl, Margit

    2014-09-01

    DNA based methods play an increasing role in food safety control and food adulteration detection. Recent papers show that high resolution melting (HRM) analysis is an interesting approach. It involves amplification of the target of interest in the presence of a saturation dye by the polymerase chain reaction (PCR) and subsequent melting of the amplicons by gradually increasing the temperature. Since the melting profile depends on the GC content, length, sequence and strand complementarity of the product, HRM analysis is highly suitable for the detection of single-base variants and small insertions or deletions. The review gives an introduction into HRM analysis, covers important aspects in the development of an HRM analysis method and describes how HRM data are analysed and interpreted. Then we discuss the potential of HRM analysis based methods in food analysis, i.e. for the identification of closely related species and cultivars and the identification of pathogenic microorganisms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. A space-time hybrid hourly rainfall model for derived flood frequency analysis

    Directory of Open Access Journals (Sweden)

    U. Haberlandt

    2008-12-01

    Full Text Available For derived flood frequency analysis based on hydrological modelling long continuous precipitation time series with high temporal resolution are needed. Often, the observation network with recording rainfall gauges is poor, especially regarding the limited length of the available rainfall time series. Stochastic precipitation synthesis is a good alternative either to extend or to regionalise rainfall series to provide adequate input for long-term rainfall-runoff modelling with subsequent estimation of design floods. Here, a new two step procedure for stochastic synthesis of continuous hourly space-time rainfall is proposed and tested for the extension of short observed precipitation time series.

    First, a single-site alternating renewal model is presented to simulate independent hourly precipitation time series for several locations. The alternating renewal model describes wet spell durations, dry spell durations and wet spell intensities using univariate frequency distributions separately for two seasons. The dependence between wet spell intensity and duration is accounted for by 2-copulas. For disaggregation of the wet spells into hourly intensities a predefined profile is used. In the second step a multi-site resampling procedure is applied on the synthetic point rainfall event series to reproduce the spatial dependence structure of rainfall. Resampling is carried out successively on all synthetic event series using simulated annealing with an objective function considering three bivariate spatial rainfall characteristics. In a case study synthetic precipitation is generated for some locations with short observation records in two mesoscale catchments of the Bode river basin located in northern Germany. The synthetic rainfall data are then applied for derived flood frequency analysis using the hydrological model HEC-HMS. The results show good performance in reproducing average and extreme rainfall characteristics as well as in

  10. On time-frequence analysis of heart rate variability

    NARCIS (Netherlands)

    H.G. van Steenis (Hugo)

    2002-01-01

    textabstractThe aim of this research is to develop a time-frequency method suitable to study HRV in greater detail. The following approach was used: • two known time-frequency representations were applied to HRV to understand its advantages and disadvantages in describing HRV in frequency and in

  11. Trade-off Analysis of Virtual Inertia and Fast Primary Frequency Control During Frequency Transients in a Converter Dominated Network

    DEFF Research Database (Denmark)

    Rezkalla, Michel M.N.; Marinelli, Mattia; Pertl, Michael

    2016-01-01

    Traditionally the electricity generation is based on rotating synchronous machines which provide inertia to the power system.The increasing share of converter connected energy sources reduces the available rotational inertia in the power system leading to faster frequency dynamics, which may cause...... more critical frequency excursions. Both, virtual inertia and fast primary control could serve as a solution to improvefrequency stability, however, their respective impacts on the system have different consequences, so that the trade-off is not straightforward. This study presents a comparative...... analysis of virtual inertiaand a fast primary control algorithms with respect to rate of change of frequency (ROCOF), frequency nadir and steady state value considering the effect of the dead time which is carried out by a sensitivity analysis. The investigation shows that the virtual inertia controller...

  12. Unraveling the Thousand Word Picture: An Introduction to Super-Resolution Data Analysis.

    Science.gov (United States)

    Lee, Antony; Tsekouras, Konstantinos; Calderon, Christopher; Bustamante, Carlos; Pressé, Steve

    2017-06-14

    Super-resolution microscopy provides direct insight into fundamental biological processes occurring at length scales smaller than light's diffraction limit. The analysis of data at such scales has brought statistical and machine learning methods into the mainstream. Here we provide a survey of data analysis methods starting from an overview of basic statistical techniques underlying the analysis of super-resolution and, more broadly, imaging data. We subsequently break down the analysis of super-resolution data into four problems: the localization problem, the counting problem, the linking problem, and what we've termed the interpretation problem.

  13. Orthology prediction at scalable resolution by phylogenetic tree analysis

    Directory of Open Access Journals (Sweden)

    Huynen Martijn A

    2007-03-01

    Full Text Available Abstract Background Orthology is one of the cornerstones of gene function prediction. Dividing the phylogenetic relations between genes into either orthologs or paralogs is however an oversimplification. Already in two-species gene-phylogenies, the complicated, non-transitive nature of phylogenetic relations results in inparalogs and outparalogs. For situations with more than two species we lack semantics to specifically describe the phylogenetic relations, let alone to exploit them. Published procedures to extract orthologous groups from phylogenetic trees do not allow identification of orthology at various levels of resolution, nor do they document the relations between the orthologous groups. Results We introduce "levels of orthology" to describe the multi-level nature of gene relations. This is implemented in a program LOFT (Levels of Orthology From Trees that assigns hierarchical orthology numbers to genes based on a phylogenetic tree. To decide upon speciation and gene duplication events in a tree LOFT can be instructed either to perform classical species-tree reconciliation or to use the species overlap between partitions in the tree. The hierarchical orthology numbers assigned by LOFT effectively summarize the phylogenetic relations between genes. The resulting high-resolution orthologous groups are depicted in colour, facilitating visual inspection of (large trees. A benchmark for orthology prediction, that takes into account the varying levels of orthology between genes, shows that the phylogeny-based high-resolution orthology assignments made by LOFT are reliable. Conclusion The "levels of orthology" concept offers high resolution, reliable orthology, while preserving the relations between orthologous groups. A Windows as well as a preliminary Java version of LOFT is available from the LOFT website http://www.cmbi.ru.nl/LOFT.

  14. Time-frequency analysis of railway bridge response in forced vibration

    Science.gov (United States)

    Cantero, Daniel; Ülker-Kaustell, Mahir; Karoumi, Raid

    2016-08-01

    This paper suggests the use of the Continuous Wavelet Transform in combination with the Modified Littlewood-Paley basis to analyse bridge responses exited by traversing trains. The analysis provides an energy distribution map in the time-frequency domain that offers a better resolution compared to previous published studies. This is demonstrated with recorded responses of the Skidträsk Bridge, a 36 m long composite bridge located in Sweden. It is shown to be particularly useful to understand the evolution of the energy content during a vehicle crossing event. With this information it is possible to distinguish the effect of several of the governing factors involved in the dynamic response including vehicle's speed and axle configuration as well as non-linear behaviour of the structure.

  15. High-resolution analysis of the mechanical behavior of tissue

    Science.gov (United States)

    Hudnut, Alexa W.; Armani, Andrea M.

    2017-06-01

    The mechanical behavior and properties of biomaterials, such as tissue, have been directly and indirectly connected to numerous malignant physiological states. For example, an increase in the Young's Modulus of tissue can be indicative of cancer. Due to the heterogeneity of biomaterials, it is extremely important to perform these measurements using whole or unprocessed tissue because the tissue matrix contains important information about the intercellular interactions and the structure. Thus, developing high-resolution approaches that can accurately measure the elasticity of unprocessed tissue samples is of great interest. Unfortunately, conventional elastography methods such as atomic force microscopy, compression testing, and ultrasound elastography either require sample processing or have poor resolution. In the present work, we demonstrate the characterization of unprocessed salmon muscle using an optical polarimetric elastography system. We compare the results of compression testing within different samples of salmon skeletal muscle with different numbers of collagen membranes to characterize differences in heterogeneity. Using the intrinsic collagen membranes as markers, we determine the resolution of the system when testing biomaterials. The device reproducibly measures the stiffness of the tissues at variable strains. By analyzing the amount of energy lost by the sample during compression, collagen membranes that are 500 μm in size are detected.

  16. High-resolution analysis of Y-chromosomal polymorphisms reveals ...

    Indian Academy of Sciences (India)

    Unknown

    2001-12-12

    Dec 12, 2001 ... from out of Africa, through West Asia, was into India. (Cann 2001). .... frequencies in all the Middle Eastern, Central Asian and ..... the south and the west. Some of these .... Israeli Ministry of Science, Culture and Sport. We wish ...

  17. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 1: Frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    We develop a general framework for the frequency analysis of irregularly sampled time series. It is based on the Lomb-Scargle periodogram, but extended to algebraic operators accounting for the presence of a polynomial trend in the model for the data, in addition to a periodic component and a background noise. Special care is devoted to the correlation between the trend and the periodic component. This new periodogram is then cast into the Welch overlapping segment averaging (WOSA) method in order to reduce its variance. We also design a test of significance for the WOSA periodogram, against the background noise. The model for the background noise is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, more general than the classical Gaussian white or red noise processes. CARMA parameters are estimated following a Bayesian framework. We provide algorithms that compute the confidence levels for the WOSA periodogram and fully take into account the uncertainty in the CARMA noise parameters. Alternatively, a theory using point estimates of CARMA parameters provides analytical confidence levels for the WOSA periodogram, which are more accurate than Markov chain Monte Carlo (MCMC) confidence levels and, below some threshold for the number of data points, less costly in computing time. We then estimate the amplitude of the periodic component with least-squares methods, and derive an approximate proportionality between the squared amplitude and the periodogram. This proportionality leads to a new extension for the periodogram: the weighted WOSA periodogram, which we recommend for most frequency analyses with irregularly sampled data. The estimated signal amplitude also permits filtering in a frequency band. Our results generalise and unify methods developed in the fields of geosciences, engineering, astronomy and astrophysics. They also constitute the starting point for an extension to the continuous wavelet transform developed in a companion

  18. Frequency Analysis of Aircraft hazards for License Application

    Energy Technology Data Exchange (ETDEWEB)

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  19. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  20. Helper T lymphocyte precursor frequency analysis in alloreactivity detection

    International Nuclear Information System (INIS)

    Cukrova, V.; Dolezalova, L.; Loudova, M.; Vitek, A.

    1998-01-01

    The utility of IL-2 secreting helper T lymphocyte precursors (HTLp) frequency testing has been evaluated for detecting alloreactivity. The frequency of HTLp was approached by limiting dilution assay. High HTLp frequency was detected in 20 out of 30 HLA matched unrelated pairs (67%). The comparison of HTLp and CTLp (cytotoxic T lymphocyte precursors) frequencies in HLA matched unrelated pairs showed that the two examinations are not fully alternative in detecting alloreactivity. This could suggest the utility of combined testing of both HTLp and CTLp frequencies for alloreactivity assessment. In contrast, five positive HTLp values were only found among 28 HLA genotypic identical siblings (18%). Previous CTLp limiting dilution studies showed very low or undetectable CTLp frequency results in that group. For that, HTLp assay remains to be the only cellular in vitro technique detecting alloreactivity in these combinations. (authors)

  1. Comparison of frequency-distance relationship and Gaussian-diffusion-based methods of compensation for distance-dependent spatial resolution in SPECT imaging

    International Nuclear Information System (INIS)

    Kohli, Vandana; King, Micgael A.; Glick, Stephen J.; Pan, Tin-Su

    1998-01-01

    The goal of this investigation was to compare resolution recovery versus noise level of two methods for compensation of distance-dependent resolution (DDR) in SPECT imaging. The two methods of compensation were restoration filtering based on the frequency-distance relationship (FDR) prior to iterative reconstruction, and modelling DDR in the projector/backprojector pair employed in iterative reconstruction. FDR restoration filtering was computationally faster than modelling the detector response in iterative reconstruction. Using Gaussian diffusion to model the detector response in iterative reconstruction sped up the process by a factor of 2.5 over frequency domain filtering in the projector/backprojector pair. Gaussian diffusion modelling resulted in a better resolution versus noise tradeoff than either FDR restoration filtering or solely modelling attenuation in the projector/backprojector pair of iterative reconstruction. For the pixel size investigated herein (0.317 cm), accounting for DDR in the projector/backprojector pair by Gaussian diffusion, or by applying a blurring function based on the distance from the face of the collimator at each distance, resulted in very similar resolution recovery and slice noise level. (author)

  2. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  3. Time-frequency analysis of stimulus frequency otoacoustic emissions and their changes with efferent stimulation in guinea pigs

    Science.gov (United States)

    Berezina-Greene, Maria A.; Guinan, John J.

    2015-12-01

    To aid in understanding their origin, stimulus frequency otoacoustic emissions (SFOAEs) were measured at a series of tone frequencies using the suppression method, both with and without stimulation of medial olivocochlear (MOC) efferents, in anesthetized guinea pigs. Time-frequency analysis showed SFOAE energy peaks in 1-3 delay components throughout the measured frequency range (0.5-12 kHz). One component's delay usually coincided with the phase-gradient delay. When multiple delay components were present, they were usually near SFOAE dips. Below 2 kHz, SFOAE delays were shorter than predicted from mechanical measurements. With MOC stimulation, SFOAE amplitude was decreased at most frequencies, but was sometimes enhanced, and all SFOAE delay components were affected. The MOC effects and an analysis of model data suggest that the multiple SFOAE delay components arise at the edges of the traveling-wave peak, not far basal of the peak. Comparisons with published guinea-pig neural data suggest that the short latencies of low-frequency SFOAEs may arise from coherent reflection from an organ-of-Corti motion that has a shorter group delay than the traveling wave.

  4. High-resolution SNP array analysis of patients with developmental disorder and normal array CGH results

    Directory of Open Access Journals (Sweden)

    Siggberg Linda

    2012-09-01

    Full Text Available Abstract Background Diagnostic analysis of patients with developmental disorders has improved over recent years largely due to the use of microarray technology. Array methods that facilitate copy number analysis have enabled the diagnosis of up to 20% more patients with previously normal karyotyping results. A substantial number of patients remain undiagnosed, however. Methods and Results Using the Genome-Wide Human SNP array 6.0, we analyzed 35 patients with a developmental disorder of unknown cause and normal array comparative genomic hybridization (array CGH results, in order to characterize previously undefined genomic aberrations. We detected no seemingly pathogenic copy number aberrations. Most of the vast amount of data produced by the array was polymorphic and non-informative. Filtering of this data, based on copy number variant (CNV population frequencies as well as phenotypically relevant genes, enabled pinpointing regions of allelic homozygosity that included candidate genes correlating to the phenotypic features in four patients, but results could not be confirmed. Conclusions In this study, the use of an ultra high-resolution SNP array did not contribute to further diagnose patients with developmental disorders of unknown cause. The statistical power of these results is limited by the small size of the patient cohort, and interpretation of these negative results can only be applied to the patients studied here. We present the results of our study and the recurrence of clustered allelic homozygosity present in this material, as detected by the SNP 6.0 array.

  5. Quantitative analysis of cholesteatoma using high resolution computed tomography

    International Nuclear Information System (INIS)

    Kikuchi, Shigeru; Yamasoba, Tatsuya; Iinuma, Toshitaka.

    1992-01-01

    Seventy-three cases of adult cholesteatoma, including 52 cases of pars flaccida type cholesteatoma and 21 of pars tensa type cholesteatoma, were examined using high resolution computed tomography, in both axial (lateral semicircular canal plane) and coronal sections (cochlear, vestibular and antral plane). These cases were classified into two subtypes according to the presence of extension of cholesteatoma into the antrum. Sixty cases of chronic otitis media with central perforation (COM) were also examined as controls. Various locations of the middle ear cavity were measured in terms of size in comparison with pars flaccida type cholesteatoma, pars tensa type cholesteatoma and COM. The width of the attic was significantly larger in both pars flaccida type and pars tensa type cholesteatoma than in COM. With pars flaccida type cholesteatoma there was a significantly larger distance between the malleus and lateral wall of the attic than with COM. In contrast, the distance between the malleus and medial wall of the attic was significantly larger with pars tensa type cholesteatoma than with COM. With cholesteatoma extending into the antrum, regardless of the type of cholesteatoma, there were significantly larger distances than with COM at the following sites: the width and height of the aditus ad antrum, and the width, height and anterior-posterior diameter of the antrum. However, these distances were not significantly different between cholesteatoma without extension into the antrum and COM. The hitherto demonstrated qualitative impressions of bone destruction in cholesteatoma were quantitatively verified in detail using high resolution computed tomography. (author)

  6. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  7. Montecarlo simulation for a new high resolution elemental analysis methodology

    International Nuclear Information System (INIS)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto

    1996-01-01

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2π solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  8. Uncertainty Assessment of Hydrological Frequency Analysis Using Bootstrap Method

    Directory of Open Access Journals (Sweden)

    Yi-Ming Hu

    2013-01-01

    Full Text Available The hydrological frequency analysis (HFA is the foundation for the hydraulic engineering design and water resources management. Hydrological extreme observations or samples are the basis for HFA; the representativeness of a sample series to the population distribution is extremely important for the estimation reliability of the hydrological design value or quantile. However, for most of hydrological extreme data obtained in practical application, the size of the samples is usually small, for example, in China about 40~50 years. Generally, samples with small size cannot completely display the statistical properties of the population distribution, thus leading to uncertainties in the estimation of hydrological design values. In this paper, a new method based on bootstrap is put forward to analyze the impact of sampling uncertainty on the design value. By bootstrap resampling technique, a large number of bootstrap samples are constructed from the original flood extreme observations; the corresponding design value or quantile is estimated for each bootstrap sample, so that the sampling distribution of design value is constructed; based on the sampling distribution, the uncertainty of quantile estimation can be quantified. Compared with the conventional approach, this method provides not only the point estimation of a design value but also quantitative evaluation on uncertainties of the estimation.

  9. Quantification of Uncertainty in the Flood Frequency Analysis

    Science.gov (United States)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  10. Assessment of homogeneity of regions for regional flood frequency analysis

    Science.gov (United States)

    Lee, Jeong Eun; Kim, Nam Won

    2016-04-01

    This paper analyzed the effect of rainfall on hydrological similarity, which is an important step for regional flood frequency analysis (RFFA). For the RFFA, storage function method (SFM) using spatial extension technique was applied for the 22 sub-catchments that are partitioned from Chungju dam watershed in Republic of Korea. We used the SFM to generate the annual maximum floods for 22 sub-catchments using annual maximum storm events (1986~2010) as input data. Then the quantiles of rainfall and flood were estimated using the annual maximum series for the 22 sub-catchments. Finally, spatial variations in terms of two quantiles were analyzed. As a result, there were significant correlation between spatial variations of the two quantiles. This result demonstrates that spatial variation of rainfall is an important factor to explain the homogeneity of regions when applying RFFA. Acknowledgements: This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  11. Numerical electromagnetic frequency domain analysis with discrete exterior calculus

    Science.gov (United States)

    Chen, Shu C.; Chew, Weng Cho

    2017-12-01

    In this paper, we perform a numerical analysis in frequency domain for various electromagnetic problems based on discrete exterior calculus (DEC) with an arbitrary 2-D triangular or 3-D tetrahedral mesh. We formulate the governing equations in terms of DEC for 3-D and 2-D inhomogeneous structures, and also show that the charge continuity relation is naturally satisfied. Then we introduce a general construction for signed dual volume to incorporate material information and take into account the case when circumcenters fall outside triangles or tetrahedrons, which may lead to negative dual volume without Delaunay triangulation. Then we examine the boundary terms induced by the dual mesh and provide a systematical treatment of various boundary conditions, including perfect magnetic conductor (PMC), perfect electric conductor (PEC), Dirichlet, periodic, and absorbing boundary conditions (ABC) within this method. An excellent agreement is achieved through the numerical calculation of several problems, including homogeneous waveguides, microstructured fibers, photonic crystals, scattering by a 2-D PEC, and resonant cavities.

  12. Rainfall frequency analysis for ungauged sites using satellite precipitation products

    Science.gov (United States)

    Gado, Tamer A.; Hsu, Kuolin; Sorooshian, Soroosh

    2017-11-01

    The occurrence of extreme rainfall events and their impacts on hydrologic systems and society are critical considerations in the design and management of a large number of water resources projects. As precipitation records are often limited or unavailable at many sites, it is essential to develop better methods for regional estimation of extreme rainfall at these partially-gauged or ungauged sites. In this study, an innovative method for regional rainfall frequency analysis for ungauged sites is presented. The new method (hereafter, this is called the RRFA-S) is based on corrected annual maximum series obtained from a satellite precipitation product (e.g., PERSIANN-CDR). The probability matching method (PMM) is used here for bias correction to match the CDF of satellite-based precipitation data with the gauged data. The RRFA-S method was assessed through a comparative study with the traditional index flood method using the available annual maximum series of daily rainfall in two different regions in USA (11 sites in Colorado and 18 sites in California). The leave-one-out cross-validation technique was used to represent the ungauged site condition. Results of this numerical application have found that the quantile estimates obtained from the new approach are more accurate and more robust than those given by the traditional index flood method.

  13. Use of historical information in extreme storm surges frequency analysis

    Science.gov (United States)

    Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent

    2013-04-01

    The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the

  14. Elasticity analysis by MR elastography using the instantaneous frequency method

    International Nuclear Information System (INIS)

    Oshiro, Osamu; Suga, Mikio; Minato, Kotaro; Okamoto, Jun; Takizawa, Osamu; Matsuda, Tetsuya; Komori, Masaru; Takahashi, Takashi

    2000-01-01

    This paper describes a calculation method for estimating the elasticity of human organs using magnetic resonance elastography (MRE) images. The method is based on the instantaneous frequency method, which is very sensitive to noise. Therefore, the proposed method also incorporates a noise-reduction function. In the instantaneous frequency method, Fourier transform is applied to the measurement signal. Then, inverse Fourier transform is performed after the negative frequency component is set to zero. In the proposed method, noise is reduced by processing in which the positive higher frequency component is also set to zero before inverse Fourier transform is performed. First, we conducted a simulation study and confirmed the applicability of this method and the noise reduction function. Next, we carried out a phantom experiment and demonstrated that elasticity images could be generated, with the gray level corresponding to the local frequency in MRE images. (author)

  15. Time-frequency analysis of time-varying modulated signals based on improved energy separation by iterative generalized demodulation

    Science.gov (United States)

    Feng, Zhipeng; Chu, Fulei; Zuo, Ming J.

    2011-03-01

    Energy separation algorithm is good at tracking instantaneous changes in frequency and amplitude of modulated signals, but it is subject to the constraints of mono-component and narrow band. In most cases, time-varying modulated vibration signals of machinery consist of multiple components, and have so complicated instantaneous frequency trajectories on time-frequency plane that they overlap in frequency domain. For such signals, conventional filters fail to obtain mono-components of narrow band, and their rectangular decomposition of time-frequency plane may split instantaneous frequency trajectories thus resulting in information loss. Regarding the advantage of generalized demodulation method in decomposing multi-component signals into mono-components, an iterative generalized demodulation method is used as a preprocessing tool to separate signals into mono-components, so as to satisfy the requirements by energy separation algorithm. By this improvement, energy separation algorithm can be generalized to a broad range of signals, as long as the instantaneous frequency trajectories of signal components do not intersect on time-frequency plane. Due to the good adaptability of energy separation algorithm to instantaneous changes in signals and the mono-component decomposition nature of generalized demodulation, the derived time-frequency energy distribution has fine resolution and is free from cross term interferences. The good performance of the proposed time-frequency analysis is illustrated by analyses of a simulated signal and the on-site recorded nonstationary vibration signal of a hydroturbine rotor during a shut-down transient process, showing that it has potential to analyze time-varying modulated signals of multi-components.

  16. Design of a high speed, high resolution thermometry system for 1.5 GHz superconducting radio frequency cavities

    Science.gov (United States)

    Knobloch, Jens; Muller, Henry; Padamsee, Hasan

    1994-11-01

    Presented in this paper are the description and the test results of a new stationary thermometry system used to map the temperature of the outer surface of 1.5 GHz superconducting single-cell cavities during operation at 1.6 K. The system comprises 764 removable carbon thermometers whose signals are multiplexed and scanned by a Macintosh computer. A complete temperature map can be obtained in as little as 0.1 s at a temperature resolution of about 0.2 mK. Alternatively, it has been demonstrated that if the acquisition time is increased to several seconds, then a temperature resolution on the order of 30 μK is possible. To our knowledge, these are the fastest acquisition times so far achieved with L-band cavities at these resolutions.

  17. Application on technique of joint time-frequency analysis of seismic signal's first arrival estimation

    International Nuclear Information System (INIS)

    Xu Chaoyang; Liu Junmin; Fan Yanfang; Ji Guohua

    2008-01-01

    Joint time-frequency analysis is conducted to construct one joint density function of time and frequency. It can open out one signal's frequency components and their evolvements. It is the new evolvement of Fourier analysis. In this paper, according to the characteristic of seismic signal's noise, one estimation method of seismic signal's first arrival based on triple correlation of joint time-frequency spectrum is introduced, and the results of experiment and conclusion are presented. (authors)

  18. Coastal Change Analysis Program (C-CAP) High Resolution Land Cover and Change Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Coastal Change Analysis Program (C-CAP) produces national standardized high resolution land cover and change products for the coastal regions of the U.S....

  19. Flood Frequency Analysis of Future Climate Projections in the Cache Creek Watershed

    Science.gov (United States)

    Fischer, I.; Trihn, T.; Ishida, K.; Jang, S.; Kavvas, E.; Kavvas, M. L.

    2014-12-01

    Effects of climate change on hydrologic flow regimes, particularly extreme events, necessitate modeling of future flows to best inform water resources management. Future flow projections may be modeled through the joint use of carbon emission scenarios, general circulation models and watershed models. This research effort ran 13 simulations for carbon emission scenarios (taken from the A1, A2 and B1 families) over the 21st century (2001-2100) for the Cache Creek watershed in Northern California. Atmospheric data from general circulation models, CCSM3 and ECHAM5, were dynamically downscaled to a 9 km resolution using MM5, a regional mesoscale model, before being input into the physically based watershed environmental hydrology (WEHY) model. Ensemble mean and standard deviation of simulated flows describe the expected hydrologic system response. Frequency histograms and cumulative distribution functions characterize the range of hydrologic responses that may occur. The modeled flow results comprise a dataset suitable for time series and frequency analysis allowing for more robust system characterization, including indices such as the 100 year flood return period. These results are significant for water quality management as the Cache Creek watershed is severely impacted by mercury pollution from historic mining activities. Extreme flow events control mercury fate and transport affecting the downstream water bodies of the Sacramento River and Sacramento- San Joaquin Delta which provide drinking water to over 25 million people.

  20. The dynamics of cyclone clustering in re-analysis and a high-resolution climate model

    Science.gov (United States)

    Priestley, Matthew; Pinto, Joaquim; Dacre, Helen; Shaffrey, Len

    2017-04-01

    Extratropical cyclones have a tendency to occur in groups (clusters) in the exit of the North Atlantic storm track during wintertime, potentially leading to widespread socioeconomic impacts. The Winter of 2013/14 was the stormiest on record for the UK and was characterised by the recurrent clustering of intense extratropical cyclones. This clustering was associated with a strong, straight and persistent North Atlantic 250 hPa jet with Rossby wave-breaking (RWB) on both flanks, pinning the jet in place. Here, we provide for the first time an analysis of all clustered events in 36 years of the ERA-Interim Re-analysis at three latitudes (45˚ N, 55˚ N, 65˚ N) encompassing various regions of Western Europe. The relationship between the occurrence of RWB and cyclone clustering is studied in detail. Clustering at 55˚ N is associated with an extended and anomalously strong jet flanked on both sides by RWB. However, clustering at 65(45)˚ N is associated with RWB to the south (north) of the jet, deflecting the jet northwards (southwards). A positive correlation was found between the intensity of the clustering and RWB occurrence to the north and south of the jet. However, there is considerable spread in these relationships. Finally, analysis has shown that the relationships identified in the re-analysis are also present in a high-resolution coupled global climate model (HiGEM). In particular, clustering is associated with the same dynamical conditions at each of our three latitudes in spite of the identified biases in frequency and intensity of RWB.

  1. An analysis of low frequency noise from large wind turbines

    DEFF Research Database (Denmark)

    Pedersen, Christian Sejer; Møller, Henrik

    2010-01-01

    As wind turbines get larger, worries have emerged, that the noise emitted by the turbines would move down in frequency, and that the contents of low-frequency noise would be enough to cause significant annoyance for the neighbors. The sound emission from 48 wind turbines with nominal electric power......-third-octave-band spectra shows that the relative noise emission is higher in the 63-250 Hz frequency range from turbines above 2 MW than from smaller turbines. The observations confirm a downward shift of the spectrum....

  2. Towards high resolution polarisation analysis using double polarisation and ellipsoidal analysers

    CERN Document Server

    Martin-Y-Marero, D

    2002-01-01

    Classical polarisation analysis methods lack the combination of high resolution and high count rate necessary to cope with the demand of modern condensed-matter experiments. In this work, we present a method to achieve high resolution polarisation analysis based on a double polarisation system. Coupling this method with an ellipsoidal wavelength analyser, a high count rate can be achieved whilst delivering a resolution of around 10 mu eV. This method is ideally suited to pulsed sources, although it can be adapted to continuous sources as well. (orig.)

  3. Multi-component time, spatial and frequency analysis of Paleoclimatic Data

    Science.gov (United States)

    Cristiano, Luigia; Stampa, Johannes; Feeser, Ingo; Dörfler, Walter; Meier, Thomas

    2017-04-01

    The investigation of the paleoclimatic data offers a powerful tool for understanding the impact of extreme climatic events as well as gradual climatic variations on the human development and cultural changes. The current global record of paleoclimatic data is relatively rich but is not generally uniformly structured and regionally distributed. The general characteristic of the reconstructed time series of paleoclimatic data is a not constant sampling interval and data resolution together with the presence of gaps in the record. Our database consists of pollen concentration from annually laminated lake sediments in two sites in Northern Germany. Such data characteristic offers the possibility for high-resolution palynological and sedimentological analyses on a well constrained time scale. Specifically we are interested to investigate the time dependence of proxies, and time and spatial correlation of the different observables respect each other. We present here a quantitative analysis of the pollent data in the frequency and time. In particular we are interested to understand the complexity of the system and understand the cause of sudden as well as the slow changes in the time dependence of the observables. We show as well our approach for handling the not uniform sampling interval and the broad frequency content characterizing the paleoclimatic databases. In particular we worked to the development of a robust data analysis to answer the key questions about the correlation between rapid climatic changes and changes in the human habits and quantitatively elaborate a model for the processed data. Here we present the preliminary results on synthetics as well as on real data for the data visualization for the trend identification with a smoothing procedure, for the identification of sharp changes in the data as function of time with AutoRegressive approach. In addition to that we use the cross-correlation and cross spectrum by applying the Multiple Filtering Technique

  4. Time frequency analysis of olfactory induced EEG-power change.

    Directory of Open Access Journals (Sweden)

    Valentin Alexander Schriever

    Full Text Available The objective of the present study was to investigate the usefulness of time-frequency analysis (TFA of olfactory-induced EEG change with a low-cost, portable olfactometer in the clinical investigation of smell function.A total of 78 volunteers participated. The study was composed of three parts where olfactory stimuli were presented using a custom-built olfactometer. Part I was designed to optimize the stimulus as well as the recording conditions. In part II EEG-power changes after olfactory/trigeminal stimulation were compared between healthy participants and patients with olfactory impairment. In Part III the test-retest reliability of the method was evaluated in healthy subjects.Part I indicated that the most effective paradigm for stimulus presentation was cued stimulus, with an interstimulus interval of 18-20s at a stimulus duration of 1000ms with each stimulus quality presented 60 times in blocks of 20 stimuli each. In Part II we found that central processing of olfactory stimuli analyzed by TFA differed significantly between healthy controls and patients even when controlling for age. It was possible to reliably distinguish patients with olfactory impairment from healthy individuals at a high degree of accuracy (healthy controls vs anosmic patients: sensitivity 75%; specificity 89%. In addition we could show a good test-retest reliability of TFA of chemosensory induced EEG-power changes in Part III.Central processing of olfactory stimuli analyzed by TFA reliably distinguishes patients with olfactory impairment from healthy individuals at a high degree of accuracy. Importantly this can be achieved with a simple olfactometer.

  5. Insertion torque, resonance frequency, and removal torque analysis of microimplants.

    Science.gov (United States)

    Tseng, Yu-Chuan; Ting, Chun-Chan; Du, Je-Kang; Chen, Chun-Ming; Wu, Ju-Hui; Chen, Hong-Sen

    2016-09-01

    This study aimed to compare the insertion torque (IT), resonance frequency (RF), and removal torque (RT) among three microimplant brands. Thirty microimplants of the three brands were used as follows: Type A (titanium alloy, 1.5-mm × 8-mm), Type B (stainless steel, 1.5-mm × 8-mm), and Type C (titanium alloy, 1.5-mm × 9-mm). A synthetic bone with a 2-mm cortical bone and bone marrow was used. Each microimplant was inserted into the synthetic bone, without predrilling, to a 7 mm depth. The IT, RF, and RT were measured in both vertical and horizontal directions. One-way analysis of variance and Spearman's rank correlation coefficient tests were used for intergroup and intragroup comparisons, respectively. In the vertical test, the ITs of Type C (7.8 Ncm) and Type B (7.5 Ncm) were significantly higher than that of Type A (4.4 Ncm). The RFs of Type C (11.5 kHz) and Type A (10.2 kHz) were significantly higher than that of Type B (7.5 kHz). Type C (7.4 Ncm) and Type B (7.3 Ncm) had significantly higher RTs than did Type A (4.1 Ncm). In the horizontal test, both the ITs and RTs were significantly higher for Type C, compared with Type A. No significant differences were found among the groups, and the study hypothesis was accepted. Type A had the lowest inner/outer diameter ratio and widest apical facing angle, engendering the lowest IT and highest RF values. However, no significant correlations in the IT, RF, and RT were observed among the three groups. Copyright © 2016. Published by Elsevier Taiwan.

  6. Frequency of Testing for Dyslipidemia: An Evidence-Based Analysis

    Science.gov (United States)

    2014-01-01

    Background Dyslipidemias include high levels of total cholesterol, low-density lipoprotein (LDL) cholesterol, and triglycerides and low levels of high-density lipoprotein (HDL) cholesterol. Dyslipidemia is a risk factor for cardiovascular disease, which is a major contributor to mortality in Canada. Approximately 23% of the 2009/11 Canadian Health Measures Survey (CHMS) participants had a high level of LDL cholesterol, with prevalence increasing with age, and approximately 15% had a total cholesterol to HDL ratio above the threshold. Objectives To evaluate the frequency of lipid testing in adults not diagnosed with dyslipidemia and in adults on treatment for dyslipidemia. Research Methods A systematic review of the literature set out to identify randomized controlled trials (RCTs), systematic reviews, health technology assessments (HTAs), and observational studies published between January 1, 2000, and November 29, 2012, that evaluated the frequency of testing for dyslipidemia in the 2 populations. Results Two observational studies assessed the frequency of lipid testing, 1 in individuals not on lipid-lowering medications and 1 in treated individuals. Both studies were based on previously collected data intended for a different objective and, therefore, no conclusions could be reached about the frequency of testing at intervals other than the ones used in the original studies. Given this limitation and generalizability issues, the quality of evidence was considered very low. No evidence for the frequency of lipid testing was identified in the 2 HTAs included. Canadian and international guidelines recommend testing for dyslipidemia in individuals at an increased risk for cardiovascular disease. The frequency of testing recommended is based on expert consensus. Conclusions Conclusions on the frequency of lipid testing could not be made based on the 2 observational studies. Current guidelines recommend lipid testing in adults with increased cardiovascular risk, with

  7. High-resolution analysis of cytosine methylation in ancient DNA.

    Directory of Open Access Journals (Sweden)

    Bastien Llamas

    Full Text Available Epigenetic changes to gene expression can result in heritable phenotypic characteristics that are not encoded in the DNA itself, but rather by biochemical modifications to the DNA or associated chromatin proteins. Interposed between genes and environment, these epigenetic modifications can be influenced by environmental factors to affect phenotype for multiple generations. This raises the possibility that epigenetic states provide a substrate for natural selection, with the potential to participate in the rapid adaptation of species to changes in environment. Any direct test of this hypothesis would require the ability to measure epigenetic states over evolutionary timescales. Here we describe the first single-base resolution of cytosine methylation patterns in an ancient mammalian genome, by bisulphite allelic sequencing of loci from late Pleistocene Bison priscus remains. Retrotransposons and the differentially methylated regions of imprinted loci displayed methylation patterns identical to those derived from fresh bovine tissue, indicating that methylation patterns are preserved in the ancient DNA. Our findings establish the biochemical stability of methylated cytosines over extensive time frames, and provide the first direct evidence that cytosine methylation patterns are retained in DNA from ancient specimens. The ability to resolve cytosine methylation in ancient DNA provides a powerful means to study the role of epigenetics in evolution.

  8. Analysis of the frequency components of X-ray images

    International Nuclear Information System (INIS)

    Matsuo, Satoru; Komizu, Mitsuru; Kida, Tetsuo; Noma, Kazuo; Hashimoto, Keiji; Onishi, Hideo; Masuda, Kazutaka

    1997-01-01

    We examined the relation between the frequency components of x-ray images of the chest and phalanges and their read sizes for digitizing. Images of the chest and phalanges were radiographed using three types of screens and films, and the noise images in background density were digitized with a drum scanner, changing the read sizes. The frequency components for these images were evaluated by converting them to the secondary Fourier to obtain the power spectrum and signal to noise ratio (SNR). After changing the cut-off frequency on the power spectrum to process a low pass filter, we also examined the frequency components of the images in relation to the normalized mean square error (NMSE) for the image converted to reverse Fourier and the original image. Results showed that the frequency components were 2.0 cycles/mm for the chest image and 6.0 cycles/mm for the phalanges. Therefore, it is necessary to collect data applying the read sizes of 200 μm and 50 μm for the chest and phalangeal images, respectively, in order to digitize these images without loss of their frequency components. (author)

  9. High-frequency and time resolution rocket observations of structured low- and medium-frequency whistler mode emissions in the auroral ionosphere

    Science.gov (United States)

    LaBelle, J.; McAdams, K. L.; Trimpi, M. L.

    High bandwidth electric field waveform measurements on a recent auroral sounding rocket reveal structured whistler mode signals at 400-800 kHz. These are observed intermittently between 300 and 500 km with spectral densities 0-10 dB above the detection threshold of 1.5×10-11V2/m2Hz. The lack of correlation with local particle measurements suggests a remote source. The signals are composed of discrete structures, in one case having bandwidths of about 10 kHz and exhibiting rapid frequency variations of the order of 200 kHz per 100 ms. In one case, emissions near the harmonic of the whistler mode signals are detected simultaneously. Current theories of auroral zone whistler mode emissions have not been applied to explain quantitatively the fine structure of these signals, which resemble auroral kilometric radiation (AKR) rather than auroral hiss.

  10. Time-frequency analysis of band-limited EEG with BMFLC and Kalman filter for BCI applications

    Science.gov (United States)

    2013-01-01

    Background Time-Frequency analysis of electroencephalogram (EEG) during different mental tasks received significant attention. As EEG is non-stationary, time-frequency analysis is essential to analyze brain states during different mental tasks. Further, the time-frequency information of EEG signal can be used as a feature for classification in brain-computer interface (BCI) applications. Methods To accurately model the EEG, band-limited multiple Fourier linear combiner (BMFLC), a linear combination of truncated multiple Fourier series models is employed. A state-space model for BMFLC in combination with Kalman filter/smoother is developed to obtain accurate adaptive estimation. By virtue of construction, BMFLC with Kalman filter/smoother provides accurate time-frequency decomposition of the bandlimited signal. Results The proposed method is computationally fast and is suitable for real-time BCI applications. To evaluate the proposed algorithm, a comparison with short-time Fourier transform (STFT) and continuous wavelet transform (CWT) for both synthesized and real EEG data is performed in this paper. The proposed method is applied to BCI Competition data IV for ERD detection in comparison with existing methods. Conclusions Results show that the proposed algorithm can provide optimal time-frequency resolution as compared to STFT and CWT. For ERD detection, BMFLC-KF outperforms STFT and BMFLC-KS in real-time applicability with low computational requirement. PMID:24274109

  11. High resolution analysis of northern Patagonia lake sediments

    Science.gov (United States)

    Jarvis, S. W.; Croudace, I. W.; Langdon, P. G.; Rindby, A.

    2009-04-01

    Sediment cores covering the period from the last glacial maximum through the Holocene to the present have been collected from sites in the Chacubuco valley, southern Chile (around 47°08'S, 72°25'W, to the east of the North Patagonian Icecap). Cores were taken from five lakes and one recently dried lake bed. Short cores (0.2 to 0.5m), covering approximately the last two hundred years, were taken from all the lakes. Additionally, long sequences were obtained from one of the lakes and from the dried lake bed, the latter sequence extending back to the last glacial maximum as indicated by thick clay at the base. Each of the lakes are small-medium sized and are open systems situated at 300-1000m above sea level. The shorter cores comprise predominantly clastic gyttja but show a number of distinct changes in colour and chemical composition that suggest major environmental changes over the period of sediment accumulation. This is also reflected in variations in the loss on ignition of samples from the cores and in elemental profiles produced by scanning the cores with the Itrax micro-XRF corescanner at 200μm resolution. The long sequence from the dried lake bed has very low organic content glacial clay at the base, interpreted as last glacial maximum basal clay following determination in the field that this layer exceeded 2m in thickness. Similar sediments occur within a stratigraphically discrete section of approximately 14cm and may relate to a stadial event. The latter section also shows a drop in organic content and appears to be glacial clay incorporating some coarse sandy components indicative of detrital input from the catchment. The second long sequence, from a carbonate lake, includes two mineral layers indicating increased detrital input from the catchment. The deeper and thicker of these layers appears similar to the 14cm layer in the first long sequence, while the upper layer comprises a fine grain size indicative of rock flour and hence also of glacial

  12. IMPROVED DETERMINATION OF THE 1{sub 0}-0{sub 0} ROTATIONAL FREQUENCY OF NH{sub 3}D{sup +} FROM THE HIGH-RESOLUTION SPECTRUM OF THE {nu}{sub 4} INFRARED BAND

    Energy Technology Data Exchange (ETDEWEB)

    Domenech, J. L.; Cueto, M.; Herrero, V. J.; Tanarro, I. [Molecular Physics Department, Instituto de Estructura de la Materia (IEM-CSIC), Serrano 123, E-28006 Madrid (Spain); Tercero, B.; Cernicharo, J. [Department of Astrophysics, CAB, INTA-CSIC, Crta Torrejon-Ajalvir Km 4, E-28850 Torrejon de Ardoz, Madrid (Spain); Fuente, A., E-mail: jl.domenech@csic.es [Observatorio Astronomico Nacional, Apdo. 112, E-28803 Alcala de Henares (Spain)

    2013-07-01

    The high-resolution spectrum of the {nu}{sub 4} band of NH{sub 3}D{sup +} has been measured by difference frequency IR laser spectroscopy in a multipass hollow cathode discharge cell. From the set of molecular constants obtained from the analysis of the spectrum, a value of 262817 {+-} 6 MHz ({+-}3{sigma}) has been derived for the frequency of the 1{sub 0}-0{sub 0} rotational transition. This value supports the assignment to NH{sub 3}D{sup +} of lines at 262816.7 MHz recorded in radio astronomy observations in Orion-IRc2 and the cold prestellar core B1-bS.

  13. Blind Time-Frequency Analysis for Source Discrimination in Multisensor Array Processing

    National Research Council Canada - National Science Library

    Amin, Moeness

    1999-01-01

    .... We have clearly demonstrated, through analysis and simulations, the offerings of time-frequency distributions in solving key problems in sensor array processing, including direction finding, source...

  14. Detailed Analysis of Torque Ripple in High Frequency Signal Injection based Sensor less PMSM Drives

    Directory of Open Access Journals (Sweden)

    Ravikumar Setty A.

    2017-01-01

    Full Text Available High Frequency Signal Injection based techniques are robust and well proven to estimate the rotor position from stand still to low speed. However, Injected high frequency signal introduces, high frequency harmonics in the motor phase currents and results in significant Output Torque ripple. There is no detailed analysis exist in the literature, to study the effect of injected signal frequency on Torque ripple. Objective of this work is to study the Torque Ripple resulting from High Frequency signal injection in PMSM motor drives. Detailed MATLAB/Simulink simulations are carried to quantify the Torque ripple at different Signal frequencies.

  15. Analysis of East Tank Farms Contamination Survey Frequency

    International Nuclear Information System (INIS)

    ELDER, R.E.

    2000-01-01

    This document provides the justification for the change in survey frequency in East Tank Farms occupied contamination areas from weekly to monthly. The Tank Farms Radiological Control Organization has performed radiological surveys of its Contamination Area (CA) Double Shell Tank (DST) farms in 200 East Area on a weekly basis for several years. The task package (DST-W012) controlling these routines designates specific components, at a minimum, that must be surveyed whenever the task is performed. This document documents the evaluation of these survey requirements and provides the recommendation and basis for moving DST tank farms in the 200 East Area from a weekly to monthly contamination survey. The contamination surveys for occupied contamination areas in West Tank Farms (WTF) were changed from a weekly frequency to a monthly frequency in 1997. Review of contamination survey data in WTF indicates a monthly interval remains satisfactory

  16. Analysis of factors correlating with medical radiological examination frequencies

    International Nuclear Information System (INIS)

    Jahnen, A.; Jaervinen, H.; Bly, R.; Olerud, H.; Vassilieva, J.; Vogiatzi, S.; Shannoun, F.

    2015-01-01

    The European Commission (EC) funded project Dose Datamed 2 (DDM2) had two objectives: to collect available data on patient doses from the radiodiagnostic procedures (X-ray and nuclear medicine) in Europe, and to facilitate the implementation of the Radiation Protection 154 Guidelines (RP154). Besides the collection of frequency and dose data, two questionnaires were issued to gather information about medical radiological imaging. This article analyses a possible correlation between the collected frequency data, selected variables from the results of the detailed questionnaire and national economic data. Based on a 35 countries dataset, there is no correlation between the gross domestic product (GDP) and the total number of X-ray examinations in a country. However, there is a significant correlation ( p < 0.01) between the GDP and the overall CT examination frequency. High income countries perform more CT examinations per inhabitant. That suggests that planar X-ray examinations are replaced by CT examinations. (authors)

  17. Simulation Analysis of SPWM Variable Frequency Speed Based on Simulink

    Directory of Open Access Journals (Sweden)

    Min-Yan DI

    2014-01-01

    Full Text Available This article is studied on currently a very active field of researching sinusoidal pulse width modulation (SPWM frequency speed control system, and strengthen researched on the simulation model of speed control system with MATLAB / Simulink / Power System simulation tools, thus we can find the best way to simulation. We apply it to the actual conveyor belt, frequency conversion motor, when the obtained simulation results are compared with the measured data, we prove that the method is practical and effective. The results of our research have a guiding role for the future engineering and technical personnel in asynchronous motor SPWM VVVF CAD design.

  18. Joint time-frequency analysis of ultrasonic signal

    International Nuclear Information System (INIS)

    Oh, Sae Kyu; Nam, Ki Woo; Oh, Jung Hwan; Lee, Keun Chan; Jang, Hong Keun

    1998-01-01

    This paper examines the propagation of Lamb (or plate) waves in anisotropic laminated composite plates. The dispersion relations are explicitly derived using the classical plate theory (CLT), the first-order shear deformation theory (FSDT) and the exact solution (ES), Attention is paid to the lowest antisymmetric (flexural) and lowest symmetric(extensional) modes in the low frequency, long wavelength limit. Different values of shear correction factor were tested in FSDT and comparisons between flexural wave dispersion curves were made with exact results to asses the range of validity of approximate plate theories in the frequency domain.

  19. Analysis of Various Frequency Selective Shielding Glass by FDTD method

    OpenAIRE

    笠嶋, 善憲; Kasashima, Yoshinori

    2012-01-01

    A frequency Selective shielding (FSS) glass is a print of many same size antennas on a sheet of glass, and it has high shielding properties for one specific frequency. This time, the author analyzed characteristics of various FSSs whose antenna types are different by FDTD method. The antenna types are cross dipole, circular loop, square loop, circular patch, and square patch. As the result, the FSSs can be composed of the various types of the antennas, and the FSSs have broad-band shielding c...

  20. On the use of the term 'frequency' in applied behavior analysis.

    Science.gov (United States)

    Carr, James E; Nosik, Melissa R; Luke, Molli M

    2018-04-01

    There exists a terminological problem in applied behavior analysis: the term frequency has been used as a synonym for both rate (the number of responses per time) and count (the number of responses). To guide decisions about the use and meaning of frequency, we surveyed the usage of frequency in contemporary behavior-analytic journals and textbooks and found that the predominant usage of frequency was as count, not rate. Thus, we encourage behavior analysts to use frequency as a synonym for count. © 2018 Society for the Experimental Analysis of Behavior.

  1. Resolution of potential ambiguities through farside angular structure: Semiclassical analysis

    International Nuclear Information System (INIS)

    Fricke, S.H.; Brandan, M.E.; McVoy, K.W.

    1988-01-01

    The optical potential fits summarized in the preceding paper are subjected to a semiclassical analysis of the Ford-Wheeler--Knoll-Schaeffer type. The important broad dips in their farside cross sections, which are essential in greatly reducing potential ambiguities, are found (in partial agreement with a suggestion of Goldberg's) to be mainly weak ''Airy'' or rainbow minima, that serve to identify deeply penetrating trajectories. The semiclassical analysis also permits the identification and understanding of a new category of discrete and continuous potential ambiguities, and suggests the manner in which specific features of the angular distributions (such as spacings and depths of various angular minima) determine the Woods-Saxon parameters found by a chi-squared search

  2. Lake-level frequency analysis for Devils Lake, North Dakota

    Science.gov (United States)

    Wiche, Gregg J.; Vecchia, Aldo V.

    1996-01-01

    Two approaches were used to estimate future lake-level probabilities for Devils Lake. The first approach is based on an annual lake-volume model, and the second approach is based on a statistical water mass-balance model that generates seasonal lake volumes on the basis of seasonal precipitation, evaporation, and inflow. Autoregressive moving average models were used to model the annual mean lake volume and the difference between the annual maximum lake volume and the annual mean lake volume. Residuals from both models were determined to be uncorrelated with zero mean and constant variance. However, a nonlinear relation between the residuals of the two models was included in the final annual lakevolume model.Because of high autocorrelation in the annual lake levels of Devils Lake, the annual lake-volume model was verified using annual lake-level changes. The annual lake-volume model closely reproduced the statistics of the recorded lake-level changes for 1901-93 except for the skewness coefficient. However, the model output is less skewed than the data indicate because of some unrealistically large lake-level declines. The statistical water mass-balance model requires as inputs seasonal precipitation, evaporation, and inflow data for Devils Lake. Analysis of annual precipitation, evaporation, and inflow data for 1950-93 revealed no significant trends or long-range dependence so the input time series were assumed to be stationary and short-range dependent.Normality transformations were used to approximately maintain the marginal probability distributions; and a multivariate, periodic autoregressive model was used to reproduce the correlation structure. Each of the coefficients in the model is significantly different from zero at the 5-percent significance level. Coefficients relating spring inflow from one year to spring and fall inflows from the previous year had the largest effect on the lake-level frequency analysis.Inclusion of parameter uncertainty in the model

  3. On the resolution of ECG acquisition systems for the reliable analysis of the P-wave

    International Nuclear Information System (INIS)

    Censi, Federica; Calcagnini, Giovanni; Mattei, Eugenio; Triventi, Michele; Bartolini, Pietro; Corazza, Ivan; Boriani, Giuseppe

    2012-01-01

    The analysis of the P-wave on surface ECG is widely used to assess the risk of atrial arrhythmias. In order to provide reliable results, the automatic analysis of the P-wave must be precise and reliable and must take into account technical aspects, one of those being the resolution of the acquisition system. The aim of this note is to investigate the effects of the amplitude resolution of ECG acquisition systems on the P-wave analysis. Starting from ECG recorded by an acquisition system with a less significant bit (LSB) of 31 nV (24 bit on an input range of 524 mVpp), we reproduced an ECG signal as acquired by systems with lower resolution (16, 15, 14, 13 and 12 bit). We found that, when the LSB is of the order of 128 µV (12 bit), a single P-wave is not recognizable on ECG. However, when averaging is applied, a P-wave template can be extracted, apparently suitable for the P-wave analysis. Results obtained in terms of P-wave duration and morphology revealed that the analysis of ECG at lowest resolutions (from 12 to 14 bit, LSB higher than 30 µV) could lead to misleading results. However, the resolution used nowadays in modern electrocardiographs (15 and 16 bit, LSB <10 µV) is sufficient for the reliable analysis of the P-wave. (note)

  4. Analysis of nonlinear behavior of loudspeakers using the instantaneous frequency

    DEFF Research Database (Denmark)

    Huang, Hai; Jacobsen, Finn

    2003-01-01

    on the Fourier transform. In this work, a new method using the instantaneous frequency is introduced for describing and characterizing loudspeaker nonlinearities. First, numerical integration is applied to simulate the nonlinearities of loudspeakers caused by two nonlinear parameters, force factor and stiffness...

  5. Rectifier analysis for radio frequency energy harvesting and power transport

    NARCIS (Netherlands)

    Keyrouz, S.; Visser, H.J.; Tijhuis, A.G.

    2012-01-01

    Wireless Power Transmission (WPT) is an attractive powering method for wireless sensor nodes, battery-less sensors, and Radio-Frequency Identification (RFID) tags. The key element on the receiving side of a WPT system is the rectifying antenna (rectenna) which captures the electromagnetic power and

  6. Evaluation and Analysis of Frequency of Transformer Failures ...

    African Journals Online (AJOL)

    Abstract. The frequency of failed distribution transformers in Power Holding Company of Nigeria Plc, Akpakpava Business Unit network, Benin City, for a period of two years have been investigated in this work. The frequent power outages recorded in our communities resulting in customers dissatisfactions, economic losses, ...

  7. Alcohol marketing in televised international football: frequency analysis.

    Science.gov (United States)

    Adams, Jean; Coleman, James; White, Martin

    2014-05-20

    Alcohol marketing includes sponsorship of individuals, organisations and sporting events. Football (soccer) is one of the most popular spectator sports worldwide. No previous studies have quantified the frequency of alcohol marketing in a high profile international football tournament. The aims were to determine: the frequency and nature of visual references to alcohol in a representative sample of EURO2012 matches broadcast in the UK; and if frequency or nature varied between matches broadcast on public service and commercial channels, or between matches that did and did not feature England. Eight matches selected by stratified random sampling were recorded. All visual references to alcohol were identified using a tool with high inter-rater reliability. 1846 visual references to alcohol were identified over 1487 minutes of broadcast--an average of 1.24 references per minute. The mean number of references per minute was higher in matches that did vs did not feature England (p = 0.004), but did not differ between matches broadcast on public service vs commercial channels (p = 0.92). The frequency of visual references to alcohol was universally high and higher in matches featuring the only UK home team--England--suggesting that there may be targeting of particularly highly viewed matches. References were embedded in broadcasts, and not particular to commercial channels including paid-for advertising. New UK codes-of-conduct on alcohol marketing at sporting events will not reduce the level of marketing reported here.

  8. Allele frequency analysis of Chinese chestnut ( Castanea mollissima ...

    African Journals Online (AJOL)

    The aim of this study was to establish a method for allele frequency detection in bulk samples. The abundance of polymerase chain reaction (PCR) products in bulk leaf samples was detected using fluorescent labeled Simple sequence repeat (SSR) primers and an Applied biosystems (AB) automatic DNA analyzer.

  9. Word Inventory and Frequency Analysis of French Conversations.

    Science.gov (United States)

    Malecot, Andre

    This word frequency list was extracted from a corpus of fifty half-hour conversations recorded in Paris during the academic year 1967-68. The speakers, who did not know that they were being recorded, were all well-educated professionals and all speakers of the most standard dialect of French. The list is made up of all phonetically discrete words…

  10. The analysis of cable forces based on natural frequency

    Science.gov (United States)

    Suangga, Made; Hidayat, Irpan; Juliastuti; Bontan, Darwin Julius

    2017-12-01

    A cable is a flexible structural member that is effective at resisting tensile forces. Cables are used in a variety of structures that employ their unique characteristics to create efficient design tension members. The condition of the cable forces in the cable supported structure is an important indication of judging whether the structure is in good condition. Several methods have been developed to measure on site cable forces. Vibration technique using correlation between natural frequency and cable forces is a simple method to determine in situ cable forces, however the method need accurate information on the boundary condition, cable mass, and cable length. The natural frequency of the cable is determined using FFT (Fast Fourier Transform) Technique to the acceleration record of the cable. Based on the natural frequency obtained, the cable forces then can be determine by analytical or by finite element program. This research is focus on the vibration techniques to determine the cable forces, to understand the physical parameter effect of the cable and also modelling techniques to the natural frequency and cable forces.

  11. Deconvolution-based resolution enhancement of chemical ice core records obtained by continuous flow analysis

    DEFF Research Database (Denmark)

    Rasmussen, Sune Olander; Andersen, Katrine K.; Johnsen, Sigfus Johann

    2005-01-01

    Continuous flow analysis (CFA) has become a popular measuring technique for obtaining high-resolution chemical ice core records due to an attractive combination of measuring speed and resolution. However, when analyzing the deeper sections of ice cores or cores from low-accumulation areas...... of the data for high-resolution studies such as annual layer counting. The presented method uses deconvolution techniques and is robust to the presence of noise in the measurements. If integrated into the data processing, it requires no additional data collection. The method is applied to selected ice core...

  12. Q resolution calculation of small angle neutron scattering spectrometer and analysis of form factor

    International Nuclear Information System (INIS)

    Chen Liang; Peng Mei; Wang Yan; Sun Liangwei; Chen Bo

    2011-01-01

    The calculational methods of Small Angle Neutron Scattering (SANS) spectrometer Q resolution function and its correlative Q standard difference were introduced. The effects of Q standard difference were analysed with the geometry lay out of spectrometer and the spread of neutron wavelength. The one dimension Q resolution Gaussian function were analysed. The form factor curve of ideal solid sphere and two different instrument arrangement parameter was convoluted respectively and the different smearing curve of form factor was obtained. The combination of using the Q resolution function to more accurately analysis SANS data. (authors)

  13. Interpretation of measured data and the resolution analysis of the RTP 4-channel pulsed radar

    International Nuclear Information System (INIS)

    Pavlo, P.

    1993-01-01

    The resolution of a 4-channel pulsed radar being built at Rijnhuisen for the RTP tokamak is analyzed. The achievable resolution mainly depends on the accuracy of the time-of-flight measurements and the number of sampling frequencies; since the technological solution and the configuration have already been set, emphasis is put on interpretation of the measured data (the inversion problem) and minimization of the overall error. For this purpose, a specific neural network - the Multi Layer Perceptron (MLP) - has successfully been applied. Central density in the range of 0.2-0.6 x 10 20 m -3 was considered, i.e., one above the critical density for all four frequencies but not so high as to restrict the measurements to just the edge of the plasma. By balancing the inversion error and the time measurement error, for a wide class of density profiles the overall error in estimating the reflection point position of between 0.72 cm (for the lowest frequency) and 0.52 cm (for the highest frequency) root mean square was obtained, assuming an RMS error of 70 ps in the time of flight measurements. This is probably much better than what could be obtained by the Abel transform. Moreover, mapping with the MLP is considerably faster, and it should be considered for routine multichannel pulsed radar data processing. (author) 2 tabs., 4 figs., 6 refs

  14. Resolution analysis of archive films for the purpose of their optimal digitization and distribution

    Science.gov (United States)

    Fliegel, Karel; Vítek, Stanislav; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek

    2017-09-01

    With recent high demand for ultra-high-definition (UHD) content to be screened in high-end digital movie theaters but also in the home environment, film archives full of movies in high-definition and above are in the scope of UHD content providers. Movies captured with the traditional film technology represent a virtually unlimited source of UHD content. The goal to maintain complete image information is also related to the choice of scanning resolution and spatial resolution for further distribution. It might seem that scanning the film material in the highest possible resolution using state-of-the-art film scanners and also its distribution in this resolution is the right choice. The information content of the digitized images is however limited, and various degradations moreover lead to its further reduction. Digital distribution of the content in the highest image resolution might be therefore unnecessary or uneconomical. In other cases, the highest possible resolution is inevitable if we want to preserve fine scene details or film grain structure for archiving purposes. This paper deals with the image detail content analysis of archive film records. The resolution limit in captured scene image and factors which lower the final resolution are discussed. Methods are proposed to determine the spatial details of the film picture based on the analysis of its digitized image data. These procedures allow determining recommendations for optimal distribution of digitized video content intended for various display devices with lower resolutions. Obtained results are illustrated on spatial downsampling use case scenario, and performance evaluation of the proposed techniques is presented.

  15. Lattice and strain analysis of atomic resolution Z-contrast images based on template matching

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Jian-Min, E-mail: jianzuo@uiuc.edu [Department of Materials Science and Engineering, University of Illinois, Urbana, IL 61801 (United States); Seitz Materials Research Laboratory, University of Illinois, Urbana, IL 61801 (United States); Shah, Amish B. [Center for Microanalysis of Materials, Materials Research Laboratory, University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States); Kim, Honggyu; Meng, Yifei; Gao, Wenpei [Department of Materials Science and Engineering, University of Illinois, Urbana, IL 61801 (United States); Seitz Materials Research Laboratory, University of Illinois, Urbana, IL 61801 (United States); Rouviére, Jean-Luc [CEA-INAC/UJF-Grenoble UMR-E, SP2M, LEMMA, Minatec, Grenoble 38054 (France)

    2014-01-15

    A real space approach is developed based on template matching for quantitative lattice analysis using atomic resolution Z-contrast images. The method, called TeMA, uses the template of an atomic column, or a group of atomic columns, to transform the image into a lattice of correlation peaks. This is helped by using a local intensity adjusted correlation and by the design of templates. Lattice analysis is performed on the correlation peaks. A reference lattice is used to correct for scan noise and scan distortions in the recorded images. Using these methods, we demonstrate that a precision of few picometers is achievable in lattice measurement using aberration corrected Z-contrast images. For application, we apply the methods to strain analysis of a molecular beam epitaxy (MBE) grown LaMnO{sub 3} and SrMnO{sub 3} superlattice. The results show alternating epitaxial strain inside the superlattice and its variations across interfaces at the spatial resolution of a single perovskite unit cell. Our methods are general, model free and provide high spatial resolution for lattice analysis. - Highlights: • A real space approach is developed for strain analysis using atomic resolution Z-contrast images and template matching. • A precision of few picometers is achievable in the measurement of lattice displacements. • The spatial resolution of a single perovskite unit cell is demonstrated for a LaMnO{sub 3} and SrMnO{sub 3} superlattice grown by MBE.

  16. Econometric analysis of realised covariation: high frequency covariance, regression and correlation in financial economics

    OpenAIRE

    Ole E. Barndorff-Nielsen; Neil Shephard

    2002-01-01

    This paper analyses multivariate high frequency financial data using realised covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis and covariance. It will be based on a fixed interval of time (e.g. a day or week), allowing the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions and covariances change through time. In particular w...

  17. Low-frequency computational electromagnetics for antenna analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miller, E.K. (Los Alamos National Lab., NM (USA)); Burke, G.J. (Lawrence Livermore National Lab., CA (USA))

    1991-01-01

    An overview of low-frequency, computational methods for modeling the electromagnetic characteristics of antennas is presented here. The article presents a brief analytical background, and summarizes the essential ingredients of the method of moments, for numerically solving low-frequency antenna problems. Some extensions to the basic models of perfectly conducting objects in free space are also summarized, followed by a consideration of some of the same computational issues that affect model accuracy, efficiency and utility. A variety of representative computations are then presented to illustrate various modeling aspects and capabilities that are currently available. A fairly extensive bibliography is included to suggest further reference material to the reader. 90 refs., 27 figs.

  18. LSD-based analysis of high-resolution stellar spectra

    Science.gov (United States)

    Tsymbal, V.; Tkachenko, A.; Van, Reeth T.

    2014-11-01

    We present a generalization of the method of least-squares deconvolution (LSD), a powerful tool for extracting high S/N average line profiles from stellar spectra. The generalization of the method is effected by extending it towards the multiprofile LSD and by introducing the possibility to correct the line strengths from the initial mask. We illustrate the new approach by two examples: (a) the detection of astroseismic signatures from low S/N spectra of single stars, and (b) disentangling spectra of multiple stellar objects. The analysis is applied to spectra obtained with 2-m class telescopes in the course of spectroscopic ground-based support for space missions such as CoRoT and Kepler. Usually, rather high S/N is required, so smaller telescopes can only compete successfully with more advanced ones when one can apply a technique that enables a remarkable increase in the S/N of the spectra which they observe. Since the LSD profiles have a potential for reconstruction what is common in all the spectral profiles, it should have a particular practical application to faint stars observed with 2-m class telescopes and whose spectra show remarkable LPVs.

  19. Is analysis of biological materials with nm spatial resolution possible?

    International Nuclear Information System (INIS)

    Warley, Alice

    2006-01-01

    Cells are bounded by a membrane, the plasma membrane, subcompartments within cells are also delineated by membranes, these membranes contain transporters that regulate the flow of ions across them. Fluxes of ions across the membranes underlie many of the basic properties of living material such as excitability and movement. Breakdown of membrane function ultimately leads to cell death. EM microanalysis has been instrumental in gaining understanding of how changes in element distribution affect cell behaviour and cell survival. The main problem that biologists face in undertaking such studies is that of specimen preparation. Cells consist mainly of water that needs to be either removed or stabilised before analysis can take place. Cryotechniques, fixation by rapid freezing followed by sectioning at low temperatures and freeze-drying of the sections have proved to be a reliable method for the study of intracellular element concentrations. These techniques have been used to show that elements are confined in different compartments within cells and produced results to support a new theory on the mechanism by which neutrophils kill bacteria. They have also shown that disturbance of the ionic content of mitochondria is one of the first signs in the pathway to cell death

  20. Patellofemoral pain syndrome: electromyography in a frequency domain analysis

    Science.gov (United States)

    Catelli, D. S.; Kuriki, H. U.; Polito, L. F.; Azevedo, F. M.; Negrão Filho, R. F.; Alves, N.

    2011-09-01

    The Patellofemoral Pain Syndrome (PFPS), has a multifactorial etiology and affects approximately 7 to 15% of the population, mostly women, youth, adults and active persons. PFPS causes anterior or retropatelar pain that is exacerbated during functional motor gestures, such as up and down stairs or spending long periods of time sitting, squatting or kneeling. As the diagnostic evaluation of this syndrome is still indirect, different mechanisms and methodologies try to make a classification that distinguishes patients with PFPS in relation to asymptomatic. Thereby, the purpose of this investigation was to determine the characteristics of the electromyographic (EMG) signal in the frequency domain of the vastus medialis oblique (VMO) and vastus lateralis (VL) in patients with PFPS, during the ascent of stairs. 33 young women (22 control group and 11 PFPS group), were evaluated by EMG during ascent of stairs. The VMO mean power frequency (MPF) and the VL frequency 95% (F95) were lower in symptomatic individuals. This may be related to the difference in muscle recruitment strategy exerted by each muscle in the PFPS group compared to the control group.

  1. High-resolution quantization based on soliton self-frequency shift and spectral compression in a bi-directional comb-fiber architecture

    Science.gov (United States)

    Zhang, Xuyan; Zhang, Zhiyao; Wang, Shubing; Liang, Dong; Li, Heping; Liu, Yong

    2018-03-01

    We propose and demonstrate an approach that can achieve high-resolution quantization by employing soliton self-frequency shift and spectral compression. Our approach is based on a bi-directional comb-fiber architecture which is composed of a Sagnac-loop-based mirror and a comb-like combination of N sections of interleaved single-mode fibers and high nonlinear fibers. The Sagnac-loop-based mirror placed at the terminal of a bus line reflects the optical pulses back to the bus line to achieve additional N-stage spectral compression, thus single-stage soliton self-frequency shift (SSFS) and (2 N - 1)-stage spectral compression are realized in the bi-directional scheme. The fiber length in the architecture is numerically optimized, and the proposed quantization scheme is evaluated by both simulation and experiment in the case of N = 2. In the experiment, a quantization resolution of 6.2 bits is obtained, which is 1.2-bit higher than that of its uni-directional counterpart.

  2. Analysis strategies for high-resolution UHF-fMRI data.

    Science.gov (United States)

    Polimeni, Jonathan R; Renvall, Ville; Zaretskaya, Natalia; Fischl, Bruce

    2018-03-01

    Functional MRI (fMRI) benefits from both increased sensitivity and specificity with increasing magnetic field strength, making it a key application for Ultra-High Field (UHF) MRI scanners. Most UHF-fMRI studies utilize the dramatic increases in sensitivity and specificity to acquire high-resolution data reaching sub-millimeter scales, which enable new classes of experiments to probe the functional organization of the human brain. This review article surveys advanced data analysis strategies developed for high-resolution fMRI at UHF. These include strategies designed to mitigate distortion and artifacts associated with higher fields in ways that attempt to preserve spatial resolution of the fMRI data, as well as recently introduced analysis techniques that are enabled by these extremely high-resolution data. Particular focus is placed on anatomically-informed analyses, including cortical surface-based analysis, which are powerful techniques that can guide each step of the analysis from preprocessing to statistical analysis to interpretation and visualization. New intracortical analysis techniques for laminar and columnar fMRI are also reviewed and discussed. Prospects for single-subject individualized analyses are also presented and discussed. Altogether, there are both specific challenges and opportunities presented by UHF-fMRI, and the use of proper analysis strategies can help these valuable data reach their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Do lateral boundary condition update frequency and the resolution of the boundary data affect the regional model COSMO-CLM? A sensitivity study.

    Science.gov (United States)

    Pankatz, K.; Kerkweg, A.

    2014-12-01

    The work presented is part of the joint project "DecReg" ("Regional decadal predictability") which is in turn part of the project "MiKlip" ("Decadal predictions"), an effort funded by the german Federal Ministry of Education and Research to improve decadal predictions on a global and regional scale. In regional climate modeling it is common to update the lateral boundary conditions (LBC) of the regional model every six hours. This is mainly due to the fact, that reference data sets like ERA are only available every six hours. Additionally, for offline coupling procedures it would be too costly to store LBC data in higher temporal resolution for climate simulations. However, theoretically, the coupling frequency could be as high as the time step of the driving model. Meanwhile, it is unclear if a more frequent update of the LBC has a significant effect on the climate in the domain of the regional model (RCM). This study uses the RCM COSMO-CLM/MESSy (Kerkweg and Jöckel, 2012) to couple COSMO-CLM offline to the GCM ECHAM5. One study examines a 30 year time slice experiment for three update frequencies of the LBC, namely six hours, one hour and six minutes. The evaluation of means, standard deviations and statistics of the climate in regional domain shows only small deviations, some stastically significant though, of 2m temperature, sea level pressure and precipitaion.The second scope of the study assesses parameters linked to cyclone activity, which is affected by the LBC update frequency. Differences in track density and strength are found when comparing the simulations.The second study examines the quality of decadal hind-casts of the decade 2001-2010 when the horizontal resolution of the driving model, namely T42, T63, T85, T106, from which the LBC are calculated, is altered. Two sets of simulations are evaluated. For the first set of simulations, the GCM simulations are performed at different resolutions using the same boundary conditions for GHGs and SSTs, thus

  4. Bilinear Time-frequency Analysis for Lamb Wave Signal Detected by Electromagnetic Acoustic Transducer

    Science.gov (United States)

    Sun, Wenxiu; Liu, Guoqiang; Xia, Hui; Xia, Zhengwu

    2018-03-01

    Accurate acquisition of the detection signal travel time plays a very important role in cross-hole tomography. The experimental platform of aluminum plate under the perpendicular magnetic field is established and the bilinear time-frequency analysis methods, Wigner-Ville Distribution (WVD) and the pseudo-Wigner-Ville distribution (PWVD), are applied to analyse the Lamb wave signals detected by electromagnetic acoustic transducer (EMAT). By extracting the same frequency component of the time-frequency spectrum as the excitation frequency, the travel time information can be obtained. In comparison with traditional linear time-frequency analysis method such as short-time Fourier transform (STFT), the bilinear time-frequency analysis method PWVD is more appropriate in extracting travel time and recognizing patterns of Lamb wave.

  5. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  6. Test beam & time resolution analysis for UFSD and CVD diamond detectors

    CERN Document Server

    Scali, Stefano

    2017-01-01

    The ever-increasing luminosity in particle physics, aimed at seeking new phenomena, has led to the need for radiation-hard detectors with a remarkable time resolution. To reach the goal several tests and data analysis has been performed but further development is still required. During my internship I have participated to the test of new sensors. After an introduction to the theoretical framework this report describes the data taking procedure using SPS beam at the H8 site in Prevessin. The second part describes the data analysis and extrapolation of the time resolution for many boards.

  7. High-Resolution Physical Properties Logging of the AND-1B Sediment Core - Opportunity for Detecting High-Frequency Signals of Paleoenvironmental Changes

    Science.gov (United States)

    Niessen, F.; Magens, D.; Kuhn, G.; Helling, D.

    2008-12-01

    Within the ANDRILL-MIS Project, a more than 1200 m long sediment core, dating back to about 13 Ma, was drilled beneath McMurdo Ice Shelf near Ross Island (Antarctica) in austral summer 2006/07 with the purpose of contributing to a better understanding of the Late Cenozoic history of the Antarctic Ice Sheet. One way to approach past ice dynamics and changes in the paleoenvironment quantitatively, is the analysis of high- resolution physical properties obtained from whole-core multi-sensor core logger measurements in which lithologic changes are expressed numerically. This is especially applicable for the repeating sequences of diatomites and diamictites in the upper half of the core with a prominent cyclicity between 140-300 mbsf. Rather abrupt high-amplitude variations in wet-bulk density (WBD) and magnetic susceptibility (MS) reflect a highly dynamic depositional system, oscillating between two main end-member types: a grounded ice sheet and open marine conditions. For the whole core, the WBD signal, ranging from 1.4 kg/cu.m in the diatomites to 2.3 kg/cu.m in diamictites from the lower part of the core, represents the influence of three variables: (i) the degree of compaction seen as reduction of porosities with depth of about 30 % from top to bottom, (ii) the clast content with clasts being almost absent in diatomite deposits and (iii) the individual grain density (GD). GD itself strongly reflects the variety of lithologies as well as the influence of cement (mainly pyrite and carbonate) on the matrix grain density. The calculation of residual porosities demonstrates the strong imprint of glacial loading for especially diamictites from the upper 150 m, pointing to a significant thickness of the overriding Pleistocene ice sheet. MS on the other hand mainly documents a marine vs. terrestrial source of sediments where the latter can be divided into younger local material from the McMurdo Volcanic Province and basement clasts from the Transantarctic Mountains

  8. Carbon financial markets: A time-frequency analysis of CO2 prices

    Science.gov (United States)

    Sousa, Rita; Aguiar-Conraria, Luís; Soares, Maria Joana

    2014-11-01

    We characterize the interrelation of CO2 prices with energy prices (electricity, gas and coal), and with economic activity. Previous studies have relied on time-domain techniques, such as Vector Auto-Regressions. In this study, we use multivariate wavelet analysis, which operates in the time-frequency domain. Wavelet analysis provides convenient tools to distinguish relations at particular frequencies and at particular time horizons. Our empirical approach has the potential to identify relations getting stronger and then disappearing over specific time intervals and frequencies. We are able to examine the coherency of these variables and lead-lag relations at different frequencies for the time periods in focus.

  9. The impact of the microphone position on the frequency analysis of snoring sounds.

    Science.gov (United States)

    Herzog, Michael; Kühnel, Thomas; Bremert, Thomas; Herzog, Beatrice; Hosemann, Werner; Kaftan, Holger

    2009-08-01

    Frequency analysis of snoring sounds has been reported as a diagnostic tool to differentiate between different sources of snoring. Several studies have been published presenting diverging results of the frequency analyses of snoring sounds. Depending on the position of the used microphones, the results of the frequency analysis of snoring sounds vary. The present study investigated the influence of different microphone positions on the outcome of the frequency analysis of snoring sounds. Nocturnal snoring was recorded simultaneously at six positions (air-coupled: 30 cm middle, 100 cm middle, 30 cm lateral to both sides of the patients' head; body contact: neck and parasternal) in five patients. The used microphones had a flat frequency response and a similar frequency range (10/40 Hz-18 kHz). Frequency analysis was performed by fast Fourier transformation and frequency bands as well as peak intensities (Peaks 1-5) were detected. Air-coupled microphones presented a wider frequency range (60 Hz-10 kHz) compared to contact microphones. The contact microphone at cervical position presented a cut off at frequencies above 300 Hz, whereas the contact microphone at parasternal position revealed a cut off above 100 Hz. On an exemplary base, the study demonstrates that frequencies above 1,000 Hz do appear in complex snoring patterns, and it is emphasised that high frequencies are imported for the interpretation of snoring sounds with respect to the identification of the source of snoring. Contact microphones might be used in screening devices, but for a natural analysis of snoring sounds the use of air-coupled microphones is indispensable.

  10. Damage detection in multi-span beams based on the analysis of frequency changes

    International Nuclear Information System (INIS)

    Gillich, G R; Ntakpe, J L; Praisach, Z I; Mimis, M C; Abdel Wahab, M

    2017-01-01

    Crack identification in multi-span beams is performed to determine whether the structure is healthy or not. Among all crack identification methods, these based on measured natural frequency changes present the advantage of simplicity and easy to use in practical engineering. To accurately identify the cracks characteristics for multi-span beam structure, a mathematical model is established, which can predict frequency changes for any boundary conditions, the intermediate supports being hinges. This relation is based on the modal strain energy concept. Since frequency changes are relative small, to obtain natural frequencies with high resolution, a signal processing algorithm based on superposing of numerous spectra is also proposed, which overcomes the disadvantage of Fast Fourier Transform in the aspect of frequency resolution. Based on above-mentioned mathematical model and signal processing algorithm, the method of identifying cracks on multi-span beams is presented. To verify the accuracy of this identification method, experimental examples are conducted on a two-span structure. The results demonstrate that the method proposed in this paper can accurately identify the crack position and depth. (paper)

  11. Time-Frequency Analysis and Hermite Projection Method Applied to Swallowing Accelerometry Signals

    Directory of Open Access Journals (Sweden)

    Ervin Sejdić

    2010-01-01

    Full Text Available Fast Hermite projections have been often used in image-processing procedures such as image database retrieval, projection filtering, and texture analysis. In this paper, we propose an innovative approach for the analysis of one-dimensional biomedical signals that combines the Hermite projection method with time-frequency analysis. In particular, we propose a two-step approach to characterize vibrations of various origins in swallowing accelerometry signals. First, by using time-frequency analysis we obtain the energy distribution of signal frequency content in time. Second, by using fast Hermite projections we characterize whether the analyzed time-frequency regions are associated with swallowing or other phenomena (vocalization, noise, bursts, etc.. The numerical analysis of the proposed scheme clearly shows that by using a few Hermite functions, vibrations of various origins are distinguishable. These results will be the basis for further analysis of swallowing accelerometry to detect swallowing difficulties.

  12. Analysis on characteristic and application of THz frequency comb and THz sub-comb

    International Nuclear Information System (INIS)

    Liu Pengxiang; Xu Degang; Yao Jianquan

    2011-01-01

    In this paper, we proposed a method for THz sub-comb generation based on spectral interference. The result of our calculation indicated that the THz pulse train, generated by surface-emitted optical rectification of femtosecond (fs) laser pulse in periodically poled lithium niobate (PPLN), has a comb-like spectrum. The characteristic of this THz sub-comb was analyzed both in frequency and time domain. Compared with the THz frequency comb emitted by a photoconductive antenna (PCA), THz sub-comb has a lower spectral resolution and wider free spectral range. Thus it could be an ideal source for wavelength division multiplexing (WDM) in THz wireless communication system.

  13. Mycoplasma pneumonia in children: radiographic pattern analysis and difference in resolution

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Myeong Ja; Jeong, Sung Eun; Kim, Joung Sook; Hur, Gham; Park, Jeung Uk [Inje Univ. College of Medicine, Seoul (Korea, Republic of)

    1997-11-01

    By analysing frequency and disease progression, this study aimed to investigate and predict the prognosis of mycoplasma pneumonia according to radiographic pattern. We retrospectively reviewed plain chest radiographs of 230 patients in whom mycoplasm pneumonia had been serologically confirmed. Their age ranged from two months to 14 years and two months, and 203(88.3%) were younger than eight years. Radiographic patterns were classified as air space consolidation, bronchopneumonic, interstitial pneumonic or diffuse mixed infiltrating type. The radiologic resolution period for each type was analysed by the resolution of symptoms and normalization of radiologic findings. The bronchopneumonic type, which was the most common, was seen in 82 patients(35.6%), airspace consolidation in 58(25.2%), interstitial in 55(23.9%), and diffuse mixed in 22(9.57%). In thirteen patients(5.7%), chest radiographs were normal, though the clinical and radiologic resolution period for each type was variable. The mean resolution period of the air space consolidation type was 14.5 days, bronchopneumonic, 7.6 days ; interstitial, 10.5 days, and diffuse mixed, 15.6 days. The airspace consolidation type needed the longest recovery period, exceeded only by the diffuse mixed type. The bronchopneumonic type was the most common radiographic pattern of mycoplasma pneumonia. The prognosis of the airspace consolidation type seems to be poorest, since this required the longest recovery period.

  14. On Space-Time Resolution of Inflow Representations for Wind Turbine Loads Analysis

    Directory of Open Access Journals (Sweden)

    Lance Manuel

    2012-06-01

    Full Text Available Efficient spatial and temporal resolution of simulated inflow wind fields is important in order to represent wind turbine dynamics and derive load statistics for design. Using Fourier-based stochastic simulation of inflow turbulence, we first investigate loads for a utility-scale turbine in the neutral atmospheric boundary layer. Load statistics, spectra, and wavelet analysis representations for different space and time resolutions are compared. Next, large-eddy simulation (LES is employed with space-time resolutions, justified on the basis of the earlier stochastic simulations, to again derive turbine loads. Extreme and fatigue loads from the two approaches used in inflow field generation are compared. On the basis of simulation studies carried out for three different wind speeds in the turbine’s operating range, it is shown that inflow turbulence described using 10-meter spatial resolution and 1 Hz temporal resolution is adequate for assessing turbine loads. Such studies on the investigation of adequate filtering or resolution of inflow wind fields help to establish efficient strategies for LES and other physical or stochastic simulation needed in turbine loads studies.

  15. Evaluation of the Effects of Conflict Resolution, Peace Education and Peer Mediation: A Meta-Analysis Study

    Science.gov (United States)

    Turk, Fulya

    2018-01-01

    The purpose of this study was to examine the effects of conflict resolution, peace education and peer mediation on the conflict resolution skills of students via meta-analysis method. 23 studies were determined to be in accordance with the study criteria. According to research findings conflict resolution, peace education and peer mediation…

  16. A new framework for estimating return levels using regional frequency analysis

    Science.gov (United States)

    Winter, Hugo; Bernardara, Pietro; Clegg, Georgina

    2017-04-01

    We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here

  17. Three-dimensional reconstruction and modeling of middle ear biomechanics by high-resolution computed tomography and finite element analysis.

    Science.gov (United States)

    Lee, Chia-Fone; Chen, Peir-Rong; Lee, Wen-Jeng; Chen, Jyh-Horng; Liu, Tien-Chen

    2006-05-01

    To present a systematic and practical approach that uses high-resolution computed tomography to derive models of the middle ear for finite element analysis. This prospective study included 31 subjects with normal hearing and no previous otologic disorders. Temporal bone images obtained from 15 right ears and 16 left ears were used for evaluation and reconstruction. High-resolution computed tomography of temporal bone was performed using simultaneous acquisition of 16 sections with a collimated slice thickness of 0.625 mm. All images were transferred to an Amira visualization system for three-dimensional reconstruction. The created three-dimensional model was translated into two commercial modeling packages, Patran and ANSYS, for finite element analysis. The characteristic dimensions of the model were measured and compared with previously published histologic section data. This result confirms that the geometric model created by the proposed method is accurate except that the tympanic membrane is thicker than when measured by the histologic section method. No obvious difference in the geometrical dimension between right and left ossicles was found (P > .05). The three-dimensional model created by finite element method and predicted umbo and stapes displacements are close to the bounds of the experimental curves of Nishihara's, Huber's, Gan's, and Sun's data across the frequency range of 100 to 8000 Hz. The model includes a description of the geometry of the middle ear components and dynamic equations of vibration. The proposed method is quick, practical, low-cost, and, most importantly, noninvasive as compared with histologic section methods.

  18. Research and Analysis of MEMS Switches in Different Frequency Bands

    Directory of Open Access Journals (Sweden)

    Wenchao Tian

    2018-04-01

    Full Text Available Due to their high isolation, low insertion loss, high linearity, and low power consumption, microelectromechanical systems (MEMS switches have drawn much attention from researchers in recent years. In this paper, we introduce the research status of MEMS switches in different bands and several reliability issues, such as dielectric charging, contact failure, and temperature instability. In this paper, some of the following methods to improve the performance of MEMS switches in high frequency are summarized: (1 utilizing combinations of several switches in series; (2 covering a float metal layer on the dielectric layer; (3 using dielectric layer materials with high dielectric constants and conductor materials with low resistance; (4 developing MEMS switches using T-match and π-match; (5 designing MEMS switches based on bipolar complementary metal–oxide–semiconductor (BiCMOS technology and reconfigurable MEMS’ surfaces; (6 employing thermal compensation structures, circularly symmetric structures, thermal buckle-beam actuators, molybdenum membrane, and thin-film packaging; (7 selecting Ultra-NanoCrystalline diamond or aluminum nitride dielectric materials and applying a bipolar driving voltage, stoppers, and a double-dielectric-layer structure; and (8 adopting gold alloying with carbon nanotubes (CNTs, hermetic and reliable packaging, and mN-level contact.

  19. Alcohol marketing in televised English professional football: a frequency analysis.

    Science.gov (United States)

    Graham, Andrew; Adams, Jean

    2014-01-01

    The aim of the study was to explore the frequency of alcohol marketing (both formal commercials and otherwise) in televised top-class English professional football matches. A purposive sample of six broadcasts (total = 1101 min) of televised top-class English club football matches were identified and recorded in full. A customized coding framework was used to identify and categorize all verbal and visual alcohol references in non-commercial broadcasting. The number and the duration of all formal alcohol commercials were also noted. A mean of 111 visual references and 2 verbal references to alcohol per hour of broadcast were identified. Nearly all visual references were to beer products and were primarily simple logos or branding. The majority of verbal alcohol references were related to title-sponsorship of competitions. A total of 17 formal alcohol commercials were identified, accounting for <1% of total broadcast time. Visual alcohol references in televised top-class English football matches are common with an average of nearly two per minute. Verbal references are rare and formal alcohol commercials account for <1% of broadcast time. Restriction of all alcohol sports sponsorship, as seen for tobacco, may be justified.

  20. Accessory bones of the feet: Radiological analysis of frequency

    Directory of Open Access Journals (Sweden)

    Vasiljević Vladica

    2010-01-01

    Full Text Available Background/Aim. Accessory bones are most commonly found on the feet and they represent an anatomic variant. They occur when there is a failure in the formation of a unique bone from separated centre of ossification. The aim of this study was to establish their frequency and medical significance. Methods. Anteroposterior and lateral foot radiography was performed in 270 patients aged of 20-80 years with a history of trauma (180 and rheumatology disease (90. The presence and distribution of accessory bones was analysed in relation to the total number of patients and their gender. The results are expressed in numeric values and in terms of percentage. Results. Accessory bones were identified in 62 (22.96% patients: 29 (10.74% of them were found in female patients and 33 (12.22% in males. The most common accessory bones were as follows: os tibiale externum 50%, os peroneum 29.03%, ostrigonum 11.29%, os vaselianum 9.68%. Conclusion. Accessory bones found in 23% of patients with trauma and some of rheumatological diseases. Their significance is demonstrated in the differential diagnosis among degenerative diseases, avulsion fractures, muscle and tendon trauma and other types of injuries which can cause painful affection of the foot, as well as in forensic practice.

  1. Prospects for higher spatial resolution quantitative X-ray analysis using transition element L-lines

    Science.gov (United States)

    Statham, P.; Holland, J.

    2014-03-01

    Lowering electron beam kV reduces electron scattering and improves spatial resolution of X-ray analysis. However, a previous round robin analysis of steels at 5 - 6 kV using Lα-lines for the first row transition elements gave poor accuracies. Our experiments on SS63 steel using Lα-lines show similar biases in Cr and Ni that cannot be corrected with changes to self-absorption coefficients or carbon coating. The inaccuracy may be caused by different probabilities for emission and anomalous self-absorption for the La-line between specimen and pure element standard. Analysis using Ll(L3-M1)-lines gives more accurate results for SS63 plausibly because the M1-shell is not so vulnerable to the atomic environment as the unfilled M4,5-shell. However, Ll-intensities are very weak and WDS analysis may be impractical for some applications. EDS with large area SDD offers orders of magnitude faster analysis and achieves similar results to WDS analysis with Lα-lines but poorer energy resolution precludes the use of Ll-lines in most situations. EDS analysis of K-lines at low overvoltage is an alternative strategy for improving spatial resolution that could give higher accuracy. The trade-off between low kV versus low overvoltage is explored in terms of sensitivity for element detection for different elements.

  2. Frequency analysis of the visual steady-state response measured with the fast optical signal in younger and older adults

    OpenAIRE

    Tse, Chun-Yu; Gordon, Brian A.; Fabiani, Monica; Gratton, Gabriele

    2010-01-01

    Relatively high frequency activity (>4 Hz) carries important information about the state of the brain or its response to high frequency events. The electroencephalogram (EEG) is commonly used to study these changes because it possesses high temporal resolution and a good signal-to-noise ratio. However, it provides limited spatial information. Non-invasive fast optical signals (FOS) have been proposed as a neuroimaging tool combining spatial and temporal resolution. Yet, this technique has not...

  3. Frequency analysis for modulation-enhanced powder diffraction.

    Science.gov (United States)

    Chernyshov, Dmitry; Dyadkin, Vadim; van Beek, Wouter; Urakawa, Atsushi

    2016-07-01

    Periodic modulation of external conditions on a crystalline sample with a consequent analysis of periodic diffraction response has been recently proposed as a tool to enhance experimental sensitivity for minor structural changes. Here the intensity distributions for both a linear and nonlinear structural response induced by a symmetric and periodic stimulus are analysed. The analysis is further extended for powder diffraction when an external perturbation changes not only the intensity of Bragg lines but also their positions. The derived results should serve as a basis for a quantitative modelling of modulation-enhanced diffraction data measured in real conditions.

  4. Spurious results from Fourier analysis of data with closely spaced frequencies

    International Nuclear Information System (INIS)

    Loumos, G.L.; Deeming, T.J.

    1978-01-01

    It is shown how erroneous results can occur using some period-finding methods, such as Fourier analysis, on data containing closely spaced frequencies. The frequency spacing accurately resolvable with data of length T is increased from the standard value of about 1/T quoted in the literature to approximately 1.5/T. (Auth.)

  5. Analysis of frequency effect on variegated RAM styles and other parameters using 40 nm FPGA

    DEFF Research Database (Denmark)

    Sharma, Rashmi; Pandey, Bishwajeet; Sharma, Vaashu

    2018-01-01

    . This analysis has been performed using the XILINX 12.1 and IBM SPSS Statistics 21 software and VHDL language. Pipe_distributed style at comparatively lesser values of frequencies consumes the least power. Therefore, lesser values of frequencies should be maintained while observing the power. This would bloom up...

  6. Wavelet analysis of frequency chaos game signal: a time-frequency signature of the C. elegans DNA.

    Science.gov (United States)

    Messaoudi, Imen; Oueslati, Afef Elloumi; Lachiri, Zied

    2014-12-01

    Challenging tasks are encountered in the field of bioinformatics. The choice of the genomic sequence's mapping technique is one the most fastidious tasks. It shows that a judicious choice would serve in examining periodic patterns distribution that concord with the underlying structure of genomes. Despite that, searching for a coding technique that can highlight all the information contained in the DNA has not yet attracted the attention it deserves. In this paper, we propose a new mapping technique based on the chaos game theory that we call the frequency chaos game signal (FCGS). The particularity of the FCGS coding resides in exploiting the statistical properties of the genomic sequence itself. This may reflect important structural and organizational features of DNA. To prove the usefulness of the FCGS approach in the detection of different local periodic patterns, we use the wavelet analysis because it provides access to information that can be obscured by other time-frequency methods such as the Fourier analysis. Thus, we apply the continuous wavelet transform (CWT) with the complex Morlet wavelet as a mother wavelet function. Scalograms that relate to the organism Caenorhabditis elegans (C. elegans) exhibit a multitude of periodic organization of specific DNA sequences.

  7. Accounting for trip frequency in importance-performance analysis

    Science.gov (United States)

    Joshua K. Gill; J.M. Bowker; John C. Bergstrom; Stanley J. Zarnoch

    2010-01-01

    Understanding customer satisfaction is critical to the successful operation of both privately and publicly managed recreation venues. A popular tool for assessing recreation visitor satisfaction is Importance- Performance Analysis (IPA). IPA provides resource managers, government officials, and private businesses with easy-to-understand and -use information about...

  8. Frequency Analysis of Gradient Estimators in Volume Rendering

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Lichtenbelt, Barthold B.A.; Malzbender, Tom

    1996-01-01

    Gradient information is used in volume rendering to classify and color samples along a ray. In this paper, we present an analysis of the theoretically ideal gradient estimator and compare it to some commonly used gradient estimators. A new method is presented to calculate the gradient at arbitrary

  9. Adjoint sensitivity analysis of high frequency structures with Matlab

    CERN Document Server

    Bakr, Mohamed; Demir, Veysel

    2017-01-01

    This book covers the theory of adjoint sensitivity analysis and uses the popular FDTD (finite-difference time-domain) method to show how wideband sensitivities can be efficiently estimated for different types of materials and structures. It includes a variety of MATLAB® examples to help readers absorb the content more easily.

  10. Capturing inhomogeneous broadening of the -CN stretch vibration in a Langmuir monolayer with high-resolution spectra and ultrafast vibrational dynamics in sum-frequency generation vibrational spectroscopy (SFG-VS)

    Science.gov (United States)

    Velarde, Luis; Wang, Hong-fei

    2013-08-01

    While in principle the frequency-domain and time-domain spectroscopic measurements should generate identical information for a given molecular system, the inhomogeneous character of surface vibrations in sum-frequency generation vibrational spectroscopy (SFG-VS) studies has only been studied with time-domain SFG-VS by mapping the decay of the vibrational polarization using ultrafast lasers, this due to the lack of SFG vibrational spectra with high enough spectral resolution and accurate enough lineshape. Here, with the recently developed high-resolution broadband SFG-VS (HR-BB-SFG-VS) technique, we show that the inhomogeneous lineshape can be obtained in the frequency-domain for the anchoring CN stretch of the 4-n-octyl-4'-cyanobiphenyl (8CB) Langmuir monolayer at the air-water interface, and that an excellent agreement with the time-domain SFG free-induction-decay can be established. We found that the 8CB CN stretch spectrum consists of a single peak centered at 2234.00 ± 0.01 cm-1 with a total linewidth of 10.9 ± 0.3 cm-1 at half maximum. The Lorentzian contribution accounts only for 4.7 ± 0.4 cm-1 to this width and the Gaussian (inhomogeneous) broadening for as much as 8.1 ± 0.2 cm-1. Polarization analysis of the -CN spectra showed that the -CN group is tilted 57° ± 2° from the surface normal. The large heterogeneity in the -CN spectrum is tentatively attributed to the -CN group interactions with the interfacial water molecules penetrated/accommodated into the 8CB monolayer, a unique phenomenon for the nCB Langmuir monolayers reported previously.

  11. Sensitive rapid analysis of iodine-labelled protein mixture on flat substrates with high spatial resolution

    International Nuclear Information System (INIS)

    Zanevskij, Yu.V.; Ivanov, A.B.; Movchan, S.A.; Peshekhonov, V.D.; Chan Dyk Tkhan'; Chernenko, S.P.; Kaminir, L.B.; Krejndlin, Eh.Ya.; Chernyj, A.A.

    1983-01-01

    Usability of rapid analysis by electrophoresis of the admixture of I 125 -labelled proteins on flat samples by means of URAN type installation developed using a multiwire proportional chamber is studied. The sensitivity of the method is better than 200 cpm/cm 2 and the spatial resolution is approximately 1 mm. The procedure of the rapid analysis is no longer than several tens of minutes

  12. High resolution transmission electron microscopy and microdiffraction for radiation damage analysis

    International Nuclear Information System (INIS)

    Sinclair, R.

    1982-01-01

    High resolution TEM techniques have developed to quite a sophisticated level over the past few years. In addition TEM instruments with a scanning capability have become available commercially which permit in particular the formation of a small electron probe at the specimen. Thus direct resolution and microdiffraction investigations of thin specimens are now possible, neither of which have been employed to any great extent in the analysis of radiation damage. Some recent advances which are thought to be relevant to this specific area of research are highlighted

  13. Conflict resolution and its context from the analysis of behavioural patterns to efficient decision-making

    CERN Document Server

    Carneiro, Davide; Neves, José

    2014-01-01

    This book studies how technological solutions can be used to alleviate the current state of legal systems, with their clogged up courtrooms and inefficient conflict resolution methods. It reviews the shortcomings and disadvantages of traditional and alternative conflict resolution methods and turns to Artificial Intelligence for problem-solving techniques and solutions. The book is divided into four parts. The first part presents a general and systematic analysis of the current state of the legal systems, identifying the main problems and their causes.?It then moves on to present UM Court: a f

  14. Optimization of High-Resolution Continuous Flow Analysis for Transient Climate Signals in Ice Cores

    DEFF Research Database (Denmark)

    Bigler, Matthias; Svensson, Anders; Kettner, Ernesto

    2011-01-01

    Over the past two decades, continuous flow analysis (CFA) systems have been refined and widely used to measure aerosol constituents in polar and alpine ice cores in very high-depth resolution. Here we present a newly designed system consisting of sodium, ammonium, dust particles, and electrolytic...... meltwater conductivity detection modules. The system is optimized for high- resolution determination of transient signals in thin layers of deep polar ice cores. Based on standard measurements and by comparing sections of early Holocene and glacial ice from Greenland, we find that the new system features...

  15. Limitations to depth resolution in high-energy, heavy-ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Elliman, R.G.; Palmer, G.R.; Ophel, T.R.; Timmers, H.

    1998-01-01

    The depth resolution of heavy-ion elastic recoil detection analysis was examined for Al and Co thin films ranging in thickness from 100 to 400 nm. Measurements were performed with 154 MeV Au ions as the incident beam, and recoils were detected using a gas ionisation detector. Energy spectra were extracted for the Al and Co recoils and the depth resolution determined as a function of film thickness from the width of the high- and low- energy edges. These results were compared with theoretical estimates calculated using the computer program DEPTH. (authors)

  16. To Fill or Not to Fill: Sensitivity Analysis of the Influence of Resolution and Hole Filling on Point Cloud Surface Modeling and Individual Rockfall Event Detection

    Directory of Open Access Journals (Sweden)

    Michael J. Olsen

    2015-09-01

    Full Text Available Monitoring unstable slopes with terrestrial laser scanning (TLS has been proven effective. However, end users still struggle immensely with the efficient processing, analysis, and interpretation of the massive and complex TLS datasets. Two recent advances described in this paper now improve the ability to work with TLS data acquired on steep slopes. The first is the improved processing of TLS data to model complex topography and fill holes. This processing step results in a continuous topographic surface model that seamlessly characterizes the rock and soil surface. The second is an advance in the automated interpretation of the surface model in such a way that a magnitude and frequency relationship of rockfall events can be quantified, which can be used to assess maintenance strategies and forecast costs. The approach is applied to unstable highway slopes in the state of Alaska, U.S.A. to evaluate its effectiveness. Further, the influence of the selected model resolution and degree of hole filling on the derived slope metrics were analyzed. In general, model resolution plays a pivotal role in the ability to detect smaller rockfall events when developing magnitude-frequency relationships. The total volume estimates are also influenced by model resolution, but were comparatively less sensitive. In contrast, hole filling had a noticeable effect on magnitude-frequency relationships but to a lesser extent than modeling resolution. However, hole filling yielded a modest increase in overall volumetric quantity estimates. Optimal analysis results occur when appropriately balancing high modeling resolution with an appropriate level of hole filling.

  17. Frequency characteristic measurement of a fiber optic gyroscope using a correlation spectrum analysis method based on a pseudo-random sequence

    International Nuclear Information System (INIS)

    Li, Yang; Chen, Xingfan; Liu, Cheng

    2015-01-01

    The frequency characteristic is an important indicator of a system’s dynamic performance. The identification of a fiber optic gyroscope (FOG)’s frequency characteristic using a correlation spectrum analysis method based on a pseudo-random sequence is proposed. Taking the angle vibrator as the source of the test rotation stimulation and a pseudo-random sequence as the test signal, the frequency characteristic of a FOG is calculated according to the power spectral density of the rotation rate signal and the cross-power spectral density of the FOG’s output signal and rotation rate signal. A theoretical simulation is done to confirm the validity of this method. An experiment system is built and the test results indicate that the measurement error of the normalized amplitude–frequency response is less than 0.01, that the error of the phase–frequency response is less than 0.3 rad, and the overall measurement accuracy is superior to the traditional frequency-sweep method. By using this method, the FOG’s amplitude–frequency response and phase–frequency response can be measured simultaneously, quickly, accurately, and with a high frequency resolution. The described method meets the requirements of engineering applications. (paper)

  18. Letter Frequency Analysis of Lithuanian and Other Languages Using the Latin Alphabet

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2015-12-01

    Full Text Available It is important to evaluate specificities of alphabets, particularly the letter frequencies while designing keyboards, analyzing texts, designing games based on alphabets, and doing some text mining. In order to adequately compare lettter frequences of Lithuanian language to other languages in the Internet space, Wikipedia source was selected which content is common to different languages. The method of letter frequency jumps is used. The main attention is paid to the analysis of letter frequencies at the boundary between native letters and foreign letters used in Lithuanian and other languages.

  19. Estimation of Internal Flooding Frequency for Screening Analysis of Flooding PSA

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Yang, Jun Eon

    2005-01-01

    The purpose of this paper is to estimate the internal frequency for the quantitative screening analysis of the flooding PSA (Probabilistic Safety Assessment) with the appropriate data and estimation method. In the case of the existing flood PSA for domestic NPPs (Nuclear Power Plant), the screening analysis was performed firstly and then detailed analysis was performed for the area not screened out. For the quantitative screening analysis, the plant area based flood frequency by MLE (Maximum Likelihood Estimation) method was used, while the component based flood frequency is used for the detailed analysis. The existing quantitative screening analysis for domestic NPPs have used data from all LWRs (Light Water Reactor), namely PWR (Pressurized Water Reactor) and BWR (Boiling Water Reactor) for the internal flood frequency of the auxiliary building and turbine building. However, in the case of the primary auxiliary building, the applicability of the data from all LWRs needs to be examined carefully because of the significant difference in equipments between the PWR and BWR structure. NUREG/CR-5750 suggested the Bayesian update method with Jeffrey's noninformative prior to estimate the initiating event frequency for the flood. It, however, did not describe any procedure of the flood PSA. Recently, Fleming and Lydell suggested the internal flooding frequency in the unit of the plant operation year-pipe length (in meter) by pipe size of each specific system which is susceptible to the flooding such as the service water system and the circulating water system. They used the failure rate, the rupture conditional probability given the failure to estimate the internal flooding frequency, and the Bayesian update to reduce uncertainties. To perform the quantitative screening analysis with the method, it requires pipe length by each pipe size of the specific system per each divided area to change the concept of the component based frequency to the concept of the plant area

  20. Design of multi-frequency CW radars

    CERN Document Server

    Jankiraman, Mohinder

    2007-01-01

    This book deals with the basic theory for design and analysis of Low Probability of Intercept (LPI) radar systems. The design of one such multi-frequency high resolution LPI radar, PANDORA, is covered.

  1. Analysis and experimental results of frequency splitting of underwater wireless power transfer

    Directory of Open Access Journals (Sweden)

    Wangqiang Niu

    2017-06-01

    Full Text Available Underwater wireless power transfer (UWPT is an important technique to power underwater devices while its frequency splitting phenomena are not fully elucidated. In this study, frequency splitting phenomena of a symmetrical planar two-coil wireless power transfer (WPT system resonated at 90 kHz are investigated in seawater and freshwater. A concise frequency splitting analysis of this WPT system in air based on circuit model is given first and then experimental data are reported to show there is little difference between power transfer in air, freshwater and seawater in the range of 40–140 kHz of this WPT system. Consequently, the frequency splitting analysis and observations in air are also applicable in freshwater and seawater. It is found a V-type frequency splitting pattern exists in this WPT system under seawater and freshwater. Frequency shift is observed in this UWPT system in overcoupled region, and no frequency shift is observed in undercoupled region. In undercoupled region, in the low frequency zone of 40–90 kHz the load voltage characteristics in three media are identical; in the high-frequency zone of 90–140 kHz, the load voltage in air is slightly larger than those in freshwater and seawater.

  2. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  3. Frequency analysis for the thermal hydraulic characterization of a natural circulation circuit

    International Nuclear Information System (INIS)

    Torres, Walmir M.; Macedo, Luiz A.; Sabundjian, Gaiane; Andrade, Delvonei A.; Umbehaun, Pedro E.; Conti, Thadeu N.; Mesquita, Roberto N.; Masotti, Paulo H.; Angelo, Gabriel

    2011-01-01

    This paper presents the frequency analysis studies of the pressure signals from an experimental natural circulation circuit during a heating process. The main objective is to identify the characteristic frequencies of this process using fast Fourier transform. Video images are used to associate these frequencies to the observed phenomenology in the circuit during the process. Sub-cooled and saturated flow boiling, heaters vibrations, overall circuit vibrations, chugging and geysering were observed. Each phenomenon has its specific frequency associated. Some phenomena and their frequencies must be avoided or attenuated since they can cause damages to the natural circulation circuit and its components. Special operation procedures and devices can be developed to avoid these undesirable frequencies. (author)

  4. Frequency analysis for the thermal hydraulic characterization of a natural circulation circuit

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Walmir M.; Macedo, Luiz A.; Sabundjian, Gaiane; Andrade, Delvonei A.; Umbehaun, Pedro E.; Conti, Thadeu N.; Mesquita, Roberto N.; Masotti, Paulo H.; Angelo, Gabriel, E-mail: wmtorres@ipen.b, E-mail: lamacedo@ipen.b, E-mail: gdjian@ipen.b, E-mail: delvonei@ipen.b, E-mail: umbehaun@ipen.b, E-mail: tnconti@ipen.b, E-mail: , E-mail: rnavarro@ipen.b, E-mail: pmasotti@ipen.b, E-mail: gabriel.angelo@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    This paper presents the frequency analysis studies of the pressure signals from an experimental natural circulation circuit during a heating process. The main objective is to identify the characteristic frequencies of this process using fast Fourier transform. Video images are used to associate these frequencies to the observed phenomenology in the circuit during the process. Sub-cooled and saturated flow boiling, heaters vibrations, overall circuit vibrations, chugging and geysering were observed. Each phenomenon has its specific frequency associated. Some phenomena and their frequencies must be avoided or attenuated since they can cause damages to the natural circulation circuit and its components. Special operation procedures and devices can be developed to avoid these undesirable frequencies. (author)

  5. Influence of Type of Frequency Weighting Function On VDV Analysis

    Science.gov (United States)

    Kowalska-Koczwara, Alicja; Stypuła, Krzysztof

    2017-10-01

    Transport vibrations are the subject of many research, mostly their influence on structural elements of the building is investigated. However, nowadays, especially in the centres of large cities were apartments, residential buildings are closer to the transport vibration sources, an increasing attention is given to providing vibrational comfort to humans in buildings. Currently, in most countries, two main methods of evaluation are used: root mean squared method (RMS) and vibration dose value (VDV). In this article, VDV method is presented and the analysis of the weighting functions selection on value of VDV is made. Measurements required for the analysis were made in Krakow, on masonry, residential, two storey building located in the city centre. The building is subjected into two transport vibration sources: tram passages and vehicle passages on very close located road. Measurement points were located on the basement wall at ground level to control the excitation and in the middle of the floor on the highest storey (in the place where people percept vibration). The room chosen for measurements is located closest to the transport excitation sources. During the measurements, 25 vibration events were recorded and analysed. VDV values were calculated for three different weighting functions according to standard: ISO 2631-1, ISO 2631-2 and BS-6841. Differences in VDV values are shown, but also influence of the weighting function selection on result of evaluation is also presented. VDV analysis was performed not only for the individual vibration event but also all day and night vibration exposure were calculated using formulas contained in the annex to the standard BS-6841. It is demonstrated that, although there are differences in the values of VDV, an influence on all day and night exposure is no longer so significant.

  6. Super-Resolution Reconstruction of Remote Sensing Images Using Multifractal Analysis

    Directory of Open Access Journals (Sweden)

    Mao-Gui Hu

    2009-10-01

    Full Text Available Satellite remote sensing (RS is an important contributor to Earth observation, providing various kinds of imagery every day, but low spatial resolution remains a critical bottleneck in a lot of applications, restricting higher spatial resolution analysis (e.g., intraurban. In this study, a multifractal-based super-resolution reconstruction method is proposed to alleviate this problem. The multifractal characteristic is common in Nature. The self-similarity or self-affinity presented in the image is useful to estimate details at larger and smaller scales than the original. We first look for the presence of multifractal characteristics in the images. Then we estimate parameters of the information transfer function and noise of the low resolution image. Finally, a noise-free, spatial resolutionenhanced image is generated by a fractal coding-based denoising and downscaling method. The empirical case shows that the reconstructed super-resolution image performs well indetail enhancement. This method is not only useful for remote sensing in investigating Earth, but also for other images with multifractal characteristics.

  7. Performance analysis of different PSF shapes for the quad-HIDAC PET submillimetre resolution recovery

    Energy Technology Data Exchange (ETDEWEB)

    Ortega Maynez, Leticia, E-mail: lortega@uacj.mx [Departamento de Ingenieria Eectrica y Computacion , Universidad Autonoma de Ciudad Juarez, Avenida del Charro 450 Norte, C.P. 32310 Ciudad Juarez, Chihuahua (Mexico); Dominguez de Jesus Ochoa, Humberto; Villegas Osiris Vergara, Osslan; Gordillo, Nelly; Guadalupe Cruz Sanchez, Vianey; Gutierrez Casas, Efren David [Departamento de Ingenieria Eectrica y Computacion, Universidad Autonoma de Ciudad Juarez, Avenida del Charro 450 Norte, C.P. 32310 Ciudad Juarez, Chihuahua (Mexico)

    2011-10-01

    In pre-clinical applications, it is quite important to preserve the image resolution because it is necessary to show the details of structures of small animals. Therefore, small animal PET scanners require high spatial resolution and good sensitivity. For the quad-HIDAC PET scanner, which has virtually continuous spatial sampling; improvements in resolution, noise and contrast are obtained as a result of avoiding artifacts introduced by binning the data into sampled projections used during the reconstruction process. In order to reconstruct high-resolution images in 3D-PET, background correction and resolution recovery are included within the Maximum Likelihood list-mode Expectation Maximization reconstruction model. This paper, introduces the performance analysis of the Gaussian, Laplacian and Triangular kernels. The Full-Width Half-Maximum used for each kernel was varied from 0.8 to 1.6 mm. For each quality compartment within the phantom, transaxial middle slices from the 3D reconstructed images are shown. Results show that, according to the quantitative measures, the triangular kernel has the best performance.

  8. Modal spectral analysis of piping: Determination of the significant frequency range

    International Nuclear Information System (INIS)

    Geraets, L.H.

    1981-01-01

    This paper investigates the influence of the number of modes on the response of a piping system in a dynamic modal spectral analysis. It shows how the analysis can be limited to a specific frequency range of the pipe (independent of the frequency range of the response spectrum), allowing cost reduction without loss in accuracy. The 'missing mass' is taken into account through an original technique. (orig./HP)

  9. A High-Resolution Continuous Flow Analysis System for Polar Ice Cores

    DEFF Research Database (Denmark)

    Dallmayr, Remi; Goto-Azuma, Kumiko; Kjær, Helle Astrid

    2016-01-01

    of Polar Research (NIPR) in Tokyo. The system allows the continuous analysis of stable water isotopes and electrical conductivity, as well as the collection of discrete samples from both inner and outer parts of the core. This CFA system was designed to have sufficiently high temporal resolution to detect...... signals of abrupt climate change in deep polar ice cores. To test its performance, we used the system to analyze different climate intervals in ice drilled at the NEEM (North Greenland Eemian Ice Drilling) site, Greenland. The quality of our continuous measurement of stable water isotopes has been......In recent decades, the development of continuous flow analysis (CFA) technology for ice core analysis has enabled greater sample throughput and greater depth resolution compared with the classic discrete sampling technique. We developed the first Japanese CFA system at the National Institute...

  10. Method for high resolution magnetic resonance analysis using magic angle technique

    Science.gov (United States)

    Wind, Robert A.; Hu, Jian Zhi

    2003-12-30

    A method of performing a magnetic resonance analysis of a biological object that includes placing the object in a main magnetic field (that has a static field direction) and in a radio frequency field; rotating the object at a frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a phase-corrected magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. The object may be reoriented about the magic angle axis between three predetermined positions that are related to each other by 120.degree.. The main magnetic field may be rotated mechanically or electronically. Methods for magnetic resonance imaging of the object are also described.

  11. AUTOMATED WETLAND DELINEATION FROM MULTI-FREQUENCY AND MULTI-POLARIZED SAR IMAGES IN HIGH TEMPORAL AND SPATIAL RESOLUTION

    Directory of Open Access Journals (Sweden)

    L. Moser

    2016-06-01

    Full Text Available Water scarcity is one of the main challenges posed by the changing climate. Especially in semi-arid regions where water reservoirs are filled during the very short rainy season, but have to store enough water for the extremely long dry season, the intelligent handling of water resources is vital. This study focusses on Lac Bam in Burkina Faso, which is the largest natural lake of the country and of high importance for the local inhabitants for irrigated farming, animal watering, and extraction of water for drinking and sanitation. With respect to the competition for water resources an independent area-wide monitoring system is essential for the acceptance of any decision maker. The following contribution introduces a weather and illumination independent monitoring system for the automated wetland delineation with a high temporal (about two weeks and a high spatial sampling (about five meters. The similarities of the multispectral and multi-polarized SAR acquisitions by RADARSAT-2 and TerraSAR-X are studied as well as the differences. The results indicate that even basic approaches without pre-classification time series analysis or post-classification filtering are already enough to establish a monitoring system of prime importance for a whole region.

  12. Estimating uncertainty in resolution tests

    CSIR Research Space (South Africa)

    Goncalves, DP

    2006-05-01

    Full Text Available frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger un- certainty analysis. ? 2006 Society of Photo-Optical Instrumentation Engineers. H20851DOI: 10....1117/1.2202914H20852 Subject terms: resolution testing; USAF 1951 test target; resolution uncertainity. Paper 050404R received May 20, 2005; revised manuscript received Sep. 2, 2005; accepted for publication Sep. 9, 2005; published online May 10, 2006. 1...

  13. Extreme Precipitation Estimation with Typhoon Morakot Using Frequency and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2011-01-01

    Full Text Available Typhoon Morakot lashed Taiwan and produced copious amounts of precipitation in 2009. From the point view of hydrological statistics, the impact of the precipitation from typhoon Morakot using a frequency analysis can be analyzed and discussed. The frequency curve, which was fitted mathematically to historical observed data, can be used to estimate the probability of exceedance for runoff events of a certain magnitude. The study integrates frequency analysis and spatial analysis to assess the effect of Typhoon Morakot event on rainfall frequency in the Gaoping River basin of southern Taiwan. First, extreme rainfall data are collected at sixteen stations for durations of 1, 3, 6, 12, and 24 hours and then an appropriate probability distribution was selected to analyze the impact of the extreme hydrological event. Spatial rainfall patterns for a return period of 200-yr with 24-hr duration with and without Typhoon Morakot are estimated. Results show that the rainfall amount is significantly different with long duration with and without the event for frequency analysis. Furthermore, spatial analysis shows that extreme rainfall for a return period of 200-yr is highly dependent on topography and is smaller in the southwest than that in the east. The results not only demonstrate the distinct effect of Typhoon Morakot on frequency analysis, but also could provide reference in future planning of hydrological engineering.

  14. Soil-structure interaction analysis of NPP containments: substructure and frequency domain methods

    International Nuclear Information System (INIS)

    Venancio-Filho, F.; Almeida, M.C.F.; Ferreira, W.G.; De Barros, F.C.P.

    1997-01-01

    Substructure and frequency domain methods for soil-structure interaction are addressed in this paper. After a brief description of mathematical models for the soil and of excitation, the equations for dynamic soil-structure interaction are developed for a rigid surface foundation and for an embedded foundation. The equations for the frequency domain analysis of MDOF systems are provided. An example of soil-structure interaction analysis with frequency-dependent soil properties is given and examples of identification of foundation impedance functions and soil properties are presented. (orig.)

  15. Econometric analysis of realized covariation: high frequency based covariance, regression, and correlation in financial economics

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2004-01-01

    This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing...... the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities....

  16. The Real-time Frequency Spectrum Analysis of Neutron Pulse Signal Series

    International Nuclear Information System (INIS)

    Tang Yuelin; Ren Yong; Wei Biao; Feng Peng; Mi Deling; Pan Yingjun; Li Jiansheng; Ye Cenming

    2009-01-01

    The frequency spectrum analysis of neutron pulse signal is a very important method in nuclear stochastic signal processing Focused on the special '0' and '1' of neutron pulse signal series, this paper proposes new rotation-table and realizes a real-time frequency spectrum algorithm under 1G Hz sample rate based on PC with add, address and SSE. The numerical experimental results show that under the count rate of 3X10 6 s -1 , this algorithm is superior to FFTW in time-consumption and can meet the real-time requirement of frequency spectrum analysis. (authors)

  17. Study of interhemispheric asymmetries in electroencephalographic signals by frequency analysis

    International Nuclear Information System (INIS)

    Zapata, J F; Garzon, J

    2011-01-01

    This study provides a new method for the detection of interhemispheric asymmetries in patients with continuous video-electroencephalography (EEG) monitoring at Intensive Care Unit (ICU), using wavelet energy. We obtained the registration of EEG signals in 42 patients with different pathologies, and then we proceeded to perform signal processing using the Matlab program, we compared the abnormalities recorded in the report by the neurophysiologist, the images of each patient and the result of signals analysis with the Discrete Wavelet Transform (DWT). Conclusions: there exists correspondence between the abnormalities found in the processing of the signal with the clinical reports of findings in patients; according to previous conclusion, the methodology used can be a useful tool for diagnosis and early quantitative detection of interhemispheric asymmetries.

  18. Space Shuttle and Space Station Radio Frequency (RF) Exposure Analysis

    Science.gov (United States)

    Hwu, Shian U.; Loh, Yin-Chung; Sham, Catherine C.; Kroll, Quin D.

    2005-01-01

    This paper outlines the modeling techniques and important parameters to define a rigorous but practical procedure that can verify the compliance of RF exposure to the NASA standards for astronauts and electronic equipment. The electromagnetic modeling techniques are applied to analyze RF exposure in Space Shuttle and Space Station environments with reasonable computing time and resources. The modeling techniques are capable of taking into account the field interactions with Space Shuttle and Space Station structures. The obtained results illustrate the multipath effects due to the presence of the space vehicle structures. It's necessary to include the field interactions with the space vehicle in the analysis for an accurate assessment of the RF exposure. Based on the obtained results, the RF keep out zones are identified for appropriate operational scenarios, flight rules and necessary RF transmitter constraints to ensure a safe operating environment and mission success.

  19. Statistical analysis of corn yields responding to climate variability at various spatio-temporal resolutions

    Science.gov (United States)

    Jiang, H.; Lin, T.

    2017-12-01

    Rain-fed corn production systems are subject to sub-seasonal variations of precipitation and temperature during the growing season. As each growth phase has varied inherent physiological process, plants necessitate different optimal environmental conditions during each phase. However, this temporal heterogeneity towards climate variability alongside the lifecycle of crops is often simplified and fixed as constant responses in large scale statistical modeling analysis. To capture the time-variant growing requirements in large scale statistical analysis, we develop and compare statistical models at various spatial and temporal resolutions to quantify the relationship between corn yield and weather factors for 12 corn belt states from 1981 to 2016. The study compares three spatial resolutions (county, agricultural district, and state scale) and three temporal resolutions (crop growth phase, monthly, and growing season) to characterize the effects of spatial and temporal variability. Our results show that the agricultural district model together with growth phase resolution can explain 52% variations of corn yield caused by temperature and precipitation variability. It provides a practical model structure balancing the overfitting problem in county specific model and weak explanation power in state specific model. In US corn belt, precipitation has positive impact on corn yield in growing season except for vegetative stage while extreme heat attains highest sensitivity from silking to dough phase. The results show the northern counties in corn belt area are less interfered by extreme heat but are more vulnerable to water deficiency.

  20. A compact high resolution ion mobility spectrometer for fast trace gas analysis.

    Science.gov (United States)

    Kirk, Ansgar T; Allers, Maria; Cochems, Philipp; Langejuergen, Jens; Zimmermann, Stefan

    2013-09-21

    Drift tube ion mobility spectrometers (IMS) are widely used for fast trace gas detection in air, but portable compact systems are typically very limited in their resolving power. Decreasing the initial ion packet width improves the resolution, but is generally associated with a reduced signal-to-noise-ratio (SNR) due to the lower number of ions injected into the drift region. In this paper, we present a refined theory of IMS operation which employs a combined approach for the analysis of the ion drift and the subsequent amplification to predict both the resolution and the SNR of the measured ion current peak. This theoretical analysis shows that the SNR is not a function of the initial ion packet width, meaning that compact drift tube IMS with both very high resolution and extremely low limits of detection can be designed. Based on these implications, an optimized combination of a compact drift tube with a length of just 10 cm and a transimpedance amplifier has been constructed with a resolution of 183 measured for the positive reactant ion peak (RIP(+)), which is sufficient to e.g. separate the RIP(+) from the protonated acetone monomer, even though their drift times only differ by a factor of 1.007. Furthermore, the limits of detection (LODs) for acetone are 180 pptv within 1 s of averaging time and 580 pptv within only 100 ms.

  1. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    Science.gov (United States)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  2. Eulerian frequency analysis of structural vibrations from high-speed video

    International Nuclear Information System (INIS)

    Venanzoni, Andrea; De Ryck, Laurent; Cuenca, Jacques

    2016-01-01

    An approach for the analysis of the frequency content of structural vibrations from high-speed video recordings is proposed. The techniques and tools proposed rely on an Eulerian approach, that is, using the time history of pixels independently to analyse structural motion, as opposed to Lagrangian approaches, where the motion of the structure is tracked in time. The starting point is an existing Eulerian motion magnification method, which consists in decomposing the video frames into a set of spatial scales through a so-called Laplacian pyramid [1]. Each scale — or level — can be amplified independently to reconstruct a magnified motion of the observed structure. The approach proposed here provides two analysis tools or pre-amplification steps. The first tool provides a representation of the global frequency content of a video per pyramid level. This may be further enhanced by applying an angular filter in the spatial frequency domain to each frame of the video before the Laplacian pyramid decomposition, which allows for the identification of the frequency content of the structural vibrations in a particular direction of space. This proposed tool complements the existing Eulerian magnification method by amplifying selectively the levels containing relevant motion information with respect to their frequency content. This magnifies the displacement while limiting the noise contribution. The second tool is a holographic representation of the frequency content of a vibrating structure, yielding a map of the predominant frequency components across the structure. In contrast to the global frequency content representation of the video, this tool provides a local analysis of the periodic gray scale intensity changes of the frame in order to identify the vibrating parts of the structure and their main frequencies. Validation cases are provided and the advantages and limits of the approaches are discussed. The first validation case consists of the frequency content

  3. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  4. Analysis of Clinical Predictors of Resolution of Sleep Disturbance Related to Frequent Nighttime Heartburn and Acid Regurgitation Symptoms in Individuals Taking Esomeprazole 20 mg or Placebo.

    Science.gov (United States)

    Johnson, David A; Le Moigne, Anne; Li, Jing; Pollack, Charles; Nagy, Peter

    2016-07-01

    Sleep disturbances related to reflux symptoms have a significant impact on the daily lives of affected individuals. These analyses identified clinical factors related to resolution of reflux-related sleep disturbance in subjects treated with esomeprazole 20 mg for 14 days. Data from the first 14 days of 2 similar randomized, double-blind studies were pooled for subjects experiencing frequent heartburn and related sleep disturbances receiving esomeprazole 20 mg (n = 357) or placebo (n = 346). A stepwise logistic regression analysis was performed with pooled and individual study data to evaluate relationships between clinical factors [treatment (esomeprazole vs. placebo), run-in sleep disturbance frequency, occurrence (yes/no) of 24-h, daytime, and nighttime heartburn (yes: ≥1 episode in 14-day treatment period)] and complete sleep disturbance resolution (no disturbances for 7 consecutive days). Absence of daytime (p = 0.0018) or nighttime (p heartburn during treatment was a significant predictor of complete sleep disturbance resolution at 14 days for the total population, while higher run-in sleep disturbance frequency (p heartburn during therapy, and esomeprazole treatment predicted complete reflux-related sleep disturbance resolution. The magnitude of therapeutic benefit for esomeprazole 20 mg over placebo increased with increasing baseline sleep disturbance.

  5. Time-Frequency-Wavenumber Analysis of Surface Waves Using the Continuous Wavelet Transform

    Science.gov (United States)

    Poggi, V.; Fäh, D.; Giardini, D.

    2013-03-01

    A modified approach to surface wave dispersion analysis using active sources is proposed. The method is based on continuous recordings, and uses the continuous wavelet transform to analyze the phase velocity dispersion of surface waves. This gives the possibility to accurately localize the phase information in time, and to isolate the most significant contribution of the surface waves. To extract the dispersion information, then, a hybrid technique is applied to the narrowband filtered seismic recordings. The technique combines the flexibility of the slant stack method in identifying waves that propagate in space and time, with the resolution of f- k approaches. This is particularly beneficial for higher mode identification in cases of high noise levels. To process the continuous wavelet transform, a new mother wavelet is presented and compared to the classical and widely used Morlet type. The proposed wavelet is obtained from a raised-cosine envelope function (Hanning type). The proposed approach is particularly suitable when using continuous recordings (e.g., from seismological-like equipment) since it does not require any hardware-based source triggering. This can be subsequently done with the proposed method. Estimation of the surface wave phase delay is performed in the frequency domain by means of a covariance matrix averaging procedure over successive wave field excitations. Thus, no record stacking is necessary in the time domain and a large number of consecutive shots can be used. This leads to a certain simplification of the field procedures. To demonstrate the effectiveness of the method, we tested it on synthetics as well on real field data. For the real case we also combine dispersion curves from ambient vibrations and active measurements.

  6. Time-frequency analysis to a particular type of scattering problems involving metallic-polymer tubing structures.

    Science.gov (United States)

    Elhanaoui, Abdelkader; Aassif, Elhoucein; Maze, Gérard; Décultot, Dominique

    2018-01-01

    In this paper, recent studies of backscattered acoustic signals in thinner steel-polymer tubing structures have been presented. Reassigned smoothed pseudo Wigner-Ville (rspWV) analysis has been adopted in order to diminish the cross-term effect, and achieve high resolution spectral. Vibration modes, which are associated to the resonances of circumferential waves, have been determined by using the modal isolation plan representation. At normalized frequencies below 140, an appreciable influence from the polymer coating thickness on the A 0 + and S 0 modes has been noticed. Furthermore, the trajectory of the A 0 - wave has been modified in the normalized frequency band 40-42. Group velocity curves of the A 0 - wave have, then, been graphically illustrated. The findings have shown a particular curvature change at reduced frequency 41 in the case of an immersed two-layer tube in water. Studies of acoustic backscattering involving steel-polymer tubing structures have confirmed the significant coupling of A 0 + and S 0 waves. Besides, the disappearance of the A 0 + resonance trajectory has been observed; which is a very important phenomenon to understand. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Study on The Extended Range Weather Forecast of Low Frequency Signal Based on Period Analysis Method

    Science.gov (United States)

    Li, X.

    2016-12-01

    Although many studies have explored the MJO and its application for weather forecasting, low-frequency oscillation has been insufficiently studied for the extend range weather forecasting over middle and high latitudes. In China, low-frequency synoptic map is a useful tool for meteorological operation department to forecast extend range weather. It is therefore necessary to develop objective methods to serve the need for finding low-frequency signal, interpretation and application of this signal in the extend range weather forecasting. In this paper, method of Butterworth band pass filter was applied to get low-frequency height field at 500hPa from 1980 to 2014 by using NCEP/NCAR daily grid data. Then period analysis and optimal subset regression methods were used to process the low frequency data of 150 days before the first forecast day and extend the low frequency signal of 500hPa low-frequency high field to future 30 days in the global from June to August during 2011-2014. Finally, the results were test. The main results are as follows: (1) In general, the fitting effect of low frequency signals of 500hPa low-frequency height field by period analysis in the northern hemisphere was better than that in the southern hemisphere, and was better in the low latitudes than that in the high latitudes. The fitting accuracy gradually reduced with the increase of forecast time length, which tended to be stable during the late forecasting period. (2) The fitting effects over the 6 key regions in China showed that except filtering result over Xinjiang area in the first 10 days and 30 days, filtering results over the other 5 key regions throughout the whole period have passed reliability test with level more than 95%. (3) The center and scope of low and high low frequency systems can be fitted well by using the methods mentioned above, which is consist with the corresponding use of the low-frequency synoptic map for the prediction of the extended period. Application of the

  8. High Resolution Melting Analysis for fast and cheap polymorphism screening of marine populations

    OpenAIRE

    sprotocols

    2015-01-01

    Authors: Anne-Leila Meistertzheim, Isabelle Calves, Sébastien Artigaud, Carolyn S. Friedman, Christine Paillard, Jean Laroche & Claude Ferec ### Abstract This protocol permits the mutation scanning of PCR products by high-resolution DNA melting analysis requiring the inclusion of a saturating intercalating dye in the PCR mix without labelled probe. During a scanning process, fluorescent melting curves of PCR amplicons are analyzed. Mutations modifying melting curve shapes, are allowed...

  9. Report on Ultra-high Resolution Gamma-/X-ray Analysis of Uranium Skull Oxide

    International Nuclear Information System (INIS)

    Friedrich, S.; Velazquez, M.; Drury, O.; Salaymeh, S.

    2009-01-01

    We have utilized the high energy resolution and high peak-to-background ratio of superconducting TES γ-detectors at very low energies for non-destructive analysis of a skull oxide derived from reprocessed nuclear fuel. Specifically, we demonstrate that superconducting detectors can separate and analyze the strong actinide emission lines in the spectral region below 60 keV that are often obscured in γ-measurements with conventional Ge detectors.

  10. Frequency Analysis Using Bootstrap Method and SIR Algorithm for Prevention of Natural Disasters

    Science.gov (United States)

    Kim, T.; Kim, Y. S.

    2017-12-01

    The frequency analysis of hydrometeorological data is one of the most important factors in response to natural disaster damage, and design standards for a disaster prevention facilities. In case of frequency analysis of hydrometeorological data, it assumes that observation data have statistical stationarity, and a parametric method considering the parameter of probability distribution is applied. For a parametric method, it is necessary to sufficiently collect reliable data; however, snowfall observations are needed to compensate for insufficient data in Korea, because of reducing the number of days for snowfall observations and mean maximum daily snowfall depth due to climate change. In this study, we conducted the frequency analysis for snowfall using the Bootstrap method and SIR algorithm which are the resampling methods that can overcome the problems of insufficient data. For the 58 meteorological stations distributed evenly in Korea, the probability of snowfall depth was estimated by non-parametric frequency analysis using the maximum daily snowfall depth data. The results show that probabilistic daily snowfall depth by frequency analysis is decreased at most stations, and most stations representing the rate of change were found to be consistent in both parametric and non-parametric frequency analysis. This study shows that the resampling methods can do the frequency analysis of the snowfall depth that has insufficient observed samples, which can be applied to interpretation of other natural disasters such as summer typhoons with seasonal characteristics. Acknowledgment.This research was supported by a grant(MPSS-NH-2015-79) from Disaster Prediction and Mitigation Technology Development Program funded by Korean Ministry of Public Safety and Security(MPSS).

  11. Analysis of Fatigue Life of PMMA at Different Frequencies Based on a New Damage Mechanics Model

    Directory of Open Access Journals (Sweden)

    Aifeng Huang

    2014-01-01

    Full Text Available Low-cycle fatigue tests at different frequencies and creep tests under different stress levels of Plexiglas Resist 45 were conducted. Correspondingly, the creep fracture time, S-N curves, cyclic creep, and hysteresis loop were obtained. These results showed that the fatigue life increases with frequency at low frequency domain. After analysis, it was found that fatigue life is dependent on the load rate and is affected by the creep damage. In addition, a new continuum damage mechanics (CDM model was established to analyze creep-fatigue life, where the damage increment nonlinear summation rule was proposed and the frequency modification was made on the fatigue damage evolution equation. Differential evolution (DE algorithm was employed to determine the parameters within the model. The proposed model described fatigue life under different frequencies, and the calculated results agreed well with the experimental results.

  12. High Resolution Frequency Swept Imaging.

    Science.gov (United States)

    1983-09-30

    Appendix XI). Preliminary ex- perimental results obtained using ultrasound are extremely encouraging. These -12- ’!" , ,a, -’ ’- : "--’ .,"U...of broad-band nature. They are either In the form of relatively long chirps ( whistles ), impulse like pings or clicks of less than lmsec duration and...Bottlenose Porpoise: A Study of Whistles and Clicks", Zoologica, Vol. 47, 1962, pp. 121-128. L 2 I ._ 4! i APPENDIX II " HBOLOCRAPHY, WAVE-ILE GTH DIVERSITY

  13. High-frequency and meso-scale winter sea-ice variability in the Southern Ocean in a high-resolution global ocean model

    Science.gov (United States)

    Stössel, Achim; von Storch, Jin-Song; Notz, Dirk; Haak, Helmuth; Gerdes, Rüdiger

    2018-03-01

    This study is on high-frequency temporal variability (HFV) and meso-scale spatial variability (MSV) of winter sea-ice drift in the Southern Ocean simulated with a global high-resolution (0.1°) sea ice-ocean model. Hourly model output is used to distinguish MSV characteristics via patterns of mean kinetic energy (MKE) and turbulent kinetic energy (TKE) of ice drift, surface currents, and wind stress, and HFV characteristics via time series of raw variables and correlations. We find that (1) along the ice edge, the MSV of ice drift coincides with that of surface currents, in particular such due to ocean eddies; (2) along the coast, the MKE of ice drift is substantially larger than its TKE and coincides with the MKE of wind stress; (3) in the interior of the ice pack, the TKE of ice drift is larger than its MKE, mostly following the TKE pattern of wind stress; (4) the HFV of ice drift is dominated by weather events, and, in the absence of tidal currents, locally and to a much smaller degree by inertial oscillations; (5) along the ice edge, the curl of the ice drift is highly correlated with that of surface currents, mostly reflecting the impact of ocean eddies. Where ocean eddies occur and the ice is relatively thin, ice velocity is characterized by enhanced relative vorticity, largely matching that of surface currents. Along the ice edge, ocean eddies produce distinct ice filaments, the realism of which is largely confirmed by high-resolution satellite passive-microwave data.

  14. Design and analysis of planar spiral resonator bandstop filter for microwave frequency

    Science.gov (United States)

    Motakabber, S. M. A.; Shaifudin Suharsono, Muhammad

    2017-11-01

    In microwave frequency, a spiral resonator can act as either frequency reject or acceptor circuits. A planar logarithmic spiral resonator bandstop filter has been developed based on this property. This project focuses on the rejection property of the spiral resonator. The performance analysis of the exhibited filter circuit has been performed by using scattering parameters (S-parameters) technique in the ultra-wideband microwave frequency. The proposed filter is built, simulated and S-parameters analysis have been accomplished by using electromagnetic simulation software CST microwave studio. The commercial microwave substrate Taconic TLX-8 has been used to build this filter. Experimental results showed that the -10 dB rejection bandwidth of the filter is 2.32 GHz and central frequency is 5.72 GHz which is suitable for ultra-wideband applications. The proposed design has been full of good compliance with the simulated and experimental results here.

  15. Frequency-domain analysis of resonant-type ring magnet power supplies

    International Nuclear Information System (INIS)

    Kim, J.M.S.; Reiniger, K.W.

    1993-01-01

    For fast-cycling synchrotrons, resonant-type ring magnet power supplies are commonly used to provide a dc-biased ac excitation for the ring magnets. Up to the present, this power supply system has been analyzed using simplified analytical approximation, namely assuming the resonant frequency of the ring magnet network is fixed and equal to the accelerator frequency. This paper presents a frequency-domain analysis technique for a more accurate analysis of resonant-type ring magnet power supplies. This approach identifies that, with the variation of the resonant frequency, the operating conditions of the power supply changes quite dramatically because of the high Q value of the resonant network. The analytical results are verified, using both experimental results and simulation results

  16. Analysis of Power System Low Frequency Oscillation Based on Energy Shift Theory

    Science.gov (United States)

    Zhang, Junfeng; Zhang, Chunwang; Ma, Daqing

    2018-01-01

    In this paper, a new method for analyzing low-frequency oscillation between analytic areas based on energy coefficient is proposed. The concept of energy coefficient is proposed by constructing the energy function, and the low-frequency oscillation is analyzed according to the energy coefficient under the current operating conditions; meanwhile, the concept of model energy is proposed to analyze the energy exchange behavior between two generators. Not only does this method provide an explanation of low-frequency oscillation from the energy point of view, but also it helps further reveal the dynamic behavior of complex power systems. The case analysis of four-machine two-area and the power system of Jilin Power Grid proves the correctness and effectiveness of the proposed method in low-frequency oscillation analysis of power system.

  17. Fast simulation of wind generation for frequency stability analysis in island power systems

    Energy Technology Data Exchange (ETDEWEB)

    Conroy, James [EirGrid, Dublin (Ireland)

    2010-07-01

    Frequency stability is a major issue for power system planning and operation in an island power system such as Ireland. As increasing amounts of variable speed wind generation are added to the system, this issue becomes more prominent, as variable speed wind generation does not provide an inherent inertial response. This lack of an inertial response means that simplified models for variable speed wind farms can be used for investigating frequency stability. EirGrid uses DIgSILENT Power Factory (as well as other software tools) to investigate frequency stability. In PowerFactory, an automation program has been created to convert detailed wind farm representation (as necessary for other types of analysis) to negative load models for frequency stability analysis. The advantage of this approach is much-improved simulation speed without loss of accuracy. This approach can also be to study future wind energy targets, and long-term simulation of voltage stability. (orig.)

  18. Validation of virtual instrument for data analysis in metrology of time and frequency

    International Nuclear Information System (INIS)

    Jordao, Bruno; Quaresma, Daniel; Rocha, Pedro; Carvalho, Ricardo; Peixoto, Jose Guilherme

    2016-01-01

    Commercial Software (CS) for collection, analysis and plot time and frequency data plots are being increasingly used in reference laboratories worldwide. With this, it has greatly improved the results of calculations of uncertainty for these values. We propose the creation of a collection of software and data analysis using Virtual Instruments (VI) developed the Primary Laboratory Time and frequency of the National Observatory - ON and validation of this instrument. To validate the instrument developed, it made a comparative analysis between the results obtained (VI) with the results obtained by (CS) widely used in many metrology laboratories. From these results we can conclude that there was equivalence between the analyzed data. (author)

  19. Multi-scale Analysis of High Resolution Topography: Feature Extraction and Identification of Landscape Characteristic Scales

    Science.gov (United States)

    Passalacqua, P.; Sangireddy, H.; Stark, C. P.

    2015-12-01

    With the advent of digital terrain data, detailed information on terrain characteristics and on scale and location of geomorphic features is available over extended areas. Our ability to observe landscapes and quantify topographic patterns has greatly improved, including the estimation of fluxes of mass and energy across landscapes. Challenges still remain in the analysis of high resolution topography data; the presence of features such as roads, for example, challenges classic methods for feature extraction and large data volumes require computationally efficient extraction and analysis methods. Moreover, opportunities exist to define new robust metrics of landscape characterization for landscape comparison and model validation. In this presentation we cover recent research in multi-scale and objective analysis of high resolution topography data. We show how the analysis of the probability density function of topographic attributes such as slope, curvature, and topographic index contains useful information for feature localization and extraction. The analysis of how the distributions change across scales, quantified by the behavior of modal values and interquartile range, allows the identification of landscape characteristic scales, such as terrain roughness. The methods are introduced on synthetic signals in one and two dimensions and then applied to a variety of landscapes of different characteristics. Validation of the methods includes the analysis of modeled landscapes where the noise distribution is known and features of interest easily measured.

  20. Analysis of Natural Frequencies in the Universal Programs for Dynamic Processes Analysis

    Directory of Open Access Journals (Sweden)

    V. A. Trudonoshin

    2016-01-01

    Full Text Available Finding the natural frequencies of complex technical objects is an important design procedure. This type of analysis allows us to determine the resonant frequencies and, as a consequence, to avoid their adverse impact on dynamics the projected object or that of under study. This applies to both the objects with distributed parameters, and the objects with lumped parameters. As to the first type of the objects, in almost every package that implements the finite element method, this type of analysis is available. The situation is different for the objects with lumped parameters. Methods to have the mathematical models for these objects look to implicit methods of numerical integration of ordinary differential equations. And the component equations of the reactive branches are sampled by numerical integration formulas, and the derivatives of state variables disappear from the vector of the unknowns of a mathematical model. In this case, talk about the implementation of the procedure for finding natural frequencies by finding eigenvalues is simply unnecessary. In cases where a mathematical model of the object is given in the normal Cauchy form, obtaining the natural frequencies is reduced to finding the eigenvalues of the coefficient matrix. There are methods to form the mathematical models in which the derivatives of the state variables make a sub-vector of the vector of unknowns. These are generalized, advanced nodal methods, and an advanced nodal one for mechanical systems. There can be a try for reduction of the mathematical models of objects, obtained by these methods, to the normal Cauchy form. The article discusses a similar procedure for the generalized and advanced nodal methods. As for the extended nodal method for mechanical systems there is specifics the article does not show. For the model obtained by generalized method the vector of unknown variables is permutated so that a sub-vector of the derivatives of the state variables was in

  1. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  2. English word frequency and recognition in bilinguals: Inter-corpus comparison and error analysis.

    Science.gov (United States)

    Shi, Lu-Feng

    2015-01-01

    This study is the second of a two-part investigation on lexical effects on bilinguals' performance on a clinical English word recognition test. Focus is on word-frequency effects using counts provided by four corpora. Frequency of occurrence was obtained for 200 NU-6 words from the Hoosier mental lexicon (HML) and three contemporary corpora, American National Corpora, Hyperspace analogue to language (HAL), and SUBTLEX(US). Correlation analysis was performed between word frequency and error rate. Ten monolinguals and 30 bilinguals participated. Bilinguals were further grouped according to their age of English acquisition and length of schooling/working in English. Word frequency significantly affected word recognition in bilinguals who acquired English late and had limited schooling/working in English. When making errors, bilinguals tended to replace the target word with a word of a higher frequency. Overall, the newer corpora outperformed the HML in predicting error rate. Frequency counts provided by contemporary corpora predict bilinguals' recognition of English monosyllabic words. Word frequency also helps explain top replacement words for misrecognized targets. Word-frequency effects are especially prominent for bilinguals foreign born and educated.

  3. Complex Signal Kurtosis and Independent Component Analysis for Wideband Radio Frequency Interference Detection

    Science.gov (United States)

    Schoenwald, Adam; Mohammed, Priscilla; Bradley, Damon; Piepmeier, Jeffrey; Wong, Englin; Gholian, Armen

    2016-01-01

    Radio-frequency interference (RFI) has negatively implicated scientific measurements across a wide variation passive remote sensing satellites. This has been observed in the L-band radiometers SMOS, Aquarius and more recently, SMAP [1, 2]. RFI has also been observed at higher frequencies such as K band [3]. Improvements in technology have allowed wider bandwidth digital back ends for passive microwave radiometry. A complex signal kurtosis radio frequency interference detector was developed to help identify corrupted measurements [4]. This work explores the use of ICA (Independent Component Analysis) as a blind source separation technique to pre-process radiometric signals for use with the previously developed real and complex signal kurtosis detectors.

  4. Field Experience with Sweep Frequency Response Analysis for Power Transformer Diagnosis

    OpenAIRE

    Ambuj Kumar; Sunil Kumar Singh; Shrikant Singh

    2015-01-01

    Sweep frequency response analysis has been turning out a powerful tool for investigation of mechanical as well as electrical integration of transformers. In this paper various aspect of practical application of SFRA has been studied. Open circuit and short circuit measurement were done on different phases of high voltage and low voltage winding. A case study was presented for the transformer of rating 31.5 MVA for various frequency ranges. A clear picture was presented fo...

  5. HAWCStab2 with super element foundations: A new tool for frequency analysis of offshore wind turbines

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Hansen, Anders Melchior; Kragh, Knud Abildgaard

    2013-01-01

    HAWCStab2 is a linear frequency domain aero-elastic tool, developed by DTU Wind Energy, suitable for frequency and stability analysis of horizontal axis 3 bladed wind turbines [1]. This tool has now been extended to also handle complex offshore foundation types, such as jacket structures...... and floating structures with mooring lines, using super elements calculated by the nonlinear time domain aero-elastic code HAWC2 [2,3]....

  6. Flaw-size measurement in a weld samples by ultrasonic frequency analysis

    International Nuclear Information System (INIS)

    Adler, L.; Cook, K.V.; Whaley, H.L. Jr.; McClung, R.W.

    1975-01-01

    An ultrasonic frequency-analysis technique was developed and applies to characterize flaws in an 8-in. (203-mm) thick heavy-section steel weld specimen. The technique applies a multitransducer system. The spectrum of the received broad-band signal is frequency analyzed at two different receivers for each of the flaws. From the two spectra, the size and orientation of the flaw are determined by the use of an analytic model proposed earlier. (auth)

  7. High-Resolution Melt Analysis for Rapid Comparison of Bacterial Community Compositions

    DEFF Research Database (Denmark)

    Hjelmsø, Mathis Hjort; Hansen, Lars Hestbjerg; Bælum, Jacob

    2014-01-01

    In the study of bacterial community composition, 16S rRNA gene amplicon sequencing is today among the preferred methods of analysis. The cost of nucleotide sequence analysis, including requisite computational and bioinformatic steps, however, takes up a large part of many research budgets. High......-resolution melt (HRM) analysis is the study of the melt behavior of specific PCR products. Here we describe a novel high-throughput approach in which we used HRM analysis targeting the 16S rRNA gene to rapidly screen multiple complex samples for differences in bacterial community composition. We hypothesized...... that HRM analysis of amplified 16S rRNA genes from a soil ecosystem could be used as a screening tool to identify changes in bacterial community structure. This hypothesis was tested using a soil microcosm setup exposed to a total of six treatments representing different combinations of pesticide...

  8. Genotyping of Listeria monocytogenes isolates from poultry carcasses using high resolution melting (HRM) analysis.

    Science.gov (United States)

    Sakaridis, Ioannis; Ganopoulos, Ioannis; Madesis, Panagiotis; Tsaftaris, Athanasios; Argiriou, Anagnostis

    2014-01-02

    An outbreak situation of human listeriosis requires a fast and accurate protocol for typing Listeria monocytogenes . Existing techniques are either characterized by low discriminatory power or are laborious and require several days to give a final result. Polymerase chain reaction (PCR) coupled with high resolution melting (HRM) analysis was investigated in this study as an alternative tool for a rapid and precise genotyping of L. monocytogenes isolates. Fifty-five isolates of L. monocytogenes isolated from poultry carcasses and the environment of four slaughterhouses were typed by HRM analysis using two specific markers, internalin B and ssrA genes. The analysis of genotype confidence percentage of L. monocytogenes isolates produced by HRM analysis generated dendrograms with two major groups and several subgroups. Furthermore, the analysis of the HRM curves revealed that all L. monocytogenes isolates could easily be distinguished. In conclusion, HRM was proven to be a fast and powerful tool for genotyping isolates of L. monocytogenes .

  9. High resolution gamma-ray spectroscopy applied to bulk sample analysis

    International Nuclear Information System (INIS)

    Kosanke, K.L.; Koch, C.D.; Wilson, R.D.

    1980-01-01

    A high resolution Ge(Li) gamma-ray spectrometer has been installed and made operational for use in routine bulk sample analysis by the Bendix Field Engineering Corporation (BFEC) geochemical analysis department. The Ge(Li) spectrometer provides bulk sample analyses for potassium, uranium, and thorium that are superior to those obtained by the BFEC sodium iodide spectrometer. The near term analysis scheme permits a direct assay for uranium that corrects for bulk sample self-absorption effects and is independent of the uranium/radium disequilibrium condition of the sample. A more complete analysis scheme has been developed that fully utilizes the gamma-ray data provided by the Ge(Li) spectrometer and that more properly accounts for the sample self-absorption effect. This new analysis scheme should be implemented on the BFEC Ge(Li) spectrometer at the earliest date

  10. High frequency analysis of cough sounds in pediatric patients with respiratory diseases.

    Science.gov (United States)

    Kosasih, K; Abeyratne, U R; Swarnkar, V

    2012-01-01

    Cough is a common symptom in a range of respiratory diseases and is considered a natural defense mechanism of the body. Despite its critical importance in the diagnosis of illness, there are no golden methods to objectively assess cough. In a typical consultation session, a physician may briefly listen to the cough sounds using a stethoscope placed against the chest. The physician may also listen to spontaneous cough sounds via naked ears, as they naturally propagate through air. Cough sounds carry vital information on the state of the respiratory system but the field of cough analysis in clinical medicine is in its infancy. All existing cough analysis approaches are severely handicapped by the limitations of the human hearing range and simplified analysis techniques. In this paper, we address these problems, and explore the use of frequencies covering a range well beyond the human perception (up to 90 kHz) and use wavelet analysis to extract diagnostically important information from coughs. Our data set comes from a pediatric respiratory ward in Indonesia, from subjects diagnosed with asthma, pneumonia and rhinopharyngitis. We analyzed over 90 cough samples from 4 patients and explored if high frequencies carried useful information in separating these disease groups. Multiple regression analysis resulted in coefficients of determination (R(2)) of 77-82% at high frequencies (15 kHz-90 kHz) indicating that they carry useful information. When the high frequencies were combined with frequencies below 15kHz, the R(2) performance increased to 85-90%.

  11. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.

  12. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.

  13. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2011-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. Assessing the blade structural integrity is a complex task requiring an initial characterization of whether resonance is possible and then performing a forced response analysis if that condition is met. The standard technique for forced response analysis in rocket engines is to decompose a CFD-generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. A substantial effort has been made to account for this denser spatial Fourier content in frequency response analysis (described in another paper by the author), but the question still remains whether the frequency response analysis itself is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, of bladed-disks undergoing this complex flow environment have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. Six loading cases were generated by varying a baseline harmonic excitation in different ways based upon cold-flow testing from Heritage Fuel Air Turbine Test. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. It was hypothesized that enforcing periodicity in the CFD (inherent in the frequency response technique) would overestimate the

  14. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  15. A novel femtosecond-gated, high-resolution, frequency-shifted shearing interferometry technique for probing pre-plasma expansion in ultra-intense laser experiments

    Energy Technology Data Exchange (ETDEWEB)

    Feister, S., E-mail: feister.7@osu.edu; Orban, C. [Department of Physics, The Ohio State University, Columbus, Ohio 43210 (United States); Innovative Scientific Solutions, Inc., Dayton, Ohio 45459 (United States); Nees, J. A. [Innovative Scientific Solutions, Inc., Dayton, Ohio 45459 (United States); Center for Ultra-Fast Optical Science, University of Michigan, Ann Arbor, Michigan 48109 (United States); Morrison, J. T. [Fellow, National Research Council, Washington, D.C. 20001 (United States); Frische, K. D. [Innovative Scientific Solutions, Inc., Dayton, Ohio 45459 (United States); Chowdhury, E. A. [Department of Physics, The Ohio State University, Columbus, Ohio 43210 (United States); Intense Energy Solutions, LLC., Plain City, Ohio 43064 (United States); Roquemore, W. M. [Air Force Research Laboratory, Dayton, Ohio 45433 (United States)

    2014-11-15

    Ultra-intense laser-matter interaction experiments (>10{sup 18} W/cm{sup 2}) with dense targets are highly sensitive to the effect of laser “noise” (in the form of pre-pulses) preceding the main ultra-intense pulse. These system-dependent pre-pulses in the nanosecond and/or picosecond regimes are often intense enough to modify the target significantly by ionizing and forming a plasma layer in front of the target before the arrival of the main pulse. Time resolved interferometry offers a robust way to characterize the expanding plasma during this period. We have developed a novel pump-probe interferometry system for an ultra-intense laser experiment that uses two short-pulse amplifiers synchronized by one ultra-fast seed oscillator to achieve 40-fs time resolution over hundreds of nanoseconds, using a variable delay line and other techniques. The first of these amplifiers acts as the pump and delivers maximal energy to the interaction region. The second amplifier is frequency shifted and then frequency doubled to generate the femtosecond probe pulse. After passing through the laser-target interaction region, the probe pulse is split and recombined in a laterally sheared Michelson interferometer. Importantly, the frequency shift in the probe allows strong plasma self-emission at the second harmonic of the pump to be filtered out, allowing plasma expansion near the critical surface and elsewhere to be clearly visible in the interferograms. To aid in the reconstruction of phase dependent imagery from fringe shifts, three separate 120° phase-shifted (temporally sheared) interferograms are acquired for each probe delay. Three-phase reconstructions of the electron densities are then inferred by Abel inversion. This interferometric system delivers precise measurements of pre-plasma expansion that can identify the condition of the target at the moment that the ultra-intense pulse arrives. Such measurements are indispensable for correlating laser pre-pulse measurements

  16. If Frisch is true - impacts of varying beam width, resolution, frequency combinations and beam overlap when retrieving liquid water content profiles

    Science.gov (United States)

    Küchler, N.; Kneifel, S.; Kollias, P.; Loehnert, U.

    2017-12-01

    Cumulus and stratocumulus clouds strongly affect the Earth's radiation budget and are a major uncertainty source in weather and climate prediction models. To improve and evaluate models, a comprehensive understanding of cloud processes is necessary and references are needed. Therefore active and passive microwave remote sensing of clouds can be used to derive cloud properties such as liquid water path and liquid water content (LWC), which can serve as a reference for model evaluation. However, both the measurements and the assumptions when retrieving physical quantities from the measurements involve uncertainty sources. Frisch et al. (1998) combined radar and radiometer observations to derive LWC profiles. Assuming their assumptions are correct, there will be still uncertainties regarding the measurement setup. We investigate how varying beam width, temporal and vertical resolutions, frequency combinations, and beam overlap of and between the two instruments influence the retrieval of LWC profiles. Especially, we discuss the benefit of combining vertically, high resolved radar and radiometer measurements using the same antenna, i.e. having ideal beam overlap. Frisch, A. S., G. Feingold, C. W. Fairall, T. Uttal, and J. B. Snider, 1998: On cloud radar and microwave radiometer measurements of stratus cloud liquid water profiles. J. Geophys. Res.: Atmos., 103 (18), 23 195-23 197, doi:0148-0227/98/98JD-01827509.00.

  17. Significance of Bias Correction in Drought Frequency and Scenario Analysis Based on Climate Models

    Science.gov (United States)

    Aryal, Y.; Zhu, J.

    2015-12-01

    Assessment of future drought characteristics is difficult as climate models usually have bias in simulating precipitation frequency and intensity. To overcome this limitation, output from climate models need to be bias corrected based on the specific purpose of applications. In this study, we examine the significance of bias correction in the context of drought frequency and scenario analysis using output from climate models. In particular, we investigate the performance of three widely used bias correction techniques: (1) monthly bias correction (MBC), (2) nested bias correction (NBC), and (3) equidistance quantile mapping (EQM) The effect of bias correction in future scenario of drought frequency is also analyzed. The characteristics of drought are investigated in terms of frequency and severity in nine representative locations in different climatic regions across the United States using regional climate model (RCM) output from the North American Regional Climate Change Assessment Program (NARCCAP). The Standardized Precipitation Index (SPI) is used as the means to compare and forecast drought characteristics at different timescales. Systematic biases in the RCM precipitation output are corrected against the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) data. The results demonstrate that bias correction significantly decreases the RCM errors in reproducing drought frequency derived from the NARR data. Preserving mean and standard deviation is essential for climate models in drought frequency analysis. RCM biases both have regional and timescale dependence. Different timescale of input precipitation in the bias corrections show similar results. Drought frequency obtained from the RCM future (2040-2070) scenarios is compared with that from the historical simulations. The changes in drought characteristics occur in all climatic regions. The relative changes in drought frequency in future scenario in relation to

  18. Analysis on optical heterodyne frequency error of full-field heterodyne interferometer

    Science.gov (United States)

    Li, Yang; Zhang, Wenxi; Wu, Zhou; Lv, Xiaoyu; Kong, Xinxin; Guo, Xiaoli

    2017-06-01

    The full-field heterodyne interferometric measurement technology is beginning better applied by employing low frequency heterodyne acousto-optical modulators instead of complex electro-mechanical scanning devices. The optical element surface could be directly acquired by synchronously detecting the received signal phases of each pixel, because standard matrix detector as CCD and CMOS cameras could be used in heterodyne interferometer. Instead of the traditional four-step phase shifting phase calculating, Fourier spectral analysis method is used for phase extracting which brings lower sensitivity to sources of uncertainty and higher measurement accuracy. In this paper, two types of full-field heterodyne interferometer are described whose advantages and disadvantages are also specified. Heterodyne interferometer has to combine two different frequency beams to produce interference, which brings a variety of optical heterodyne frequency errors. Frequency mixing error and beat frequency error are two different kinds of inescapable heterodyne frequency errors. In this paper, the effects of frequency mixing error to surface measurement are derived. The relationship between the phase extraction accuracy and the errors are calculated. :: The tolerance of the extinction ratio of polarization splitting prism and the signal-to-noise ratio of stray light is given. The error of phase extraction by Fourier analysis that caused by beat frequency shifting is derived and calculated. We also propose an improved phase extraction method based on spectrum correction. An amplitude ratio spectrum correction algorithm with using Hanning window is used to correct the heterodyne signal phase extraction. The simulation results show that this method can effectively suppress the degradation of phase extracting caused by beat frequency error and reduce the measurement uncertainty of full-field heterodyne interferometer.

  19. Detonation mode and frequency analysis under high loss conditions for stoichiometric propane-oxygen

    KAUST Repository

    Jackson, Scott

    2016-03-24

    The propagation characteristics of galloping detonations were quantified with a high-time-resolution velocity diagnostic. Combustion waves were initiated in 30-m lengths of 4.1-mm inner diameter transparent tubing filled with stoichiometric propane-oxygen mixtures. Chemiluminescence from the resulting waves was imaged to determine the luminous wave front position and velocity every 83.3 μ. As the mixture initial pressure was decreased from 20 to 7 kPa, the wave was observed to become increasingly unsteady and transition from steady detonation to a galloping detonation. While wave velocities averaged over the full tube length smoothly decreased with initial pressure down to half of the Chapman-Jouguet detonation velocity (DCJ) at the quenching limit, the actual propagation mechanism was seen to be a galloping wave with a cycle period of approximately 1.0 ms, corresponding to a cycle length of 1.3-2.0 m or 317-488 tube diameters depending on the average wave speed. The long test section length of 7300 tube diameters allowed observation of up to 20 galloping cycles, allowing for statistical analysis of the wave dynamics. In the galloping regime, a bimodal velocity distribution was observed with peaks centered near 0.4 DCJ and 0.95 DCJ. Decreasing initial pressure increasingly favored the low velocity mode. Galloping frequencies ranged from 0.8 to 1.0 kHz and were insensitive to initial mixture pressure. Wave deflagration-to-detonation transition and detonation failure trajectories were found to be repeatable in a given test and also across different initial mixture pressures. The temporal duration of wave dwell at the low and high velocity modes during galloping was also quantified. It was found that the mean wave dwell duration in the low velocity mode was a weak function of initial mixture pressure, while the mean dwell time in the high velocity mode depended exponentially on initial mixture pressure. Analysis of the velocity histories using dynamical systems ideas

  20. Real time analysis of electromagnetic radiation in a very wide frequency range

    International Nuclear Information System (INIS)

    Peralta, J.A.; Reyes L, P.; Yepez, E.

    2001-01-01

    In this work, we present an electronic apparatus that facilitates the monitoring and analysis of electromagnetic radiation in a very wide frequency range. The device is a combination of real and virtual instruments, taking advantage of new hardware and software; the measurable range of frequencies depends on the speed of an analog/digital converter, reaching tens of Megahertz. The device has been successfully used to monitor the environmental electromagnetic radiation at very low frequency, a very useful parameter in the research of electromagnetic precursors of earthquakes. The apparatus is a new configuration and has advantages with respect to those previously used: when the attached computer is fast, Fourier analysis can be done in real time, can display simultaneously several bands, the digitized data allow a variety of methods of analysis, and the apparatus is very cheap. (Author)

  1. Resonance Analysis of High-Frequency Electrohydraulic Exciter Controlled by 2D Valve

    Directory of Open Access Journals (Sweden)

    Guojun Pan

    2015-01-01

    Full Text Available The resonant characteristic of hydraulic system has not been described yet because it is necessarily restricted by linear assumptions in classical fluid theory. A way of the resonance analysis is presented for an electrohydraulic exciter controlled by 2D valve. The block diagram of this excitation system is established by extracting nonlinear parts from the traditional linearization analysis; as a result the resonant frequency is obtained. According to input energy from oil source which is equal to the reverse energy to oil source, load pressure and load flow are solved analytically as the working frequency reaches the natural frequency. The analytical expression of resonant peak is also derived without damping. Finally, the experimental system is built to verify the theoretical analysis. The initial research on resonant characteristic will lay theoretical foundation and make useful complement for resonance phenomena of classical fluid theory in hydraulic system.

  2. Real time analysis of electromagnetic radiation in a very wide frequency range

    Energy Technology Data Exchange (ETDEWEB)

    Peralta, J.A.; Reyes L, P.; Yepez, E. [Escuela Superior de Fisica y Matematicas, Instituto Politecnico Nacional, Edificio 9, U.P. Adolfo Lopez Mateos, Zacatenco, 07738 Mexico D.F. (Mexico)

    2001-07-01

    In this work, we present an electronic apparatus that facilitates the monitoring and analysis of electromagnetic radiation in a very wide frequency range. The device is a combination of real and virtual instruments, taking advantage of new hardware and software; the measurable range of frequencies depends on the speed of an analog/digital converter, reaching tens of Megahertz. The device has been successfully used to monitor the environmental electromagnetic radiation at very low frequency, a very useful parameter in the research of electromagnetic precursors of earthquakes. The apparatus is a new configuration and has advantages with respect to those previously used: when the attached computer is fast, Fourier analysis can be done in real time, can display simultaneously several bands, the digitized data allow a variety of methods of analysis, and the apparatus is very cheap. (Author)

  3. The DECMU: a digital device for delayed analysis of multi-frequency eddy current signals

    International Nuclear Information System (INIS)

    Pigeon, Michel.

    1981-08-01

    A delayed data analysis system has been realized by the CEA and Intercontrole for in-service inspection of steam generators of nuclear plants by multifrequency eddy current testing. This device allows, out of the plant, adjustment during switching of the probes, graph recording and analysis for defect signal qualification. The equipment contains an analog mixing device, as IC3FA multi-frequency appartus, but has in addition a memory allowing data cycling and signal isolation for adjustment or analysis [fr

  4. Mode shape and natural frequency identification for seismic analysis from background vibration

    International Nuclear Information System (INIS)

    Bhan, S.; Wozniak, Z.

    1986-02-01

    The feasibility of calculating natural frequencies and mode shapes of major equipment in a CANDU reactor from the measurements of their response to background excitation has been studied. A review of vibration data measured at various locations in CANDU plants shows that structures responded to a combination of random and harmonic background excitation. Amplitude of measured vibration is sufficient to allow meaningful data analysis. Frequency content in the 0 to 50-Hz range, which is of interest for earthquake response, is present in some of the vibration measurements studied. Spectral techniques have been developed for determining the response function of structures from measured vibration response to background excitation. The natural frequencies and mode shapes are then evaluated graphically from the frequency function plots. The methodology has been tested on a simple cantilever beam with known natural frequencies and mode shapes. The comparison between the theoretical and the computed natural frequencies and mode shapes is good for the lower modes. However, better curve-fitting techniques will be required in future, especially for higher modes. Readily available equipment necessary for the measurement of background vibration in a CANDU plant (which is commercially available) has been identified. An experimental program has been proposed to verify the methodology developed in this study. Recommendations are also made to study methods to improve the accuracy of the mode shape and natural frequency prediction

  5. Analysis of Chinese women with primary ovarian insufficiency by high resolution array-comparative genomic hybridization.

    Science.gov (United States)

    Liao, Can; Fu, Fang; Yang, Xin; Sun, Yi-Min; Li, Dong-Zhi

    2011-06-01

    Primary ovarian insufficiency (POI) is defined as a primary ovarian defect characterized by absent menarche (primary amenorrhea) or premature depletion of ovarian follicles before the age of 40 years. The etiology of primary ovarian insufficiency in human female patients is still unclear. The purpose of this study is to investigate the potential genetic causes in primary amenorrhea patients by high resolution array based comparative genomic hybridization (array-CGH) analysis. Following the standard karyotyping analysis, genomic DNA from whole blood of 15 primary amenorrhea patients and 15 normal control women was hybridized with Affymetrix cytogenetic 2.7M arrays following the standard protocol. Copy number variations identified by array-CGH were confirmed by real time polymerase chain reaction. All the 30 samples were negative by conventional karyotyping analysis. Microdeletions on chromosome 17q21.31-q21.32 with approximately 1.3 Mb were identified in four patients by high resolution array-CGH analysis. This included the female reproductive secretory pathway related factor N-ethylmaleimide-sensitive factor (NSF) gene. The results of the present study suggest that there may be critical regions regulating primary ovarian insufficiency in women with a 17q21.31-q21.32 microdeletion. This effect might be due to the loss of function of the NSF gene/genes within the deleted region or to effects on contiguous genes.

  6. Interactive desktop analysis of high resolution simulations: application to turbulent plume dynamics and current sheet formation

    International Nuclear Information System (INIS)

    Clyne, John; Mininni, Pablo; Norton, Alan; Rast, Mark

    2007-01-01

    The ever increasing processing capabilities of the supercomputers available to computational scientists today, combined with the need for higher and higher resolution computational grids, has resulted in deluges of simulation data. Yet the computational resources and tools required to make sense of these vast numerical outputs through subsequent analysis are often far from adequate, making such analysis of the data a painstaking, if not a hopeless, task. In this paper, we describe a new tool for the scientific investigation of massive computational datasets. This tool (VAPOR) employs data reduction, advanced visualization, and quantitative analysis operations to permit the interactive exploration of vast datasets using only a desktop PC equipped with a commodity graphics card. We describe VAPORs use in the study of two problems. The first, motivated by stellar envelope convection, investigates the hydrodynamic stability of compressible thermal starting plumes as they descend through a stratified layer of increasing density with depth. The second looks at current sheet formation in an incompressible helical magnetohydrodynamic flow to understand the early spontaneous development of quasi two-dimensional (2D) structures embedded within the 3D solution. Both of the problems were studied at sufficiently high spatial resolution, a grid of 504 2 by 2048 points for the first and 1536 3 points for the second, to overwhelm the interactive capabilities of typically available analysis resources

  7. The determination of frequency response function of the RSG Gas by laplace transform analysis

    International Nuclear Information System (INIS)

    Tukiran, S.; Surian, P.; Jujuratisbela, U.

    1997-01-01

    The response function of the RSG-GAS reactor system to the reactivity perturbations is necessary to be analyzed due to the interrelation with reliability and safety of reactor operation. the response depends on the power frequency response function H(s), while H(s) depends on the zero power frequency response function Z(s) and dynamic power coefficient of reactivity Kp(s) determination of the frequency response function of the RSG-GAS reactor was done by Fourier transform analysis method. Z(s) was obtained by fourier transform of P(t) and Cj(t) became P(S) and Cj(s) in point kinetic equations. Second order of simpson rule was used for completion of its numerical integration. then. LYMPR (Laplace transform for multipurpose reactor) code was made with fortran 77 computer language in vax 8550 system. the LTMPR code is able to determine the frequency response function and period-reactivity relation of RSG-GAS reactor by rod drop method. Profile of power as rod drop, zero power (without reactivity feedback) was used for determination frequency response of RSG-GAS reactor. The results of calculations are in a good agreement with experiment result, so the LTMPR code can be used for analysis response frequency of the RSG-GAS reactor

  8. A study on thermal characteristics analysis model of high frequency switching transformer

    Science.gov (United States)

    Yoo, Jin-Hyung; Jung, Tae-Uk

    2015-05-01

    Recently, interest has been shown in research on the module-integrated converter (MIC) in small-scale photovoltaic (PV) generation. In an MIC, the voltage boosting high frequency transformer should be designed to be compact in size and have high efficiency. In response to the need to satisfy these requirements, this paper presents a coupled electromagnetic analysis model of a transformer connected with a high frequency switching DC-DC converter circuit while considering thermal characteristics due to the copper and core losses. A design optimization procedure for high efficiency is also presented using this design analysis method, and it is verified by the experimental result.

  9. Frequency and time domain analysis of an external cavity laser with strong filtered optical feedback

    DEFF Research Database (Denmark)

    Detoma, Enrico; Tromborg, Bjarne; Montrosset, Ivo

    The stability properties of an external cavity laser with strong grating-filtered optical feedback to an anti-reflection coated facet are studied with a general frequency domain model. The model takes into account non-linear effects like four wave mixing and gain compression. A small......-signal analysis in the frequency domain allows a calculation of the range of operation without mode hopping around the grating reflectivity peak. This region should be as large as possible for proper operation of the tunable laser source. The analysis shows this stabilizing effect of mode coupling and gain...

  10. Review of single particle dynamics for third generation light sources through frequency map analysis

    Directory of Open Access Journals (Sweden)

    L. Nadolski

    2003-11-01

    Full Text Available Frequency map analysis [J. Laskar, Icarus 88, 266 (1990] is used here to analyze the transverse dynamics of four third generation synchrotron light sources: the ALS, the ESRF, the SOLEIL project, and Super-ACO. Time variations of the betatron tunes give additional information for the global dynamics of the beam. The main resonances are revealed; a one-to-one correspondence between the configuration space and the frequency space can be performed. We stress that the frequency maps, and therefore the dynamics optimization, are highly sensitive to sextupolar strengths and vary in a large amount from one machine to another. The frequency maps can thus be used to characterize the different machines.

  11. Time-frequency analysis of the restricted three-body problem: transport and resonance transitions

    International Nuclear Information System (INIS)

    Vela-Arevalo, Luz V; Marsden, Jerrold E

    2004-01-01

    A method of time-frequency analysis based on wavelets is applied to the problem of transport between different regions of the solar system, using the model of the circular restricted three-body problem in both the planar and the spatial versions of the problem. The method is based on the extraction of instantaneous frequencies from the wavelet transform of numerical solutions. Time-varying frequencies provide a good diagnostic tool to discern chaotic trajectories from regular ones, and we can identify resonance islands that greatly affect the dynamics. Good accuracy in the calculation of time-varying frequencies allows us to determine resonance trappings of chaotic trajectories and resonance transitions. We show the relation between resonance transitions and transport in different regions of the phase space

  12. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  13. Dipeptide frequency/bias analysis identifies conserved sites of nonrandomness shared by cysteine-rich motifs.

    Science.gov (United States)

    Campion, S R; Ameen, A S; Lai, L; King, J M; Munzenmaier, T N

    2001-08-15

    This report describes the application of a simple computational tool, AAPAIR.TAB, for the systematic analysis of the cysteine-rich EGF, Sushi, and Laminin motif/sequence families at the two-amino acid level. Automated dipeptide frequency/bias analysis detects preferences in the distribution of amino acids in established protein families, by determining which "ordered dipeptides" occur most frequently in comprehensive motif-specific sequence data sets. Graphic display of the dipeptide frequency/bias data revealed family-specific preferences for certain dipeptides, but more importantly detected a shared preference for employment of the ordered dipeptides Gly-Tyr (GY) and Gly-Phe (GF) in all three protein families. The dipeptide Asn-Gly (NG) also exhibited high-frequency and bias in the EGF and Sushi motif families, whereas Asn-Thr (NT) was distinguished in the Laminin family. Evaluation of the distribution of dipeptides identified by frequency/bias analysis subsequently revealed the highly restricted localization of the G(F/Y) and N(G/T) sequence elements at two separate sites of extreme conservation in the consensus sequence of all three sequence families. The similar employment of the high-frequency/bias dipeptides in three distinct protein sequence families was further correlated with the concurrence of these shared molecular determinants at similar positions within the distinctive scaffolds of three structurally divergent, but similarly employed, motif modules.

  14. Effects of meal frequency on weight loss and body composition: a meta-analysis.

    Science.gov (United States)

    Schoenfeld, Brad Jon; Aragon, Alan Albert; Krieger, James W

    2015-02-01

    It has been hypothesized that eating small, frequent meals enhances fat loss and helps to achieve better weight maintenance. Several observational studies lend support to this hypothesis, with an inverse relationship noted between the frequency of eating and adiposity. The purpose of this narrative review is to present and discuss a meta-analysis with regression that evaluated experimental research on meal frequency with respect to changes in fat mass and lean mass. A total of 15 studies were identified that investigated meal frequency in accordance with the criteria outlined. Feeding frequency was positively associated with reductions in fat mass and body fat percentage as well as an increase in fat-free mass. However, sensitivity analysis of the data showed that the positive findings were the product of a single study, casting doubt as to whether more frequent meals confer beneficial effects on body composition. In conclusion, although the initial results of this meta-analysis suggest a potential benefit of increased feeding frequencies for enhancing body composition, these findings need to be interpreted with circumspection. © The Author(s) 2015. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Asymmetric resonance frequency analysis of in-plane electrothermal silicon cantilevers for nanoparticle sensors

    Science.gov (United States)

    Bertke, Maik; Hamdana, Gerry; Wu, Wenze; Marks, Markus; Suryo Wasisto, Hutomo; Peiner, Erwin

    2016-10-01

    The asymmetric resonance frequency analysis of silicon cantilevers for a low-cost wearable airborne nanoparticle detector (Cantor) is described in this paper. The cantilevers, which are operated in the fundamental in-plane resonance mode, are used as a mass-sensitive microbalance. They are manufactured out of bulk silicon, containing a full piezoresistive Wheatstone bridge and an integrated thermal heater for reading the measurement output signal and stimulating the in-plane excitation, respectively. To optimize the sensor performance, cantilevers with different cantilever geometries are designed, fabricated and characterized. Besides the resonance frequency, the quality factor (Q) of the resonance curve has a high influence concerning the sensor sensitivity. Because of an asymmetric resonance behaviour, a novel fitting function and method to extract the Q is created, different from that of the simple harmonic oscillator (SHO). For testing the sensor in a long-term frequency analysis, a phase- locked loop (PLL) circuit is employed, yielding a frequency stability of up to 0.753 Hz at an Allan variance of 3.77 × 10-6. This proposed asymmetric resonance frequency analysis method is expected to be further used in the process development of the next-generation Cantor.

  16. Asymmetric resonance frequency analysis of in-plane electrothermal silicon cantilevers for nanoparticle sensors

    International Nuclear Information System (INIS)

    Bertke, Maik; Hamdana, Gerry; Wu, Wenze; Marks, Markus; Wasisto, Hutomo Suryo; Peiner, Erwin

    2016-01-01

    The asymmetric resonance frequency analysis of silicon cantilevers for a low-cost wearable airborne nanoparticle detector (Cantor) is described in this paper. The cantilevers, which are operated in the fundamental in-plane resonance mode, are used as a mass-sensitive microbalance. They are manufactured out of bulk silicon, containing a full piezoresistive Wheatstone bridge and an integrated thermal heater for reading the measurement output signal and stimulating the in-plane excitation, respectively. To optimize the sensor performance, cantilevers with different cantilever geometries are designed, fabricated and characterized. Besides the resonance frequency, the quality factor ( Q ) of the resonance curve has a high influence concerning the sensor sensitivity. Because of an asymmetric resonance behaviour, a novel fitting function and method to extract the Q is created, different from that of the simple harmonic oscillator (SHO). For testing the sensor in a long-term frequency analysis, a phase- locked loop (PLL) circuit is employed, yielding a frequency stability of up to 0.753 Hz at an Allan variance of 3.77 × 10 -6 . This proposed asymmetric resonance frequency analysis method is expected to be further used in the process development of the next-generation Cantor. (paper)

  17. Frequency scanning-based stability analysis method for grid-connected inverter system

    DEFF Research Database (Denmark)

    Wang, Yanbo; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion with conside......This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion...... with consideration of the inverter nonlinearities. Small current disturbance is injected into grid-connected inverter system in a particular frequency range, and the impedance is computed according to the harmonic-frequency response using Fourier analysis, and then the stability is predicted on the basis...... of the impedance stability criterion. The stability issues of grid-connected inverters with grid-current feedback and the converter-current feedback are addressed using the proposed method. The results obtained from simulation and experiments validate the effectiveness of the method. The frequency scanning...

  18. CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Kyoungah Choi

    2015-09-01

    Full Text Available We propose a novel approach to evaluating how effectively a closed circuit television (CCTV system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system.

  19. Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Hoa T. [Univ. of Utah, Salt Lake City, UT (United States); Stone, Daithi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-01

    An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different case studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.

  20. Regulatory analysis for resolution of USI [Unresolved Safety Issue] A-47

    International Nuclear Information System (INIS)

    Szukiewicz, A.J.

    1989-07-01

    This report presents a summary of the regulatory analysis conducted by the US Nuclear Regulatory Commission staff to evaluate the value/impact of alternatives for the resolution of Unresolved Safety Issue A-47, ''Safety Implications of Control Systems.'' The NRC staff's resolution presented herein is based on these analyses and on the technical findings and conclusions presented in NUREG-1217, the companion document to this report. The staff has concluded that certain actions should be taken to improve safety in light-water reactor plants. The staff recommended that certain plants improve their control systems to preclude reactor vessel/steam generator overfill events and to prevent steam generator dryout, modify their technical specifications to verify operability of such systems, and modify selected emergency procedures to ensure safe shutdown of the plant following a small-break loss-of-coolant accident. This report was issued as a draft for public comment on May 27, 1988. As a result of the public comments received, this report was revised. The NRC staff's responses to and resolution of the public comments are included as Appendix C to the final report, NUREG-1217

  1. Clinical usefulness and feasibility of time-frequency analysis of chemosensory event-related potentials.

    Science.gov (United States)

    Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A

    2013-09-01

    The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.

  2. Sub-metric Resolution FWI of Ultra-High-Frequency Marine Reflection Seismograms. A Remote Sensing Tool for the Characterisation of Shallow Marine Geohazard

    Science.gov (United States)

    Provenzano, G.; Vardy, M. E.; Henstock, T.; Zervos, A.

    2017-12-01

    A quantitative high-resolution physical model of the top 100 meters of the sub-seabed is of key importance for a wide range of shallow geohazard scenarios: identification of potential shallow landsliding, monitoring of gas storage sites, and assessment of offshore structures stability. Cur- rently, engineering-scale sediment characterisation relies heavily on direct sampling of the seabed and in-situ measurements. Such an approach is expensive and time-consuming, as well as liable to alter the sediment properties during the coring process. As opposed to reservoir-scale seismic exploration, ultra-high-frequency (UHF, 0.2-4.0 kHz) multi-channel marine reflection seismic data are most often limited to a to semi-quantitative interpretation of the reflection amplitudes and facies geometries, leaving largely unexploited its intrinsic value as a remote characterisation tool. In this work, we develop a seismic inversion methodology to obtain a robust sub-metric resolution elastic model from limited-offset, limited-bandwidth UHF seismic reflection data, with minimal pre-processing and limited a priori information. The Full Waveform Inversion is implemented as a stochastic optimiser based upon a Genetic Algorithm, modified in order to improve the robustness against inaccurate starting model populations. Multiple independent runs are used to create a robust posterior model distribution and quantify the uncertainties on the solution. The methodology has been applied to complex synthetic examples and to real datasets acquired in areas prone to shallow landsliding. The inverted elastic models show a satisfactory match with the ground-truths and a good sensitivity to relevant variations in the sediment texture and saturation state. We apply the methodology to a range of synthetic consolidating slopes under different loading conditions and sediment properties distributions. Our work demonstrates that the seismic inversion of UHF data has the potential to become an important

  3. Detection of somatic mutations by high-resolution DNA melting (HRM) analysis in multiple cancers.

    Science.gov (United States)

    Gonzalez-Bosquet, Jesus; Calcei, Jacob; Wei, Jun S; Garcia-Closas, Montserrat; Sherman, Mark E; Hewitt, Stephen; Vockley, Joseph; Lissowska, Jolanta; Yang, Hannah P; Khan, Javed; Chanock, Stephen

    2011-01-17

    Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM) curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each). HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples.

  4. Detection of somatic mutations by high-resolution DNA melting (HRM analysis in multiple cancers.

    Directory of Open Access Journals (Sweden)

    Jesus Gonzalez-Bosquet

    Full Text Available Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each. HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples.

  5. Analysis of Interactive Conflict Resolution Tool Usage in a Mixed Equipage Environment

    Science.gov (United States)

    Homola, Jeffrey; Morey, Susan; Cabrall, Christopher; Martin, Lynne; Mercer, Joey; Prevot, Thomas

    2013-01-01

    A human-in-the-loop simulation was conducted that examined separation assurance concepts in varying levels of traffic density with mixtures of aircraft equipage and automation. This paper's analysis focuses on one of the experimental conditions in which traffic levels were approximately fifty percent higher than today, and approximately fifty percent of the traffic within the test area were equipped with data communications (data comm) capabilities. The other fifty percent of the aircraft required control by voice much like today. Within this environment, the air traffic controller participants were provided access to tools and automation designed to support the primary task of separation assurance that are currently unavailable. Two tools were selected for analysis in this paper: 1) a pre-probed altitude fly-out menu that provided instant feedback of conflict probe results for a range of altitudes, and 2) an interactive auto resolver that provided on-demand access to an automation-generated conflict resolution trajectory. Although encouraged, use of the support tools was not required; the participants were free to use the tools as they saw fit, and they were also free to accept, reject, or modify the resolutions offered by the automation. This mode of interaction provided a unique opportunity to examine exactly when and how these tools were used, as well as how acceptable the resolutions were. Results showed that the participants used the pre-probed altitude fly-out menu in 14% of conflict cases and preferred to use it in a strategic timeframe on data comm equipped and level flight aircraft. The interactive auto resolver was also used in a primarily strategic timeframe on 22% of conflicts and that their preference was to use it on conflicts involving data comm equipped aircraft as well. Of the 258 resolutions displayed, 46% were implemented and 54% were not. The auto resolver was rated highly by participants in terms of confidence and preference. Factors such as

  6. The incident of repetitive demands resolution in consumer affairs: empirical analysis of legal feasibility

    Directory of Open Access Journals (Sweden)

    Lucas do Monte Silva

    2017-05-01

    Full Text Available Faced with the scenario of massification of lawsuits, this article intends to analyze the main arguments and questionings of the demands related to moral damage and health plans, on Santa Catarina’s Court of Justice, in order to analyze the possible application of the incident of repetitive demands resolution of the new Civil Procedure Code. To do so, it will be done, first, an analysis of the current context of the Brazilian judiciary, presenting the context of repetitive demands and massification of contracts and introductory aspects of the incident of repetitive demands resolution. Then it will made be a judicial empirical analysis, quantitative and qualitative, through a case study of Santa Catarina Courts of Justice, conducting an empirical study of cross descriptive analysis of the demands related to the issue highlighted above, in order to demonstrate an 'argumentative radiography’ of the judgments of that Court. The results confirmed the possibility of applying IRDR in repetitive demands relating to subjects of this study, with due legal caution, taking into account the high number of “issues of fact” that involve lawsuits that have, among their claims, compensation for moral damages.

  7. Isotopomer analysis of lipid biosynthesis by high resolution mass spectrometry and NMR

    Energy Technology Data Exchange (ETDEWEB)

    Lane, Andrew N., E-mail: anlane01@louisville.edu [JG Brown Cancer Center, 529 S. Jackson Street, Louisville, KY 40202 (United States); Center for Regulatory and Environmental Analytical Metabolomics (CREAM), University of Louisville, Louisville, KY (United States); Fan, Teresa W.-M. [JG Brown Cancer Center, 529 S. Jackson Street, Louisville, KY 40202 (United States); Center for Regulatory and Environmental Analytical Metabolomics (CREAM), University of Louisville, Louisville, KY (United States); Department of Chemistry, University of Louisville, Louisville, KY 40292 (United States); Xie, Zhengzhi; Moseley, Hunter N.B.; Higashi, Richard M. [Center for Regulatory and Environmental Analytical Metabolomics (CREAM), University of Louisville, Louisville, KY (United States); Department of Chemistry, University of Louisville, Louisville, KY 40292 (United States)

    2009-10-05

    We have coupled 2D-NMR and infusion FT-ICR-MS with computer-assisted assignment to profile {sup 13}C-isotopologues of glycerophospholipids (GPL) directly in crude cell extracts, resulting in very high information throughput of >3000 isobaric molecules in a few minutes. A mass accuracy of better than 1 ppm combined with a resolution of 100,000 at the measured m/z was required to distinguish isotopomers from other GPL structures. Isotopologue analysis of GPLs extracted from LCC2 breast cancer cells grown on [U-{sup 13}C]-glucose provided a rich trove of information about the biosynthesis and turnover of the GPLs. The isotopologue intensity ratios from the FT-ICR-MS were accurate to {approx}1% or better based on natural abundance background, and depended on the signal-to-nose ratio. The time course of incorporation of {sup 13}C from [U-{sup 13}C]-glucose into a particular phosphatidylcholine was analyzed in detail, to provide a quantitative measure of the sizes of glycerol, acetyl CoA and total GPL pools in growing LCC2 cells. Independent and complementary analysis of the positional {sup 13}C enrichment in the glycerol and fatty acyl chains obtained from high resolution 2D NMR was used to verify key aspects of the model. This technology enables simple and rapid sample preparation, has rapid analysis, and is generally applicable to unfractionated GPLs of almost any head group, and to mixtures of other classes of metabolites.

  8. Isotopomer analysis of lipid biosynthesis by high resolution mass spectrometry and NMR

    International Nuclear Information System (INIS)

    Lane, Andrew N.; Fan, Teresa W.-M.; Xie, Zhengzhi; Moseley, Hunter N.B.; Higashi, Richard M.

    2009-01-01

    We have coupled 2D-NMR and infusion FT-ICR-MS with computer-assisted assignment to profile 13 C-isotopologues of glycerophospholipids (GPL) directly in crude cell extracts, resulting in very high information throughput of >3000 isobaric molecules in a few minutes. A mass accuracy of better than 1 ppm combined with a resolution of 100,000 at the measured m/z was required to distinguish isotopomers from other GPL structures. Isotopologue analysis of GPLs extracted from LCC2 breast cancer cells grown on [U- 13 C]-glucose provided a rich trove of information about the biosynthesis and turnover of the GPLs. The isotopologue intensity ratios from the FT-ICR-MS were accurate to ∼1% or better based on natural abundance background, and depended on the signal-to-nose ratio. The time course of incorporation of 13 C from [U- 13 C]-glucose into a particular phosphatidylcholine was analyzed in detail, to provide a quantitative measure of the sizes of glycerol, acetyl CoA and total GPL pools in growing LCC2 cells. Independent and complementary analysis of the positional 13 C enrichment in the glycerol and fatty acyl chains obtained from high resolution 2D NMR was used to verify key aspects of the model. This technology enables simple and rapid sample preparation, has rapid analysis, and is generally applicable to unfractionated GPLs of almost any head group, and to mixtures of other classes of metabolites.

  9. Frequency-Zooming ARMA Modeling for Analysis of Noisy String Instrument Tones

    Directory of Open Access Journals (Sweden)

    Paulo A. A. Esquef

    2003-09-01

    Full Text Available This paper addresses model-based analysis of string instrument sounds. In particular, it reviews the application of autoregressive (AR modeling to sound analysis/synthesis purposes. Moreover, a frequency-zooming autoregressive moving average (FZ-ARMA modeling scheme is described. The performance of the FZ-ARMA method on modeling the modal behavior of isolated groups of resonance frequencies is evaluated for both synthetic and real string instrument tones immersed in background noise. We demonstrate that the FZ-ARMA modeling is a robust tool to estimate the decay time and frequency of partials of noisy tones. Finally, we discuss the use of the method in synthesis of string instrument sounds.

  10. Image enhancement of x-ray microscope using frequency spectrum analysis

    International Nuclear Information System (INIS)

    Li Wenjie; Chen Jie; Tian Jinping; Zhang Xiaobo; Liu Gang; Tian Yangchao; Liu Yijin; Wu Ziyu

    2009-01-01

    We demonstrate a new method for x-ray microscope image enhancement using frequency spectrum analysis. Fine sample characteristics are well enhanced with homogeneous visibility and better contrast from single image. This method is easy to implement and really helps to improve the quality of image taken by our imaging system.

  11. Image enhancement of x-ray microscope using frequency spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Li Wenjie; Chen Jie; Tian Jinping; Zhang Xiaobo; Liu Gang; Tian Yangchao [National Synchrotron Radiation Laboratory, University of Science and Technology of China, Hefei, Anhui 230029 (China); Liu Yijin; Wu Ziyu, E-mail: wuzy@ihep.ac.c, E-mail: ychtian@ustc.edu.c [Institute of High Energy Physics, Chinese Academy of Science, Beijing 100049 (China)

    2009-09-01

    We demonstrate a new method for x-ray microscope image enhancement using frequency spectrum analysis. Fine sample characteristics are well enhanced with homogeneous visibility and better contrast from single image. This method is easy to implement and really helps to improve the quality of image taken by our imaging system.

  12. Feedback control of laser welding based on frequency analysis of light emissions and adaptive beam shaping

    Czech Academy of Sciences Publication Activity Database

    Mrňa, Libor; Šarbort, Martin; Řeřucha, Šimon; Jedlička, Petr

    2012-01-01

    Roč. 39, NOV (2012), s. 784-791 ISSN 1875-3892. [LANE 2012. Laser Assisted Net Shape Engineering /7./ International Conference on Photonic Technologies. Fürth, 12.11.2012-15.12.2012] Institutional support: RVO:68081731 Keywords : laser welding * feedback control * frequency analysis * adaptive beam shaping Subject RIV: BH - Optics, Masers, Lasers

  13. Experimental Demonstration and Theoretical Analysis of Slow Light in a Semiconductor Waveguide at GHz Frequencies

    DEFF Research Database (Denmark)

    Mørk, Jesper; Kjær, Rasmus; Poel, Mike van der

    2005-01-01

    Experimental demonstration and theoretical analysis of slow light in a semiconductor waveguide at GHz frequencies slow-down of light by a factor of two in a semiconductor waveguide at room temperature with a bandwidth of 16.7 GHz using the effect of coherent pulsations of the carrier density...

  14. Operation States Analysis of the Series-Parallel resonant Converter Working Above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2007-01-01

    Full Text Available Operation states analysis of a series-parallel converter working above resonance frequency is described in the paper. Principal equations are derived for individual operation states. On the basis of them the diagrams are made out. The diagrams give the complex image of the converter behaviour for individual circuit parameters. The waveforms may be utilised at designing the inverter individual parts.

  15. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  16. The evaluation of voiding patterns. An analysis of frequency-volume charts and symptom scores

    NARCIS (Netherlands)

    Haarst, E.P. van

    2015-01-01

    This thesis is an analysis of frequency-volume charts (FVCs) and International Prostate Symptom Scores (IPSS) and their relations, based on 2 large databases: one with 24-hour FVCs of 1152 volunteers of all adult age groups without urological complaints, and one with 7-day FVCs of 378 urological

  17. High frequency analysis of lead-lag relationships between financial markets

    NARCIS (Netherlands)

    de Jong, F.C.J.M.; Nijman, T.E.

    1995-01-01

    High frequency data are often observed at irregular intervals, which complicates the analysis of lead-lag relationships between financial markets. Frequently, estimators have been used that are based on observations at regular intervals, which are adapted to the irregular observations case by

  18. Frequency domain performance analysis of marginally stable LTI systems with saturation

    NARCIS (Netherlands)

    Berg, van den R.A.; Pogromski, A.Y.; Rooda, J.E.; Leonov, G.; Nijmeijer, H.; Pogromsky, A.; Fradkov, A.

    2009-01-01

    In this paper we discuss the frequency domain performance analysis of a marginally stable linear time-invariant (LTI) system with saturation in the feedback loop. We present two methods, both based on the notion of convergent systems, that allow to evaluate the performance of this type of systems in

  19. Operation Analysis of the Series-Parallel Resonant Converter Working above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2006-01-01

    Full Text Available The present article deals with theoretical analysis of operation of a series-parallel converter working above resonance frequency. Derived are principal equations for individual operation intervals. Based on these made out are waveforms of individual quantities during both the inverter operation at load and no-load operation. The waveforms may be utilised at designing the inverter individual parts.

  20. Impedance-Based High Frequency Resonance Analysis of DFIG System in Weak Grids

    DEFF Research Database (Denmark)

    Song, Yipeng; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    Resonance (SSR). However, the High Frequency Resonance (HFR) of DFIG systems due to the impedance interaction between DFIG system and parallel compensated weak network is often overlooked. This paper thus investigates the impedance characteristics of DFIG systems for the analysis of HFR. The influences...

  1. TropFishR: an R package for fisheries analysis with length-frequency data

    DEFF Research Database (Denmark)

    Mildenberger, Tobias; Taylor, M. H.; Wolff, A.M.

    2017-01-01

    1. The R package TropFishR is a new analysis toolbox compiling single-species stock assessment methods specifically designed for data-limited fisheries analysis using length-frequency data. 2. It includes methods for (i) estimating biological stock characteristics such as growth and mortality par...... introduces the package and demonstrates the functionality of a selection of its core methods. 4. TropFishR modernises traditional stock assessment methods by easing application and development and by combining it with advanced statistical approaches......1. The R package TropFishR is a new analysis toolbox compiling single-species stock assessment methods specifically designed for data-limited fisheries analysis using length-frequency data. 2. It includes methods for (i) estimating biological stock characteristics such as growth and mortality...

  2. Analysis of the Emitted Wavelet of High-Resolution Bowtie GPR Antennas

    Directory of Open Access Journals (Sweden)

    Manuel Pereira

    2009-06-01

    Full Text Available Most Ground Penetrating Radars (GPR cover a wide frequency range by emitting very short time wavelets. In this work, we study in detail the wavelet emitted by two bowtie GPR antennas with nominal frequencies of 800 MHz and 1 GHz. Knowledge of this emitted wavelet allows us to extract as much information as possible from recorded signals, using advanced processing techniques and computer simulations. Following previously published methodology used by Rial et al. [1], which ensures system stability and reliability in data acquisition, a thorough analysis of the wavelet in both time and frequency domain is performed. Most of tests were carried out with air as propagation medium, allowing a proper analysis of the geometrical attenuation factor. Furthermore, we attempt to determine, for each antenna, a time zero in the records to allow us to correctly assign a position to the reflectors detected by the radar. Obtained results indicate that the time zero is not a constant value for the evaluated antennas, but instead depends on the characteristics of the material in contact with the antenna.

  3. Study on time-frequency analysis method of very fast transient overvoltage

    Science.gov (United States)

    Li, Shuai; Liu, Shiming; Huang, Qiyan; Fu, Chuanshun

    2018-04-01

    The operation of the disconnector in the gas insulated substation (GIS) may produce very fast transient overvoltage (VFTO), which has the characteristics of short rise time, short duration, high amplitude and rich frequency components. VFTO can cause damage to GIS and secondary equipment, and the frequency components contained in the VFTO can cause resonance overvoltage inside the transformer, so it is necessary to study the spectral characteristics of the VFTO. From the perspective of signal processing, VFTO is a kind of non-stationary signal, the traditional Fourier transform is difficult to describe its frequency which changes with time, so it is necessary to use time-frequency analysis to analyze VFTO spectral characteristics. In this paper, we analyze the performance of short time Fourier transform (STFT), Wigner-Ville distribution (WVD), pseudo Wigner-Ville distribution (PWVD) and smooth pseudo Wigner-Ville distribution (SPWVD). The results show that SPWVD transform is the best. The time-frequency aggregation of SPWVD is higher than STFT, and it does not have cross-interference terms, which can meet the requirements of VFTO spectrum analysis.

  4. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    Science.gov (United States)

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily

  5. A multivariate analysis of age-related differences in functional networks supporting conflict resolution.

    Science.gov (United States)

    Salami, Alireza; Rieckmann, Anna; Fischer, Håkan; Bäckman, Lars

    2014-02-01

    Functional neuroimaging studies demonstrate age-related differences in recruitment of a large-scale attentional network during interference resolution, especially within dorsolateral prefrontal cortex (DLPFC) and anterior cingulate cortex (ACC). These alterations in functional responses have been frequently observed despite equivalent task performance, suggesting age-related reallocation of neural resources, although direct evidence for a facilitating effect in aging is sparse. We used the multi-source interference task and multivariate partial-least-squares to investigate age-related differences in the neuronal signature of conflict resolution, and their behavioral implications in younger and older adults. There were interference-related increases in activity, involving fronto-parietal and basal ganglia networks that generalized across age. In addition an age-by-task interaction was observed within a distributed network, including DLPFC and ACC, with greater activity during interference in the old. Next, we combined brain-behavior and functional connectivity analyses to investigate whether compensatory brain changes were present in older adults, using DLPFC and ACC as regions of interest (i.e. seed regions). This analysis revealed two networks differentially related to performance across age groups. A structural analysis revealed age-related gray-matter losses in regions facilitating performance in the young, suggesting that functional reorganization may partly reflect structural alterations in aging. Collectively, these findings suggest that age-related structural changes contribute to reductions in the efficient recruitment of a youth-like interference network, which cascades into instantiation of a different network facilitating conflict resolution in elderly people. © 2013. Published by Elsevier Inc. All rights reserved.

  6. Identification of the GST-T1 and GST-M1 null genotypes using high resolution melting analysis.

    Science.gov (United States)

    Drobná, Zuzana; Del Razo, Luz Maria; Garcia-Vargas, Gonzalo; Sánchez-Ramírez, Blanca; González-Horta, Carmen; Ballinas-Casarrubias, Lourdes; Loomis, Dana; Stýblo, Miroslav

    2012-01-13

    Glutathione S-transferases, including GST-T1 and GST-M1, are known to be involved in the phase II detoxification pathways for xenobiotics as well as in the metabolism of endogenous compounds. Polymorphisms in these genes have been linked to an increased susceptibility to carcinogenesis and associated with risk factors that predispose to certain inflammatory diseases. In addition, GST-T1 and GST-M1 null genotypes have been shown to be responsible for interindividual variations in the metabolism of arsenic, a known human carcinogen. To assess the specific GST genotypes in the Mexican population chronically exposed to arsenic, we have developed a multiplex High Resolution Melting PCR (HRM-PCR) analysis using a LightCycler480 instrument. This method is based on analysis of the PCR product melting curve that discriminates PCR products according to their lengths and base sequences. Three pairs of primers that specifically recognize GST-T1, GST-M1, and β-globin, an internal control, to produce amplicons of different length were designed and combined with LightCycler480 High Resolution Melting Master Mix containing ResoLight, a completely saturating DNA dye. Data collected from melting curve analysis were evaluated using LightCycler480 software to determine specific melting temperatures of individual melting curves representing target genes. Using this newly developed multiplex HRM-PCR analysis, we evaluated GST-T1 and GST-M1 genotypes in 504 DNA samples isolated from the blood of individuals residing in Zimapan, Lagunera, and Chihuahua regions in Mexico. We found that the Zimapan and Lagunera populations have similar GST-T1 and GST-M1 genotype frequencies which differ from those of the Chihuahua population. In addition, 14 individuals have been identified as carriers of the double null genotype, i.e., null genotypes in both GST-T1 and GST-M1 genes. Although this procedure does not distinguish between biallelic (+/+) and monoallelic (+/-) genotypes, it can be used in an

  7. Correlation analysis between team communication characteristics and frequency of inappropriate communications

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Lee, Seung Woo; Park, Jinkyun; Kang, Hyun Gook; Seong, Poong Hyun

    2013-01-01

    Highlights: • We proposed a method to evaluate team communication characteristics based on social network analysis. • We compare team communication characteristics with the frequency of inappropriate communications. • Frequency of inappropriate communications were decreased when more operators perform the same types of role as others. • Frequency of inappropriate communications were decreased for teams who provide more number of acknowledgment. - Abstract: The characteristics of team communications are important since large process systems such as nuclear power plants, airline, and railways are operated by operating teams. In such situation, inappropriate communications can cause a lack of situational information and lead to serious consequences for the systems. As a result, the communication characteristics of operating teams should be understood in order to extract meaningful insights to address the nature of inappropriate communications. The purpose of this study was to develop a method to evaluate the characteristics of team communications based on social network analysis and compare them with the frequency of inappropriate communications. In order to perform the analysis, verbal protocol data, which were audio-visual recorded under training sessions by operating teams, were used and interfacing system loss of coolant accident scenarios were selected. As a result of the study, it was found that the frequency of inappropriate communications decreased when more operators perform the same types of role as other operators, since they can easily and effectively back up each other. Also, the frequency of inappropriate communication is decreased for teams which provide a relatively large communication content that acknowledge or confirm another communication content

  8. Quantitative analysis of 39 polybrominated diphenyl ethers by isotope dilution GC/low-resolution MS.

    Science.gov (United States)

    Ackerman, Luke K; Wilson, Glenn R; Simonich, Staci L

    2005-04-01

    A GC/low-resolution MS method for the quantitative isotope dilution analysis of 39 mono- to heptabrominated diphenyl ethers was developed. The effects of two different ionization sources, electron impact (EI) and electron capture negative ionization (ECNI), and the effects of their parameters on production of high-mass fragment ions [M - xH - yBr](-) specific to PBDEs were investigated. Electron energy, emission current, source temperature, ECNI system pressure, and choice of ECNI reagent gases were optimized. Previously unidentified enhancement of PBDE high-mass fragment ion [M - xH - yBr](-) abundance was achieved. Electron energy had the largest impact on PBDE high-mass fragment ion abundance for both the ECNI and EI sources. By monitoring high-mass fragment ions of PBDEs under optimized ECNI source conditions, quantitative isotope dilution analysis of 39 PBDEs was conducted using nine (13)C(12) labeled PBDEs on a low-resolution MS with low picogram to femtogram instrument detection limits.

  9. Analysis of the Behavior of Undamped and Unstable High-Frequency Resonance in DFIG System

    DEFF Research Database (Denmark)

    Song, Yipeng; Blaabjerg, Frede

    2017-01-01

    As the wind power generation develops, the Doubly Fed Induction Generator (DFIG) based wind power system may suffer Sub Synchronous Resonance (SSR) and High Frequency Resonance (HFR) in the series and parallel compensated weak network. The principle and frequency of HFR have been discussed using...... the Bode diagram as an analysis tool. However, the HFR can be categorized into two different types: undamped HFR (which exists in steady state) and unstable HFR (which eventually results in complete instability and divergence), both of them are not investigated before. Since both the undamped HFR...

  10. Frequency distribution analysis of the long-lived beta-activity of air dust

    International Nuclear Information System (INIS)

    Bunzl, K.; Hoetzl, H.; Winkler, R.

    1977-01-01

    In order to compare the average annual beta activities of air dust a frequency distribution analysis of data has been carried out in order to select a representative quantity for the average value of the data group. It was found that the data to be analysed were consistent with a log-normal frequency distribution and therefore calculations were made of, as the representative average, the median of the beta activity of each year as the antilog of the arithmetric mean of the logarithms, log x, of the analytical values x. The 95% confidence limits were also obtained. The quantities thus calculated are summarized in tabular form. (U.K.)

  11. The spa typing of methicillin-resistant Staphylococcus aureus isolates by High Resolution Melting (HRM) analysis.

    Science.gov (United States)

    Fasihi, Yasser; Fooladi, Saba; Mohammadi, Mohammad Ali; Emaneini, Mohammad; Kalantar-Neyestanaki, Davood

    2017-09-06

    Molecular typing is an important tool for control and prevention of infection. A suitable molecular typing method for epidemiological investigation must be easy to perform, highly reproducible, inexpensive, rapid and easy to interpret. In this study, two molecular typing methods including the conventional PCR-sequencing method and high resolution melting (HRM) analysis were used for staphylococcal protein A (spa) typing of 30 Methicillin-resistant Staphylococcus aureus (MRSA) isolates recovered from clinical samples. Based on PCR-sequencing method results, 16 different spa types were identified among the 30 MRSA isolates. Among the 16 different spa types, 14 spa types separated by HRM method. Two spa types including t4718 and t2894 were not separated from each other. According to our results, spa typing based on HRM analysis method is very rapid, easy to perform and cost-effective, but this method must be standardized for different regions, spa types, and real-time machinery.

  12. The application of computer technique in routine neutron activation analysis using high resolution gamma ray spectrometry

    International Nuclear Information System (INIS)

    Szopa, Z.; Plejewska, M.; Staszelis, J.

    1982-01-01

    A full system of four computer programs for routine - qualitative and quantitative - neutron activation analysis (NAA) using high resolution gamma ray-spectrometry had been elaborated. The structure and possibilities of the ''data flow'' programs i.e. programs DIDPDP and DIDCDC, dedicated for fast and reliable ''off line'' data transfer between the buffer memory of the spectrometric line (9-track magnetic tape) and the fast access memory (disc) of the used computers PDP-11/45 and CYBER-73 had been presented. The structure and organization of the ''data processing'' programs i.e. programs SAWAPS and MAZYG had been presented as well. The utility and reliability of these programs in the case of the large-scale, routine NAA, exampled by analysis of filters with air polutants, had been tested and discussed. Programs are written mainly in FORTRAN. (author)

  13. The investigation of Martian dune fields using very high resolution photogrammetric measurements and time series analysis

    Science.gov (United States)

    Kim, J.; Park, M.; Baik, H. S.; Choi, Y.

    2016-12-01

    At the present time, arguments continue regarding the migration speeds of Martian dune fields and their correlation with atmospheric circulation. However, precisely measuring the spatial translation of Martian dunes has rarely conducted only a very few times Therefore, we developed a generic procedure to precisely measure the migration of dune fields with recently introduced 25-cm resolution High Resolution Imaging Science Experimen (HIRISE) employing a high-accuracy photogrammetric processor and sub-pixel image correlator. The processor was designed to trace estimated dune migration, albeit slight, over the Martian surface by 1) the introduction of very high resolution ortho images and stereo analysis based on hierarchical geodetic control for better initial point settings; 2) positioning error removal throughout the sensor model refinement with a non-rigorous bundle block adjustment, which makes possible the co-alignment of all images in a time series; and 3) improved sub-pixel co-registration algorithms using optical flow with a refinement stage conducted on a pyramidal grid processor and a blunder classifier. Moreover, volumetric changes of Martian dunes were additionally traced by means of stereo analysis and photoclinometry. The established algorithms have been tested using high-resolution HIRISE images over a large number of Martian dune fields covering whole Mars Global Dune Database. Migrations over well-known crater dune fields appeared to be almost static for the considerable temporal periods and were weakly correlated with wind directions estimated by the Mars Climate Database (Millour et al. 2015). Only over a few Martian dune fields, such as Kaiser crater, meaningful migration speeds (>1m/year) compared to phtotogrammetric error residual have been measured. Currently a technical improved processor to compensate error residual using time series observation is under developing and expected to produce the long term migration speed over Martian dune

  14. Analysis of measured data of human body based on error correcting frequency

    Science.gov (United States)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  15. Historical and future drought in Bangladesh using copula-based bivariate regional frequency analysis

    Science.gov (United States)

    Mortuza, Md Rubayet; Moges, Edom; Demissie, Yonas; Li, Hong-Yi

    2018-02-01

    The study aims at regional and probabilistic evaluation of bivariate drought characteristics to assess both the past and future drought duration and severity in Bangladesh. The procedures involve applying (1) standardized precipitation index to identify drought duration and severity, (2) regional frequency analysis to determine the appropriate marginal distributions for both duration and severity, (3) copula model to estimate the joint probability distribution of drought duration and severity, and (4) precipitation projections from multiple climate models to assess future drought trends. Since drought duration and severity in Bangladesh are often strongly correlated and do not follow same marginal distributions, the joint and conditional return periods of droughts are characterized using the copula-based joint distribution. The country is divided into three homogeneous regions using Fuzzy clustering and multivariate discordancy and homogeneity measures. For given severity and duration values, the joint return periods for a drought to exceed both values are on average 45% larger, while to exceed either value are 40% less than the return periods from the univariate frequency analysis, which treats drought duration and severity independently. These suggest that compared to the bivariate drought frequency analysis, the standard univariate frequency analysis under/overestimate the frequency and severity of droughts depending on how their duration and severity are related. Overall, more frequent and severe droughts are observed in the west side of the country. Future drought trend based on four climate models and two scenarios showed the possibility of less frequent drought in the future (2020-2100) than in the past (1961-2010).

  16. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  17. A Study on Regional Frequency Analysis using Artificial Neural Network - the Sumjin River Basin

    Science.gov (United States)

    Jeong, C.; Ahn, J.; Ahn, H.; Heo, J. H.

    2017-12-01

    Regional frequency analysis means to make up for shortcomings in the at-site frequency analysis which is about a lack of sample size through the regional concept. Regional rainfall quantile depends on the identification of hydrologically homogeneous regions, hence the regional classification based on hydrological homogeneous assumption is very important. For regional clustering about rainfall, multidimensional variables and factors related geographical features and meteorological figure are considered such as mean annual precipitation, number of days with precipitation in a year and average maximum daily precipitation in a month. Self-Organizing Feature Map method which is one of the artificial neural network algorithm in the unsupervised learning techniques solves N-dimensional and nonlinear problems and be shown results simply as a data visualization technique. In this study, for the Sumjin river basin in South Korea, cluster analysis was performed based on SOM method using high-dimensional geographical features and meteorological factor as input data. then, for the results, in order to evaluate the homogeneity of regions, the L-moment based discordancy and heterogeneity measures were used. Rainfall quantiles were estimated as the index flood method which is one of regional rainfall frequency analysis. Clustering analysis using SOM method and the consequential variation in rainfall quantile were analyzed. This research was supported by a grant(2017-MPSS31-001) from Supporting Technology Development Program for Disaster Management funded by Ministry of Public Safety and Security(MPSS) of the Korean government.

  18. Application of time–frequency wavelet analysis in the reflectometry of thin films

    Energy Technology Data Exchange (ETDEWEB)

    Astaf’ev, S. B., E-mail: bard@crys.ras.ru [Russian Academy of Sciences, Shubnikov Institute of Crystallography, Federal Scientific Research Centre “Crystallography and Photonics” (Russian Federation); Shchedrin, B. M. [Moscow State University, Faculty of Computational Mathematics and Cybernetics (Russian Federation); Yanusova, L. G. [Russian Academy of Sciences, Shubnikov Institute of Crystallography, Federal Scientific Research Centre “Crystallography and Photonics” (Russian Federation)

    2017-03-15

    The application of time–frequency wavelet analysis for solving the reflectometry inverse problem is considered. It is shown that a simultaneous transform of specular intensity curve, depending on the grazing angle and spatial frequency, allows one to determine not only the thickness but also the alteration order of individual regions (layers) with characteristic behavior of electron density. This information makes it possible to reconstruct the electron density profile in the film cross section as a whole (i.e., to solve the inverse reflectometry problem). The application of the time–frequency transform is illustrated by examples of reconstructing (based on X-ray reflectivity data) the layer alternation order in models of two-layer films with inverted arrangement of layers and a four-layer film on a solid substrate.

  19. Frequency-resolved interferometric measurement of local density fluctuations for turbulent combustion analysis

    International Nuclear Information System (INIS)

    Köberl, S; Giuliani, F; Woisetschläger, J; Fontaneto, F

    2010-01-01

    A validation of a novel interferometric measurement technique for the frequency-resolved detection of local density fluctuation in turbulent combustion analysis was performed in this work. Two laser vibrometer systems together with a signal analyser were used to obtain frequency spectra of density fluctuations across a methane-jet flame. Since laser vibrometry is based on interferometric techniques, the derived signals are path-integrals along the measurement beam. To obtain local frequency spectra of density fluctuations, long-time-averaged measurements from each of the two systems were performed using correlation functions and cross spectra. Results were compared to data recorded by standard interferometric techniques for validation purposes. Additionally, Raman scattering and laser Doppler velocimetry were used for flame characterization

  20. Fatigue crack propagation of super duplex stainless steel and time-frequency analysis of acoustic emission

    International Nuclear Information System (INIS)

    Lee, Sang Kee; Nam, Ki Woo; Kang, Chang Yong; Do, Jae Yoon

    2000-01-01

    On this study, the fatigue crack propagation of super duplex stainless steel is investigated in conditions of various volume fraction of austenite phase by changing heat treatment temperature. And we analysed acoustic emission signals during the fatigue test by time-frequency analysis methods. As the temperature of heat treatment increased, volume fraction of austenite decreased and coarse grain was obtained. The specimen heat treated at 1200 deg. C had longer fatigue life and slower rate of crack growth. As a result of time-frequency analyze of acoustic emission signals during fatigue test, main frequency was 200∼300 kHz having no correlation with heat treatment and crack length, and 500 kHz was obtained by dimple and separate of inclusion

  1. Evaluation of PCR and high-resolution melt curve analysis for differentiation of Salmonella isolates.

    Science.gov (United States)

    Saeidabadi, Mohammad Sadegh; Nili, Hassan; Dadras, Habibollah; Sharifiyazdi, Hassan; Connolly, Joanne; Valcanis, Mary; Raidal, Shane; Ghorashi, Seyed Ali

    2017-06-01

    Consumption of poultry products contaminated with Salmonella is one of the major causes of foodborne diseases worldwide and therefore detection and differentiation of Salmonella spp. in poultry is important. In this study, oligonucleotide primers were designed from hemD gene and a PCR followed by high-resolution melt (HRM) curve analysis was developed for rapid differentiation of Salmonella isolates. Amplicons of 228 bp were generated from 16 different Salmonella reference strains and from 65 clinical field isolates mainly from poultry farms. HRM curve analysis of the amplicons differentiated Salmonella isolates and analysis of the nucleotide sequence of the amplicons from selected isolates revealed that each melting curve profile was related to a unique DNA sequence. The relationship between reference strains and tested specimens was also evaluated using a mathematical model without visual interpretation of HRM curves. In addition, the potential of the PCR-HRM curve analysis was evaluated for genotyping of additional Salmonella isolates from different avian species. The findings indicate that PCR followed by HRM curve analysis provides a rapid and robust technique for genotyping of Salmonella isolates to determine the serovar/serotype.

  2. Study on Frequency content in seismic hazard analysis in West Azarbayjan and East Azarbayjan provinces (Iran)

    Science.gov (United States)

    Behzadafshar, K.; Abbaszadeh Shahri, A.; Isfandiari, K.

    2012-12-01

    ABSTRACT: Iran plate is prone to earthquake, occurrence of destructive earthquakes approximately every 5 years certify it. Due to existence of happened great earthquakes and large number of potential seismic sources (active faults) which some of them are responsible for great earthquakes the North-West of Iran which is located in junction of Alborz and Zagros seismotectonic provinces (Mirzaii et al, 1998) is an interesting area for seismologists. Considering to population and existence of large cities like Tabriz, Ardabil and Orumiyeh which play crucial role in industry and economy of Iran, authors decided to focus on study of seismic hazard assessment in these two provinces to achieve ground acceleration in different frequency content and indicate critical frequencies in the studied area. It is important to note that however lots of studies have been done in North -West of Iran, but building code modifications also need frequency content analysis to asses seismic hazard more precisely which has been done in the present study. Furthermore, in previous studies have been applied free download softwares which were provided before 2000 but the most important advantage of this study is applying professional industrial software which has been written in 2009 and provided by authors. This applied software can cover previous software weak points very well such as gridding potential sources, attention to the seismogenic zone and applying attenuation relationships directly. Obtained hazard maps illustrate that maximum accelerations will be experienced in North West to South East direction which increased by frequency reduction from 100 Hz to 10 Hz then decreased by frequency reduce (to 0.25 Hz). Maximum acceleration will be occurred in the basement in 10 HZ frequency content. Keywords: hazard map, Frequency content, seismogenic zone, Iran

  3. Prospects of Frequency-Time Correlation Analysis for Detecting Pipeline Leaks by Acoustic Emission Method

    International Nuclear Information System (INIS)

    Faerman, V A; Cheremnov, A G; Avramchuk, V V; Luneva, E E

    2014-01-01

    In the current work the relevance of nondestructive test method development applied for pipeline leak detection is considered. It was shown that acoustic emission testing is currently one of the most widely spread leak detection methods. The main disadvantage of this method is that it cannot be applied in monitoring long pipeline sections, which in its turn complicates and slows down the inspection of the line pipe sections of main pipelines. The prospects of developing alternative techniques and methods based on the use of the spectral analysis of signals were considered and their possible application in leak detection on the basis of the correlation method was outlined. As an alternative, the time-frequency correlation function calculation is proposed. This function represents the correlation between the spectral components of the analyzed signals. In this work, the technique of time-frequency correlation function calculation is described. The experimental data that demonstrate obvious advantage of the time-frequency correlation function compared to the simple correlation function are presented. The application of the time-frequency correlation function is more effective in suppressing the noise components in the frequency range of the useful signal, which makes maximum of the function more pronounced. The main drawback of application of the time- frequency correlation function analysis in solving leak detection problems is a great number of calculations that may result in a further increase in pipeline time inspection. However, this drawback can be partially reduced by the development and implementation of efficient algorithms (including parallel) of computing the fast Fourier transform using computer central processing unit and graphic processing unit

  4. High-resolution three-dimensional imaging and analysis of rock falls in Yosemite valley, California

    Science.gov (United States)

    Stock, Gregory M.; Bawden, G.W.; Green, J.K.; Hanson, E.; Downing, G.; Collins, B.D.; Bond, S.; Leslar, M.

    2011-01-01

    We present quantitative analyses of recent large rock falls in Yosemite Valley, California, using integrated high-resolution imaging techniques. Rock falls commonly occur from the glacially sculpted granitic walls of Yosemite Valley, modifying this iconic landscape but also posing signifi cant potential hazards and risks. Two large rock falls occurred from the cliff beneath Glacier Point in eastern Yosemite Valley on 7 and 8 October 2008, causing minor injuries and damaging structures in a developed area. We used a combination of gigapixel photography, airborne laser scanning (ALS) data, and ground-based terrestrial laser scanning (TLS) data to characterize the rock-fall detachment surface and adjacent cliff area, quantify the rock-fall volume, evaluate the geologic structure that contributed to failure, and assess the likely failure mode. We merged the ALS and TLS data to resolve the complex, vertical to overhanging topography of the Glacier Point area in three dimensions, and integrated these data with gigapixel photographs to fully image the cliff face in high resolution. Three-dimensional analysis of repeat TLS data reveals that the cumulative failure consisted of a near-planar rock slab with a maximum length of 69.0 m, a mean thickness of 2.1 m, a detachment surface area of 2750 m2, and a volume of 5663 ?? 36 m3. Failure occurred along a surfaceparallel, vertically oriented sheeting joint in a clear example of granitic exfoliation. Stress concentration at crack tips likely propagated fractures through the partially attached slab, leading to failure. Our results demonstrate the utility of high-resolution imaging techniques for quantifying far-range (>1 km) rock falls occurring from the largely inaccessible, vertical rock faces of Yosemite Valley, and for providing highly accurate and precise data needed for rock-fall hazard assessment. ?? 2011 Geological Society of America.

  5. Auto-identification of engine fault acoustic signal through inverse trigonometric instantaneous frequency analysis

    Directory of Open Access Journals (Sweden)

    Dayong Ning

    2016-03-01

    Full Text Available The acoustic signals of internal combustion engines contain valuable information about the condition of engines. These signals can be used to detect incipient faults in engines. However, these signals are complex and composed of a faulty component and other noise signals of background. As such, engine conditions’ characteristics are difficult to extract through wavelet transformation and acoustic emission techniques. In this study, an instantaneous frequency analysis method was proposed. A new time–frequency model was constructed using a fixed amplitude and a variable cycle sine function to fit adjacent points gradually from a time domain signal. The instantaneous frequency corresponds to single value at any time. This study also introduced instantaneous frequency calculation on the basis of an inverse trigonometric fitting method at any time. The mean value of all local maximum values was then considered to identify the engine condition automatically. Results revealed that the mean of local maximum values under faulty conditions differs from the normal mean. An experiment case was also conducted to illustrate the availability of the proposed method. Using the proposed time–frequency model, we can identify engine condition and determine abnormal sound produced by faulty engines.

  6. Warped frequency transform analysis of ultrasonic guided waves in long bones

    Science.gov (United States)

    De Marchi, L.; Baravelli, E.; Xu, K.; Ta, D.; Speciale, N.; Marzani, A.; Viola, E.

    2010-03-01

    Long bones can be seen as irregular hollow tubes, in which, for a given excitation frequency, many ultrasonic Guided Waves (GWs) can propagate. The analysis of GWs is potential to reflect more information on both geometry and material properties of the bone than any other method (such as dual-energy X-ray absorptiometry, or quantitative computed tomography), and can be used in the assessment of osteoporosis and in the evaluation of fracture healing. In this study, time frequency representations (TFRs) were used to gain insights into the expected behavior of GWs in bones. To this aim, we implemented a dedicated Warped Frequency Transform (WFT) which decomposes the spectrotemporal components of the different propagating modes by selecting an appropriate warping map to reshape the frequency axis. The map can be designed once the GWs group velocity dispersion curves can be predicted. To this purpose, the bone is considered as a hollow cylinder with inner and outer diameter of 16.6 and 24.7 mm, respectively, and linear poroelastic material properties in agreement with the low level of stresses induced by the waves. Timetransient events obtained experimentally, via a piezoelectric ultrasonic set-up applied to bovine tibiae, are analyzed. The results show that WFT limits interference patterns which appear with others TFRs (such as scalograms or warpograms) and produces a sparse representation suitable for characterization purposes. In particular, the mode-frequency combinations propagating with minimal losses are identified.

  7. Analysis of muscle fatigue conditions using time-frequency images and GLCM features

    Directory of Open Access Journals (Sweden)

    Karthick P.A.

    2016-09-01

    Full Text Available In this work, an attempt has been made to differentiate muscle non-fatigue and fatigue conditions using sEMG signals and texture representation of the time-frequency images. The sEMG signals are recorded from the biceps brachii muscle of 25 healthy adult volunteers during dynamic fatiguing contraction. The first and last curls of these signals are considered as the non-fatigue and fatigue zones, respectively. These signals are preprocessed and the time-frequency spectrum is computed using short time fourier transform (STFT. Gray-Level Co-occurrence Matrix (GLCM is extracted from low (15–45 Hz, medium (46–95 Hz and high (96–150 Hz frequency bands of the time-frequency images. Further, the features such as contrast, correlation, energy and homogeneity are calculated from the resultant matrices. The results show that the high frequency band based features are able to differentiate non-fatigue and fatigue conditions. The features such as correlation, contrast and homogeneity extracted at angles 0°, 45°, 90°, and 135° are found to be distinct with high statistical significance (p < 0.0001. Hence, this framework can be used for analysis of neuromuscular disorders.

  8. Dimensional analysis and prediction of dielectrophoretic crossover frequency of spherical particles

    Directory of Open Access Journals (Sweden)

    Che-Kai Yeh

    2017-06-01

    Full Text Available The manipulation of biological cells and micrometer-scale particles using dielectrophoresis (DEP is an indispensable technique for lab-on-a-chip systems for many biological and colloidal science applications. However, existing models, including the dipole model and numerical simulations based on Maxwell stress tensor (MST, cannot achieve high accuracy and high computation efficiency at the same time. The dipole model is widely used and provides adequate predictions on the crossover frequency of submicron particles, but cannot predict the crossover frequency for larger particles accurately; on the other hand, the MST method offers high accuracy for a wide variety of particle sizes and shapes, but is time-consuming and may lack predictive understanding of the interplay between key parameters. Here we present a mathematical model, using dimensional analysis and the Buckingham pi theorem, that permits high accuracy and efficiency in predicting the crossover frequency of spherical particles. The curve fitting and calculation are performed using commercial packages OriginLab and MATLAB, respectively. In addition, through this model we also can predict the conditions in which no crossover frequency exists. Also, we propose a pair of dimensionless parameters, forming a functional relation, that provide physical insights into the dependency of the crossover frequency on five key parameters. The model is verified under several scenarios using comprehensive MST simulations by COMSOL Multiphysics software (COMSOL, Inc. and some published experimental data.

  9. [The value of spectral frequency analysis by Doppler examination (author's transl)].

    Science.gov (United States)

    Boccalon, H; Reggi, M; Lozes, A; Canal, C; Jausseran, J M; Courbier, R; Puel, P; Enjalbert, A

    1981-01-01

    Arterial stenoses of moderate extent may involve modifications of the blood flow. Arterial shading is not always examined at the best incident angle to assess the extent of the stenosis. Spectral frequency analysis by Doppler examination is a good means of evaluating the effect of moderate arterial lesions. The present study was carried out with a Doppler effect having an acoustic spectrum, which is shown in a histogram having 16 frequency bands. The values were recorded on the two femoral arteries. A study was also made of 49 normal subjects so as to establish a normal envelope histogram, taking into account the following parameters: maximum peak (800 Hz), low cut-off frequency (420 Hz), high cut-off frequency (2,600 Hz); the first peak was found to be present in 81 % of the subjects (at 375 Hz) and the second peak in 75 % of the subjects (2,020 Hz). Thirteen patients with iliac lesions of different extent were included in the study; details of these lesions were established in all cases by aortography. None of the recorded frequency histograms were located within the normal envelope. Two cases of moderate iliac stenoses were noted ( Less Than 50 % of the diameter) which interfered with the histogram, even though the femoral velocity signal was normal.

  10. The frequency of family meals and nutritional health in children: a meta-analysis.

    Science.gov (United States)

    Dallacker, M; Hertwig, R; Mata, J

    2018-05-01

    Findings on the relationship between family meal frequency and children's nutritional health are inconsistent. The reasons for these mixed results have to date remained largely unexplored. This systematic review and meta-analysis of 57 studies (203,706 participants) examines (i) the relationship between family meal frequency and various nutritional health outcomes and (ii) two potential explanations for the inconsistent findings: sociodemographic characteristics and mealtime characteristics. Separate meta-analyses revealed significant associations between higher family meal frequency and better overall diet quality (r = 0.13), more healthy diet (r = 0.10), less unhealthy diet (r = -0.04) and lower body mass index, BMI (r = -0.05). Child's age, country, number of family members present at meals and meal type (i.e. breakfast, lunch or dinner) did not moderate the relationship of meal frequency with healthy diet, unhealthy diet or BMI. Socioeconomic status only moderated the relationship with BMI. The findings show a significant relationship between frequent family meals and better nutritional health - in younger and older children, across countries and socioeconomic groups, and for meals taken with the whole family vs. one parent. Building on these findings, research can now target the causal direction of the relationship between family meal frequency and nutritional health. © 2018 World Obesity Federation.

  11. Viscoelastic characterization of compacted pharmaceutical excipient materials by analysis of frequency-dependent mechanical relaxation processes

    Science.gov (United States)

    Welch, K.; Mousavi, S.; Lundberg, B.; Strømme, M.

    2005-09-01

    A newly developed method for determining the frequency-dependent complex Young's modulus was employed to analyze the mechanical response of compacted microcrystalline cellulose, sorbitol, ethyl cellulose and starch for frequencies up to 20 kHz. A Debye-like relaxation was observed in all the studied pharmaceutical excipient materials and a comparison with corresponding dielectric spectroscopy data was made. The location in frequency of the relaxation peak was shown to correlate to the measured tensile strength of the tablets, and the relaxation was interpreted as the vibrational response of the interparticle hydrogen and van der Waals bindings in the tablets. Further, the measured relaxation strength, holding information about the energy loss involved in the relaxation processes, showed that the weakest material in terms of tensile strength, starch, is the material among the four tested ones that is able to absorb the most energy within its structure when exposed to external perturbations inducing vibrations in the studied frequency range. The results indicate that mechanical relaxation analysis performed over relatively broad frequency ranges should be useful for predicting material properties of importance for the functionality of a material in applications such as, e.g., drug delivery, drug storage and handling, and also for clarifying the origin of hitherto unexplained molecular processes.

  12. Qualitative and quantitative analysis of complex temperature-programmed desorption data by multivariate curve resolution

    Science.gov (United States)

    Rodríguez-Reyes, Juan Carlos F.; Teplyakov, Andrew V.; Brown, Steven D.

    2010-10-01

    The substantial amount of information carried in temperature-programmed desorption (TPD) experiments is often difficult to mine due to the occurrence of competing reaction pathways that produce compounds with similar mass spectrometric features. Multivariate curve resolution (MCR) is introduced as a tool capable of overcoming this problem by mathematically detecting spectral variations and correlations between several m/z traces, which is later translated into the extraction of the cracking pattern and the desorption profile for each desorbate. Different from the elegant (though complex) methods currently available to analyze TPD data, MCR analysis is applicable even when no information regarding the specific surface reaction/desorption process or the nature of the desorbing species is available. However, when available, any information can be used as constraints that guide the outcome, increasing the accuracy of the resolution. This approach is especially valuable when the compounds desorbing are different from what would be expected based on a chemical intuition, when the cracking pattern of the model test compound is difficult or impossible to obtain (because it could be unstable or very rare), and when knowing major components desorbing from the surface could in more traditional methods actually bias the quantification of minor components. The enhanced level of understanding of thermal processes achieved through MCR analysis is demonstrated by analyzing three phenomena: i) the cryogenic desorption of vinyltrimethylsilane from silicon, an introductory system where the known multilayer and monolayer components are resolved; ii) acrolein hydrogenation on a bimetallic Pt-Ni-Pt catalyst, where a rapid identification of hydrogenated products as well as other desorbing species is achieved, and iii) the thermal reaction of Ti[N(CH 3) 2] 4 on Si(100), where the products of surface decomposition are identified and an estimation of the surface composition after the

  13. Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.

    Science.gov (United States)

    Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo

    2015-02-12

    Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.

  14. Feasibility analysis of high resolution tissue image registration using 3-D synthetic data

    Directory of Open Access Journals (Sweden)

    Yachna Sharma

    2011-01-01

    Full Text Available Background: Registration of high-resolution tissue images is a critical step in the 3D analysis of protein expression. Because the distance between images (~4-5μm thickness of a tissue section is nearly the size of the objects of interest (~10-20μm cancer cell nucleus, a given object is often not present in both of two adjacent images. Without consistent correspondence of objects between images, registration becomes a difficult task. This work assesses the feasibility of current registration techniques for such images. Methods: We generated high resolution synthetic 3-D image data sets emulating the constraints in real data. We applied multiple registration methods to the synthetic image data sets and assessed the registration performance of three techniques (i.e., mutual information (MI, kernel density estimate (KDE method [1], and principal component analysis (PCA at various slice thicknesses (with increments of 1μm in order to quantify the limitations of each method. Results: Our analysis shows that PCA, when combined with the KDE method based on nuclei centers, aligns images corresponding to 5μm thick sections with acceptable accuracy. We also note that registration error increases rapidly with increasing distance between images, and that the choice of feature points which are conserved between slices improves performance. Conclusions: We used simulation to help select appropriate features and methods for image registration by estimating best-case-scenario errors for given data constraints in histological images. The results of this study suggest that much of the difficulty of stained tissue registration can be reduced to the problem of accurately identifying feature points, such as the center of nuclei.

  15. Flood frequency analysis of historical flood data under stationary and non-stationary modelling

    Science.gov (United States)

    Machado, M. J.; Botero, B. A.; López, J.; Francés, F.; Díez-Herrero, A.; Benito, G.

    2015-06-01

    Historical records are an important source of information on extreme and rare floods and fundamental to establish a reliable flood return frequency. The use of long historical records for flood frequency analysis brings in the question of flood stationarity, since climatic and land-use conditions can affect the relevance of past flooding as a predictor of future flooding. In this paper, a detailed 400 yr flood record from the Tagus River in Aranjuez (central Spain) was analysed under stationary and non-stationary flood frequency approaches, to assess their contribution within hazard studies. Historical flood records in Aranjuez were obtained from documents (Proceedings of the City Council, diaries, chronicles, memoirs, etc.), epigraphic marks, and indirect historical sources and reports. The water levels associated with different floods (derived from descriptions or epigraphic marks) were computed into discharge values using a one-dimensional hydraulic model. Secular variations in flood magnitude and frequency, found to respond to climate and environmental drivers, showed a good correlation between high values of historical flood discharges and a negative mode of the North Atlantic Oscillation (NAO) index. Over the systematic gauge record (1913-2008), an abrupt change on flood magnitude was produced in 1957 due to constructions of three major reservoirs in the Tagus headwaters (Bolarque, Entrepeñas and Buendia) controlling 80% of the watershed surface draining to Aranjuez. Two different models were used for the flood frequency analysis: (a) a stationary model estimating statistical distributions incorporating imprecise and categorical data based on maximum likelihood estimators, and (b) a time-varying model based on "generalized additive models for location, scale and shape" (GAMLSS) modelling, which incorporates external covariates related to climate variability (NAO index) and catchment hydrology factors (in this paper a reservoir index; RI). Flood frequency

  16. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis

  17. IP Controller Design for Uncertain Two-Mass Torsional System Using Time-Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Jing Cui

    2018-01-01

    Full Text Available With the development of industrial production, drive systems are demanded for larger inertias of motors and load machines, whereas shafts should be lightweight. In this situation, it will excite mechanical vibrations in load side, which is harmful for industrial production when the motor works. Because of the complexity of the flexible shaft, it is often difficult to calculate stiffness coefficient of the flexible shaft. Furthermore, only the velocity of driving side could be measured, whereas the driving torque, the load torque, and the velocity of load side are immeasurable. Therefore, it is inconvenient to design the controller for the uncertain system. In this paper, a low-order IP controller is designed for an uncertain two-mass torsional system based on polynomial method and time-frequency analysis (TFA. IP controller parameters are calculated by inertias of driving side and load side as well as the resonant frequency based on polynomial method. Therein, the resonant frequency is identified using the time-frequency analysis (TFA of the velocity step response of the driving side under the open-loop system state, which can not only avoid harmful persistent start-stop excitation signal of the traditional method, but also obtain high recognition accuracy under the condition of weak vibration signal submerged in noise. The effectiveness of the designed IP controller is verified by groups of experiments. Experimental results show that good performance for vibration suppression is obtained for uncertain two-mass torsional system in a medium-low shaft stiffness condition.

  18. Marvel Analysis of the Measured High-resolution Rovibronic Spectra of TiO

    Science.gov (United States)

    McKemmish, Laura K.; Masseron, Thomas; Sheppard, Samuel; Sandeman, Elizabeth; Schofield, Zak; Furtenbacher, Tibor; Császár, Attila G.; Tennyson, Jonathan; Sousa-Silva, Clara

    2017-02-01

    Accurate, experimental rovibronic energy levels, with associated labels and uncertainties, are reported for 11 low-lying electronic states of the diatomic {}48{{Ti}}16{{O}} molecule, determined using the Marvel (Measured Active Rotational-Vibrational Energy Levels) algorithm. All levels are based on lines corresponding to critically reviewed and validated high-resolution experimental spectra taken from 24 literature sources. The transition data are in the 2-22,160 cm-1 region. Out of the 49,679 measured transitions, 43,885 are triplet-triplet, 5710 are singlet-singlet, and 84 are triplet-singlet transitions. A careful analysis of the resulting experimental spectroscopic network (SN) allows 48,590 transitions to be validated. The transitions determine 93 vibrational band origins of {}48{{Ti}}16{{O}}, including 71 triplet and 22 singlet ones. There are 276 (73) triplet-triplet (singlet-singlet) band-heads derived from Marvel experimental energies, 123(38) of which have never been assigned in low- or high-resolution experiments. The highest J value, where J stands for the total angular momentum, for which an energy level is validated is 163. The number of experimentally derived triplet and singlet {}48{{Ti}}16{{O}} rovibrational energy levels is 8682 and 1882, respectively. The lists of validated lines and levels for {}48{{Ti}}16{{O}} are deposited in the supporting information to this paper.

  19. Automated Microfluidic Platform for Serial Polymerase Chain Reaction and High-Resolution Melting Analysis.

    Science.gov (United States)

    Cao, Weidong; Bean, Brian; Corey, Scott; Coursey, Johnathan S; Hasson, Kenton C; Inoue, Hiroshi; Isano, Taisuke; Kanderian, Sami; Lane, Ben; Liang, Hongye; Murphy, Brian; Owen, Greg; Shinoda, Nobuhiko; Zeng, Shulin; Knight, Ivor T

    2016-06-01

    We report the development of an automated genetic analyzer for human sample testing based on microfluidic rapid polymerase chain reaction (PCR) with high-resolution melting analysis (HRMA). The integrated DNA microfluidic cartridge was used on a platform designed with a robotic pipettor system that works by sequentially picking up different test solutions from a 384-well plate, mixing them in the tips, and delivering mixed fluids to the DNA cartridge. A novel image feedback flow control system based on a Canon 5D Mark II digital camera was developed for controlling fluid movement through a complex microfluidic branching network without the use of valves. The same camera was used for measuring the high-resolution melt curve of DNA amplicons that were generated in the microfluidic chip. Owing to fast heating and cooling as well as sensitive temperature measurement in the microfluidic channels, the time frame for PCR and HRMA was dramatically reduced from hours to minutes. Preliminary testing results demonstrated that rapid serial PCR and HRMA are possible while still achieving high data quality that is suitable for human sample testing. © 2015 Society for Laboratory Automation and Screening.

  20. Non-stationary hydrologic frequency analysis using B-spline quantile regression

    Science.gov (United States)

    Nasri, B.; Bouezmarni, T.; St-Hilaire, A.; Ouarda, T. B. M. J.

    2017-11-01

    Hydrologic frequency analysis is commonly used by engineers and hydrologists to provide the basic information on planning, design and management of hydraulic and water resources systems under the assumption of stationarity. However, with increasing evidence of climate change, it is possible that the assumption of stationarity, which is prerequisite for traditional frequency analysis and hence, the results of conventional analysis would become questionable. In this study, we consider a framework for frequency analysis of extremes based on B-Spline quantile regression which allows to model data in the presence of non-stationarity and/or dependence on covariates with linear and non-linear dependence. A Markov Chain Monte Carlo (MCMC) algorithm was used to estimate quantiles and their posterior distributions. A coefficient of determination and Bayesian information criterion (BIC) for quantile regression are used in order to select the best model, i.e. for each quantile, we choose the degree and number of knots of the adequate B-spline quantile regression model. The method is applied to annual maximum and minimum streamflow records in Ontario, Canada. Climate indices are considered to describe the non-stationarity in the variable of interest and to estimate the quantiles in this case. The results show large differences between the non-stationary quantiles and their stationary equivalents for an annual maximum and minimum discharge with high annual non-exceedance probabilities.

  1. Multiscale Thermo-Mechanical Design and Analysis of High Frequency and High Power Vacuum Electron Devices

    Science.gov (United States)

    Gamzina, Diana

    Diana Gamzina March 2016 Mechanical and Aerospace Engineering Multiscale Thermo-Mechanical Design and Analysis of High Frequency and High Power Vacuum Electron Devices Abstract A methodology for performing thermo-mechanical design and analysis of high frequency and high average power vacuum electron devices is presented. This methodology results in a "first-pass" engineering design directly ready for manufacturing. The methodology includes establishment of thermal and mechanical boundary conditions, evaluation of convective film heat transfer coefficients, identification of material options, evaluation of temperature and stress field distributions, assessment of microscale effects on the stress state of the material, and fatigue analysis. The feature size of vacuum electron devices operating in the high frequency regime of 100 GHz to 1 THz is comparable to the microstructure of the materials employed for their fabrication. As a result, the thermo-mechanical performance of a device is affected by the local material microstructure. Such multiscale effects on the stress state are considered in the range of scales from about 10 microns up to a few millimeters. The design and analysis methodology is demonstrated on three separate microwave devices: a 95 GHz 10 kW cw sheet beam klystron, a 263 GHz 50 W long pulse wide-bandwidth sheet beam travelling wave tube, and a 346 GHz 1 W cw backward wave oscillator.

  2. Regional frequency analysis of extreme rainfalls using partial L moments method

    Science.gov (United States)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  3. Aurally-adequate time-frequency analysis for scattered sound in auditoria

    Science.gov (United States)

    Norris, Molly K.; Xiang, Ning; Kleiner, Mendel

    2005-04-01

    The goal of this work was to apply an aurally-adequate time-frequency analysis technique to the analysis of sound scattering effects in auditoria. Time-frequency representations were developed as a motivated effort that takes into account binaural hearing, with a specific implementation of interaural cross-correlation process. A model of the human auditory system was implemented in the MATLAB platform based on two previous models [A. Härmä and K. Palomäki, HUTear, Espoo, Finland; and M. A. Akeroyd, A. Binaural Cross-correlogram Toolbox for MATLAB (2001), University of Sussex, Brighton]. These stages include proper frequency selectivity, the conversion of the mechanical motion of the basilar membrane to neural impulses, and binaural hearing effects. The model was then used in the analysis of room impulse responses with varying scattering characteristics. This paper discusses the analysis results using simulated and measured room impulse responses. [Work supported by the Frank H. and Eva B. Buck Foundation.

  4. Application of the Frequency Map Analysis to the Study of the Beam Dynamics of Light Sources

    International Nuclear Information System (INIS)

    Nadolski, Laurent

    2001-01-01

    The topic of this thesis is the study of beam dynamics in storage rings with a restriction to single particle transverse dynamics. In a first part, tools (Frequency Map Analysis, Hamiltonian, Integrator) are presented for studying and exploring the dynamics. Numerical simulations of four synchrotron radiation sources (the ALS, the ESRF, SOLEIL and Super-ACO) are performed. We construct a tracking code based on a new class of symplectic integrators (Laskar and Robutel, 2000). These integrators with only positive steps are more precise by an order of magnitude than the standard Forest and Ruth's scheme. Comparisons with the BETA, DESPOT and MAD codes are carried out. Frequency Map Analysis (Laskar, 1990) is our main analysis tool. This is a numerical method for analysing a conservative dynamical system. Based on a refined Fourier technique, it enables us to compute frequency maps which are real footprints of the beam dynamics of an accelerator. We stress the high sensitivity of the dynamics to magnetics errors and sextipolar strengths. The second part of this work is dedicated to the analysis of experimental results from two light sources. Together with the ALS accelerator team (Berkeley), we succeeded in obtaining the first experimental frequency map of an accelerator. The agreement with the machine model is very impressive. At the Super-ACO ring, the study of the tune shift with amplitude enabled us to highlight a strong octupolar-like component related to the quadrupole fringe field. The aftermaths for the beam dynamics are important and give us a better understanding the measured ring performance. All these results are based on turn by turn measurements. Many closely related phenomena are treated such as response matrix analysis or beam decoherence. (author) [fr

  5. Multi-Resolution Wavelet-Transformed Image Analysis of Histological Sections of Breast Carcinomas

    Directory of Open Access Journals (Sweden)

    Hae-Gil Hwang

    2005-01-01

    Full Text Available Multi-resolution images of histological sections of breast cancer tissue were analyzed using texture features of Haar- and Daubechies transform wavelets. Tissue samples analyzed were from ductal regions of the breast and included benign ductal hyperplasia, ductal carcinoma in situ (DCIS, and invasive ductal carcinoma (CA. To assess the correlation between computerized image analysis and visual analysis by a pathologist, we created a two-step classification system based on feature extraction and classification. In the feature extraction step, we extracted texture features from wavelet-transformed images at 10× magnification. In the classification step, we applied two types of classifiers to the extracted features, namely a statistics-based multivariate (discriminant analysis and a neural network. Using features from second-level Haar transform wavelet images in combination with discriminant analysis, we obtained classification accuracies of 96.67 and 87.78% for the training and testing set (90 images each, respectively. We conclude that the best classifier of carcinomas in histological sections of breast tissue are the texture features from the second-level Haar transform wavelet images used in a discriminant function.

  6. High-resolution melting analysis for prenatal diagnosis of beta-thalassemia in northern Thailand.

    Science.gov (United States)

    Charoenkwan, Pimlak; Sirichotiyakul, Supatra; Phusua, Arunee; Suanta, Sudjai; Fanhchaksai, Kanda; Sae-Tung, Rattika; Sanguansermsri, Torpong

    2017-12-01

    High-resolution melting (HRM) analysis is a rapid mutation analysis which assesses the pattern of reduction of fluorescence signal after subjecting the amplified PCR product with saturated fluorescence dye to an increasing temperature. We used HRM analysis for prenatal diagnosis of beta-thalassemia disease in northern Thailand. Five PCR-HRM protocols were used to detect point mutations in five different segments of the beta-globin gene, and one protocol to detect the 3.4 kb beta-globin deletion. We sought to characterize the mutations in carriers and to enable prenatal diagnosis in 126 couples at risk of having a fetus with beta-thalassemia disease. The protocols identified 18 common mutations causing beta-thalassemia, including the rare codon 132 (A-T) mutation. Each mutation showed a specific HRM pattern and all results were in concordance with those from direct DNA sequencing or gap-PCR methods. In cases of beta-thalassemia disease resulting from homozygosity for a mutation or compound heterozygosity for two mutations on the same amplified segment, the HRM patterns were different to those of a single mutation and were specific for each combination. HRM analysis is a simple and useful method for mutation identification in beta-thalassemia carriers and prenatal diagnosis of beta-thalassemia in northern Thailand.

  7. Coupled analysis of multi-impact energy harvesting from low-frequency wind induced vibrations

    Science.gov (United States)

    Zhu, Jin; Zhang, Wei

    2015-04-01

    Energy need from off-grid locations has been critical for effective real-time monitoring and control to ensure structural safety and reliability. To harvest energy from ambient environments, the piezoelectric-based energy-harvesting system has been proven very efficient to convert high frequency vibrations into usable electrical energy. However, due to the low frequency nature of the vibrations of civil infrastructures, such as those induced from vehicle impacts, wind, and waves, the application of a traditional piezoelectric-based energy-harvesting system is greatly restrained since the output power drops dramatically with the reduction of vibration frequencies. This paper focuses on the coupled analysis of a proposed piezoelectric multi-impact wind-energy-harvesting device that can effectively up-convert low frequency wind-induced vibrations into high frequency ones. The device consists of an H-shape beam and four bimorph piezoelectric cantilever beams. The H-shape beam, which can be easily triggered to vibrate at a low wind speed, is originated from the first Tacoma Narrows Bridge, which failed at wind speeds of 18.8 m s-1 in 1940. The multi-impact mechanism between the H-shape beam and the bimorph piezoelectric cantilever beams is incorporated to improve the harvesting performance at lower frequencies. During the multi-impact process, a series of sequential impacts between the H-shape beam and the cantilever beams can trigger high frequency vibrations of the cantilever beams and result in high output power with a considerably high efficiency. In the coupled analysis, the coupled structural, aerodynamic, and electrical equations are solved to obtain the dynamic response and the power output of the proposed harvesting device. A parametric study for several parameters in the coupled analysis framework is carried out including the external resistance, wind speed, and the configuration of the H-shape beam. The average harvested power for the piezoelectric cantilever

  8. Time-frequency feature analysis and recognition of fission neutrons signal based on support vector machine

    International Nuclear Information System (INIS)

    Jin Jing; Wei Biao; Feng Peng; Tang Yuelin; Zhou Mi

    2010-01-01

    Based on the interdependent relationship between fission neutrons ( 252 Cf) and fission chain ( 235 U system), the paper presents the time-frequency feature analysis and recognition in fission neutron signal based on support vector machine (SVM) through the analysis on signal characteristics and the measuring principle of the 252 Cf fission neutron signal. The time-frequency characteristics and energy features of the fission neutron signal are extracted by using wavelet decomposition and de-noising wavelet packet decomposition, and then applied to training and classification by means of support vector machine based on statistical learning theory. The results show that, it is effective to obtain features of nuclear signal via wavelet decomposition and de-noising wavelet packet decomposition, and the latter can reflect the internal characteristics of the fission neutron system better. With the training accomplished, the SVM classifier achieves an accuracy rate above 70%, overcoming the lack of training samples, and verifying the effectiveness of the algorithm. (authors)

  9. Natural frequency and vibration analysis of jacket type foundation for offshore wind power

    Science.gov (United States)

    Hung, Y.-C.; Chang, Y.-Y.; Chen, S.-Y.

    2017-12-01

    There are various types of foundation structure for offshore wind power, engineers may assess the condition of ocean at wind farm, and arrange the transportation, installation of each structure members, furthermore, considering the ability of manufacture steel structure as well, then make an optimum design. To design jacket offshore structure, unlike onshore cases, offshore structure also need to estimate the wave excitation effect. The aim of this paper is to study the difference of natural frequency between different kinds of structural stiffness and discuss the effect of different setting of boundary condition during analysis, besides, compare this value with the natural frequency of sea wave, in order to avoid the resonance effect. In this paper, the finite element analysis software ABAQUS is used to model and analyze the natural vibration behavior of the jacket structure.

  10. High resolution respirometry analysis of polyethylenimine-mediated mitochondrial energy crisis and cellular stress

    DEFF Research Database (Denmark)

    Hall, Arnaldur; Larsen, Anna Karina; Parhamifar, Ladan

    2013-01-01

    and spectrophotometry analysis of cytochrome c oxidase activity we were able to identify complex IV (cytochrome c oxidase) as a likely specific site of PEI mediated inhibition within the electron transport system. Unraveling the mechanisms of PEI-mediated mitochondrial energy crisis is central for combinatorial design...... of PEI-mediated plasma membrane damage and subsequent ATP leakage to the extracellular medium. Studies with freshly isolated mouse liver mitochondria corroborated with bioenergetic findings and demonstrated parallel polycation concentration- and time-dependent changes in state 2 and state 4o oxygen flux...... as well as lowered ADP phosphorylation (state 3) and mitochondrial ATP synthesis. Polycation-mediated reduction of electron transport system activity was further demonstrated in 'broken mitochondria' (freeze-thawed mitochondrial preparations). Moreover, by using both high-resolution respirometry...

  11. Differentiation of minute virus of mice and mouse parvovirus by high resolution melting curve analysis.

    Science.gov (United States)

    Rao, Dan; Wu, Miaoli; Wang, Jing; Yuan, Wen; Zhu, Yujun; Cong, Feng; Xu, Fengjiao; Lian, Yuexiao; Huang, Bihong; Wu, Qiwen; Chen, Meili; Zhang, Yu; Huang, Ren; Guo, Pengju

    2017-12-01

    Murine parvovirus is one of the most prevalent infectious pathogens in mouse colonies. A specific primer pair targeting the VP2 gene of minute virus of mice (MVM) and mouse parvovirus (MPV) was utilized for high resolution melting (HRM) analysis. The resulting melting curves could distinguish these two virus strains and there was no detectable amplification of the other mouse pathogens which included rat parvovirus (KRV), ectromelia virus (ECT), mouse adenovirus (MAD), mouse cytomegalovirus (MCMV), polyoma virus (Poly), Helicobactor hepaticus (H. hepaticus) and Salmonella typhimurium (S. typhimurium). The detection limit of the standard was 10 copies/μL. This study showed that the PCR-HRM assay could be an alternative useful method with high specificity and sensitivity for differentiating murine parvovirus strains MVM and MPV. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Stain Deconvolution Using Statistical Analysis of Multi-Resolution Stain Colour Representation.

    Directory of Open Access Journals (Sweden)

    Najah Alsubaie

    Full Text Available Stain colour estimation is a prominent factor of the analysis pipeline in most of histology image processing algorithms. Providing a reliable and efficient stain colour deconvolution approach is fundamental for robust algorithm. In this paper, we propose a novel method for stain colour deconvolution of histology images. This approach statistically analyses the multi-resolutional representation of the image to separate the independent observations out of the correlated ones. We then estimate the stain mixing matrix using filtered uncorrelated data. We conducted an extensive set of experiments to compare the proposed method to the recent state of the art methods and demonstrate the robustness of this approach using three different datasets of scanned slides, prepared in different labs using different scanners.

  13. High Spatial Resolution Analysis of Fungal Cell Biochemistry: Bridging the Analytical Gap using Synchrotron FTIR Spectromicroscopy

    International Nuclear Information System (INIS)

    Kaminskyj, S.; Konstantin, J.; Szeghalmi, A.; Gough, K.

    2008-01-01

    Fungi impact humans and the environment in many ways, for good and ill. Some fungi support the growth of terrestrial plants or are used in biotechnology, and yet others are established or emerging pathogens. In some cases, the same organism may play different roles depending on the context or the circumstance. A better understanding of the relationship between fungal biochemical composition as related to the fungal growth environment is essential if we are to support or control their activities. Synchrotron FTIR (sFTIR) spectromicroscopy of fungal hyphae is a major new tool for exploring cell composition at a high spatial resolution. Brilliant synchrotron light is essential for this analysis due to the small size of fungal hyphae. sFTIR biochemical characterization of subcellular variation in hyphal composition will allow detailed exploration of fungal responses to experimental treatments and to environmental factors.

  14. Region-of-influence approach to a frequency analysis of heavy precipitation in Slovakia

    Czech Academy of Sciences Publication Activity Database

    Gaál, L.; Kyselý, Jan; Szolgay, J.

    2007-01-01

    Roč. 4, č. 4 (2007), s. 2361-2401 ISSN 1812-2108 R&D Projects: GA AV ČR KJB300420601 Institutional research plan: CEZ:AV0Z30420517 Keywords : regional frequency analysis * region-of-influence approach * pooling groups * extreme precipitation events * L-moments * Slovakia Subject RIV: DG - Athmosphere Sciences, Meteorology http://www.hydrol-earth-syst-sci-discuss.net/4/2361/2007/

  15. Region-of-influence approach to a frequency analysis of heavy precipitation in Slovakia

    Czech Academy of Sciences Publication Activity Database

    Gaál, L.; Kyselý, Jan; Szolgay, J.

    2008-01-01

    Roč. 12, č. 3 (2008), s. 825-839 ISSN 1027-5606 R&D Projects: GA AV ČR KJB300420601 Institutional research plan: CEZ:AV0Z30420517 Keywords : regional frequency analysis * region-of-influence approach * pooling groups * extreme precipitation events * L-moments * Slovakia Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 2.167, year: 2008 http://www.hydrol-earth-syst-sci.net/12/825/2008/

  16. Computized analysis of diagnostic level in breast diseases and frequency of their clinicoroentgenological symptoms

    International Nuclear Information System (INIS)

    Babij, Ya.S.; Krakhmaleva, L.P.; Shumakov, A.G.

    1988-01-01

    A standardized chart with a list of clinicoroentgenological symptoms of the most frequent mammary gland diseases is developed. The data of computer analysis of the diagnostics level and the frequency of clinicoroentgenological symptoms of different diseases of the mammary gland are given. The data obtained can be used for conventional and computer diagnostics of mammary gland diseases, as well as as a basis for the further study of semiotics of different pathological processes in the mammary gland. 22 refs

  17. A two-component generalized extreme value distribution for precipitation frequency analysis

    Czech Academy of Sciences Publication Activity Database

    Rulfová, Zuzana; Buishand, A.; Roth, M.; Kyselý, Jan

    2016-01-01

    Roč. 534, March (2016), s. 659-668 ISSN 0022-1694 R&D Projects: GA ČR(CZ) GA14-18675S Institutional support: RVO:68378289 Keywords : precipitation extremes * two-component extreme value distribution * regional frequency analysis * convective precipitation * stratiform precipitation * Central Europe Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.483, year: 2016 http://www.sciencedirect.com/science/article/pii/S0022169416000500

  18. Text in social networking Web sites: A word frequency analysis of Live Spaces

    OpenAIRE

    Thelwall, Mike

    2008-01-01

    Social networking sites are owned by a wide section of society and seem to dominate Web usage. Despite much research into this phenomenon, little systematic data is available. This article partially fills this gap with a pilot text analysis of one social networking site, Live Spaces. The text in 3,071 English language Live Spaces sites was monitored daily for six months and word frequency statistics calculated and compared with those from the British National Corpus. The results confirmed the...

  19. Experimental measure of arm stiffness during single reaching movements with a time-frequency analysis

    OpenAIRE

    Piovesan, Davide; Pierobon, Alberto; DiZio, Paul; Lackner, James R.

    2013-01-01

    We tested an innovative method to estimate joint stiffness and damping during multijoint unfettered arm movements. The technique employs impulsive perturbations and a time-frequency analysis to estimate the arm's mechanical properties along a reaching trajectory. Each single impulsive perturbation provides a continuous estimation on a single-reach basis, making our method ideal to investigate motor adaptation in the presence of force fields and to study the control of movement in impaired ind...

  20. Measurement of flaw size in a weld sample by ultrasonic frequency analysis

    International Nuclear Information System (INIS)

    Whaley, H.L. Jr.; Adler, L.; Cook, K.V.; McClung, R.W.

    1975-05-01

    An ultrasonic frequency analysis technique has been developed and applied to the measurement of flaws in an 8-in.-thick heavy-section steel specimen belonging to the Pressure Vessel Research Committee program. Using the technique the flaws occurring in the weld area were characterized in quantitative terms of both dimension and orientation. Several modifications of the technique were made during the study to include the application of several transducers and to consider ultrasonic mode conversion. (U.S.)

  1. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  2. Multi-group transport methods for high-resolution neutron activation analysis

    International Nuclear Information System (INIS)

    Burns, K. A.; Smith, L. E.; Gesh, C. J.; Shaver, M. W.

    2009-01-01

    The accurate and efficient simulation of coupled neutron-photon problems is necessary for several important radiation detection applications. Examples include the detection of nuclear threats concealed in cargo containers and prompt gamma neutron activation analysis for nondestructive determination of elemental composition of unknown samples. In these applications, high-resolution gamma-ray spectrometers are used to preserve as much information as possible about the emitted photon flux, which consists of both continuum and characteristic gamma rays with discrete energies. Monte Carlo transport is the most commonly used modeling tool for this type of problem, but computational times for many problems can be prohibitive. This work explores the use of multi-group deterministic methods for the simulation of neutron activation problems. Central to this work is the development of a method for generating multi-group neutron-photon cross-sections in a way that separates the discrete and continuum photon emissions so that the key signatures in neutron activation analysis (i.e., the characteristic line energies) are preserved. The mechanics of the cross-section preparation method are described and contrasted with standard neutron-gamma cross-section sets. These custom cross-sections are then applied to several benchmark problems. Multi-group results for neutron and photon flux are compared to MCNP results. Finally, calculated responses of high-resolution spectrometers are compared. Preliminary findings show promising results when compared to MCNP. A detailed discussion of the potential benefits and shortcomings of the multi-group-based approach, in terms of accuracy, and computational efficiency, is provided. (authors)

  3. Genome-wide SNP discovery in tetraploid alfalfa using 454 sequencing and high resolution melting analysis

    Directory of Open Access Journals (Sweden)

    Zhao Patrick X

    2011-07-01

    Full Text Available Abstract Background Single nucleotide polymorphisms (SNPs are the most common type of sequence variation among plants and are often functionally important. We describe the use of 454 technology and high resolution melting analysis (HRM for high throughput SNP discovery in tetraploid alfalfa (Medicago sativa L., a species with high economic value but limited genomic resources. Results The alfalfa genotypes selected from M. sativa subsp. sativa var. 'Chilean' and M. sativa subsp. falcata var. 'Wisfal', which differ in water stress sensitivity, were used to prepare cDNA from tissue of clonally-propagated plants grown under either well-watered or water-stressed conditions, and then pooled for 454 sequencing. Based on 125.2 Mb of raw sequence, a total of 54,216 unique sequences were obtained including 24,144 tentative consensus (TCs sequences and 30,072 singletons, ranging from 100 bp to 6,662 bp in length, with an average length of 541 bp. We identified 40,661 candidate SNPs distributed throughout the genome. A sample of candidate SNPs were evaluated and validated using high resolution melting (HRM analysis. A total of 3,491 TCs harboring 20,270 candidate SNPs were located on the M. truncatula (MT 3.5.1 chromosomes. Gene Ontology assignments indicate that sequences obtained cover a broad range of GO categories. Conclusions We describe an efficient method to identify thousands of SNPs distributed throughout the alfalfa genome covering a broad range of GO categories. Validated SNPs represent valuable molecular marker resources that can be used to enhance marker density in linkage maps, identify potential factors involved in heterosis and genetic variation, and as tools for association mapping and genomic selection in alfalfa.

  4. Time–frequency analysis of nonstationary complex magneto-hydro-dynamics in fusion plasma signals using the Choi–Williams distribution

    International Nuclear Information System (INIS)

    Xu, L.Q.; Hu, L.Q.; Chen, K.Y.; Li, E.Z.

    2013-01-01

    Highlights: • Choi–Williams distribution yields excellent time–frequency resolution for discrete signal. • CWD method provides clear time–frequency pictures of EAST and HT-7 fast MHD events. • CWD method has advantages to wavelets transform scalogram and the short-time Fourier transform spectrogram. • We discuss about how to choose the windows and free parameter of CWD method. -- Abstract: The Choi–Williams distribution is applied to the time–frequency analysis of signals describing rapid magneto-hydro-dynamic (MHD) modes and events in tokamak plasmas. A comparison is made with Soft X-ray (SXR) signals as well as Mirnov signal that shows the advantages of the Choi–Williams distribution over both continuous wavelets transform scalogram and the short-time Fourier transform spectrogram. Examples of MHD activities in HT-7 and EAST tokamak are shown, namely the onset of coupling tearing modes, high frequency precursors of sawtooth, and low frequency MHD instabilities in edge localized mode (ELM) free in H mode discharge

  5. An Object-Based Image Analysis Approach for Detecting Penguin Guano in very High Spatial Resolution Satellite Images

    OpenAIRE

    Chandi Witharana; Heather J. Lynch

    2016-01-01

    The logistical challenges of Antarctic field work and the increasing availability of very high resolution commercial imagery have driven an interest in more efficient search and classification of remotely sensed imagery. This exploratory study employed geographic object-based analysis (GEOBIA) methods to classify guano stains, indicative of chinstrap and Adélie penguin breeding areas, from very high spatial resolution (VHSR) satellite imagery and closely examined the transferability of knowle...

  6. SAMPO 90 - High resolution interactive gamma spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1991-01-01

    SAMPO 90 is a high performance gamma spectrum analysis program for personal computers. It uses high resolution color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or by using macros for automated measurement and analysis sequences including the control of MCAs and sample changers. Semi-automated calibrations for peak shapes (Gaussian with exponential tails), detector efficiency, and energy are available with a possibility for user intervention through interactive graphics. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear, non-linear and mixed mode fitting, where the component energies and areas can be either frozen or allowed to float in arbitrary combinations. Nuclide identification is done using associated lines techniques which allow interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. Attenuation corrections can be taken into account in detector efficiency calculation. The most common PC-based MCA spectrum formats (Canberra S100, Ortec ACE, Nucleus PCA, ND AccuSpec) are supported as well as ASCII spectrum files. A gamma-line library is included together with an editor for user configurable libraries. The analysis reports and program parameters are fully customizable. Function key macros can be used to automate the most common analysis procedures. Small batch type modules are additionally available for routine work. SAMPO 90 is a result of over twenty man years of programming and contains 25,000 lines of Fortran, 10,000 lines of C, and 12,000 lines of assembler

  7. Research on Distributed PV Storage Virtual Synchronous Generator System and Its Static Frequency Characteristic Analysis

    Directory of Open Access Journals (Sweden)

    Xiangwu Yan

    2018-03-01

    Full Text Available The increasing penetration rate of grid connected renewable energy power generation reduces the primary frequency regulation capability of the system and poses a challenge to the security and stability of the power grid. In this paper, a distributed photovoltaic (PV storage virtual synchronous generator system is constructed, which realizes the external characteristics of synchronous generator/motor. For this kind of input/output bidirectional devices (e.g., renewable power generation/storage combined systems, pumped storage power stations, battery energy storage systems, and vehicle-to-grid electric vehicles, a synthesis analysis method for system power-frequency considering source-load static frequency characteristics (S-L analysis method is proposed in order to depict the system’s power balance dynamic adjustment process visually. Simultaneously, an inertia matching method is proposed to solve the problem of inertia matching in the power grid. Through the simulation experiment in MATLAB, the feasibility of the distributed PV storage synchronous virtual machine system is verified as well as the effectiveness of S-L analysis method and inertia matching method.

  8. A direct method for soil-structure interaction analysis based on frequency-dependent soil masses

    International Nuclear Information System (INIS)

    Danisch, R.; Delinic, K.; Marti, J.; Trbojevic, V.M.

    1993-01-01

    In a soil-structure interaction analysis, the soil, as a subsystem of the global vibrating system, exerts a strong influence on the response of the nuclear reactor building to the earthquake excitation. The volume of resources required for dealing with the soil have led to a number of different types of frequency-domain solutions, most of them based on the impedance function approach. These procedures require coupling the soil to the lumped-mass finite-element model of the reactor building. In most practical cases, the global vibrating system is analysed in the time domain (i.e. modal time history, linear or non-linear direct time-integration). Hence, it follows that the frequency domain solution for soil must be converted to an 'equivalent' soil model in the time domain. Over the past three decades, different approaches have been developed and used for earthquake analysis of nuclear power plants. In some cases, difficulties experienced in modelling the soil have affected the methods of global analysis, thus leading to approaches like the substructuring technique, e.g. 3-step method. In the practical applications, the limitations of each specific method must be taken into account in order to avoid unrealistic results. The aim of this paper is to present the recent development on an equivalent SDOF system for soil including frequency-dependent soil masses. The method will be compared with the classical 3-step method. (author)

  9. Analysis of Radio Frequency Blackout for a Blunt-Body Capsule in Atmospheric Reentry Missions

    Directory of Open Access Journals (Sweden)

    Yusuke Takahashi

    2016-01-01

    Full Text Available A numerical analysis of electromagnetic waves around the atmospheric reentry demonstrator (ARD of the European Space Agency (ESA in an atmospheric reentry mission was conducted. During the ARD mission, which involves a 70% scaled-down configuration capsule of the Apollo command module, radio frequency blackout and strong plasma attenuation of radio waves in communications with data relay satellites and air planes were observed. The electromagnetic interference was caused by highly dense plasma derived from a strong shock wave generated in front of the capsule because of orbital speed during reentry. In this study, the physical properties of the plasma flow in the shock layer and wake region of the ESA ARD were obtained using a computational fluid dynamics technique. Then, electromagnetic waves were expressed using a frequency-dependent finite-difference time-domain method using the plasma properties. The analysis model was validated based on experimental flight data. A comparison of the measured and predicted results showed good agreement. The distribution of charged particles around the ESA ARD and the complicated behavior of electromagnetic waves, with attenuation and reflection, are clarified in detail. It is suggested that the analysis model could be an effective tool for investigating radio frequency blackout and plasma attenuation in radio wave communication.

  10. Backscattering analysis of high frequency ultrasonic imaging for ultrasound-guided breast biopsy

    Science.gov (United States)

    Cummins, Thomas; Akiyama, Takahiro; Lee, Changyang; Martin, Sue E.; Shung, K. Kirk

    2017-03-01

    A new ultrasound-guided breast biopsy technique is proposed. The technique utilizes conventional ultrasound guidance coupled with a high frequency embedded ultrasound array located within the biopsy needle to improve the accuracy in breast cancer diagnosis.1 The array within the needle is intended to be used to detect micro- calcifications indicative of early breast cancers such as ductal carcinoma in situ (DCIS). Backscattering analysis has the potential to characterize tissues to improve localization of lesions. This paper describes initial results of the application of backscattering analysis of breast biopsy tissue specimens and shows the usefulness of high frequency ultrasound for the new biopsy related technique. Ultrasound echoes of ex-vivo breast biopsy tissue specimens were acquired by using a single-element transducer with a bandwidth from 41 MHz to 88 MHz utilizing a UBM methodology, and the backscattering coefficients were calculated. These values as well as B-mode image data were mapped in 2D and matched with each pathology image for the identification of tissue type for the comparison to the pathology images corresponding to each plane. Microcalcifications were significantly distinguished from normal tissue. Adenocarcinoma was also successfully differentiated from adipose tissue. These results indicate that backscattering analysis is able to quantitatively distinguish tissues into normal and abnormal, which should help radiologists locate abnormal areas during the proposed ultrasound-guided breast biopsy with high frequency ultrasound.

  11. Single Frequency Impedance Analysis on Reduced Graphene Oxide Screen-Printed Electrode for Biomolecular Detection.

    Science.gov (United States)

    Rajesh; Singal, Shobhita; Kotnala, Ravinder K

    2017-10-01

    A biofunctionalized reduced graphene oxide (rGO)-modified screen-printed carbon electrode (SPCE) was constructed as an immunosensor for C-reactive protein (CRP) detection, a biomarker released in early stage acute myocardial infarction. A different approach of single frequency analysis (SFA) study was utilized for the biomolecular sensing, by monitoring the response in phase angle changes obtained at an optimized frequency resulting from antigen-antibody interactions. A set of measurements were carried out to optimize a frequency where a maximum change in phase angle was observed, and in this case, we found it at around 10 Hz. The bioelectrode was characterized by contact angle measurements, scanning electron microscopy, and electrochemical techniques. A concentration-dependent response of immunosensor to CRP with the change in phase angle, at a fixed frequency of 10 Hz, was found to be in the range of 10 ng mL -1 to 10 μg mL -1 in PBS and was fit quantitative well with the Hill-Langmuir equation. Based on the concentration-response data, the dissociation constant (K d ) was found to be 3.5 nM (with a Hill coefficient n = 0.57), which indicated a negative cooperativity with high anti-CRP (antibody)-CRP (antigen) binding at the electrode surface. A low-frequency analysis of sensing with an ease of measurement on a disposable electroactive rGO-modified electrode with high selectivity and sensitivity makes it a potential tool for biological sensors.

  12. Multifield analysis of a piezoelectric valveless micropump: effects of actuation frequency and electric potential

    International Nuclear Information System (INIS)

    Sayar, Ersin; Farouk, Bakhtier

    2012-01-01

    Coupled multifield analysis of a piezoelectrically actuated valveless micropump device is carried out for liquid (water) transport applications. The valveless micropump consists of two diffuser/nozzle elements; the pump chamber, a thin structural layer (silicon), and a piezoelectric layer, PZT-5A as the actuator. We consider two-way coupling of forces between solid and liquid domains in the systems where actuator deflection causes fluid flow and vice versa. Flow contraction and expansion (through the nozzle and the diffuser respectively) generate net fluid flow. Both structural and flow field analysis of the microfluidic device are considered. The effect of the driving power (voltage) and actuation frequency on silicon-PZT-5A bi-layer membrane deflection and flow rate is investigated. For the compressible flow formulation, an isothermal equation of state for the working fluid is employed. The governing equations for the flow fields and the silicon-PZT-5A bi-layer membrane motions are solved numerically. At frequencies below 5000 Hz, the predicted flow rate increases with actuation frequency. The fluid–solid system shows a resonance at 5000 Hz due to the combined effect of mechanical and fluidic capacitances, inductances, and damping. Time-averaged flow rate starts to drop with increase of actuation frequency above (5000 Hz). The velocity profile in the pump chamber becomes relatively flat or plug-like, if the frequency of pulsations is sufficiently large (high Womersley number). The pressure, velocity, and flow rate prediction models developed in the present study can be utilized to optimize the design of MEMS based micropumps. (paper)

  13. Multifield analysis of a piezoelectric valveless micropump: effects of actuation frequency and electric potential

    Science.gov (United States)

    Sayar, Ersin; Farouk, Bakhtier

    2012-07-01

    Coupled multifield analysis of a piezoelectrically actuated valveless micropump device is carried out for liquid (water) transport applications. The valveless micropump consists of two diffuser/nozzle elements; the pump chamber, a thin structural layer (silicon), and a piezoelectric layer, PZT-5A as the actuator. We consider two-way coupling of forces between solid and liquid domains in the systems where actuator deflection causes fluid flow and vice versa. Flow contraction and expansion (through the nozzle and the diffuser respectively) generate net fluid flow. Both structural and flow field analysis of the microfluidic device are considered. The effect of the driving power (voltage) and actuation frequency on silicon-PZT-5A bi-layer membrane deflection and flow rate is investigated. For the compressible flow formulation, an isothermal equation of state for the working fluid is employed. The governing equations for the flow fields and the silicon-PZT-5A bi-layer membrane motions are solved numerically. At frequencies below 5000 Hz, the predicted flow rate increases with actuation frequency. The fluid-solid system shows a resonance at 5000 Hz due to the combined effect of mechanical and fluidic capacitances, inductances, and damping. Time-averaged flow rate starts to drop with increase of actuation frequency above (5000 Hz). The velocity profile in the pump chamber becomes relatively flat or plug-like, if the frequency of pulsations is sufficiently large (high Womersley number). The pressure, velocity, and flow rate prediction models developed in the present study can be utilized to optimize the design of MEMS based micropumps.

  14. High-frequency autonomic modulation: a new model for analysis of autonomic cardiac control.

    Science.gov (United States)

    Champéroux, Pascal; Fesler, Pierre; Judé, Sebastien; Richard, Serge; Le Guennec, Jean-Yves; Thireau, Jérôme

    2018-05-03

    Increase in high-frequency beat-to-beat heart rate oscillations by torsadogenic hERG blockers appears to be associated with signs of parasympathetic and sympathetic co-activation which cannot be assessed directly using classic methods of heart rate variability analysis. The present work aimed to find a translational model that would allow this particular state of the autonomic control of heart rate to be assessed. High-frequency heart rate and heart period oscillations were analysed within discrete 10 s intervals in a cohort of 200 healthy human subjects. Results were compared to data collected in non-human primates and beagle dogs during pharmacological challenges and torsadogenic hERG blockers exposure, in 127 genotyped LQT1 patients on/off β-blocker treatment and in subgroups of smoking and non-smoking subjects. Three states of autonomic modulation, S1 (parasympathetic predominance) to S3 (reciprocal parasympathetic withdrawal/sympathetic activation), were differentiated to build a new model of heart rate variability referred to as high-frequency autonomic modulation. The S2 state corresponded to a specific state during which both parasympathetic and sympathetic systems were coexisting or co-activated. S2 oscillations were proportionally increased by torsadogenic hERG-blocking drugs, whereas smoking caused an increase in S3 oscillations. The combined analysis of the magnitude of high-frequency heart rate and high-frequency heart period oscillations allows a refined assessment of heart rate autonomic modulation applicable to long-term ECG recordings and offers new approaches to assessment of the risk of sudden death both in terms of underlying mechanisms and sensitivity. © 2018 The Authors. British Journal of Pharmacology published by John Wiley & Sons Ltd on behalf of British Pharmacological Society.

  15. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, G.; Smith, J.A.; Serinaldi, F.; Bales, J.; Bates, P.D.; Krajewski, W.F.

    2009-01-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110 km2) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1 m3 s- 1 km- 2 to a maximum of 5.1 m3 s- 1 km- 2. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2 m3 s- 1 km- 2). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2 m3 s- 1 km- 2 ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades. ?? 2009 Elsevier Ltd.

  16. Flood frequency analysis for nonstationary annual peak records in an urban drainage basin

    Science.gov (United States)

    Villarini, Gabriele; Smith, James A.; Serinaldi, Francesco; Bales, Jerad; Bates, Paul D.; Krajewski, Witold F.

    2009-08-01

    Flood frequency analysis in urban watersheds is complicated by nonstationarities of annual peak records associated with land use change and evolving urban stormwater infrastructure. In this study, a framework for flood frequency analysis is developed based on the Generalized Additive Models for Location, Scale and Shape parameters (GAMLSS), a tool for modeling time series under nonstationary conditions. GAMLSS is applied to annual maximum peak discharge records for Little Sugar Creek, a highly urbanized watershed which drains the urban core of Charlotte, North Carolina. It is shown that GAMLSS is able to describe the variability in the mean and variance of the annual maximum peak discharge by modeling the parameters of the selected parametric distribution as a smooth function of time via cubic splines. Flood frequency analyses for Little Sugar Creek (at a drainage area of 110km) show that the maximum flow with a 0.01-annual probability (corresponding to 100-year flood peak under stationary conditions) over the 83-year record has ranged from a minimum unit discharge of 2.1mskm to a maximum of 5.1mskm. An alternative characterization can be made by examining the estimated return interval of the peak discharge that would have an annual exceedance probability of 0.01 under the assumption of stationarity (3.2mskm). Under nonstationary conditions, alternative definitions of return period should be adapted. Under the GAMLSS model, the return interval of an annual peak discharge of 3.2mskm ranges from a maximum value of more than 5000 years in 1957 to a minimum value of almost 8 years for the present time (2007). The GAMLSS framework is also used to examine the links between population trends and flood frequency, as well as trends in annual maximum rainfall. These analyses are used to examine evolving flood frequency over future decades.

  17. Frequency analysis of cytotoxic T lymphocyte precursors in search for donors in bone marrow transplantation

    International Nuclear Information System (INIS)

    Cukrova, V.; Dolezalova, L.; Loudova, M.; Matejkova, E.; Korinkova, P.; Lukasova, M.; Stary, J.

    1995-01-01

    The usefulness of cytotoxic T lymphocytes (CTLp) frequency analysis in the search for donors in bone marrow transplantation was studied. The frequency of anti-recipient CTLp was approached by limiting dilution assay in HLA matched unrelated, HLA partially matched related and HLA genotypically identical donors. The majority of patients examined were affected with different hematological malignancies. Allo-reactive CTLp recognizing non-HLA gene products were not detected in pre-transplant examination of two pairs of HLA identical siblings. However, an increase incidence of allo-specific CTLp was identified in HLA matched mixed lymphocyte culture (MLC) negative unrelated pairs. Thus, CTLp assay allowed to the residual Class I incompatibilities that remained hidden in standard serotyping. In two matched unrelated pairs with high pretranslant CTLp frequency the severe acute graft-versus-host diseases developed after bone marrow transplantation. Examination of other relatives in patients lacking an HLA identical sibling showed the importance of Class I incompatibility for CTLp generation as well. The lack of correlation between CTLp frequency and HLA-D disparity could suggest that Class II antigens do not participate in CTLp induction. With one exception we had good correlation between MLC and DNA analysis of Class II antigens demonstrating that MLC gives interpretable results even in unrelated pairs. Our results demonstrate the significance of CTLp frequency assay in detection of residual Class I incompatibilities in matched unrelated pairs and in assessment of Class I compatibility in related pairs. For that it should be used in the final selection of bone marrow transplantation donors. (author)

  18. Identification of Uvaria sp by barcoding coupled with high-resolution melting analysis (Bar-HRM).

    Science.gov (United States)

    Osathanunkul, M; Madesis, P; Ounjai, S; Pumiputavon, K; Somboonchai, R; Lithanatudom, P; Chaowasku, T; Wipasa, J; Suwannapoom, C

    2016-01-13

    DNA barcoding, which was developed about a decade ago, relies on short, standardized regions of the genome to identify plant and animal species. This method can be used to not only identify known species but also to discover novel ones. Numerous sequences are stored in online databases worldwide. One of the ways to save cost and time (by omitting the sequencing step) in species identification is to use available barcode data to design optimized primers for further analysis, such as high-resolution melting analysis (HRM). This study aimed to determine the effectiveness of the hybrid method Bar-HRM (DNA barcoding combined with HRM) to identify species that share similar external morphological features, rather than conduct traditional taxonomic identification that require major parts (leaf, flower, fruit) of the specimens. The specimens used for testing were those, which could not be identified at the species level and could either be Uvaria longipes or Uvaria wrayias, indicated by morphological identification. Primer pairs derived from chloroplast regions (matK, psbA-trnH, rbcL, and trnL) were used in the Bar-HRM. The results obtained from psbA-trnH primers were good enough to help in identifying the specimen while the rest were not. Bar-HRM analysis was proven to be a fast and cost-effective method for plant species identification.

  19. Study of resolution enhancement methods for impurities quantitative analysis in uranium compounds by XRF

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Clayton P.; Salvador, Vera L.R.; Cotrim, Marycel E.B.; Pires, Maria Ap. F.; Scapin, Marcos A., E-mail: clayton.pereira.silva@usp.b [Instituto de Pesquisas Energeticas e Nucleares (CQMA/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Quimica e Meio Ambiente

    2011-07-01

    X-ray fluorescence analysis is a technique widely used for the determination of both major and trace elements related to interaction between the sample and radiation, allowing direct and nondestructive analysis. However, in uranium matrices these devices are inefficient because the characteristic emission lines of elements like S, Cl, Zn, Zr, Mo and other overlap characteristic emission lines of uranium. Thus, chemical procedures to separation of uranium are needed to perform this sort of analysis. In this paper the deconvolution method was used to increase spectra resolution and correct the overlaps. The methodology was tested according to NBR ISO 17025 using a set of seven certified reference materials for impurities present in U3O8 (New Brunswick Laboratory - NBL). The results showed that this methodology allows quantitative determination of impurities such as Zn, Zr, Mo and others, in uranium compounds. The detection limits were shorter than 50{mu}g. g{sup -1} and uncertainty was shorter than 10% for the determined elements. (author)

  20. Improving Accuracy and Temporal Resolution of Learning Curve Estimation for within- and across-Session Analysis

    Science.gov (United States)

    Tabelow, Karsten; König, Reinhard; Polzehl, Jörg

    2016-01-01

    Estimation of learning curves is ubiquitously based on proportions of correct responses within moving trial windows. Thereby, it is tacitly assumed that learning performance is constant within the moving windows, which, however, is often not the case. In the present study we demonstrate that violations of this assumption lead to systematic errors in the analysis of learning curves, and we explored the dependency of these errors on window size, different statistical models, and learning phase. To reduce these errors in the analysis of single-subject data as well as on the population level, we propose adequate statistical methods for the estimation of learning curves and the construction of confidence intervals, trial by trial. Applied to data from an avoidance learning experiment with rodents, these methods revealed performance changes occurring at multiple time scales within and across training sessions which were otherwise obscured in the conventional analysis. Our work shows that the proper assessment of the behavioral dynamics of learning at high temporal resolution can shed new light on specific learning processes, and, thus, allows to refine existing learning concepts. It further disambiguates the interpretation of neurophysiological signal changes recorded during training in relation to learning. PMID:27303809