WorldWideScience

Sample records for filter based image

  1. Wiener discrete cosine transform-based image filtering

    Science.gov (United States)

    Pogrebnyak, Oleksiy; Lukin, Vladimir V.

    2012-10-01

    A classical problem of additive white (spatially uncorrelated) Gaussian noise suppression in grayscale images is considered. The main attention is paid to discrete cosine transform (DCT)-based denoising, in particular, to image processing in blocks of a limited size. The efficiency of DCT-based image filtering with hard thresholding is studied for different sizes of overlapped blocks. A multiscale approach that aggregates the outputs of DCT filters having different overlapped block sizes is proposed. Later, a two-stage denoising procedure that presumes the use of the multiscale DCT-based filtering with hard thresholding at the first stage and a multiscale Wiener DCT-based filtering at the second stage is proposed and tested. The efficiency of the proposed multiscale DCT-based filtering is compared to the state-of-the-art block-matching and three-dimensional filter. Next, the potentially reachable multiscale filtering efficiency in terms of output mean square error (MSE) is studied. The obtained results are of the same order as those obtained by Chatterjee's approach based on nonlocal patch processing. It is shown that the ideal Wiener DCT-based filter potential is usually higher when noise variance is high.

  2. A Digital Image Denoising Algorithm Based on Gaussian Filtering and Bilateral Filtering

    Directory of Open Access Journals (Sweden)

    Piao Weiying

    2018-01-01

    Full Text Available Bilateral filtering has been applied in the area of digital image processing widely, but in the high gradient region of the image, bilateral filtering may generate staircase effect. Bilateral filtering can be regarded as one particular form of local mode filtering, according to above analysis, an mixed image de-noising algorithm is proposed based on Gaussian filter and bilateral filtering. First of all, it uses Gaussian filter to filtrate the noise image and get the reference image, then to take both the reference image and noise image as the input for range kernel function of bilateral filter. The reference image can provide the image’s low frequency information, and noise image can provide image’s high frequency information. Through the competitive experiment on both the method in this paper and traditional bilateral filtering, the experimental result showed that the mixed de-noising algorithm can effectively overcome staircase effect, and the filtrated image was more smooth, its textural features was also more close to the original image, and it can achieve higher PSNR value, but the amount of calculation of above two algorithms are basically the same.

  3. Independent component analysis based filtering for penumbral imaging

    International Nuclear Information System (INIS)

    Chen Yenwei; Han Xianhua; Nozaki, Shinya

    2004-01-01

    We propose a filtering based on independent component analysis (ICA) for Poisson noise reduction. In the proposed filtering, the image is first transformed to ICA domain and then the noise components are removed by a soft thresholding (shrinkage). The proposed filter, which is used as a preprocessing of the reconstruction, has been successfully applied to penumbral imaging. Both simulation results and experimental results show that the reconstructed image is dramatically improved in comparison to that without the noise-removing filters

  4. Image Recommendation Algorithm Using Feature-Based Collaborative Filtering

    Science.gov (United States)

    Kim, Deok-Hwan

    As the multimedia contents market continues its rapid expansion, the amount of image contents used in mobile phone services, digital libraries, and catalog service is increasing remarkably. In spite of this rapid growth, users experience high levels of frustration when searching for the desired image. Even though new images are profitable to the service providers, traditional collaborative filtering methods cannot recommend them. To solve this problem, in this paper, we propose feature-based collaborative filtering (FBCF) method to reflect the user's most recent preference by representing his purchase sequence in the visual feature space. The proposed approach represents the images that have been purchased in the past as the feature clusters in the multi-dimensional feature space and then selects neighbors by using an inter-cluster distance function between their feature clusters. Various experiments using real image data demonstrate that the proposed approach provides a higher quality recommendation and better performance than do typical collaborative filtering and content-based filtering techniques.

  5. Detail Enhancement for Infrared Images Based on Propagated Image Filter

    Directory of Open Access Journals (Sweden)

    Yishu Peng

    2016-01-01

    Full Text Available For displaying high-dynamic-range images acquired by thermal camera systems, 14-bit raw infrared data should map into 8-bit gray values. This paper presents a new method for detail enhancement of infrared images to display the image with a relatively satisfied contrast and brightness, rich detail information, and no artifacts caused by the image processing. We first adopt a propagated image filter to smooth the input image and separate the image into the base layer and the detail layer. Then, we refine the base layer by using modified histogram projection for compressing. Meanwhile, the adaptive weights derived from the layer decomposition processing are used as the strict gain control for the detail layer. The final display result is obtained by recombining the two modified layers. Experimental results on both cooled and uncooled infrared data verify that the proposed method outperforms the method based on log-power histogram modification and bilateral filter-based detail enhancement in both detail enhancement and visual effect.

  6. Gabor filter based fingerprint image enhancement

    Science.gov (United States)

    Wang, Jin-Xiang

    2013-03-01

    Fingerprint recognition technology has become the most reliable biometric technology due to its uniqueness and invariance, which has been most convenient and most reliable technique for personal authentication. The development of Automated Fingerprint Identification System is an urgent need for modern information security. Meanwhile, fingerprint preprocessing algorithm of fingerprint recognition technology has played an important part in Automatic Fingerprint Identification System. This article introduces the general steps in the fingerprint recognition technology, namely the image input, preprocessing, feature recognition, and fingerprint image enhancement. As the key to fingerprint identification technology, fingerprint image enhancement affects the accuracy of the system. It focuses on the characteristics of the fingerprint image, Gabor filters algorithm for fingerprint image enhancement, the theoretical basis of Gabor filters, and demonstration of the filter. The enhancement algorithm for fingerprint image is in the windows XP platform with matlab.65 as a development tool for the demonstration. The result shows that the Gabor filter is effective in fingerprint image enhancement technology.

  7. Image defog algorithm based on open close filter and gradient domain recursive bilateral filter

    Science.gov (United States)

    Liu, Daqian; Liu, Wanjun; Zhao, Qingguo; Fei, Bowen

    2017-11-01

    To solve the problems of fuzzy details, color distortion, low brightness of the image obtained by the dark channel prior defog algorithm, an image defog algorithm based on open close filter and gradient domain recursive bilateral filter, referred to as OCRBF, was put forward. The algorithm named OCRBF firstly makes use of weighted quad tree to obtain more accurate the global atmospheric value, then exploits multiple-structure element morphological open and close filter towards the minimum channel map to obtain a rough scattering map by dark channel prior, makes use of variogram to correct the transmittance map,and uses gradient domain recursive bilateral filter for the smooth operation, finally gets recovery images by image degradation model, and makes contrast adjustment to get bright, clear and no fog image. A large number of experimental results show that the proposed defog method in this paper can be good to remove the fog , recover color and definition of the fog image containing close range image, image perspective, the image including the bright areas very well, compared with other image defog algorithms,obtain more clear and natural fog free images with details of higher visibility, what's more, the relationship between the time complexity of SIDA algorithm and the number of image pixels is a linear correlation.

  8. Pleasant/Unpleasant Filtering for Affective Image Retrieval Based on Cross-Correlation of EEG Features

    Directory of Open Access Journals (Sweden)

    Keranmu Xielifuguli

    2014-01-01

    Full Text Available People often make decisions based on sensitivity rather than rationality. In the field of biological information processing, methods are available for analyzing biological information directly based on electroencephalogram: EEG to determine the pleasant/unpleasant reactions of users. In this study, we propose a sensitivity filtering technique for discriminating preferences (pleasant/unpleasant for images using a sensitivity image filtering system based on EEG. Using a set of images retrieved by similarity retrieval, we perform the sensitivity-based pleasant/unpleasant classification of images based on the affective features extracted from images with the maximum entropy method: MEM. In the present study, the affective features comprised cross-correlation features obtained from EEGs produced when an individual observed an image. However, it is difficult to measure the EEG when a subject visualizes an unknown image. Thus, we propose a solution where a linear regression method based on canonical correlation is used to estimate the cross-correlation features from image features. Experiments were conducted to evaluate the validity of sensitivity filtering compared with image similarity retrieval methods based on image features. We found that sensitivity filtering using color correlograms was suitable for the classification of preferred images, while sensitivity filtering using local binary patterns was suitable for the classification of unpleasant images. Moreover, sensitivity filtering using local binary patterns for unpleasant images had a 90% success rate. Thus, we conclude that the proposed method is efficient for filtering unpleasant images.

  9. Infrared image background modeling based on improved Susan filtering

    Science.gov (United States)

    Yuehua, Xia

    2018-02-01

    When SUSAN filter is used to model the infrared image, the Gaussian filter lacks the ability of direction filtering. After filtering, the edge information of the image cannot be preserved well, so that there are a lot of edge singular points in the difference graph, increase the difficulties of target detection. To solve the above problems, the anisotropy algorithm is introduced in this paper, and the anisotropic Gauss filter is used instead of the Gauss filter in the SUSAN filter operator. Firstly, using anisotropic gradient operator to calculate a point of image's horizontal and vertical gradient, to determine the long axis direction of the filter; Secondly, use the local area of the point and the neighborhood smoothness to calculate the filter length and short axis variance; And then calculate the first-order norm of the difference between the local area of the point's gray-scale and mean, to determine the threshold of the SUSAN filter; Finally, the built SUSAN filter is used to convolution the image to obtain the background image, at the same time, the difference between the background image and the original image is obtained. The experimental results show that the background modeling effect of infrared image is evaluated by Mean Squared Error (MSE), Structural Similarity (SSIM) and local Signal-to-noise Ratio Gain (GSNR). Compared with the traditional filtering algorithm, the improved SUSAN filter has achieved better background modeling effect, which can effectively preserve the edge information in the image, and the dim small target is effectively enhanced in the difference graph, which greatly reduces the false alarm rate of the image.

  10. The singular value filter: a general filter design strategy for PCA-based signal separation in medical ultrasound imaging.

    Science.gov (United States)

    Mauldin, F William; Lin, Dan; Hossack, John A

    2011-11-01

    A general filtering method, called the singular value filter (SVF), is presented as a framework for principal component analysis (PCA) based filter design in medical ultrasound imaging. The SVF approach operates by projecting the original data onto a new set of bases determined from PCA using singular value decomposition (SVD). The shape of the SVF weighting function, which relates the singular value spectrum of the input data to the filtering coefficients assigned to each basis function, is designed in accordance with a signal model and statistical assumptions regarding the underlying source signals. In this paper, we applied SVF for the specific application of clutter artifact rejection in diagnostic ultrasound imaging. SVF was compared to a conventional PCA-based filtering technique, which we refer to as the blind source separation (BSS) method, as well as a simple frequency-based finite impulse response (FIR) filter used as a baseline for comparison. The performance of each filter was quantified in simulated lesion images as well as experimental cardiac ultrasound data. SVF was demonstrated in both simulation and experimental results, over a wide range of imaging conditions, to outperform the BSS and FIR filtering methods in terms of contrast-to-noise ratio (CNR) and motion tracking performance. In experimental mouse heart data, SVF provided excellent artifact suppression with an average CNR improvement of 1.8 dB with over 40% reduction in displacement tracking error. It was further demonstrated from simulation and experimental results that SVF provided superior clutter rejection, as reflected in larger CNR values, when filtering was achieved using complex pulse-echo received data and non-binary filter coefficients.

  11. Convergent Filter Bases

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-09-01

    Full Text Available We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections.

  12. Fuzzy Logic-Based Filter for Removing Additive and Impulsive Noise from Color Images

    Science.gov (United States)

    Zhu, Yuhong; Li, Hongyang; Jiang, Huageng

    2017-12-01

    This paper presents an efficient filter method based on fuzzy logics for adaptively removing additive and impulsive noise from color images. The proposed filter comprises two parts including noise detection and noise removal filtering. In the detection part, the fuzzy peer group concept is applied to determine what type of noise is added to each pixel of the corrupted image. In the filter part, the impulse noise is deducted by the vector median filter in the CIELAB color space and an optimal fuzzy filter is introduced to reduce the Gaussian noise, while they can work together to remove the mixed Gaussian-impulse noise from color images. Experimental results on several color images proves the efficacy of the proposed fuzzy filter.

  13. Detail-enhanced multimodality medical image fusion based on gradient minimization smoothing filter and shearing filter.

    Science.gov (United States)

    Liu, Xingbin; Mei, Wenbo; Du, Huiqian

    2018-02-13

    In this paper, a detail-enhanced multimodality medical image fusion algorithm is proposed by using proposed multi-scale joint decomposition framework (MJDF) and shearing filter (SF). The MJDF constructed with gradient minimization smoothing filter (GMSF) and Gaussian low-pass filter (GLF) is used to decompose source images into low-pass layers, edge layers, and detail layers at multiple scales. In order to highlight the detail information in the fused image, the edge layer and the detail layer in each scale are weighted combined into a detail-enhanced layer. As directional filter is effective in capturing salient information, so SF is applied to the detail-enhanced layer to extract geometrical features and obtain directional coefficients. Visual saliency map-based fusion rule is designed for fusing low-pass layers, and the sum of standard deviation is used as activity level measurement for directional coefficients fusion. The final fusion result is obtained by synthesizing the fused low-pass layers and directional coefficients. Experimental results show that the proposed method with shift-invariance, directional selectivity, and detail-enhanced property is efficient in preserving and enhancing detail information of multimodality medical images. Graphical abstract The detailed implementation of the proposed medical image fusion algorithm.

  14. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  15. Exploring an optimal wavelet-based filter for cryo-ET imaging.

    Science.gov (United States)

    Huang, Xinrui; Li, Sha; Gao, Song

    2018-02-07

    Cryo-electron tomography (cryo-ET) is one of the most advanced technologies for the in situ visualization of molecular machines by producing three-dimensional (3D) biological structures. However, cryo-ET imaging has two serious disadvantages-low dose and low image contrast-which result in high-resolution information being obscured by noise and image quality being degraded, and this causes errors in biological interpretation. The purpose of this research is to explore an optimal wavelet denoising technique to reduce noise in cryo-ET images. We perform tests using simulation data and design a filter using the optimum selected wavelet parameters (three-level decomposition, level-1 zeroed out, subband-dependent threshold, a soft-thresholding and spline-based discrete dyadic wavelet transform (DDWT)), which we call a modified wavelet shrinkage filter; this filter is suitable for noisy cryo-ET data. When testing using real cryo-ET experiment data, higher quality images and more accurate measures of a biological structure can be obtained with the modified wavelet shrinkage filter processing compared with conventional processing. Because the proposed method provides an inherent advantage when dealing with cryo-ET images, it can therefore extend the current state-of-the-art technology in assisting all aspects of cryo-ET studies: visualization, reconstruction, structural analysis, and interpretation.

  16. DESIGN OF DYADIC-INTEGER-COEFFICIENTS BASED BI-ORTHOGONAL WAVELET FILTERS FOR IMAGE SUPER-RESOLUTION USING SUB-PIXEL IMAGE REGISTRATION

    Directory of Open Access Journals (Sweden)

    P.B. Chopade

    2014-05-01

    Full Text Available This paper presents image super-resolution scheme based on sub-pixel image registration by the design of a specific class of dyadic-integer-coefficient based wavelet filters derived from the construction of a half-band polynomial. First, the integer-coefficient based half-band polynomial is designed by the splitting approach. Next, this designed half-band polynomial is factorized and assigned specific number of vanishing moments and roots to obtain the dyadic-integer coefficients low-pass analysis and synthesis filters. The possibility of these dyadic-integer coefficients based wavelet filters is explored in the field of image super-resolution using sub-pixel image registration. The two-resolution frames are registered at a specific shift from one another to restore the resolution lost by CCD array of camera. The discrete wavelet transform (DWT obtained from the designed coefficients is applied on these two low-resolution images to obtain the high resolution image. The developed approach is validated by comparing the quality metrics with existing filter banks.

  17. MR image reconstruction via guided filter.

    Science.gov (United States)

    Huang, Heyan; Yang, Hang; Wang, Kang

    2018-04-01

    Magnetic resonance imaging (MRI) reconstruction from the smallest possible set of Fourier samples has been a difficult problem in medical imaging field. In our paper, we present a new approach based on a guided filter for efficient MRI recovery algorithm. The guided filter is an edge-preserving smoothing operator and has better behaviors near edges than the bilateral filter. Our reconstruction method is consist of two steps. First, we propose two cost functions which could be computed efficiently and thus obtain two different images. Second, the guided filter is used with these two obtained images for efficient edge-preserving filtering, and one image is used as the guidance image, the other one is used as a filtered image in the guided filter. In our reconstruction algorithm, we can obtain more details by introducing guided filter. We compare our reconstruction algorithm with some competitive MRI reconstruction techniques in terms of PSNR and visual quality. Simulation results are given to show the performance of our new method.

  18. Efficient OCT Image Enhancement Based on Collaborative Shock Filtering.

    Science.gov (United States)

    Liu, Guohua; Wang, Ziyu; Mu, Guoying; Li, Peijin

    2018-01-01

    Efficient enhancement of noisy optical coherence tomography (OCT) images is a key task for interpreting them correctly. In this paper, to better enhance details and layered structures of a human retina image, we propose a collaborative shock filtering for OCT image denoising and enhancement. Noisy OCT image is first denoised by a collaborative filtering method with new similarity measure, and then the denoised image is sharpened by a shock-type filtering for edge and detail enhancement. For dim OCT images, in order to improve image contrast for the detection of tiny lesions, a gamma transformation is first used to enhance the images within proper gray levels. The proposed method integrating image smoothing and sharpening simultaneously obtains better visual results in experiments.

  19. Iris image recognition wavelet filter-banks based iris feature extraction schemes

    CERN Document Server

    Rahulkar, Amol D

    2014-01-01

    This book provides the new results in wavelet filter banks based feature extraction, and the classifier in the field of iris image recognition. It provides the broad treatment on the design of separable, non-separable wavelets filter banks, and the classifier. The design techniques presented in the book are applied on iris image analysis for person authentication. This book also brings together the three strands of research (wavelets, iris image analysis, and classifier). It compares the performance of the presented techniques with state-of-the-art available schemes. This book contains the compilation of basic material on the design of wavelets that avoids reading many different books. Therefore, it provide an easier path for the new-comers, researchers to master the contents. In addition, the designed filter banks and classifier can also be effectively used than existing filter-banks in many signal processing applications like pattern classification, data-compression, watermarking, denoising etc.  that will...

  20. General filtering method for electronic speckle pattern interferometry fringe images with various densities based on variational image decomposition.

    Science.gov (United States)

    Li, Biyuan; Tang, Chen; Gao, Guannan; Chen, Mingming; Tang, Shuwei; Lei, Zhenkun

    2017-06-01

    Filtering off speckle noise from a fringe image is one of the key tasks in electronic speckle pattern interferometry (ESPI). In general, ESPI fringe images can be divided into three categories: low-density fringe images, high-density fringe images, and variable-density fringe images. In this paper, we first present a general filtering method based on variational image decomposition that can filter speckle noise for ESPI fringe images with various densities. In our method, a variable-density ESPI fringe image is decomposed into low-density fringes, high-density fringes, and noise. A low-density fringe image is decomposed into low-density fringes and noise. A high-density fringe image is decomposed into high-density fringes and noise. We give some suitable function spaces to describe low-density fringes, high-density fringes, and noise, respectively. Then we construct several models and numerical algorithms for ESPI fringe images with various densities. And we investigate the performance of these models via our extensive experiments. Finally, we compare our proposed models with the windowed Fourier transform method and coherence enhancing diffusion partial differential equation filter. These two methods may be the most effective filtering methods at present. Furthermore, we use the proposed method to filter a collection of the experimentally obtained ESPI fringe images with poor quality. The experimental results demonstrate the performance of our proposed method.

  1. Imaging reconstruction based on improved wavelet denoising combined with parallel-beam filtered back-projection algorithm

    Science.gov (United States)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2012-11-01

    The image reconstruction is a key step in medical imaging (MI) and its algorithm's performance determinates the quality and resolution of reconstructed image. Although some algorithms have been used, filter back-projection (FBP) algorithm is still the classical and commonly-used algorithm in clinical MI. In FBP algorithm, filtering of original projection data is a key step in order to overcome artifact of the reconstructed image. Since simple using of classical filters, such as Shepp-Logan (SL), Ram-Lak (RL) filter have some drawbacks and limitations in practice, especially for the projection data polluted by non-stationary random noises. So, an improved wavelet denoising combined with parallel-beam FBP algorithm is used to enhance the quality of reconstructed image in this paper. In the experiments, the reconstructed effects were compared between the improved wavelet denoising and others (directly FBP, mean filter combined FBP and median filter combined FBP method). To determine the optimum reconstruction effect, different algorithms, and different wavelet bases combined with three filters were respectively test. Experimental results show the reconstruction effect of improved FBP algorithm is better than that of others. Comparing the results of different algorithms based on two evaluation standards i.e. mean-square error (MSE), peak-to-peak signal-noise ratio (PSNR), it was found that the reconstructed effects of the improved FBP based on db2 and Hanning filter at decomposition scale 2 was best, its MSE value was less and the PSNR value was higher than others. Therefore, this improved FBP algorithm has potential value in the medical imaging.

  2. Passive ranging using a filter-based non-imaging method based on oxygen absorption.

    Science.gov (United States)

    Yu, Hao; Liu, Bingqi; Yan, Zongqun; Zhang, Yu

    2017-10-01

    To solve the problem of poor real-time measurement caused by a hyperspectral imaging system and to simplify the design in passive ranging technology based on oxygen absorption spectrum, a filter-based non-imaging ranging method is proposed. In this method, three bandpass filters are used to obtain the source radiation intensities that are located in the oxygen absorption band near 762 nm and the band's left and right non-absorption shoulders, and a photomultiplier tube is used as the non-imaging sensor of the passive ranging system. Range is estimated by comparing the calculated values of band-average transmission due to oxygen absorption, τ O 2 , against the predicted curve of τ O 2 versus range. The method is tested under short-range conditions. Accuracy of 6.5% is achieved with the designed experimental ranging system at the range of 400 m.

  3. Color image guided depth image super resolution using fusion filter

    Science.gov (United States)

    He, Jin; Liang, Bin; He, Ying; Yang, Jun

    2018-04-01

    Depth cameras are currently playing an important role in many areas. However, most of them can only obtain lowresolution (LR) depth images. Color cameras can easily provide high-resolution (HR) color images. Using color image as a guide image is an efficient way to get a HR depth image. In this paper, we propose a depth image super resolution (SR) algorithm, which uses a HR color image as a guide image and a LR depth image as input. We use the fusion filter of guided filter and edge based joint bilateral filter to get HR depth image. Our experimental results on Middlebury 2005 datasets show that our method can provide better quality in HR depth images both numerically and visually.

  4. HDR Pathological Image Enhancement Based on Improved Bias Field Correction and Guided Image Filter

    Directory of Open Access Journals (Sweden)

    Qingjiao Sun

    2016-01-01

    Full Text Available Pathological image enhancement is a significant topic in the field of pathological image processing. This paper proposes a high dynamic range (HDR pathological image enhancement method based on improved bias field correction and guided image filter (GIF. Firstly, a preprocessing including stain normalization and wavelet denoising is performed for Haematoxylin and Eosin (H and E stained pathological image. Then, an improved bias field correction model is developed to enhance the influence of light for high-frequency part in image and correct the intensity inhomogeneity and detail discontinuity of image. Next, HDR pathological image is generated based on least square method using low dynamic range (LDR image, H and E channel images. Finally, the fine enhanced image is acquired after the detail enhancement process. Experiments with 140 pathological images demonstrate the performance advantages of our proposed method as compared with related work.

  5. Evaluation of an image-based tracking workflow with Kalman filtering for automatic image plane alignment in interventional MRI.

    Science.gov (United States)

    Neumann, M; Cuvillon, L; Breton, E; de Matheli, M

    2013-01-01

    Recently, a workflow for magnetic resonance (MR) image plane alignment based on tracking in real-time MR images was introduced. The workflow is based on a tracking device composed of 2 resonant micro-coils and a passive marker, and allows for tracking of the passive marker in clinical real-time images and automatic (re-)initialization using the microcoils. As the Kalman filter has proven its benefit as an estimator and predictor, it is well suited for use in tracking applications. In this paper, a Kalman filter is integrated in the previously developed workflow in order to predict position and orientation of the tracking device. Measurement noise covariances of the Kalman filter are dynamically changed in order to take into account that, according to the image plane orientation, only a subset of the 3D pose components is available. The improved tracking performance of the Kalman extended workflow could be quantified in simulation results. Also, a first experiment in the MRI scanner was performed but without quantitative results yet.

  6. Defogging of road images using gain coefficient-based trilateral filter

    Science.gov (United States)

    Singh, Dilbag; Kumar, Vijay

    2018-01-01

    Poor weather conditions are responsible for most of the road accidents year in and year out. Poor weather conditions, such as fog, degrade the visibility of objects. Thus, it becomes difficult for drivers to identify the vehicles in a foggy environment. The dark channel prior (DCP)-based defogging techniques have been found to be an efficient way to remove fog from road images. However, it produces poor results when image objects are inherently similar to airlight and no shadow is cast on them. To eliminate this problem, a modified restoration model-based DCP is developed to remove the fog from road images. The transmission map is also refined by developing a gain coefficient-based trilateral filter. Thus, the proposed technique has an ability to remove fog from road images in an effective manner. The proposed technique is compared with seven well-known defogging techniques on two benchmark foggy images datasets and five real-time foggy images. The experimental results demonstrate that the proposed approach is able to remove the different types of fog from roadside images as well as significantly improve the image's visibility. It also reveals that the restored image has little or no artifacts.

  7. Brain MR Image Restoration Using an Automatic Trilateral Filter With GPU-Based Acceleration.

    Science.gov (United States)

    Chang, Herng-Hua; Li, Cheng-Yuan; Gallogly, Audrey Haihong

    2018-02-01

    Noise reduction in brain magnetic resonance (MR) images has been a challenging and demanding task. This study develops a new trilateral filter that aims to achieve robust and efficient image restoration. Extended from the bilateral filter, the proposed algorithm contains one additional intensity similarity funct-ion, which compensates for the unique characteristics of noise in brain MR images. An entropy function adaptive to intensity variations is introduced to regulate the contributions of the weighting components. To hasten the computation, parallel computing based on the graphics processing unit (GPU) strategy is explored with emphasis on memory allocations and thread distributions. To automate the filtration, image texture feature analysis associated with machine learning is investigated. Among the 98 candidate features, the sequential forward floating selection scheme is employed to acquire the optimal texture features for regularization. Subsequently, a two-stage classifier that consists of support vector machines and artificial neural networks is established to predict the filter parameters for automation. A speedup gain of 757 was reached to process an entire MR image volume of 256 × 256 × 256 pixels, which completed within 0.5 s. Automatic restoration results revealed high accuracy with an ensemble average relative error of 0.53 ± 0.85% in terms of the peak signal-to-noise ratio. This self-regulating trilateral filter outperformed many state-of-the-art noise reduction methods both qualitatively and quantitatively. We believe that this new image restoration algorithm is of potential in many brain MR image processing applications that require expedition and automation.

  8. A vector Wiener filter for dual-radionuclide imaging

    International Nuclear Information System (INIS)

    Links, J.M.; Prince, J.L.; Gupta, S.N.

    1996-01-01

    The routine use of a single radionuclide for patient imaging in nuclear medicine can be complemented by studies employing two tracers to examine two different processes in a single organ, most frequently by simultaneous imaging of both radionuclides in two different energy windows. In addition, simultaneous transmission/emission imaging with dual-radionuclides has been described, with one radionuclide used for the transmission study and a second for the emission study. There is thus currently considerable interest in dual-radionuclide imaging. A major problem with all dual-radionuclide imaging is the crosstalk between the two radionuclides. Such crosstalk frequently occurs, because scattered radiation from the higher energy radionuclide is detected in the lower energy window, and because the lower energy radionuclide may have higher energy emissions which are detected in the higher energy window. The authors have previously described the use of Fourier-based restoration filtering in single photon emission computed tomography (SPECT) and positron emission tomography (PET) to improve quantitative accuracy by designing a Wiener or other Fourier filter to partially restore the loss of contrast due to scatter and finite spatial resolution effects. The authors describe here the derivation and initial validation of an extension of such filtering for dual-radionuclide imaging that simultaneously (1) improves contrast in each radionuclide's direct image, (2) reduces image noise, and (3) reduces the crosstalk contribution from the other radionuclide. This filter is based on a vector version of the Wiener filter, which is shown to be superior [in the minimum mean square error (MMSE) sense] to the sequential application of separate crosstalk and restoration filters

  9. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  10. Fusion of multispectral and panchromatic images using multirate filter banks

    Institute of Scientific and Technical Information of China (English)

    Wang Hong; Jing Zhongliang; Li Jianxun

    2005-01-01

    In this paper, an image fusion method based on the filter banks is proposed for merging a high-resolution panchromatic image and a low-resolution multispectral image. Firstly, the filter banks are designed to merge different signals with minimum distortion by using cosine modulation. Then, the filter banks-based image fusion is adopted to obtain a high-resolution multispectral image that combines the spectral characteristic of low-resolution data with the spatial resolution of the panchromatic image. Finally, two different experiments and corresponding performance analysis are presented. Experimental results indicate that the proposed approach outperforms the HIS transform, discrete wavelet transform and discrete wavelet frame.

  11. Cluster Based Vector Attribute Filtering

    NARCIS (Netherlands)

    Kiwanuka, Fred N.; Wilkinson, Michael H.F.

    2016-01-01

    Morphological attribute filters operate on images based on properties or attributes of connected components. Until recently, attribute filtering was based on a single global threshold on a scalar property to remove or retain objects. A single threshold struggles in case no single property or

  12. Image denoising using new pixon representation based on fuzzy filtering and partial differential equations

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Nikpour, Mohsen

    2012-01-01

    In this paper, we have proposed two extensions to pixon-based image modeling. The first one is using bicubic interpolation instead of bilinear interpolation and the second one is using fuzzy filtering method, aiming to improve the quality of the pixonal image. Finally, partial differential...

  13. Hyper-spectral modulation fluorescent imaging using double acousto-optical tunable filter based on TeO2-crystals

    International Nuclear Information System (INIS)

    Zaytsev, Kirill I; Perchik, Alexey V; Chernomyrdin, Nikita V; Yurchenko, Stanislav O; Kudrin, Konstantin G; Reshetov, Igor V

    2015-01-01

    We have proposed a method for hyper-spectral fluorescent imaging based on acousto-optical filtering. The object of interest was pumped using ultraviolet radiation of mercury lamp equipped with monochromatic excitation filter with the window of transparency centered at 365 nm. Double TeO 2 -based acousto-optical filter, tunable in range from 430 to 780 nm and having 2 nm bandwidth of spectral transparency, was used in order to detect quasimonochromatic images of object fluorescence. Modulating of ultraviolet pump intensity was used in order to reduce an impact of non-fluorescent background on the sample fluorescent imaging. The technique for signal-to-noise ratio improvement, based on fluorescence intensity estimation via digital processing of modulated video sequence of fluorescent object, was introduced. We have implemented the proposed technique for the test sample studying and we have discussed its possible applications

  14. Directional Joint Bilateral Filter for Depth Images

    Directory of Open Access Journals (Sweden)

    Anh Vu Le

    2014-06-01

    Full Text Available Depth maps taken by the low cost Kinect sensor are often noisy and incomplete. Thus, post-processing for obtaining reliable depth maps is necessary for advanced image and video applications such as object recognition and multi-view rendering. In this paper, we propose adaptive directional filters that fill the holes and suppress the noise in depth maps. Specifically, novel filters whose window shapes are adaptively adjusted based on the edge direction of the color image are presented. Experimental results show that our method yields higher quality filtered depth maps than other existing methods, especially at the edge boundaries.

  15. An Improved Filtering Method for Quantum Color Image in Frequency Domain

    Science.gov (United States)

    Li, Panchi; Xiao, Hong

    2018-01-01

    In this paper we investigate the use of quantum Fourier transform (QFT) in the field of image processing. We consider QFT-based color image filtering operations and their applications in image smoothing, sharpening, and selective filtering using quantum frequency domain filters. The underlying principle used for constructing the proposed quantum filters is to use the principle of the quantum Oracle to implement the filter function. Compared with the existing methods, our method is not only suitable for color images, but also can flexibly design the notch filters. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on color images. The major advantages of the quantum frequency filtering lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  16. Imaging spectrometer using a liquid crystal tunable filter

    Science.gov (United States)

    Chrien, Thomas G.; Chovit, Christopher; Miller, Peter J.

    1993-09-01

    A demonstration imaging spectrometer using a liquid crystal tunable filter (LCTF) was built and tested on a hot air balloon platform. The LCTF is a tunable polarization interference or Lyot filter. The LCTF enables a small, light weight, low power, band sequential imaging spectrometer design. An overview of the prototype system is given along with a description of balloon experiment results. System model performance predictions are given for a future LCTF based imaging spectrometer design. System design considerations of LCTF imaging spectrometers are discussed.

  17. Restoration of nuclear medicine images using adaptive Wiener filters

    International Nuclear Information System (INIS)

    Meinel, G.

    1989-01-01

    An adaptive Wiener filter implementation for restoration of nuclear medicine images is described. These are considerably disturbed both deterministically (definition) and stochastically (Poisson's quantum noise). After introduction of an image model, description of necessary parameter approximations and information on optimum design methods the implementation is described. The filter operates adaptively as concerns the local signal-to-noise ratio and is based on a filter band concept. To verify the restoration effect size numbers are introduced and the filter is tested against these numbers. (author)

  18. Two-dimensional filtering of SPECT images using the Metz and Wiener filters

    International Nuclear Information System (INIS)

    King, M.A.; Schwinger, R.B.; Penney, B.C.; Doherty, P.W.

    1984-01-01

    Presently, single photon emission computed tomographic (SPECT) images are usually reconstructed by arbitrarily selecting a one-dimensional ''window'' function for use in reconstruction. A better method would be to automatically choose among a family of two-dimensional image restoration filters in such a way as to produce ''optimum'' image quality. Two-dimensional image processing techniques offer the advantages of a larger statistical sampling of the data for better noise reduction, and two-dimensional image deconvolution to correct for blurring during acquisition. An investigation of two such ''optimal'' digital image restoration techniques (the count-dependent Metz filter and the Wiener filter) was made. They were applied both as two-dimensional ''window'' functions for preprocessing SPECT images, and for filtering reconstructed images. Their performance was compared by measuring image contrast and per cent fractional standard deviation (% FSD) in multiple-acquisitions of the Jaszczak SPECT phantom at two different count levels. A statistically significant increase in image contrast and decrease in % FSD was observed with these techniques when compared to the results of reconstruction with a ramp filter. The adaptability of the techniques was manifested in a lesser % reduction in % FSD at the high count level coupled with a greater enhancement in image contrast. Using an array processor, processing time was 0.2 sec per image for the Metz filter and 3 sec for the Wiener filter. It is concluded that two-dimensional digital image restoration with these techniques can produce a significant increase in SPECT image quality

  19. Frequency Domain Image Filtering Using CUDA

    Directory of Open Access Journals (Sweden)

    Muhammad Awais Rajput

    2014-10-01

    Full Text Available In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA?s CUDA (Compute Unified Device Architecture. In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA?s parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butterworth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output image quality on both the processing architectures

  20. Frequency domain image filtering using cuda

    International Nuclear Information System (INIS)

    Rajput, M.A.; Khan, U.A.

    2014-01-01

    In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA's CUDA (Compute Unified Device Architecture). In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform) which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA's parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butter worth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output) image quality on both the processing architectures. (author)

  1. Vector Directional Distance Rational Hybrid Filters for Color Image Restoration

    Directory of Open Access Journals (Sweden)

    L. Khriji

    2005-12-01

    Full Text Available A new class of nonlinear filters, called vector-directional distance rational hybrid filters (VDDRHF for multispectral image processing, is introduced and applied to color image-filtering problems. These filters are based on rational functions (RF. The VDDRHF filter is a two-stage filter, which exploits the features of the vector directional distance filter (VDDF, the center weighted vector directional distance filter (CWVDDF and those of the rational operator. The filter output is a result of vector rational function (VRF operating on the output of three sub-functions. Two vector directional distance (VDDF filters and one center weighted vector directional distance filter (CWVDDF are proposed to be used in the first stage due to their desirable properties, such as, noise attenuation, chromaticity retention, and edges and details preservation. Experimental results show that the new VDDRHF outperforms a number of widely known nonlinear filters for multi-spectral image processing such as the vector median filter (VMF, the generalized vector directional filters (GVDF and distance directional filters (DDF with respect to all criteria used.

  2. Spatial filters for focusing ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Gori, Paola

    2001-01-01

    , but the approach always yields point spread functions better or equal to a traditional dynamically focused image. Finally, the process was applied to in-vivo clinical images of the liver and right kidney from a 28 years old male. The data was obtained with a single element transducer focused at 100 mm....... A new method for making spatial matched filter focusing of RF ultrasound data is proposed based on the spatial impulse response description of the imaging. The response from a scatterer at any given point in space relative to the transducer can be calculated, and this gives the spatial matched filter...... for synthetic aperture imaging for single element transducers. It is evaluated using the Field II program. Data from a single 3 MHz transducer focused at a distance of 80 mm is processed. Far from the transducer focal region, the processing greatly improves the image resolution: the lateral slice...

  3. Multiscale infrared and visible image fusion using gradient domain guided image filtering

    Science.gov (United States)

    Zhu, Jin; Jin, Weiqi; Li, Li; Han, Zhenghao; Wang, Xia

    2018-03-01

    For better surveillance with infrared and visible imaging, a novel hybrid multiscale decomposition fusion method using gradient domain guided image filtering (HMSD-GDGF) is proposed in this study. In this method, hybrid multiscale decomposition with guided image filtering and gradient domain guided image filtering of source images are first applied before the weight maps of each scale are obtained using a saliency detection technology and filtering means with three different fusion rules at different scales. The three types of fusion rules are for small-scale detail level, large-scale detail level, and base level. Finally, the target becomes more salient and can be more easily detected in the fusion result, with the detail information of the scene being fully displayed. After analyzing the experimental comparisons with state-of-the-art fusion methods, the HMSD-GDGF method has obvious advantages in fidelity of salient information (including structural similarity, brightness, and contrast), preservation of edge features, and human visual perception. Therefore, visual effects can be improved by using the proposed HMSD-GDGF method.

  4. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods.

    Science.gov (United States)

    Hoak, Anthony; Medeiros, Henry; Povinelli, Richard J

    2017-03-03

    We develop an interactive likelihood (ILH) for sequential Monte Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL) and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA) and classification of events, activities and relationships for multi-object trackers (CLEAR MOT)). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  5. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Directory of Open Access Journals (Sweden)

    Byeong Hak Kim

    2017-12-01

    Full Text Available Unmanned aerial vehicles (UAVs are equipped with optical systems including an infrared (IR camera such as electro-optical IR (EO/IR, target acquisition and designation sights (TADS, or forward looking IR (FLIR. However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC and scene-based NUC (SBNUC. However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA. In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR and long wave infrared (LWIR images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  6. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  7. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  8. A robust nonlinear filter for image restoration.

    Science.gov (United States)

    Koivunen, V

    1995-01-01

    A class of nonlinear regression filters based on robust estimation theory is introduced. The goal of the filtering is to recover a high-quality image from degraded observations. Models for desired image structures and contaminating processes are employed, but deviations from strict assumptions are allowed since the assumptions on signal and noise are typically only approximately true. The robustness of filters is usually addressed only in a distributional sense, i.e., the actual error distribution deviates from the nominal one. In this paper, the robustness is considered in a broad sense since the outliers may also be due to inappropriate signal model, or there may be more than one statistical population present in the processing window, causing biased estimates. Two filtering algorithms minimizing a least trimmed squares criterion are provided. The design of the filters is simple since no scale parameters or context-dependent threshold values are required. Experimental results using both real and simulated data are presented. The filters effectively attenuate both impulsive and nonimpulsive noise while recovering the signal structure and preserving interesting details.

  9. Imaging spin filter for electrons based on specular reflection from iridium (001)

    Energy Technology Data Exchange (ETDEWEB)

    Kutnyakhov, D.; Lushchyk, P. [Johannes Gutenberg-Universität, Institut für Physik, 55099 Mainz (Germany); Fognini, A.; Perriard, D. [Laboratorium für Festkörperphysik, ETH Zürich, 8093 Zürich (Switzerland); Kolbe, M.; Medjanik, K.; Fedchenko, E.; Nepijko, S.A.; Elmers, H.J. [Johannes Gutenberg-Universität, Institut für Physik, 55099 Mainz (Germany); Salvatella, G.; Stieger, C.; Gort, R.; Bähler, T.; Michlmayer, T.; Acremann, Y.; Vaterlaus, A. [Laboratorium für Festkörperphysik, ETH Zürich, 8093 Zürich (Switzerland); Giebels, F.; Gollisch, H.; Feder, R. [Universität Duisburg-Essen, Theoretische Festkörperphysik, 47057 Duisburg (Germany); Tusche, C. [Max Planck-Institut für Mikrostrukturphysik, 06120 Halle (Germany); and others

    2013-07-15

    As Stern–Gerlach type spin filters do not work with electrons, spin analysis of electron beams is accomplished by spin-dependent scattering processes based on spin–orbit or exchange interaction. Existing polarimeters are single-channel devices characterized by an inherently low figure of merit (FoM) of typically 10{sup −4}–10{sup −3}. This single-channel approach is not compatible with parallel imaging microscopes and also not with modern electron spectrometers that acquire a certain energy and angular interval simultaneously. We present a novel type of polarimeter that can transport a full image by making use of k-parallel conservation in low-energy electron diffraction. We studied specular reflection from Ir (001) because this spin-filter crystal provides a high analyzing power combined with a “lifetime” in UHV of a full day. One good working point is centered at 39 eV scattering energy with a broad maximum of 5 eV usable width. A second one at about 10 eV shows a narrower profile but much higher FoM. A relativistic layer-KKR SPLEED calculation shows good agreement with measurements. - Highlights: • Novel type of spin polarimeter can transport a full image by making use of k{sup →}{sub ||} conservation in LEED. • When combined with a hemispherical analyzer, it acquires a certain energy and angular interval simultaneously. • Ir (001) based spin-filter provides a high analyzing power combined with a “lifetime” in UHV of a full day. • Parallel spin detection improves spin polarimeter efficiency by orders of magnitude. • A relativistic layer-KKR SPLEED calculation shows good agreement with measurements.

  10. Selected annotated bibliographies for adaptive filtering of digital image data

    Science.gov (United States)

    Mayers, Margaret; Wood, Lynnette

    1988-01-01

    Digital spatial filtering is an important tool both for enhancing the information content of satellite image data and for implementing cosmetic effects which make the imagery more interpretable and appealing to the eye. Spatial filtering is a context-dependent operation that alters the gray level of a pixel by computing a weighted average formed from the gray level values of other pixels in the immediate vicinity.Traditional spatial filtering involves passing a particular filter or set of filters over an entire image. This assumes that the filter parameter values are appropriate for the entire image, which in turn is based on the assumption that the statistics of the image are constant over the image. However, the statistics of an image may vary widely over the image, requiring an adaptive or "smart" filter whose parameters change as a function of the local statistical properties of the image. Then a pixel would be averaged only with more typical members of the same population. This annotated bibliography cites some of the work done in the area of adaptive filtering. The methods usually fall into two categories, (a) those that segment the image into subregions, each assumed to have stationary statistics, and use a different filter on each subregion, and (b) those that use a two-dimensional "sliding window" to continuously estimate the filter either the spatial or frequency domain, or may utilize both domains. They may be used to deal with images degraded by space variant noise, to suppress undesirable local radiometric statistics while enforcing desirable (user-defined) statistics, to treat problems where space-variant point spread functions are involved, to segment images into regions of constant value for classification, or to "tune" images in order to remove (nonstationary) variations in illumination, noise, contrast, shadows, or haze.Since adpative filtering, like nonadaptive filtering, is used in image processing to accomplish various goals, this bibliography

  11. On-Line Multi-Damage Scanning Spatial-Wavenumber Filter Based Imaging Method for Aircraft Composite Structure

    Directory of Open Access Journals (Sweden)

    Yuanqiang Ren

    2017-05-01

    Full Text Available Structural health monitoring (SHM of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures.

  12. Feature-Based Nonlocal Polarimetric SAR Filtering

    Directory of Open Access Journals (Sweden)

    Xiaoli Xing

    2017-10-01

    Full Text Available Polarimetric synthetic aperture radar (PolSAR images are inherently contaminated by multiplicative speckle noise, which complicates the image interpretation and image analyses. To reduce the speckle effect, several adaptive speckle filters have been developed based on the weighted average of the similarity measures commonly depending on the model or probability distribution, which are often affected by the distribution parameters and modeling texture components. In this paper, a novel filtering method introduces the coefficient of variance ( CV and Pauli basis (PB to measure the similarity, and the two features are combined with the framework of the nonlocal mean filtering. The CV is used to describe the complexity of various scenes and distinguish the scene heterogeneity; moreover, the Pauli basis is able to express the polarimetric information in PolSAR image processing. This proposed filtering combines the CV and Pauli basis to improve the estimation accuracy of the similarity weights. Then, the similarity of the features is deduced according to the test statistic. Subsequently, the filtering is proceeded by using the nonlocal weighted estimation. The performance of the proposed filter is tested with the simulated images and real PolSAR images, which are acquired by AIRSAR system and ESAR system. The qualitative and quantitative experiments indicate the validity of the proposed method by comparing with the widely-used despeckling methods.

  13. Cat Swarm Optimization Based Functional Link Artificial Neural Network Filter for Gaussian Noise Removal from Computed Tomography Images

    Directory of Open Access Journals (Sweden)

    M. Kumar

    2016-01-01

    Full Text Available Gaussian noise is one of the dominant noises, which degrades the quality of acquired Computed Tomography (CT image data. It creates difficulties in pathological identification or diagnosis of any disease. Gaussian noise elimination is desirable to improve the clarity of a CT image for clinical, diagnostic, and postprocessing applications. This paper proposes an evolutionary nonlinear adaptive filter approach, using Cat Swarm Functional Link Artificial Neural Network (CS-FLANN to remove the unwanted noise. The structure of the proposed filter is based on the Functional Link Artificial Neural Network (FLANN and the Cat Swarm Optimization (CSO is utilized for the selection of optimum weight of the neural network filter. The applied filter has been compared with the existing linear filters, like the mean filter and the adaptive Wiener filter. The performance indices, such as peak signal to noise ratio (PSNR, have been computed for the quantitative analysis of the proposed filter. The experimental evaluation established the superiority of the proposed filtering technique over existing methods.

  14. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods

    Directory of Open Access Journals (Sweden)

    Anthony Hoak

    2017-03-01

    Full Text Available We develop an interactive likelihood (ILH for sequential Monte Carlo (SMC methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL and TUD-Stadtmitte using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA and classification of events, activities and relationships for multi-object trackers (CLEAR MOT. In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  15. Combination of Wiener filtering and singular value decomposition filtering for volume imaging PET

    International Nuclear Information System (INIS)

    Shao, L.; Lewitt, R.M.; Karp, J.S.

    1995-01-01

    Although the three-dimensional (3D) multi-slice rebinning (MSRB) algorithm in PET is fast and practical, and provides an accurate reconstruction, the MSRB image, in general, suffers from the noise amplified by its singular value decomposition (SVD) filtering operation in the axial direction. Their aim in this study is to combine the use of the Wiener filter (WF) with the SVD to decrease the noise and improve the image quality. The SVD filtering ''deconvolves'' the spatially variant axial response function while the WF suppresses the noise and reduces the blurring not modeled by the axial SVD filter but included in the system modulation transfer function. Therefore, the synthesis of these two techniques combines the advantages of both filters. The authors applied this approach to the volume imaging HEAD PENN-PET brain scanner with an axial extent of 256 mm. This combined filter was evaluated in terms of spatial resolution, image contrast, and signal-to-noise ratio with several phantoms, such as a cold sphere phantom and 3D brain phantom. Specifically, the authors studied both the SVD filter with an axial Wiener filter and the SVD filter with a 3D Wiener filter, and compared the filtered images to those from the 3D reprojection (3DRP) reconstruction algorithm. Their results indicate that the Wiener filter increases the signal-to-noise ratio and also improves the contrast. For the MSRB images of the 3D brain phantom, after 3D WF, both the Gray/White and Gray/Ventricle ratios were improved from 1.8 to 2.8 and 2.1 to 4.1, respectively. In addition, the image quality with the MSRB algorithm is close to that of the 3DRP algorithm with 3D WF applied to both image reconstructions

  16. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Directory of Open Access Journals (Sweden)

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  17. Image super-resolution reconstruction based on regularization technique and guided filter

    Science.gov (United States)

    Huang, De-tian; Huang, Wei-qin; Gu, Pei-ting; Liu, Pei-zhong; Luo, Yan-min

    2017-06-01

    In order to improve the accuracy of sparse representation coefficients and the quality of reconstructed images, an improved image super-resolution algorithm based on sparse representation is presented. In the sparse coding stage, the autoregressive (AR) regularization and the non-local (NL) similarity regularization are introduced to improve the sparse coding objective function. A group of AR models which describe the image local structures are pre-learned from the training samples, and one or several suitable AR models can be adaptively selected for each image patch to regularize the solution space. Then, the image non-local redundancy is obtained by the NL similarity regularization to preserve edges. In the process of computing the sparse representation coefficients, the feature-sign search algorithm is utilized instead of the conventional orthogonal matching pursuit algorithm to improve the accuracy of the sparse coefficients. To restore image details further, a global error compensation model based on weighted guided filter is proposed to realize error compensation for the reconstructed images. Experimental results demonstrate that compared with Bicubic, L1SR, SISR, GR, ANR, NE + LS, NE + NNLS, NE + LLE and A + (16 atoms) methods, the proposed approach has remarkable improvement in peak signal-to-noise ratio, structural similarity and subjective visual perception.

  18. SD LMS L-Filters for Filtration of Gray Level Images in Timespatial Domain Based on GLCM Features

    Directory of Open Access Journals (Sweden)

    Robert Hudec

    2008-01-01

    Full Text Available In this paper, the new kind of adaptive signal-dependent LMS L-filter for suppression of a mixed noise in greyscale images is developed. It is based on the texture parameter measurement as modification of spatial impulse detector structure. Moreover, the one of GLCM (Gray Level Co-occurrence Matrix features, namely, the contrast or inertia adjusted by threshold as switch between partial filters is utilised. Finally, at the positions of partial filters the adaptive LMS versions of L-filters are chosen.

  19. Image enhancement by spatial frequency post-processing of images obtained with pupil filters

    Science.gov (United States)

    Estévez, Irene; Escalera, Juan C.; Stefano, Quimey Pears; Iemmi, Claudio; Ledesma, Silvia; Yzuel, María J.; Campos, Juan

    2016-12-01

    The use of apodizing or superresolving filters improves the performance of an optical system in different frequency bands. This improvement can be seen as an increase in the OTF value compared to the OTF for the clear aperture. In this paper we propose a method to enhance the contrast of an image in both its low and its high frequencies. The method is based on the generation of a synthetic Optical Transfer Function, by multiplexing the OTFs given by the use of different non-uniform transmission filters on the pupil. We propose to capture three images, one obtained with a clear pupil, one obtained with an apodizing filter that enhances the low frequencies and another one taken with a superresolving filter that improves the high frequencies. In the Fourier domain the three spectra are combined by using smoothed passband filters, and then the inverse transform is performed. We show that we can create an enhanced image better than the image obtained with the clear aperture. To evaluate the performance of the method, bar tests (sinusoidal tests) with different frequency content are used. The results show that a contrast improvement in the high and low frequencies is obtained.

  20. 3D Wavelet-Based Filter and Method

    Science.gov (United States)

    Moss, William C.; Haase, Sebastian; Sedat, John W.

    2008-08-12

    A 3D wavelet-based filter for visualizing and locating structural features of a user-specified linear size in 2D or 3D image data. The only input parameter is a characteristic linear size of the feature of interest, and the filter output contains only those regions that are correlated with the characteristic size, thus denoising the image.

  1. Correlation Filters for Detection of Cellular Nuclei in Histopathology Images.

    Science.gov (United States)

    Ahmad, Asif; Asif, Amina; Rajpoot, Nasir; Arif, Muhammad; Minhas, Fayyaz Ul Amir Afsar

    2017-11-21

    Nuclei detection in histology images is an essential part of computer aided diagnosis of cancers and tumors. It is a challenging task due to diverse and complicated structures of cells. In this work, we present an automated technique for detection of cellular nuclei in hematoxylin and eosin stained histopathology images. Our proposed approach is based on kernelized correlation filters. Correlation filters have been widely used in object detection and tracking applications but their strength has not been explored in the medical imaging domain up till now. Our experimental results show that the proposed scheme gives state of the art accuracy and can learn complex nuclear morphologies. Like deep learning approaches, the proposed filters do not require engineering of image features as they can operate directly on histopathology images without significant preprocessing. However, unlike deep learning methods, the large-margin correlation filters developed in this work are interpretable, computationally efficient and do not require specialized or expensive computing hardware. A cloud based webserver of the proposed method and its python implementation can be accessed at the following URL: http://faculty.pieas.edu.pk/fayyaz/software.html#corehist .

  2. An adaptive Kalman filter for speckle reductions in ultrasound images

    International Nuclear Information System (INIS)

    Castellini, G.; Labate, D.; Masotti, L.; Mannini, E.; Rocchi, S.

    1988-01-01

    Speckle is the term used to describe the granular appearance found in ultrasound images. The presence of speckle reduces the diagnostic potential of the echographic technique because it tends to mask small inhomogeneities of the investigated tissue. We developed a new method of speckle reductions that utilizes an adaptive one-dimensional Kalman filter based on the assumption that the observed image can be considered as a superimposition of speckle on a ''true images''. The filter adaptivity, necessary to avoid loss of resolution, has been obtained by statistical considerations on the local signal variations. The results of the applications of this particular Kalman filter, both on A-Mode and B-MODE images, show a significant speckle reduction

  3. Convex blind image deconvolution with inverse filtering

    Science.gov (United States)

    Lv, Xiao-Guang; Li, Fang; Zeng, Tieyong

    2018-03-01

    Blind image deconvolution is the process of estimating both the original image and the blur kernel from the degraded image with only partial or no information about degradation and the imaging system. It is a bilinear ill-posed inverse problem corresponding to the direct problem of convolution. Regularization methods are used to handle the ill-posedness of blind deconvolution and get meaningful solutions. In this paper, we investigate a convex regularized inverse filtering method for blind deconvolution of images. We assume that the support region of the blur object is known, as has been done in a few existing works. By studying the inverse filters of signal and image restoration problems, we observe the oscillation structure of the inverse filters. Inspired by the oscillation structure of the inverse filters, we propose to use the star norm to regularize the inverse filter. Meanwhile, we use the total variation to regularize the resulting image obtained by convolving the inverse filter with the degraded image. The proposed minimization model is shown to be convex. We employ the first-order primal-dual method for the solution of the proposed minimization model. Numerical examples for blind image restoration are given to show that the proposed method outperforms some existing methods in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), visual quality and time consumption.

  4. Complex noise suppression using a sparse representation and 3D filtering of images

    Science.gov (United States)

    Kravchenko, V. F.; Ponomaryov, V. I.; Pustovoit, V. I.; Palacios-Enriquez, A.

    2017-08-01

    A novel method for the filtering of images corrupted by complex noise composed of randomly distributed impulses and additive Gaussian noise has been substantiated for the first time. The method consists of three main stages: the detection and filtering of pixels corrupted by impulsive noise, the subsequent image processing to suppress the additive noise based on 3D filtering and a sparse representation of signals in a basis of wavelets, and the concluding image processing procedure to clean the final image of the errors emerged at the previous stages. A physical interpretation of the filtering method under complex noise conditions is given. A filtering block diagram has been developed in accordance with the novel approach. Simulations of the novel image filtering method have shown an advantage of the proposed filtering scheme in terms of generally recognized criteria, such as the structural similarity index measure and the peak signal-to-noise ratio, and when visually comparing the filtered images.

  5. Efficient Filtering of Noisy Fingerprint Images

    Directory of Open Access Journals (Sweden)

    Maria Liliana Costin

    2016-01-01

    Full Text Available Fingerprint identification is an important field in the wide domain of biometrics with many applications, in different areas such: judicial, mobile phones, access systems, airports. There are many elaborated algorithms for fingerprint identification, but none of them can guarantee that the results of identification are always 100 % accurate. A first step in a fingerprint image analysing process consists in the pre-processing or filtering. If the result after this step is not by a good quality the upcoming identification process can fail. A major difficulty can appear in case of fingerprint identification if the images that should be identified from a fingerprint image database are noisy with different type of noise. The objectives of the paper are: the successful completion of the noisy digital image filtering, a novel more robust algorithm of identifying the best filtering algorithm and the classification and ranking of the images. The choice about the best filtered images of a set of 9 algorithms is made with a dual method of fuzzy and aggregation model. We are proposing through this paper a set of 9 filters with different novelty designed for processing the digital images using the following methods: quartiles, medians, average, thresholds and histogram equalization, applied all over the image or locally on small areas. Finally the statistics reveal the classification and ranking of the best algorithms.

  6. Development of an adaptive bilateral filter for evaluating color image difference

    Science.gov (United States)

    Wang, Zhaohui; Hardeberg, Jon Yngve

    2012-04-01

    Spatial filtering, which aims to mimic the contrast sensitivity function (CSF) of the human visual system (HVS), has previously been combined with color difference formulae for measuring color image reproduction errors. These spatial filters attenuate imperceptible information in images, unfortunately including high frequency edges, which are believed to be crucial in the process of scene analysis by the HVS. The adaptive bilateral filter represents a novel approach, which avoids the undesirable loss of edge information introduced by CSF-based filtering. The bilateral filter employs two Gaussian smoothing filters in different domains, i.e., spatial domain and intensity domain. We propose a method to decide the parameters, which are designed to be adaptive to the corresponding viewing conditions, and the quantity and homogeneity of information contained in an image. Experiments and discussions are given to support the proposal. A series of perceptual experiments were conducted to evaluate the performance of our approach. The experimental sample images were reproduced with variations in six image attributes: lightness, chroma, hue, compression, noise, and sharpness/blurriness. The Pearson's correlation values between the model-predicted image difference and the observed difference were employed to evaluate the performance, and compare it with that of spatial CIELAB and image appearance model.

  7. Fast bilateral filtering of CT-images

    Energy Technology Data Exchange (ETDEWEB)

    Steckmann, Sven; Baer, Matthias; Kachelriess, Marc [Erlangen-Nuernberg Univ., Erlangen (Germany). Inst. of Medical Physics (IMP)

    2011-07-01

    The Bilateral filter is able to get a lower noise level while retaining the edges in images. The downside of a bilateral filter is the high order of the problem itself. While having a Volume size of N with a dimension of d and a filter window of r the problem is of size N{sup d} . r{sup d}. In the literature there are some proposals for speeding up by reducing this order by approximating a component of the filter. This leads to inaccurate results which often implies non acceptable artifacts for medical imaging. A better way for medical imaging is to speed up the filter itself while leaving the basic structure intact. This is the way our implementation uses. We solve the problem of calculating the function of e{sup -x} in an efficient way on modern architectures, and the problem of vectorizing the filtering process. As result we implemented a filter which is 2.5 times faster than the highly optimized basic approach. By comparing the basic analytical approach with the final algorithm, the differences in quality of the computing process is negligible to the human eye. We are able to process a volume with 512{sup 3} voxels with a filter of 25 x 25 x 1 in 21 s on a modern Intel Xeon platform with two X5590 processors running at 3.33 GHz. (orig.)

  8. Quantum Image Filtering in the Frequency Domain

    Directory of Open Access Journals (Sweden)

    MANTA, V. I.

    2013-08-01

    Full Text Available In this paper we address the emerging field of Quantum Image Processing. We investigate the use of quantum computing systems to represent and manipulate images. In particular, we consider the basic task of image filtering. We prove that a quantum version for this operation can be achieved, even though the quantum convolution of two sequences is physically impossible. In our approach we use the principle of the quantum oracle to implement the filter function. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on grayscale images. There are important differences between the classical and the quantum implementations for image filtering. We analyze these differences and show that the major advantage of the quantum approach lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  9. 32Still Image Compression Algorithm Based on Directional Filter Banks

    OpenAIRE

    Chunling Yang; Duanwu Cao; Li Ma

    2010-01-01

    Hybrid wavelet and directional filter banks (HWD) is an effective multi-scale geometrical analysis method. Compared to wavelet transform, it can better capture the directional information of images. But the ringing artifact, which is caused by the coefficient quantization in transform domain, is the biggest drawback of image compression algorithms in HWD domain. In this paper, by researching on the relationship between directional decomposition and ringing artifact, an improved decomposition ...

  10. Image classification using multiscale information fusion based on saliency driven nonlinear diffusion filtering.

    Science.gov (United States)

    Hu, Weiming; Hu, Ruiguang; Xie, Nianhua; Ling, Haibin; Maybank, Stephen

    2014-04-01

    In this paper, we propose saliency driven image multiscale nonlinear diffusion filtering. The resulting scale space in general preserves or even enhances semantically important structures such as edges, lines, or flow-like structures in the foreground, and inhibits and smoothes clutter in the background. The image is classified using multiscale information fusion based on the original image, the image at the final scale at which the diffusion process converges, and the image at a midscale. Our algorithm emphasizes the foreground features, which are important for image classification. The background image regions, whether considered as contexts of the foreground or noise to the foreground, can be globally handled by fusing information from different scales. Experimental tests of the effectiveness of the multiscale space for the image classification are conducted on the following publicly available datasets: 1) the PASCAL 2005 dataset; 2) the Oxford 102 flowers dataset; and 3) the Oxford 17 flowers dataset, with high classification rates.

  11. AER image filtering

    Science.gov (United States)

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  12. Edge-based correlation image registration for multispectral imaging

    Science.gov (United States)

    Nandy, Prabal [Albuquerque, NM

    2009-11-17

    Registration information for images of a common target obtained from a plurality of different spectral bands can be obtained by combining edge detection and phase correlation. The images are edge-filtered, and pairs of the edge-filtered images are then phase correlated to produce phase correlation images. The registration information can be determined based on these phase correlation images.

  13. Multi-look polarimetric SAR image filtering using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper

    2000-01-01

    Based on a previously published algorithm capable of estimating the radar cross-section in synthetic aperture radar (SAR) intensity images, a new filter is presented utilizing multi-look polarimetric SAR images. The underlying mean covariance matrix is estimated from the observed sample covariance...

  14. Filter and Filter Bank Design for Image Texture Recognition

    Energy Technology Data Exchange (ETDEWEB)

    Randen, Trygve

    1997-12-31

    The relevance of this thesis to energy and environment lies in its application to remote sensing such as for instance sea floor mapping and seismic pattern recognition. The focus is on the design of two-dimensional filters for feature extraction, segmentation, and classification of digital images with textural content. The features are extracted by filtering with a linear filter and estimating the local energy in the filter response. The thesis gives a review covering broadly most previous approaches to texture feature extraction and continues with proposals of some new techniques. 143 refs., 59 figs., 7 tabs.

  15. Real-time single image dehazing based on dark channel prior theory and guided filtering

    Science.gov (United States)

    Zhang, Zan

    2017-10-01

    Images and videos taken outside the foggy day are serious degraded. In order to restore degraded image taken in foggy day and overcome traditional Dark Channel prior algorithms problems of remnant fog in edge, we propose a new dehazing method.We first find the fog area in the dark primary color map to obtain the estimated value of the transmittance using quadratic tree. Then we regard the gray-scale image after guided filtering as atmospheric light map and remove haze based on it. Box processing and image down sampling technology are also used to improve the processing speed. Finally, the atmospheric light scattering model is used to restore the image. A plenty of experiments show that algorithm is effective, efficient and has a wide range of application.

  16. Image denoising by sparse 3-D transform-domain collaborative filtering.

    Science.gov (United States)

    Dabov, Kostadin; Foi, Alessandro; Katkovnik, Vladimir; Egiazarian, Karen

    2007-08-01

    We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2-D image fragments (e.g., blocks) into 3-D data arrays which we call "groups." Collaborative filtering is a special procedure developed to deal with these 3-D groups. We realize it using the three successive steps: 3-D transformation of a group, shrinkage of the transform spectrum, and inverse 3-D transformation. The result is a 3-D estimate that consists of the jointly filtered grouped image blocks. By attenuating the noise, the collaborative filtering reveals even the finest details shared by grouped blocks and, at the same time, it preserves the essential unique features of each individual block. The filtered blocks are then returned to their original positions. Because these blocks are overlapping, for each pixel, we obtain many different estimates which need to be combined. Aggregation is a particular averaging procedure which is exploited to take advantage of this redundancy. A significant improvement is obtained by a specially developed collaborative Wiener filtering. An algorithm based on this novel denoising strategy and its efficient implementation are presented in full detail; an extension to color-image denoising is also developed. The experimental results demonstrate that this computationally scalable algorithm achieves state-of-the-art denoising performance in terms of both peak signal-to-noise ratio and subjective visual quality.

  17. Aircraft Detection from VHR Images Based on Circle-Frequency Filter and Multilevel Features

    Directory of Open Access Journals (Sweden)

    Feng Gao

    2013-01-01

    Full Text Available Aircraft automatic detection from very high-resolution (VHR images plays an important role in a wide variety of applications. This paper proposes a novel detector for aircraft detection from very high-resolution (VHR remote sensing images. To accurately distinguish aircrafts from background, a circle-frequency filter (CF-filter is used to extract the candidate locations of aircrafts from a large size image. A multi-level feature model is then employed to represent both local appearance and spatial layout of aircrafts by means of Robust Hue Descriptor and Histogram of Oriented Gradients. The experimental results demonstrate the superior performance of the proposed method.

  18. Ghost suppression in image restoration filtering

    Science.gov (United States)

    Riemer, T. E.; Mcgillem, C. D.

    1975-01-01

    An optimum image restoration filter is described in which provision is made to constrain the spatial extent of the restoration function, the noise level of the filter output and the rate of falloff of the composite system point-spread away from the origin. Experimental results show that sidelobes on the composite system point-spread function produce ghosts in the restored image near discontinuities in intensity level. By redetermining the filter using a penalty function that is zero over the main lobe of the composite point-spread function of the optimum filter and nonzero where the point-spread function departs from a smoothly decaying function in the sidelobe region, a great reduction in sidelobe level is obtained. Almost no loss in resolving power of the composite system results from this procedure. By iteratively carrying out the same procedure even further reductions in sidelobe level are obtained. Examples of original and iterated restoration functions are shown along with their effects on a test image.

  19. High performance 3D adaptive filtering for DSP based portable medical imaging systems

    Science.gov (United States)

    Bockenbach, Olivier; Ali, Murtaza; Wainwright, Ian; Nadeski, Mark

    2015-03-01

    Portable medical imaging devices have proven valuable for emergency medical services both in the field and hospital environments and are becoming more prevalent in clinical settings where the use of larger imaging machines is impractical. Despite their constraints on power, size and cost, portable imaging devices must still deliver high quality images. 3D adaptive filtering is one of the most advanced techniques aimed at noise reduction and feature enhancement, but is computationally very demanding and hence often cannot be run with sufficient performance on a portable platform. In recent years, advanced multicore digital signal processors (DSP) have been developed that attain high processing performance while maintaining low levels of power dissipation. These processors enable the implementation of complex algorithms on a portable platform. In this study, the performance of a 3D adaptive filtering algorithm on a DSP is investigated. The performance is assessed by filtering a volume of size 512x256x128 voxels sampled at a pace of 10 MVoxels/sec with an Ultrasound 3D probe. Relative performance and power is addressed between a reference PC (Quad Core CPU) and a TMS320C6678 DSP from Texas Instruments.

  20. Metal artefact reduction for a dental cone beam CT image using image segmentation and backprojection filters

    International Nuclear Information System (INIS)

    Mohammadi, Mahdi; Khotanlou, Hassan; Mohammadi, Mohammad

    2011-01-01

    Full text: Due to low dose delivery and fast scanning, the dental Cone Beam CT (CBCT) is the latest technology being implanted for a range of dental imaging. The presence of metallic objects including amalgam or gold fillings in the mouth produces an intuitive image for human jaws. The feasibility of a fast and accurate approach for metal artefact reduction for dental CBCT is investigated. The current study investigates the metal artefact reduction using image segmentation and modification of several sinigrams. In order to reduce metal effects such as beam hardening, streak artefact and intense noises, the application of several algorithms is evaluated. The proposed method includes three stages: preprocessing, reconstruction and post-processing. In the pre-processing stage, in order to reduce the noise level, several phase and frequency filters were applied. At the second stage, based on the specific sinogram achieved for each segment, spline interpolation and weighting backprojection filters were applied to reconstruct the original image. A three-dimensional filter was then applied on reconstructed images, to improve the image quality. Results showed that compared to other available filters, standard frequency filters have a significant influence in the preprocessing stage (ΔHU = 48 ± 6). In addition, with the streak artefact, the probability of beam hardening artefact increases. t e post-processing stage, the application of three-dimensional filters improves the quality of reconstructed images (See Fig. I). Conclusion The proposed method reduces metal artefacts especially where there are more than one metal implanted in the region of interest.

  1. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery

    Directory of Open Access Journals (Sweden)

    Yalan Zheng

    2017-12-01

    Full Text Available GaoFen-2 (GF-2 is a civilian optical satellite self-developed by China equipped with both multispectral and panchromatic sensors, and is the first satellite in China with a resolution below 1 m. Because the pan-sharpening methods on GF-2 imagery have not been a focus of previous works, we propose a novel pan-sharpening method based on guided image filtering and compare the performance to state-of-the-art methods on GF-2 images. Guided image filtering was introduced to decompose and transfer the details and structures from the original panchromatic and multispectral bands. Thereafter, an adaptive model that considers the local spectral relationship was designed to properly inject spatial information back into the original spectral bands. Four pairs of GF-2 images acquired from urban, water body, cropland, and forest areas were selected for the experiments. Both quantitative and visual inspections were used for the assessment. The experimental results demonstrated that for GF-2 imagery acquired over different scenes, the proposed approach consistently achieves high spectral fidelity and enhances spatial details, thereby benefitting the potential classification procedures.

  2. Filter's importance in nuclear cardiology imaging

    International Nuclear Information System (INIS)

    Jesus, Maria C. de; Lima, Ana L.S.; Santos, Joyra A. dos; Megueriam, Berdj A.

    2008-01-01

    Full text: Nuclear Medicine is a medical speciality which employs tomography procedures for the diagnosis, treatment and prevention of diseases. One of the most commonly used apparatus is the Single Photon Emission Computed Tomography (SPECT). To perform exams, a very small amount of a radiopharmaceutical must be given to the patient. Then, a gamma camera is placed in convenient positions to perform the photon counting, which is used to reconstruct a full 3 dimensional distribution of the radionuclide inside the body or organ. This reconstruction provides a 3-dimensional image in spatial coordinates, of the body or organ under study, allowing the physician to give the diagnostic. Image reconstruction is usually worked in the frequency domain, due to a great simplification introduced by the Fourier decomposition of image spectra. After the reconstruction, an inverse Fourier transform must be applied to trace back the image into spatial coordinates. To optimize this reconstruction procedure, digital filters are used to remove undesirable components of frequency, which can 'shadow' relevant physical signatures of diseases. Unfortunately, the efficiency of the applied filter is strongly dependent on its own mathematical parameters. In this work we demonstrate how filters interfere on image quality in cardiology examinations with SPECT, concerning perfusion and myocardial viability and the importance of the medical physicist in the choice of the right filters avoiding some serious problems that could occur in the inadequate processing of an image damaging the medical diagnosis. (author)

  3. Fractional order integration and fuzzy logic based filter for denoising of echocardiographic image.

    Science.gov (United States)

    Saadia, Ayesha; Rashdi, Adnan

    2016-12-01

    Ultrasound is widely used for imaging due to its cost effectiveness and safety feature. However, ultrasound images are inherently corrupted with speckle noise which severely affects the quality of these images and create difficulty for physicians in diagnosis. To get maximum benefit from ultrasound imaging, image denoising is an essential requirement. To perform image denoising, a two stage methodology using fuzzy weighted mean and fractional integration filter has been proposed in this research work. In stage-1, image pixels are processed by applying a 3 × 3 window around each pixel and fuzzy logic is used to assign weights to the pixels in each window, replacing central pixel of the window with weighted mean of all neighboring pixels present in the same window. Noise suppression is achieved by assigning weights to the pixels while preserving edges and other important features of an image. In stage-2, the resultant image is further improved by fractional order integration filter. Effectiveness of the proposed methodology has been analyzed for standard test images artificially corrupted with speckle noise and real ultrasound B-mode images. Results of the proposed technique have been compared with different state-of-the-art techniques including Lsmv, Wiener, Geometric filter, Bilateral, Non-local means, Wavelet, Perona et al., Total variation (TV), Global Adaptive Fractional Integral Algorithm (GAFIA) and Improved Fractional Order Differential (IFD) model. Comparison has been done on quantitative and qualitative basis. For quantitative analysis different metrics like Peak Signal to Noise Ratio (PSNR), Speckle Suppression Index (SSI), Structural Similarity (SSIM), Edge Preservation Index (β) and Correlation Coefficient (ρ) have been used. Simulations have been done using Matlab. Simulation results of artificially corrupted standard test images and two real Echocardiographic images reveal that the proposed method outperforms existing image denoising techniques

  4. Efficient Scalable Median Filtering Using Histogram-Based Operations.

    Science.gov (United States)

    Green, Oded

    2018-05-01

    Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.

  5. Preliminary energy-filtering neutron imaging with time-of-flight method on PKUNIFTY: A compact accelerator based neutron imaging facility at Peking University

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hu; Zou, Yubin, E-mail: zouyubin@pku.edu.cn; Wen, Weiwei; Lu, Yuanrong; Guo, Zhiyu

    2016-07-01

    Peking University Neutron Imaging Facility (PKUNIFTY) works on an accelerator–based neutron source with a repetition period of 10 ms and pulse duration of 0.4 ms, which has a rather low Cd ratio. To improve the effective Cd ratio and thus improve the detection capability of the facility, energy-filtering neutron imaging was realized with the intensified CCD camera and time-of-flight (TOF) method. Time structure of the pulsed neutron source was firstly simulated with Geant4, and the simulation result was evaluated with experiment. Both simulation and experiment results indicated that fast neutrons and epithermal neutrons were concentrated in the first 0.8 ms of each pulse period; meanwhile in the period of 0.8–2.0 ms only thermal neutrons existed. Based on this result, neutron images with and without energy filtering were acquired respectively, and it showed that detection capability of PKUNIFTY was improved with setting the exposure interval as 0.8–2.0 ms, especially for materials with strong moderating capability.

  6. Collaborating Filtering Community Image Recommendation System Based on Scene

    Directory of Open Access Journals (Sweden)

    He Tao

    2017-01-01

    Full Text Available With the advancement of smart city, the development of intelligent mobile terminal and wireless network, the traditional text information service no longer meet the needs of the community residents, community image service appeared as a new media service. “There are pictures of the truth” has become a community residents to understand and master the new dynamic community, image information service has become a new information service. However, there are two major problems in image information service. Firstly, the underlying eigenvalues extracted by current image feature extraction techniques are difficult for users to understand, and there is a semantic gap between the image content itself and the user’s understanding; secondly, in community life of the image data increasing quickly, it is difficult to find their own interested image data. Aiming at the two problems, this paper proposes a unified image semantic scene model to express the image content. On this basis, a collaborative filtering recommendation model of fusion scene semantics is proposed. In the recommendation model, a comprehensiveness and accuracy user interest model is proposed to improve the recommendation quality. The results of the present study have achieved good results in the pilot cities of Wenzhou and Yan'an, and it is applied normally.

  7. Digital Path Approach Despeckle Filter for Ultrasound Imaging and Video

    Directory of Open Access Journals (Sweden)

    Marek Szczepański

    2017-01-01

    Full Text Available We propose a novel filtering technique capable of reducing the multiplicative noise in ultrasound images that is an extension of the denoising algorithms based on the concept of digital paths. In this approach, the filter weights are calculated taking into account the similarity between pixel intensities that belongs to the local neighborhood of the processed pixel, which is called a path. The output of the filter is estimated as the weighted average of pixels connected by the paths. The way of creating paths is pivotal and determines the effectiveness and computational complexity of the proposed filtering design. Such procedure can be effective for different types of noise but fail in the presence of multiplicative noise. To increase the filtering efficiency for this type of disturbances, we introduce some improvements of the basic concept and new classes of similarity functions and finally extend our techniques to a spatiotemporal domain. The experimental results prove that the proposed algorithm provides the comparable results with the state-of-the-art techniques for multiplicative noise removal in ultrasound images and it can be applied for real-time image enhancement of video streams.

  8. A filtering approach to image reconstruction in 3D SPECT

    International Nuclear Information System (INIS)

    Bronnikov, Andrei V.

    2000-01-01

    We present a new approach to three-dimensional (3D) image reconstruction using analytical inversion of the exponential divergent beam transform, which can serve as a mathematical model for cone-beam 3D SPECT imaging. We apply a circular cone-beam scan and assume constant attenuation inside a convex area with a known boundary, which is satisfactory in brain imaging. The reconstruction problem is reduced to an image restoration problem characterized by a shift-variant point spread function which is given analytically. The method requires two computation steps: backprojection and filtering. The modulation transfer function (MTF) of the filter is derived by means of an original methodology using the 2D Laplace transform. The filter is implemented in the frequency domain and requires 2D Fourier transform of transverse slices. In order to obtain a shift-invariant cone-beam projection-backprojection operator we resort to an approximation, assuming that the collimator has a relatively large focal length. Nevertheless, numerical experiments demonstrate surprisingly good results for detectors with relatively short focal lengths. The use of a wavelet-based filtering algorithm greatly improves the stability to Poisson noise. (author)

  9. Image pre-filtering for measurement error reduction in digital image correlation

    Science.gov (United States)

    Zhou, Yihao; Sun, Chen; Song, Yuntao; Chen, Jubing

    2015-02-01

    In digital image correlation, the sub-pixel intensity interpolation causes a systematic error in the measured displacements. The error increases toward high-frequency component of the speckle pattern. In practice, a captured image is usually corrupted by additive white noise. The noise introduces additional energy in the high frequencies and therefore raises the systematic error. Meanwhile, the noise also elevates the random error which increases with the noise power. In order to reduce the systematic error and the random error of the measurements, we apply a pre-filtering to the images prior to the correlation so that the high-frequency contents are suppressed. Two spatial-domain filters (binomial and Gaussian) and two frequency-domain filters (Butterworth and Wiener) are tested on speckle images undergoing both simulated and real-world translations. By evaluating the errors of the various combinations of speckle patterns, interpolators, noise levels, and filter configurations, we come to the following conclusions. All the four filters are able to reduce the systematic error. Meanwhile, the random error can also be reduced if the signal power is mainly distributed around DC. For high-frequency speckle patterns, the low-pass filters (binomial, Gaussian and Butterworth) slightly increase the random error and Butterworth filter produces the lowest random error among them. By using Wiener filter with over-estimated noise power, the random error can be reduced but the resultant systematic error is higher than that of low-pass filters. In general, Butterworth filter is recommended for error reduction due to its flexibility of passband selection and maximal preservation of the allowed frequencies. Binomial filter enables efficient implementation and thus becomes a good option if computational cost is a critical issue. While used together with pre-filtering, B-spline interpolator produces lower systematic error than bicubic interpolator and similar level of the random

  10. Supervised Filter Learning for Representation Based Face Recognition.

    Directory of Open Access Journals (Sweden)

    Chao Bi

    Full Text Available Representation based classification methods, such as Sparse Representation Classification (SRC and Linear Regression Classification (LRC have been developed for face recognition problem successfully. However, most of these methods use the original face images without any preprocessing for recognition. Thus, their performances may be affected by some problematic factors (such as illumination and expression variances in the face images. In order to overcome this limitation, a novel supervised filter learning algorithm is proposed for representation based face recognition in this paper. The underlying idea of our algorithm is to learn a filter so that the within-class representation residuals of the faces' Local Binary Pattern (LBP features are minimized and the between-class representation residuals of the faces' LBP features are maximized. Therefore, the LBP features of filtered face images are more discriminative for representation based classifiers. Furthermore, we also extend our algorithm for heterogeneous face recognition problem. Extensive experiments are carried out on five databases and the experimental results verify the efficacy of the proposed algorithm.

  11. Energy Based Clutter Filtering for Vector Flow Imaging

    DEFF Research Database (Denmark)

    Villagómez Hoyos, Carlos Armando; Jensen, Jonas; Ewertsen, Caroline

    2017-01-01

    for obtaining vector flow measurements, since the spectra overlaps at high beam-to-flow angles. In this work a distinct approach is proposed, where the energy of the velocity spectrum is used to differentiate among the two signals. The energy based method is applied by limiting the amplitude of the velocity...... spectrum function to a predetermined threshold. The effect of the clutter filtering is evaluated on a plane wave (PW) scan sequence in combination with transverse oscillation (TO) and directional beamforming (DB) for velocity estimation. The performance of the filter is assessed by comparison...

  12. Superresolution restoration of an image sequence: adaptive filtering approach.

    Science.gov (United States)

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  13. Improving the quality of brain CT image from Wavelet filters

    International Nuclear Information System (INIS)

    Pita Machado, Reinaldo; Perez Diaz, Marlen; Bravo Pino, Rolando

    2012-01-01

    An algorithm to reduce Poisson noise is described using Wavelet filters. Five tomographic images of patients and a head anthropomorphic phantom were used. They were acquired with two different CT machines. Due to the original images contain the acquisition noise; some simulated free noise lesions were added to the images and after that the whole images were contaminated with noise. Contaminated images were filtered with 9 Wavelet filters at different decomposition levels and thresholds. Image quality of filtered and unfiltered images was graded using the Signal to Noise ratio, Normalized Mean Square Error and the Structural Similarity Index, as well as, by the subjective JAFROC methods with 5 observers. Some filters as Bior 3.7 and dB45 improved in a significant way head CT image quality (p<0.05) producing an increment in SNR without visible structural distortions

  14. Spectral characterization in deep UV of an improved imaging KDP acousto-optic tunable filter

    International Nuclear Information System (INIS)

    Gupta, Neelam; Voloshinov, Vitaly

    2014-01-01

    Recently, we developed a number of high quality noncollinear acousto-optic tunable filter (AOTF) cells in different birefringent materials with UV imaging capability. Cells based on a single crystal of KDP (potassium dihydrophosphate) had the best transmission efficiency and the optical throughput needed to acquire high quality spectral images at wavelengths above 220 nm. One of the main limitations of these imaging filters was their small angular aperture in air, limited to about 1.0°. In this paper, we describe an improved imaging KDP AOTF operating from the deep UV to the visible region of the spectrum. The linear and angular apertures of the new filter are 10 × 10 mm 2 and 1.8°, respectively. The spectral tuning range is 205–430 nm with a 60 cm −1 spectral resolution. We describe the filter and present experimental results on imaging using both a broadband source and a number of light emitting diodes (LEDs) in the UV, and include the measured spectra of these LEDs obtained with a collinear SiO 2 filter-based spectrometer operating above 255 nm. (paper)

  15. Optimization of Butterworth filter for brain SPECT imaging

    International Nuclear Information System (INIS)

    Minoshima, Satoshi; Maruno, Hirotaka; Yui, Nobuharu

    1993-01-01

    A method has been described to optimize the cutoff frequency of the Butterworth filter for brain SPECT imaging. Since a computer simulation study has demonstrated that separation between an object signal and the random noise in projection images in a spatial-frequency domain is influenced by the total number of counts, the cutoff frequency of the Butterworth filter should be optimized for individual subjects according to total counts in a study. To reveal the relationship between the optimal cutoff frequencies and total counts in brain SPECT study, we used a normal volunteer and 99m Tc hexamethyl-propyleneamine oxime (HMPAO) to obtain projection sets with different total counts. High quality images were created from a projection set with an acquisition time of 300-seconds per projection. The filter was optimized by calculating mean square errors from high quality images visually inspecting filtered reconstructed images. Dependence between total counts and optimal cutoff frequencies was clearly demonstrated in a nomogram. Using this nomogram, the optimal cutoff frequency for each study can be estimated from total counts, maximizing visual image quality. The results suggest that the cutoff frequency of Butterworth filter should be determined by referring to total counts in each study. (author)

  16. The value of filtered planar images in pediatric DMSA scans

    International Nuclear Information System (INIS)

    Mohammed, A.M.; Naddaf, S.Y.; Elgazzar, A.H.; Al-Abdul Salam, A.A.; Omar, A.A.

    2006-01-01

    The study was designed to demonstrate the value of filtered planar images in paediatric DMSA scanning. One hundred and seventy three patients ranged in age from 15 days to 12 years (mean: 4.3 years) with urinary tract infection (UTI) and clinical and/or laboratory suspicion of acute pyelonephritis (APN) were retrospectively studied. Planar images were filtered using Butterworth filter. The scan findings were reported as positive, negative or equivocal for cortical defects. Each scan was read in a double-blind fashion by two nuclear medicine physicians to evaluate inter-observer variations. Each kidney was divided into three zones, upper, middle and lower, and each zone was graded as positive, negative or equivocal for the presence of renal defects. Renal cortical defects were found in 66 patients (91 kidneys and 186 zones) with filtered images, 58 patients (81 kidneys and 175 zones) with planar images, and 69 patients (87 kidneys and 180 zones) with SPECT images. McNemar's test revealed statistically significant difference between filtered and planar images (p=0.038 for patients, 0.021 for kidneys and 0.034 for number of zones). Inter-observer agreement was 0.877 for filtered images, 0.915 for planar images and 0.915 for SPECT images. It was concluded that filtered planar images of renal cortex are comparable to SPECT images and can be used effectively in place of SPECT, when required, to shorten imaging time and eliminate motion artifacts, especially in the paediatric population. (author)

  17. Gaussian particle filter based pose and motion estimation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Determination of relative three-dimensional (3D) position, orientation, and relative motion between two reference frames is an important problem in robotic guidance, manipulation, and assembly as well as in other fields such as photogrammetry.A solution to pose and motion estimation problem that uses two-dimensional (2D) intensity images from a single camera is desirable for real-time applications. The difficulty in performing this measurement is that the process of projecting 3D object features to 2D images is a nonlinear transformation. In this paper, the 3D transformation is modeled as a nonlinear stochastic system with the state estimation providing six degrees-of-freedom motion and position values, using line features in image plane as measuring inputs and dual quaternion to represent both rotation and translation in a unified notation. A filtering method called the Gaussian particle filter (GPF) based on the particle filtering concept is presented for 3D pose and motion estimation of a moving target from monocular image sequences. The method has been implemented with simulated data, and simulation results are provided along with comparisons to the extended Kalman filter (EKF) and the unscented Kalman filter (UKF) to show the relative advantages of the GPF. Simulation results showed that GPF is a superior alternative to EKF and UKF.

  18. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Matched Filtering and Convolutional Neural Network.

    Science.gov (United States)

    Chen, Shuo; Luo, Chenggao; Wang, Hongqiang; Deng, Bin; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-04-26

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. However, there are still two problems in three-dimensional (3D) TCAI. Firstly, the large-scale reference-signal matrix based on meshing the 3D imaging area creates a heavy computational burden, thus leading to unsatisfactory efficiency. Secondly, it is difficult to resolve the target under low signal-to-noise ratio (SNR). In this paper, we propose a 3D imaging method based on matched filtering (MF) and convolutional neural network (CNN), which can reduce the computational burden and achieve high-resolution imaging for low SNR targets. In terms of the frequency-hopping (FH) signal, the original echo is processed with MF. By extracting the processed echo in different spike pulses separately, targets in different imaging planes are reconstructed simultaneously to decompose the global computational complexity, and then are synthesized together to reconstruct the 3D target. Based on the conventional TCAI model, we deduce and build a new TCAI model based on MF. Furthermore, the convolutional neural network (CNN) is designed to teach the MF-TCAI how to reconstruct the low SNR target better. The experimental results demonstrate that the MF-TCAI achieves impressive performance on imaging ability and efficiency under low SNR. Moreover, the MF-TCAI has learned to better resolve the low-SNR 3D target with the help of CNN. In summary, the proposed 3D TCAI can achieve: (1) low-SNR high-resolution imaging by using MF; (2) efficient 3D imaging by downsizing the large-scale reference-signal matrix; and (3) intelligent imaging with CNN. Therefore, the TCAI based on MF and CNN has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  19. An approach of point cloud denoising based on improved bilateral filtering

    Science.gov (United States)

    Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin

    2018-04-01

    An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.

  20. Dynamic positron emission tomography image restoration via a kinetics-induced bilateral filter.

    Directory of Open Access Journals (Sweden)

    Zhaoying Bian

    Full Text Available Dynamic positron emission tomography (PET imaging is a powerful tool that provides useful quantitative information on physiological and biochemical processes. However, low signal-to-noise ratio in short dynamic frames makes accurate kinetic parameter estimation from noisy voxel-wise time activity curves (TAC a challenging task. To address this problem, several spatial filters have been investigated to reduce the noise of each frame with noticeable gains. These filters include the Gaussian filter, bilateral filter, and wavelet-based filter. These filters usually consider only the local properties of each frame without exploring potential kinetic information from entire frames. Thus, in this work, to improve PET parametric imaging accuracy, we present a kinetics-induced bilateral filter (KIBF to reduce the noise of dynamic image frames by incorporating the similarity between the voxel-wise TACs using the framework of bilateral filter. The aim of the proposed KIBF algorithm is to reduce the noise in homogeneous areas while preserving the distinct kinetics of regions of interest. Experimental results on digital brain phantom and in vivo rat study with typical (18F-FDG kinetics have shown that the present KIBF algorithm can achieve notable gains over other existing algorithms in terms of quantitative accuracy measures and visual inspection.

  1. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Floros, D; Zhang, Y; Yin, FF; Ren, L; Pitsianis, N

    2016-01-01

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  2. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Floros, D [Aristotle University of Thessaloniki (Greece); Zhang, Y; Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States)

    2016-06-15

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  3. A diagnostic imaging approach for online characterization of multi-impact in aircraft composite structures based on a scanning spatial-wavenumber filter of guided wave

    Science.gov (United States)

    Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Su, Zhongqing

    2017-06-01

    Monitoring of impact and multi-impact in particular in aircraft composite structures has been an intensive research topic in the field of guided-wave-based structural health monitoring (SHM). Compared with the majority of existing methods such as those using signal features in the time-, frequency- or joint time-frequency domain, the approach based on spatial-wavenumber filter of guided wave shows superb advantage in effectively distinguishing particular wave modes and identifying their propagation direction relative to the sensor array. However, there exist two major issues when conducting online characterization of multi-impact event. Firstly, the spatial-wavenumber filter should be realized in the situation that the wavenumber of high spatial resolution of the complicated multi-impact signal cannot be measured or modeled. Secondly, it's difficult to identify the multiple impacts and realize multi-impact localization due to the overlapping of wavenumbers. To address these issues, a scanning spatial-wavenumber filter based diagnostic imaging method for online characterization of multi-impact event is proposed to conduct multi-impact imaging and localization in this paper. The principle of the scanning filter for multi-impact is developed first to conduct spatial-wavenumber filtering and to achieve wavenumber-time imaging of the multiple impacts. Then, a feature identification method of multi-impact based on eigenvalue decomposition and wavenumber searching is presented to estimate the number of impacts and calculate the wavenumber of the multi-impact signal, and an image mapping method is proposed as well to convert the wavenumber-time image to an angle-distance image to distinguish and locate the multiple impacts. A series of multi-impact events are applied to a carbon fiber laminate plate to validate the proposed methods. The validation results show that the localization of the multiple impacts are well achieved.

  4. Implementation of non-linear filters for iterative penalized maximum likelihood image reconstruction

    International Nuclear Information System (INIS)

    Liang, Z.; Gilland, D.; Jaszczak, R.; Coleman, R.

    1990-01-01

    In this paper, the authors report on the implementation of six edge-preserving, noise-smoothing, non-linear filters applied in image space for iterative penalized maximum-likelihood (ML) SPECT image reconstruction. The non-linear smoothing filters implemented were the median filter, the E 6 filter, the sigma filter, the edge-line filter, the gradient-inverse filter, and the 3-point edge filter with gradient-inverse filter, and the 3-point edge filter with gradient-inverse weight. A 3 x 3 window was used for all these filters. The best image obtained, by viewing the profiles through the image in terms of noise-smoothing, edge-sharpening, and contrast, was the one smoothed with the 3-point edge filter. The computation time for the smoothing was less than 1% of one iteration, and the memory space for the smoothing was negligible. These images were compared with the results obtained using Bayesian analysis

  5. Multi-dimensional medical images compressed and filtered with wavelets

    International Nuclear Information System (INIS)

    Boyen, H.; Reeth, F. van; Flerackers, E.

    2002-01-01

    Full text: Using the standard wavelet decomposition methods, multi-dimensional medical images can be compressed and filtered by repeating the wavelet-algorithm on 1D-signals in an extra loop per extra dimension. In the non-standard decomposition for multi-dimensional images the areas that must be zero-filled in case of band- or notch-filters are more complex than geometric areas such as rectangles or cubes. Adding an additional dimension in this algorithm until 4D (e.g. a 3D beating heart) increases the geometric complexity of those areas even more. The aim of our study was to calculate the boundaries of the formed complex geometric areas, so we can use the faster non-standard decomposition to compress and filter multi-dimensional medical images. Because a lot of 3D medical images taken by PET- or SPECT-cameras have only a few layers in the Z-dimension and compressing images in a dimension with a few voxels is usually not worthwhile, we provided a solution in which one can choose which dimensions will be compressed or filtered. With the proposal of non-standard decomposition on Daubechies' wavelets D2 to D20 by Steven Gollmer in 1992, 1D data can be compressed and filtered. Each additional level works only on the smoothed data, so the transformation-time halves per extra level. Zero-filling a well-defined area alter the wavelet-transform and then performing the inverse transform will do the filtering. To be capable to compress and filter up to 4D-Images with the faster non-standard wavelet decomposition method, we have investigated a new method for calculating the boundaries of the areas which must be zero-filled in case of filtering. This is especially true for band- and notch filtering. Contrary to the standard decomposition method, the areas are no longer rectangles in 2D or cubes in 3D or a row of cubes in 4D: they are rectangles expanded with a half-sized rectangle in the other direction for 2D, cubes expanded with half cubes in one and quarter cubes in the

  6. Comparison of various filtering methods for digital X-ray image processing

    International Nuclear Information System (INIS)

    Pfluger, T.; Reinfelder, H.E.; Dorschky, K.; Oppelt, A.; Siemens A.G., Erlangen

    1987-01-01

    Three filtering methods are explained and compared that are used for border edge enhancement of digitally processed X-ray images. The filters are compared by two examples, a radiograph of the chest, and one of the knee joint. The unsharpness mask is found to yield the best compromise between edge enhancement and image noise intensifying effect, whereas the results obtained by the high-pass filter or the Wallis filter are less good for diagnostic evaluation. The filtered images better display narrow lines, structural borders and edges, and finely spotted areas, than the original radiograph, so that diagnostic evaluation is easier after image filtering. (orig.) [de

  7. [Restoration filtering based on projection power spectrum for single-photon emission computed tomography].

    Science.gov (United States)

    Kubo, N

    1995-04-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical "least squares filter" theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the "Butterworth" filtering method (cut-off frequency of 0.15 cycles/pixel), and "Wiener" filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99mTc filled cylinder, were used. NMSE of the "Butterworth" filter, "Wiener" filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images.

  8. Restoration filtering based on projection power spectrum for single-photon emission computed tomography

    International Nuclear Information System (INIS)

    Kubo, Naoki

    1995-01-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical 'least squares filter' theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the 'Butterworth' filtering method (cut-off frequency of 0.15 cycles/pixel), and 'Wiener' filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99m Tc filled cylinder, were used. NMSE of the 'Butterworth' filter, 'Wiener' filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images. (author)

  9. Iodine filter imaging system for subtraction angiography using synchrotron radiation

    Science.gov (United States)

    Umetani, K.; Ueda, K.; Takeda, T.; Itai, Y.; Akisada, M.; Nakajima, T.

    1993-11-01

    A new type of real-time imaging system was developed for transvenous coronary angiography. A combination of an iodine filter and a single energy broad-bandwidth X-ray produces two-energy images for the iodine K-edge subtraction technique. X-ray images are sequentially converted to visible images by an X-ray image intensifier. By synchronizing the timing of the movement of the iodine filter into and out of the X-ray beam, two output images of the image intensifier are focused side by side on the photoconductive layer of a camera tube by an oscillating mirror. Both images are read out by electron beam scanning of a 1050-scanning-line video camera within a camera frame time of 66.7 ms. One hundred ninety two pairs of iodine-filtered and non-iodine-filtered images are stored in the frame memory at a rate of 15 pairs/s. In vivo subtracted images of coronary arteries in dogs were obtained in the form of motion pictures.

  10. Chip-scale fluorescence microscope based on a silo-filter complementary metal-oxide semiconductor image sensor.

    Science.gov (United States)

    Ah Lee, Seung; Ou, Xiaoze; Lee, J Eugene; Yang, Changhuei

    2013-06-01

    We demonstrate a silo-filter (SF) complementary metal-oxide semiconductor (CMOS) image sensor for a chip-scale fluorescence microscope. The extruded pixel design with metal walls between neighboring pixels guides fluorescence emission through the thick absorptive filter to the photodiode of a pixel. Our prototype device achieves 13 μm resolution over a wide field of view (4.8 mm × 4.4 mm). We demonstrate bright-field and fluorescence longitudinal imaging of living cells in a compact, low-cost configuration.

  11. Methods of filtering the graph images of the functions

    Directory of Open Access Journals (Sweden)

    Олександр Григорович Бурса

    2017-06-01

    Full Text Available The theoretical aspects of cleaning raster images of scanned graphs of functions from digital, chromatic and luminance distortions by using computer graphics techniques have been considered. The basic types of distortions characteristic of graph images of functions have been stated. To suppress the distortion several methods, providing for high-quality of the resulting images and saving their topological features, were suggested. The paper describes the techniques developed and improved by the authors: the method of cleaning the image of distortions by means of iterative contrasting, based on the step-by-step increase in image contrast in the graph by 1%; the method of small entities distortion restoring, based on the thinning of the known matrix of contrast increase filter (the allowable dimensions of the nucleus dilution radius convolution matrix, which provide for the retention of the graph lines have been established; integration technique of the noise reduction method by means of contrasting and distortion restoring method of small entities with known σ-filter. Each method in the complex has been theoretically substantiated. The developed methods involve treatment of graph images as the entire image (global processing and its fragments (local processing. The metrics assessing the quality of the resulting image with the global and local processing have been chosen, the substantiation of the choice as well as the formulas have been given. The proposed complex methods of cleaning the graphs images of functions from grayscale image distortions is adaptive to the form of an image carrier, the distortion level in the image and its distribution. The presented results of testing the developed complex of methods for a representative sample of images confirm its effectiveness

  12. Optical supervised filtering technique based on Hopfield neural network

    Science.gov (United States)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  13. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    Science.gov (United States)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  14. Filters in 2D and 3D Cardiac SPECT Image Processing

    Directory of Open Access Journals (Sweden)

    Maria Lyra

    2014-01-01

    Full Text Available Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast.

  15. Document image binarization using "multi-scale" predefined filters

    Science.gov (United States)

    Saabni, Raid M.

    2018-04-01

    Reading text or searching for key words within a historical document is a very challenging task. one of the first steps of the complete task is binarization, where we separate foreground such as text, figures and drawings from the background. Successful results of this important step in many cases can determine next steps to success or failure, therefore it is very vital to the success of the complete task of reading and analyzing the content of a document image. Generally, historical documents images are of poor quality due to their storage condition and degradation over time, which mostly cause to varying contrasts, stains, dirt and seeping ink from reverse side. In this paper, we use banks of anisotropic predefined filters in different scales and orientations to develop a binarization method for degraded documents and manuscripts. Using the fact, that handwritten strokes may follow different scales and orientations, we use predefined sets of filter banks having various scales, weights, and orientations to seek a compact set of filters and weights in order to generate diffrent layers of foregrounds and background. Results of convolving these fiters on the gray level image locally, weighted and accumulated to enhance the original image. Based on the different layers, seeds of components in the gray level image and a learning process, we present an improved binarization algorithm to separate the background from layers of foreground. Different layers of foreground which may be caused by seeping ink, degradation or other factors are also separated from the real foreground in a second phase. Promising experimental results were obtained on the DIBCO2011 , DIBCO2013 and H-DIBCO2016 data sets and a collection of images taken from real historical documents.

  16. A Low-Noise CMOS THz Imager Based on Source Modulation and an In-Pixel High-Q Passive Switched-Capacitor N-Path Filter

    Science.gov (United States)

    Boukhayma, Assim; Dupret, Antoine; Rostaing, Jean-Pierre; Enz, Christian

    2016-01-01

    This paper presents the first low noise complementary metal oxide semiconductor (CMOS) terahertz (THz) imager based on source modulation and in-pixel high-Q filtering. The 31×31 focal plane array has been fully integrated in a 0.13μm standard CMOS process. The sensitivity has been improved significantly by modulating the active THz source that lights the scene and performing on-chip high-Q filtering. Each pixel encompass a broadband bow tie antenna coupled to an N-type metal-oxide-semiconductor (NMOS) detector that shifts the THz radiation, a low noise adjustable gain amplifier and a high-Q filter centered at the modulation frequency. The filter is based on a passive switched-capacitor (SC) N-path filter combined with a continuous-time broad-band Gm-C filter. A simplified analysis that helps in designing and tuning the passive SC N-path filter is provided. The characterization of the readout chain shows that a Q factor of 100 has been achieved for the filter with a good matching between the analytical calculation and the measurement results. An input-referred noise of 0.2μV RMS has been measured. Characterization of the chip with different THz wavelengths confirms the broadband feature of the antenna and shows that this THz imager reaches a total noise equivalent power of 0.6 nW at 270 GHz and 0.8 nW at 600 GHz. PMID:26950131

  17. A Low-Noise CMOS THz Imager Based on Source Modulation and an In-Pixel High-Q Passive Switched-Capacitor N-Path Filter.

    Science.gov (United States)

    Boukhayma, Assim; Dupret, Antoine; Rostaing, Jean-Pierre; Enz, Christian

    2016-03-03

    This paper presents the first low noise complementary metal oxide semiconductor (CMOS) deletedCMOS terahertz (THz) imager based on source modulation and in-pixel high-Q filtering. The 31 × 31 focal plane array has been fully integrated in a 0 . 13 μ m standard CMOS process. The sensitivity has been improved significantly by modulating the active THz source that lights the scene and performing on-chip high-Q filtering. Each pixel encompass a broadband bow tie antenna coupled to an N-type metal-oxide-semiconductor (NMOS) detector that shifts the THz radiation, a low noise adjustable gain amplifier and a high-Q filter centered at the modulation frequency. The filter is based on a passive switched-capacitor (SC) N-path filter combined with a continuous-time broad-band Gm-C filter. A simplified analysis that helps in designing and tuning the passive SC N-path filter is provided. The characterization of the readout chain shows that a Q factor of 100 has been achieved for the filter with a good matching between the analytical calculation and the measurement results. An input-referred noise of 0 . 2 μ V RMS has been measured. Characterization of the chip with different THz wavelengths confirms the broadband feature of the antenna and shows that this THz imager reaches a total noise equivalent power of 0 . 6 nW at 270 GHz and 0 . 8 nW at 600 GHz.

  18. Fan-beam and cone-beam image reconstruction via filtering the backprojection image of differentiated projection data

    International Nuclear Information System (INIS)

    Zhuang Tingliang; Leng Shuai; Nett, Brian E; Chen Guanghong

    2004-01-01

    In this paper, a new image reconstruction scheme is presented based on Tuy's cone-beam inversion scheme and its fan-beam counterpart. It is demonstrated that Tuy's inversion scheme may be used to derive a new framework for fan-beam and cone-beam image reconstruction. In this new framework, images are reconstructed via filtering the backprojection image of differentiated projection data. The new framework is mathematically exact and is applicable to a general source trajectory provided the Tuy data sufficiency condition is satisfied. By choosing a piece-wise constant function for one of the components in the factorized weighting function, the filtering kernel is one dimensional, viz. the filtering process is along a straight line. Thus, the derived image reconstruction algorithm is mathematically exact and efficient. In the cone-beam case, the derived reconstruction algorithm is applicable to a large class of source trajectories where the pi-lines or the generalized pi-lines exist. In addition, the new reconstruction scheme survives the super-short scan mode in both the fan-beam and cone-beam cases provided the data are not transversely truncated. Numerical simulations were conducted to validate the new reconstruction scheme for the fan-beam case

  19. Detection of pulmonary nodules on lung X-ray images. Studies on multi-resolutional filter and energy subtraction images

    International Nuclear Information System (INIS)

    Sawada, Akira; Sato, Yoshinobu; Kido, Shoji; Tamura, Shinichi

    1999-01-01

    The purpose of this work is to prove the effectiveness of an energy subtraction image for the detection of pulmonary nodules and the effectiveness of multi-resolutional filter on an energy subtraction image to detect pulmonary nodules. Also we study influential factors to the accuracy of detection of pulmonary nodules from viewpoints of types of images, types of digital filters and types of evaluation methods. As one type of images, we select an energy subtraction image, which removes bones such as ribs from the conventional X-ray image by utilizing the difference of X-ray absorption ratios at different energy between bones and soft tissue. Ribs and vessels are major causes of CAD errors in detection of pulmonary nodules and many researches have tried to solve this problem. So we select conventional X-ray images and energy subtraction X-ray images as types of images, and at the same time select ∇ 2 G (Laplacian of Guassian) filter, Min-DD (Minimum Directional Difference) filter and our multi-resolutional filter as types of digital filters. Also we select two evaluation methods and prove the effectiveness of an energy subtraction image, the effectiveness of Min-DD filter on a conventional X-ray image and the effectiveness of multi-resolutional filter on an energy subtraction image. (author)

  20. Efficient Hardware Implementation For Fingerprint Image Enhancement Using Anisotropic Gaussian Filter.

    Science.gov (United States)

    Khan, Tariq Mahmood; Bailey, Donald G; Khan, Mohammad A U; Kong, Yinan

    2017-05-01

    A real-time image filtering technique is proposed which could result in faster implementation for fingerprint image enhancement. One major hurdle associated with fingerprint filtering techniques is the expensive nature of their hardware implementations. To circumvent this, a modified anisotropic Gaussian filter is efficiently adopted in hardware by decomposing the filter into two orthogonal Gaussians and an oriented line Gaussian. An architecture is developed for dynamically controlling the orientation of the line Gaussian filter. To further improve the performance of the filter, the input image is homogenized by a local image normalization. In the proposed structure, for a middle-range reconfigurable FPGA, both parallel compute-intensive and real-time demands were achieved. We manage to efficiently speed up the image-processing time and improve the resource utilization of the FPGA. Test results show an improved speed for its hardware architecture while maintaining reasonable enhancement benchmarks.

  1. A METHOD FOR RECORDING AND VIEWING STEREOSCOPIC IMAGES IN COLOUR USING MULTICHROME FILTERS

    DEFF Research Database (Denmark)

    2000-01-01

    in a conventional stereogram recorded of the scene. The invention makes use of a colour-based encoding technique and viewing filters selected so that the human observer receives, in one eye, an image of nearly full colour information, in the other eye, an essentially monochrome image supplying the parallactic......The aim of the invention is to create techniques for the encoding, production and viewing of stereograms, supplemented by methods for selecting certain optical filters needed in these novel techniques, thus providing a human observer with stereograms each of which consist of a single image...

  2. Oriented diffusion filtering for enhancing low-quality fingerprint images

    KAUST Repository

    Gottschlich, C.; Schönlieb, C.-B.

    2012-01-01

    To enhance low-quality fingerprint images, we present a novel method that first estimates the local orientation of the fingerprint ridge and valley flow and next performs oriented diffusion filtering, followed by a locally adaptive contrast enhancement step. By applying the authors' new approach to low-quality images of the FVC2004 fingerprint databases, the authors are able to show its competitiveness with other state-of-the-art enhancement methods for fingerprints like curved Gabor filtering. A major advantage of oriented diffusion filtering over those is its computational efficiency. Combining oriented diffusion filtering with curved Gabor filters led to additional improvements and, to the best of the authors' knowledge, the lowest equal error rates achieved so far using MINDTCT and BOZORTH3 on the FVC2004 databases. The recognition performance and the computational efficiency of the method suggest to include oriented diffusion filtering as a standard image enhancement add-on module for real-time fingerprint recognition systems. In order to facilitate the reproduction of these results, an implementation of the oriented diffusion filtering for Matlab and GNU Octave is made available for download. © 2012 The Institution of Engineering and Technology.

  3. Oriented diffusion filtering for enhancing low-quality fingerprint images

    KAUST Repository

    Gottschlich, C.

    2012-01-01

    To enhance low-quality fingerprint images, we present a novel method that first estimates the local orientation of the fingerprint ridge and valley flow and next performs oriented diffusion filtering, followed by a locally adaptive contrast enhancement step. By applying the authors\\' new approach to low-quality images of the FVC2004 fingerprint databases, the authors are able to show its competitiveness with other state-of-the-art enhancement methods for fingerprints like curved Gabor filtering. A major advantage of oriented diffusion filtering over those is its computational efficiency. Combining oriented diffusion filtering with curved Gabor filters led to additional improvements and, to the best of the authors\\' knowledge, the lowest equal error rates achieved so far using MINDTCT and BOZORTH3 on the FVC2004 databases. The recognition performance and the computational efficiency of the method suggest to include oriented diffusion filtering as a standard image enhancement add-on module for real-time fingerprint recognition systems. In order to facilitate the reproduction of these results, an implementation of the oriented diffusion filtering for Matlab and GNU Octave is made available for download. © 2012 The Institution of Engineering and Technology.

  4. Eigenimage filtering of nuclear medicine image sequences

    International Nuclear Information System (INIS)

    Windham, J.P.; Froelich, J.W.; Abd-Allah, M.

    1985-01-01

    In many nuclear medicine imaging sequences the localization of radioactivity in organs other than the target organ interferes with imaging of the desired anatomical structure or physiological process. A filtering technique has been developed which suppresses the interfering process while enhancing the desired process. This technique requires the identification of temporal sequential signatures for both the interfering and desired processes. These signatures are placed in the form of signature vectors. Signature matrices, M/sub D/ and M/sub U/, are formed by taking the outer product expansion of the temporal signature vectors for the desired and interfering processes respectively. By using the transformation from the simultaneous diagonalization of these two signature matrices a weighting vector is obtained. The technique is shown to maximize the projection of the desired process while minimizing the interfering process based upon an extension of Rayleigh's Principle. The technique is demonstrated for first pass renal and cardiac flow studies. This filter offers a potential for simplifying and extending the accuracy of diagnostic nuclear medicine procedures

  5. A Kalman Filter-Based Method to Generate Continuous Time Series of Medium-Resolution NDVI Images

    Directory of Open Access Journals (Sweden)

    Fernando Sedano

    2014-12-01

    Full Text Available A data assimilation method to produce complete temporal sequences of synthetic medium-resolution images is presented. The method implements a Kalman filter recursive algorithm that integrates medium and moderate resolution imagery. To demonstrate the approach, time series of 30-m spatial resolution NDVI images at 16-day time steps were generated using Landsat NDVI images and MODIS NDVI products at four sites with different ecosystems and land cover-land use dynamics. The results show that the time series of synthetic NDVI images captured seasonal land surface dynamics and maintained the spatial structure of the landscape at higher spatial resolution. The time series of synthetic medium-resolution NDVI images were validated within a Monte Carlo simulation framework. Normalized residuals decreased as the number of available observations increased, ranging from 0.2 to below 0.1. Residuals were also significantly lower for time series of synthetic NDVI images generated at combined recursion (smoothing than individually at forward and backward recursions (filtering. Conversely, the uncertainties of the synthetic images also decreased when the number of available observations increased and combined recursions were implemented.

  6. Computer processing of the scintigraphic image using digital filtering techniques

    International Nuclear Information System (INIS)

    Matsuo, Michimasa

    1976-01-01

    The theory of digital filtering was studied as a method for the computer processing of scintigraphic images. The characteristics and design techniques of finite impulse response (FIR) digital filters with linear phases were examined using the z-transform. The conventional data processing method, smoothing, could be recognized as one kind of linear phase FIR low-pass digital filtering. Ten representatives of FIR low-pass digital filters with various cut-off frequencies were scrutinized from the frequency domain in one-dimension and two-dimensions. These filters were applied to phantom studies with cold targets, using a Scinticamera-Minicomputer on-line System. These studies revealed that the resultant images had a direct connection with the magnitude response of the filter, that is, they could be estimated fairly well from the frequency response of the digital filter used. The filter, which was estimated from phantom studies as optimal for liver scintigrams using 198 Au-colloid, was successfully applied in clinical use for detecting true cold lesions and, at the same time, for eliminating spurious images. (J.P.N.)

  7. ANALYSIS OF SST IMAGES BY WEIGHTED ENSEMBLE TRANSFORM KALMAN FILTER

    OpenAIRE

    Sai , Gorthi; Beyou , Sébastien; Memin , Etienne

    2011-01-01

    International audience; This paper presents a novel, efficient scheme for the analysis of Sea Surface Temperature (SST) ocean images. We consider the estimation of the velocity fields and vorticity values from a sequence of oceanic images. The contribution of this paper lies in proposing a novel, robust and simple approach based onWeighted Ensemble Transform Kalman filter (WETKF) data assimilation technique for the analysis of real SST images, that may contain coast regions or large areas of ...

  8. Human visual modeling and image deconvolution by linear filtering

    International Nuclear Information System (INIS)

    Larminat, P. de; Barba, D.; Gerber, R.; Ronsin, J.

    1978-01-01

    The problem is the numerical restoration of images degraded by passing through a known and spatially invariant linear system, and by the addition of a stationary noise. We propose an improvement of the Wiener's filter to allow the restoration of such images. This improvement allows to reduce the important drawbacks of classical Wiener's filter: the voluminous data processing, the lack of consideration of the vision's characteristivs which condition the perception by the observer of the restored image. In a first paragraph, we describe the structure of the visual detection system and a modelling method of this system. In the second paragraph we explain a restoration method by Wiener filtering that takes the visual properties into account and that can be adapted to the local properties of the image. Then the results obtained on TV images or scintigrams (images obtained by a gamma-camera) are commented [fr

  9. Chromatic aberrations correction for imaging spectrometer based on acousto-optic tunable filter with two transducers.

    Science.gov (United States)

    Zhao, Huijie; Wang, Ziye; Jia, Guorui; Zhang, Ying; Xu, Zefu

    2017-10-02

    The acousto-optic tunable filter (AOTF) with wide wavelength range and high spectral resolution has long crystal and two transducers. A longer crystal length leads to a bigger chromatic focal shift and the double-transducer arrangement induces angular mutation in diffracted beam, which increase difficulty in longitudinal and lateral chromatic aberration correction respectively. In this study, the two chromatic aberrations are analyzed quantitatively based on an AOTF optical model and a novel catadioptric dual-path configuration is proposed to correct both the chromatic aberrations. The test results exhibit effectiveness of the optical configuration for this type of AOTF-based imaging spectrometer.

  10. Neutron Imaging of Diesel Particulate Filters

    International Nuclear Information System (INIS)

    Strzelec, Andrea; Bilheux, Hassina Z.; Finney, Charles E.A.; Daw, C. Stuart; Foster, Dave; Rutland, Christopher J.; Schillinger, Burkhard; Schulz, Michael

    2009-01-01

    This article presents nondestructive neutron computed tomography (nCT) measurements of Diesel Particulate Filters (DPFs) as a method to measure ash and soot loading in the filters. Uncatalyzed and unwashcoated 200cpsi cordierite DPFs exposed to 100% biodiesel (B100) exhaust and conventional ultra low sulfur 2007 certification diesel (ULSD) exhaust at one speed-load point (1500rpm, 2.6bar BMEP) are compared to a brand new (never exposed) filter. Precise structural information about the substrate as well as an attempt to quantify soot and ash loading in the channel of the DPF illustrates the potential strength of the neutron imaging technique

  11. A Tentative Application Of Morphological Filters To Time-Varying Images

    Science.gov (United States)

    Billard, D.; Poquillon, B.

    1989-03-01

    In this paper, morphological filters, which are commonly used to process either 2D or multidimensional static images, are generalized to the analysis of time-varying image sequence. The introduction of the time dimension induces then interesting prop-erties when designing such spatio-temporal morphological filters. In particular, the specification of spatio-temporal structuring ele-ments (equivalent to time-varying spatial structuring elements) can be adjusted according to the temporal variations of the image sequences to be processed : this allows to derive specific morphological transforms to perform noise filtering or moving objects discrimination on dynamic images viewed by a non-stationary sensor. First, a brief introduction to the basic principles underlying morphological filters will be given. Then, a straightforward gener-alization of these principles to time-varying images will be pro-posed. This will lead us to define spatio-temporal opening and closing and to introduce some of their possible applications to process dynamic images. At last, preliminary results obtained us-ing a natural forward looking infrared (FUR) image sequence are presented.

  12. Active filtering applied to radiographic images unfolded by the Richardson-Lucy algorithm

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Silvani, Maria Ines; Lopes, Ricardo T.

    2011-01-01

    Degradation of images caused by systematic uncertainties can be reduced when one knows the features of the spoiling agent. Typical uncertainties of this kind arise in radiographic images due to the non - zero resolution of the detector used to acquire them, and from the non-punctual character of the source employed in the acquisition, or from the beam divergence when extended sources are used. Both features blur the image, which, instead of a single point exhibits a spot with a vanishing edge, reproducing hence the point spread function - PSF of the system. Once this spoiling function is known, an inverse problem approach, involving inversion of matrices, can then be used to retrieve the original image. As these matrices are generally ill-conditioned, due to statistical fluctuation and truncation errors, iterative procedures should be applied, such as the Richardson-Lucy algorithm. This algorithm has been applied in this work to unfold radiographic images acquired by transmission of thermal neutrons and gamma-rays. After this procedure, the resulting images undergo an active filtering which fairly improves their final quality at a negligible cost in terms of processing time. The filter ruling the process is based on the matrix of the correction factors for the last iteration of the deconvolution procedure. Synthetic images degraded with a known PSF, and undergone to the same treatment, have been used as benchmark to evaluate the soundness of the developed active filtering procedure. The deconvolution and filtering algorithms have been incorporated to a Fortran program, written to deal with real images, generate the synthetic ones and display both. (author)

  13. Adaptive multiresolution Hermite-Binomial filters for image edge and texture analysis

    NARCIS (Netherlands)

    Gu, Y.H.; Katsaggelos, A.K.

    1994-01-01

    A new multiresolution image analysis approach using adaptive Hermite-Binomial filters is presented in this paper. According to the local image structural and textural properties, the analysis filter kernels are made adaptive both in their scales and orders. Applications of such an adaptive filtering

  14. Static Hyperspectral Fluorescence Imaging of Viscous Materials Based on a Linear Variable Filter Spectrometer

    Directory of Open Access Journals (Sweden)

    Alexander W. Koch

    2013-09-01

    Full Text Available This paper presents a low-cost hyperspectral measurement setup in a new application based on fluorescence detection in the visible (Vis wavelength range. The aim of the setup is to take hyperspectral fluorescence images of viscous materials. Based on these images, fluorescent and non-fluorescent impurities in the viscous materials can be detected. For the illumination of the measurement object, a narrow-band high-power light-emitting diode (LED with a center wavelength of 370 nm was used. The low-cost acquisition unit for the imaging consists of a linear variable filter (LVF and a complementary metal oxide semiconductor (CMOS 2D sensor array. The translucent wavelength range of the LVF is from 400 nm to 700 nm. For the confirmation of the concept, static measurements of fluorescent viscous materials with a non-fluorescent impurity have been performed and analyzed. With the presented setup, measurement surfaces in the micrometer range can be provided. The measureable minimum particle size of the impurities is in the nanometer range. The recording rate for the measurements depends on the exposure time of the used CMOS 2D sensor array and has been found to be in the microsecond range.

  15. Energy-filtered Photoelectron Emission Microscopy (EF-PEEM) for imaging nanoelectronic materials

    International Nuclear Information System (INIS)

    Renault, Olivier; Chabli, Amal

    2007-01-01

    Photoelectron-Emission Microscopy (PEEM) is the most promising approach to photoemission-based (XPS, UPS) imaging techniques with high lateral resolution, typically below 100 nm. It has now reached its maturity with a new generation of instruments with energy-filtering capabilities. Therefore UPS and XPS imaging with energy-filtered PEEM (EF-PEEM) can be applied to technologically-relevant samples. UPS images with contrast in local work function, obtained with laboratory UV sources, are obtained in ultra-high vacuum environment with lateral resolutions better than 50 nm and sensitivies of 20 meV. XPS images with elemental and bonding state contrast can show up lateral resolution better than 200 nm with synchrotron excitation. In this paper, we present the principles and capabilities of EF-PEEM and nanospectroscopy. Then, we focus on an example of application to non-destructive work-function imaging of polycrystalline copper for advanced interconnects, where it is shown that EF-PEEM is an alternative to Kelvin probes

  16. An Image Filter Based on Shearlet Transformation and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2015-01-01

    Full Text Available Digital image is always polluted by noise and made data postprocessing difficult. To remove noise and preserve detail of image as much as possible, this paper proposed image filter algorithm which combined the merits of Shearlet transformation and particle swarm optimization (PSO algorithm. Firstly, we use classical Shearlet transform to decompose noised image into many subwavelets under multiscale and multiorientation. Secondly, we gave weighted factor to those subwavelets obtained. Then, using classical Shearlet inverse transform, we obtained a composite image which is composed of those weighted subwavelets. After that, we designed fast and rough evaluation method to evaluate noise level of the new image; by using this method as fitness, we adopted PSO to find the optimal weighted factor we added; after lots of iterations, by the optimal factors and Shearlet inverse transform, we got the best denoised image. Experimental results have shown that proposed algorithm eliminates noise effectively and yields good peak signal noise ratio (PSNR.

  17. Use of morphologic filters in the computerized detection of lung nodules in digital chest images

    International Nuclear Information System (INIS)

    Yoshimura, H.; Giger, M.L.; Doi, K.; Ahn, N.; MacMahon, H.

    1989-01-01

    The authors have previously described a computerized scheme for the detection of lung nodules based on a difference-image approach, which had a detection accuracy of 70% with 7--8 false positives per image. Currently, they are investigating morphologic filters for the further enhancement/suppression of nodule-signals and the removal of false-positives. Gray-level morphologic filtering is performed on clinical chest radiographs digitized with an optical drum scanner. Various shapes and sequences of erosion and dilation filters (i.e., determination of the minimum and maximum gray levels, respectively) were examined for signal enhancement and suppression for sue in the difference- image approach

  18. A filtering approach to edge preserving MAP estimation of images.

    Science.gov (United States)

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  19. Filter and slice thickness selection in SPECT image reconstruction

    International Nuclear Information System (INIS)

    Ivanovic, M.; Weber, D.A.; Wilson, G.A.; O'Mara, R.E.

    1985-01-01

    The choice of filter and slice thickness in SPECT image reconstruction as function of activity and linear and angular sampling were investigated in phantom and patient imaging studies. Reconstructed transverse and longitudinal spatial resolution of the system were measured using a line source in a water filled phantom. Phantom studies included measurements of the Data Spectrum phantom; clinical studies included tomographic procedures in 40 patients undergoing imaging of the temporomandibular joint. Slices of the phantom and patient images were evaluated for spatial of the phantom and patient images were evaluated for spatial resolution, noise, and image quality. Major findings include; spatial resolution and image quality improve with increasing linear sampling frequencies over the range of 4-8 mm/p in the phantom images, best spatial resolution and image quality in clinical images were observed at a linear sampling frequency of 6mm/p, Shepp and Logan filter gives the best spatial resolution for phantom studies at the lowest linear sampling frequency; smoothed Shepp and Logan filter provides best quality images without loss of resolution at higher frequencies and, spatial resolution and image quality improve with increased angular sampling frequency in the phantom at 40 c/p but appear to be independent of angular sampling frequency at 400 c/p

  20. Improvement of natural image search engines results by emotional filtering

    Directory of Open Access Journals (Sweden)

    Patrice Denis

    2016-04-01

    Full Text Available With the Internet 2.0 era, managing user emotions is a problem that more and more actors are interested in. Historically, the first notions of emotion sharing were expressed and defined with emoticons. They allowed users to show their emotional status to others in an impersonal and emotionless digital world. Now, in the Internet of social media, every day users share lots of content with each other on Facebook, Twitter, Google+ and so on. Several new popular web sites like FlickR, Picassa, Pinterest, Instagram or DeviantArt are now specifically based on sharing image content as well as personal emotional status. This kind of information is economically very valuable as it can for instance help commercial companies sell more efficiently. In fact, with this king of emotional information, business can made where companies will better target their customers needs, and/or even sell them more products. Research has been and is still interested in the mining of emotional information from user data since then. In this paper, we focus on the impact of emotions from images that have been collected from search image engines. More specifically our proposition is the creation of a filtering layer applied on the results of such image search engines. Our peculiarity relies in the fact that it is the first attempt from our knowledge to filter image search engines results with an emotional filtering approach.

  1. Photonics-Based Microwave Image-Reject Mixer

    Directory of Open Access Journals (Sweden)

    Dan Zhu

    2018-03-01

    Full Text Available Recent developments in photonics-based microwave image-reject mixers (IRMs are reviewed with an emphasis on the pre-filtering method, which applies an optical or electrical filter to remove the undesired image, and the phase cancellation method, which is realized by introducing an additional phase to the converted image and cancelling it through coherent combination without phase shift. Applications of photonics-based microwave IRM in electronic warfare, radar systems and satellite payloads are described. The inherent challenges of implementing photonics-based microwave IRM to meet specific requirements of the radio frequency (RF system are discussed. Developmental trends of the photonics-based microwave IRM are also discussed.

  2. Superharmonic imaging with chirp coded excitation: filtering spectrally overlapped harmonics.

    Science.gov (United States)

    Harput, Sevan; McLaughlan, James; Cowell, David M J; Freear, Steven

    2014-11-01

    Superharmonic imaging improves the spatial resolution by using the higher order harmonics generated in tissue. The superharmonic component is formed by combining the third, fourth, and fifth harmonics, which have low energy content and therefore poor SNR. This study uses coded excitation to increase the excitation energy. The SNR improvement is achieved on the receiver side by performing pulse compression with harmonic matched filters. The use of coded signals also introduces new filtering capabilities that are not possible with pulsed excitation. This is especially important when using wideband signals. For narrowband signals, the spectral boundaries of the harmonics are clearly separated and thus easy to filter; however, the available imaging bandwidth is underused. Wideband excitation is preferable for harmonic imaging applications to preserve axial resolution, but it generates spectrally overlapping harmonics that are not possible to filter in time and frequency domains. After pulse compression, this overlap increases the range side lobes, which appear as imaging artifacts and reduce the Bmode image quality. In this study, the isolation of higher order harmonics was achieved in another domain by using the fan chirp transform (FChT). To show the effect of excitation bandwidth in superharmonic imaging, measurements were performed by using linear frequency modulated chirp excitation with varying bandwidths of 10% to 50%. Superharmonic imaging was performed on a wire phantom using a wideband chirp excitation. Results were presented with and without applying the FChT filtering technique by comparing the spatial resolution and side lobe levels. Wideband excitation signals achieved a better resolution as expected, however range side lobes as high as -23 dB were observed for the superharmonic component of chirp excitation with 50% fractional bandwidth. The proposed filtering technique achieved >50 dB range side lobe suppression and improved the image quality without

  3. Large-Scale Query-by-Image Video Retrieval Using Bloom Filters

    OpenAIRE

    Araujo, Andre; Chaves, Jason; Lakshman, Haricharan; Angst, Roland; Girod, Bernd

    2016-01-01

    We consider the problem of using image queries to retrieve videos from a database. Our focus is on large-scale applications, where it is infeasible to index each database video frame independently. Our main contribution is a framework based on Bloom filters, which can be used to index long video segments, enabling efficient image-to-video comparisons. Using this framework, we investigate several retrieval architectures, by considering different types of aggregation and different functions to ...

  4. Denoising of MR images using FREBAS collaborative filtering

    International Nuclear Information System (INIS)

    Ito, Satoshi; Hizume, Masayuki; Yamada, Yoshifumi

    2011-01-01

    We propose a novel image denoising strategy based on the correlation in the FREBAS transformed domain. FREBAS transform is a kind of multi-resolution image analysis which consists of two different Fresnel transforms. It can decompose images into down-scaled images of the same size with a different frequency bandwidth. Since these decomposed images have similar distributions for the same directions from the center of the FREBAS domain, even when the FREBAS signal is hidden by noise in the case of a low-signal-to-noise ratio (SNR) image, the signal distribution can be estimated using the distribution of the FREBAS signal located near the position of interest. We have developed a collaborative Wiener filter in the FREBAS transformed domain which implements collaboration of the standard deviation of the position of interest and that of analogous positions. The experimental results demonstrated that the proposed algorithm improves the SNR in terms of both the total SNR and the SNR at the edges of images. (author)

  5. M2 FILTER FOR SPECKLE NOISE SUPPRESSION IN BREAST ULTRASOUND IMAGES

    Directory of Open Access Journals (Sweden)

    E.S. Samundeeswari

    2016-11-01

    Full Text Available Breast cancer, commonly found in women is a serious life threatening disease due to its invasive nature. Ultrasound (US imaging method plays an effective role in screening early detection and diagnosis of Breast cancer. Speckle noise generally affects medical ultrasound images and also causes a number of difficulties in identifying the Region of Interest. Suppressing speckle noise is a challenging task as it destroys fine edge details. No specific filter is designed yet to get a noise free BUS image that is contaminated by speckle noise. In this paper M2 filter, a novel hybrid of linear and nonlinear filter is proposed and compared to other spatial filters with 3×3 kernel size. The performance of the proposed M2 filter is measured by statistical quantity parameters like MSE, PSNR and SSI. The experimental analysis clearly shows that the proposed M2 filter outperforms better than other spatial filters by 2% high PSNR values with regards to speckle suppression.

  6. KALMAN FILTER BASED FEATURE ANALYSIS FOR TRACKING PEOPLE FROM AIRBORNE IMAGES

    Directory of Open Access Journals (Sweden)

    B. Sirmacek

    2012-09-01

    Full Text Available Recently, analysis of man events in real-time using computer vision techniques became a very important research field. Especially, understanding motion of people can be helpful to prevent unpleasant conditions. Understanding behavioral dynamics of people can also help to estimate future states of underground passages, shopping center like public entrances, or streets. In order to bring an automated solution to this problem, we propose a novel approach using airborne image sequences. Although airborne image resolutions are not enough to see each person in detail, we can still notice a change of color components in the place where a person exists. Therefore, we propose a color feature detection based probabilistic framework in order to detect people automatically. Extracted local features behave as observations of the probability density function (pdf of the people locations to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf. First, we use estimated pdf to detect boundaries of dense crowds. After that, using background information of dense crowds and previously extracted local features, we detect other people in non-crowd regions automatically for each image in the sequence. We benefit from Kalman filtering to track motion of detected people. To test our algorithm, we use a stadium entrance image data set taken from airborne camera system. Our experimental results indicate possible usage of the algorithm in real-life man events. We believe that the proposed approach can also provide crucial information to police departments and crisis management teams to achieve more detailed observations of people in large open area events to prevent possible accidents or unpleasant conditions.

  7. HIGH-PRECISION ATTITUDE ESTIMATION METHOD OF STAR SENSORS AND GYRO BASED ON COMPLEMENTARY FILTER AND UNSCENTED KALMAN FILTER

    Directory of Open Access Journals (Sweden)

    C. Guo

    2017-07-01

    Full Text Available Determining the attitude of satellite at the time of imaging then establishing the mathematical relationship between image points and ground points is essential in high-resolution remote sensing image mapping. Star tracker is insensitive to the high frequency attitude variation due to the measure noise and satellite jitter, but the low frequency attitude motion can be determined with high accuracy. Gyro, as a short-term reference to the satellite’s attitude, is sensitive to high frequency attitude change, but due to the existence of gyro drift and integral error, the attitude determination error increases with time. Based on the opposite noise frequency characteristics of two kinds of attitude sensors, this paper proposes an on-orbit attitude estimation method of star sensors and gyro based on Complementary Filter (CF and Unscented Kalman Filter (UKF. In this study, the principle and implementation of the proposed method are described. First, gyro attitude quaternions are acquired based on the attitude kinematics equation. An attitude information fusion method is then introduced, which applies high-pass filtering and low-pass filtering to the gyro and star tracker, respectively. Second, the attitude fusion data based on CF are introduced as the observed values of UKF system in the process of measurement updating. The accuracy and effectiveness of the method are validated based on the simulated sensors attitude data. The obtained results indicate that the proposed method can suppress the gyro drift and measure noise of attitude sensors, improving the accuracy of the attitude determination significantly, comparing with the simulated on-orbit attitude and the attitude estimation results of the UKF defined by the same simulation parameters.

  8. Classification of Textures Using Filter Based Local Feature Extraction

    Directory of Open Access Journals (Sweden)

    Bocekci Veysel Gokhan

    2016-01-01

    Full Text Available In this work local features are used in feature extraction process in image processing for textures. The local binary pattern feature extraction method from textures are introduced. Filtering is also used during the feature extraction process for getting discriminative features. To show the effectiveness of the algorithm before the extraction process, three different noise are added to both train and test images. Wiener filter and median filter are used to remove the noise from images. We evaluate the performance of the method with Naïve Bayesian classifier. We conduct the comparative analysis on benchmark dataset with different filtering and size. Our experiments demonstrate that feature extraction process combine with filtering give promising results on noisy images.

  9. ROV Based Underwater Blurred Image Restoration

    Institute of Scientific and Technical Information of China (English)

    LIU Zhishen; DING Tianfu; WANG Gang

    2003-01-01

    In this paper, we present a method of ROV based image processing to restore underwater blurry images from the theory of light and image transmission in the sea. Computer is used to simulate the maximum detection range of the ROV under different water body conditions. The receiving irradiance of the video camera at different detection ranges is also calculated. The ROV's detection performance under different water body conditions is given by simulation. We restore the underwater blurry images using the Wiener filter based on the simulation. The Wiener filter is shown to be a simple useful method for underwater image restoration in the ROV underwater experiments. We also present examples of restored images of an underwater standard target taken by the video camera in these experiments.

  10. 3D early embryogenesis image filtering by nonlinear partial differential equations.

    Science.gov (United States)

    Krivá, Z; Mikula, K; Peyriéras, N; Rizzi, B; Sarti, A; Stasová, O

    2010-08-01

    We present nonlinear diffusion equations, numerical schemes to solve them and their application for filtering 3D images obtained from laser scanning microscopy (LSM) of living zebrafish embryos, with a goal to identify the optimal filtering method and its parameters. In the large scale applications dealing with analysis of 3D+time embryogenesis images, an important objective is a correct detection of the number and position of cell nuclei yielding the spatio-temporal cell lineage tree of embryogenesis. The filtering is the first and necessary step of the image analysis chain and must lead to correct results, removing the noise, sharpening the nuclei edges and correcting the acquisition errors related to spuriously connected subregions. In this paper we study such properties for the regularized Perona-Malik model and for the generalized mean curvature flow equations in the level-set formulation. A comparison with other nonlinear diffusion filters, like tensor anisotropic diffusion and Beltrami flow, is also included. All numerical schemes are based on the same discretization principles, i.e. finite volume method in space and semi-implicit scheme in time, for solving nonlinear partial differential equations. These numerical schemes are unconditionally stable, fast and naturally parallelizable. The filtering results are evaluated and compared first using the Mean Hausdorff distance between a gold standard and different isosurfaces of original and filtered data. Then, the number of isosurface connected components in a region of interest (ROI) detected in original and after the filtering is compared with the corresponding correct number of nuclei in the gold standard. Such analysis proves the robustness and reliability of the edge preserving nonlinear diffusion filtering for this type of data and lead to finding the optimal filtering parameters for the studied models and numerical schemes. Further comparisons consist in ability of splitting the very close objects which

  11. [Design Method Analysis and Performance Comparison of Wall Filter for Ultrasound Color Flow Imaging].

    Science.gov (United States)

    Wang, Lutao; Xiao, Jun; Chai, Hua

    2015-08-01

    The successful suppression of clutter arising from stationary or slowly moving tissue is one of the key issues in medical ultrasound color blood imaging. Remaining clutter may cause bias in the mean blood frequency estimation and results in a potentially misleading description of blood-flow. In this paper, based on the principle of general wall-filter, the design process of three classes of filters, infinitely impulse response with projection initialization (Prj-IIR), polynomials regression (Pol-Reg), and eigen-based filters are previewed and analyzed. The performance of the filters was assessed by calculating the bias and variance of a mean blood velocity using a standard autocorrelation estimator. Simulation results show that the performance of Pol-Reg filter is similar to Prj-IIR filters. Both of them can offer accurate estimation of mean blood flow speed under steady clutter conditions, and the clutter rejection ability can be enhanced by increasing the ensemble size of Doppler vector. Eigen-based filters can effectively remove the non-stationary clutter component, and further improve the estimation accuracy for low speed blood flow signals. There is also no significant increase in computation complexity for eigen-based filters when the ensemble size is less than 10.

  12. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    International Nuclear Information System (INIS)

    Bowman, Wesley; Sattarivand, Mike

    2016-01-01

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknesses range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.

  13. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, Wesley; Sattarivand, Mike [Department of Radiation Oncology, Dalhousie University at Nova Scotia Health Authority, Department of Radiation Oncology, Dalhousie University at Nova Scotia Health Authority (Canada)

    2016-08-15

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknesses range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.

  14. Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms.

    Science.gov (United States)

    Harikumar, G; Bresler, Y

    1999-01-01

    We address the problem of restoring an image from its noisy convolutions with two or more unknown finite impulse response (FIR) filters. We develop theoretical results about the existence and uniqueness of solutions, and show that under some generically true assumptions, both the filters and the image can be determined exactly in the absence of noise, and stably estimated in its presence. We present efficient algorithms to estimate the blur functions and their sizes. These algorithms are of two types, subspace-based and likelihood-based, and are extensions of techniques proposed for the solution of the multichannel blind deconvolution problem in one dimension. We present memory and computation-efficient techniques to handle the very large matrices arising in the two-dimensional (2-D) case. Once the blur functions are determined, they are used in a multichannel deconvolution step to reconstruct the unknown image. The theoretical and practical implications of edge effects, and "weakly exciting" images are examined. Finally, the algorithms are demonstrated on synthetic and real data.

  15. First magnetic resonance imaging-guided aortic stenting and cava filter placement using a polyetheretherketone-based magnetic resonance imaging-compatible guidewire in swine: proof of concept.

    Science.gov (United States)

    Kos, Sebastian; Huegli, Rolf; Hofmann, Eugen; Quick, Harald H; Kuehl, Hilmar; Aker, Stephanie; Kaiser, Gernot M; Borm, Paul J A; Jacob, Augustinus L; Bilecen, Deniz

    2009-05-01

    The purpose of this study was to demonstrate feasibility of percutaneous transluminal aortic stenting and cava filter placement under magnetic resonance imaging (MRI) guidance exclusively using a polyetheretherketone (PEEK)-based MRI-compatible guidewire. Percutaneous transluminal aortic stenting and cava filter placement were performed in 3 domestic swine. Procedures were performed under MRI-guidance in an open-bore 1.5-T scanner. The applied 0.035-inch guidewire has a PEEK core reinforced by fibres, floppy tip, hydrophilic coating, and paramagnetic markings for passive visualization. Through an 11F sheath, the guidewire was advanced into the abdominal (swine 1) or thoracic aorta (swine 2), and the stents were deployed. The guidewire was advanced into the inferior vena cava (swine 3), and the cava filter was deployed. Postmortem autopsy was performed. Procedural success, guidewire visibility, pushability, and stent support were qualitatively assessed by consensus. Procedure times were documented. Guidewire guidance into the abdominal and thoracic aortas and the inferior vena cava was successful. Stent deployments were successful in the abdominal (swine 1) and thoracic (swine 2) segments of the descending aorta. Cava filter positioning and deployment was successful. Autopsy documented good stent and filter positioning. Guidewire visibility through applied markers was rated acceptable for aortic stenting and good for venous filter placement. Steerability, pushability, and device support were good. The PEEK-based guidewire allows either percutaneous MRI-guided aortic stenting in the thoracic and abdominal segments of the descending aorta and filter placement in the inferior vena cava with acceptable to good device visibility and offers good steerability, pushability, and device support.

  16. First Magnetic Resonance Imaging-Guided Aortic Stenting and Cava Filter Placement Using a Polyetheretherketone-Based Magnetic Resonance Imaging-Compatible Guidewire in Swine: Proof of Concept

    International Nuclear Information System (INIS)

    Kos, Sebastian; Huegli, Rolf; Hofmann, Eugen; Quick, Harald H.; Kuehl, Hilmar; Aker, Stephanie; Kaiser, Gernot M.; Borm, Paul J. A.; Jacob, Augustinus L.; Bilecen, Deniz

    2009-01-01

    The purpose of this study was to demonstrate feasibility of percutaneous transluminal aortic stenting and cava filter placement under magnetic resonance imaging (MRI) guidance exclusively using a polyetheretherketone (PEEK)-based MRI-compatible guidewire. Percutaneous transluminal aortic stenting and cava filter placement were performed in 3 domestic swine. Procedures were performed under MRI-guidance in an open-bore 1.5-T scanner. The applied 0.035-inch guidewire has a PEEK core reinforced by fibres, floppy tip, hydrophilic coating, and paramagnetic markings for passive visualization. Through an 11F sheath, the guidewire was advanced into the abdominal (swine 1) or thoracic aorta (swine 2), and the stents were deployed. The guidewire was advanced into the inferior vena cava (swine 3), and the cava filter was deployed. Postmortem autopsy was performed. Procedural success, guidewire visibility, pushability, and stent support were qualitatively assessed by consensus. Procedure times were documented. Guidewire guidance into the abdominal and thoracic aortas and the inferior vena cava was successful. Stent deployments were successful in the abdominal (swine 1) and thoracic (swine 2) segments of the descending aorta. Cava filter positioning and deployment was successful. Autopsy documented good stent and filter positioning. Guidewire visibility through applied markers was rated acceptable for aortic stenting and good for venous filter placement. Steerability, pushability, and device support were good. The PEEK-based guidewire allows either percutaneous MRI-guided aortic stenting in the thoracic and abdominal segments of the descending aorta and filter placement in the inferior vena cava with acceptable to good device visibility and offers good steerability, pushability, and device support.

  17. Two-dimensional restoration of single photon emission computed tomography images using the Kalman filter

    International Nuclear Information System (INIS)

    Boulfelfel, D.; Rangayyan, R.M.; Kuduvalli, G.R.; Hahn, L.J.; Kloiber, R.

    1994-01-01

    The discrete filtered backprojection (DFBP) algorithm used for the reconstruction of single photon emission computed tomography (SPECT) images affects image quality because of the operations of filtering and discretization. The discretization of the filtered backprojection process can cause the modulation transfer function (MTF) of the SPECT imaging system to be anisotropic and nonstationary, especially near the edges of the camera's field of view. The use of shift-invariant restoration techniques fails to restore large images because these techniques do not account for such variations in the MTF. This study presents the application of a two-dimensional (2-D) shift-variant Kalman filter for post-reconstruction restoration of SPECT slices. This filter was applied to SPECT images of a hollow cylinder phantom; a resolution phantom; and a large, truncated cone phantom containing two types of cold spots, a sphere, and a triangular prism. The images were acquired on an ADAC GENESYS camera. A comparison was performed between results obtained by the Kalman filter and those obtained by shift-invariant filters. Quantitative analysis of the restored images performed through measurement of root mean squared errors shows a considerable reduction in error of Kalman-filtered images over images restored using shift-invariant methods

  18. Automated microaneurysm detection method based on double ring filter in retinal fundus images

    Science.gov (United States)

    Mizutani, Atsushi; Muramatsu, Chisako; Hatanaka, Yuji; Suemori, Shinsuke; Hara, Takeshi; Fujita, Hiroshi

    2009-02-01

    The presence of microaneurysms in the eye is one of the early signs of diabetic retinopathy, which is one of the leading causes of vision loss. We have been investigating a computerized method for the detection of microaneurysms on retinal fundus images, which were obtained from the Retinopathy Online Challenge (ROC) database. The ROC provides 50 training cases, in which "gold standard" locations of microaneurysms are provided, and 50 test cases without the gold standard locations. In this study, the computerized scheme was developed by using the training cases. Although the results for the test cases are also included, this paper mainly discusses the results for the training cases because the "gold standard" for the test cases is not known. After image preprocessing, candidate regions for microaneurysms were detected using a double-ring filter. Any potential false positives located in the regions corresponding to blood vessels were removed by automatic extraction of blood vessels from the images. Twelve image features were determined, and the candidate lesions were classified into microaneurysms or false positives using the rule-based method and an artificial neural network. The true positive fraction of the proposed method was 0.45 at 27 false positives per image. Forty-two percent of microaneurysms in the 50 training cases were considered invisible by the consensus of two co-investigators. When the method was evaluated for visible microaneurysms, the sensitivity for detecting microaneurysms was 65% at 27 false positives per image. Our computerized detection scheme could be improved for helping ophthalmologists in the early diagnosis of diabetic retinopathy.

  19. Sealing Clay Text Segmentation Based on Radon-Like Features and Adaptive Enhancement Filters

    Directory of Open Access Journals (Sweden)

    Xia Zheng

    2015-01-01

    Full Text Available Text extraction is a key issue in sealing clay research. The traditional method based on rubbings increases the risk of sealing clay damage and is unfavorable to sealing clay protection. Therefore, using digital image of sealing clay, a new method for text segmentation based on Radon-like features and adaptive enhancement filters is proposed in this paper. First, adaptive enhancement LM filter bank is used to get the maximum energy image; second, the edge image of the maximum energy image is calculated; finally, Radon-like feature images are generated by combining maximum energy image and its edge image. The average image of Radon-like feature images is segmented by the image thresholding method. Compared with 2D Otsu, GA, and FastFCM, the experiment result shows that this method can perform better in terms of accuracy and completeness of the text.

  20. US images encoding envelope amplitude following narrow band filtering

    International Nuclear Information System (INIS)

    Sommer, F.G.; Stern, R.A.; Chen, H.S.

    1986-01-01

    Ultrasonic waveform data from phantoms having differing scattering characteristics and from normal and cirrhotic human liver in vivo were recorded within a standardized dynamic range and filtered with narrow band filters either above or below the mean recorded ultrasonic center frequency. Images created by mapping the amplitudes of received ultrasound following such filtration permitted dramatic differentiation, not discernible in conventional US images, of phantoms having differing scattering characteristics, and of normal and cirrhotic human livers

  1. Nonlinear image filtering within IDP++

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, S.K.; Wieting, M.G.; Brase, J.M.

    1995-02-09

    IDP++, image and data processing in C++, is a set of a signal processing libraries written in C++. It is a multi-dimension (up to four dimensions), multi-data type (implemented through templates) signal processing extension to C++. IDP++ takes advantage of the object-oriented compiler technology to provide ``information hiding.`` Users need only know C, not C++. Signals or data sets are treated like any other variable with a defined set of operators and functions. We here some examples of the nonlinear filter library within IDP++. Specifically, the results of MIN, MAX median, {alpha}-trimmed mean, and edge-trimmed mean filters as applied to a real aperture radar (RR) and synthetic aperture radar (SAR) data set.

  2. Imaging through scattering media by Fourier filtering and single-pixel detection

    Science.gov (United States)

    Jauregui-Sánchez, Y.; Clemente, P.; Lancis, J.; Tajahuerce, E.

    2018-02-01

    We present a novel imaging system that combines the principles of Fourier spatial filtering and single-pixel imaging in order to recover images of an object hidden behind a turbid medium by transillumination. We compare the performance of our single-pixel imaging setup with that of a conventional system. We conclude that the introduction of Fourier gating improves the contrast of images in both cases. Furthermore, we show that the combination of single-pixel imaging and Fourier spatial filtering techniques is particularly well adapted to provide images of objects transmitted through scattering media.

  3. Preprocessing of PHERMEX flash radiographic images with Haar and adaptive filtering

    International Nuclear Information System (INIS)

    Brolley, J.E.

    1978-11-01

    Work on image preparation has continued with the application of high-sequency boosting via Haar filtering. This is useful in developing line or edge structures. Widrow LMS adaptive filtering has also been shown to be useful in developing edge structure in special problems. Shadow effects can be obtained with the latter which may be useful for some problems. Combined Haar and adaptive filtering is illustrated for a PHERMEX image

  4. Artifact reduction of compressed images and video combining adaptive fuzzy filtering and directional anisotropic diffusion

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Forchhammer, Søren; Korhonen, Jari

    2011-01-01

    and ringing artifacts, we have applied directional anisotropic diffusion. Besides that, the selection of the adaptive threshold parameter for the diffusion coefficient has also improved the performance of the algorithm. Experimental results on JPEG compressed images as well as MJPEG and H.264 compressed......Fuzzy filtering is one of the recently developed methods for reducing distortion in compressed images and video. In this paper, we combine the powerful anisotropic diffusion equations with fuzzy filtering in order to reduce the impact of artifacts. Based on the directional nature of the blocking...... videos show improvement in artifact reduction of the proposed algorithm over other directional and spatial fuzzy filters....

  5. Information Recovery Algorithm for Ground Objects in Thin Cloud Images by Fusing Guide Filter and Transfer Learning

    Directory of Open Access Journals (Sweden)

    HU Gensheng

    2018-03-01

    Full Text Available Ground object information of remote sensing images covered with thin clouds is obscure. An information recovery algorithm for ground objects in thin cloud images is proposed by fusing guide filter and transfer learning. Firstly, multi-resolution decomposition of thin cloud target images and cloud-free guidance images is performed by using multi-directional nonsubsampled dual-tree complex wavelet transform. Then the decomposed low frequency subbands are processed by using support vector guided filter and transfer learning respectively. The decomposed high frequency subbands are enhanced by using modified Laine enhancement function. The low frequency subbands output by guided filter and those predicted by transfer learning model are fused by the method of selection and weighting based on regional energy. Finally, the enhanced high frequency subbands and the fused low frequency subbands are reconstructed by using inverse multi-directional nonsubsampled dual-tree complex wavelet transform to obtain the ground object information recovery images. Experimental results of Landsat-8 OLI multispectral images show that, support vector guided filter can effectively preserve the detail information of the target images, domain adaptive transfer learning can effectively extend the range of available multi-source and multi-temporal remote sensing images, and good effects for ground object information recover are obtained by fusing guide filter and transfer learning to remove thin cloud on the remote sensing images.

  6. An Extension to a Filter Implementation of Local Quadratic Surface for Image Noise Estimation

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    1999-01-01

    Based on regression analysis this paper gives a description for simple image filter design. Specifically 3x3 filter implementations of a quadratic surface, residuals from this surface, gradients and the Laplacian are given. For the residual a 5x5 filter is given also. It is shown that the 3x3......) it is concluded that if striping is to be considered as a part of the noise, the residual from a 3x3 median filter seems best. If we are interested in a salt-and-pepper noise estimator the proposed extension to the 3x3 filter for the residual from a quadratic surface seems best. Simple statistics...

  7. An image-space parallel convolution filtering algorithm based on shadow map

    Science.gov (United States)

    Li, Hua; Yang, Huamin; Zhao, Jianping

    2017-07-01

    Shadow mapping is commonly used in real-time rendering. In this paper, we presented an accurate and efficient method of soft shadows generation from planar area lights. First this method generated a depth map from light's view, and analyzed the depth-discontinuities areas as well as shadow boundaries. Then these areas were described as binary values in the texture map called binary light-visibility map, and a parallel convolution filtering algorithm based on GPU was enforced to smooth out the boundaries with a box filter. Experiments show that our algorithm is an effective shadow map based method that produces perceptually accurate soft shadows in real time with more details of shadow boundaries compared with the previous works.

  8. A Kalman filter technique applied for medical image reconstruction

    International Nuclear Information System (INIS)

    Goliaei, S.; Ghorshi, S.; Manzuri, M. T.; Mortazavi, M.

    2011-01-01

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  9. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    Science.gov (United States)

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO 2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  10. Weighted ensemble transform Kalman filter for image assimilation

    Directory of Open Access Journals (Sweden)

    Sebastien Beyou

    2013-01-01

    Full Text Available This study proposes an extension of the Weighted Ensemble Kalman filter (WEnKF proposed by Papadakis et al. (2010 for the assimilation of image observations. The main focus of this study is on a novel formulation of the Weighted filter with the Ensemble Transform Kalman filter (WETKF, incorporating directly as a measurement model a non-linear image reconstruction criterion. This technique has been compared to the original WEnKF on numerical and real world data of 2-D turbulence observed through the transport of a passive scalar. In particular, it has been applied for the reconstruction of oceanic surface current vorticity fields from sea surface temperature (SST satellite data. This latter technique enables a consistent recovery along time of oceanic surface currents and vorticity maps in presence of large missing data areas and strong noise.

  11. GPU-based parallel algorithm for blind image restoration using midfrequency-based methods

    Science.gov (United States)

    Xie, Lang; Luo, Yi-han; Bao, Qi-liang

    2013-08-01

    GPU-based general-purpose computing is a new branch of modern parallel computing, so the study of parallel algorithms specially designed for GPU hardware architecture is of great significance. In order to solve the problem of high computational complexity and poor real-time performance in blind image restoration, the midfrequency-based algorithm for blind image restoration was analyzed and improved in this paper. Furthermore, a midfrequency-based filtering method is also used to restore the image hardly with any recursion or iteration. Combining the algorithm with data intensiveness, data parallel computing and GPU execution model of single instruction and multiple threads, a new parallel midfrequency-based algorithm for blind image restoration is proposed in this paper, which is suitable for stream computing of GPU. In this algorithm, the GPU is utilized to accelerate the estimation of class-G point spread functions and midfrequency-based filtering. Aiming at better management of the GPU threads, the threads in a grid are scheduled according to the decomposition of the filtering data in frequency domain after the optimization of data access and the communication between the host and the device. The kernel parallelism structure is determined by the decomposition of the filtering data to ensure the transmission rate to get around the memory bandwidth limitation. The results show that, with the new algorithm, the operational speed is significantly increased and the real-time performance of image restoration is effectively improved, especially for high-resolution images.

  12. Apodized RFI filtering of synthetic aperture radar images

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter

    2014-02-01

    Fine resolution Synthetic Aperture Radar (SAR) systems necessarily require wide bandwidths that often overlap spectrum utilized by other wireless services. These other emitters pose a source of Radio Frequency Interference (RFI) to the SAR echo signals that degrades SAR image quality. Filtering, or excising, the offending spectral contaminants will mitigate the interference, but at a cost of often degrading the SAR image in other ways, notably by raising offensive sidelobe levels. This report proposes borrowing an idea from nonlinear sidelobe apodization techniques to suppress interference without the attendant increase in sidelobe levels. The simple post-processing technique is termed Apodized RFI Filtering (ARF).

  13. Wavelet Filter Banks for Super-Resolution SAR Imaging

    Science.gov (United States)

    Sheybani, Ehsan O.; Deshpande, Manohar; Memarsadeghi, Nargess

    2011-01-01

    This paper discusses Innovative wavelet-based filter banks designed to enhance the analysis of super resolution Synthetic Aperture Radar (SAR) images using parametric spectral methods and signal classification algorithms, SAR finds applications In many of NASA's earth science fields such as deformation, ecosystem structure, and dynamics of Ice, snow and cold land processes, and surface water and ocean topography. Traditionally, standard methods such as Fast-Fourier Transform (FFT) and Inverse Fast-Fourier Transform (IFFT) have been used to extract Images from SAR radar data, Due to non-parametric features of these methods and their resolution limitations and observation time dependence, use of spectral estimation and signal pre- and post-processing techniques based on wavelets to process SAR radar data has been proposed. Multi-resolution wavelet transforms and advanced spectral estimation techniques have proven to offer efficient solutions to this problem.

  14. Filtering and deconvolution for bioluminescence imaging of small animals

    International Nuclear Information System (INIS)

    Akkoul, S.

    2010-01-01

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  15. Learning based particle filtering object tracking for visible-light systems.

    Science.gov (United States)

    Sun, Wei

    2015-10-01

    We propose a novel object tracking framework based on online learning scheme that can work robustly in challenging scenarios. Firstly, a learning-based particle filter is proposed with color and edge-based features. We train a. support vector machine (SVM) classifier with object and background information and map the outputs into probabilities, then the weight of particles in a particle filter can be calculated by the probabilistic outputs to estimate the state of the object. Secondly, the tracking loop starts with Lucas-Kanade (LK) affine template matching and follows by learning-based particle filter tracking. Lucas-Kanade method estimates errors and updates object template in the positive samples dataset, and learning-based particle filter tracker will start if the LK tracker loses the object. Finally, SVM classifier evaluates every tracked appearance to update the training set or restart the tracking loop if necessary. Experimental results show that our method is robust to challenging light, scale and pose changing, and test on eButton image sequence also achieves satisfactory tracking performance.

  16. Image restoration technique using median filter combined with decision tree algorithm

    International Nuclear Information System (INIS)

    Sethu, D.; Assadi, H.M.; Hasson, F.N.; Hasson, N.N.

    2007-01-01

    Images are usually corrupted during transmission principally due to interface in the channel used for transmission. Images also be impaired by the addition of various forms of noise. Salt and pepper is commonly used to impair the image. Salt and pepper noise can be caused by errors in data transmission, malfunctioning pixel elements in camera sensors, and timing errors in the digitization process. During the filtering of noisy image, important features such as edges, lines and other fine image details embedded in the image tends to blur because of filtering operation. The enhancement of noisy data, however, is a very critical process because the sharpening operation can significantly increase the noise. In this respect, contrast enhancement is often necessary in order to highlight details that have been blurred. In this proposed approach we aim to develop image processing technique that can meet this new requirement, which are high quality and high speed. Furthermore, prevent the noise accretion during the sharpening of the image details, and compare the restored images via proposed method with other kinds of filters. (author)

  17. A multi-stage noise adaptive switching filter for extremely corrupted images

    Science.gov (United States)

    Dinh, Hai; Adhami, Reza; Wang, Yi

    2015-07-01

    A multi-stage noise adaptive switching filter (MSNASF) is proposed for the restoration of images extremely corrupted by impulse and impulse-like noise. The filter consists of two steps: noise detection and noise removal. The proposed extrema-based noise detection scheme utilizes the false contouring effect to get better over detection rate at low noise density. It is adaptive and will detect not only impulse but also impulse-like noise. In the noise removal step, a novel multi-stage filtering scheme is proposed. It replaces corrupted pixel with the nearest uncorrupted median to preserve details. When compared with other methods, MSNASF provides better peak signal to noise ratio (PSNR) and structure similarity index (SSIM). A subjective evaluation carried out online also demonstrates that MSNASF yields higher fidelity.

  18. CHANGE DETECTION VIA SELECTIVE GUIDED CONTRASTING FILTERS

    Directory of Open Access Journals (Sweden)

    Y. V. Vizilter

    2017-05-01

    Full Text Available Change detection scheme based on guided contrasting was previously proposed. Guided contrasting filter takes two images (test and sample as input and forms the output as filtered version of test image. Such filter preserves the similar details and smooths the non-similar details of test image with respect to sample image. Due to this the difference between test image and its filtered version (difference map could be a basis for robust change detection. Guided contrasting is performed in two steps: at the first step some smoothing operator (SO is applied for elimination of test image details; at the second step all matched details are restored with local contrast proportional to the value of some local similarity coefficient (LSC. The guided contrasting filter was proposed based on local average smoothing as SO and local linear correlation as LSC. In this paper we propose and implement new set of selective guided contrasting filters based on different combinations of various SO and thresholded LSC. Linear average and Gaussian smoothing, nonlinear median filtering, morphological opening and closing are considered as SO. Local linear correlation coefficient, morphological correlation coefficient (MCC, mutual information, mean square MCC and geometrical correlation coefficients are applied as LSC. Thresholding of LSC allows operating with non-normalized LSC and enhancing the selective properties of guided contrasting filters: details are either totally recovered or not recovered at all after the smoothing. These different guided contrasting filters are tested as a part of previously proposed change detection pipeline, which contains following stages: guided contrasting filtering on image pyramid, calculation of difference map, binarization, extraction of change proposals and testing change proposals using local MCC. Experiments on real and simulated image bases demonstrate the applicability of all proposed selective guided contrasting filters. All

  19. Time Domain Filtering of Resolved Images of Sgr A{sup ∗}

    Energy Technology Data Exchange (ETDEWEB)

    Shiokawa, Hotaka; Doeleman, Sheperd S. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Gammie, Charles F. [Department of Physics, University of Illinois, 1110 West Green Street, Urbana, IL 61801 (United States)

    2017-09-01

    The goal of the Event Horizon Telescope (EHT) is to provide spatially resolved images of Sgr A*, the source associated with the Galactic Center black hole. Because Sgr A* varies on timescales that are short compared to an EHT observing campaign, it is interesting to ask whether variability contains information about the structure and dynamics of the accretion flow. In this paper, we introduce “time-domain filtering,” a technique to filter time fluctuating images with specific temporal frequency ranges and to demonstrate the power and usage of the technique by applying it to mock millimeter wavelength images of Sgr A*. The mock image data is generated from the General Relativistic Magnetohydrodynamic (GRMHD) simulation and the general relativistic ray-tracing method. We show that the variability on each line of sight is tightly correlated with a typical radius of emission. This is because disk emissivity fluctuates on a timescale of the order of the local orbital period. Time-domain filtered images therefore reflect the model dependent emission radius distribution, which is not accessible in time-averaged images. We show that, in principle, filtered data have the power to distinguish between models with different black-hole spins, different disk viewing angles, and different disk orientations in the sky.

  20. Kalman filter techniques for accelerated Cartesian dynamic cardiac imaging.

    Science.gov (United States)

    Feng, Xue; Salerno, Michael; Kramer, Christopher M; Meyer, Craig H

    2013-05-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome, and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and signal-to-noise ratio. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view-sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. Copyright © 2012 Wiley Periodicals, Inc.

  1. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  2. Filters involving derivatives with application to reconstruction from scanned halftone images

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Jensen, Kim S.

    1995-01-01

    This paper presents a method for designing finite impulse response (FIR) filters for samples of a 2-D signal, e.g., an image, and its gradient. The filters, which are called blended filters, are decomposable in three filters, each separable in 1-D filters on subsets of the data set. Optimality...... in the minimum mean square error sense (MMSE) of blended filtering is shown for signals with separable autocorrelation function. Relations between correlation functions for signals and their gradients are derived. Blended filters may be composed from FIR Wiener filters using these relations. Simple blended...... is achievable with blended filters...

  3. A New Switching-Based Median Filtering Scheme and Algorithm for Removal of High-Density Salt and Pepper Noise in Images

    Directory of Open Access Journals (Sweden)

    Jayaraj V

    2010-01-01

    Full Text Available A new switching-based median filtering scheme for restoration of images that are highly corrupted by salt and pepper noise is proposed. An algorithm based on the scheme is developed. The new scheme introduces the concept of substitution of noisy pixels by linear prediction prior to estimation. A novel simplified linear predictor is developed for this purpose. The objective of the scheme and algorithm is the removal of high-density salt and pepper noise in images. The new algorithm shows significantly better image quality with good PSNR, reduced MSE, good edge preservation, and reduced streaking. The good performance is achieved with reduced computational complexity. A comparison of the performance is made with several existing algorithms in terms of visual and quantitative results. The performance of the proposed scheme and algorithm is demonstrated.

  4. Magnetic resonance image enhancement using V-filter

    International Nuclear Information System (INIS)

    Yamamoto, H.; Sugita, K.; Kanzaki, N.; Johja, I.; Hiraki, Y.

    1990-01-01

    The purpose of this study is to present a method of boundary enhancement algorithms for magnetic resonance images using a V-filter. The boundary of the brain tumor was precisely extracted by the region segmentation techniques

  5. Raman imaging using fixed bandpass filter

    Science.gov (United States)

    Landström, L.; Kullander, F.; Lundén, H.; Wästerby, P.

    2017-05-01

    By using fixed narrow band pass optical filtering and scanning the laser excitation wavelength, hyperspectral Raman imaging could be achieved. Experimental, proof-of-principle results from the Chemical Warfare Agent (CWA) tabun (GA) as well as the common CWA simulant tributyl phosphate (TBP) on different surfaces/substrates are presented and discussed.

  6. Advanced microlens and color filter process technology for the high-efficiency CMOS and CCD image sensors

    Science.gov (United States)

    Fan, Yang-Tung; Peng, Chiou-Shian; Chu, Cheng-Yu

    2000-12-01

    New markets are emerging for digital electronic image device, especially in visual communications, PC camera, mobile/cell phone, security system, toys, vehicle image system and computer peripherals for document capture. To enable one-chip image system that image sensor is with a full digital interface, can make image capture devices in our daily lives. Adding a color filter to such image sensor in a pattern of mosaics pixel or wide stripes can make image more real and colorful. We can say 'color filter makes the life more colorful color filter is? Color filter means can filter image light source except the color with specific wavelength and transmittance that is same as color filter itself. Color filter process is coating and patterning green, red and blue (or cyan, magenta and yellow) mosaic resists onto matched pixel in image sensing array pixels. According to the signal caught from each pixel, we can figure out the environment image picture. Widely use of digital electronic camera and multimedia applications today makes the feature of color filter becoming bright. Although it has challenge but it is very worthy to develop the process of color filter. We provide the best service on shorter cycle time, excellent color quality, high and stable yield. The key issues of advanced color process have to be solved and implemented are planarization and micro-lens technology. Lost of key points of color filter process technology have to consider will also be described in this paper.

  7. Two-dimensional real-time imaging system for subtraction angiography using an iodine filter

    Science.gov (United States)

    Umetani, Keiji; Ueda, Ken; Takeda, Tohoru; Anno, Izumi; Itai, Yuji; Akisada, Masayoshi; Nakajima, Teiichi

    1992-01-01

    A new type of subtraction imaging system was developed using an iodine filter and a single-energy broad bandwidth monochromatized x ray. The x-ray images of coronary arteries made after intravenous injection of a contrast agent are enhanced by an energy-subtraction technique. Filter chopping of the x-ray beam switches energies rapidly, so that a nearly simultaneous pair of filtered and nonfiltered images can be made. By using a high-speed video camera, a pair of two 512 × 512 pixel images can be obtained within 9 ms. Three hundred eighty-four images (raw data) are stored in a 144-Mbyte frame memory. After phantom studies, in vivo subtracted images of coronary arteries in dogs were obtained at a rate of 15 images/s.

  8. Evidence-Based Evaluation of Inferior Vena Cava Filter Complications Based on Filter Type

    Science.gov (United States)

    Deso, Steven E.; Idakoji, Ibrahim A.; Kuo, William T.

    2016-01-01

    Many inferior vena cava (IVC) filter types, along with their specific risks and complications, are not recognized. The purpose of this study was to evaluate the various FDA-approved IVC filter types to determine device-specific risks, as a way to help identify patients who may benefit from ongoing follow-up versus prompt filter retrieval. An evidence-based electronic search (FDA Premarket Notification, MEDLINE, FDA MAUDE) was performed to identify all IVC filter types and device-specific complications from 1980 to 2014. Twenty-three IVC filter types (14 retrievable, 9 permanent) were identified. The devices were categorized as follows: conical (n = 14), conical with umbrella (n = 1), conical with cylindrical element (n = 2), biconical with cylindrical element (n = 2), helical (n = 1), spiral (n = 1), and complex (n = 1). Purely conical filters were associated with the highest reported risks of penetration (90–100%). Filters with cylindrical or umbrella elements were associated with the highest reported risk of IVC thrombosis (30–50%). Conical Bard filters were associated with the highest reported risks of fracture (40%). The various FDA-approved IVC filter types were evaluated for device-specific complications based on best current evidence. This information can be used to guide and optimize clinical management in patients with indwelling IVC filters. PMID:27247477

  9. A motion-compensated image filter for low-dose fluoroscopy in a real-time tumor-tracking radiotherapy system

    International Nuclear Information System (INIS)

    Miyamoto, Naoki; Ishikawa, Masayori; Sutherland, Kenneth

    2015-01-01

    In the real-time tumor-tracking radiotherapy system, a surrogate fiducial marker inserted in or near the tumor is detected by fluoroscopy to realize respiratory-gated radiotherapy. The imaging dose caused by fluoroscopy should be minimized. In this work, an image processing technique is proposed for tracing a moving marker in low-dose imaging. The proposed tracking technique is a combination of a motion-compensated recursive filter and template pattern matching. The proposed image filter can reduce motion artifacts resulting from the recursive process based on the determination of the region of interest for the next frame according to the current marker position in the fluoroscopic images. The effectiveness of the proposed technique and the expected clinical benefit were examined by phantom experimental studies with actual tumor trajectories generated from clinical patient data. It was demonstrated that the marker motion could be traced in low-dose imaging by applying the proposed algorithm with acceptable registration error and high pattern recognition score in all trajectories, although some trajectories were not able to be tracked with the conventional spatial filters or without image filters. The positional accuracy is expected to be kept within ±2 mm. The total computation time required to determine the marker position is a few milliseconds. The proposed image processing technique is applicable for imaging dose reduction. (author)

  10. Least median of squares filtering of locally optimal point matches for compressible flow image registration

    International Nuclear Information System (INIS)

    Castillo, Edward; Guerrero, Thomas; Castillo, Richard; White, Benjamin; Rojo, Javier

    2012-01-01

    Compressible flow based image registration operates under the assumption that the mass of the imaged material is conserved from one image to the next. Depending on how the mass conservation assumption is modeled, the performance of existing compressible flow methods is limited by factors such as image quality, noise, large magnitude voxel displacements, and computational requirements. The Least Median of Squares Filtered Compressible Flow (LFC) method introduced here is based on a localized, nonlinear least squares, compressible flow model that describes the displacement of a single voxel that lends itself to a simple grid search (block matching) optimization strategy. Spatially inaccurate grid search point matches, corresponding to erroneous local minimizers of the nonlinear compressible flow model, are removed by a novel filtering approach based on least median of squares fitting and the forward search outlier detection method. The spatial accuracy of the method is measured using ten thoracic CT image sets and large samples of expert determined landmarks (available at www.dir-lab.com). The LFC method produces an average error within the intra-observer error on eight of the ten cases, indicating that the method is capable of achieving a high spatial accuracy for thoracic CT registration. (paper)

  11. Guided filtering for solar image/video processing

    Directory of Open Access Journals (Sweden)

    Long Xu

    2017-06-01

    Full Text Available A new image enhancement algorithm employing guided filtering is proposed in this work for enhancement of solar images and videos, so that users can easily figure out important fine structures imbedded in the recorded images/movies for solar observation. The proposed algorithm can efficiently remove image noises, including Gaussian and impulse noises. Meanwhile, it can further highlight fibrous structures on/beyond the solar disk. These fibrous structures can clearly demonstrate the progress of solar flare, prominence coronal mass emission, magnetic field, and so on. The experimental results prove that the proposed algorithm gives significant enhancement of visual quality of solar images beyond original input and several classical image enhancement algorithms, thus facilitating easier determination of interesting solar burst activities from recorded images/movies.

  12. Biomedical bandpass filter for fluorescence microscopy imaging based on TiO2/SiO2 and TiO2/MgF2 dielectric multilayers

    International Nuclear Information System (INIS)

    Butt, M A; Fomchenkov, S A; Verma, P; Khonina, S N; Ullah, A

    2016-01-01

    We report a design for creating a multilayer dielectric optical filters based on TiO 2 and SiO 2 /MgF 2 alternating layers. We have selected Titanium dioxide (TiO 2 ) for high refractive index (2.5), Silicon dioxide (SiO 2 ) and Magnesium fluoride (MgF 2 ) as a low refractive index layer (1.45 and 1.37) respectively. Miniaturized visible spectrometers are useful for quick and mobile characterization of biological samples. Such devices can be fabricated by using Fabry-Perot (FP) filters consisting of two highly reflecting mirrors with a central cavity in between. Distributed Bragg Reflectors (DBRs) consisting of alternating high and low refractive index material pairs are the most commonly used mirrors in FP filters, due to their high reflectivity. However, DBRs have high reflectivity for a selected range of wavelengths known as the stopband of the DBR. This range is usually much smaller than the sensitivity range of the spectrometer range. Therefore a bandpass filters are required to restrict wavelength outside the stopband of the FP DBRs. The proposed filter shows a high quality with average transmission of 97.4% within the passbands and the transmission outside the passband is around 4%. Special attention has been given to keep the thickness of the filters within the economic limits. It can be suggested that these filters are exceptional choice for florescence imaging and Endoscope narrow band imaging. (paper)

  13. Decentralized Social Filtering based on Trust

    OpenAIRE

    Olsson, Tomas

    1998-01-01

    This paper describes a decentralised approach to social filtering based on trust between agents in a multiagent system. The social filtering in the proposed approach is built on the interactions between collaborative software agents performing content-based filtering. This means that it uses a mixture of content-based and social filtering and thereby, it takes advantage of both methods.

  14. On Applicability of Tunable Filter Bank Based Feature for Ear Biometrics: A Study from Constrained to Unconstrained.

    Science.gov (United States)

    Chowdhury, Debbrota Paul; Bakshi, Sambit; Guo, Guodong; Sa, Pankaj Kumar

    2017-11-27

    In this paper, an overall framework has been presented for person verification using ear biometric which uses tunable filter bank as local feature extractor. The tunable filter bank, based on a half-band polynomial of 14th order, extracts distinct features from ear images maintaining its frequency selectivity property. To advocate the applicability of tunable filter bank on ear biometrics, recognition test has been performed on available constrained databases like AMI, WPUT, IITD and unconstrained database like UERC. Experiments have been conducted applying tunable filter based feature extractor on subparts of the ear. Empirical experiments have been conducted with four and six subdivisions of the ear image. Analyzing the experimental results, it has been found that tunable filter moderately succeeds to distinguish ear features at par with the state-of-the-art features used for ear recognition. Accuracies of 70.58%, 67.01%, 81.98%, and 57.75% have been achieved on AMI, WPUT, IITD, and UERC databases through considering Canberra Distance as underlying measure of separation. The performances indicate that tunable filter is a candidate for recognizing human from ear images.

  15. A Divergence Median-based Geometric Detector with A Weighted Averaging Filter

    Science.gov (United States)

    Hua, Xiaoqiang; Cheng, Yongqiang; Li, Yubo; Wang, Hongqiang; Qin, Yuliang

    2018-01-01

    To overcome the performance degradation of the classical fast Fourier transform (FFT)-based constant false alarm rate detector with the limited sample data, a divergence median-based geometric detector on the Riemannian manifold of Heimitian positive definite matrices is proposed in this paper. In particular, an autocorrelation matrix is used to model the correlation of sample data. This method of the modeling can avoid the poor Doppler resolution as well as the energy spread of the Doppler filter banks result from the FFT. Moreover, a weighted averaging filter, conceived from the philosophy of the bilateral filtering in image denoising, is proposed and combined within the geometric detection framework. As the weighted averaging filter acts as the clutter suppression, the performance of the geometric detector is improved. Numerical experiments are given to validate the effectiveness of our proposed method.

  16. Image restoration by Wiener filtering in the presence of signal-dependent noise.

    Science.gov (United States)

    Kondo, K; Ichioka, Y; Suzuki, T

    1977-09-01

    An optimum filter to restore the degraded image due to blurring and the signal-dependent noise is obtained on the basis of the theory of Wiener filtering. Computer simulations of image restoration using signal-dependent noise models are carried out. It becomes clear that the optimum filter, which makes use of a priori information on the signal-dependent nature of the noise and the spectral density of the signal and the noise showing significant spatial correlation, is potentially advantageous.

  17. Improved Kalman Filter-Based Speech Enhancement with Perceptual Post-Filtering

    Institute of Scientific and Technical Information of China (English)

    WEIJianqiang; DULimin; YANZhaoli; ZENGHui

    2004-01-01

    In this paper, a Kalman filter-based speech enhancement algorithm with some improvements of previous work is presented. A new technique based on spectral subtraction is used for separation speech and noise characteristics from noisy speech and for the computation of speech and noise Autoregressive (AR) parameters. In order to obtain a Kalman filter output with high audible quality, a perceptual post-filter is placed at the output of the Kalman filter to smooth the enhanced speech spectra.Extensive experiments indicate that this newly proposed method works well.

  18. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  19. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  20. Supervised retinal vessel segmentation from color fundus images based on matched filtering and AdaBoost classifier.

    Directory of Open Access Journals (Sweden)

    Nogol Memari

    Full Text Available The structure and appearance of the blood vessel network in retinal fundus images is an essential part of diagnosing various problems associated with the eyes, such as diabetes and hypertension. In this paper, an automatic retinal vessel segmentation method utilizing matched filter techniques coupled with an AdaBoost classifier is proposed. The fundus image is enhanced using morphological operations, the contrast is increased using contrast limited adaptive histogram equalization (CLAHE method and the inhomogeneity is corrected using Retinex approach. Then, the blood vessels are enhanced using a combination of B-COSFIRE and Frangi matched filters. From this preprocessed image, different statistical features are computed on a pixel-wise basis and used in an AdaBoost classifier to extract the blood vessel network inside the image. Finally, the segmented images are postprocessed to remove the misclassified pixels and regions. The proposed method was validated using publicly accessible Digital Retinal Images for Vessel Extraction (DRIVE, Structured Analysis of the Retina (STARE and Child Heart and Health Study in England (CHASE_DB1 datasets commonly used for determining the accuracy of retinal vessel segmentation methods. The accuracy of the proposed segmentation method was comparable to other state of the art methods while being very close to the manual segmentation provided by the second human observer with an average accuracy of 0.972, 0.951 and 0.948 in DRIVE, STARE and CHASE_DB1 datasets, respectively.

  1. Field programmable gate array based hardware implementation of a gradient filter for edge detection in colour images with subpixel precision

    International Nuclear Information System (INIS)

    Schellhorn, M; Rosenberger, M; Correns, M; Blau, M; Goepfert, A; Rueckwardt, M; Linss, G

    2010-01-01

    Within the field of industrial image processing the use of colour cameras becomes ever more common. Increasingly the established black and white cameras are replaced by economical single-chip colour cameras with Bayer pattern. The use of the additional colour information is particularly important for recognition or inspection. Become interesting however also for the geometric metrology, if measuring tasks can be solved more robust or more exactly. However only few suitable algorithms are available, in order to detect edges with the necessary precision. All attempts require however additional computation expenditure. On the basis of a new filter for edge detection in colour images with subpixel precision, the implementation on a pre-processing hardware platform is presented. Hardware implemented filters offer the advantage that they can be used easily with existing measuring software, since after the filtering a single channel image is present, which unites the information of all colour channels. Advanced field programmable gate arrays represent an ideal platform for the parallel processing of multiple channels. The effective implementation presupposes however a high programming expenditure. On the example of the colour filter implementation, arising problems are analyzed and the chosen solution method is presented.

  2. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    Science.gov (United States)

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  3. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  4. Hardware design of the median filter based on window structure and batcher′s oddeven sort network

    Directory of Open Access Journals (Sweden)

    SUN Kaimin

    2013-06-01

    Full Text Available Area and speed are two important factors to be considered in designing Median Filter with digital circuits.Area consideration requires the use of logical resources as little as possible,while speed consideration requires the system capable of working on higher clock frequencies,with as few clock cycles as possible to complete a frame filtering or real time filtering.This paper gives a new design of Median Filter,the hardware structure of which is a 3×3 window structure with two buffers.The filter function module is based on Batcher′s Odd-Even Sort network theory.Structural design is implemented in FPGA,verified by ModelSim software and realizes video image filtering.The experimental analysis shows that this new structure of Median Filter effectively decreases logical resources (merely using 741 Logic Elements,and accelerates the pixel processing speed up to 27MHz.This filter achieves realtime processing of video images of 30 frames/s.This design not only has a certain practicality,but also provides a reference for the hardware structure design ideas in digital image processing.

  5. A novel spatiotemporal muscle activity imaging approach based on the Extended Kalman Filter.

    Science.gov (United States)

    Wang, Jing; Zhang, Yingchun; Zhu, Xiangjun; Zhou, Ping; Liu, Chenguang; Rymer, William Z

    2012-01-01

    A novel spatiotemporal muscle activity imaging (sMAI) approach has been developed using the Extended Kalman Filter (EKF) to reconstruct internal muscle activities from non-invasive multi-channel surface electromyogram (sEMG) recordings. A distributed bioelectric dipole source model is employed to describe the internal muscle activity space, and a linear relationship between the muscle activity space and the sEMG measurement space is then established. The EKF is employed to recursively solve the ill-posed inverse problem in the sMAI approach, in which the weighted minimum norm (WMN) method is utilized to calculate the initial state and a new nonlinear method is developed based on the propagating features of muscle activities to predict the recursive state. A series of computer simulations was conducted to test the performance of the proposed sMAI approach. Results show that the localization error rapidly decreases over 35% and the overlap ratio rapidly increases over 45% compared to the results achieved using the WMN method only. The present promising results demonstrate the feasibility of utilizing the proposed EKF-based sMAI approach to accurately reconstruct internal muscle activities from non-invasive sEMG recordings.

  6. Kalman Filtered MR Temperature Imaging for Laser Induced Thermal Therapies

    OpenAIRE

    Fuentes, D.; Yung, J.; Hazle, J. D.; Weinberg, J. S.; Stafford, R. J.

    2011-01-01

    The feasibility of using a stochastic form of Pennes bioheat model within a 3D finite element based Kalman filter (KF) algorithm is critically evaluated for the ability to provide temperature field estimates in the event of magnetic resonance temperature imaging (MRTI) data loss during laser induced thermal therapy (LITT). The ability to recover missing MRTI data was analyzed by systematically removing spatiotemporal information from a clinical MR-guided LITT procedure in human brain and comp...

  7. COMPARISON OF ULTRASOUND IMAGE FILTERING METHODS BY MEANS OF MULTIVARIABLE KURTOSIS

    Directory of Open Access Journals (Sweden)

    Mariusz Nieniewski

    2017-06-01

    Full Text Available Comparison of the quality of despeckled US medical images is complicated because there is no image of a human body that would be free of speckles and could serve as a reference. A number of various image metrics are currently used for comparison of filtering methods; however, they do not satisfactorily represent the visual quality of images and medical expert’s satisfaction with images. This paper proposes an innovative use of relative multivariate kurtosis for the evaluation of the most important edges in an image. Multivariate kurtosis allows one to introduce an order among the filtered images and can be used as one of the metrics for image quality evaluation. At present there is no method which would jointly consider individual metrics. Furthermore, these metrics are typically defined by comparing the noisy original and filtered images, which is incorrect since the noisy original cannot serve as a golden standard. In contrast to this, the proposed kurtosis is the absolute measure, which is calculated independently of any reference image and it agrees with the medical expert’s satisfaction to a large extent. The paper presents a numerical procedure for calculating kurtosis and describes results of such calculations for a computer-generated noisy image, images of a general purpose phantom and a cyst phantom, as well as real-life images of thyroid and carotid artery obtained with SonixTouch ultrasound machine. 16 different methods of image despeckling are compared via kurtosis. The paper shows that visually more satisfactory despeckling results are associated with higher kurtosis, and to a certain degree kurtosis can be used as a single metric for evaluation of image quality.

  8. Acousto-Optic Tunable Filter Hyperspectral Microscope Imaging Method for Characterizing Spectra from Foodborne Pathogens.

    Science.gov (United States)

    Hyperspectral microscope imaging (HMI) method, which provides both spatial and spectral characteristics of samples, can be effective for foodborne pathogen detection. The acousto-optic tunable filter (AOTF)-based HMI method can be used to characterize spectral properties of biofilms formed by Salmon...

  9. Evaluation of multichannel Wiener filters applied to fine resolution passive microwave images of first-year sea ice

    Science.gov (United States)

    Full, William E.; Eppler, Duane T.

    1993-01-01

    The effectivity of multichannel Wiener filters to improve images obtained with passive microwave systems was investigated by applying Wiener filters to passive microwave images of first-year sea ice. Four major parameters which define the filter were varied: the lag or pixel offset between the original and the desired scenes, filter length, the number of lines in the filter, and the weight applied to the empirical correlation functions. The effect of each variable on the image quality was assessed by visually comparing the results. It was found that the application of multichannel Wiener theory to passive microwave images of first-year sea ice resulted in visually sharper images with enhanced textural features and less high-frequency noise. However, Wiener filters induced a slight blocky grain to the image and could produce a type of ringing along scan lines traversing sharp intensity contrasts.

  10. MRT letter: Guided filtering of image focus volume for 3D shape recovery of microscopic objects.

    Science.gov (United States)

    Mahmood, Muhammad Tariq

    2014-12-01

    In this letter, a shape from focus (SFF) method is proposed that utilizes the guided image filtering to enhance the image focus volume efficiently. First, image focus volume is computed using a conventional focus measure. Then each layer of image focus volume is filtered using guided filtering. In this work, the all-in-focus image, which can be obtained from the initial focus volume, is used as guidance image. Finally, improved depth map is obtained from the filtered image focus volume by maximizing the focus measure along the optical axis. The proposed SFF method is efficient and provides better depth maps. The improved performance is highlighted by conducting several experiments using image sequences of simulated and real microscopic objects. The comparative analysis demonstrates the effectiveness of the proposed SFF method. © 2014 Wiley Periodicals, Inc.

  11. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    Directory of Open Access Journals (Sweden)

    Luís BRAVO PEREIRA

    2010-09-01

    Full Text Available For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on using some innovative filters, that appeared available on the market in recent years, projected to adsorb UV radiation even more efficiently than with the mentioned above pigment based standard filters: the interference filters for UV rejection (and, usually, for IR rejection too manufactured using interference layers, that present better results than the pigment based filters. The only problem with interference filters type is that they are sensitive to the rays direction and, because of that, they are not adequate to wide-angle lenses. The internal fluorescence for three filters: the B+W 415 UV cut (equivalent to the Kodak Wratten 2E, pigment based, the B+W 486 UV IR cut (an interference type filter, used frequently on digital cameras to remove IR or UV and the Baader UVIR rejection filter (two versions of this interference filter were used had been tested and compared. The final quality of the UV fluorescence images seems to be of a superior quality when compared to the images obtained with classic filters.

  12. An Efficient FPGA Implementation of Optimized Anisotropic Diffusion Filtering of Images

    Directory of Open Access Journals (Sweden)

    Chandrajit Pal

    2016-01-01

    Full Text Available Digital image processing is an exciting area of research with a variety of applications including medical, surveillance security systems, defence, and space applications. Noise removal as a preprocessing step helps to improve the performance of the signal processing algorithms, thereby enhancing image quality. Anisotropic diffusion filtering proposed by Perona and Malik can be used as an edge-preserving smoother, removing high-frequency components of images without blurring their edges. In this paper, we present the FPGA implementation of an edge-preserving anisotropic diffusion filter for digital images. The designed architecture completely replaced the convolution operation and implemented the same using simple arithmetic subtraction of the neighboring intensities within a kernel, preceded by multiple operations in parallel within the kernel. To improve the image reconstruction quality, the diffusion coefficient parameter, responsible for controlling the filtering process, has been properly analyzed. Its signal behavior has been studied by subsequently scaling and differentiating the signal. The hardware implementation of the proposed design shows better performance in terms of reconstruction quality and accelerated performance with respect to its software implementation. It also reduces computation, power consumption, and resource utilization with respect to other related works.

  13. Image reconstruction for digital breast tomosynthesis (DBT) by using projection-angle-dependent filter functions

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yeonok; Park, Chulkyu; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Lee, Minsik; Cho, Heemoon; Choi, Sungil; Koo, Yangseo [Yonsei University, Wonju (Korea, Republic of)

    2014-09-15

    Digital breast tomosynthesis (DBT) is considered in clinics as a standard three-dimensional imaging modality, allowing the earlier detection of cancer. It typically acquires only 10-30 projections over a limited angle range of 15 - 60 .deg. with a stationary detector and typically uses a computationally-efficient filtered-backprojection (FBP) algorithm for image reconstruction. However, a common FBP algorithm yields poor image quality resulting from the loss of average image value and the presence of severe image artifacts due to the elimination of the dc component of the image by the ramp filter and to the incomplete data, respectively. As an alternative, iterative reconstruction methods are often used in DBT to overcome these difficulties, even though they are still computationally expensive. In this study, as a compromise, we considered a projection-angle dependent filtering method in which one-dimensional geometry-adapted filter kernels are computed with the aid of a conjugate-gradient method and are incorporated into the standard FBP framework. We implemented the proposed algorithm and performed systematic simulation works to investigate the imaging characteristics. Our results indicate that the proposed method is superior to a conventional FBP method for DBT imaging and has a comparable computational cost, while preserving good image homogeneity and edge sharpening with no serious image artifacts.

  14. Processing of a neutrographic image, using Bosso Filter

    International Nuclear Information System (INIS)

    Pereda, C.; Bustamante, M.; Henriquez, C.

    2006-01-01

    The following paper shows the result of the treatment of a neutron radiographic image, obtained in the RECH-1 experimental reactor, making use of the computational image treatment techniques of the IDL software, which are complemented with the Bosso filter method already tested to improve quality in medical diagnosis. These techniques possess an undeniable value as an auxiliary to neutrography, which results can be noticed through this first try with an auxiliary neutrographic image used in PGNAA. These results insinuate that this method should give all its advantages to the neutrographic analysis standards: structural images, density variations, etc

  15. Variation of the count-dependent Metz filter with imaging system modulation transfer function

    International Nuclear Information System (INIS)

    King, M.A.; Schwinger, R.B.; Penney, B.C.

    1986-01-01

    A systematic investigation was conducted of how a number of parameters which alter the system modulation transfer function (MTF) influence the count-dependent Metz filter. Since restoration filters are most effective at those frequencies where the object power spectrum dominates that of the noise, it was observed that parameters which significantly degrade the MTF at low spatial frequencies strongly influence the formation of the Metz filter. Thus the radionuclide imaged and the depth of the source in a scattering medium had the most influence. This is because they alter the relative amount of scattered radiation being imaged. For low-energy photon emitters, the collimator employed and the distance from the collimator were found to have less of an influence but still to be significant. These cause alterations in the MTF which are more gradual, and hence are most pronounced at mid to high spatial frequencies. As long as adequate spatial sampling is employed, the Metz filter was determined to be independent of the exact size of the sampling bin width, to a first approximation. For planar and single photon emission computed tomographic (SPECT) imaging, it is shown that two-dimensional filtering with the Metz filter optimized for the imaging conditions is able to deconvolve scatter and other causes of spatial resolution loss while diminishing noise, all in a balanced manner

  16. Mobile Phone Ratiometric Imaging Enables Highly Sensitive Fluorescence Lateral Flow Immunoassays without External Optical Filters.

    Science.gov (United States)

    Shah, Kamal G; Singh, Vidhi; Kauffman, Peter C; Abe, Koji; Yager, Paul

    2018-05-14

    Paper-based diagnostic tests based on the lateral flow immunoassay concept promise low-cost, point-of-care detection of infectious diseases, but such assays suffer from poor limits of detection. One factor that contributes to poor analytical performance is a reliance on low-contrast chromophoric optical labels such as gold nanoparticles. Previous attempts to improve the sensitivity of paper-based diagnostics include replacing chromophoric labels with enzymes, fluorophores, or phosphors at the expense of increased fluidic complexity or the need for device readers with costly optoelectronics. Several groups, including our own, have proposed mobile phones as suitable point-of-care readers due to their low cost, ease of use, and ubiquity. However, extant mobile phone fluorescence readers require costly optical filters and were typically validated with only one camera sensor module, which is inappropriate for potential point-of-care use. In response, we propose to couple low-cost ultraviolet light-emitting diodes with long Stokes-shift quantum dots to enable ratiometric mobile phone fluorescence measurements without optical filters. Ratiometric imaging with unmodified smartphone cameras improves the contrast and attenuates the impact of excitation intensity variability by 15×. Practical application was shown with a lateral flow immunoassay for influenza A with nucleoproteins spiked into simulated nasal matrix. Limits of detection of 1.5 and 2.6 fmol were attained on two mobile phones, which are comparable to a gel imager (1.9 fmol), 10× better than imaging gold nanoparticles on a scanner (18 fmol), and >2 orders of magnitude better than gold nanoparticle-labeled assays imaged with mobile phones. Use of the proposed filter-free mobile phone imaging scheme is a first step toward enabling a new generation of highly sensitive, point-of-care fluorescence assays.

  17. Tunable thin-film optical filters for hyperspectral microscopy

    Science.gov (United States)

    Favreau, Peter F.; Rich, Thomas C.; Prabhat, Prashant; Leavesley, Silas J.

    2013-02-01

    Hyperspectral imaging was originally developed for use in remote sensing applications. More recently, it has been applied to biological imaging systems, such as fluorescence microscopes. The ability to distinguish molecules based on spectral differences has been especially advantageous for identifying fluorophores in highly autofluorescent tissues. A key component of hyperspectral imaging systems is wavelength filtering. Each filtering technology used for hyperspectral imaging has corresponding advantages and disadvantages. Recently, a new optical filtering technology has been developed that uses multi-layered thin-film optical filters that can be rotated, with respect to incident light, to control the center wavelength of the pass-band. Compared to the majority of tunable filter technologies, these filters have superior optical performance including greater than 90% transmission, steep spectral edges and high out-of-band blocking. Hence, tunable thin-film optical filters present optical characteristics that may make them well-suited for many biological spectral imaging applications. An array of tunable thin-film filters was implemented on an inverted fluorescence microscope (TE 2000, Nikon Instruments) to cover the full visible wavelength range. Images of a previously published model, GFP-expressing endothelial cells in the lung, were acquired using a charge-coupled device camera (Rolera EM-C2, Q-Imaging). This model sample presents fluorescently-labeled cells in a highly autofluorescent environment. Linear unmixing of hyperspectral images indicates that thin-film tunable filters provide equivalent spectral discrimination to our previous acousto-optic tunable filter-based approach, with increased signal-to-noise characteristics. Hence, tunable multi-layered thin film optical filters may provide greatly improved spectral filtering characteristics and therefore enable wider acceptance of hyperspectral widefield microscopy.

  18. Improving Image Matching by Reducing Surface Reflections Using Polarising Filter Techniques

    Science.gov (United States)

    Conen, N.; Hastedt, H.; Kahmen, O.; Luhmann, T.

    2018-05-01

    In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera's orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002) using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm) with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  19. IMPROVING IMAGE MATCHING BY REDUCING SURFACE REFLECTIONS USING POLARISING FILTER TECHNIQUES

    Directory of Open Access Journals (Sweden)

    N. Conen

    2018-05-01

    Full Text Available In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera’s orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002 using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  20. Ross filter pairs for metal artefact reduction in x-ray tomography: a case study based on imaging and segmentation of metallic implants

    Science.gov (United States)

    Arhatari, Benedicta D.; Abbey, Brian

    2018-01-01

    Ross filter pairs have recently been demonstrated as a highly effective means of producing quasi-monoenergetic beams from polychromatic X-ray sources. They have found applications in both X-ray spectroscopy and for elemental separation in X-ray computed tomography (XCT). Here we explore whether they could be applied to the problem of metal artefact reduction (MAR) for applications in medical imaging. Metal artefacts are a common problem in X-ray imaging of metal implants embedded in bone and soft tissue. A number of data post-processing approaches to MAR have been proposed in the literature, however these can be time-consuming and sometimes have limited efficacy. Here we describe and demonstrate an alternative approach based on beam conditioning using Ross filter pairs. This approach obviates the need for any complex post-processing of the data and enables MAR and segmentation from the surrounding tissue by exploiting the absorption edge contrast of the implant.

  1. An Adaptive Filtering Algorithm Based on Genetic Algorithm-Backpropagation Network

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2013-01-01

    Full Text Available A new image filtering algorithm is proposed. GA-BPN algorithm uses genetic algorithm (GA to decide weights in a back propagation neural network (BPN. It has better global optimal characteristics than traditional optimal algorithm. In this paper, we used GA-BPN to do image noise filter researching work. Firstly, this paper uses training samples to train GA-BPN as the noise detector. Then, we utilize the well-trained GA-BPN to recognize noise pixels in target image. And at last, an adaptive weighted average algorithm is used to recover noise pixels recognized by GA-BPN. Experiment data shows that this algorithm has better performance than other filters.

  2. Filter Selection for Optimizing the Spectral Sensitivity of Broadband Multispectral Cameras Based on Maximum Linear Independence.

    Science.gov (United States)

    Li, Sui-Xian

    2018-05-07

    Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI). However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ₂ norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.

  3. Filter Selection for Optimizing the Spectral Sensitivity of Broadband Multispectral Cameras Based on Maximum Linear Independence

    Directory of Open Access Journals (Sweden)

    Sui-Xian Li

    2018-05-01

    Full Text Available Previous research has shown that the effectiveness of selecting filter sets from among a large set of commercial broadband filters by a vector analysis method based on maximum linear independence (MLI. However, the traditional MLI approach is suboptimal due to the need to predefine the first filter of the selected filter set to be the maximum ℓ2 norm among all available filters. An exhaustive imaging simulation with every single filter serving as the first filter is conducted to investigate the features of the most competent filter set. From the simulation, the characteristics of the most competent filter set are discovered. Besides minimization of the condition number, the geometric features of the best-performed filter set comprise a distinct transmittance peak along the wavelength axis of the first filter, a generally uniform distribution for the peaks of the filters and substantial overlaps of the transmittance curves of the adjacent filters. Therefore, the best-performed filter sets can be recognized intuitively by simple vector analysis and just a few experimental verifications. A practical two-step framework for selecting optimal filter set is recommended, which guarantees a significant enhancement of the performance of the systems. This work should be useful for optimizing the spectral sensitivity of broadband multispectral imaging sensors.

  4. Bowtie filter and water calibration in the improvement of cone beam CT image quality

    International Nuclear Information System (INIS)

    Li Minghui; Dai Jianrong; Zhang Ke

    2010-01-01

    Objective: To evaluate the improvement of cone beam CT (CBCT) image quality by using bewtie filter (F 1 ) and water calibration. Methods: First the multi-level gain calibration of the detector panel with the method of Cal 2 calibration was performed, and the CT images of CATPHAN503 with F 0 and bowtie filter were collected, respectively. Then the detector panel using water calibration kit was calibrated, and images were acquired again. Finally, the change of image quality after using F 1 and (or) water calibration method was observed. The observed indexes included low contrast visibility, spatial uniformity, ring artifact, spatial resolution and geometric accuracy. Results: Comparing with the traditional combination of F 0 filter and Cal 2 calibration, the combination of bowtie filter F 1 and water calibration improves low contrast visibility by 13.71%, and spatial uniformity by 54. 42%. Water calibration removes ring artifacts effectively. However, none of them improves spatial resolution and geometric accuracy. Conclusions: The combination of F 1 and water calibration improves CBCT image quality effectively. This improvement is aid to the registration of CBCT images and localization images. (authors)

  5. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  6. Digital filtering and reconstruction of coded aperture images

    International Nuclear Information System (INIS)

    Tobin, K.W. Jr.

    1987-01-01

    The real-time neutron radiography facility at the University of Virginia has been used for both transmission radiography and computed tomography. Recently, a coded aperture system has been developed to permit the extraction of three dimensional information from a low intensity field of radiation scattered by an extended object. Short wave-length radiations (e.g. neutrons) are not easily image because of the difficulties in achieving diffraction and refraction with a conventional lens imaging system. By using a coded aperture approach, an imaging system has been developed that records and reconstructs an object from an intensity distribution. This system has a signal-to-noise ratio that is proportional to the total open area of the aperture making it ideal for imaging with a limiting intensity radiation field. The main goal of this research was to develope and implement the digital methods and theory necessary for the reconstruction process. Several real-time video systems, attached to an Intellect-100 image processor, a DEC PDP-11 micro-computer, and a Convex-1 parallel processing mainframe were employed. This system, coupled with theoretical extensions and improvements, allowed for retrieval of information previously unobtainable by earlier optical methods. The effect of thermal noise, shot noise, and aperture related artifacts were examined so that new digital filtering techniques could be constructed and implemented. Results of image data filtering prior to and following the reconstruction process are reported. Improvements related to the different signal processing methods are emphasized. The application and advantages of this imaging technique to the field of non-destructive testing are also discussed

  7. Filtering for distributed mechanical systems using position measurements: perspectives in medical imaging

    International Nuclear Information System (INIS)

    Moireau, Philippe; Chapelle, Dominique; Tallec, Patrick Le

    2009-01-01

    We propose an effective filtering methodology designed to perform estimation in a distributed mechanical system using position measurements. As in a previously introduced method, the filter is inspired by robust control feedback, but here we take full advantage of the estimation specificity to choose a feedback law that can act on displacements instead of velocities and still retain the same kind of dissipativity property which guarantees robustness. This is very valuable in many applications for which positions are more readily available than velocities, as in medical imaging. We provide an in-depth analysis of the proposed procedure, as well as detailed numerical assessments using a test problem inspired by cardiac biomechanics, as medical diagnosis assistance is an important perspective for this approach. The method is formulated first for measurements based on Lagrangian displacements, but we then derive a nonlinear extension allowing us to instead consider segmented images, which of course is even more relevant in medical applications

  8. The HURRA filter: An easy method to eliminate collimator artifacts in high-energy gamma camera images.

    Science.gov (United States)

    Perez-Garcia, H; Barquero, R

    The correct determination and delineation of tumor/organ size is crucial in 2-D imaging in 131 I therapy. These images are usually obtained using a system composed of a Gamma camera and high-energy collimator, although the system can produce artifacts in the image. This article analyses these artifacts and describes a correction filter that can eliminate those collimator artifacts. Using free software, ImageJ, a central profile in the image is obtained and analyzed. Two components can be seen in the fluctuation of the profile: one associated with the stochastic nature of the radiation, plus electronic noise and the other periodically across the position in space due to the collimator. These frequencies are analytically obtained and compared with the frequencies in the Fourier transform of the profile. A specially developed filter removes the artifacts in the 2D Fourier transform of the DICOM image. This filter is tested using a 15-cm-diameter Petri dish with 131 I radioactive water (big object size) image, a 131 I clinical pill (small object size) image, and an image of the remainder of the lesion of two patients treated with 3.7GBq (100mCi), and 4.44GBq (120mCi) of 131 I, respectively, after thyroidectomy. The artifact is due to the hexagonal periodic structure of the collimator. The use of the filter on large-sized images reduces the fluctuation by 5.8-3.5%. In small-sized images, the FWHM can be determined in the filtered image, while this is impossible in the unfiltered image. The definition of tumor boundary and the visualization of the activity distribution inside patient lesions improve drastically when the filter is applied to the corresponding images obtained with HE gamma camera. The HURRA filter removes the artifact of high-energy collimator artifacts in planar images obtained with a Gamma camera without reducing the image resolution. It can be applied in any study of patient quantification because the number of counts remains invariant. The filter makes

  9. Adaptive iterated function systems filter for images highly corrupted with fixed - Value impulse noise

    Science.gov (United States)

    Shanmugavadivu, P.; Eliahim Jeevaraj, P. S.

    2014-06-01

    The Adaptive Iterated Functions Systems (AIFS) Filter presented in this paper has an outstanding potential to attenuate the fixed-value impulse noise in images. This filter has two distinct phases namely noise detection and noise correction which uses Measure of Statistics and Iterated Function Systems (IFS) respectively. The performance of AIFS filter is assessed by three metrics namely, Peak Signal-to-Noise Ratio (PSNR), Mean Structural Similarity Index Matrix (MSSIM) and Human Visual Perception (HVP). The quantitative measures PSNR and MSSIM endorse the merit of this filter in terms of degree of noise suppression and details/edge preservation respectively, in comparison with the high performing filters reported in the recent literature. The qualitative measure HVP confirms the noise suppression ability of the devised filter. This computationally simple noise filter broadly finds application wherein the images are highly degraded by fixed-value impulse noise.

  10. Application of digital tomosynthesis (DTS) of optimal deblurring filters for dental X-ray imaging

    International Nuclear Information System (INIS)

    Oh, J. E.; Cho, H. S.; Kim, D. S.; Choi, S. I.; Je, U. K.

    2012-01-01

    Digital tomosynthesis (DTS) is a limited-angle tomographic technique that provides some of the tomographic benefits of computed tomography (CT) but at reduced dose and cost. Thus, the potential for application of DTS to dental X-ray imaging seems promising. As a continuation of our dental radiography R and D, we developed an effective DTS reconstruction algorithm and implemented it in conjunction with a commercial dental CT system for potential use in dental implant placement. The reconstruction algorithm employed a backprojection filtering (BPF) method based upon optimal deblurring filters to suppress effectively both the blur artifacts originating from the out-focus planes and the high-frequency noise. To verify the usefulness of the reconstruction algorithm, we performed systematic simulation works and evaluated the image characteristics. We also performed experimental works in which DTS images of enhanced anatomical resolution were successfully obtained by using the algorithm and were promising to our ongoing applications to dental X-ray imaging. In this paper, our approach to the development of the DTS reconstruction algorithm and the results are described in detail.

  11. Design and application of finite impulse response digital filters

    International Nuclear Information System (INIS)

    Miller, T.R.; Sampathkumaran, K.S.

    1982-01-01

    The finite impulse response (FIR) digital filter is a spatial domain filter with a frequency domain representation. The theory of the FIR filter is presented and techniques are described for designing FIR filters with known frequency response characteristics. Rational design principles are emphasized based on characterization of the imaging system using the modulation transfer function and physical properties of the imaged objects. Bandpass, Wiener, and low-pass filters were designed and applied to 201 Tl myocardial images. The bandpass filter eliminates low-frequency image components that represent background activity and high-frequency components due to noise. The Wiener, or minimum mean square error filter 'sharpens' the image while also reducing noise. The Wiener filter illustrates the power of the FIR technique to design filters with any desired frequency reponse. The low-pass filter, while of relative limited use, is presented to compare it with a popular elementary 'smoothing' filter. (orig.)

  12. Pornographic image recognition and filtering using incremental learning in compressed domain

    Science.gov (United States)

    Zhang, Jing; Wang, Chao; Zhuo, Li; Geng, Wenhao

    2015-11-01

    With the rapid development and popularity of the network, the openness, anonymity, and interactivity of networks have led to the spread and proliferation of pornographic images on the Internet, which have done great harm to adolescents' physical and mental health. With the establishment of image compression standards, pornographic images are mainly stored with compressed formats. Therefore, how to efficiently filter pornographic images is one of the challenging issues for information security. A pornographic image recognition and filtering method in the compressed domain is proposed by using incremental learning, which includes the following steps: (1) low-resolution (LR) images are first reconstructed from the compressed stream of pornographic images, (2) visual words are created from the LR image to represent the pornographic image, and (3) incremental learning is adopted to continuously adjust the classification rules to recognize the new pornographic image samples after the covering algorithm is utilized to train and recognize the visual words in order to build the initial classification model of pornographic images. The experimental results show that the proposed pornographic image recognition method using incremental learning has a higher recognition rate as well as costing less recognition time in the compressed domain.

  13. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    Directory of Open Access Journals (Sweden)

    Nagashettappa Biradar

    2016-01-01

    Full Text Available The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI, and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink embedded with Stein’s unbiased risk estimation (SURE. The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  14. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    Science.gov (United States)

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  15. Performance tuning for CUDA-accelerated neighborhood denoising filters

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Ziyi; Mueller, Klaus [Stony Brook Univ., NY (United States). Center for Visual Computing, Computer Science; Xu, Wei

    2011-07-01

    Neighborhood denoising filters are powerful techniques in image processing and can effectively enhance the image quality in CT reconstructions. In this study, by taking the bilateral filter and the non-local mean filter as two examples, we discuss their implementations and perform fine-tuning on the targeted GPU architecture. Experimental results show that the straightforward GPU-based neighborhood filters can be further accelerated by pre-fetching. The optimized GPU-accelerated denoising filters are ready for plug-in into reconstruction framework to enable fast denoising without compromising image quality. (orig.)

  16. Image enhancement filters significantly improve reading performance for low vision observers

    Science.gov (United States)

    Lawton, T. B.

    1992-01-01

    As people age, so do their photoreceptors; many photoreceptors in central vision stop functioning when a person reaches their late sixties or early seventies. Low vision observers with losses in central vision, those with age-related maculopathies, were studied. Low vision observers no longer see high spatial frequencies, being unable to resolve fine edge detail. We developed image enhancement filters to compensate for the low vision observer's losses in contrast sensitivity to intermediate and high spatial frequencies. The filters work by boosting the amplitude of the less visible intermediate spatial frequencies. The lower spatial frequencies. These image enhancement filters not only reduce the magnification needed for reading by up to 70 percent, but they also increase the observer's reading speed by 2-4 times. A summary of this research is presented.

  17. A SAR IMAGE REGISTRATION METHOD BASED ON SIFT ALGORITHM

    Directory of Open Access Journals (Sweden)

    W. Lu

    2017-09-01

    Full Text Available In order to improve the stability and rapidity of synthetic aperture radar (SAR images matching, an effective method was presented. Firstly, the adaptive smoothing filtering was employed for image denoising in image processing based on Wallis filtering to avoid the follow-up noise is amplified. Secondly, feature points were extracted by a simplified SIFT algorithm. Finally, the exact matching of the images was achieved with these points. Compared with the existing methods, it not only maintains the richness of features, but a-lso reduces the noise of the image. The simulation results show that the proposed algorithm can achieve better matching effect.

  18. Half-Fan-Based Intensity-Weighted Region-of-Interest Imaging for Low-Dose Cone-Beam CT in Image-Guided Radiation Therapy.

    Science.gov (United States)

    Yoo, Boyeol; Son, Kihong; Pua, Rizza; Kim, Jinsung; Solodov, Alexander; Cho, Seungryong

    2016-10-01

    With the increased use of computed tomography (CT) in clinics, dose reduction is the most important feature people seek when considering new CT techniques or applications. We developed an intensity-weighted region-of-interest (IWROI) imaging method in an exact half-fan geometry to reduce the imaging radiation dose to patients in cone-beam CT (CBCT) for image-guided radiation therapy (IGRT). While dose reduction is highly desirable, preserving the high-quality images of the ROI is also important for target localization in IGRT. An intensity-weighting (IW) filter made of copper was mounted in place of a bowtie filter on the X-ray tube unit of an on-board imager (OBI) system such that the filter can substantially reduce radiation exposure to the outer ROI. In addition to mounting the IW filter, the lead-blade collimation of the OBI was adjusted to produce an exact half-fan scanning geometry for a further reduction of the radiation dose. The chord-based rebinned backprojection-filtration (BPF) algorithm in circular CBCT was implemented for image reconstruction, and a humanoid pelvis phantom was used for the IWROI imaging experiment. The IWROI image of the phantom was successfully reconstructed after beam-quality correction, and it was registered to the reference image within an acceptable level of tolerance. Dosimetric measurements revealed that the dose is reduced by approximately 61% in the inner ROI and by 73% in the outer ROI compared to the conventional bowtie filter-based half-fan scan. The IWROI method substantially reduces the imaging radiation dose and provides reconstructed images with an acceptable level of quality for patient setup and target localization. The proposed half-fan-based IWROI imaging technique can add a valuable option to CBCT in IGRT applications.

  19. Design, control, and implementation of LCL-filter-based shunt active power filters

    DEFF Research Database (Denmark)

    Tang, Yi; Loh, Poh Chiang; Wang, Peng

    2011-01-01

    This paper concentrates on the design, control and implementation of an LCL-filter-based shunt active power filter (SAPF), which can effectively compensate harmonic currents produced by nonlinear loads in a three-phase three-wire power system. The use of LCL-filter at the output end of SAPF offer......-loop control system, and active damping implemented with fewer current sensors are all addressed here. An analytical design example is finally presented, being supported with experimental results, to verify its effectiveness and practicality.......This paper concentrates on the design, control and implementation of an LCL-filter-based shunt active power filter (SAPF), which can effectively compensate harmonic currents produced by nonlinear loads in a three-phase three-wire power system. The use of LCL-filter at the output end of SAPF offers...

  20. Relationship between pre-reconstruction filter and accuracy of registration software based on mutual-information maximization. A study of SPECT-MR brain phantom images

    International Nuclear Information System (INIS)

    Mito, Suzuko; Magota, Keiichi; Arai, Hiroshi; Omote, Hidehiko; Katsuura, Hidenori; Suzuki, Kotaro; Kubo Naoki

    2005-01-01

    Image registration technique is becoming an increasingly important tool in SPECT. Recently, software based on mutual-information maximization has been developed for automatic multimodality image registration. The accuracy of the software is important for its application to image registration. During SPECT reconstruction, the projection data are pre-filtered in order to reduce Poisson noise, commonly using a Butterworth filter. We have investigated the dependence of the absolute accuracy of MRI-SPECT registration on the cut-off frequencies of a range of Butterworth filters. This study used a 3D Hoffman phantom (Model No. 9000, Data-spectrum Co.). For the reference volume, an magnetization prepared rapid gradient echo (MPRage) sequence was performed on a Vision MRI (Siemence, 1.5 T). For the floating volumes, SPECT data of a phantom including 99m Tc 85 kBq/mL were acquired by a GCA-9300 (Toshiba Medical Systems Co.). During SPECT, the orbito-meatal (OM) line of the phantom was tilted by 5 deg and 15 deg to mimic the incline of a patient's head. The projection data were pre-filtered with Butterworth filters (cut-off frequency varying between 0.24 to 0.94 cycles/cm in 0.02 steps, order 8). The automated registrations were performed using iNRT β version software (Nihon Medi. Co.) and the rotation angles of SPECT for registration were noted. In this study, the registrations of all SPECT data were successful. Graphs of registration rotation angles against cut-off frequencies were scattered and showed no correlation between the two. The registration rotation angles ranged with changing cut-off frequency from -0.4 deg to +3.8 deg at a 5 deg tilt and from +12.7 deg to +19.6 deg at a 15 deg tilt. The registration rotation angles showed variation even for slight differences in cut-off frequencies. The absolute errors were a few degrees for any cut-off frequency. Regardless of the cut-off frequency, automatic registration using this software provides similar results. (author)

  1. Signal-to-noise ratio enhancement on SEM images using a cubic spline interpolation with Savitzky-Golay filters and weighted least squares error.

    Science.gov (United States)

    Kiani, M A; Sim, K S; Nia, M E; Tso, C P

    2015-05-01

    A new technique based on cubic spline interpolation with Savitzky-Golay smoothing using weighted least squares error filter is enhanced for scanning electron microscope (SEM) images. A diversity of sample images is captured and the performance is found to be better when compared with the moving average and the standard median filters, with respect to eliminating noise. This technique can be implemented efficiently on real-time SEM images, with all mandatory data for processing obtained from a single image. Noise in images, and particularly in SEM images, are undesirable. A new noise reduction technique, based on cubic spline interpolation with Savitzky-Golay and weighted least squares error method, is developed. We apply the combined technique to single image signal-to-noise ratio estimation and noise reduction for SEM imaging system. This autocorrelation-based technique requires image details to be correlated over a few pixels, whereas the noise is assumed to be uncorrelated from pixel to pixel. The noise component is derived from the difference between the image autocorrelation at zero offset, and the estimation of the corresponding original autocorrelation. In the few test cases involving different images, the efficiency of the developed noise reduction filter is proved to be significantly better than those obtained from the other methods. Noise can be reduced efficiently with appropriate choice of scan rate from real-time SEM images, without generating corruption or increasing scanning time. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  2. Segmentation of dermatoscopic images by frequency domain filtering and k-means clustering algorithms.

    Science.gov (United States)

    Rajab, Maher I

    2011-11-01

    Since the introduction of epiluminescence microscopy (ELM), image analysis tools have been extended to the field of dermatology, in an attempt to algorithmically reproduce clinical evaluation. Accurate image segmentation of skin lesions is one of the key steps for useful, early and non-invasive diagnosis of coetaneous melanomas. This paper proposes two image segmentation algorithms based on frequency domain processing and k-means clustering/fuzzy k-means clustering. The two methods are capable of segmenting and extracting the true border that reveals the global structure irregularity (indentations and protrusions), which may suggest excessive cell growth or regression of a melanoma. As a pre-processing step, Fourier low-pass filtering is applied to reduce the surrounding noise in a skin lesion image. A quantitative comparison of the techniques is enabled by the use of synthetic skin lesion images that model lesions covered with hair to which Gaussian noise is added. The proposed techniques are also compared with an established optimal-based thresholding skin-segmentation method. It is demonstrated that for lesions with a range of different border irregularity properties, the k-means clustering and fuzzy k-means clustering segmentation methods provide the best performance over a range of signal to noise ratios. The proposed segmentation techniques are also demonstrated to have similar performance when tested on real skin lesions representing high-resolution ELM images. This study suggests that the segmentation results obtained using a combination of low-pass frequency filtering and k-means or fuzzy k-means clustering are superior to the result that would be obtained by using k-means or fuzzy k-means clustering segmentation methods alone. © 2011 John Wiley & Sons A/S.

  3. Median Filter Noise Reduction of Image and Backpropagation Neural Network Model for Cervical Cancer Classification

    Science.gov (United States)

    Wutsqa, D. U.; Marwah, M.

    2017-06-01

    In this paper, we consider spatial operation median filter to reduce the noise in the cervical images yielded by colposcopy tool. The backpropagation neural network (BPNN) model is applied to the colposcopy images to classify cervical cancer. The classification process requires an image extraction by using a gray level co-occurrence matrix (GLCM) method to obtain image features that are used as inputs of BPNN model. The advantage of noise reduction is evaluated by comparing the performances of BPNN models with and without spatial operation median filter. The experimental result shows that the spatial operation median filter can improve the accuracy of the BPNN model for cervical cancer classification.

  4. Use of metameric filters for future interference security image structures

    Science.gov (United States)

    Baloukas, Bill; Larouche, Stéphane; Martinu, Ludvik

    2006-02-01

    In the present work, we describe innovative approaches and properties that can be added to the already popular thin film optically variable devices (OVD) used on banknotes. We show two practical examples of OVDs, namely (i) a pair of metameric filters offering a hidden image effect as a function of the angle of observation as well as a specific spectral property permitting automatic note readability, and (ii) multi-material filters offering a side-dependent color shift. We first describe the design approach of these new devices followed by their sensitivity to deposition errors especially in the case of the metameric filters where slight thickness variations have a significant effect on the obtained colors. The performance of prototype filters prepared by dual ion beam sputtering (DIBS) is shown.

  5. A reconstruction algorithm for coherent scatter computed tomography based on filtered back-projection

    International Nuclear Information System (INIS)

    Stevendaal, U. van; Schlomka, J.-P.; Harding, A.; Grass, M.

    2003-01-01

    Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter form factor of the investigated object. Reconstruction from coherently scattered x-rays is commonly done using algebraic reconstruction techniques (ART). In this paper, we propose an alternative approach based on filtered back-projection. For the first time, a three-dimensional (3D) filtered back-projection technique using curved 3D back-projection lines is applied to two-dimensional coherent scatter projection data. The proposed algorithm is tested with simulated projection data as well as with projection data acquired with a demonstrator setup similar to a multi-line CT scanner geometry. While yielding comparable image quality as ART reconstruction, the modified 3D filtered back-projection algorithm is about two orders of magnitude faster. In contrast to iterative reconstruction schemes, it has the advantage that subfield-of-view reconstruction becomes feasible. This allows a selective reconstruction of the coherent-scatter form factor for a region of interest. The proposed modified 3D filtered back-projection algorithm is a powerful reconstruction technique to be implemented in a CSCT scanning system. This method gives coherent scatter CT the potential of becoming a competitive modality for medical imaging or nondestructive testing

  6. Adaptive nonlocal means filtering based on local noise level for CT denoising

    International Nuclear Information System (INIS)

    Li, Zhoubo; Trzasko, Joshua D.; Lake, David S.; Blezek, Daniel J.; Manduca, Armando; Yu, Lifeng; Fletcher, Joel G.; McCollough, Cynthia H.

    2014-01-01

    Purpose: To develop and evaluate an image-domain noise reduction method based on a modified nonlocal means (NLM) algorithm that is adaptive to local noise level of CT images and to implement this method in a time frame consistent with clinical workflow. Methods: A computationally efficient technique for local noise estimation directly from CT images was developed. A forward projection, based on a 2D fan-beam approximation, was used to generate the projection data, with a noise model incorporating the effects of the bowtie filter and automatic exposure control. The noise propagation from projection data to images was analytically derived. The analytical noise map was validated using repeated scans of a phantom. A 3D NLM denoising algorithm was modified to adapt its denoising strength locally based on this noise map. The performance of this adaptive NLM filter was evaluated in phantom studies in terms of in-plane and cross-plane high-contrast spatial resolution, noise power spectrum (NPS), subjective low-contrast spatial resolution using the American College of Radiology (ACR) accreditation phantom, and objective low-contrast spatial resolution using a channelized Hotelling model observer (CHO). Graphical processing units (GPU) implementation of this noise map calculation and the adaptive NLM filtering were developed to meet demands of clinical workflow. Adaptive NLM was piloted on lower dose scans in clinical practice. Results: The local noise level estimation matches the noise distribution determined from multiple repetitive scans of a phantom, demonstrated by small variations in the ratio map between the analytical noise map and the one calculated from repeated scans. The phantom studies demonstrated that the adaptive NLM filter can reduce noise substantially without degrading the high-contrast spatial resolution, as illustrated by modulation transfer function and slice sensitivity profile results. The NPS results show that adaptive NLM denoising preserves the

  7. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    Science.gov (United States)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  8. Adaptive wiener filter based on Gaussian mixture distribution model for denoising chest X-ray CT image

    International Nuclear Information System (INIS)

    Tabuchi, Motohiro; Yamane, Nobumoto; Morikawa, Yoshitaka

    2008-01-01

    In recent decades, X-ray CT imaging has become more important as a result of its high-resolution performance. However, it is well known that the X-ray dose is insufficient in the techniques that use low-dose imaging in health screening or thin-slice imaging in work-up. Therefore, the degradation of CT images caused by the streak artifact frequently becomes problematic. In this study, we applied a Wiener filter (WF) using the universal Gaussian mixture distribution model (UNI-GMM) as a statistical model to remove streak artifact. In designing the WF, it is necessary to estimate the statistical model and the precise co-variances of the original image. In the proposed method, we obtained a variety of chest X-ray CT images using a phantom simulating a chest organ, and we estimated the statistical information using the images for training. The results of simulation showed that it is possible to fit the UNI-GMM to the chest X-ray CT images and reduce the specific noise. (author)

  9. STRUCTURE TENSOR IMAGE FILTERING USING RIEMANNIAN L1 AND L∞ CENTER-OF-MASS

    Directory of Open Access Journals (Sweden)

    Jesus Angulo

    2014-06-01

    Full Text Available Structure tensor images are obtained by a Gaussian smoothing of the dyadic product of gradient image. These images give at each pixel a n×n symmetric positive definite matrix SPD(n, representing the local orientation and the edge information. Processing such images requires appropriate algorithms working on the Riemannian manifold on the SPD(n matrices. This contribution deals with structure tensor image filtering based on Lp geometric averaging. In particular, L1 center-of-mass (Riemannian median or Fermat-Weber point and L∞ center-of-mass (Riemannian circumcenter can be obtained for structure tensors using recently proposed algorithms. Our contribution in this paper is to study the interest of L1 and L∞ Riemannian estimators for structure tensor image processing. In particular, we compare both for two image analysis tasks: (i structure tensor image denoising; (ii anomaly detection in structure tensor images.

  10. Enhancement of noisy EDX HRSTEM spectrum-images by combination of filtering and PCA.

    Science.gov (United States)

    Potapov, Pavel; Longo, Paolo; Okunishi, Eiji

    2017-05-01

    STEM spectrum-imaging with collecting EDX signal is considered in view of the extraction of maximum information from very noisy data. It is emphasized that spectrum-images with weak EDX signal often suffer from information loss in the course of PCA treatment. The loss occurs when the level of random noise exceeds a certain threshold. Weighted PCA, though potentially helpful in isolation of meaningful variations from noise, might provoke the complete loss of information in the situation of weak EDX signal. Filtering datasets prior PCA can improve the situation and recover the lost information. In particular, Gaussian kernel filters are found to be efficient. A new filter useful in the case of sparse atomic-resolution EDX spectrum-images is suggested. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The use of the Kalman filter in the automated segmentation of EIT lung images

    International Nuclear Information System (INIS)

    Zifan, A; Chapman, B E; Liatsis, P

    2013-01-01

    In this paper, we present a new pipeline for the fast and accurate segmentation of impedance images of the lungs using electrical impedance tomography (EIT). EIT is an emerging, promising, non-invasive imaging modality that produces real-time, low spatial but high temporal resolution images of impedance inside a body. Recovering impedance itself constitutes a nonlinear ill-posed inverse problem, therefore the problem is usually linearized, which produces impedance-change images, rather than static impedance ones. Such images are highly blurry and fuzzy along object boundaries. We provide a mathematical reasoning behind the high suitability of the Kalman filter when it comes to segmenting and tracking conductivity changes in EIT lung images. Next, we use a two-fold approach to tackle the segmentation problem. First, we construct a global lung shape to restrict the search region of the Kalman filter. Next, we proceed with augmenting the Kalman filter by incorporating an adaptive foreground detection system to provide the boundary contours for the Kalman filter to carry out the tracking of the conductivity changes as the lungs undergo deformation in a respiratory cycle. The proposed method has been validated by using performance statistics such as misclassified area, and false positive rate, and compared to previous approaches. The results show that the proposed automated method can be a fast and reliable segmentation tool for EIT imaging. (paper)

  12. The use of the Kalman filter in the automated segmentation of EIT lung images.

    Science.gov (United States)

    Zifan, A; Liatsis, P; Chapman, B E

    2013-06-01

    In this paper, we present a new pipeline for the fast and accurate segmentation of impedance images of the lungs using electrical impedance tomography (EIT). EIT is an emerging, promising, non-invasive imaging modality that produces real-time, low spatial but high temporal resolution images of impedance inside a body. Recovering impedance itself constitutes a nonlinear ill-posed inverse problem, therefore the problem is usually linearized, which produces impedance-change images, rather than static impedance ones. Such images are highly blurry and fuzzy along object boundaries. We provide a mathematical reasoning behind the high suitability of the Kalman filter when it comes to segmenting and tracking conductivity changes in EIT lung images. Next, we use a two-fold approach to tackle the segmentation problem. First, we construct a global lung shape to restrict the search region of the Kalman filter. Next, we proceed with augmenting the Kalman filter by incorporating an adaptive foreground detection system to provide the boundary contours for the Kalman filter to carry out the tracking of the conductivity changes as the lungs undergo deformation in a respiratory cycle. The proposed method has been validated by using performance statistics such as misclassified area, and false positive rate, and compared to previous approaches. The results show that the proposed automated method can be a fast and reliable segmentation tool for EIT imaging.

  13. Gradient based filtering of digital elevation models

    DEFF Research Database (Denmark)

    Knudsen, Thomas; Andersen, Rune Carbuhn

    We present a filtering method for digital terrain models (DTMs). The method is based on mathematical morphological filtering within gradient (slope) defined domains. The intention with the filtering procedure is to improbé the cartographic quality of height contours generated from a DTM based...

  14. Image Denoising Using Interquartile Range Filter with Local Averaging

    OpenAIRE

    Jassim, Firas Ajil

    2013-01-01

    Image denoising is one of the fundamental problems in image processing. In this paper, a novel approach to suppress noise from the image is conducted by applying the interquartile range (IQR) which is one of the statistical methods used to detect outlier effect from a dataset. A window of size kXk was implemented to support IQR filter. Each pixel outside the IQR range of the kXk window is treated as noisy pixel. The estimation of the noisy pixels was obtained by local averaging. The essential...

  15. Mammographic image enhancement using wavelet transform and homomorphic filter

    Directory of Open Access Journals (Sweden)

    F Majidi

    2015-12-01

    Full Text Available Mammography is the most effective method for the early diagnosis of breast cancer diseases. As mammographic images contain low signal to noise ratio and low contrast, it becomes too difficult for radiologists to analyze mammogram. To deal with the above stated problems, it is very important to enhance the mammographic images using image processing methods. This paper introduces a new image enhancement approach for mammographic images which uses the modified mathematical morphology, wavelet transform and homomorphic filter to suppress the noise of images. For performance evaluation of the proposed method, contrast improvement index (CII and edge preservation index (EPI are adopted. Experimental results on mammographic images from Pejvak Digital Imaging Center (PDIC show that the proposed algorithm improves the two indexes, thereby achieving the goal of enhancing mammographic images.

  16. Despeckling Polsar Images Based on Relative Total Variation Model

    Science.gov (United States)

    Jiang, C.; He, X. F.; Yang, L. J.; Jiang, J.; Wang, D. Y.; Yuan, Y.

    2018-04-01

    Relatively total variation (RTV) algorithm, which can effectively decompose structure information and texture in image, is employed in extracting main structures of the image. However, applying the RTV directly to polarimetric SAR (PolSAR) image filtering will not preserve polarimetric information. A new RTV approach based on the complex Wishart distribution is proposed considering the polarimetric properties of PolSAR. The proposed polarization RTV (PolRTV) algorithm can be used for PolSAR image filtering. The L-band Airborne SAR (AIRSAR) San Francisco data is used to demonstrate the effectiveness of the proposed algorithm in speckle suppression, structural information preservation, and polarimetric property preservation.

  17. Digital notch filter based active damping for LCL filters

    DEFF Research Database (Denmark)

    Yao, Wenli; Yang, Yongheng; Zhang, Xiaobin

    2015-01-01

    . In contrast, the active damping does not require any dissipation elements, and thus has become of increasing interest. As a result, a vast of active damping solutions have been reported, among which multi-loop control systems and additional sensors are necessary, leading to increased cost and complexity....... In this paper, a notch filter based active damping without the requirement of additional sensors is proposed, where the inverter current is employed as the feedback variable. Firstly, a design method of the notch filter for active damping is presented. The entire system stability has then been investigated...... in the z-domain. Simulations and experiments are carried out to verify the proposed active damping method. Both results have confirmed that the notch filter based active damping can ensure the entire system stability in the case of resonances with a good system performance....

  18. Edge Detection from High Resolution Remote Sensing Images using Two-Dimensional log Gabor Filter in Frequency Domain

    International Nuclear Information System (INIS)

    Wang, K; Yu, T; Meng, Q Y; Wang, G K; Li, S P; Liu, S H

    2014-01-01

    Edges are vital features to describe the structural information of images, especially high spatial resolution remote sensing images. Edge features can be used to define the boundaries between different ground objects in high spatial resolution remote sensing images. Thus edge detection is important in the remote sensing image processing. Even though many different edge detection algorithms have been proposed, it is difficult to extract the edge features from high spatial resolution remote sensing image including complex ground objects. This paper introduces a novel method to detect edges from the high spatial resolution remote sensing image based on frequency domain. Firstly, the high spatial resolution remote sensing images are Fourier transformed to obtain the magnitude spectrum image (frequency image) by FFT. Then, the frequency spectrum is analyzed by using the radius and angle sampling. Finally, two-dimensional log Gabor filter with optimal parameters is designed according to the result of spectrum analysis. Finally, dot product between the result of Fourier transform and the log Gabor filter is inverse Fourier transformed to obtain the detections. The experimental result shows that the proposed algorithm can detect edge features from the high resolution remote sensing image commendably

  19. Blurred image restoration using knife-edge function and optimal window Wiener filtering

    Science.gov (United States)

    Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects. PMID:29377950

  20. Impact of Image Filters and Observations Parameters in CBCT for Identification of Mandibular Osteolytic Lesions.

    Science.gov (United States)

    Monteiro, Bruna Moraes; Nobrega Filho, Denys Silveira; Lopes, Patrícia de Medeiros Loureiro; de Sales, Marcelo Augusto Oliveira

    2012-01-01

    The aim of this study was to analyze the influence of filters (algorithms) to improve the image of Cone Beam Computed Tomography (CBCT) in diagnosis of osteolytic lesions of the mandible, in order to establish the protocols for viewing images more suitable for CBCT diagnostics. 15 dry mandibles in which perforations were performed, simulating lesions, were submitted to CBCT examination. Two examiners analyzed the images, using filters to improve image Hard, Normal, and Very Sharp, contained in the iCAT Vision software, and protocols for assessment: axial; sagittal and coronal; and axial, sagittal and coronal planes simultaneously (MPR), on two occasions. The sensitivity and specificity (validity) of the cone beam computed tomography (CBCT) have been demonstrated as the values achieved were above 75% for sensitivity and above 85% for specificity, reaching around 95.5% of sensitivity and 99% of specificity when we used the appropriate observation protocol. It was concluded that the use of filters (algorithms) to improve the CBCT image influences the diagnosis, due to the fact that all measured values were correspondingly higher when it was used the filter Very Sharp, which justifies its use for clinical activities, followed by Hard and Normal filters, in order of decreasing values.

  1. Impact of Image Filters and Observations Parameters in CBCT for Identification of Mandibular Osteolytic Lesions

    Directory of Open Access Journals (Sweden)

    Bruna Moraes Monteiro

    2012-01-01

    Full Text Available The aim of this study was to analyze the influence of filters (algorithms to improve the image of Cone Beam Computed Tomography (CBCT in diagnosis of osteolytic lesions of the mandible, in order to establish the protocols for viewing images more suitable for CBCT diagnostics. 15 dry mandibles in which perforations were performed, simulating lesions, were submitted to CBCT examination. Two examiners analyzed the images, using filters to improve image Hard, Normal, and Very Sharp, contained in the iCAT Vision software, and protocols for assessment: axial; sagittal and coronal; and axial, sagittal and coronal planes simultaneously (MPR, on two occasions. The sensitivity and specificity (validity of the cone beam computed tomography (CBCT have been demonstrated as the values achieved were above 75% for sensitivity and above 85% for specificity, reaching around 95.5% of sensitivity and 99% of specificity when we used the appropriate observation protocol. It was concluded that the use of filters (algorithms to improve the CBCT image influences the diagnosis, due to the fact that all measured values were correspondingly higher when it was used the filter Very Sharp, which justifies its use for clinical activities, followed by Hard and Normal filters, in order of decreasing values.

  2. Optimization of recommendations for abdomen computerized tomography based on reconstruction filters, voltage and tube current

    International Nuclear Information System (INIS)

    Silveira, Vinicius da Costa

    2015-01-01

    The use of computed tomography has increased significantly over the past decades. In Brazil the use increased more than twofold from 2008 to 2014, in the meantime the abdomen procedures have tripled. The high frequency of this procedure combined by the increasing collective radiation dose in medical exposures, has resulted development tools to maximize the benefit in CT images. This work aimed to establish protocols optimized in abdominal CT through acquisitions parameters and reconstructions techniques based on filters kernels. A sample of patients undergoing abdominal CT in a diagnostic center of Rio de Janeiro was assessed. Had been collected patients information and acquisitions parameters. The phantoms CT image acquisitions were performed by using different voltage values by adjusting the tube current (mAs) to obtain the same value from CTDI vol patients with normal BMI. Afterwards, the CTDIvol values were reduced by 30%, 50% and 60%. All images were reconstructed with low-contrast filters (A) and standard filters (B). The CTDIvol values for patients with normal BMI were 7% higher than in patients with underweight BMI and 30%, 50% and 60% lower than the overweight, obese I and III patients, respectively. The evaluations of image quality showed that variation of the current (mA) and the reconstruction filters did not affect the Hounsfield values. When the contrast-to-noise ratio (CNR) was normalized to CTDIvol, the protocols acquired with 60% reduction of CTDIvol with 140 kV and 80 kV showed CNR 6% lower than the routine. Modifications of the acquisition parameters did not affect spatial resolution, but the post-processing with B filters reduced the spatial frequency by 16%. With reduced the dose of 30%, lesions in the spleen had the CNR higher than 10% routine protocols with 140 kV acquired and post-processed to filter A. The image post-processing with a filter A with a 80kV voltage provided CNR values equal to the routine for the liver lesions with a 30

  3. SU-F-I-08: CT Image Ring Artifact Reduction Based On Prior Image

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, C; Qi, H; Chen, Z; Wu, S; Xu, Y; Zhou, L [Southern Medical University, Guangzhou, Guangdong (China)

    2016-06-15

    Purpose: In computed tomography (CT) system, CT images with ring artifacts will be reconstructed when some adjacent bins of detector don’t work. The ring artifacts severely degrade CT image quality. We present a useful CT ring artifacts reduction based on projection data correction, aiming at estimating the missing data of projection data accurately, thus removing the ring artifacts of CT images. Methods: The method consists of ten steps: 1) Identification of abnormal pixel line in projection sinogram; 2) Linear interpolation within the pixel line of projection sinogram; 3) FBP reconstruction using interpolated projection data; 4) Filtering FBP image using mean filter; 5) Forwarding projection of filtered FBP image; 6) Subtraction forwarded projection from original projection; 7) Linear interpolation of abnormal pixel line area in the subtraction projection; 8) Adding the interpolated subtraction projection on the forwarded projection; 9) FBP reconstruction using corrected projection data; 10) Return to step 4 until the pre-set iteration number is reached. The method is validated on simulated and real data to restore missing projection data and reconstruct ring artifact-free CT images. Results: We have studied impact of amount of dead bins of CT detector on the accuracy of missing data estimation in projection sinogram. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, three iterations are sufficient to restore projection data and reconstruct ring artifact-free images when the dead bins rating is under 30%. The dead-bin-induced artifacts are substantially reduced. More iteration number is needed to reconstruct satisfactory images while the rating of dead bins increases. Similar results were found for a real head phantom case. Conclusion: A practical CT image ring artifact correction scheme based on projection data is developed. This method can produce ring artifact-free CT images feasibly and effectively.

  4. SU-F-I-08: CT Image Ring Artifact Reduction Based On Prior Image

    International Nuclear Information System (INIS)

    Yuan, C; Qi, H; Chen, Z; Wu, S; Xu, Y; Zhou, L

    2016-01-01

    Purpose: In computed tomography (CT) system, CT images with ring artifacts will be reconstructed when some adjacent bins of detector don’t work. The ring artifacts severely degrade CT image quality. We present a useful CT ring artifacts reduction based on projection data correction, aiming at estimating the missing data of projection data accurately, thus removing the ring artifacts of CT images. Methods: The method consists of ten steps: 1) Identification of abnormal pixel line in projection sinogram; 2) Linear interpolation within the pixel line of projection sinogram; 3) FBP reconstruction using interpolated projection data; 4) Filtering FBP image using mean filter; 5) Forwarding projection of filtered FBP image; 6) Subtraction forwarded projection from original projection; 7) Linear interpolation of abnormal pixel line area in the subtraction projection; 8) Adding the interpolated subtraction projection on the forwarded projection; 9) FBP reconstruction using corrected projection data; 10) Return to step 4 until the pre-set iteration number is reached. The method is validated on simulated and real data to restore missing projection data and reconstruct ring artifact-free CT images. Results: We have studied impact of amount of dead bins of CT detector on the accuracy of missing data estimation in projection sinogram. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, three iterations are sufficient to restore projection data and reconstruct ring artifact-free images when the dead bins rating is under 30%. The dead-bin-induced artifacts are substantially reduced. More iteration number is needed to reconstruct satisfactory images while the rating of dead bins increases. Similar results were found for a real head phantom case. Conclusion: A practical CT image ring artifact correction scheme based on projection data is developed. This method can produce ring artifact-free CT images feasibly and effectively.

  5. Impulse Noise Cancellation of Medical Images Using Wavelet Networks and Median Filters

    Science.gov (United States)

    Sadri, Amir Reza; Zekri, Maryam; Sadri, Saeid; Gheissari, Niloofar

    2012-01-01

    This paper presents a new two-stage approach to impulse noise removal for medical images based on wavelet network (WN). The first step is noise detection, in which the so-called gray-level difference and average background difference are considered as the inputs of a WN. Wavelet Network is used as a preprocessing for the second stage. The second step is removing impulse noise with a median filter. The wavelet network presented here is a fixed one without learning. Experimental results show that our method acts on impulse noise effectively, and at the same time preserves chromaticity and image details very well. PMID:23493998

  6. Fast single image dehazing based on image fusion

    Science.gov (United States)

    Liu, Haibo; Yang, Jie; Wu, Zhengping; Zhang, Qingnian

    2015-01-01

    Images captured in foggy weather conditions often fade the colors and reduce the contrast of the observed objects. An efficient image fusion method is proposed to remove haze from a single input image. First, the initial medium transmission is estimated based on the dark channel prior. Second, the method adopts an assumption that the degradation level affected by haze of each region is the same, which is similar to the Retinex theory, and uses a simple Gaussian filter to get the coarse medium transmission. Then, pixel-level fusion is achieved between the initial medium transmission and coarse medium transmission. The proposed method can recover a high-quality haze-free image based on the physical model, and the complexity of the proposed method is only a linear function of the number of input image pixels. Experimental results demonstrate that the proposed method can allow a very fast implementation and achieve better restoration for visibility and color fidelity compared to some state-of-the-art methods.

  7. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    Directory of Open Access Journals (Sweden)

    Hyeon Sik Kim

    2014-10-01

    Full Text Available Objective(s: In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF measurement by dynamic N-13 ammonia positron emission tomography (PET, we compared various reconstruction and filtering methods with image characteristics. Methods: Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years were reconstructed, using filtered back projection (FBP and ordered subset expectation maximization (OSEM methods. OSEM reconstruction consisted of OSEM_2I, OSEM_4I, and OSEM_6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR was calculated by noise and contrast recovery (CR. Stress and rest MBF and coronary flow reserve (CFR were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. Results: In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (PP=0.923 and 0.855 for readers 1 and 2, respectively. SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Conclusion: Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation. .

  8. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Cho, Sang-Geon; Kim, Ju Han; Kwon, Seong Young; Lee, Byeong-il; Bom, Hee-Seung

    2014-01-01

    In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF) measurement by dynamic N-13 ammonia positron emission tomography (PET), we compared various reconstruction and filtering methods with image characteristics. Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years) were reconstructed, using filtered back projection (FBP) and ordered subset expectation maximization (OSEM) methods. OSEM reconstruction consisted of OSEM-2I, OSEM-4I, and OSEM-6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ) was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR) was calculated by noise and contrast recovery (CR). Stress and rest MBF and coronary flow reserve (CFR) were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (P<0.001 for both readers). However, no significant difference of IQ was found between FBP and various numbers of iteration in OSEM (P=0.923 and 0.855 for readers 1 and 2, respectively). SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation

  9. Detection of retinal nerve fiber layer defects in retinal fundus images using Gabor filtering

    Science.gov (United States)

    Hayashi, Yoshinori; Nakagawa, Toshiaki; Hatanaka, Yuji; Aoyama, Akira; Kakogawa, Masakatsu; Hara, Takeshi; Fujita, Hiroshi; Yamamoto, Tetsuya

    2007-03-01

    Retinal nerve fiber layer defect (NFLD) is one of the most important findings for the diagnosis of glaucoma reported by ophthalmologists. However, such changes could be overlooked, especially in mass screenings, because ophthalmologists have limited time to search for a number of different changes for the diagnosis of various diseases such as diabetes, hypertension and glaucoma. Therefore, the use of a computer-aided detection (CAD) system can improve the results of diagnosis. In this work, a technique for the detection of NFLDs in retinal fundus images is proposed. In the preprocessing step, blood vessels are "erased" from the original retinal fundus image by using morphological filtering. The preprocessed image is then transformed into a rectangular array. NFLD regions are observed as vertical dark bands in the transformed image. Gabor filtering is then applied to enhance the vertical dark bands. False positives (FPs) are reduced by a rule-based method which uses the information of the location and the width of each candidate region. The detected regions are back-transformed into the original configuration. In this preliminary study, 71% of NFLD regions are detected with average number of FPs of 3.2 per image. In conclusion, we have developed a technique for the detection of NFLDs in retinal fundus images. Promising results have been obtained in this initial study.

  10. An effective coded excitation scheme based on a predistorted FM signal and an optimized digital filter

    DEFF Research Database (Denmark)

    Misaridis, Thanasis; Jensen, Jørgen Arendt

    1999-01-01

    This paper presents a coded excitation imaging system based on a predistorted FM excitation and a digital compression filter designed for medical ultrasonic applications, in order to preserve both axial resolution and contrast. In radars, optimal Chebyshev windows efficiently weight a nearly...... as with pulse excitation (about 1.5 lambda), depending on the filter design criteria. The axial sidelobes are below -40 dB, which is the noise level of the measuring imaging system. The proposed excitation/compression scheme shows good overall performance and stability to the frequency shift due to attenuation...... be removed by weighting. We show that by using a predistorted chirp with amplitude or phase shaping for amplitude ripple reduction and a correlation filter that accounts for the transducer's natural frequency weighting, output sidelobe levels of -35 to -40 dB are directly obtained. When an optimized filter...

  11. 基于人体关键部位检测的敏感图像过滤方法%PORNOGRAPHIC IMAGE FILTERING METHOD BASED ON EROTOGENIC-ZONE DETECTION

    Institute of Scientific and Technical Information of China (English)

    陆蓓; 巩玉旺; 姚金良; 周建政

    2011-01-01

    目前多数敏感图像过滤方法对皮肤裸露较多或类肤色区域较多的图像容易产生误检.为降低对这类图像的误检率,提出一种基于人体关键部位检测的敏感图像过滤方法.该方法提取肤色特征、表征局部对象外观和形状的HOG(Histograms of Orien-ted Gradient)特征、空间分布特征及描述区域灰度分布的Haar-like等特征,利用Adaboost学习算法,训练得到人体关键部位的分类器,通过此分类器实现敏感图像的过滤.实验表明,该方法能够准确地检测关键部位,可以有效地降低敏感图像的误检率.%At present, many non-pornographic images containing larger exposure of skin area or approximate skin-colour area are often prone to be detected as the pornographic images by most of the pornographic image filtering methods. In order to decrease the false detection rate, a new pornographic image filtering method based on erotogenic-zone detection is proposed in the paper. The method extracts main features, including skin-colour features, HOG features which describe the shape and appearance of local objects, spatial distribution based features and Haar-like features which describe local grayscale distribution, trains and obtains the classifier of erotogenic-zone recognition with Adaboost learning algorithm, and achieves the pornographic image filtering through the classifier. Results gained from the experiments confirmed that this method can precisely detect erotogenic-zone in an image, and can effectively reduce the fault detection rate against nonpornographic images.

  12. Performance evaluation of 3-D enhancement filters for detection of lung cancer from 3-D chest X-ray CT images

    International Nuclear Information System (INIS)

    Shimizu, Akinobu; Hagai, Makoto; Toriwaki, Jun-ichiro; Hasegawa, Jun-ichi.

    1995-01-01

    This paper evaluates the performance of several three dimensional enhancement filters used in procedures for detecting lung cancer shadows from three dimensional (3D) chest X-ray CT images. Two dimensional enhancement filters such as Min-DD filter, Contrast filter and N-Quoit filter have been proposed for enhancing cancer shadows in conventional 2D X-ray images. In this paper, we extend each of these 2D filters to a 3D filter and evaluate its performance experimentally by using CT images with artificial and true lung cancer shadows. As a result, we find that these 3D filters are effective for determining the position of a lung cancer shadow in a 3D chest CT image, as compared with the simple procedure such as smoothing filter, and that the performance of these filters become lower in the hilar area due to the influence of the vessel shadows. (author)

  13. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Sihem SLATNIA

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks,extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  14. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Okba Kazar

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks, extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  15. Thermographic image analysis for classification of ACL rupture disease, bone cancer, and feline hyperthyroid, with Gabor filters

    Science.gov (United States)

    Alvandipour, Mehrdad; Umbaugh, Scott E.; Mishra, Deependra K.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Thermography and pattern classification techniques are used to classify three different pathologies in veterinary images. Thermographic images of both normal and diseased animals were provided by the Long Island Veterinary Specialists (LIVS). The three pathologies are ACL rupture disease, bone cancer, and feline hyperthyroid. The diagnosis of these diseases usually involves radiology and laboratory tests while the method that we propose uses thermographic images and image analysis techniques and is intended for use as a prescreening tool. Images in each category of pathologies are first filtered by Gabor filters and then various features are extracted and used for classification into normal and abnormal classes. Gabor filters are linear filters that can be characterized by the two parameters wavelength λ and orientation θ. With two different wavelength and five different orientations, a total of ten different filters were studied. Different combinations of camera views, filters, feature vectors, normalization methods, and classification methods, produce different tests that were examined and the sensitivity, specificity and success rate for each test were produced. Using the Gabor features alone, sensitivity, specificity, and overall success rates of 85% for each of the pathologies was achieved.

  16. Linear Regression Based Real-Time Filtering

    Directory of Open Access Journals (Sweden)

    Misel Batmend

    2013-01-01

    Full Text Available This paper introduces real time filtering method based on linear least squares fitted line. Method can be used in case that a filtered signal is linear. This constraint narrows a band of potential applications. Advantage over Kalman filter is that it is computationally less expensive. The paper further deals with application of introduced method on filtering data used to evaluate a position of engraved material with respect to engraving machine. The filter was implemented to the CNC engraving machine control system. Experiments showing its performance are included.

  17. Comparison of the diagnostic accuracy of direct digital radiography system, filtered images, and subtraction radiography

    Directory of Open Access Journals (Sweden)

    Wilton Mitsunari Takeshita

    2013-01-01

    Full Text Available Background: To compare the diagnostic accuracy of three different imaging systems: Direct digital radiography system (DDR-CMOS, four types of filtered images, and a priori and a posteriori registration of digital subtraction radiography (DSR in the diagnosis of proximal defects. Materials and Methods: The teeth were arranged in pairs in 10 blocks of vinyl polysiloxane, and proximal defects were performed with drills of 0.25, 0.5, and 1 mm diameter. Kodak RVG 6100 sensor was used to capture the images. A posteriori DSR registrations were done with Regeemy 0.2.43 and subtraction with Image Tool 3.0. Filtered images were obtained with Kodak Dental Imaging 6.1 software. Images (n = 360 were evaluated by three raters, all experts in dental radiology. Results: Sensitivity and specificity of the area under the receiver operator characteristic (ROC curve (Az were higher for DSR images with all three drills (Az = 0.896, 0.979, and 1.000 for drills 0.25, 0.5, and 1 mm, respectively. The highest values were found for 1-mm drills and the lowest for 0.25-mm drills, with negative filter having the lowest values of all (Az = 0.631. Conclusion: The best method of diagnosis was by using a DSR. The negative filter obtained the worst results. Larger drills showed the highest sensitivity and specificity values of the area under the ROC curve.

  18. Filtered region of interest cone-beam rotational angiography

    International Nuclear Information System (INIS)

    Schafer, Sebastian; Noeel, Peter B.; Walczak, Alan M.; Hoffmann, Kenneth R.

    2010-01-01

    Purpose: Cone-beam rotational angiography (CBRA) is widely used in the modern clinical settings. In a number of procedures, the area of interest is often considerably smaller than the field of view (FOV) of the detector, subjecting the patient to potentially unnecessary x-ray dose. The authors therefore propose a filter-based method to reduce the dose in the regions of low interest, while supplying high image quality in the region of interest (ROI). Methods: For such procedures, the authors propose a method of filtered region of interest (FROI)-CBRA. In the authors' approach, a gadolinium filter with a circular central opening is placed into the x-ray beam during image acquisition. The central region is imaged with high contrast, while peripheral regions are subjected to a substantial lower intensity and dose through beam filtering. The resulting images contain a high contrast/intensity ROI, as well as a low contrast/intensity peripheral region, and a transition region in between. To equalize the two regions' intensities, the first projection of the acquisition is performed with and without the filter in place. The equalization relationship, based on Beer's law, is established through linear regression using corresponding filtered and nonfiltered data. The transition region is equalized based on radial profiles. Results: Evaluations in 2D and 3D show no visible difference between conventional FROI-CBRA projection images and reconstructions in the ROI. CNR evaluations show similar image quality in the ROI, with a reduced CNR in the reconstructed peripheral region. In all filtered projection images, the scatter fraction inside the ROI was reduced. Theoretical and experimental dose evaluations show a considerable dose reduction; using a ROI half the original FOV reduces the dose by 60% for the filter thickness of 1.29 mm. Conclusions: These results indicate the potential of FROI-CBRA to reduce the dose to the patient while supplying the physician with the desired

  19. Filtered region of interest cone-beam rotational angiography

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, Sebastian; Noeel, Peter B.; Walczak, Alan M.; Hoffmann, Kenneth R. [Department of Mechanical Engineering, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States) and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Computer Science, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States) and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Mechanical Engineering, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Computer Science, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States) and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States)

    2010-02-15

    Purpose: Cone-beam rotational angiography (CBRA) is widely used in the modern clinical settings. In a number of procedures, the area of interest is often considerably smaller than the field of view (FOV) of the detector, subjecting the patient to potentially unnecessary x-ray dose. The authors therefore propose a filter-based method to reduce the dose in the regions of low interest, while supplying high image quality in the region of interest (ROI). Methods: For such procedures, the authors propose a method of filtered region of interest (FROI)-CBRA. In the authors' approach, a gadolinium filter with a circular central opening is placed into the x-ray beam during image acquisition. The central region is imaged with high contrast, while peripheral regions are subjected to a substantial lower intensity and dose through beam filtering. The resulting images contain a high contrast/intensity ROI, as well as a low contrast/intensity peripheral region, and a transition region in between. To equalize the two regions' intensities, the first projection of the acquisition is performed with and without the filter in place. The equalization relationship, based on Beer's law, is established through linear regression using corresponding filtered and nonfiltered data. The transition region is equalized based on radial profiles. Results: Evaluations in 2D and 3D show no visible difference between conventional FROI-CBRA projection images and reconstructions in the ROI. CNR evaluations show similar image quality in the ROI, with a reduced CNR in the reconstructed peripheral region. In all filtered projection images, the scatter fraction inside the ROI was reduced. Theoretical and experimental dose evaluations show a considerable dose reduction; using a ROI half the original FOV reduces the dose by 60% for the filter thickness of 1.29 mm. Conclusions: These results indicate the potential of FROI-CBRA to reduce the dose to the patient while supplying the physician with

  20. Characteristics of Quoit filter, a digital filter developed for the extraction of circumscribed shadows, and its applications to mammograms

    International Nuclear Information System (INIS)

    Isobe, Yoshiaki; Ohkubo, Natsumi; Yamamoto, Shinji; Toriwaki, Jun-ichiro; Kobatake, Hidefumi.

    1993-01-01

    This paper presents a newly developed filter called Quoit filter, which detects circumscribed shadows (concentric circular isolated image), like typical cancer regions. This Quoit filter is based on the mathematical morphology and is found to have interesting facts as follows. (1) Output of this filter can be analytically expressible when an input image is assumed to be a concentric circular model (output is expectable for typical inputs). (2) This filter has an ability to reconstruct original isolated models mentioned in (1) selectively, when this filter is applied sequentially twice. This filter was tested on the detection of cancer regions in X-ray mammograms, and for 12 cancer mammograms, this filter achieved a true-positive cancer detection rate of 100 %. (author)

  1. Despeckle filtering for ultrasound imaging and video II selected applications

    CERN Document Server

    Loizou, Christos P

    2015-01-01

    In ultrasound imaging and video visual perception is hindered by speckle multiplicative noise that degrades the quality. Noise reduction is therefore essential for improving the visual observation quality or as a pre-processing step for further automated analysis, such as image/video segmentation, texture analysis and encoding in ultrasound imaging and video. The goal of the first book (book 1 of 2 books) was to introduce the problem of speckle in ultrasound image and video as well as the theoretical background, algorithmic steps, and the MatlabTM for the following group of despeckle filters:

  2. Three-State Locally Adaptive Texture Preserving Filter for Radar and Optical Image Processing

    Directory of Open Access Journals (Sweden)

    Jaakko T. Astola

    2005-05-01

    Full Text Available Textural features are one of the most important types of useful information contained in images. In practice, these features are commonly masked by noise. Relatively little attention has been paid to texture preserving properties of noise attenuation methods. This stimulates solving the following tasks: (1 to analyze the texture preservation properties of various filters; and (2 to design image processing methods capable to preserve texture features well and to effectively reduce noise. This paper deals with examining texture feature preserving properties of different filters. The study is performed for a set of texture samples and different noise variances. The locally adaptive three-state schemes are proposed for which texture is considered as a particular class. For “detection” of texture regions, several classifiers are proposed and analyzed. As shown, an appropriate trade-off of the designed filter properties is provided. This is demonstrated quantitatively for artificial test images and is confirmed visually for real-life images.

  3. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    Science.gov (United States)

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  4. Context-based adaptive filtering of interest points in image retrieval

    DEFF Research Database (Denmark)

    Nguyen, Phuong Giang; Andersen, Hans Jørgen

    2009-01-01

    Interest points have been used as local features with success in many computer vision applications such as image/video retrieval and object recognition. However, a major issue when using this approach is a large number of interest points detected from each image and created a dense feature space...... a subset of features. Our approach differs from others in a fact that selected feature is based on the context of the given image. Our experimental results show a significant reduction rate of features while preserving the retrieval performance....

  5. Polarization-Insensitive Tunable Optical Filters based on Liquid Crystal Polarization Gratings

    Science.gov (United States)

    Nicolescu, Elena

    Tunable optical filters are widely used for a variety of applications including spectroscopy, optical communication networks, remote sensing, and biomedical imaging and diagnostics. All of these application areas can greatly benefit from improvements in the key characteristics of the tunable optical filters embedded in them. Some of these key parameters include peak transmittance, bandwidth, tuning range, and transition width. In recent years research efforts have also focused on miniaturizing tunable optical filters into physically small packages for compact portable spectroscopy and hyperspectral imaging applications such as real-time medical diagnostics and defense applications. However, it is important that miniaturization not have a detrimental effect on filter performance. The overarching theme of this dissertation is to explore novel configurations of Polarization Gratings (PGs) as simple, low-cost, polarization-insensitive alternatives to conventional optical filtering technologies for applications including hyperspectral imaging and telecommunications. We approach this goal from several directions with a combination of theory and experimental demonstration leading to, in our opinion, a significant contribution to the field. We present three classes of tunable optical filters, the first of which is an angle-filtering scheme where the stop-band wavelengths are redirected off axis and the passband is transmitted on-axis. This is achieved using a stacked configuration of polarization gratings of various thicknesses. To improve this class of filter, we also introduce a novel optical element, the Bilayer Polarization Grating, exhibiting unique optical properties and demonstrating complex anchoring conditions with high quality. The second class of optical filter is analogous to a Lyot filter, utilizing stacks of static or tunable waveplates sandwiched with polarizing elements. However, we introduce a new configuration using PGs and static waveplates to replace

  6. Switching non-local median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2015-06-01

    This paper describes a novel image filtering method for removal of random-valued impulse noise superimposed on grayscale images. Generally, it is well known that switching-type median filters are effective for impulse noise removal. In this paper, we propose a more sophisticated switching-type impulse noise removal method in terms of detail-preserving performance. Specifically, the noise detector of the proposed method finds out noise-corrupted pixels by focusing attention on the difference between the value of a pixel of interest (POI) and the median of its neighboring pixel values, and on the POI's isolation tendency from the surrounding pixels. Furthermore, the removal of the detected noise is performed by the newly proposed median filter based on non-local processing, which has superior detail-preservation capability compared to the conventional median filter. The effectiveness and the validity of the proposed method are verified by some experiments using natural grayscale images.

  7. Unsupervised Retinal Vessel Segmentation Using Combined Filters.

    Directory of Open Access Journals (Sweden)

    Wendeson S Oliveira

    Full Text Available Image segmentation of retinal blood vessels is a process that can help to predict and diagnose cardiovascular related diseases, such as hypertension and diabetes, which are known to affect the retinal blood vessels' appearance. This work proposes an unsupervised method for the segmentation of retinal vessels images using a combined matched filter, Frangi's filter and Gabor Wavelet filter to enhance the images. The combination of these three filters in order to improve the segmentation is the main motivation of this work. We investigate two approaches to perform the filter combination: weighted mean and median ranking. Segmentation methods are tested after the vessel enhancement. Enhanced images with median ranking are segmented using a simple threshold criterion. Two segmentation procedures are applied when considering enhanced retinal images using the weighted mean approach. The first method is based on deformable models and the second uses fuzzy C-means for the image segmentation. The procedure is evaluated using two public image databases, Drive and Stare. The experimental results demonstrate that the proposed methods perform well for vessel segmentation in comparison with state-of-the-art methods.

  8. Characterization of the Edges and Contrasts in a digital image with the variation of the Parameters of the High-pass Filters used in the Estimation of Atmospheric Visibility

    Directory of Open Access Journals (Sweden)

    Martha C. Guzmán-Zapata

    2013-11-01

    Full Text Available This paper considers the edges and contrasts obtained with high-pass filters used in the estimation of daytime atmospheric visibility from digital images, and the behavior of these edges and contrasts is characterized by varying the parameters of high-pass filters such as the Ideal, Gaussian, and Homomorphic-Gaussian. A synthetic image of regions with different contrasts is used to apply different filters, then, we define an index to measure the quality of the edges obtained in the filtered image and it is used to analyze the results. The results show that both, the filter selection and the selection of its parameters: affects the characteristics and quality of the detected edges in the filtered image, also determine the amount of noise that the filter added to the image (artifacts that were not present in the original image, and also establish if achieved, or not, the edge detection. The results also show that the edge quality index reaches maximum values at certain combinations of the filters parameters, which means that some combinations of parameters reduce situations distorting the edges and distorting atmospheric visibility measures based on the Fourier transform. So these parameters which provide maximum quality edges are established as suitable for use in visibility measurement.

  9. Application of a novel Kalman filter based block matching method to ultrasound images for hand tendon displacement estimation.

    Science.gov (United States)

    Lai, Ting-Yu; Chen, Hsiao-I; Shih, Cho-Chiang; Kuo, Li-Chieh; Hsu, Hsiu-Yun; Huang, Chih-Chung

    2016-01-01

    Information about tendon displacement is important for allowing clinicians to not only quantify preoperative tendon injuries but also to identify any adhesive scaring between tendon and adjacent tissue. The Fisher-Tippett (FT) similarity measure has recently been shown to be more accurate than the Laplacian sum of absolute differences (SAD) and Gaussian sum of squared differences (SSD) similarity measures for tracking tendon displacement in ultrasound B-mode images. However, all of these similarity measures can easily be influenced by the quality of the ultrasound image, particularly its signal-to-noise ratio. Ultrasound images of injured hands are unfortunately often of poor quality due to the presence of adhesive scars. The present study investigated a novel Kalman-filter scheme for overcoming this problem. Three state-of-the-art tracking methods (FT, SAD, and SSD) were used to track the displacements of phantom and cadaver tendons, while FT was used to track human tendons. These three tracking methods were combined individually with the proposed Kalman-filter (K1) scheme and another Kalman-filter scheme used in a previous study to optimize the displacement trajectories of the phantom and cadaver tendons. The motion of the human extensor digitorum communis tendon was measured in the present study using the FT-K1 scheme. The experimental results indicated that SSD exhibited better accuracy in the phantom experiments, whereas FT exhibited better performance for tracking real tendon motion in the cadaver experiments. All three tracking methods were influenced by the signal-to-noise ratio of the images. On the other hand, the K1 scheme was able to optimize the tracking trajectory of displacement in all experiments, even from a location with a poor image quality. The human experimental data indicated that the normal tendons were displaced more than the injured tendons, and that the motion ability of the injured tendon was restored after appropriate rehabilitation

  10. Graphic filter library implemented in CUDA language

    OpenAIRE

    Peroutková, Hedvika

    2009-01-01

    This thesis deals with the problem of reducing computation time of raster image processing by parallel computing on graphics processing unit. Raster image processing thereby refers to the application of graphic filters, which can be applied in sequence with different settings. This thesis evaluates the suitability of using parallelization on graphic card for raster image adjustments based on multicriterial choice. Filters are implemented for graphics processing unit in CUDA language. Opacity ...

  11. Cross-correlated imaging of distributed mode filtering rod fiber

    DEFF Research Database (Denmark)

    Laurila, Marko; Barankov, Roman; Jørgensen, Mette Marie

    2013-01-01

    We analyze the modal properties of an 85μm core distributed mode filtering rod fiber using cross-correlated (C2) imaging. We evaluate suppression of higher-order modes (HOMs) under severely misaligned mode excitation and identify a single-mode regime where HOMs are suppressed by more than 20dB....

  12. Kalman filtered MR temperature imaging for laser induced thermal therapies.

    Science.gov (United States)

    Fuentes, D; Yung, J; Hazle, J D; Weinberg, J S; Stafford, R J

    2012-04-01

    The feasibility of using a stochastic form of Pennes bioheat model within a 3-D finite element based Kalman filter (KF) algorithm is critically evaluated for the ability to provide temperature field estimates in the event of magnetic resonance temperature imaging (MRTI) data loss during laser induced thermal therapy (LITT). The ability to recover missing MRTI data was analyzed by systematically removing spatiotemporal information from a clinical MR-guided LITT procedure in human brain and comparing predictions in these regions to the original measurements. Performance was quantitatively evaluated in terms of a dimensionless L(2) (RMS) norm of the temperature error weighted by acquisition uncertainty. During periods of no data corruption, observed error histories demonstrate that the Kalman algorithm does not alter the high quality temperature measurement provided by MR thermal imaging. The KF-MRTI implementation considered is seen to predict the bioheat transfer with RMS error 10 sec.

  13. Applying Enhancement Filters in the Pre-processing of Images of Lymphoma

    International Nuclear Information System (INIS)

    Silva, Sérgio Henrique; Do Nascimento, Marcelo Zanchetta; Neves, Leandro Alves; Batista, Valério Ramos

    2015-01-01

    Lymphoma is a type of cancer that affects the immune system, and is classified as Hodgkin or non-Hodgkin. It is one of the ten types of cancer that are the most common on earth. Among all malignant neoplasms diagnosed in the world, lymphoma ranges from three to four percent of them. Our work presents a study of some filters devoted to enhancing images of lymphoma at the pre-processing step. Here the enhancement is useful for removing noise from the digital images. We have analysed the noise caused by different sources like room vibration, scraps and defocusing, and in the following classes of lymphoma: follicular, mantle cell and B-cell chronic lymphocytic leukemia. The filters Gaussian, Median and Mean-Shift were applied to different colour models (RGB, Lab and HSV). Afterwards, we performed a quantitative analysis of the images by means of the Structural Similarity Index. This was done in order to evaluate the similarity between the images. In all cases we have obtained a certainty of at least 75%, which rises to 99% if one considers only HSV. Namely, we have concluded that HSV is an important choice of colour model at pre-processing histological images of lymphoma, because in this case the resulting image will get the best enhancement

  14. SU-D-207-07: Implementation of Full/half Bowtie Filter Model in a Commercial Treatment Planning System for Kilovoltage X-Ray Imaging Dose Estimation

    International Nuclear Information System (INIS)

    Kim, S; Alaei, P

    2015-01-01

    Purpose: To implement full/half bowtie filter models in a commercial treatment planning system (TPS) to calculate kilovoltage (kV) x-ray imaging dose of Varian On-Board Imager (OBI) cone beam CT (CBCT) system. Methods: Full/half bowtie filters of Varian OBI were created as compensator models in Pinnacle TPS (version 9.6) using Matlab software (version 2011a). The profiles of both bowtie filters were acquired from the manufacturer, imported into the Matlab system and hard coded in binary file format. A Pinnacle script was written to import each bowtie filter data into a Pinnacle treatment plan as a compensator. A kV x-ray beam model without including the compensator model was commissioned per each bowtie filter setting based on percent depth dose and lateral profile data acquired from Monte Carlo simulations. To validate the bowtie filter models, a rectangular water phantom was generated in the planning system and an anterior/posterior beam with each bowtie filter was created. Using the Pinnacle script, each bowtie filter compensator was added to the treatment plan. Lateral profile at the depth of 3cm and percent depth dose were measured using an ion chamber and compared with the data extracted from the treatment plans. Results: The kV x-ray beams for both full and half bowtie filter have been modeled in a commercial TPS. The difference of lateral and depth dose profiles between dose calculations and ion chamber measurements were within 6%. Conclusion: Both full/half bowtie filter models provide reasonable results in kV x-ray dose calculations in the water phantom. This study demonstrates the possibility of using a model-based treatment planning system to calculate the kV imaging dose for both full and half bowtie filter modes. Further study is to be performed to evaluate the models in clinical situations

  15. Ensemble-based Kalman Filters in Strongly Nonlinear Dynamics

    Institute of Scientific and Technical Information of China (English)

    Zhaoxia PU; Joshua HACKER

    2009-01-01

    This study examines the effectiveness of ensemble Kalman filters in data assimilation with the strongly nonlinear dynamics of the Lorenz-63 model, and in particular their use in predicting the regime transition that occurs when the model jumps from one basin of attraction to the other. Four configurations of the ensemble-based Kalman filtering data assimilation techniques, including the ensemble Kalman filter, ensemble adjustment Kalman filter, ensemble square root filter and ensemble transform Kalman filter, are evaluated with their ability in predicting the regime transition (also called phase transition) and also are compared in terms of their sensitivity to both observational and sampling errors. The sensitivity of each ensemble-based filter to the size of the ensemble is also examined.

  16. FLOWING BILATERAL FILTER: DEFINITION AND IMPLEMENTATIONS

    Directory of Open Access Journals (Sweden)

    Maxime Moreaud

    2015-06-01

    Full Text Available The bilateral filter plays a key role in image processing applications due to its intuitive parameterization and its high quality filter result, smoothing homogeneous regions while preserving the edges of the objects. Considering the image as a topological relief, seeing pixel intensities as peaks and valleys, we introduce a way to control the tonal weighting coefficients, the flowing bilateral filter, reducing "halo" artifacts typically produced by the regular bilateral filter around a large peak surrounded by two valleys of lower values. In this paper we propose to investigate exact and approximated versions of CPU and parallel GPU (Graphical Processing Unit based implementations of the regular and flowing bilateral filter using the NVidia CUDA API. Fast implementations of these filters are important for the processing of large 3D volumes up to several GB acquired by x-ray or electron tomography.

  17. A mobile and web application-based recommendation system using color quantization and collaborative filtering

    OpenAIRE

    KAYA, FİDAN; YILDIZ, GÜREL; KAVAK, ADNAN

    2015-01-01

    In this paper, a recommendation system based on a mobile and web application is proposed for indoor decoration. The main contribution of this work is to apply two-stage filtering using linear matching and collaborative filtering to make recommendations. In the mobile application part, the image of the medium captured by a mobile phone is analyzed using color quantization methods, and these color analysis results along with other user-defined parameters such as height, width, and type of the p...

  18. Image processing based detection of lung cancer on CT scan images

    Science.gov (United States)

    Abdillah, Bariqi; Bustamam, Alhadi; Sarwinda, Devvi

    2017-10-01

    In this paper, we implement and analyze the image processing method for detection of lung cancer. Image processing techniques are widely used in several medical problems for picture enhancement in the detection phase to support the early medical treatment. In this research we proposed a detection method of lung cancer based on image segmentation. Image segmentation is one of intermediate level in image processing. Marker control watershed and region growing approach are used to segment of CT scan image. Detection phases are followed by image enhancement using Gabor filter, image segmentation, and features extraction. From the experimental results, we found the effectiveness of our approach. The results show that the best approach for main features detection is watershed with masking method which has high accuracy and robust.

  19. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  20. Bandwidth Controllable Tunable Filter for Hyper-/Multi-Spectral Imager, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase I proposal introduces a fast speed bandwidth controllable tunable filter for hyper-/multi-spectral (HS/MS) imagers. It dynamically passes a variable...

  1. Noninvasive mapping of water diffusional exchange in the human brain using filter-exchange imaging.

    Science.gov (United States)

    Nilsson, Markus; Lätt, Jimmy; van Westen, Danielle; Brockstedt, Sara; Lasič, Samo; Ståhlberg, Freddy; Topgaard, Daniel

    2013-06-01

    We present the first in vivo application of the filter-exchange imaging protocol for diffusion MRI. The protocol allows noninvasive mapping of the rate of water exchange between microenvironments with different self-diffusivities, such as the intracellular and extracellular spaces in tissue. Since diffusional water exchange across the cell membrane is a fundamental process in human physiology and pathophysiology, clinically feasible and noninvasive imaging of the water exchange rate would offer new means to diagnose disease and monitor treatment response in conditions such as cancer and edema. The in vivo use of filter-exchange imaging was demonstrated by studying the brain of five healthy volunteers and one intracranial tumor (meningioma). Apparent exchange rates in white matter range from 0.8±0.08 s(-1) in the internal capsule, to 1.6±0.11 s(-1) for frontal white matter, indicating that low values are associated with high myelination. Solid tumor displayed values of up to 2.9±0.8 s(-1). In white matter, the apparent exchange rate values suggest intra-axonal exchange times in the order of seconds, confirming the slow exchange assumption in the analysis of diffusion MRI data. We propose that filter-exchange imaging could be used clinically to map the water exchange rate in pathologies. Filter-exchange imaging may also be valuable for evaluating novel therapies targeting the function of aquaporins. Copyright © 2012 Wiley Periodicals, Inc.

  2. Spatio-spectral color filter array design for optimal image recovery.

    Science.gov (United States)

    Hirakawa, Keigo; Wolfe, Patrick J

    2008-10-01

    In digital imaging applications, data are typically obtained via a spatial subsampling procedure implemented as a color filter array-a physical construction whereby only a single color value is measured at each pixel location. Owing to the growing ubiquity of color imaging and display devices, much recent work has focused on the implications of such arrays for subsequent digital processing, including in particular the canonical demosaicking task of reconstructing a full color image from spatially subsampled and incomplete color data acquired under a particular choice of array pattern. In contrast to the majority of the demosaicking literature, we consider here the problem of color filter array design and its implications for spatial reconstruction quality. We pose this problem formally as one of simultaneously maximizing the spectral radii of luminance and chrominance channels subject to perfect reconstruction, and-after proving sub-optimality of a wide class of existing array patterns-provide a constructive method for its solution that yields robust, new panchromatic designs implementable as subtractive colors. Empirical evaluations on multiple color image test sets support our theoretical results, and indicate the potential of these patterns to increase spatial resolution for fixed sensor size, and to contribute to improved reconstruction fidelity as well as significantly reduced hardware complexity.

  3. Scattering angle-based filtering via extension in velocity

    KAUST Repository

    Kazei, Vladimir; Tessmer, Ekkehart; Alkhalifah, Tariq

    2016-01-01

    The scattering angle between the source and receiver wavefields can be utilized in full-waveform inversion (FWI) and in reverse-time migration (RTM) for regularization and quality control or to remove low frequency artifacts. The access to the scattering angle information is costly as the relation between local image features and scattering angles has non-stationary nature. For the purpose of a more efficient scattering angle information extraction, we develop techniques that utilize the simplicity of the scattering angle based filters for constantvelocity background models. We split the background velocity model into several domains with different velocity ranges, generating an

  4. Scattering angle-based filtering via extension in velocity

    KAUST Repository

    Kazei, Vladimir

    2016-09-06

    The scattering angle between the source and receiver wavefields can be utilized in full-waveform inversion (FWI) and in reverse-time migration (RTM) for regularization and quality control or to remove low frequency artifacts. The access to the scattering angle information is costly as the relation between local image features and scattering angles has non-stationary nature. For the purpose of a more efficient scattering angle information extraction, we develop techniques that utilize the simplicity of the scattering angle based filters for constantvelocity background models. We split the background velocity model into several domains with different velocity ranges, generating an

  5. Real-time MR diffusion tensor and Q-ball imaging using Kalman filtering

    International Nuclear Information System (INIS)

    Poupon, C.; Roche, A.; Dubois, J.; Mangin, J.F.; Poupon, F.

    2008-01-01

    Diffusion magnetic resonance imaging (dMRI) has become an established research tool for the investigation of tissue structure and orientation. In this paper, we present a method for real-time processing of diffusion tensor and Q-ball imaging. The basic idea is to use Kalman filtering framework to fit either the linear tensor or Q-ball model. Because the Kalman filter is designed to be an incremental algorithm, it naturally enables updating the model estimate after the acquisition of any new diffusion-weighted volume. Processing diffusion models and maps during ongoing scans provides a new useful tool for clinicians, especially when it is not possible to predict how long a subject may remain still in the magnet. First, we introduce the general linear models corresponding to the two diffusion tensor and analytical Q-ball models of interest. Then, we present the Kalman filtering framework and we focus on the optimization of the diffusion orientation sets in order to speed up the convergence of the online processing. Last, we give some results on a healthy volunteer for the online tensor and the Q-ball model, and we make some comparisons with the conventional offline techniques used in the literature. We could achieve full real-time for diffusion tensor imaging and deferred time for Q-ball imaging, using a single workstation. (authors)

  6. Video-based noncooperative iris image segmentation.

    Science.gov (United States)

    Du, Yingzi; Arslanturk, Emrah; Zhou, Zhi; Belcher, Craig

    2011-02-01

    In this paper, we propose a video-based noncooperative iris image segmentation scheme that incorporates a quality filter to quickly eliminate images without an eye, employs a coarse-to-fine segmentation scheme to improve the overall efficiency, uses a direct least squares fitting of ellipses method to model the deformed pupil and limbic boundaries, and develops a window gradient-based method to remove noise in the iris region. A remote iris acquisition system is set up to collect noncooperative iris video images. An objective method is used to quantitatively evaluate the accuracy of the segmentation results. The experimental results demonstrate the effectiveness of this method. The proposed method would make noncooperative iris recognition or iris surveillance possible.

  7. Evaluation of non-linear adaptive smoothing filter by digital phantom

    International Nuclear Information System (INIS)

    Sato, Kazuhiro; Ishiya, Hiroki; Oshita, Ryosuke; Yanagawa, Isao; Goto, Mitsunori; Mori, Issei

    2008-01-01

    As a result of the development of multi-slice CT, diagnoses based on three-dimensional reconstruction images and multi-planar reconstruction have spread. For these applications, which require high z-resolution, thin slice imaging is essential. However, because z-resolution is always based on a trade-off with image noise, thin slice imaging is necessarily accompanied by an increase in noise level. To improve the quality of thin slice images, a non-linear adaptive smoothing filter has been developed, and is being widely applied to clinical use. We developed a digital bar pattern phantom for the purpose of evaluating the effect of this filter and attempted evaluation from an addition image of the bar pattern phantom and the image of the water phantom. The effect of this filter was changed in a complex manner by the contrast and spatial frequency of the original image. We have confirmed the reduced effect of image noise in the low frequency component of the image, but decreased contrast or increased quantity of noise in the image of the high frequency component. This result represents the effect of change in the adaptation of this filter. The digital phantom was useful for this evaluation, but to understand the total effect of filtering, much improvement of the shape of the digital phantom is required. (author)

  8. A Novel Segmentation Approach Combining Region- and Edge-Based Information for Ultrasound Images

    Directory of Open Access Journals (Sweden)

    Yaozhong Luo

    2017-01-01

    Full Text Available Ultrasound imaging has become one of the most popular medical imaging modalities with numerous diagnostic applications. However, ultrasound (US image segmentation, which is the essential process for further analysis, is a challenging task due to the poor image quality. In this paper, we propose a new segmentation scheme to combine both region- and edge-based information into the robust graph-based (RGB segmentation method. The only interaction required is to select two diagonal points to determine a region of interest (ROI on the original image. The ROI image is smoothed by a bilateral filter and then contrast-enhanced by histogram equalization. Then, the enhanced image is filtered by pyramid mean shift to improve homogeneity. With the optimization of particle swarm optimization (PSO algorithm, the RGB segmentation method is performed to segment the filtered image. The segmentation results of our method have been compared with the corresponding results obtained by three existing approaches, and four metrics have been used to measure the segmentation performance. The experimental results show that the method achieves the best overall performance and gets the lowest ARE (10.77%, the second highest TPVF (85.34%, and the second lowest FPVF (4.48%.

  9. Filtered Rayleigh Scattering Measurements in a Buoyant Flow Field

    National Research Council Canada - National Science Library

    Meents, Steven M

    2008-01-01

    Filtered Rayleigh Scattering (FRS) is a non-intrusive, laser-based flow characterization technique that consists of a narrow linewidth laser, a molecular absorption filter, and a high resolution camera behind the filter to record images...

  10. Identifying city PV roof resource based on Gabor filter

    Science.gov (United States)

    Ruhang, Xu; Zhilin, Liu; Yong, Huang; Xiaoyu, Zhang

    2017-06-01

    To identify a city’s PV roof resources, the area and ownership distribution of residential buildings in an urban district should be assessed. To achieve this assessment, remote sensing data analysing is a promising approach. Urban building roof area estimation is a major topic for remote sensing image information extraction. There are normally three ways to solve this problem. The first way is pixel-based analysis, which is based on mathematical morphology or statistical methods; the second way is object-based analysis, which is able to combine semantic information and expert knowledge; the third way is signal-processing view method. This paper presented a Gabor filter based method. This result shows that the method is fast and with proper accuracy.

  11. A Filtering Approach for Image-Guided Surgery With a Highly Articulated Surgical Snake Robot.

    Science.gov (United States)

    Tully, Stephen; Choset, Howie

    2016-02-01

    The objective of this paper is to introduce a probabilistic filtering approach to estimate the pose and internal shape of a highly flexible surgical snake robot during minimally invasive surgery. Our approach renders a depiction of the robot that is registered to preoperatively reconstructed organ models to produce a 3-D visualization that can be used for surgical feedback. Our filtering method estimates the robot shape using an extended Kalman filter that fuses magnetic tracker data with kinematic models that define the motion of the robot. Using Lie derivative analysis, we show that this estimation problem is observable, and thus, the shape and configuration of the robot can be successfully recovered with a sufficient number of magnetic tracker measurements. We validate this study with benchtop and in-vivo image-guidance experiments in which the surgical robot was driven along the epicardial surface of a porcine heart. This paper introduces a filtering approach for shape estimation that can be used for image guidance during minimally invasive surgery. The methods being introduced in this paper enable informative image guidance for highly articulated surgical robots, which benefits the advancement of robotic surgery.

  12. Correlation study of effect of additional filter on radiation dose and image quality in digital mammography

    International Nuclear Information System (INIS)

    Liu Jie; Liu Peifang; Wang Hongbin; Zhang Shuping; Liu Xueou

    2012-01-01

    Objective: To explore the effect of different additional filters on radiation dose and image quality in digital mammography. Methods: Hologic company's Selenia digital mammography machine and the post-processing workstations and 5 M high resolution medical monitor were used in this study. Mammography phantoms with the thickness from 1.6 cm to 8.6 cm were used to simulate human breast tissue. The same exposure conditions, pressure, compression thickness, the anode were employed with the additional filters of Mo and Rh under the automatic and manual exposure mode. The image kV, mAs, pressure, filter, average glandular dose (AGD), entrance surface dose (ESD), signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and image score according to ACR criteria were recorded for the two additional filters. Paired sample t test was performed to compare the indices of Mo and Rh groups by using SPSS 17.0. Results: AGD and ESD of Rh and Mo group were both higher with the increase of the thickness of all the phantoms. AGD, ESD and their increased value of Rh filter(1.484 ± 1.041, 7.969 ± 7.633, 0.423 ± 0.190 and 3.057 ± 2.139) were lower than those of Mo filter (1.915 ± 1.301, 12.516 ± 11.632, 0.539 ±0.246 and 4.731 ± 3.294), in all the phantoms with different thickness (t values were 4.614, 3.209, 3.396 and 3.605, P<0.05). SNR, CNR, and image score of Rh and Mo group both decreased with the increase of the thickness of all the phantoms. There were no statistical difference (P>0.05). Conclusions: Compared with Mo filter, Rh filter could reduce the radiation dose, and this advantage is more obvious in the thicker phantom when the same image quality is required. (authors)

  13. CLASSIFICATION OF HYPERSPECTRAL DATA BASED ON GUIDED FILTERING AND RANDOM FOREST

    Directory of Open Access Journals (Sweden)

    H. Ma

    2017-09-01

    Full Text Available Hyperspectral images usually consist of more than one hundred spectral bands, which have potentials to provide rich spatial and spectral information. However, the application of hyperspectral data is still challengeable due to “the curse of dimensionality”. In this context, many techniques, which aim to make full use of both the spatial and spectral information, are investigated. In order to preserve the geometrical information, meanwhile, with less spectral bands, we propose a novel method, which combines principal components analysis (PCA, guided image filtering and the random forest classifier (RF. In detail, PCA is firstly employed to reduce the dimension of spectral bands. Secondly, the guided image filtering technique is introduced to smooth land object, meanwhile preserving the edge of objects. Finally, the features are fed into RF classifier. To illustrate the effectiveness of the method, we carry out experiments over the popular Indian Pines data set, which is collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS sensor. By comparing the proposed method with the method of only using PCA or guided image filter, we find that effect of the proposed method is better.

  14. Kernel-based noise filtering of neutron detector signals

    International Nuclear Information System (INIS)

    Park, Moon Ghu; Shin, Ho Cheol; Lee, Eun Ki

    2007-01-01

    This paper describes recently developed techniques for effective filtering of neutron detector signal noise. In this paper, three kinds of noise filters are proposed and their performance is demonstrated for the estimation of reactivity. The tested filters are based on the unilateral kernel filter, unilateral kernel filter with adaptive bandwidth and bilateral filter to show their effectiveness in edge preservation. Filtering performance is compared with conventional low-pass and wavelet filters. The bilateral filter shows a remarkable improvement compared with unilateral kernel and wavelet filters. The effectiveness and simplicity of the unilateral kernel filter with adaptive bandwidth is also demonstrated by applying it to the reactivity measurement performed during reactor start-up physics tests

  15. Precision of quantitative computed tomography texture analysis using image filtering: A phantom study for scanner variability.

    Science.gov (United States)

    Yasaka, Koichiro; Akai, Hiroyuki; Mackin, Dennis; Court, Laurence; Moros, Eduardo; Ohtomo, Kuni; Kiryu, Shigeru

    2017-05-01

    Quantitative computed tomography (CT) texture analyses for images with and without filtration are gaining attention to capture the heterogeneity of tumors. The aim of this study was to investigate how quantitative texture parameters using image filtering vary among different computed tomography (CT) scanners using a phantom developed for radiomics studies.A phantom, consisting of 10 different cartridges with various textures, was scanned under 6 different scanning protocols using four CT scanners from four different vendors. CT texture analyses were performed for both unfiltered images and filtered images (using a Laplacian of Gaussian spatial band-pass filter) featuring fine, medium, and coarse textures. Forty-five regions of interest were placed for each cartridge (x) in a specific scan image set (y), and the average of the texture values (T(x,y)) was calculated. The interquartile range (IQR) of T(x,y) among the 6 scans was calculated for a specific cartridge (IQR(x)), while the IQR of T(x,y) among the 10 cartridges was calculated for a specific scan (IQR(y)), and the median IQR(y) was then calculated for the 6 scans (as the control IQR, IQRc). The median of their quotient (IQR(x)/IQRc) among the 10 cartridges was defined as the variability index (VI).The VI was relatively small for the mean in unfiltered images (0.011) and for standard deviation (0.020-0.044) and entropy (0.040-0.044) in filtered images. Skewness and kurtosis in filtered images featuring medium and coarse textures were relatively variable across different CT scanners, with VIs of 0.638-0.692 and 0.430-0.437, respectively.Various quantitative CT texture parameters are robust and variable among different scanners, and the behavior of these parameters should be taken into consideration.

  16. Block-Based Compressed Sensing for Neutron Radiation Image Using WDFB

    Directory of Open Access Journals (Sweden)

    Wei Jin

    2015-01-01

    Full Text Available An ideal compression method for neutron radiation image should have high compression ratio while keeping more details of the original image. Compressed sensing (CS, which can break through the restrictions of sampling theorem, is likely to offer an efficient compression scheme for the neutron radiation image. Combining wavelet transform with directional filter banks, a novel nonredundant multiscale geometry analysis transform named Wavelet Directional Filter Banks (WDFB is constructed and applied to represent neutron radiation image sparsely. Then, the block-based CS technique is introduced and a high performance CS scheme for neutron radiation image is proposed. By performing two-step iterative shrinkage algorithm the problem of L1 norm minimization is solved to reconstruct neutron radiation image from random measurements. The experiment results demonstrate that the scheme not only improves the quality of reconstructed image obviously but also retains more details of original image.

  17. Perception-Based Filtering for MMOGs

    Directory of Open Access Journals (Sweden)

    Souad El Merhebi

    2008-01-01

    Full Text Available Online games have exploded in the last few years. These games face several problems linked to scalability and interactivity. In fact, online games should provide a quick feedback of users' interactions as well as a coherent view of the shared world. However, the search for enhanced scalability dramatically increases message exchange. Such an increase consumes processing power and bandwidth, and thus limits interactivity, consistency, and scalability. To reduce the rate of message exchange, distributed virtual environment systems use filtering techniques such as interest management that filters messages according to users' interests in the world. These interests are influenced by perceptual facts which we study in this paper in order to build upon them a perception-based filtering technique. This technique satisfies users' needs by precisely providing an exact filtering which is more efficient than other techniques.

  18. A robust technique based on VLM and Frangi filter for retinal vessel extraction and denoising.

    Directory of Open Access Journals (Sweden)

    Khan Bahadar Khan

    Full Text Available The exploration of retinal vessel structure is colossally important on account of numerous diseases including stroke, Diabetic Retinopathy (DR and coronary heart diseases, which can damage the retinal vessel structure. The retinal vascular network is very hard to be extracted due to its spreading and diminishing geometry and contrast variation in an image. The proposed technique consists of unique parallel processes for denoising and extraction of blood vessels in retinal images. In the preprocessing section, an adaptive histogram equalization enhances dissimilarity between the vessels and the background and morphological top-hat filters are employed to eliminate macula and optic disc, etc. To remove local noise, the difference of images is computed from the top-hat filtered image and the high-boost filtered image. Frangi filter is applied at multi scale for the enhancement of vessels possessing diverse widths. Segmentation is performed by using improved Otsu thresholding on the high-boost filtered image and Frangi's enhanced image, separately. In the postprocessing steps, a Vessel Location Map (VLM is extracted by using raster to vector transformation. Postprocessing steps are employed in a novel way to reject misclassified vessel pixels. The final segmented image is obtained by using pixel-by-pixel AND operation between VLM and Frangi output image. The method has been rigorously analyzed on the STARE, DRIVE and HRF datasets.

  19. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  20. GPU Accelerated Vector Median Filter

    Science.gov (United States)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  1. Influence of different anode/filter combination on radiation dose and image quality in digital mammography

    International Nuclear Information System (INIS)

    Liu Jie; Liu Peifang; Zhang Lianlian; Ma Wenjuan

    2013-01-01

    Objective: To explore the effect of different anode/filter combination on radiation dose and image quality in digital mammography, so as to choose optimal anode/filter combination to reduce radiation injury without scarifying image quality. Methods: Mammography accredition phantoms with the thickness from 1.6 cm to 8.6 cm were used to simulate human breast tissue. The same exposure conditions, pressure, compression thickness. and different anode/filter combination were employed under the automatic and manual exposure modes. The image kV, mAs, pressure, filter, average glandular dose (ACD), contrast to noise ratio (CNR) were recorded and the figure of merit (FOM) was calculated. SPSS 17.0 and one-way analysis of variance were used in the statistical analysis. Results: As the phantom thickness increase, the ACD values which were acquired with Mo/Mo, Mo/Rh, and W/Ag three different anode/filter combinations were increased, but CNR and FOM values were decreased, ACD, CNR, and FOM values which were acquired in the phantom with different thickness, and three different anode/filter combinations were statistically different (P=0.000, respectively). The ACD values of Mo/Mo were lowest. For 1.6 cm-2.6 cm phantom thicknesses, the FOMs of Mo/Rh were lowest, and for 3.6 cm-8.6 cm phantom thicknesses, the FOMs of W/Ag were lowest. Conclusion: Phantom thickness in 1.6 cm-2.6 cm and 3.6 cm-8.6 cm. Mo/Rh combination and W/Ag combination respectively can achieve the highest FOM, and can provide the best imaging quality with low radiation dose. (authors)

  2. Automatic detection of solar features in HSOS full-disk solar images using guided filter

    Science.gov (United States)

    Yuan, Fei; Lin, Jiaben; Guo, Jingjing; Wang, Gang; Tong, Liyue; Zhang, Xinwei; Wang, Bingxiang

    2018-02-01

    A procedure is introduced for the automatic detection of solar features using full-disk solar images from Huairou Solar Observing Station (HSOS), National Astronomical Observatories of China. In image preprocessing, median filter is applied to remove the noises. Guided filter is adopted to enhance the edges of solar features and restrain the solar limb darkening, which is first introduced into the astronomical target detection. Then specific features are detected by Otsu algorithm and further threshold processing technique. Compared with other automatic detection procedures, our procedure has some advantages such as real time and reliability as well as no need of local threshold. Also, it reduces the amount of computation largely, which is benefited from the efficient guided filter algorithm. The procedure has been tested on one month sequences (December 2013) of HSOS full-disk solar images and the result shows that the number of features detected by our procedure is well consistent with the manual one.

  3. Demosaicing and Superresolution for Color Filter Array via Residual Image Reconstruction and Sparse Representation

    OpenAIRE

    Sun, Guangling

    2012-01-01

    A framework of demosaicing and superresolution for color filter array (CFA) via residual image reconstruction and sparse representation is presented.Given the intermediate image produced by certain demosaicing and interpolation technique, a residual image between the final reconstruction image and the intermediate image is reconstructed using sparse representation.The final reconstruction image has richer edges and details than that of the intermediate image. Specifically, a generic dictionar...

  4. Single-Phase LLCL-Filter-based Grid-Tied Inverter with Low-Pass Filter Based Capacitor Current Feedback Active damper

    DEFF Research Database (Denmark)

    Liu, Yuan; Wu, Weimin; Li, Yun

    2016-01-01

    The capacitor-current-feedback active damping method is attractive for high-order-filter-based high power grid-tied inverter when the grid impedance varies within a wide range. In order to improve the system control bandwidth and attenuate the high order grid background harmonics by using the quasi....... In this paper, a low pass filter is proposed to be inserted in the capacitor current feedback loop op LLCL-filter based grid-tied inverter together with a digital proportional and differential compensator. The detailed theoretical analysis is given. For verification, simulations on a 2kW/220V/10kHz LLCL...

  5. A nowcasting technique based on application of the particle filter blending algorithm

    Science.gov (United States)

    Chen, Yuanzhao; Lan, Hongping; Chen, Xunlai; Zhang, Wenhai

    2017-10-01

    To improve the accuracy of nowcasting, a new extrapolation technique called particle filter blending was configured in this study and applied to experimental nowcasting. Radar echo extrapolation was performed by using the radar mosaic at an altitude of 2.5 km obtained from the radar images of 12 S-band radars in Guangdong Province, China. The first bilateral filter was applied in the quality control of the radar data; an optical flow method based on the Lucas-Kanade algorithm and the Harris corner detection algorithm were used to track radar echoes and retrieve the echo motion vectors; then, the motion vectors were blended with the particle filter blending algorithm to estimate the optimal motion vector of the true echo motions; finally, semi-Lagrangian extrapolation was used for radar echo extrapolation based on the obtained motion vector field. A comparative study of the extrapolated forecasts of four precipitation events in 2016 in Guangdong was conducted. The results indicate that the particle filter blending algorithm could realistically reproduce the spatial pattern, echo intensity, and echo location at 30- and 60-min forecast lead times. The forecasts agreed well with observations, and the results were of operational significance. Quantitative evaluation of the forecasts indicates that the particle filter blending algorithm performed better than the cross-correlation method and the optical flow method. Therefore, the particle filter blending method is proved to be superior to the traditional forecasting methods and it can be used to enhance the ability of nowcasting in operational weather forecasts.

  6. SU-F-J-28: Development of a New Imaging Filter to Remove the Shadows From the Carbon Fiber Grid Table Top

    Energy Technology Data Exchange (ETDEWEB)

    Maehana, W [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Yokohama National University, Yokohama, kanagawa (Japan); Nagao, T [Yokohama National University, Yokohama, kanagawa (Japan)

    2016-06-15

    Purpose: For the image guided radiation therapy (IGRT), the shadows caused by the construction of the treatment couch top adversely affect the visual evaluation. Therefore, we developed the new imaging filter in order to remove the shadows. The performance of the new filter was evaluated using the clinical images. Methods: The new filter was composed of the band-pass filter (BPF) weighted by the k factor and the low-pass filter (LPF). In the frequency region, the stop bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the BPF, and the pass bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the LPF. After adding the filter, the shadows from the carbon fiber grid table top (CFGTT, Varian) on the kV-image was removed. To check the filter effect, we compared the clinical images, which are thorax and thoracoabdominal region, with to without the filter. The subjective evaluation tests was performed by adapting a three-point scale (agree, neither agree nor disagree, disagree) about the 15 persons in the department of radiation oncology. Results: We succeeded in removing all shadows of CFGTT using the new filter. This filter is very useful shown by the results of the subjective evaluation having the 23/30 persons agreed to the filtered clinical images. Conclusion: We concluded that the proposed method was useful tool for the IGRT and the new filter leads to improvement of the accuracy of radiation therapy.

  7. Tin-filter enhanced dual-energy-CT: image quality and accuracy of CT numbers in virtual noncontrast imaging.

    Science.gov (United States)

    Kaufmann, Sascha; Sauter, Alexander; Spira, Daniel; Gatidis, Sergios; Ketelsen, Dominik; Heuschmid, Martin; Claussen, Claus D; Thomas, Christoph

    2013-05-01

    To measure and compare the objective image quality of true noncontrast (TNC) images with virtual noncontrast (VNC) images acquired by tin-filter-enhanced, dual-source, dual-energy computed tomography (DECT) of upper abdomen. Sixty-three patients received unenhanced abdominal CT and enhanced abdominal DECT (100/140 kV with tin filter) in portal-venous phase. VNC images were calculated from the DECT datasets using commercially available software. The mean attenuation of relevant tissues and image quality were compared between the TNC and VNC images. Image quality was rated objectively by measuring image noise and the sharpness of object edges using custom-designed software. Measurements were compared using Student two-tailed t-test. Correlation coefficients for tissue attenuation measurements between TNC and VNC were calculated and the relative deviations were illustrated using Bland-Altman plots. Mean attenuation differences between TNC and VNC (HUTNC - HUVNC) image sets were as follows: right liver lobe -4.94 Hounsfield units (HU), left liver lobe -3.29 HU, vena cava -2.19 HU, spleen -7.46 HU, pancreas 1.29 HU, fat -11.14 HU, aorta 1.29 HU, bone marrow 36.83 HU (all P VNC and TNC series were observed for liver, vena portae, kidneys, pancreas, muscle and bone marrow (Pearson's correlation coefficient ≥0.75). Mean image noise was significantly higher in TNC images (P VNC and TNC images (P = .19). The Hounsfield units in VNC images closely resemble TNC images in the majority of the organs of the upper abdomen (kidneys, liver, pancreas). In spleen and fat, Hounsfield numbers in VNC images are tend to be higher than in TNC images. VNC images show a low image noise and satisfactory edge sharpness. Other criteria of image quality and the depiction of certain lesions need to be evaluated additionally. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  8. Avoiding the Use of Exhausted Drinking Water Filters: A Filter-Clock Based on Rusting Iron

    Directory of Open Access Journals (Sweden)

    Arnaud Igor Ndé-Tchoupé

    2018-05-01

    Full Text Available Efficient but affordable water treatment technologies are currently sought to solve the prevalent shortage of safe drinking water. Adsorption-based technologies are in the front-line of these efforts. Upon proper design, universally applied materials (e.g., activated carbons, bone chars, metal oxides are able to quantitatively remove inorganic and organic pollutants as well as pathogens from water. Each water filter has a defined removal capacity and must be replaced when this capacity is exhausted. Operational experience has shown that it may be difficult to convince some low-skilled users to buy new filters after a predicted service life. This communication describes the quest to develop a filter-clock to encourage all users to change their filters after the designed service life. A brief discussion on such a filter-clock based on rusting of metallic iron (Fe0 is presented. Integrating such filter-clocks in the design of water filters is regarded as essential for safeguarding public health.

  9. Autonomous Navigation for Autonomous Underwater Vehicles Based on Information Filters and Active Sensing

    Directory of Open Access Journals (Sweden)

    Tianhong Yan

    2011-11-01

    Full Text Available This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM, and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China. Weak links in the information matrix in an extended information filter (EIF can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM. All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.

  10. Autonomous navigation for autonomous underwater vehicles based on information filters and active sensing.

    Science.gov (United States)

    He, Bo; Zhang, Hongjin; Li, Chao; Zhang, Shujing; Liang, Yan; Yan, Tianhong

    2011-01-01

    This paper addresses an autonomous navigation method for the autonomous underwater vehicle (AUV) C-Ranger applying information-filter-based simultaneous localization and mapping (SLAM), and its sea trial experiments in Tuandao Bay (Shangdong Province, P.R. China). Weak links in the information matrix in an extended information filter (EIF) can be pruned to achieve an efficient approach-sparse EIF algorithm (SEIF-SLAM). All the basic update formulae can be implemented in constant time irrespective of the size of the map; hence the computational complexity is significantly reduced. The mechanical scanning imaging sonar is chosen as the active sensing device for the underwater vehicle, and a compensation method based on feedback of the AUV pose is presented to overcome distortion of the acoustic images due to the vehicle motion. In order to verify the feasibility of the navigation methods proposed for the C-Ranger, a sea trial was conducted in Tuandao Bay. Experimental results and analysis show that the proposed navigation approach based on SEIF-SLAM improves the accuracy of the navigation compared with conventional method; moreover the algorithm has a low computational cost when compared with EKF-SLAM.

  11. Adaptive oriented PDEs filtering methods based on new controlling speed function for discontinuous optical fringe patterns

    Science.gov (United States)

    Zhou, Qiuling; Tang, Chen; Li, Biyuan; Wang, Linlin; Lei, Zhenkun; Tang, Shuwei

    2018-01-01

    The filtering of discontinuous optical fringe patterns is a challenging problem faced in this area. This paper is concerned with oriented partial differential equations (OPDEs)-based image filtering methods for discontinuous optical fringe patterns. We redefine a new controlling speed function to depend on the orientation coherence. The orientation coherence can be used to distinguish the continuous regions and the discontinuous regions, and can be calculated by utilizing fringe orientation. We introduce the new controlling speed function to the previous OPDEs and propose adaptive OPDEs filtering models. According to our proposed adaptive OPDEs filtering models, the filtering in the continuous and discontinuous regions can be selectively carried out. We demonstrate the performance of the proposed adaptive OPDEs via application to the simulated and experimental fringe patterns, and compare our methods with the previous OPDEs.

  12. Image Vector Quantization codec indexes filtering

    Directory of Open Access Journals (Sweden)

    Lakhdar Moulay Abdelmounaim

    2012-01-01

    Full Text Available Vector Quantisation (VQ is an efficient coding algorithm that has been widely used in the field of video and image coding, due to its fast decoding efficiency. However, the indexes of VQ are sometimes lost because of signal interference during the transmission. In this paper, we propose an efficient estimation method to conceal and recover the lost indexes on the decoder side, to avoid re-transmitting the whole image again. If the image or video has the limitation of a period of validity, re-transmitting the data wastes the resources of time and network bandwidth. Therefore, using the originally received correct data to estimate and recover the lost data is efficient in time-constrained situations, such as network conferencing or mobile transmissions. In nature images, the pixels are correlated with their neighbours and VQ partitions the image into sub-blocks and quantises them to the indexes that are transmitted; the correlation between adjacent indexes is very strong. There are two parts of the proposed method. The first is pre-processing and the second is an estimation process. In pre-processing, we modify the order of codevectors in the VQ codebook to increase the correlation among the neighbouring vectors. We then use a special filtering method in the estimation process. Using conventional VQ to compress the Lena image and transmit it without any loss of index can achieve a PSNR of 30.429 dB on the decoder. The simulation results demonstrate that our method can estimate the indexes to achieve PSNR values of 29.084 and 28.327 dB when the loss rate is 0.5% and 1%, respectively.

  13. Photoacoustic imaging in scattering media by combining a correlation matrix filter with a time reversal operator.

    Science.gov (United States)

    Rui, Wei; Tao, Chao; Liu, Xiaojun

    2017-09-18

    Acoustic scattering medium is a fundamental challenge for photoacoustic imaging. In this study, we reveal the different coherent properties of the scattering photoacoustic waves and the direct photoacoustic waves in a matrix form. Direct waves show a particular coherence on the antidiagonals of the matrix, whereas scattering waves do not. Based on this property, a correlation matrix filter combining with a time reversal operator is proposed to preserve the direct waves and recover the image behind a scattering layer. Both numerical simulations and photoacoustic imaging experiments demonstrate that the proposed approach effectively increases the image contrast and decreases the background speckles in a scattering medium. This study might improve the quality of photoacoustic imaging in an acoustic scattering environment and extend its applications.

  14. A family of quantization based piecewise linear filter networks

    DEFF Research Database (Denmark)

    Sørensen, John Aasted

    1992-01-01

    A family of quantization-based piecewise linear filter networks is proposed. For stationary signals, a filter network from this family is a generalization of the classical Wiener filter with an input signal and a desired response. The construction of the filter network is based on quantization...... of the input signal x(n) into quantization classes. With each quantization class is associated a linear filter. The filtering at time n is carried out by the filter belonging to the actual quantization class of x(n ) and the filters belonging to the neighbor quantization classes of x(n) (regularization......). This construction leads to a three-layer filter network. The first layer consists of the quantization class filters for the input signal. The second layer carries out the regularization between neighbor quantization classes, and the third layer constitutes a decision of quantization class from where the resulting...

  15. Regularized Fractional Power Parameters for Image Denoising Based on Convex Solution of Fractional Heat Equation

    Directory of Open Access Journals (Sweden)

    Hamid A. Jalab

    2014-01-01

    Full Text Available The interest in using fractional mask operators based on fractional calculus operators has grown for image denoising. Denoising is one of the most fundamental image restoration problems in computer vision and image processing. This paper proposes an image denoising algorithm based on convex solution of fractional heat equation with regularized fractional power parameters. The performances of the proposed algorithms were evaluated by computing the PSNR, using different types of images. Experiments according to visual perception and the peak signal to noise ratio values show that the improvements in the denoising process are competent with the standard Gaussian filter and Wiener filter.

  16. Switching non-local vector median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2016-04-01

    This paper describes a novel image filtering method that removes random-valued impulse noise superimposed on a natural color image. In impulse noise removal, it is essential to employ a switching-type filtering method, as used in the well-known switching median filter, to preserve the detail of an original image with good quality. In color image filtering, it is generally preferable to deal with the red (R), green (G), and blue (B) components of each pixel of a color image as elements of a vectorized signal, as in the well-known vector median filter, rather than as component-wise signals to prevent a color shift after filtering. By taking these fundamentals into consideration, we propose a switching-type vector median filter with non-local processing that mainly consists of a noise detector and a noise removal filter. Concretely, we propose a noise detector that proactively detects noise-corrupted pixels by focusing attention on the isolation tendencies of pixels of interest not in an input image but in difference images between RGB components. Furthermore, as the noise removal filter, we propose an extended version of the non-local median filter, we proposed previously for grayscale image processing, named the non-local vector median filter, which is designed for color image processing. The proposed method realizes a superior balance between the preservation of detail and impulse noise removal by proactive noise detection and non-local switching vector median filtering, respectively. The effectiveness and validity of the proposed method are verified in a series of experiments using natural color images.

  17. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    Science.gov (United States)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  18. [Testing method research for key performance indicator of imaging acousto-optic tunable filter (AOTF)].

    Science.gov (United States)

    Hu, Shan-Zhou; Chen, Fen-Fei; Zeng, Li-Bo; Wu, Qiong-Shui

    2013-01-01

    Imaging AOTF is an important optical filter component for new spectral imaging instruments developed in recent years. The principle of imaging AOTF component was demonstrated, and a set of testing methods for some key performances were studied, such as diffraction efficiency, wavelength shift with temperature, homogeneity in space for diffraction efficiency, imaging shift, etc.

  19. OTRA-Based Multi-Function Inverse Filter Configuration

    Directory of Open Access Journals (Sweden)

    Abdhesh Kumar Singh

    2017-01-01

    Full Text Available A new OTRA-based multifunction Inverse filter configuration is presented which is capable of realizing low pass, high pass and band pass filters using only two OTRAs and five to six passive elements. To the best knowledge of the authors, any inverse filter configuration using OTRAs has not been reported in the literature earlier. The effect of the major parasitics of the OTRAs and their effect on the performance filter have been investigated and measured through simulation results and Monte-Carlo analysis. The workability of the proposed circuits has been confirmed by SPICE simulations using CMOS-based-OTRA realizable in 0.18 µm CMOS technology. The proposed circuits are the only ones which provide simultaneously the following features: use of reasonable number of active elements (only 2, realizability of all the three basic filter functions, employment of all virtually grounded resistors and capacitors and tunability of all filter parameters (except gain factor, H_0 for inverse high pass. The centre/cut-off frequency of the various filter circuits lying in the vicinity of 1 MHz have been found to be realizable, which has been verified through SPICE simulation results and have been found to be in good agreement with the theoretical results.

  20. A center-median filtering method for detection of temporal variation in coronal images

    Directory of Open Access Journals (Sweden)

    Plowman Joseph

    2016-01-01

    Full Text Available Events in the solar corona are often widely separated in their timescales, which can allow them to be identified when they would otherwise be confused with emission from other sources in the corona. Methods for cleanly separating such events based on their timescales are thus desirable for research in the field. This paper develops a technique for identifying time-varying signals in solar coronal image sequences which is based on a per-pixel running median filter and an understanding of photon-counting statistics. Example applications to “EIT waves” (named after EIT, the EUV Imaging Telescope on the Solar and Heliospheric Observatory and small-scale dynamics are shown, both using 193 Å data from the Atmospheric Imaging Assembly (AIA on the Solar Dynamics Observatory. The technique is found to discriminate EIT waves more cleanly than the running and base difference techniques most commonly used. It is also demonstrated that there is more signal in the data than is commonly appreciated, finding that the waves can be traced to the edge of the AIA field of view when the data are rebinned to increase the signal-to-noise ratio.

  1. Bowtie filters for dedicated breast CT: Analysis of bowtie filter material selection

    Energy Technology Data Exchange (ETDEWEB)

    Kontson, Kimberly, E-mail: Kimberly.Kontson@fda.hhs.gov; Jennings, Robert J. [Department of Bioengineering, University of Maryland, College Park, Maryland 20742 and Division of Imaging and Applied Mathematics, Office of Science and Engineering Laboratories, Center for Devices and Radiological Health, U.S. Food and Drug Administration, 10903 New Hampshire Avenue, Silver Spring, Maryland 20993 (United States)

    2015-09-15

    Purpose: For a given bowtie filter design, both the selection of material and the physical design control the energy fluence, and consequently the dose distribution, in the object. Using three previously described bowtie filter designs, the goal of this work is to demonstrate the effect that different materials have on the bowtie filter performance measures. Methods: Three bowtie filter designs that compensate for one or more aspects of the beam-modifying effects due to the differences in path length in a projection have been designed. The nature of the designs allows for their realization using a variety of materials. The designs were based on a phantom, 14 cm in diameter, composed of 40% fibroglandular and 60% adipose tissue. Bowtie design #1 is based on single material spectral matching and produces nearly uniform spectral shape for radiation incident upon the detector. Bowtie design #2 uses the idea of basis-material decomposition to produce the same spectral shape and intensity at the detector, using two different materials. With bowtie design #3, it is possible to eliminate the beam hardening effect in the reconstructed image by adjusting the bowtie filter thickness so that the effective attenuation coefficient for every ray is the same. Seven different materials were chosen to represent a range of chemical compositions and densities. After calculation of construction parameters for each bowtie filter design, a bowtie filter was created using each of these materials (assuming reasonable construction parameters were obtained), resulting in a total of 26 bowtie filters modeled analytically and in the PENELOPE Monte Carlo simulation environment. Using the analytical model of each bowtie filter, design profiles were obtained and energy fluence as a function of fan-angle was calculated. Projection images with and without each bowtie filter design were also generated using PENELOPE and reconstructed using FBP. Parameters such as dose distribution, noise uniformity

  2. Bowtie filters for dedicated breast CT: Analysis of bowtie filter material selection

    International Nuclear Information System (INIS)

    Kontson, Kimberly; Jennings, Robert J.

    2015-01-01

    Purpose: For a given bowtie filter design, both the selection of material and the physical design control the energy fluence, and consequently the dose distribution, in the object. Using three previously described bowtie filter designs, the goal of this work is to demonstrate the effect that different materials have on the bowtie filter performance measures. Methods: Three bowtie filter designs that compensate for one or more aspects of the beam-modifying effects due to the differences in path length in a projection have been designed. The nature of the designs allows for their realization using a variety of materials. The designs were based on a phantom, 14 cm in diameter, composed of 40% fibroglandular and 60% adipose tissue. Bowtie design #1 is based on single material spectral matching and produces nearly uniform spectral shape for radiation incident upon the detector. Bowtie design #2 uses the idea of basis-material decomposition to produce the same spectral shape and intensity at the detector, using two different materials. With bowtie design #3, it is possible to eliminate the beam hardening effect in the reconstructed image by adjusting the bowtie filter thickness so that the effective attenuation coefficient for every ray is the same. Seven different materials were chosen to represent a range of chemical compositions and densities. After calculation of construction parameters for each bowtie filter design, a bowtie filter was created using each of these materials (assuming reasonable construction parameters were obtained), resulting in a total of 26 bowtie filters modeled analytically and in the PENELOPE Monte Carlo simulation environment. Using the analytical model of each bowtie filter, design profiles were obtained and energy fluence as a function of fan-angle was calculated. Projection images with and without each bowtie filter design were also generated using PENELOPE and reconstructed using FBP. Parameters such as dose distribution, noise uniformity

  3. Pornographic image detection with Gabor filters

    Science.gov (United States)

    Durrell, Kevan; Murray, Daniel J. C.

    2002-04-01

    As Internet-enabled computers become ubiquitous in homes, schools, and other publicly accessible locations, there are more people 'surfing the net' who would prefer not to be exposed to offensive material. There is a lot of material freely available on the Internet that we, as a responsible and caring society, would like to keep our children from viewing. Pornographic image content is one category of material over which we would like some control. We have been conducting experiments to determine the effectiveness of using characteristic feature vectors and neural networks to identify semantic image content. This paper will describe our approach to identifying pornographic images using Gabor filters, Principal Component Analysis (PCA), Correllograms, and Neural Networks. In brief, we used a set of 5,000 typical images available from the Internet, 20% of which were judged to be pornographic, to train a neural network. We then apply the trained neural network to feature vectors from images that had not been used in training. We measure our performance as Recall, how many of the verification images labeled pornographic were correctly identified, and Precision, how many images deemed pornographic by the neural network are in fact pornographic. The set of images that were used in the experiment described in this paper for its training and validation sets are freely available on the Internet. Freely available is an important qualifier, since high-resolution, studio-quality pornographic images are often protected by portals that charge members a fee to gain access to their material. Although this is not a hard and fast rule, many of the pornographic images that are available easily and without charge on the Internet are of low image quality. Some of these images are collages or contain textual elements or have had their resolution intentionally lowered to reduce their file size. These are the offensive images that a user, without a credit card, might inadvertently come

  4. Time-resolved magnetic imaging in an aberration-corrected, energy-filtered photoemission electron microscope

    International Nuclear Information System (INIS)

    Nickel, F.; Gottlob, D.M.; Krug, I.P.; Doganay, H.; Cramm, S.; Kaiser, A.M.; Lin, G.; Makarov, D.; Schmidt, O.G.

    2013-01-01

    We report on the implementation and usage of a synchrotron-based time-resolving operation mode in an aberration-corrected, energy-filtered photoemission electron microscope. The setup consists of a new type of sample holder, which enables fast magnetization reversal of the sample by sub-ns pulses of up to 10 mT. Within the sample holder current pulses are generated by a fast avalanche photo diode and transformed into magnetic fields by means of a microstrip line. For more efficient use of the synchrotron time structure, we developed an electrostatic deflection gating mechanism capable of beam blanking within a few nanoseconds. This allows us to operate the setup in the hybrid bunch mode of the storage ring facility, selecting one or several bright singular light pulses which are temporally well-separated from the normal high-intensity multibunch pulse pattern. - Highlights: • A new time-resolving operation mode in photoemission electron microscopy is shown. • Our setup works within an energy-filtered, aberration-corrected PEEM. • A new gating system for bunch selection using synchrotron radiation is developed. • An alternative magnetic excitation system is developed. • First tr-imaging using an energy-filtered, aberration-corrected PEEM is shown

  5. A Low Cost Structurally Optimized Design for Diverse Filter Types

    Science.gov (United States)

    Kazmi, Majida; Aziz, Arshad; Akhtar, Pervez; Ikram, Nassar

    2016-01-01

    A wide range of image processing applications deploys two dimensional (2D)-filters for performing diversified tasks such as image enhancement, edge detection, noise suppression, multi scale decomposition and compression etc. All of these tasks require multiple type of 2D-filters simultaneously to acquire the desired results. The resource hungry conventional approach is not a viable option for implementing these computationally intensive 2D-filters especially in a resource constraint environment. Thus it calls for optimized solutions. Mostly the optimization of these filters are based on exploiting structural properties. A common shortcoming of all previously reported optimized approaches is their restricted applicability only for a specific filter type. These narrow scoped solutions completely disregard the versatility attribute of advanced image processing applications and in turn offset their effectiveness while implementing a complete application. This paper presents an efficient framework which exploits the structural properties of 2D-filters for effectually reducing its computational cost along with an added advantage of versatility for supporting diverse filter types. A composite symmetric filter structure is introduced which exploits the identities of quadrant and circular T-symmetries in two distinct filter regions simultaneously. These T-symmetries effectually reduce the number of filter coefficients and consequently its multipliers count. The proposed framework at the same time empowers this composite filter structure with additional capabilities of realizing all of its Ψ-symmetry based subtypes and also its special asymmetric filters case. The two-fold optimized framework thus reduces filter computational cost up to 75% as compared to the conventional approach as well as its versatility attribute not only supports diverse filter types but also offers further cost reduction via resource sharing for sequential implementation of diversified image

  6. Retina-Inspired Filter.

    Science.gov (United States)

    Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien

    2018-07-01

    This paper introduces a novel filter, which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer, and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model "virtual retina." This model is the cornerstone to derive the non-separable spatio-temporal OPL retina-inspired filter, briefly renamed retina-inspired filter, studied in this paper. This filter is connected to the dynamic behavior of the retina, which enables the retina to increase the sharpness of the visual stimulus during filtering before its transmission to the brain. We establish that this retina-inspired transform forms a group of spatio-temporal Weighted Difference of Gaussian (WDoG) filters when it is applied to a still image visible for a given time. We analyze the spatial frequency bandwidth of the retina-inspired filter with respect to time. It is shown that the WDoG spectrum varies from a lowpass filter to a bandpass filter. Therefore, while time increases, the retina-inspired filter enables to extract different kinds of information from the input image. Finally, we discuss the benefits of using the retina-inspired filter in image processing applications such as edge detection and compression.

  7. Gated myocardial SPECT using spatial and temporal filtering

    International Nuclear Information System (INIS)

    Hatton, R.L.; Hutton, B.F.; Kyme, A.Z.; Larcos, G.

    2002-01-01

    Full text: Standard protocols for examining myocardial perfusion and motion defects involve the use of gated SPECT images, and a composite of the gated frames. This study examines the usefulness of extracting one or a combination of frames from the gated image to assess perfusion, and whether the addition of a temporal filter to the gated image improves signal to noise. Choice of the most appropriate frame was also considered. Sixteen and eight frame gated SPECT studies were simulated using the dynamic NURBS-based cardiac torso (NCAT) phantom. Variously sized perfusion defects were included in the inferior wall to assess contrast to normal tissue. Scatter and attenuation were not included. Butterworth spatial cutoff frequencies were varied to establish the most appropriate combination of temporal/spatial filters to reduce noise and retain contrast in the images. The 16 frame data produced higher ejection fraction across all spatial filter cutoffs, and generally was unaffected by temporal filtering. Temporal filtering reduced the noise in a uniform liver region in the gated images to within 25% of the composite image noise. The lesion extent and contrast were greater in the end-diastolic frames compared to end-systolic and mid-cycle frames. In conclusion, by using a temporally filtered end-diastolic image from the gated sequence, a favourable balance between noise and contrast can be achieved. Work is progress to confirm these findings in the clinical situation. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  8. Vehicle Sideslip Angle Estimation Based on Hybrid Kalman Filter

    Directory of Open Access Journals (Sweden)

    Jing Li

    2016-01-01

    Full Text Available Vehicle sideslip angle is essential for active safety control systems. This paper presents a new hybrid Kalman filter to estimate vehicle sideslip angle based on the 3-DoF nonlinear vehicle dynamic model combined with Magic Formula tire model. The hybrid Kalman filter is realized by combining square-root cubature Kalman filter (SCKF, which has quick convergence and numerical stability, with square-root cubature based receding horizon Kalman FIR filter (SCRHKF, which has robustness against model uncertainty and temporary noise. Moreover, SCKF and SCRHKF work in parallel, and the estimation outputs of two filters are merged by interacting multiple model (IMM approach. Experimental results show the accuracy and robustness of the hybrid Kalman filter.

  9. Task-based detectability in CT image reconstruction by filtered backprojection and penalized likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Grace J. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (Canada); Stayman, J. Webster; Zbijewski, Wojciech [Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (United States); Siewerdsen, Jeffrey H., E-mail: jeff.siewerdsen@jhu.edu [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205 (United States)

    2014-08-15

    Purpose: Nonstationarity is an important aspect of imaging performance in CT and cone-beam CT (CBCT), especially for systems employing iterative reconstruction. This work presents a theoretical framework for both filtered-backprojection (FBP) and penalized-likelihood (PL) reconstruction that includes explicit descriptions of nonstationary noise, spatial resolution, and task-based detectability index. Potential utility of the model was demonstrated in the optimal selection of regularization parameters in PL reconstruction. Methods: Analytical models for local modulation transfer function (MTF) and noise-power spectrum (NPS) were investigated for both FBP and PL reconstruction, including explicit dependence on the object and spatial location. For FBP, a cascaded systems analysis framework was adapted to account for nonstationarity by separately calculating fluence and system gains for each ray passing through any given voxel. For PL, the point-spread function and covariance were derived using the implicit function theorem and first-order Taylor expansion according toFessler [“Mean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): Applications to tomography,” IEEE Trans. Image Process. 5(3), 493–506 (1996)]. Detectability index was calculated for a variety of simple tasks. The model for PL was used in selecting the regularization strength parameter to optimize task-based performance, with both a constant and a spatially varying regularization map. Results: Theoretical models of FBP and PL were validated in 2D simulated fan-beam data and found to yield accurate predictions of local MTF and NPS as a function of the object and the spatial location. The NPS for both FBP and PL exhibit similar anisotropic nature depending on the pathlength (and therefore, the object and spatial location within the object) traversed by each ray, with the PL NPS experiencing greater smoothing along directions with higher noise. The MTF of FBP

  10. Modeling of memristor-based chaotic systems using nonlinear Wiener adaptive filters based on backslash operator

    International Nuclear Information System (INIS)

    Zhao, Yibo; Jiang, Yi; Feng, Jiuchao; Wu, Lifu

    2016-01-01

    Highlights: • A novel nonlinear Wiener adaptive filters based on the backslash operator are proposed. • The identification approach to the memristor-based chaotic systems using the proposed adaptive filters. • The weight update algorithm and convergence characteristics for the proposed adaptive filters are derived. - Abstract: Memristor-based chaotic systems have complex dynamical behaviors, which are characterized as nonlinear and hysteresis characteristics. Modeling and identification of their nonlinear model is an important premise for analyzing the dynamical behavior of the memristor-based chaotic systems. This paper presents a novel nonlinear Wiener adaptive filtering identification approach to the memristor-based chaotic systems. The linear part of Wiener model consists of the linear transversal adaptive filters, the nonlinear part consists of nonlinear adaptive filters based on the backslash operator for the hysteresis characteristics of the memristor. The weight update algorithms for the linear and nonlinear adaptive filters are derived. Final computer simulation results show the effectiveness as well as fast convergence characteristics. Comparing with the adaptive nonlinear polynomial filters, the proposed nonlinear adaptive filters have less identification error.

  11. Contour extraction of echocardiographic images based on pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana [Department of Multimedia, Faculty of Computer Science and Information Technology, Department of Computer and Communication Systems Engineering, Faculty of Engineering University Putra Malaysia 43400 Serdang, Selangor (Malaysia); Zamrin, D M [Department of Surgery, Faculty of Medicine, National University of Malaysia, 56000 Cheras, Kuala Lumpur (Malaysia); Saripan, M Iqbal

    2011-02-15

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  12. Contour extraction of echocardiographic images based on pre-processing

    International Nuclear Information System (INIS)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana; Zamrin, D M; Saripan, M Iqbal

    2011-01-01

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  13. Comparison of Deconvolution Filters for Photoacoustic Tomography.

    Directory of Open Access Journals (Sweden)

    Dominique Van de Sompel

    filter. The performance of the Fourier filter was found to be the poorest of all three methods, based on the reconstructed images' lowest resolution (blurriest appearance, generally lowest contrast-to-noise ratio, and lowest robustness to noise. Overall, the Tikhonov filter was deemed to produce the most desirable image reconstructions.

  14. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    Science.gov (United States)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  15. Nanophotonic Image Sensors.

    Science.gov (United States)

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Multirate Digital Filters Based on FPGA and Its Applications

    International Nuclear Information System (INIS)

    Sharaf El-Din, R.M.A.

    2013-01-01

    Digital Signal Processing (DSP) is one of the fastest growing techniques in the electronics industry. It is used in a wide range of application fields such as, telecommunications, data communications, image enhancement and processing, video signals, digital TV broadcasting, and voice synthesis and recognition. Field Programmable Gate Array (FPGA) offers good solution for addressing the needs of high performance DSP systems. The focus of this thesis is on one of the basic DSP functions, namely filtering signals to remove unwanted frequency bands. Multi rate Digital Filters (MDFs) are the main theme here. Theory and implementation of MDF, as a special class of digital filters, will be discussed. Multi rate digital filters represent a class of digital filters having a number of attractive features like, low requirements for the coefficient word lengths, significant saving in computation and storage requirements results in a significant reduction in its dynamic power consumption. This thesis introduces an efficient FPGA realization of a multi rate decimation filter with narrow pass-band and narrow transition band to reduce the frequency sample rate by factor of 64 for noise thermometer applications. The proposed multi rate decimation filter is composed of three stages; the first stage is a Cascaded Integrator Comb (CIC) decimation filter, the second stage is a two-coefficient Half-Band (HB) filter and the last stage is a sharper transition HB filter. The frequency responses of individual stages as well as the overall filter response have been demonstrated with full simulation using MATLAB. The design and implementation of the proposed MDF on FPGA (XILINX Virtex XCV800 BG432-4), using VHSIC Hardware Description Language (VHDL), has been introduced. The implementation areas of the proposed filter stages are compared. Using CIC-HB technique saves 18% of the design area, compared to using six stages HB decimation filters.

  17. Comments on the paper 'A novel 3D wavelet-based filter forvisualizing features in noisy biological data', by Moss et al.

    Energy Technology Data Exchange (ETDEWEB)

    Luengo Hendriks, Cris L.; Knowles, David W.

    2006-02-04

    Moss et al.(2005) describe, in a recent paper, a filter thatthey use to detect lines. We noticed that the wavelet on which thisfilter is based is a difference of uniform filters. This filter is anapproximation to the second derivative operator, which is commonlyimplemented as the Laplace of Gaussian (or Marr-Hildreth) operator (Marr&Hildreth, 1980; Jahne, 2002), Figure 1. We have compared Moss'filter with 1) the Laplace of Gaussian operator, 2) an approximation ofthe Laplace of Gaussian using uniform filters, and 3) a few common noisereduction filters. The Laplace-like operators detect lines by suppressingimage features both larger and smaller than the filter size. The noisereduction filters only suppress image features smaller than the filtersize. By estimating the signal to noise ratio (SNR) and mean squaredifference (MSD) of the filtered results, we found that the filterproposed by Moss et al. does not outperform the Laplace of Gaussianoperator. We also found that for images with extreme noise content, linedetection filters perform better than the noise reduction filters whentrying to enhance line structures. In less extreme cases of noise, thestandard noise reduction filters perform significantly better than boththe Laplace of Gaussian and Moss' filter.

  18. Mathematical filtering minimizes metallic halation of titanium implants in MicroCT images.

    Science.gov (United States)

    Ha, Jee; Osher, Stanley J; Nishimura, Ichiro

    2013-01-01

    Microcomputed tomography (MicroCT) images containing titanium implant suffer from x-rays scattering, artifact and the implant surface is critically affected by metallic halation. To improve the metallic halation artifact, a nonlinear Total Variation denoising algorithm such as Split Bregman algorithm was applied to the digital data set of MicroCT images. This study demonstrated that the use of a mathematical filter could successfully reduce metallic halation, facilitating the osseointegration evaluation at the bone implant interface in the reconstructed images.

  19. Quaternionic Spatiotemporal Filtering for Dense Motion Field Estimation in Ultrasound Imaging

    Directory of Open Access Journals (Sweden)

    Marion Adrien

    2010-01-01

    Full Text Available Abstract Blood motion estimation provides fundamental clinical information to prevent and detect pathologies such as cancer. Ultrasound imaging associated with Doppler methods is often used for blood flow evaluation. However, Doppler methods suffer from shortcomings such as limited spatial resolution and the inability to estimate lateral motion. Numerous methods such as block matching and decorrelation-based techniques have been proposed to overcome these limitations. In this paper, we propose an original method to estimate dense fields of vector velocity from ultrasound image sequences. Our proposal is based on a spatiotemporal approach and considers 2D+t data as a 3D volume. Orientation of the texture within this volume is related to velocity. Thus, we designed a bank of 3D quaternionic filters to estimate local orientation and then calculate local velocities. The method was applied to a large set of experimental and simulated flow sequences with low motion ( 1 mm/s within small vessels ( 1 mm. Evaluation was conducted with several quantitative criteria such as the normalized mean error or the estimated mean velocity. The results obtained show the good behaviour of our method, characterizing the flows studied.

  20. STUDIES OF NGC 6720 WITH CALIBRATED HST/WFC3 EMISSION-LINE FILTER IMAGES. I. STRUCTURE AND EVOLUTION ,

    International Nuclear Information System (INIS)

    O'Dell, C. R.; Ferland, G. J.; Henney, W. J.; Peimbert, M.

    2013-01-01

    We have performed a detailed analysis of the Ring Nebula (NGC 6720) using Hubble Space Telescope WFC3 images and derived a new three-dimensional model. Existing high spectral resolution spectra played an important supplementary role in our modeling. It is shown that the Main Ring of the nebula is an ionization-bounded irregular non-symmetric disk with a central cavity and perpendicular extended lobes pointed almost toward the observer. The faint outer halos are determined to be fossil radiation, i.e., radiation from gas ionized in an earlier stage of the nebula when it was not ionization bounded. The narrowband WFC3 filters that isolate some of the emission lines are affected by broadening on their short wavelength side and all the filters were calibrated using ground-based spectra. The filter calibration results are presented in an appendix.

  1. Human Detection Based on the Generation of a Background Image by Using a Far-Infrared Light Camera

    Directory of Open Access Journals (Sweden)

    Eun Som Jeon

    2015-03-01

    Full Text Available The need for computer vision-based human detection has increased in fields, such as security, intelligent surveillance and monitoring systems. However, performance enhancement of human detection based on visible light cameras is limited, because of factors, such as nonuniform illumination, shadows and low external light in the evening and night. Consequently, human detection based on thermal (far-infrared light cameras has been considered as an alternative. However, its performance is influenced by the factors, such as low image resolution, low contrast and the large noises of thermal images. It is also affected by the high temperature of backgrounds during the day. To solve these problems, we propose a new method for detecting human areas in thermal camera images. Compared to previous works, the proposed research is novel in the following four aspects. One background image is generated by median and average filtering. Additional filtering procedures based on maximum gray level, size filtering and region erasing are applied to remove the human areas from the background image. Secondly, candidate human regions in the input image are located by combining the pixel and edge difference images between the input and background images. The thresholds for the difference images are adaptively determined based on the brightness of the generated background image. Noise components are removed by component labeling, a morphological operation and size filtering. Third, detected areas that may have more than two human regions are merged or separated based on the information in the horizontal and vertical histograms of the detected area. This procedure is adaptively operated based on the brightness of the generated background image. Fourth, a further procedure for the separation and removal of the candidate human regions is performed based on the size and ratio of the height to width information of the candidate regions considering the camera viewing direction

  2. An improved three-dimension reconstruction method based on guided filter and Delaunay

    Science.gov (United States)

    Liu, Yilin; Su, Xiu; Liang, Haitao; Xu, Huaiyuan; Wang, Yi; Chen, Xiaodong

    2018-01-01

    Binocular stereo vision is becoming a research hotspot in the area of image processing. Based on traditional adaptive-weight stereo matching algorithm, we improve the cost volume by averaging the AD (Absolute Difference) of RGB color channels and adding x-derivative of the grayscale image to get the cost volume. Then we use guided filter in the cost aggregation step and weighted median filter for post-processing to address the edge problem. In order to get the location in real space, we combine the deep information with the camera calibration to project each pixel in 2D image to 3D coordinate matrix. We add the concept of projection to region-growing algorithm for surface reconstruction, its specific operation is to project all the points to a 2D plane through the normals of clouds and return the results back to 3D space according to these connection relationship among the points in 2D plane. During the triangulation in 2D plane, we use Delaunay algorithm because it has optimal quality of mesh. We configure OpenCV and pcl on Visual Studio for testing, and the experimental results show that the proposed algorithm have higher computational accuracy of disparity and can realize the details of the real mesh model.

  3. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  4. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  5. GPU-Based Block-Wise Nonlocal Means Denoising for 3D Ultrasound Images

    Directory of Open Access Journals (Sweden)

    Liu Li

    2013-01-01

    Full Text Available Speckle suppression plays an important role in improving ultrasound (US image quality. While lots of algorithms have been proposed for 2D US image denoising with remarkable filtering quality, there is relatively less work done on 3D ultrasound speckle suppression, where the whole volume data rather than just one frame needs to be considered. Then, the most crucial problem with 3D US denoising is that the computational complexity increases tremendously. The nonlocal means (NLM provides an effective method for speckle suppression in US images. In this paper, a programmable graphic-processor-unit- (GPU- based fast NLM filter is proposed for 3D ultrasound speckle reduction. A Gamma distribution noise model, which is able to reliably capture image statistics for Log-compressed ultrasound images, was used for the 3D block-wise NLM filter on basis of Bayesian framework. The most significant aspect of our method was the adopting of powerful data-parallel computing capability of GPU to improve the overall efficiency. Experimental results demonstrate that the proposed method can enormously accelerate the algorithm.

  6. Object-Oriented Semisupervised Classification of VHR Images by Combining MedLDA and a Bilateral Filter

    Directory of Open Access Journals (Sweden)

    Shi He

    2015-01-01

    Full Text Available A Bayesian hierarchical model is presented to classify very high resolution (VHR images in a semisupervised manner, in which both a maximum entropy discrimination latent Dirichlet allocation (MedLDA and a bilateral filter are combined into a novel application framework. The primary contribution of this paper is to nullify the disadvantages of traditional probabilistic topic models on pixel-level supervised information and to achieve the effective classification of VHR remote sensing images. This framework consists of the following two iterative steps. In the training stage, the model utilizes the central labeled pixel and its neighborhood, as a squared labeled image object, to train the classifiers. In the classification stage, each central unlabeled pixel with its neighborhood, as an unlabeled object, is classified as a user-provided geoobject class label with the maximum posterior probability. Gibbs sampling is adopted for model inference. The experimental results demonstrate that the proposed method outperforms two classical SVM-based supervised classification methods and probabilistic-topic-models-based classification methods.

  7. Noise reduction and functional maps image quality improvement in dynamic CT perfusion using a new k-means clustering guided bilateral filter (KMGB).

    Science.gov (United States)

    Pisana, Francesco; Henzler, Thomas; Schönberg, Stefan; Klotz, Ernst; Schmidt, Bernhard; Kachelrieß, Marc

    2017-07-01

    Dynamic CT perfusion (CTP) consists in repeated acquisitions of the same volume in different time steps, slightly before, during and slightly afterwards the injection of contrast media. Important functional information can be derived for each voxel, which reflect the local hemodynamic properties and hence the metabolism of the tissue. Different approaches are being investigated to exploit data redundancy and prior knowledge for noise reduction of such datasets, ranging from iterative reconstruction schemes to high dimensional filters. We propose a new spatial bilateral filter which makes use of the k-means clustering algorithm and of an optimal calculated guiding image. We named the proposed filter as k-means clustering guided bilateral filter (KMGB). In this study, the KMGB filter is compared with the partial temporal non-local means filter (PATEN), with the time-intensity profile similarity (TIPS) filter, and with a new version derived from it, by introducing the guiding image (GB-TIPS). All the filters were tested on a digital in-house developed brain CTP phantom, were noise was added to simulate 80 kV and 200 mAs (default scanning parameters), 100 mAs and 30 mAs. Moreover, the filters performances were tested on 7 noisy clinical datasets with different pathologies in different body regions. The original contribution of our work is two-fold: first we propose an efficient algorithm to calculate a guiding image to improve the results of the TIPS filter, secondly we propose the introduction of the k-means clustering step and demonstrate how this can potentially replace the TIPS part of the filter obtaining better results at lower computational efforts. As expected, in the GB-TIPS, the introduction of the guiding image limits the over-smoothing of the TIPS filter, improving spatial resolution by more than 50%. Furthermore, replacing the time-intensity profile similarity calculation with a fuzzy k-means clustering strategy (KMGB) allows to control the edge preserving

  8. Tunable electro-optic filter stack

    Science.gov (United States)

    Fontecchio, Adam K.; Shriyan, Sameet K.; Bellingham, Alyssa

    2017-09-05

    A holographic polymer dispersed liquid crystal (HPDLC) tunable filter exhibits switching times of no more than 20 microseconds. The HPDLC tunable filter can be utilized in a variety of applications. An HPDLC tunable filter stack can be utilized in a hyperspectral imaging system capable of spectrally multiplexing hyperspectral imaging data acquired while the hyperspectral imaging system is airborne. HPDLC tunable filter stacks can be utilized in high speed switchable optical shielding systems, for example as a coating for a visor or an aircraft canopy. These HPDLC tunable filter stacks can be fabricated using a spin coating apparatus and associated fabrication methods.

  9. Intensity Variation Normalization for Finger Vein Recognition Using Guided Filter Based Singe Scale Retinex.

    Science.gov (United States)

    Xie, Shan Juan; Lu, Yu; Yoon, Sook; Yang, Jucheng; Park, Dong Sun

    2015-07-14

    Finger vein recognition has been considered one of the most promising biometrics for personal authentication. However, the capacities and percentages of finger tissues (e.g., bone, muscle, ligament, water, fat, etc.) vary person by person. This usually causes poor quality of finger vein images, therefore degrading the performance of finger vein recognition systems (FVRSs). In this paper, the intrinsic factors of finger tissue causing poor quality of finger vein images are analyzed, and an intensity variation (IV) normalization method using guided filter based single scale retinex (GFSSR) is proposed for finger vein image enhancement. The experimental results on two public datasets demonstrate the effectiveness of the proposed method in enhancing the image quality and finger vein recognition accuracy.

  10. Choosing and using astronomical filters

    CERN Document Server

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  11. Active Damping Techniques for LCL-Filtered Inverters-Based Microgrids

    DEFF Research Database (Denmark)

    Lorzadeh, Iman; Firoozabadi, Mehdi Savaghebi; Askarian Abyaneh, Hossein

    2015-01-01

    LCL-type filters are widely used in gridconnected voltage source inverters, since it provides switching ripples reduction with lower cost and weight than the L-type counterpart. However, the inclusion of LCL-filters in voltage source inverters complicates the current control design regarding system...... the different active damping approaches for grid-connected inverters with LCL filters, which are based on high-order filters and additional feedbacks methods. These techniques are analyzed and discussed in detail....... stability issues; because an inherent resonance peak appears due to zero impedance at that resonance frequency. Moreover, in grid-interactive low-voltage microgrids, the interactions among the LCL-filtered-based parallel inverters may result in a more complex multiresonance issue which may compromise...

  12. Binary-space-partitioned images for resolving image-based visibility.

    Science.gov (United States)

    Fu, Chi-Wing; Wong, Tien-Tsin; Tong, Wai-Shun; Tang, Chi-Keung; Hanson, Andrew J

    2004-01-01

    We propose a novel 2D representation for 3D visibility sorting, the Binary-Space-Partitioned Image (BSPI), to accelerate real-time image-based rendering. BSPI is an efficient 2D realization of a 3D BSP tree, which is commonly used in computer graphics for time-critical visibility sorting. Since the overall structure of a BSP tree is encoded in a BSPI, traversing a BSPI is comparable to traversing the corresponding BSP tree. BSPI performs visibility sorting efficiently and accurately in the 2D image space by warping the reference image triangle-by-triangle instead of pixel-by-pixel. Multiple BSPIs can be combined to solve "disocclusion," when an occluded portion of the scene becomes visible at a novel viewpoint. Our method is highly automatic, including a tensor voting preprocessing step that generates candidate image partition lines for BSPIs, filters the noisy input data by rejecting outliers, and interpolates missing information. Our system has been applied to a variety of real data, including stereo, motion, and range images.

  13. The model of illumination-transillumination for image enhancement of X-ray images

    Energy Technology Data Exchange (ETDEWEB)

    Lyu, Kwang Yeul [Shingu College, Sungnam (Korea, Republic of); Rhee, Sang Min [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2001-06-01

    In digital image processing, the homomorphic filtering approach is derived from an illumination - reflectance model of the image. It can also be used with an illumination-transillumination model X-ray film. Several X-ray images were applied to enhancement with histogram equalization and homomorphic filter based on an illumination-transillumination model. The homomorphic filter has proven theoretical claim of image density range compression and balanced contrast enhancement, and also was found a valuable tool to process analog X-ray images to digital images.

  14. Adaptive Image Transmission Scheme over Wavelet-Based OFDM System

    Institute of Scientific and Technical Information of China (English)

    GAOXinying; YUANDongfeng; ZHANGHaixia

    2005-01-01

    In this paper an adaptive image transmission scheme is proposed over Wavelet-based OFDM (WOFDM) system with Unequal error protection (UEP) by the design of non-uniform signal constellation in MLC. Two different data division schemes: byte-based and bitbased, are analyzed and compared. Different bits are protected unequally according to their different contribution to the image quality in bit-based data division scheme, which causes UEP combined with this scheme more powerful than that with byte-based scheme. Simulation results demonstrate that image transmission by UEP with bit-based data division scheme presents much higher PSNR values and surprisingly better image quality. Furthermore, by considering the tradeoff of complexity and BER performance, Haar wavelet with the shortest compactly supported filter length is the most suitable one among orthogonal Daubechies wavelet series in our proposed system.

  15. Comparison of the image qualities of filtered back-projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction for CT venography at 80 kVp

    International Nuclear Information System (INIS)

    Kim, Jin Hyeok; Choo, Ki Seok; Moon, Tae Yong; Lee, Jun Woo; Jeon, Ung Bae; Kim, Tae Un; Hwang, Jae Yeon; Yun, Myeong-Ja; Jeong, Dong Wook; Lim, Soo Jin

    2016-01-01

    To evaluate the subjective and objective qualities of computed tomography (CT) venography images at 80 kVp using model-based iterative reconstruction (MBIR) and to compare these with those of filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) using the same CT data sets. Forty-four patients (mean age: 56.1 ± 18.1) who underwent 80 kVp CT venography (CTV) for the evaluation of deep vein thrombosis (DVT) during 4 months were enrolled in this retrospective study. The same raw data were reconstructed using FBP, ASIR, and MBIR. Objective and subjective image analysis were performed at the inferior vena cava (IVC), femoral vein, and popliteal vein. The mean CNR of MBIR was significantly greater than those of FBP and ASIR and images reconstructed using MBIR had significantly lower objective image noise (p <.001). Subjective image quality and confidence of detecting DVT by MBIR group were significantly greater than those of FBP and ASIR (p <.005), and MBIR had the lowest score for subjective image noise (p <.001). CTV at 80 kVp with MBIR was superior to FBP and ASIR regarding subjective and objective image qualities. (orig.)

  16. Flat microwave photonic filter based on hybrid of two filters

    International Nuclear Information System (INIS)

    Qi, Chunhui; Pei, Li; Ning, Tigang; Li, Jing; Gao, Song

    2010-01-01

    A new microwave photonic filter (MPF) hybrid of two filters that can realize both multiple taps and a flat bandpass or bandstop response is presented. Based on the phase character of a Mach–Zehnder modulator (MZM), a two taps finite impulse response (FIR) filter is obtained as the first part. The second part is obtained by taking full advantage of the wavelength selectivity of the fiber Bragg grating (FBG) and the gain of a erbium-doped fiber (EDF). Combining the two filters, the flat bandpass or bandstop response is realized by changing the coupler's factor k, the reflectivity of FBG1 R 1 or the gain of the EDF g. Optimizing the system parameters, a flat bandpass response with amplitude depth of more than 45 dB is obtained at k = 0.5, R 1 = 0.33, g = 10, and a flat bandstop response is also obtained at k = 0.4, R 1 = 0.5, g = 2. In addition, the free-spectral range (FSR) can be controlled by changing the length of the EDF and the length difference between two MZMs. The method is proved feasible by some experiments. Such a method offers realistic solutions to support future radio-frequency (RF) optical communication systems

  17. Improvement of nonlinear diffusion equation using relaxed geometric mean filter for low PSNR images

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan

    2013-01-01

    A new method to improve the performance of low PSNR image denoising is presented. The proposed scheme estimates edge gradient from an image that is regularised with a relaxed geometric mean filter. The proposed method consists of two stages; the first stage consists of a second order nonlinear an...

  18. A flexible new method for 3D measurement based on multi-view image sequences

    Science.gov (United States)

    Cui, Haihua; Zhao, Zhimin; Cheng, Xiaosheng; Guo, Changye; Jia, Huayu

    2016-11-01

    Three-dimensional measurement is the base part for reverse engineering. The paper developed a new flexible and fast optical measurement method based on multi-view geometry theory. At first, feature points are detected and matched with improved SIFT algorithm. The Hellinger Kernel is used to estimate the histogram distance instead of traditional Euclidean distance, which is immunity to the weak texture image; then a new filter three-principle for filtering the calculation of essential matrix is designed, the essential matrix is calculated using the improved a Contrario Ransac filter method. One view point cloud is constructed accurately with two view images; after this, the overlapped features are used to eliminate the accumulated errors caused by added view images, which improved the camera's position precision. At last, the method is verified with the application of dental restoration CAD/CAM, experiment results show that the proposed method is fast, accurate and flexible for tooth 3D measurement.

  19. Dual mode operation, highly selective nanohole array-based plasmonic colour filters

    Science.gov (United States)

    Fouladi Mahani, Fatemeh; Mokhtari, Arash; Mehran, Mahdiyeh

    2017-09-01

    Taking advantage of nanostructured metal films as plasmonic colour filters (PCFs) has been evolved remarkably as an alternative to the conventional technologies of chemical colour filtering. However, most of the proposed PCFs depict a poor colour purity focusing on generating either the additive or subtractive colours. In this paper, we present dual mode operation PCFs employing an opaque aluminium film patterned with sub-wavelength holes. Subtractive colours like cyan, magenta, and yellow are the results of reflection mode of these filters yielding optical efficiencies as high as 70%-80% and full width at half maximum of the stop-bands up to 40-50 nm. The colour selectivity of the transmission mode for the additive colours is also significant due to their enhanced performance through the utilization of a relatively thick aluminium film in contact with a modified dielectric environment. These filters provide a simple design with one-step lithography in addition to compatibility with the conventional CMOS processes. Moreover, they are polarization insensitive due to their symmetric geometry. A complete palette of pure subtractive and additive colours has been realized with potential applications, such as multispectral imaging, CMOS image sensors, displays, and colour printing.

  20. Fringing in MonoCam Y4 filter images

    International Nuclear Information System (INIS)

    Brooks, J.; Nomerotski, A.; Fisher-Levine, M.

    2017-01-01

    We study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y 4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net ''fringe'' pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relative intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. We also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.

  1. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    International Nuclear Information System (INIS)

    Huang, J; Szczykutowicz, T; Bayouth, J; Miller, J

    2016-01-01

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between the acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials

  2. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    Energy Technology Data Exchange (ETDEWEB)

    Huang, J; Szczykutowicz, T; Bayouth, J; Miller, J [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between the acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials

  3. Regularization of DT-MR images using a successive Fermat median filtering method.

    Science.gov (United States)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan; Hong, Cheolpyo; Han, Bongsoo

    2008-05-21

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods.

  4. Regularization of DT-MR images using a successive Fermat median filtering method

    International Nuclear Information System (INIS)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan; Hong, Cheolpyo; Han, Bongsoo

    2008-01-01

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods

  5. Regularization of DT-MR images using a successive Fermat median filtering method

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan [Department of Biomedical Engineering, Yonsei University, Wonju, 220-710 (Korea, Republic of); Hong, Cheolpyo; Han, Bongsoo [Department of Radiological Science, Yonsei University, Wonju, 220-710 (Korea, Republic of)], E-mail: bshan@yonsei.ac.kr

    2008-05-21

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods.

  6. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    International Nuclear Information System (INIS)

    Kamezawa, H; Arimura, H; Ohki, M; Shirieda, K; Kameda, N

    2014-01-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems

  7. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    Energy Technology Data Exchange (ETDEWEB)

    Kamezawa, H [Graduate School of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan); Arimura, H; Ohki, M [Faculty of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Shirieda, K; Kameda, N [Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan)

    2014-06-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems.

  8. Intensity Variation Normalization for Finger Vein Recognition Using Guided Filter Based Singe Scale Retinex

    Directory of Open Access Journals (Sweden)

    Shan Juan Xie

    2015-07-01

    Full Text Available Finger vein recognition has been considered one of the most promising biometrics for personal authentication. However, the capacities and percentages of finger tissues (e.g., bone, muscle, ligament, water, fat, etc. vary person by person. This usually causes poor quality of finger vein images, therefore degrading the performance of finger vein recognition systems (FVRSs. In this paper, the intrinsic factors of finger tissue causing poor quality of finger vein images are analyzed, and an intensity variation (IV normalization method using guided filter based single scale retinex (GFSSR is proposed for finger vein image enhancement. The experimental results on two public datasets demonstrate the effectiveness of the proposed method in enhancing the image quality and finger vein recognition accuracy.

  9. A new relative radiometric consistency processing method for change detection based on wavelet transform and a low-pass filter

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The research purpose of this paper is to show the limitations of the existing radiometric normalization approaches and their disadvantages in change detection of artificial objects by comparing the existing approaches,on the basis of which a preprocessing approach to radiometric consistency,based on wavelet transform and a spatial low-pass filter,has been devised.This approach first separates the high frequency information and low frequency information by wavelet transform.Then,the processing of relative radiometric consistency based on a low-pass filter is conducted on the low frequency parts.After processing,an inverse wavelet transform is conducted to obtain the results image.The experimental results show that this approach can substantially reduce the influence on change detection of linear or nonlinear radiometric differences in multi-temporal images.

  10. Optimization-based particle filter for state and parameter estimation

    Institute of Scientific and Technical Information of China (English)

    Li Fu; Qi Fei; Shi Guangming; Zhang Li

    2009-01-01

    In recent years, the theory of particle filter has been developed and widely used for state and parameter estimation in nonlinear/non-Gaussian systems. Choosing good importance density is a critical issue in particle filter design. In order to improve the approximation of posterior distribution, this paper provides an optimization-based algorithm (the steepest descent method) to generate the proposal distribution and then sample particles from the distribution. This algorithm is applied in 1-D case, and the simulation results show that the proposed particle filter performs better than the extended Kalman filter (EKF), the standard particle filter (PF), the extended Kalman particle filter (PF-EKF) and the unscented particle filter (UPF) both in efficiency and in estimation precision.

  11. An automatic method to determine cutoff frequency based on image power spectrum

    International Nuclear Information System (INIS)

    Beis, J.S.; Vancouver Hospital and Health Sciences Center, British Columbia; Celler, A.; Barney, J.S.

    1995-01-01

    The authors present an algorithm for automatically choosing filter cutoff frequency (F c ) using the power spectrum of the projections. The method is based on the assumption that the expectation of the image power spectrum is the sum of the expectation of the blurred object power spectrum (dominant at low frequencies) plus a constant value due to Poisson noise. By considering the discrete components of the noise-dominated high-frequency spectrum as a Gaussian distribution N(μ,σ), the Student t-test determines F c as the highest frequency for which the image frequency components are unlikely to be drawn from N (μ,σ). The method is general and can be applied to any filter. In this work, the authors tested the approach using the Metz restoration filter on simulated, phantom, and patient data with good results. Quantitative performance of the technique was evaluated by plotting recovery coefficient (RC) versus NMSE of reconstructed images

  12. Hierarchical detection of red lesions in retinal images by multiscale correlation filtering

    Science.gov (United States)

    Zhang, Bob; Wu, Xiangqian; You, Jane; Li, Qin; Karray, Fakhri

    2009-02-01

    This paper presents an approach to the computer aided diagnosis (CAD) of diabetic retinopathy (DR) -- a common and severe complication of long-term diabetes which damages the retina and cause blindness. Since red lesions are regarded as the first signs of DR, there has been extensive research on effective detection and localization of these abnormalities in retinal images. In contrast to existing algorithms, a new approach based on Multiscale Correlation Filtering (MSCF) and dynamic thresholding is developed. This consists of two levels, Red Lesion Candidate Detection (coarse level) and True Red Lesion Detection (fine level). The approach was evaluated using data from Retinopathy On-line Challenge (ROC) competition website and we conclude our method to be effective and efficient.

  13. Rotationally invariant correlation filtering

    International Nuclear Information System (INIS)

    Schils, G.F.; Sweeney, D.W.

    1985-01-01

    A method is presented for analyzing and designing optical correlation filters that have tailored rotational invariance properties. The concept of a correlation of an image with a rotation of itself is introduced. A unified theory of rotation-invariant filtering is then formulated. The unified approach describes matched filters (with no rotation invariance) and circular-harmonic filters (with full rotation invariance) as special cases. The continuum of intermediate cases is described in terms of a cyclic convolution operation over angle. The angular filtering approach allows an exact choice for the continuous trade-off between loss of the correlation energy (or specificity regarding the image) and the amount of rotational invariance desired

  14. Knowledge-based iterative model reconstruction: comparative image quality and radiation dose with a pediatric computed tomography phantom

    International Nuclear Information System (INIS)

    Ryu, Young Jin; Choi, Young Hun; Cheon, Jung-Eun; Kim, Woo Sun; Kim, In-One; Ha, Seongmin

    2016-01-01

    CT of pediatric phantoms can provide useful guidance to the optimization of knowledge-based iterative reconstruction CT. To compare radiation dose and image quality of CT images obtained at different radiation doses reconstructed with knowledge-based iterative reconstruction, hybrid iterative reconstruction and filtered back-projection. We scanned a 5-year anthropomorphic phantom at seven levels of radiation. We then reconstructed CT data with knowledge-based iterative reconstruction (iterative model reconstruction [IMR] levels 1, 2 and 3; Philips Healthcare, Andover, MA), hybrid iterative reconstruction (iDose 4 , levels 3 and 7; Philips Healthcare, Andover, MA) and filtered back-projection. The noise, signal-to-noise ratio and contrast-to-noise ratio were calculated. We evaluated low-contrast resolutions and detectability by low-contrast targets and subjective and objective spatial resolutions by the line pairs and wire. With radiation at 100 peak kVp and 100 mAs (3.64 mSv), the relative doses ranged from 5% (0.19 mSv) to 150% (5.46 mSv). Lower noise and higher signal-to-noise, contrast-to-noise and objective spatial resolution were generally achieved in ascending order of filtered back-projection, iDose 4 levels 3 and 7, and IMR levels 1, 2 and 3, at all radiation dose levels. Compared with filtered back-projection at 100% dose, similar noise levels were obtained on IMR level 2 images at 24% dose and iDose 4 level 3 images at 50% dose, respectively. Regarding low-contrast resolution, low-contrast detectability and objective spatial resolution, IMR level 2 images at 24% dose showed comparable image quality with filtered back-projection at 100% dose. Subjective spatial resolution was not greatly affected by reconstruction algorithm. Reduced-dose IMR obtained at 0.92 mSv (24%) showed similar image quality to routine-dose filtered back-projection obtained at 3.64 mSv (100%), and half-dose iDose 4 obtained at 1.81 mSv. (orig.)

  15. Knowledge-based iterative model reconstruction: comparative image quality and radiation dose with a pediatric computed tomography phantom.

    Science.gov (United States)

    Ryu, Young Jin; Choi, Young Hun; Cheon, Jung-Eun; Ha, Seongmin; Kim, Woo Sun; Kim, In-One

    2016-03-01

    CT of pediatric phantoms can provide useful guidance to the optimization of knowledge-based iterative reconstruction CT. To compare radiation dose and image quality of CT images obtained at different radiation doses reconstructed with knowledge-based iterative reconstruction, hybrid iterative reconstruction and filtered back-projection. We scanned a 5-year anthropomorphic phantom at seven levels of radiation. We then reconstructed CT data with knowledge-based iterative reconstruction (iterative model reconstruction [IMR] levels 1, 2 and 3; Philips Healthcare, Andover, MA), hybrid iterative reconstruction (iDose(4), levels 3 and 7; Philips Healthcare, Andover, MA) and filtered back-projection. The noise, signal-to-noise ratio and contrast-to-noise ratio were calculated. We evaluated low-contrast resolutions and detectability by low-contrast targets and subjective and objective spatial resolutions by the line pairs and wire. With radiation at 100 peak kVp and 100 mAs (3.64 mSv), the relative doses ranged from 5% (0.19 mSv) to 150% (5.46 mSv). Lower noise and higher signal-to-noise, contrast-to-noise and objective spatial resolution were generally achieved in ascending order of filtered back-projection, iDose(4) levels 3 and 7, and IMR levels 1, 2 and 3, at all radiation dose levels. Compared with filtered back-projection at 100% dose, similar noise levels were obtained on IMR level 2 images at 24% dose and iDose(4) level 3 images at 50% dose, respectively. Regarding low-contrast resolution, low-contrast detectability and objective spatial resolution, IMR level 2 images at 24% dose showed comparable image quality with filtered back-projection at 100% dose. Subjective spatial resolution was not greatly affected by reconstruction algorithm. Reduced-dose IMR obtained at 0.92 mSv (24%) showed similar image quality to routine-dose filtered back-projection obtained at 3.64 mSv (100%), and half-dose iDose(4) obtained at 1.81 mSv.

  16. Text extraction method for historical Tibetan document images based on block projections

    Science.gov (United States)

    Duan, Li-juan; Zhang, Xi-qun; Ma, Long-long; Wu, Jian

    2017-11-01

    Text extraction is an important initial step in digitizing the historical documents. In this paper, we present a text extraction method for historical Tibetan document images based on block projections. The task of text extraction is considered as text area detection and location problem. The images are divided equally into blocks and the blocks are filtered by the information of the categories of connected components and corner point density. By analyzing the filtered blocks' projections, the approximate text areas can be located, and the text regions are extracted. Experiments on the dataset of historical Tibetan documents demonstrate the effectiveness of the proposed method.

  17. Digital Correlation based on Wavelet Transform for Image Detection

    International Nuclear Information System (INIS)

    Barba, L; Vargas, L; Torres, C; Mattos, L

    2011-01-01

    In this work is presented a method for the optimization of digital correlators to improve the characteristic detection on images using wavelet transform as well as subband filtering. It is proposed an approach of wavelet-based image contrast enhancement in order to increase the performance of digital correlators. The multiresolution representation is employed to improve the high frequency content of images taken into account the input contrast measured for the original image. The energy of correlation peaks and discrimination level of several objects are improved with this technique. To demonstrate the potentiality in extracting characteristics using the wavelet transform, small objects inside reference images are detected successfully.

  18. CT Image Sequence Restoration Based on Sparse and Low-Rank Decomposition

    Science.gov (United States)

    Gou, Shuiping; Wang, Yueyue; Wang, Zhilong; Peng, Yong; Zhang, Xiaopeng; Jiao, Licheng; Wu, Jianshe

    2013-01-01

    Blurry organ boundaries and soft tissue structures present a major challenge in biomedical image restoration. In this paper, we propose a low-rank decomposition-based method for computed tomography (CT) image sequence restoration, where the CT image sequence is decomposed into a sparse component and a low-rank component. A new point spread function of Weiner filter is employed to efficiently remove blur in the sparse component; a wiener filtering with the Gaussian PSF is used to recover the average image of the low-rank component. And then we get the recovered CT image sequence by combining the recovery low-rank image with all recovery sparse image sequence. Our method achieves restoration results with higher contrast, sharper organ boundaries and richer soft tissue structure information, compared with existing CT image restoration methods. The robustness of our method was assessed with numerical experiments using three different low-rank models: Robust Principle Component Analysis (RPCA), Linearized Alternating Direction Method with Adaptive Penalty (LADMAP) and Go Decomposition (GoDec). Experimental results demonstrated that the RPCA model was the most suitable for the small noise CT images whereas the GoDec model was the best for the large noisy CT images. PMID:24023764

  19. LED induced autofluorescence (LIAF) imager with eight multi-filters for oral cancer diagnosis

    Science.gov (United States)

    Huang, Ting-Wei; Cheng, Nai-Lun; Tsai, Ming-Hsui; Chiou, Jin-Chern; Mang, Ou-Yang

    2016-03-01

    Oral cancer is one of the serious and growing problem in many developing and developed countries. The simple oral visual screening by clinician can reduce 37,000 oral cancer deaths annually worldwide. However, the conventional oral examination with the visual inspection and the palpation of oral lesions is not an objective and reliable approach for oral cancer diagnosis, and it may cause the delayed hospital treatment for the patients of oral cancer or leads to the oral cancer out of control in the late stage. Therefore, a device for oral cancer detection are developed for early diagnosis and treatment. A portable LED Induced autofluorescence (LIAF) imager is developed by our group. It contained the multiple wavelength of LED excitation light and the rotary filter ring of eight channels to capture ex-vivo oral tissue autofluorescence images. The advantages of LIAF imager compared to other devices for oral cancer diagnosis are that LIAF imager has a probe of L shape for fixing the object distance, protecting the effect of ambient light, and observing the blind spot in the deep port between the gumsgingiva and the lining of the mouth. Besides, the multiple excitation of LED light source can induce multiple autofluorescence, and LIAF imager with the rotary filter ring of eight channels can detect the spectral images of multiple narrow bands. The prototype of a portable LIAF imager is applied in the clinical trials for some cases in Taiwan, and the images of the clinical trial with the specific excitation show the significant differences between normal tissue and oral tissue under these cases.

  20. Tunable output-frequency filter algorithm for imaging through scattering media under LED illumination

    Science.gov (United States)

    Zhou, Meiling; Singh, Alok Kumar; Pedrini, Giancarlo; Osten, Wolfgang; Min, Junwei; Yao, Baoli

    2018-03-01

    We present a tunable output-frequency filter (TOF) algorithm to reconstruct the object from noisy experimental data under low-power partially coherent illumination, such as LED, when imaging through scattering media. In the iterative algorithm, we employ Gaussian functions with different filter windows at different stages of iteration process to reduce corruption from experimental noise to search for a global minimum in the reconstruction. In comparison with the conventional iterative phase retrieval algorithm, we demonstrate that the proposed TOF algorithm achieves consistent and reliable reconstruction in the presence of experimental noise. Moreover, the spatial resolution and distinctive features are retained in the reconstruction since the filter is applied only to the region outside the object. The feasibility of the proposed method is proved by experimental results.

  1. Dynamic beam filtering for miscentered patients.

    Science.gov (United States)

    Mao, Andrew; Shyr, William; Gang, Grace J; Stayman, J Webster

    2018-02-01

    Accurate centering of the patient within the bore of a CT scanner takes time and is often difficult to achieve precisely. Patient miscentering can result in significant dose and image noise penalties with the use of traditional bowtie filters. This work describes a system to dynamically position an x-ray beam filter during image acquisition to enable more consistent image performance and potentially lower dose needed for CT imaging. We propose a new approach in which two orthogonal low-dose scout images are used to estimate a parametric model of the object describing its shape, size, and location within the field of view (FOV). This model is then used to compute an optimal filter motion profile by minimizing the variance of the expected detector fluence for each projection. Dynamic filtration was implemented on a cone-beam CT (CBCT) test bench using two different physical filters: 1) an aluminum bowtie and 2) a structured binary filter called a multiple aperture device (MAD). Dynamic filtration performance was compared to a static filter in studies of dose and reconstruction noise as a function of the degree of miscentering of a homogeneous water phantom. Estimated filter trajectories were found to be largely sinusoidal with an amplitude proportional to the amount of miscentering. Dynamic filtration demonstrated an improved ability to keep the spatial distribution of dose and reconstruction noise at baseline levels across varying levels of miscentering, reducing the maximum noise and dose deviation from 53% to 15% and 42% to 14% respectively for the bowtie filter, and 25% to 8% and 24% to 15% respectively for the MAD filter. Dynamic positioning of beam filters during acquisition improves dose utilization and image quality over static filters for miscentered patients. Such dynamic filters relax positioning requirements and have the potential to reduce set-up time and lower dose requirements.

  2. Fixed-pattern noise correction method based on improved moment matching for a TDI CMOS image sensor.

    Science.gov (United States)

    Xu, Jiangtao; Nie, Huafeng; Nie, Kaiming; Jin, Weimin

    2017-09-01

    In this paper, an improved moment matching method based on a spatial correlation filter (SCF) and bilateral filter (BF) is proposed to correct the fixed-pattern noise (FPN) of a time-delay-integration CMOS image sensor (TDI-CIS). First, the values of row FPN (RFPN) and column FPN (CFPN) are estimated and added to the original image through SCF and BF, respectively. Then the filtered image will be processed by an improved moment matching method with a moving window. Experimental results based on a 128-stage TDI-CIS show that, after correcting the FPN in the image captured under uniform illumination, the standard deviation of row mean vector (SDRMV) decreases from 5.6761 LSB to 0.1948 LSB, while the standard deviation of the column mean vector (SDCMV) decreases from 15.2005 LSB to 13.1949LSB. In addition, for different images captured by different TDI-CISs, the average decrease of SDRMV and SDCMV is 5.4922/2.0357 LSB, respectively. Comparative experimental results indicate that the proposed method can effectively correct the FPNs of different TDI-CISs while maintaining image details without any auxiliary equipment.

  3. Superpixel-Based Feature for Aerial Image Scene Recognition

    Directory of Open Access Journals (Sweden)

    Hongguang Li

    2018-01-01

    Full Text Available Image scene recognition is a core technology for many aerial remote sensing applications. Different landforms are inputted as different scenes in aerial imaging, and all landform information is regarded as valuable for aerial image scene recognition. However, the conventional features of the Bag-of-Words model are designed using local points or other related information and thus are unable to fully describe landform areas. This limitation cannot be ignored when the aim is to ensure accurate aerial scene recognition. A novel superpixel-based feature is proposed in this study to characterize aerial image scenes. Then, based on the proposed feature, a scene recognition method of the Bag-of-Words model for aerial imaging is designed. The proposed superpixel-based feature that utilizes landform information establishes top-task superpixel extraction of landforms to bottom-task expression of feature vectors. This characterization technique comprises the following steps: simple linear iterative clustering based superpixel segmentation, adaptive filter bank construction, Lie group-based feature quantification, and visual saliency model-based feature weighting. Experiments of image scene recognition are carried out using real image data captured by an unmanned aerial vehicle (UAV. The recognition accuracy of the proposed superpixel-based feature is 95.1%, which is higher than those of scene recognition algorithms based on other local features.

  4. Learning-based 3D surface optimization from medical image reconstruction

    Science.gov (United States)

    Wei, Mingqiang; Wang, Jun; Guo, Xianglin; Wu, Huisi; Xie, Haoran; Wang, Fu Lee; Qin, Jing

    2018-04-01

    Mesh optimization has been studied from the graphical point of view: It often focuses on 3D surfaces obtained by optical and laser scanners. This is despite the fact that isosurfaced meshes of medical image reconstruction suffer from both staircases and noise: Isotropic filters lead to shape distortion, while anisotropic ones maintain pseudo-features. We present a data-driven method for automatically removing these medical artifacts while not introducing additional ones. We consider mesh optimization as a combination of vertex filtering and facet filtering in two stages: Offline training and runtime optimization. In specific, we first detect staircases based on the scanning direction of CT/MRI scanners, and design a staircase-sensitive Laplacian filter (vertex-based) to remove them; and then design a unilateral filtered facet normal descriptor (uFND) for measuring the geometry features around each facet of a given mesh, and learn the regression functions from a set of medical meshes and their high-resolution reference counterparts for mapping the uFNDs to the facet normals of the reference meshes (facet-based). At runtime, we first perform staircase-sensitive Laplacian filter on an input MC (Marching Cubes) mesh, and then filter the mesh facet normal field using the learned regression functions, and finally deform it to match the new normal field for obtaining a compact approximation of the high-resolution reference model. Tests show that our algorithm achieves higher quality results than previous approaches regarding surface smoothness and surface accuracy.

  5. Complete filter-based cerebral embolic protection with transcatheter aortic valve replacement.

    Science.gov (United States)

    Van Gils, Lennart; Kroon, Herbert; Daemen, Joost; Ren, Claire; Maugenest, Anne-Marie; Schipper, Marguerite; De Jaegere, Peter P; Van Mieghem, Nicolas M

    2018-03-01

    To evaluate the value of left vertebral artery filter protection in addition to the current filter-based embolic protection technology to achieve complete cerebral protection during TAVR. The occurrence of cerebrovascular events after transcatheter aortic valve replacement (TAVR) has fueled concern for its potential application in younger patients with longer life expectancy. Transcatheter cerebral embolic protection (TCEP) devices may limit periprocedural cerebrovascular events by preventing macro and micro-embolization to the brain. Conventional filter-based TCEP devices cover three extracranial contributories to the brain, yet leave the left vertebral artery unprotected. Patients underwent TAVR with complete TCEP. A dual-filter system was deployed in the brachiocephalic trunk and left common carotid artery with an additional single filter in the left vertebral artery. After TAVR all filters were retrieved and sent for histopathological evaluation by an experienced pathologist. Eleven patients received a dual-filter system and nine of them received an additional left vertebral filter. In the remaining two patients, the left vertebral filter could not be deployed. No periprocedural strokes occurred. We found debris in all filters, consisting of thrombus, tissue derived debris, and foreign body material. The left vertebral filter contained debris in an equal amount of patients as the Sentinel filters. The size of the captured particles was similar between all filters. The left vertebral artery is an important entry route for embolic material to the brain during TAVR. Selective filter protection of the left vertebral artery revealed embolic debris in all patients. The clinical value of complete filter-based TCEP during TAVR warrants further research. © 2017 Wiley Periodicals, Inc.

  6. The attitude inversion method of geostationary satellites based on unscented particle filter

    Science.gov (United States)

    Du, Xiaoping; Wang, Yang; Hu, Heng; Gou, Ruixin; Liu, Hao

    2018-04-01

    The attitude information of geostationary satellites is difficult to be obtained since they are presented in non-resolved images on the ground observation equipment in space object surveillance. In this paper, an attitude inversion method for geostationary satellite based on Unscented Particle Filter (UPF) and ground photometric data is presented. The inversion algorithm based on UPF is proposed aiming at the strong non-linear feature in the photometric data inversion for satellite attitude, which combines the advantage of Unscented Kalman Filter (UKF) and Particle Filter (PF). This update method improves the particle selection based on the idea of UKF to redesign the importance density function. Moreover, it uses the RMS-UKF to partially correct the prediction covariance matrix, which improves the applicability of the attitude inversion method in view of UKF and the particle degradation and dilution of the attitude inversion method based on PF. This paper describes the main principles and steps of algorithm in detail, correctness, accuracy, stability and applicability of the method are verified by simulation experiment and scaling experiment in the end. The results show that the proposed method can effectively solve the problem of particle degradation and depletion in the attitude inversion method on account of PF, and the problem that UKF is not suitable for the strong non-linear attitude inversion. However, the inversion accuracy is obviously superior to UKF and PF, in addition, in the case of the inversion with large attitude error that can inverse the attitude with small particles and high precision.

  7. An Extended Kalman Filter-Based Attitude Tracking Algorithm for Star Sensors.

    Science.gov (United States)

    Li, Jian; Wei, Xinguo; Zhang, Guangjun

    2017-08-21

    Efficiency and reliability are key issues when a star sensor operates in tracking mode. In the case of high attitude dynamics, the performance of existing attitude tracking algorithms degenerates rapidly. In this paper an extended Kalman filtering-based attitude tracking algorithm is presented. The star sensor is modeled as a nonlinear stochastic system with the state estimate providing the three degree-of-freedom attitude quaternion and angular velocity. The star positions in the star image are predicted and measured to estimate the optimal attitude. Furthermore, all the cataloged stars observed in the sensor field-of-view according the predicted image motion are accessed using a catalog partition table to speed up the tracking, called star mapping. Software simulation and night-sky experiment are performed to validate the efficiency and reliability of the proposed method.

  8. Image scale measurement with correlation filters in a volume holographic optical correlator

    Science.gov (United States)

    Zheng, Tianxiang; Cao, Liangcai; He, Qingsheng; Jin, Guofan

    2013-08-01

    A search engine containing various target images or different part of a large scene area is of great use for many applications, including object detection, biometric recognition, and image registration. The input image captured in realtime is compared with all the template images in the search engine. A volume holographic correlator is one type of these search engines. It performs thousands of comparisons among the images at a super high speed, with the correlation task accomplishing mainly in optics. However, the inputted target image always contains scale variation to the filtering template images. At the time, the correlation values cannot properly reflect the similarity of the images. It is essential to estimate and eliminate the scale variation of the inputted target image. There are three domains for performing the scale measurement, as spatial, spectral and time domains. Most methods dealing with the scale factor are based on the spatial or the spectral domains. In this paper, a method with the time domain is proposed to measure the scale factor of the input image. It is called a time-sequential scaled method. The method utilizes the relationship between the scale variation and the correlation value of two images. It sends a few artificially scaled input images to compare with the template images. The correlation value increases and decreases with the increasing of the scale factor at the intervals of 0.8~1 and 1~1.2, respectively. The original scale of the input image can be measured by estimating the largest correlation value through correlating the artificially scaled input image with the template images. The measurement range for the scale can be 0.8~4.8. Scale factor beyond 1.2 is measured by scaling the input image at the factor of 1/2, 1/3 and 1/4, correlating the artificially scaled input image with the template images, and estimating the new corresponding scale factor inside 0.8~1.2.

  9. Biogas Filter Based on Local Natural Zeolite Materials

    OpenAIRE

    Krido Wahono, Satriyo; Anggo Rizal, Wahyu

    2014-01-01

    UPT BPPTK LIPI has created a biogas filter tool to improve the purity of methane in the biogas. The device shaped cylindrical tube containing absorbent materials which based on local natural zeolite of Indonesia. The absorbent has been activated and modified with other materials. This absorbtion material has multi-adsorption capacity for almost impurities gas of biogas. The biogas  filter increase methane content of biogas for 5-20%. The biogas filter improve the biogas’s performance such as ...

  10. Some practical considerations in finite element-based digital image correlation

    KAUST Repository

    Wang, Bo

    2015-04-20

    As an alternative to subset-based digital image correlation (DIC), finite element-based (FE-based) DIC method has gained increasing attention in the experimental mechanics community. However, the literature survey reveals that some important issues have not been well addressed in the published literatures. This work therefore aims to point out a few important considerations in the practical algorithm implementation of the FE-based DIC method, along with simple but effective solutions that can effectively tackle these issues. First, to better accommodate the possible intensity variations of the deformed images practically occurred in real experiments, a robust zero-mean normalized sum of squared difference criterion, instead of the commonly used sum of squared difference criterion, is introduced to quantify the similarity between reference and deformed elements in FE-based DIC. Second, to reduce the bias error induced by image noise and imperfect intensity interpolation, low-pass filtering of the speckle images with a 5×5 pixels Gaussian filter prior to correlation analysis, is presented. Third, to ensure the iterative calculation of FE-based DIC converges correctly and rapidly, an efficient subset-based DIC method, instead of simple integer-pixel displacement searching, is used to provide accurate initial guess of deformation for each calculation point. Also, the effects of various convergence criteria on the efficiency and accuracy of FE-based DIC are carefully examined, and a proper convergence criterion is recommended. The efficacy of these solutions is verified by numerical and real experiments. The results reveal that the improved FE-based DIC offers evident advantages over existing FE-based DIC method in terms of accuracy and efficiency. © 2015 Elsevier Ltd. All rights reserved.

  11. Elemental distribution imaging by energy-filtering transmission electron microscopy (EFTEM) and its applications

    International Nuclear Information System (INIS)

    Kurata, Hiroki

    1996-01-01

    EFTEM is new microscopy with the object of visualizing high resolution quantitative elemental distribution. The measurement principles and the present state of EFTEM studies are explained by the examples of measurement of the elemental distributions. EFTEM is a combination of the transmission electron microscope with the electron energy loss spectroscopy (EFLS). EFTEM method sets the slit in the specific energy field and put the electron passing the slit back in the microscopic image. The qualitative elemental analysis is obtained by observing the position of the absorption end of core electronic excitation spectrum and the quantitative one by determining the core electronic excitation strength of the specific atom depend on filtering with energy selector slit. The binding state and the local structure in the neighborhood of excited atom is determined by the fine structure of absorption end. By the chemical mapping method, the distribution image of chemical binding state is visualized by the imaging chemical map obtained by filtering the specific peak strength of fine structure with the narrow energy selector slit. The fine powder of lead chromate (PbCrO 4 ) covered with silica glass was shown as a typical example of the elemental distribution image of core electronic excitation spectrum. The quantitative analysis method of elemental distribution image is explained. The possibility of single atom analysis at nanometer was shown by the example of nanotube observed by EFTEM. (S.Y.)

  12. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    histogram of an existing image (input image) into a new grey value histogram (output image) are most quickly handled by a look-up table (LUT). The histogram of an image can be influenced by gain, offset and gamma of the camera. Gain defines the voltage range, offset defines the reference voltage and gamma the slope of the regression line between the light intensity and the voltage of the camera. A very important descriptor of neighbourhood relations in an image is the co-occurrence matrix. The distance between the pixels (original pixel and its neighbouring pixel) can influence the various parameters calculated from the co-occurrence matrix. The main goals of image enhancement are elimination of surface roughness in an image (smoothing), correction of defects (e.g. noise), extraction of edges, identification of points, strengthening texture elements and improving contrast. In enhancement, two types of operations can be distinguished: pixel-based (point operations) and neighbourhood-based (matrix operations). The most important pixel-based operations are linear stretching of grey values, application of pre-stored LUTs and histogram equalisation. The neighbourhood-based operations work with so-called filters. These are organising elements with an original or initial point in their centre. Filters can be used to accentuate or to suppress specific structures within the image. Filters can work either in the spatial or in the frequency domain. The method used for analysing alterations of grey value intensities in the frequency domain is the Hartley transform. Filter operations in the spatial domain can be based on averaging or ranking the grey values occurring in the organising element. The most important filters, which are usually applied, are the Gaussian filter and the Laplace filter (both averaging filters), and the median filter, the top hat filter and the range operator (all ranking filters). Segmentation of objects is traditionally based on threshold grey values. (AB

  13. Proposing Wavelet-Based Low-Pass Filter and Input Filter to Improve Transient Response of Grid-Connected Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Bijan Rahmani

    2016-08-01

    Full Text Available Available photovoltaic (PV systems show a prolonged transient response, when integrated into the power grid via active filters. On one hand, the conventional low-pass filter, employed within the integrated PV system, works with a large delay, particularly in the presence of system’s low-order harmonics. On the other hand, the switching of the DC (direct current–DC converters within PV units also prolongs the transient response of an integrated system, injecting harmonics and distortion through the PV-end current. This paper initially develops a wavelet-based low-pass filter to improve the transient response of the interconnected PV systems to grid lines. Further, a damped input filter is proposed within the PV system to address the raised converter’s switching issue. Finally, Matlab/Simulink simulations validate the effectiveness of the proposed wavelet-based low-pass filter and damped input filter within an integrated PV system.

  14. Low-power adaptive filter based on RNS components

    DEFF Research Database (Denmark)

    Bernocchi, Gian Luca; Cardarilli, Gian Carlo; Del Re, Andrea

    2007-01-01

    In this paper a low-power implementation of an adaptive FIR filter is presented. The filter is designed to meet the constraints of channel equalization for fixed wireless communications that typically requires a large number of taps, but a serial updating of the filter coefficients, based...... on the least mean squares (LMS) algorithm, is allowed. Previous work showed that the use of the residue number system (RNS) for the variable FIR filter grants advantages both in area and power consumption. On the other hand, the use of a binary serial implementation of the adaptation algorithm eliminates...... the need for complex scaling circuits in RNS. The advantages in terms of area and speed of the presented filter, with respect to its two's complement counterpart, are evaluated for implementations in standard cells....

  15. Optimum filter-based discrimination of neutrons and gamma rays

    International Nuclear Information System (INIS)

    Amiri, Moslem; Prenosil, Vaclav; Cvachovec, Frantisek

    2015-01-01

    An optimum filter-based method for discrimination of neutrons and gamma-rays in a mixed radiation field is presented. The existing filter-based implementations of discriminators require sample pulse responses in advance of the experiment run to build the filter coefficients, which makes them less practical. Our novel technique creates the coefficients during the experiment and improves their quality gradually. Applied to several sets of mixed neutron and photon signals obtained through different digitizers using stilbene scintillator, this approach is analyzed and its discrimination quality is measured. (authors)

  16. Multiscale bilateral filtering for improving image quality in digital breast tomosynthesis

    Science.gov (United States)

    Lu, Yao; Chan, Heang-Ping; Wei, Jun; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2015-01-01

    Purpose: Detection of subtle microcalcifications in digital breast tomosynthesis (DBT) is a challenging task because of the large, noisy DBT volume. It is important to enhance the contrast-to-noise ratio (CNR) of microcalcifications in DBT reconstruction. Most regularization methods depend on local gradient and may treat the ill-defined margins or subtle spiculations of masses and subtle microcalcifications as noise because of their small gradient. The authors developed a new multiscale bilateral filtering (MSBF) regularization method for the simultaneous algebraic reconstruction technique (SART) to improve the CNR of microcalcifications without compromising the quality of masses. Methods: The MSBF exploits a multiscale structure of DBT images to suppress noise and selectively enhance high frequency structures. At the end of each SART iteration, every DBT slice is decomposed into several frequency bands via Laplacian pyramid decomposition. No regularization is applied to the low frequency bands so that subtle edges of masses and structured background are preserved. Bilateral filtering is applied to the high frequency bands to enhance microcalcifications while suppressing noise. The regularized DBT images are used for updating in the next SART iteration. The new MSBF method was compared with the nonconvex total p-variation (TpV) method for noise regularization with SART. A GE GEN2 prototype DBT system was used for acquisition of projections at 21 angles in 3° increments over a ±30° range. The reconstruction image quality with no regularization (NR) and that with the two regularization methods were compared using the DBT scans of a heterogeneous breast phantom and several human subjects with masses and microcalcifications. The CNR and the full width at half maximum (FWHM) of the line profiles of microcalcifications and across the spiculations within their in-focus DBT slices were used as image quality measures. Results: The MSBF method reduced contouring artifacts

  17. Impulsive noise suppression in color images based on the geodesic digital paths

    Science.gov (United States)

    Smolka, Bogdan; Cyganek, Boguslaw

    2015-02-01

    In the paper a novel filtering design based on the concept of exploration of the pixel neighborhood by digital paths is presented. The paths start from the boundary of a filtering window and reach its center. The cost of transitions between adjacent pixels is defined in the hybrid spatial-color space. Then, an optimal path of minimum total cost, leading from pixels of the window's boundary to its center is determined. The cost of an optimal path serves as a degree of similarity of the central pixel to the samples from the local processing window. If a pixel is an outlier, then all the paths starting from the window's boundary will have high costs and the minimum one will also be high. The filter output is calculated as a weighted mean of the central pixel and an estimate constructed using the information on the minimum cost assigned to each image pixel. So, first the costs of optimal paths are used to build a smoothed image and in the second step the minimum cost of the central pixel is utilized for construction of the weights of a soft-switching scheme. The experiments performed on a set of standard color images, revealed that the efficiency of the proposed algorithm is superior to the state-of-the-art filtering techniques in terms of the objective restoration quality measures, especially for high noise contamination ratios. The proposed filter, due to its low computational complexity, can be applied for real time image denoising and also for the enhancement of video streams.

  18. Learnable despeckling framework for optical coherence tomography images

    Science.gov (United States)

    Adabi, Saba; Rashedi, Elaheh; Clayton, Anne; Mohebbi-Kalkhoran, Hamed; Chen, Xue-wen; Conforto, Silvia; Nasiriavanaki, Mohammadreza

    2018-01-01

    Optical coherence tomography (OCT) is a prevalent, interferometric, high-resolution imaging method with broad biomedical applications. Nonetheless, OCT images suffer from an artifact called speckle, which degrades the image quality. Digital filters offer an opportunity for image improvement in clinical OCT devices, where hardware modification to enhance images is expensive. To reduce speckle, a wide variety of digital filters have been proposed; selecting the most appropriate filter for an OCT image/image set is a challenging decision, especially in dermatology applications of OCT where a different variety of tissues are imaged. To tackle this challenge, we propose an expandable learnable despeckling framework, we call LDF. LDF decides which speckle reduction algorithm is most effective on a given image by learning a figure of merit (FOM) as a single quantitative image assessment measure. LDF is learnable, which means when implemented on an OCT machine, each given image/image set is retrained and its performance is improved. Also, LDF is expandable, meaning that any despeckling algorithm can easily be added to it. The architecture of LDF includes two main parts: (i) an autoencoder neural network and (ii) filter classifier. The autoencoder learns the FOM based on several quality assessment measures obtained from the OCT image including signal-to-noise ratio, contrast-to-noise ratio, equivalent number of looks, edge preservation index, and mean structural similarity index. Subsequently, the filter classifier identifies the most efficient filter from the following categories: (a) sliding window filters including median, mean, and symmetric nearest neighborhood, (b) adaptive statistical-based filters including Wiener, homomorphic Lee, and Kuwahara, and (c) edge preserved patch or pixel correlation-based filters including nonlocal mean, total variation, and block matching three-dimensional filtering.

  19. An active damping method based on biquad digital filter for parallel grid-interfacing inverters with LCL filters

    DEFF Research Database (Denmark)

    Lu, Xiaonan; Sun, Kai; Huang, Lipei

    2014-01-01

    around the switching frequency and its multiples. Although the LCL-filters have several advantages compared to single inductance filter, its resonance problem should be noticed. Conventionally, the resonance analysis is mainly focused on the single inverter system, whereas in a renewable energy system...... to the conventional active damping approaches, the biquad filter based active damping method does not require additional sensors and control loops. Meanwhile, the multiple instable closed-loop poles of the parallel inverter system can be moved to the stable region simultaneously. Real-time simulations based on d...

  20. Graphene-based tunable terahertz filter with rectangular ring ...

    Indian Academy of Sciences (India)

    A plasmonic band-pass filter based on graphene rectangular ring resonator with double narrow gaps is proposed and numerically investigated by finite-difference time-domain (FDTD) simulations. For the filter with or without gaps, the resonant frequencies can be effectively adjusted by changing the width of the graphene ...

  1. Graphene-based tunable terahertz filter with rectangular ring ...

    Indian Academy of Sciences (India)

    WEI SU

    2017-08-16

    Aug 16, 2017 ... Abstract. A plasmonic band-pass filter based on graphene rectangular ring resonator with double narrow gaps is proposed and numerically investigated by finite-difference time-domain (FDTD) simulations. For the filter with or without gaps, the resonant frequencies can be effectively adjusted by changing ...

  2. Particle filter based MAP state estimation: A comparison

    NARCIS (Netherlands)

    Saha, S.; Boers, Y.; Driessen, J.N.; Mandal, Pranab K.; Bagchi, Arunabha

    2009-01-01

    MAP estimation is a good alternative to MMSE for certain applications involving nonlinear non Gaussian systems. Recently a new particle filter based MAP estimator has been derived. This new method extracts the MAP directly from the output of a running particle filter. In the recent past, a Viterbi

  3. SU-G-IeP4-15: Ultrasound Imaging of Absorbable Inferior Vena Cava Filters for Proper Placement

    Energy Technology Data Exchange (ETDEWEB)

    Mitcham, T; Bouchard, R; Melancon, A; Melancon, M [University of Texas MD Anderson Cancer Center, Houston, TX (United States); Eggers, M [Adient Medical Technologies, Pearland, TX (United States)

    2016-06-15

    Purpose: Inferior vena cava filters (IVCFs) are used in patients with a high risk of pulmonary embolism in situations when the use of blood thinning drugs would be inappropriate. These filters are implanted under x-ray guidance; however, this provides a dose of ionizing radiation to both patient and physician. B-mode ultrasound (US) imaging allows for localization of certain implanted devices without radiation dose concerns. The goal of this study was to investigate the feasibility of imaging the placement of absorbable IVCFs using US imaging to alleviate the dosage concern inherent to fluoroscopy. Methods: A phantom was constructed to mimic a human IVC using tissue-mimicking material with 0.5 dB/cm/MHz acoustic attenuation, while agar inclusions were used to model acoustic mismatch at the venous interface. Absorbable IVCF’s were imaged at 15 cm depth using B-mode US at 2, 3, 5, and 7 MHz transmit frequencies. Then, to determine temporal stability, the IVCF was left in the phantom for 10 weeks; during this time, the IVCF was imaged using the same techniques as above, while the integrity of the filter was analyzed by inspecting for fiber discontinuities. Results: Visualization of the inferior vena cava filter was possible at 5, 7.5, and 15 cm depth at US central frequencies of 2, 3, 5, and 7 MHz. Imaging the IVCF at 5 MHz yielded the clearest images while maintaining acceptable spatial resolution for identifying the IVCF’s, while lower frequencies provided noticeably worse image quality. No obvious degradation was observed over the course of the 10 weeks in a static phantom environment. Conclusion: Biodegradable IVCF localization was possible up to 15 cm in depth using conventional B-mode US in a tissue-mimicking phantom. This leads to the potential for using B-mode US to guide the placement of the IVCF upon deployment by the interventional radiologist. Mitch Eggers is an owner of Adient Medical Technologies. There are no other conflicts of interest to disclose.

  4. Cervical spine imaging in trauma: Does the use of grid and filter combination improve visualisation of the cervicothoracic junction?

    International Nuclear Information System (INIS)

    Goyal, Nimit; Rachapalli, Vamsidhar; Burns, Helen; Lloyd, David C.F.

    2011-01-01

    Purpose: To evaluate the usefulness of filter and anti-scatter grid combination in demonstrating the cervicothoracic junction in lateral cervical spine radiographs performed for trauma patients. Methods: Following a change in departmental protocol in our hospital, anti-scatter grid and filter are routinely used for lateral cervical spine radiograph in all trauma patients with immobilised cervical spine. A retrospective study was done to compare the efficacy of lateral cervical spine radiographs in demonstrating the cervicothoracic junction for a period of three months before and after the implementation of the change. All images were independently evaluated by two observers. Results: 253 trauma patients had a lateral cervical spine radiograph done in January to March 2003 without using the anti-scatter grid and filter while 309 patients in January to March 2007, using filter and grid. Inter-observer variability between the two observers was calculated using Cohen's Kappa which showed good and very good agreement for 2003 and 2007 respectively. 126 (49.8%) images adequately demonstrated the cervicothoracic junction without using filter and grid while 189 (61.1%) were adequate following their use. This was statistically significant (Fischer exact test, p value = 0.0081). Conclusion: The use of filter and anti-scatter grids improves the visualisation of cervicothoracic junction in lateral cervical spine imaging and reduces the need to repeat exposure.

  5. Cervical spine imaging in trauma: Does the use of grid and filter combination improve visualisation of the cervicothoracic junction?

    Energy Technology Data Exchange (ETDEWEB)

    Goyal, Nimit, E-mail: nimitgoyal@doctors.org.u [University Hospital of Wales, Heath Park, Cardiff, CF14 4XW (United Kingdom); Rachapalli, Vamsidhar; Burns, Helen; Lloyd, David C.F. [University Hospital of Wales, Heath Park, Cardiff, CF14 4XW (United Kingdom)

    2011-02-15

    Purpose: To evaluate the usefulness of filter and anti-scatter grid combination in demonstrating the cervicothoracic junction in lateral cervical spine radiographs performed for trauma patients. Methods: Following a change in departmental protocol in our hospital, anti-scatter grid and filter are routinely used for lateral cervical spine radiograph in all trauma patients with immobilised cervical spine. A retrospective study was done to compare the efficacy of lateral cervical spine radiographs in demonstrating the cervicothoracic junction for a period of three months before and after the implementation of the change. All images were independently evaluated by two observers. Results: 253 trauma patients had a lateral cervical spine radiograph done in January to March 2003 without using the anti-scatter grid and filter while 309 patients in January to March 2007, using filter and grid. Inter-observer variability between the two observers was calculated using Cohen's Kappa which showed good and very good agreement for 2003 and 2007 respectively. 126 (49.8%) images adequately demonstrated the cervicothoracic junction without using filter and grid while 189 (61.1%) were adequate following their use. This was statistically significant (Fischer exact test, p value = 0.0081). Conclusion: The use of filter and anti-scatter grids improves the visualisation of cervicothoracic junction in lateral cervical spine imaging and reduces the need to repeat exposure.

  6. Generalized Selection Weighted Vector Filters

    Directory of Open Access Journals (Sweden)

    Rastislav Lukac

    2004-09-01

    Full Text Available This paper introduces a class of nonlinear multichannel filters capable of removing impulsive noise in color images. The here-proposed generalized selection weighted vector filter class constitutes a powerful filtering framework for multichannel signal processing. Previously defined multichannel filters such as vector median filter, basic vector directional filter, directional-distance filter, weighted vector median filters, and weighted vector directional filters are treated from a global viewpoint using the proposed framework. Robust order-statistic concepts and increased degree of freedom in filter design make the proposed method attractive for a variety of applications. Introduced multichannel sigmoidal adaptation of the filter parameters and its modifications allow to accommodate the filter parameters to varying signal and noise statistics. Simulation studies reported in this paper indicate that the proposed filter class is computationally attractive, yields excellent performance, and is able to preserve fine details and color information while efficiently suppressing impulsive noise. This paper is an extended version of the paper by Lukac et al. presented at the 2003 IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03 in Grado, Italy.

  7. Use of astronomy filters in fluorescence microscopy.

    Science.gov (United States)

    Piper, Jörg

    2012-02-01

    Monochrome astronomy filters are well suited for use as excitation or suppression filters in fluorescence microscopy. Because of their particular optical design, such filters can be combined with standard halogen light sources for excitation in many fluorescent probes. In this "low energy excitation," photobleaching (fading) or other irritations of native specimens are avoided. Photomicrographs can be taken from living motile fluorescent specimens also with a flash so that fluorescence images can be created free from indistinctness caused by movement. Special filter cubes or dichroic mirrors are not needed for our method. By use of suitable astronomy filters, fluorescence microscopy can be carried out with standard laboratory microscopes equipped with condensers for bright-field (BF) and dark-field (DF) illumination in transmitted light. In BF excitation, the background brightness can be modulated in tiny steps up to dark or black. Moreover, standard industry microscopes fitted with a vertical illuminator for examinations of opaque probes in DF or BF illumination based on incident light (wafer inspections, for instance) can also be used for excitation in epi-illumination when adequate astronomy filters are inserted as excitatory and suppression filters in the illuminating and imaging light path. In all variants, transmission bands can be modulated by transmission shift.

  8. MO-FG-CAMPUS-IeP1-01: Alternative K-Edge Filters for Low-Energy Image Acquisition in Contrast Enhanced Spectral Mammography

    Energy Technology Data Exchange (ETDEWEB)

    Shrestha, S; Vedantham, S; Karellas, A [University of Massachusetts Medical School, Worcester, MA (United States)

    2016-06-15

    Purpose: In Contrast Enhanced Spectral Mammography (CESM), Rh filter is often used during low-energy image acquisition. The potential for using Ag, In and Sn filters, which exhibit K-edge closer to, and just below that of Iodine, instead of the Rh filter, was investigated for the low-energy image acquisition. Methods: Analytical computations of the half-value thickness (HVT) and the photon fluence per mAs (photons/mm2/mAs) for 50µm Rh were compared with other potential K-edge filters (Ag, In and Sn), all with K-absorption edge below that of Iodine. Two strategies were investigated: fixed kVp and filter thickness (50µm for all filters) resulting in HVT variation, and fixed kVp and HVT resulting in variation in Ag, In and Sn thickness. Monte Carlo simulations (GEANT4) were conducted to determine if the scatter-to-primary ratio (SPR) and the point spread function of scatter (scatter PSF) differed between Rh and other K-edge filters. Results: Ag, In and Sn filters (50µm thick) increased photon fluence/mAs by 1.3–1.4, 1.8–2, and 1.7–2 at 28-32 kVp compared to 50µm Rh, which could decrease exposure time. Additionally, the fraction of spectra closer to and just below Iodine’s K-edge increased with these filters, which could improve post-subtraction image contrast. For HVT matched to 50µm Rh filtered spectra, the thickness range for Ag, In, and Sn were (41,44)µm, (49,55)µm and (45,53)µm, and increased photon fluence/mAs by 1.5–1.7, 1.6–2, and 1.6–2.2, respectively. Monte Carlo simulations showed that neither the SPR nor the scatter PSF of Ag, In and Sn differed from Rh, indicating no additional detriment due to x-ray scatter. Conclusion: The use of Ag, In and Sn filters for low-energy image acquisition in CESM is potentially feasible and could decrease exposure time and may improve post-subtraction image contrast. Effect of these filters on radiation dose, contrast, noise and associated metrics are being investigated. Funding Support: Supported in

  9. Mosaicing of single plane illumination microscopy images using groupwise registration and fast content-based image fusion

    Science.gov (United States)

    Preibisch, Stephan; Rohlfing, Torsten; Hasak, Michael P.; Tomancak, Pavel

    2008-03-01

    Single Plane Illumination Microscopy (SPIM; Huisken et al., Nature 305(5686):1007-1009, 2004) is an emerging microscopic technique that enables live imaging of large biological specimens in their entirety. By imaging the living biological sample from multiple angles SPIM has the potential to achieve isotropic resolution throughout even relatively large biological specimens. For every angle, however, only a relatively shallow section of the specimen is imaged with high resolution, whereas deeper regions appear increasingly blurred. In order to produce a single, uniformly high resolution image, we propose here an image mosaicing algorithm that combines state of the art groupwise image registration for alignment with content-based image fusion to prevent degrading of the fused image due to regional blurring of the input images. For the registration stage, we introduce an application-specific groupwise transformation model that incorporates per-image as well as groupwise transformation parameters. We also propose a new fusion algorithm based on Gaussian filters, which is substantially faster than fusion based on local image entropy. We demonstrate the performance of our mosaicing method on data acquired from living embryos of the fruit fly, Drosophila, using four and eight angle acquisitions.

  10. Full-field particle velocimetry with a photorefractive optical novelty filter

    International Nuclear Information System (INIS)

    Woerdemann, Mike; Holtmann, Frank; Denz, Cornelia

    2008-01-01

    We utilize the finite time constant of a photorefractive optical novelty filter microscope to access full-field velocity information of fluid flows on microscopic scales. In contrast to conventional methods such as particle image velocimetry and particle tracking velocimetry, not only image acquisition of the tracer particle field but also evaluation of tracer particle velocities is done all-optically by the novelty filter. We investigate the velocity dependent parameters of two-beam coupling based optical novelty filters and demonstrate calibration and application of a photorefractive velocimetry system. Theoretical and practical limits to the range of accessible velocities are discussed

  11. Content Based Image Matching for Planetary Science

    Science.gov (United States)

    Deans, M. C.; Meyer, C.

    2006-12-01

    Planetary missions generate large volumes of data. With the MER rovers still functioning on Mars, PDS contains over 7200 released images from the Microscopic Imagers alone. These data products are only searchable by keys such as the Sol, spacecraft clock, or rover motion counter index, with little connection to the semantic content of the images. We have developed a method for matching images based on the visual textures in images. For every image in a database, a series of filters compute the image response to localized frequencies and orientations. Filter responses are turned into a low dimensional descriptor vector, generating a 37 dimensional fingerprint. For images such as the MER MI, this represents a compression ratio of 99.9965% (the fingerprint is approximately 0.0035% the size of the original image). At query time, fingerprints are quickly matched to find images with similar appearance. Image databases containing several thousand images are preprocessed offline in a matter of hours. Image matches from the database are found in a matter of seconds. We have demonstrated this image matching technique using three sources of data. The first database consists of 7200 images from the MER Microscopic Imager. The second database consists of 3500 images from the Narrow Angle Mars Orbital Camera (MOC-NA), which were cropped into 1024×1024 sub-images for consistency. The third database consists of 7500 scanned archival photos from the Apollo Metric Camera. Example query results from all three data sources are shown. We have also carried out user tests to evaluate matching performance by hand labeling results. User tests verify approximately 20% false positive rate for the top 14 results for MOC NA and MER MI data. This means typically 10 to 12 results out of 14 match the query image sufficiently. This represents a powerful search tool for databases of thousands of images where the a priori match probability for an image might be less than 1%. Qualitatively, correct

  12. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel.

    Science.gov (United States)

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-11-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments.

  13. Deconvolution of Defocused Image with Multivariate Local Polynomial Regression and Iterative Wiener Filtering in DWT Domain

    Directory of Open Access Journals (Sweden)

    Liyun Su

    2010-01-01

    obtaining the point spread function (PSF parameter, iterative wiener filter is adopted to complete the restoration. We experimentally illustrate its performance on simulated data and real blurred image. Results show that the proposed PSF parameter estimation technique and the image restoration method are effective.

  14. Faraday anomalous dispersion optical filters

    Science.gov (United States)

    Shay, T. M.; Yin, B.; Alvarez, L. S.

    1993-01-01

    The effect of Faraday anomalous dispersion optical filters on infrared and blue transitions of some alkali atoms is calculated. A composite system is designed to further increase the background noise rejection. The measured results of the solar background rejection and image quality through the filter are presented. The results show that the filter may provide high transmission and high background noise rejection with excellent image quality.

  15. Face Recognition using Gabor Filters

    Directory of Open Access Journals (Sweden)

    Sajjad MOHSIN

    2011-01-01

    Full Text Available An Elastic Bunch Graph Map (EBGM algorithm is being proposed in this research paper that successfully implements face recognition using Gabor filters. The proposed system applies 40 different Gabor filters on an image. As aresult of which 40 images with different angles and orientation are received. Next, maximum intensity points in each filtered image are calculated and mark them as Fiducial points. The system reduces these points in accordance to distance between them. The next step is calculating the distances between the reduced points using distance formula. At last, the distances are compared with database. If match occurs, it means that the image is recognized.

  16. Extraction of topographic and material contrasts on surfaces from SEM images obtained by energy filtering detection with low-energy primary electrons

    International Nuclear Information System (INIS)

    Nagoshi, Masayasu; Aoyama, Tomohiro; Sato, Kaoru

    2013-01-01

    Secondary electron microscope (SEM) images have been obtained for practical materials using low primary electron energies and an in-lens type annular detector with changing negative bias voltage supplied to a grid placed in front of the detector. The kinetic-energy distribution of the detected electrons was evaluated by the gradient of the bias-energy dependence of the brightness of the images. This is divided into mainly two parts at about 500 V, high and low brightness in the low- and high-energy regions, respectively and shows difference among the surface regions having different composition and topography. The combination of the negative grid bias and the pixel-by-pixel image subtraction provides the band-pass filtered images and extracts the material and topographic information of the specimen surfaces. -- Highlights: ► Scanning electron (SE) images contain many kind of information on material surfaces. ► We investigate energy-filtered SE images for practical materials. ► The brightness of the images is divided into two parts by the bias voltage. ► Topographic and material contrasts are extracted by subtracting the filtered images.

  17. Image processing of vaporizing GDI sprays: a new curvature-based approach

    Science.gov (United States)

    Lazzaro, Maurizio; Ianniello, Roberto

    2018-01-01

    This article introduces an innovative method for the segmentation of Mie-scattering and schlieren images of GDI sprays. The contours of the liquid phase are obtained by segmenting the scattering images of the spray by means of optimal filtering of the image, relying on variational methods, and an original thresholding procedure based on an iterative application of Otsu’s method. The segmentation of schlieren images, to get the contours of the spray vapour phase, is obtained by exploiting the surface curvature of the image to strongly enhance the intensity texture due to the vapour density gradients. This approach allows one to unambiguously discern the whole vapour phase of the spray from the background. Additional information about the spray liquid phase can be obtained by thresholding filtered schlieren images. The potential of this method has been substantiated in the segmentation of schlieren and scattering images of a GDI spray of isooctane. The fuel, heated to 363 K, was injected into nitrogen at a density of 1.12 and 3.5 kg m-3 with temperatures of 333 K and 573 K.

  18. Spoof surface plasmon polaritons based notch filter for ultra-wideband microwave waveguide

    DEFF Research Database (Denmark)

    Xiao, Binggang; Li, Sheng-Hua; Xiao, Sanshui

    2016-01-01

    Spoof surface plasmon polaritons based notch filter for ultra-wideband microwave waveguide is proposed. Owing to subwavelength confinement, such a filter has advantage in the structure size without sacrificing the performance. The spoof SPP based notch is introduced to suppress the WLAN and satel......Spoof surface plasmon polaritons based notch filter for ultra-wideband microwave waveguide is proposed. Owing to subwavelength confinement, such a filter has advantage in the structure size without sacrificing the performance. The spoof SPP based notch is introduced to suppress the WLAN...

  19. Using Convolutional Neural Network Filters to Measure Left-Right Mirror Symmetry in Images

    Directory of Open Access Journals (Sweden)

    Anselm Brachmann

    2016-12-01

    Full Text Available We propose a method for measuring symmetry in images by using filter responses from Convolutional Neural Networks (CNNs. The aim of the method is to model human perception of left/right symmetry as closely as possible. Using the Convolutional Neural Network (CNN approach has two main advantages: First, CNN filter responses closely match the responses of neurons in the human visual system; they take information on color, edges and texture into account simultaneously. Second, we can measure higher-order symmetry, which relies not only on color, edges and texture, but also on the shapes and objects that are depicted in images. We validated our algorithm on a dataset of 300 music album covers, which were rated according to their symmetry by 20 human observers, and compared results with those from a previously proposed method. With our method, human perception of symmetry can be predicted with high accuracy. Moreover, we demonstrate that the inclusion of features from higher CNN layers, which encode more abstract image content, increases the performance further. In conclusion, we introduce a model of left/right symmetry that closely models human perception of symmetry in CD album covers.

  20. Design of digital trapezoidal shaping filter based on LabVIEW

    International Nuclear Information System (INIS)

    Liu Yujuan; Qin Guoxiu; Yang Zhihui; Zhang Xiaodong

    2013-01-01

    It describes the design of a digital trapezoidal shaping filter to nuclear signals based on LabVIEW. A method of optimizing the trapezoidal shaping filter's parameters was presented and tested, and the test results of the effect of shaping filter algorithm were studied. (authors)

  1. Low Power Systolic Array Based Digital Filter for DSP Applications

    Directory of Open Access Journals (Sweden)

    S. Karthick

    2015-01-01

    Full Text Available Main concepts in DSP include filtering, averaging, modulating, and correlating the signals in digital form to estimate characteristic parameter of a signal into a desirable form. This paper presents a brief concept of low power datapath impact for Digital Signal Processing (DSP based biomedical application. Systolic array based digital filter used in signal processing of electrocardiogram analysis is presented with datapath architectural innovations in low power consumption perspective. Implementation was done with ASIC design methodology using TSMC 65 nm technological library node. The proposed systolic array filter has reduced leakage power up to 8.5% than the existing filter architectures.

  2. Regularization based on steering parameterized Gaussian filters and a Bhattacharyya distance functional

    Science.gov (United States)

    Lopes, Emerson P.

    2001-08-01

    Template regularization embeds the problem of class separability. In the machine vision perspective, this problem is critical when a textural classification procedure is applied to non-stationary pattern mosaic images. These applications often present low accuracy performance due to disturbance of the classifiers produced by exogenous or endogenous signal regularity perturbations. Natural scene imaging, where the images present certain degree of homogeneity in terms of texture element size or shape (primitives) shows a variety of behaviors, especially varying the preferential spatial directionality. The space-time image pattern characterization is only solved if classification procedures are designed considering the most robust tools within a parallel and hardware perspective. The results to be compared in this paper are obtained using a framework based on multi-resolution, frame and hypothesis approach. Two strategies for the bank of Gabor filters applications are considered: adaptive strategy using the KL transform and fix configuration strategy. The regularization under discussion is accomplished in the pyramid building system instance. The filterings are steering Gaussians controlled by free parameters which are adjusted in accordance with a feedback process driven by hints obtained from sequence of frames interaction functionals pos-processed in the training process and including classification of training set samples as examples. Besides these adjustments there is continuous input data sensitive adaptiveness. The experimental result assessments are focused on two basic issues: Bhattacharyya distance as pattern characterization feature and the combination of KL transform as feature selection and adaptive criterion with the regularization of the pattern Bhattacharyya distance functional (BDF) behavior, using the BDF state separability and symmetry as the main indicators of an optimum framework parameter configuration.

  3. Efficient Kernel-Based Ensemble Gaussian Mixture Filtering

    KAUST Repository

    Liu, Bo

    2015-11-11

    We consider the Bayesian filtering problem for data assimilation following the kernel-based ensemble Gaussian-mixture filtering (EnGMF) approach introduced by Anderson and Anderson (1999). In this approach, the posterior distribution of the system state is propagated with the model using the ensemble Monte Carlo method, providing a forecast ensemble that is then used to construct a prior Gaussian-mixture (GM) based on the kernel density estimator. This results in two update steps: a Kalman filter (KF)-like update of the ensemble members and a particle filter (PF)-like update of the weights, followed by a resampling step to start a new forecast cycle. After formulating EnGMF for any observational operator, we analyze the influence of the bandwidth parameter of the kernel function on the covariance of the posterior distribution. We then focus on two aspects: i) the efficient implementation of EnGMF with (relatively) small ensembles, where we propose a new deterministic resampling strategy preserving the first two moments of the posterior GM to limit the sampling error; and ii) the analysis of the effect of the bandwidth parameter on contributions of KF and PF updates and on the weights variance. Numerical results using the Lorenz-96 model are presented to assess the behavior of EnGMF with deterministic resampling, study its sensitivity to different parameters and settings, and evaluate its performance against ensemble KFs. The proposed EnGMF approach with deterministic resampling suggests improved estimates in all tested scenarios, and is shown to require less localization and to be less sensitive to the choice of filtering parameters.

  4. Transistor-based filter for inhibiting load noise from entering a power supply

    Science.gov (United States)

    Taubman, Matthew S

    2013-07-02

    A transistor-based filter for inhibiting load noise from entering a power supply is disclosed. The filter includes a first transistor having an emitter coupled to a power supply, a collector coupled to a load, and a base. The filter also includes a first capacitor coupled between the base of the first transistor and a ground terminal. The filter further includes an impedance coupled between the base and a node between the collector and the load, or a second transistor and second capacitor. The impedance can be a resistor or an inductor.

  5. Be Foil ''Filter Knee Imaging'' NSTX Plasma with Fast Soft X-ray Camera

    International Nuclear Information System (INIS)

    B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

    2005-01-01

    A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28 o ) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip

  6. Image Fusion Based on the Self-Organizing Feature Map Neural Networks

    Institute of Scientific and Technical Information of China (English)

    ZHANG Zhaoli; SUN Shenghe

    2001-01-01

    This paper presents a new image datafusion scheme based on the self-organizing featuremap (SOFM) neural networks.The scheme consists ofthree steps:(1) pre-processing of the images,whereweighted median filtering removes part of the noisecomponents corrupting the image,(2) pixel clusteringfor each image using two-dimensional self-organizingfeature map neural networks,and (3) fusion of the im-ages obtained in Step (2) utilizing fuzzy logic,whichsuppresses the residual noise components and thusfurther improves the image quality.It proves thatsuch a three-step combination offers an impressive ef-fectiveness and performance improvement,which isconfirmed by simulations involving three image sen-sors (each of which has a different noise structure).

  7. HEPA air filter (image)

    Science.gov (United States)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  8. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging...... using the original OMERACT Filter and the newly proposed structure. Breakout groups critically reviewed the extent to which the candidate biomarkers complied with the proposed stepwise approach, as a way of examining the utility of the proposed 3-dimensional structure. RESULTS: Although...... was obtained for a proposed tri-axis structure to assess validation of imaging and soluble biomarkers; nevertheless, additional work is required to better evaluate its place within the OMERACT Filter 2.0....

  9. An approach for fixed coefficient RNS-based FIR filter

    Science.gov (United States)

    Srinivasa Reddy, Kotha; Sahoo, Subhendu Kumar

    2017-08-01

    In this work, an efficient new modular multiplication method for {2k-1, 2k, 2k+1-1} moduli set is proposed to implement a residue number system (RNS)-based fixed coefficient finite impulse response filter. The new multiplication approach reduces the number of partial products by using pre-loaded product block. The reduction in partial products with the proposed modular multiplication improves the clock frequency and reduces the area and power as compared with the conventional modular multiplication. Further, the present approach eliminates a binary number to residue number converter circuit, which is usually needed at the front end of RNS-based system. In this work, two fixed coefficient filter architectures with the new modular multiplication approach are proposed. The filters are implemented using Verilog hardware description language. The United Microelectronics Corporation 90 nm technology library has been used for synthesis and the results area, power and delay are obtained with the help of Cadence register transfer level compiler. The power delay product (PDP) is also considered for performance comparison among the proposed filters. One of the proposed architecture is found to improve PDP gain by 60.83% as compared with the filter implemented with conventional modular multiplier. The filters functionality is validated with the help of Altera DSP Builder.

  10. Energy-filtered real- and k-space secondary and energy-loss electron imaging with Dual Emission Electron spectro-Microscope: Cs/Mo(110)

    Energy Technology Data Exchange (ETDEWEB)

    Grzelakowski, Krzysztof P., E-mail: k.grzelakowski@opticon-nanotechnology.com

    2016-05-15

    Since its introduction the importance of complementary k{sub ||}-space (LEED) and real space (LEEM) information in the investigation of surface science phenomena has been widely demonstrated over the last five decades. In this paper we report the application of a novel kind of electron spectromicroscope Dual Emission Electron spectroMicroscope (DEEM) with two independent electron optical channels for reciprocal and real space quasi-simultaneous imaging in investigation of a Cs covered Mo(110) single crystal by using the 800 eV electron beam from an “in-lens” electron gun system developed for the sample illumination. With the DEEM spectromicroscope it is possible to observe dynamic, irreversible processes at surfaces in the energy-filtered real space and in the corresponding energy-filtered k{sub ǁ}-space quasi-simultaneously in two independent imaging columns. The novel concept of the high energy electron beam sample illumination in the cathode lens based microscopes allows chemically selective imaging and analysis under laboratory conditions. - Highlights: • A novel concept of the electron sample illumination with “in-lens” e- gun is realized. • Quasi-simultaneous energy selective observation of the real- and k-space in EELS mode. • Observation of the energy filtered Auger electron diffraction at Cs atoms on Mo(110). • Energy-loss, Auger and secondary electron momentum microscopy is realized.

  11. Acoustic wave filter based on periodically poled lithium niobate.

    Science.gov (United States)

    Courjon, Emilie; Bassignot, Florent; Ulliac, Gwenn; Benchabane, Sarah; Ballandras, Sylvain

    2012-09-01

    Solutions for the development of compact RF passive transducers as an alternative to standard surface or bulk acoustic wave devices are receiving increasing interest. This article presents results on the development of an acoustic band-pass filter based on periodically poled ferroelectric domains in lithium niobate. The fabrication of periodically poled transducers (PPTs) operating in the range of 20 to 650 MHz has been achieved on 3-in (76.2-mm) 500-μm-thick wafers. This kind of transducer is able to excite elliptical as well as longitudinal modes, yielding phase velocities of about 3800 and 6500 ms(-1), respectively. A new type of acoustic band-pass filter is proposed, based on the use of PPTs instead of the SAWs excited by classical interdigital transducers. The design and the fabrication of such a filter are presented, as well as experimental measurements of its electrical response and transfer function. The feasibility of such a PPT-based filter is thereby demonstrated and the limitations of this method are discussed.

  12. Coarse Alignment Technology on Moving base for SINS Based on the Improved Quaternion Filter Algorithm.

    Science.gov (United States)

    Zhang, Tao; Zhu, Yongyun; Zhou, Feng; Yan, Yaxiong; Tong, Jinwu

    2017-06-17

    Initial alignment of the strapdown inertial navigation system (SINS) is intended to determine the initial attitude matrix in a short time with certain accuracy. The alignment accuracy of the quaternion filter algorithm is remarkable, but the convergence rate is slow. To solve this problem, this paper proposes an improved quaternion filter algorithm for faster initial alignment based on the error model of the quaternion filter algorithm. The improved quaternion filter algorithm constructs the K matrix based on the principle of optimal quaternion algorithm, and rebuilds the measurement model by containing acceleration and velocity errors to make the convergence rate faster. A doppler velocity log (DVL) provides the reference velocity for the improved quaternion filter alignment algorithm. In order to demonstrate the performance of the improved quaternion filter algorithm in the field, a turntable experiment and a vehicle test are carried out. The results of the experiments show that the convergence rate of the proposed improved quaternion filter is faster than that of the tradition quaternion filter algorithm. In addition, the improved quaternion filter algorithm also demonstrates advantages in terms of correctness, effectiveness, and practicability.

  13. Wavelet based Image Registration Technique for Matching Dental x-rays

    OpenAIRE

    P. Ramprasad; H. C. Nagaraj; M. K. Parasuram

    2008-01-01

    Image registration plays an important role in the diagnosis of dental pathologies such as dental caries, alveolar bone loss and periapical lesions etc. This paper presents a new wavelet based algorithm for registering noisy and poor contrast dental x-rays. Proposed algorithm has two stages. First stage is a preprocessing stage, removes the noise from the x-ray images. Gaussian filter has been used. Second stage is a geometric transformation stage. Proposed work uses two l...

  14. Widely Tunable 4th Order Switched Gm -C Band-Pass Filter Based on N-Path Filters

    NARCIS (Netherlands)

    Darvishi, M.; van der Zee, Ronan A.R.; Klumperink, Eric A.M.; Nauta, Bram

    2012-01-01

    Abstract—A widely tunable 4th order BPF based on the subtraction of two 2nd order 4-path passive-mixer filters with slightly different center frequencies is proposed. The center frequency of each 4-path filter is slightly shifted relative to its clock frequency (one upward and the other one

  15. MR angiography with a matched filter

    International Nuclear Information System (INIS)

    De Castro, J.B.; Riederer, S.J.; Lee, J.N.

    1987-01-01

    The technique of matched filtering was applied to a series of cine MR images. The filter was devised to yield a subtraction angiographic image in which direct current components present in the cine series are removed and the signal-to-noise ratio (S/N) of the vascular structures is optimized. The S/N of a matched filter was compared with that of a simple subtraction, in which an image with high flow is subtracted from one with low flow. Experimentally, a range of results from minimal improvement to significant (60%) improvement in S/N was seen in the comparisons of matched filtered subtraction with simple subtraction

  16. A comparison of filters for thoracic diagnosis

    International Nuclear Information System (INIS)

    Oestmann, J.W.; Hendrickx, P.; Rieder, P.; Geerlings, H.; Medizinische Hochschule Hannover

    1986-01-01

    The effect of three types of filter on the quality of radiographs of the chest was compared. These filters improve visualization of mediastinal structures without significantly reducing the quality of the pulmonary image. In practice the Du Pont filter proved best; the quality of the central and peripheral portions of the lung image is equal to that of an ordinary radiograph and visualization of the mediastinum is improved. The Agfa-Gevaert filter showed no significant disadvantages compared with the ordinary techniques but the improvement in mediastinal visualization is not that marked. The 3M-filter yields poor images of the central portions of the lung and its type of construction prevents the retrocardiac structures from being pictured as well as with the other filters. (orig.) [de

  17. Biogas Filter Based on Local Natural Zeolite Materials

    Directory of Open Access Journals (Sweden)

    Satriyo Krido Wahono

    2014-02-01

    Full Text Available UPT BPPTK LIPI has created a biogas filter tool to improve the purity of methane in the biogas. The device shaped cylindrical tube containing absorbent materials which based on local natural zeolite of Indonesia. The absorbent has been activated and modified with other materials. This absorbtion material has multi-adsorption capacity for almost impurities gas of biogas. The biogas  filter increase methane content of biogas for 5-20%. The biogas filter improve the biogas’s performance such as increasing methane contents, increasing heating value, reduction of odors, reduction of corrosion potential, increasing the efficiency and stability of the generator.

  18. MEDOF - MINIMUM EUCLIDEAN DISTANCE OPTIMAL FILTER

    Science.gov (United States)

    Barton, R. S.

    1994-01-01

    The Minimum Euclidean Distance Optimal Filter program, MEDOF, generates filters for use in optical correlators. The algorithm implemented in MEDOF follows theory put forth by Richard D. Juday of NASA/JSC. This program analytically optimizes filters on arbitrary spatial light modulators such as coupled, binary, full complex, and fractional 2pi phase. MEDOF optimizes these modulators on a number of metrics including: correlation peak intensity at the origin for the centered appearance of the reference image in the input plane, signal to noise ratio including the correlation detector noise as well as the colored additive input noise, peak to correlation energy defined as the fraction of the signal energy passed by the filter that shows up in the correlation spot, and the peak to total energy which is a generalization of PCE that adds the passed colored input noise to the input image's passed energy. The user of MEDOF supplies the functions that describe the following quantities: 1) the reference signal, 2) the realizable complex encodings of both the input and filter SLM, 3) the noise model, possibly colored, as it adds at the reference image and at the correlation detection plane, and 4) the metric to analyze, here taken to be one of the analytical ones like SNR (signal to noise ratio) or PCE (peak to correlation energy) rather than peak to secondary ratio. MEDOF calculates filters for arbitrary modulators and a wide range of metrics as described above. MEDOF examines the statistics of the encoded input image's noise (if SNR or PCE is selected) and the filter SLM's (Spatial Light Modulator) available values. These statistics are used as the basis of a range for searching for the magnitude and phase of k, a pragmatically based complex constant for computing the filter transmittance from the electric field. The filter is produced for the mesh points in those ranges and the value of the metric that results from these points is computed. When the search is concluded, the

  19. Variable Step Size Maximum Correntropy Criteria Based Adaptive Filtering Algorithm

    Directory of Open Access Journals (Sweden)

    S. Radhika

    2016-04-01

    Full Text Available Maximum correntropy criterion (MCC based adaptive filters are found to be robust against impulsive interference. This paper proposes a novel MCC based adaptive filter with variable step size in order to obtain improved performance in terms of both convergence rate and steady state error with robustness against impulsive interference. The optimal variable step size is obtained by minimizing the Mean Square Deviation (MSD error from one iteration to the other. Simulation results in the context of a highly impulsive system identification scenario show that the proposed algorithm has faster convergence and lesser steady state error than the conventional MCC based adaptive filters.

  20. Noise reduction with complex bilateral filter.

    Science.gov (United States)

    Matsumoto, Mitsuharu

    2017-12-01

    This study introduces a noise reduction technique that uses a complex bilateral filter. A bilateral filter is a nonlinear filter originally developed for images that can reduce noise while preserving edge information. It is an attractive filter and has been used in many applications in image processing. When it is applied to an acoustical signal, small-amplitude noise is reduced while the speech signal is preserved. However, a bilateral filter cannot handle noise with relatively large amplitudes owing to its innate characteristics. In this study, the noisy signal is transformed into the time-frequency domain and the filter is improved to handle complex spectra. The high-amplitude noise is reduced in the time-frequency domain via the proposed filter. The features and the potential of the proposed filter are also confirmed through experiments.

  1. Depth Images Filtering In Distributed Streaming

    Directory of Open Access Journals (Sweden)

    Dziubich Tomasz

    2016-04-01

    Full Text Available In this paper, we propose a distributed system for point cloud processing and transferring them via computer network regarding to effectiveness-related requirements. We discuss the comparison of point cloud filters focusing on their usage for streaming optimization. For the filtering step of the stream pipeline processing we evaluate four filters: Voxel Grid, Radial Outliner Remover, Statistical Outlier Removal and Pass Through. For each of the filters we perform a series of tests for evaluating the impact on the point cloud size and transmitting frequency (analysed for various fps ratio. We present results of the optimization process used for point cloud consolidation in a distributed environment. We describe the processing of the point clouds before and after the transmission. Pre- and post-processing allow the user to send the cloud via network without any delays. The proposed pre-processing compression of the cloud and the post-processing reconstruction of it are focused on assuring that the end-user application obtains the cloud with a given precision.

  2. A fast color image enhancement algorithm based on Max Intensity Channel

    Science.gov (United States)

    Sun, Wei; Han, Long; Guo, Baolong; Jia, Wenyan; Sun, Mingui

    2014-03-01

    In this paper, we extend image enhancement techniques based on the retinex theory imitating human visual perception of scenes containing high illumination variations. This extension achieves simultaneous dynamic range modification, color consistency, and lightness rendition without multi-scale Gaussian filtering which has a certain halo effect. The reflection component is analyzed based on the illumination and reflection imaging model. A new prior named Max Intensity Channel (MIC) is implemented assuming that the reflections of some points in the scene are very high in at least one color channel. Using this prior, the illumination of the scene is obtained directly by performing a gray-scale closing operation and a fast cross-bilateral filtering on the MIC of the input color image. Consequently, the reflection component of each RGB color channel can be determined from the illumination and reflection imaging model. The proposed algorithm estimates the illumination component which is relatively smooth and maintains the edge details in different regions. A satisfactory color rendition is achieved for a class of images that do not satisfy the gray-world assumption implicit to the theoretical foundation of the retinex. Experiments are carried out to compare the new method with several spatial and transform domain methods. Our results indicate that the new method is superior in enhancement applications, improves computation speed, and performs well for images with high illumination variations than other methods. Further comparisons of images from National Aeronautics and Space Administration and a wearable camera eButton have shown a high performance of the new method with better color restoration and preservation of image details.

  3. Filtering Photogrammetric Point Clouds Using Standard LIDAR Filters Towards DTM Generation

    Science.gov (United States)

    Zhang, Z.; Gerke, M.; Vosselman, G.; Yang, M. Y.

    2018-05-01

    Digital Terrain Models (DTMs) can be generated from point clouds acquired by laser scanning or photogrammetric dense matching. During the last two decades, much effort has been paid to developing robust filtering algorithms for the airborne laser scanning (ALS) data. With the point cloud quality from dense image matching (DIM) getting better and better, the research question that arises is whether those standard Lidar filters can be used to filter photogrammetric point clouds as well. Experiments are implemented to filter two dense matching point clouds with different noise levels. Results show that the standard Lidar filter is robust to random noise. However, artefacts and blunders in the DIM points often appear due to low contrast or poor texture in the images. Filtering will be erroneous in these locations. Filtering the DIM points pre-processed by a ranking filter will bring higher Type II error (i.e. non-ground points actually labelled as ground points) but much lower Type I error (i.e. bare ground points labelled as non-ground points). Finally, the potential DTM accuracy that can be achieved by DIM points is evaluated. Two DIM point clouds derived by Pix4Dmapper and SURE are compared. On grassland dense matching generates points higher than the true terrain surface, which will result in incorrectly elevated DTMs. The application of the ranking filter leads to a reduced bias in the DTM height, but a slightly increased noise level.

  4. A Novel Image Tag Completion Method Based on Convolutional Neural Transformation

    KAUST Repository

    Geng, Yanyan; Zhang, Guohui; Li, Weizhi; Gu, Yi; Liang, Ru-Ze; Liang, Gaoyuan; Wang, Jingbin; Wu, Yanbin; Patil, Nitin; Wang, Jing-Yan

    2017-01-01

    In the problems of image retrieval and annotation, complete textual tag lists of images play critical roles. However, in real-world applications, the image tags are usually incomplete, thus it is important to learn the complete tags for images. In this paper, we study the problem of image tag complete and proposed a novel method for this problem based on a popular image representation method, convolutional neural network (CNN). The method estimates the complete tags from the convolutional filtering outputs of images based on a linear predictor. The CNN parameters, linear predictor, and the complete tags are learned jointly by our method. We build a minimization problem to encourage the consistency between the complete tags and the available incomplete tags, reduce the estimation error, and reduce the model complexity. An iterative algorithm is developed to solve the minimization problem. Experiments over benchmark image data sets show its effectiveness.

  5. A Novel Image Tag Completion Method Based on Convolutional Neural Transformation

    KAUST Repository

    Geng, Yanyan

    2017-10-24

    In the problems of image retrieval and annotation, complete textual tag lists of images play critical roles. However, in real-world applications, the image tags are usually incomplete, thus it is important to learn the complete tags for images. In this paper, we study the problem of image tag complete and proposed a novel method for this problem based on a popular image representation method, convolutional neural network (CNN). The method estimates the complete tags from the convolutional filtering outputs of images based on a linear predictor. The CNN parameters, linear predictor, and the complete tags are learned jointly by our method. We build a minimization problem to encourage the consistency between the complete tags and the available incomplete tags, reduce the estimation error, and reduce the model complexity. An iterative algorithm is developed to solve the minimization problem. Experiments over benchmark image data sets show its effectiveness.

  6. Voltage harmonic elimination with RLC based interface smoothing filter

    International Nuclear Information System (INIS)

    Chandrasekaran, K; Ramachandaramurthy, V K

    2015-01-01

    A method is proposed for designing a Dynamic Voltage Restorer (DVR) with RLC interface smoothing filter. The RLC filter connected between the IGBT based Voltage Source Inverter (VSI) is attempted to eliminate voltage harmonics in the busbar voltage and switching harmonics from VSI by producing a PWM controlled harmonic voltage. In this method, the DVR or series active filter produces PWM voltage that cancels the existing harmonic voltage due to any harmonic voltage source. The proposed method is valid for any distorted busbar voltage. The operating VSI handles no active power but only harmonic power. The DVR is able to suppress the lower order switching harmonics generated by the IGBT based VSI. Good dynamic and transient results obtained. The Total Harmonic Distortion (THD) is minimized to zero at the sensitive load end. Digital simulations are carried out using PSCAD/EMTDC to validate the performance of RLC filter. Simulated results are presented. (paper)

  7. Computational multispectral video imaging [Invited].

    Science.gov (United States)

    Wang, Peng; Menon, Rajesh

    2018-01-01

    Multispectral imagers reveal information unperceivable to humans and conventional cameras. Here, we demonstrate a compact single-shot multispectral video-imaging camera by placing a micro-structured diffractive filter in close proximity to the image sensor. The diffractive filter converts spectral information to a spatial code on the sensor pixels. Following a calibration step, this code can be inverted via regularization-based linear algebra to compute the multispectral image. We experimentally demonstrated spectral resolution of 9.6 nm within the visible band (430-718 nm). We further show that the spatial resolution is enhanced by over 30% compared with the case without the diffractive filter. We also demonstrate Vis-IR imaging with the same sensor. Because no absorptive color filters are utilized, sensitivity is preserved as well. Finally, the diffractive filters can be easily manufactured using optical lithography and replication techniques.

  8. Investigation of the influence of image reconstruction filter and scan parameters on operation of automatic tube current modulation systems for different CT scanners

    International Nuclear Information System (INIS)

    Sookpeng, Supawitoo; Martin, Colin J.; Gentle, David J.

    2015-01-01

    Variation in the user selected CT scanning parameters under automatic tube current modulation (ATCM) between hospitals has a substantial influence on the radiation doses and image quality for patients. The aim of this study was to investigate the effect of changing image reconstruction filter and scan parameter settings on tube current, dose and image quality for various CT scanners operating under ATCM. The scan parameters varied were pitch factor, rotation time, collimator configuration, kVp, image thickness and image filter convolution (FC) used for reconstruction. The Toshiba scanner varies the tube current to achieve a set target noise. Changes in the FC setting and image thickness for the first reconstruction were the major factors affecting patient dose. A two-step change in FC from smoother to sharper filters doubles the dose, but is counterbalanced by an improvement in spatial resolution. In contrast, Philips and Siemens scanners maintained tube current values similar to those for a reference image and patient, and the tube current only varied slightly for changes in individual CT scan parameters. The selection of a sharp filter increased the image noise, while use of iDose iterative reconstruction reduced the noise. Since the principles used by CT manufacturers for ATCM vary, it is important that parameters which affect patient dose and image quality for each scanner are made clear to operator to aid in optimisation. (authors)

  9. Caval penetration by retrievable inferior vena cava filters: a retrospective comparison of Option and Günther Tulip filters.

    Science.gov (United States)

    Olorunsola, Olufoladare G; Kohi, Maureen P; Fidelman, Nicholas; Westphalen, Antonio C; Kolli, Pallav K; Taylor, Andrew G; Gordon, Roy L; LaBerge, Jeanne M; Kerlan, Robert K

    2013-04-01

    To compare the frequency of vena caval penetration by the struts of the Option and Günther Tulip cone filters on postplacement computed tomography (CT) imaging. All patients who had an Option or Günther Tulip inferior vena cava (IVC) filter placed between January 2010 and May 2012 were identified retrospectively from medical records. Of the 208 IVC filters placed, the positions of 58 devices (21 Option filters, 37 Günther Tulip filters [GTFs]) were documented on follow-up CT examinations obtained for reasons unrelated to filter placement. In cases when multiple CT studies were obtained after placement, each study was reviewed, for a total of 80 examinations. Images were assessed for evidence of caval wall penetration by filter components, noting the number of penetrating struts and any effect on pericaval tissues. Penetration of at least one strut was observed in 17% of all filters imaged by CT between 1 and 447 days following placement. Although there was no significant difference in the overall prevalence of penetration when comparing the Option filter and GTF (Option, 10%; GTF, 22%), only GTFs showed time-dependent penetration, with penetration becoming more likely after prolonged indwelling times. No patient had damage to pericaval tissues or documented symptoms attributed to penetration. Although the Günther Tulip and Option filters exhibit caval penetration at CT imaging, only the GTF exhibits progressive penetration over time. Copyright © 2013 SIR. Published by Elsevier Inc. All rights reserved.

  10. Decision-Based Marginal Total Variation Diffusion for Impulsive Noise Removal in Color Images

    Directory of Open Access Journals (Sweden)

    Hongyao Deng

    2017-01-01

    Full Text Available Impulsive noise removal for color images usually employs vector median filter, switching median filter, the total variation L1 method, and variants. These approaches, however, often introduce excessive smoothing and can result in extensive visual feature blurring and thus are suitable only for images with low density noise. A marginal method to reduce impulsive noise is proposed in this paper that overcomes this limitation that is based on the following facts: (i each channel in a color image is contaminated independently, and contaminative components are independent and identically distributed; (ii in a natural image the gradients of different components of a pixel are similar to one another. This method divides components into different categories based on different noise characteristics. If an image is corrupted by salt-and-pepper noise, the components are divided into the corrupted and the noise-free components; if the image is corrupted by random-valued impulses, the components are divided into the corrupted, noise-free, and the possibly corrupted components. Components falling into different categories are processed differently. If a component is corrupted, modified total variation diffusion is applied; if it is possibly corrupted, scaled total variation diffusion is applied; otherwise, the component is left unchanged. Simulation results demonstrate its effectiveness.

  11. Filter Halaman Web Pornografi Menggunakan Kecocokan Kata dan Deteksi Warna Kulit

    Directory of Open Access Journals (Sweden)

    Yusron Rijal

    2011-05-01

    Full Text Available This paper presents an effort to detect pornographic webpages. It was stated that a positive relationship exists between percentage of human skin color in an image and the image itself (Jones et.al., 1998. Based on the statement, rather than using the traditional method of text-filtering, this paper propose a new approach to detect pornographic images by using skin color detection. The skin color detection performed by using RGB, HSI, and YCbCr color model. Using algorithm stated by Ap-apid (Ap-apid, 2005, the system will classify nude and not-nude images. If one or more nude images are found, the system will block the webpage. Keywords: Webpage Filtering, Image Processing, Pornography, Nudity, Skin Color, Nude Images

  12. FDTD parallel computational analysis of grid-type scattering filter characteristics for medical X-ray image diagnosis

    International Nuclear Information System (INIS)

    Takahashi, Koichi; Miyazaki, Yasumitsu; Goto, Nobuo

    2007-01-01

    X-ray diagnosis depends on the intensity of transmitted and scattered waves in X-ray propagation in biomedical media. X-ray is scattered and absorbed by tissues, such as fat, bone and internal organs. However, image processing for medical diagnosis, based on the scattering and absorption characteristics of these tissues in X-ray spectrum is not so much studied. To obtain precise information of tissues in a living body, the accurate characteristics of scattering and absorption are required. In this paper, X-ray scattering and absorption in biomedical media are studied using 2-dimensional finite difference time domain (FDTD) method. In FDTD method, the size of analysis space is very limited by the performance of available computers. To overcome this limitation, parallel and successive FDTD method is introduced. As a result of computer simulation, the amplitude of transmitted and scattered waves are presented numerically. The fundamental filtering characteristics of grid-type filter are also shown numerically. (author)

  13. Enhancement of nerve structure segmentation by a correntropy-based pre-image approach

    Directory of Open Access Journals (Sweden)

    J. Gil-González

    2017-05-01

    Full Text Available Peripheral Nerve Blocking (PNB is a commonly used technique for performing regional anesthesia and managing pain. PNB comprises the administration of anesthetics in the proximity of a nerve. In this sense, the success of PNB procedures depends on an accurate location of the target nerve. Recently, ultrasound images (UI have been widely used to locate nerve structures for PNB, since they enable a noninvasive visualization of the target nerve and the anatomical structures around it. However, UI are affected by speckle noise, which makes it difficult to accurately locate a given nerve. Thus, it is necessary to perform a filtering step to attenuate the speckle noise without eliminating relevant anatomical details that are required for high-level tasks, such as segmentation of nerve structures. In this paper, we propose an UI improvement strategy with the use of a pre-image-based filter. In particular, we map the input images by a nonlinear function (kernel. Specifically, we employ a correntropybased mapping as kernel functional to code higher-order statistics of the input data under both nonlinear and non-Gaussian conditions. We validate our approach against an UI dataset focused on nerve segmentation for PNB. Likewise, our Correntropy-based Pre-Image Filtering (CPIF is applied as a pre-processing stage to segment nerve structures in a UI. The segmentation performance is measured in terms of the Dice coefficient. According to the results, we observe that CPIF finds a suitable approximation for UI by highlighting discriminative nerve patterns.

  14. Nonlinear filtering for character recognition in low quality document images

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2014-09-01

    Optical character recognition in scanned printed documents is a well-studied task, where the captured conditions like sheet position, illumination, contrast and resolution are controlled. Nowadays, it is more practical to use mobile devices for document capture than a scanner. So as a consequence, the quality of document images is often poor owing to presence of geometric distortions, nonhomogeneous illumination, low resolution, etc. In this work we propose to use multiple adaptive nonlinear composite filters for detection and classification of characters. Computer simulation results obtained with the proposed system are presented and discussed.

  15. Artificial neural network (ANN)-based prediction of depth filter loading capacity for filter sizing.

    Science.gov (United States)

    Agarwal, Harshit; Rathore, Anurag S; Hadpe, Sandeep Ramesh; Alva, Solomon J

    2016-11-01

    This article presents an application of artificial neural network (ANN) modelling towards prediction of depth filter loading capacity for clarification of a monoclonal antibody (mAb) product during commercial manufacturing. The effect of operating parameters on filter loading capacity was evaluated based on the analysis of change in the differential pressure (DP) as a function of time. The proposed ANN model uses inlet stream properties (feed turbidity, feed cell count, feed cell viability), flux, and time to predict the corresponding DP. The ANN contained a single output layer with ten neurons in hidden layer and employed a sigmoidal activation function. This network was trained with 174 training points, 37 validation points, and 37 test points. Further, a pressure cut-off of 1.1 bar was used for sizing the filter area required under each operating condition. The modelling results showed that there was excellent agreement between the predicted and experimental data with a regression coefficient (R 2 ) of 0.98. The developed ANN model was used for performing variable depth filter sizing for different clarification lots. Monte-Carlo simulation was performed to estimate the cost savings by using different filter areas for different clarification lots rather than using the same filter area. A 10% saving in cost of goods was obtained for this operation. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1436-1443, 2016. © 2016 American Institute of Chemical Engineers.

  16. Model-based VQ for image data archival, retrieval and distribution

    Science.gov (United States)

    Manohar, Mareboyana; Tilton, James C.

    1995-01-01

    An ideal image compression technique for image data archival, retrieval and distribution would be one with the asymmetrical computational requirements of Vector Quantization (VQ), but without the complications arising from VQ codebooks. Codebook generation and maintenance are stumbling blocks which have limited the use of VQ as a practical image compression algorithm. Model-based VQ (MVQ), a variant of VQ described here, has the computational properties of VQ but does not require explicit codebooks. The codebooks are internally generated using mean removed error and Human Visual System (HVS) models. The error model assumed is the Laplacian distribution with mean, lambda-computed from a sample of the input image. A Laplacian distribution with mean, lambda, is generated with uniform random number generator. These random numbers are grouped into vectors. These vectors are further conditioned to make them perceptually meaningful by filtering the DCT coefficients from each vector. The DCT coefficients are filtered by multiplying by a weight matrix that is found to be optimal for human perception. The inverse DCT is performed to produce the conditioned vectors for the codebook. The only image dependent parameter used in the generation of codebook is the mean, lambda, that is included in the coded file to repeat the codebook generation process for decoding.

  17. The 2D Hotelling filter - a quantitative noise-reducing principal-component filter for dynamic PET data, with applications in patient dose reduction

    International Nuclear Information System (INIS)

    Axelsson, Jan; Sörensen, Jens

    2013-01-01

    In this paper we apply the principal-component analysis filter (Hotelling filter) to reduce noise from dynamic positron-emission tomography (PET) patient data, for a number of different radio-tracer molecules. We furthermore show how preprocessing images with this filter improves parametric images created from such dynamic sequence. We use zero-mean unit variance normalization, prior to performing a Hotelling filter on the slices of a dynamic time-series. The Scree-plot technique was used to determine which principal components to be rejected in the filter process. This filter was applied to [ 11 C]-acetate on heart and head-neck tumors, [ 18 F]-FDG on liver tumors and brain, and [ 11 C]-Raclopride on brain. Simulations of blood and tissue regions with noise properties matched to real PET data, was used to analyze how quantitation and resolution is affected by the Hotelling filter. Summing varying parts of a 90-frame [ 18 F]-FDG brain scan, we created 9-frame dynamic scans with image statistics comparable to 20 MBq, 60 MBq and 200 MBq injected activity. Hotelling filter performed on slices (2D) and on volumes (3D) were compared. The 2D Hotelling filter reduces noise in the tissue uptake drastically, so that it becomes simple to manually pick out regions-of-interest from noisy data. 2D Hotelling filter introduces less bias than 3D Hotelling filter in focal Raclopride uptake. Simulations show that the Hotelling filter is sensitive to typical blood peak in PET prior to tissue uptake have commenced, introducing a negative bias in early tissue uptake. Quantitation on real dynamic data is reliable. Two examples clearly show that pre-filtering the dynamic sequence with the Hotelling filter prior to Patlak-slope calculations gives clearly improved parametric image quality. We also show that a dramatic dose reduction can be achieved for Patlak slope images without changing image quality or quantitation. The 2D Hotelling-filtering of dynamic PET data is a computer

  18. Kalman Filter Based Tracking in an Video Surveillance System

    Directory of Open Access Journals (Sweden)

    SULIMAN, C.

    2010-05-01

    Full Text Available In this paper we have developed a Matlab/Simulink based model for monitoring a contact in a video surveillance sequence. For the segmentation process and corect identification of a contact in a surveillance video, we have used the Horn-Schunk optical flow algorithm. The position and the behavior of the correctly detected contact were monitored with the help of the traditional Kalman filter. After that we have compared the results obtained from the optical flow method with the ones obtained from the Kalman filter, and we show the correct functionality of the Kalman filter based tracking. The tests were performed using video data taken with the help of a fix camera. The tested algorithm has shown promising results.

  19. Spatial filtering self-velocimeter for vehicle application using a CMOS linear image sensor

    Science.gov (United States)

    He, Xin; Zhou, Jian; Nie, Xiaoming; Long, Xingwu

    2015-03-01

    The idea of using a spatial filtering velocimeter (SFV) to measure the velocity of a vehicle for an inertial navigation system is put forward. The presented SFV is based on a CMOS linear image sensor with a high-speed data rate, large pixel size, and built-in timing generator. These advantages make the image sensor suitable to measure vehicle velocity. The power spectrum of the output signal is obtained by fast Fourier transform and is corrected by a frequency spectrum correction algorithm. This velocimeter was used to measure the velocity of a conveyor belt driven by a rotary table and the measurement uncertainty is ˜0.54%. Furthermore, it was also installed on a vehicle together with a laser Doppler velocimeter (LDV) to measure self-velocity. The measurement result of the designed SFV is compared with that of the LDV. It is shown that the measurement result of the SFV is coincident with that of the LDV. Therefore, the designed SFV is suitable for a vehicle self-contained inertial navigation system.

  20. Spoof surface plasmon polaritons based notch filter for ultra-wideband microwave waveguide

    DEFF Research Database (Denmark)

    Xiao, Binggang; Li, Sheng-Hua; Xiao, Sanshui

    2016-01-01

    Spoof surface plasmon polaritons based notch filter for ultra-wideband microwave waveguide is proposed. Owing to subwavelength confinement, such a filter has advantage in the structure size without sacrificing the performance. The spoof SPP based notch is introduced to suppress the WLAN and satel...... and satellite communication signals. Due to planar structures proposed here, it is easy to integrate in the microwave integrated systems, which can play an important role in the microwave communication circuit and system.......Spoof surface plasmon polaritons based notch filter for ultra-wideband microwave waveguide is proposed. Owing to subwavelength confinement, such a filter has advantage in the structure size without sacrificing the performance. The spoof SPP based notch is introduced to suppress the WLAN...

  1. Fast estimate of Hartley entropy in image sharpening

    Science.gov (United States)

    Krbcová, Zuzana; Kukal, Jaromír.; Svihlik, Jan; Fliegel, Karel

    2016-09-01

    Two classes of linear IIR filters: Laplacian of Gaussian (LoG) and Difference of Gaussians (DoG) are frequently used as high pass filters for contextual vision and edge detection. They are also used for image sharpening when linearly combined with the original image. Resulting sharpening filters are radially symmetric in spatial and frequency domains. Our approach is based on the radial approximation of unknown optimal filter, which is designed as a weighted sum of Gaussian filters with various radii. The novel filter is designed for MRI image enhancement where the image intensity represents anatomical structure plus additive noise. We prefer the gradient norm of Hartley entropy of whole image intensity as a measure which has to be maximized for the best sharpening. The entropy estimation procedure is as fast as FFT included in the filter but this estimate is a continuous function of enhanced image intensities. Physically motivated heuristic is used for optimum sharpening filter design by its parameter tuning. Our approach is compared with Wiener filter on MRI images.

  2. An Integrated Dictionary-Learning Entropy-Based Medical Image Fusion Framework

    Directory of Open Access Journals (Sweden)

    Guanqiu Qi

    2017-10-01

    Full Text Available Image fusion is widely used in different areas and can integrate complementary and relevant information of source images captured by multiple sensors into a unitary synthetic image. Medical image fusion, as an important image fusion application, can extract the details of multiple images from different imaging modalities and combine them into an image that contains complete and non-redundant information for increasing the accuracy of medical diagnosis and assessment. The quality of the fused image directly affects medical diagnosis and assessment. However, existing solutions have some drawbacks in contrast, sharpness, brightness, blur and details. This paper proposes an integrated dictionary-learning and entropy-based medical image-fusion framework that consists of three steps. First, the input image information is decomposed into low-frequency and high-frequency components by using a Gaussian filter. Second, low-frequency components are fused by weighted average algorithm and high-frequency components are fused by the dictionary-learning based algorithm. In the dictionary-learning process of high-frequency components, an entropy-based algorithm is used for informative blocks selection. Third, the fused low-frequency and high-frequency components are combined to obtain the final fusion results. The results and analyses of comparative experiments demonstrate that the proposed medical image fusion framework has better performance than existing solutions.

  3. Automatic brain MR image denoising based on texture feature-based artificial neural networks.

    Science.gov (United States)

    Chang, Yu-Ning; Chang, Herng-Hua

    2015-01-01

    Noise is one of the main sources of quality deterioration not only for visual inspection but also in computerized processing in brain magnetic resonance (MR) image analysis such as tissue classification, segmentation and registration. Accordingly, noise removal in brain MR images is important for a wide variety of subsequent processing applications. However, most existing denoising algorithms require laborious tuning of parameters that are often sensitive to specific image features and textures. Automation of these parameters through artificial intelligence techniques will be highly beneficial. In the present study, an artificial neural network associated with image texture feature analysis is proposed to establish a predictable parameter model and automate the denoising procedure. In the proposed approach, a total of 83 image attributes were extracted based on four categories: 1) Basic image statistics. 2) Gray-level co-occurrence matrix (GLCM). 3) Gray-level run-length matrix (GLRLM) and 4) Tamura texture features. To obtain the ranking of discrimination in these texture features, a paired-samples t-test was applied to each individual image feature computed in every image. Subsequently, the sequential forward selection (SFS) method was used to select the best texture features according to the ranking of discrimination. The selected optimal features were further incorporated into a back propagation neural network to establish a predictable parameter model. A wide variety of MR images with various scenarios were adopted to evaluate the performance of the proposed framework. Experimental results indicated that this new automation system accurately predicted the bilateral filtering parameters and effectively removed the noise in a number of MR images. Comparing to the manually tuned filtering process, our approach not only produced better denoised results but also saved significant processing time.

  4. Synthesis of Cascadable DDCC-Based Universal Filter Using NAM

    Directory of Open Access Journals (Sweden)

    Huu-Duy Tran

    2015-08-01

    Full Text Available A novel systematic approach for synthesizing DDCC-based voltage-mode biquadratic universal filters is proposed. The DDCCs are described by infinity-variables’ models of nullor-mirror elements which can be used in the nodal admittance matrix expansion process. Applying the proposed method, the obtained 12 equivalent filters offer the following features: multi-input and two outputs, realization of all five standard filter functions, namely lowpass, bandpass, highpass, notch and allpass, high-input impedance, employing only grounded capacitors and resistors, orthogonal controllability between pole frequency and quality factor, and cascadable, low active and passive sensitivities. The workability of some synthesized filters is verified by HSPICE simulations to demonstrate the feasibility of the proposed method.

  5. Wavelet-based de-noising algorithm for images acquired with parallel magnetic resonance imaging (MRI)

    International Nuclear Information System (INIS)

    Delakis, Ioannis; Hammad, Omer; Kitney, Richard I

    2007-01-01

    Wavelet-based de-noising has been shown to improve image signal-to-noise ratio in magnetic resonance imaging (MRI) while maintaining spatial resolution. Wavelet-based de-noising techniques typically implemented in MRI require that noise displays uniform spatial distribution. However, images acquired with parallel MRI have spatially varying noise levels. In this work, a new algorithm for filtering images with parallel MRI is presented. The proposed algorithm extracts the edges from the original image and then generates a noise map from the wavelet coefficients at finer scales. The noise map is zeroed at locations where edges have been detected and directional analysis is also used to calculate noise in regions of low-contrast edges that may not have been detected. The new methodology was applied on phantom and brain images and compared with other applicable de-noising techniques. The performance of the proposed algorithm was shown to be comparable with other techniques in central areas of the images, where noise levels are high. In addition, finer details and edges were maintained in peripheral areas, where noise levels are low. The proposed methodology is fully automated and can be applied on final reconstructed images without requiring sensitivity profiles or noise matrices of the receiver coils, therefore making it suitable for implementation in a clinical MRI setting

  6. Improved automatic filtering methodology for an optimal pharmacokinetic modelling of DCE-MR images of the prostate

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez Martinez, V.; Bosch Roig, I.; Sanz Requena, R.

    2016-07-01

    In Dynamic Contrast-Enhanced Magnetic Resonance (DCEMR) studies with high temporal resolution, images are quite noisy due to the complicate balance between temporal and spatial resolution. For this reason, the temporal curves extracted from the images present remarkable noise levels and, because of that, the pharmacokinetic parameters calculated by least squares fitting from the curves and the arterial phase (a useful marker in tumour diagnosis which appears in curves with high arterial contribution) are affected. In order to solve these limitations, an automatic filtering method was developed by our group. In this work, an advanced automatic filtering methodology is presented to further improve noise reduction of the temporal curves in order to obtain more accurate kinetic parameters and a proper modelling of the arterial phase. (Author)

  7. Enhancing Perceived Quality of Compressed Images and Video with Anisotropic Diffusion and Fuzzy Filtering

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Korhonen, Jari; Forchhammer, Søren

    2013-01-01

    and subjective results on JPEG compressed images, as well as MJPEG and H.264/AVC compressed video, indicate that the proposed algorithms employing directional and spatial fuzzy filters achieve better artifact reduction than other methods. In particular, robust improvements with H.264/AVC video have been gained...

  8. Comparative study of reconstruction filters for cranio examinations of the Philips system and its influence on the quality of the tomographic image

    International Nuclear Information System (INIS)

    Silveira, V.C.; Kodlulovich, S.; Delduck, R.S.; Oliveira, L.C.G.

    2011-01-01

    The aim of this study was to evaluate different reconstruction algorithm (kernels) applied for head examinations. The research was carried out using 40-slice MDCT (Philips, Brilliance 40 CT scanner) and an ACR Phantoms to evaluate the image quality. The doses were estimated applying the coefficients obtained by IMPACT .The study showed that independently of the filter used the values of CT number have no change. The low contrast showed that the choice of correct filter might result a decrease of 9% of doses values. For special resolution, the sharp filter showed a better response for low m As. The noise value of image to certain filters smooth has no changed, even by reducing the m As values. (author)

  9. Stability Study of Filtering Techniques in Pictures of mini-MIAS Database; Estudio de Estabilidad de Tecnicas de Filtrado en Imagenes de la Base de Datos mini-MIAS

    Energy Technology Data Exchange (ETDEWEB)

    Parcero, E.; Vidal, V.; Verdu, G.; Mayo, P.

    2014-07-01

    The study of filtering techniques applied to medical imaging is particularly important because it can be decisive for an accurate diagnosis. This work aims to study the stability of Fuzzy Peer Group Averaging filter when applied to mammographic images of different nature in relation to the type of tissue abnormality found and diagnosis. The results show that the filter is effective, because obtained a PSNR value of 27 by comparing the filtered image with the original, and a value of 17 by comparing the filtered image with contaminated with noise. Also show that the filter will behave properly regardless of the image characteristics. (Author)

  10. Neural network Hilbert transform based filtered backprojection for fast inline x-ray inspection

    Science.gov (United States)

    Janssens, Eline; De Beenhouwer, Jan; Van Dael, Mattias; De Schryver, Thomas; Van Hoorebeke, Luc; Verboven, Pieter; Nicolai, Bart; Sijbers, Jan

    2018-03-01

    X-ray imaging is an important tool for quality control since it allows to inspect the interior of products in a non-destructive way. Conventional x-ray imaging, however, is slow and expensive. Inline x-ray inspection, on the other hand, can pave the way towards fast and individual quality control, provided that a sufficiently high throughput can be achieved at a minimal cost. To meet these criteria, an inline inspection acquisition geometry is proposed where the object moves and rotates on a conveyor belt while it passes a fixed source and detector. Moreover, for this acquisition geometry, a new neural-network-based reconstruction algorithm is introduced: the neural network Hilbert transform based filtered backprojection. The proposed algorithm is evaluated both on simulated and real inline x-ray data and has shown to generate high quality reconstructions of 400  ×  400 reconstruction pixels within 200 ms, thereby meeting the high throughput criteria.

  11. Exposure reduction in general dental practice using digital x-ray imaging system for intraoral radiography with additional x-ray beam filter

    International Nuclear Information System (INIS)

    Shibuya, Hitoshi; Mori, Toshimichi; Hayakawa, Yoshihiko; Kuroyanagi, Kinya; Ota, Yoshiko

    1997-01-01

    To measure exposure reduction in general dental practice using digital x-ray imaging systems for intraoral radiography with additional x-ray beam filter. Two digital x-ray imaging systems, Pana Digital (Pana-Heraus Dental) and CDR (Schick Technologies), were applied for intraoral radiography in general dental practice. Due to the high sensitivity to x-rays, additional x-ray beam filters for output reduction were used for examination. An Orex W II (Osada Electric Industry) x-ray generator was operated at 60 kVp, 7 mA. X-ray output (air-kerma; Gy) necessary for obtaining clinically acceptable images was measured at 0 to 20 cm in 5 cm steps from the cone tip using an ionizing chamber type 660 (Nuclear Associates) and compared with those for Ektaspeed Plus film (Eastman Kodak). The Pana Digital system was used with the optional filter supplied by Pana-Heraus Dental which reduced the output to 38%. The exposure necessary to obtain clinically acceptable images was only 40% of that for the film. The CDR system was used with the Dental X-ray Beam Filter Kit (Eastman Kodak) which reduced the x-ray output to 30%. The exposure necessary to obtain clinically acceptable images was only 20% of that for the film. The two digital x-ray imaging systems, Pana Digital and CDR, provided large dose savings (60-80%) compared with Ektaspeed Plus film when applied for intraoral radiography in general dental practice. (author)

  12. The use of wavelet filters for reducing noise in posterior fossa Computed Tomography images

    International Nuclear Information System (INIS)

    Pita-Machado, Reinado; Perez-Diaz, Marlen; Lorenzo-Ginori, Juan V.; Bravo-Pino, Rolando

    2014-01-01

    Wavelet transform based de-noising like wavelet shrinkage, gives the good results in CT. This procedure affects very little the spatial resolution. Some applications are reconstruction methods, while others are a posteriori de-noising methods. De-noising after reconstruction is very difficult because the noise is non-stationary and has unknown distribution. Therefore, methods which work on the sinogram-space don’t have this problem, because they always work over a known noise distribution at this point. On the other hand, the posterior fossa in a head CT is a very complex region for physicians, because it is commonly affected by artifacts and noise which are not eliminated during the reconstruction procedure. This can leads to some false positive evaluations. The purpose of our present work is to compare different wavelet shrinkage de-noising filters to reduce noise, particularly in images of the posterior fossa within CT scans in the sinogram-space. This work describes an experimental search for the best wavelets, to reduce Poisson noise in Computed Tomography (CT) scans. Results showed that de-noising with wavelet filters improved the quality of posterior fossa region in terms of an increased CNR, without noticeable structural distortions

  13. Tunable double-channel filter based on two-dimensional ferroelectric photonic crystals

    International Nuclear Information System (INIS)

    Jiang, Ping; Ding, Chengyuan; Hu, Xiaoyong; Gong, Qihuang

    2007-01-01

    A tunable double-channel filter is presented, which is based on a two-dimensional nonlinear ferroelectric photonic crystal made of cerium doped barium titanate. The filtering properties of the photonic crystal filter can be tuned by adjusting the defect structure or by a pump light. The influences of the structure disorders caused by the perturbations in the radius or the position of air holes on the filtering properties are also analyzed

  14. Tunable double-channel filter based on two-dimensional ferroelectric photonic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Ping [State Key Laboratory for Mesoscopic Physics, Department of Physics, Peking University, Beijing 100871 (China); Ding, Chengyuan [State Key Laboratory for Mesoscopic Physics, Department of Physics, Peking University, Beijing 100871 (China); Hu, Xiaoyong [State Key Laboratory for Mesoscopic Physics, Department of Physics, Peking University, Beijing 100871 (China)]. E-mail: xiaoyonghu@pku.edu.cn; Gong, Qihuang [State Key Laboratory for Mesoscopic Physics, Department of Physics, Peking University, Beijing 100871 (China)]. E-mail: qhgong@pku.edu.cn

    2007-04-02

    A tunable double-channel filter is presented, which is based on a two-dimensional nonlinear ferroelectric photonic crystal made of cerium doped barium titanate. The filtering properties of the photonic crystal filter can be tuned by adjusting the defect structure or by a pump light. The influences of the structure disorders caused by the perturbations in the radius or the position of air holes on the filtering properties are also analyzed.

  15. Michelson interferometer based interleaver design using classic IIR filter decomposition.

    Science.gov (United States)

    Cheng, Chi-Hao; Tang, Shasha

    2013-12-16

    An elegant method to design a Michelson interferometer based interleaver using a classic infinite impulse response (IIR) filter such as Butterworth, Chebyshev, and elliptic filters as a starting point are presented. The proposed design method allows engineers to design a Michelson interferometer based interleaver from specifications seamlessly. Simulation results are presented to demonstrate the validity of the proposed design method.

  16. Multi-tap complex-coefficient incoherent microwave photonic filters based on optical single-sideband modulation and narrow band optical filtering.

    Science.gov (United States)

    Sagues, Mikel; García Olcina, Raimundo; Loayssa, Alayn; Sales, Salvador; Capmany, José

    2008-01-07

    We propose a novel scheme to implement tunable multi-tap complex coefficient filters based on optical single sideband modulation and narrow band optical filtering. A four tap filter is experimentally demonstrated to highlight the enhanced tuning performance provided by complex coefficients. Optical processing is performed by the use of a cascade of four phase-shifted fiber Bragg gratings specifically fabricated for this purpose.

  17. Robust non-local median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2017-04-01

    This paper describes a novel image filter with superior performance on detail-preserving removal of random-valued impulse noise superimposed on natural gray-scale images. The non-local means filter is in the limelight as a way of Gaussian noise removal with superior performance on detail preservation. By referring the fundamental concept of the non-local means, we had proposed a non-local median filter as a specialized way for random-valued impulse noise removal so far. In the non-local processing, the output of a filter is calculated from pixels in blocks which are similar to the block centered at a pixel of interest. As a result, aggressive noise removal is conducted without destroying the detailed structures in an original image. However, the performance of non-local processing decreases enormously in the case of high noise occurrence probability. A cause of this problem is that the superimposed noise disturbs accurate calculation of the similarity between the blocks. To cope with this problem, we propose an improved non-local median filter which is robust to the high level of corruption by introducing a new similarity measure considering possibility of being the original signal. The effectiveness and validity of the proposed method are verified in a series of experiments using natural gray-scale images.

  18. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging.

    Science.gov (United States)

    Meyer, Mathias; Haubenreisser, Holger; Raupach, Rainer; Schmidt, Bernhard; Lietzmann, Florian; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Schad, Lothar R; Schoenberg, Stefan O; Henzler, Thomas

    2015-01-01

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm(2) removes the necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. • Omitting the z-axis-filter allows a reduction in radiation dose of 50% • A smaller focal spot of 0.2 mm (2) significantly improves spatial resolution • Ultra-high-resolution temporal-bone-CT helps to gain diagnostic information of the middle/inner ear.

  19. Influence of Respiratory Gating, Image Filtering, and Animal Positioning on High-Resolution Electrocardiography-Gated Murine Cardiac Single-Photon Emission Computed Tomography

    Directory of Open Access Journals (Sweden)

    Chao Wu

    2015-01-01

    Full Text Available Cardiac parameters obtained from single-photon emission computed tomographic (SPECT images can be affected by respiratory motion, image filtering, and animal positioning. We investigated the influence of these factors on ultra-high-resolution murine myocardial perfusion SPECT. Five mice were injected with 99m technetium (99mTc-tetrofosmin, and each was scanned in supine and prone positions in a U-SPECT-II scanner with respiratory and electrocardiographic (ECG gating. ECG-gated SPECT images were created without applying respiratory motion correction or with two different respiratory motion correction strategies. The images were filtered with a range of three-dimensional gaussian kernels, after which end-diastolic volumes (EDVs, end-systolic volumes (ESVs, and left ventricular ejection fractions were calculated. No significant differences in the measured cardiac parameters were detected when any strategy to reduce or correct for respiratory motion was applied, whereas big differences (> 5% in EDV and ESV were found with regard to different positioning of animals. A linear relationship (p < .001 was found between the EDV or ESV and the kernel size of the gaussian filter. In short, respiratory gating did not significantly affect the cardiac parameters of mice obtained with ultra-high-resolution SPECT, whereas the position of the animals and the image filters should be the same in a comparative study with multiple scans to avoid systematic differences in measured cardiac parameters.

  20. Hot spot detection for breast cancer in Ki-67 stained slides: image dependent filtering approach

    Science.gov (United States)

    Niazi, M. Khalid Khan; Downs-Kelly, Erinn; Gurcan, Metin N.

    2014-03-01

    We present a new method to detect hot spots from breast cancer slides stained for Ki67 expression. It is common practice to use centroid of a nucleus as a surrogate representation of a cell. This often requires the detection of individual nuclei. Once all the nuclei are detected, the hot spots are detected by clustering the centroids. For large size images, nuclei detection is computationally demanding. Instead of detecting the individual nuclei and treating hot spot detection as a clustering problem, we considered hot spot detection as an image filtering problem where positively stained pixels are used to detect hot spots in breast cancer images. The method first segments the Ki-67 positive pixels using the visually meaningful segmentation (VMS) method that we developed earlier. Then, it automatically generates an image dependent filter to generate a density map from the segmented image. The smoothness of the density image simplifies the detection of local maxima. The number of local maxima directly corresponds to the number of hot spots in the breast cancer image. The method was tested on 23 different regions of interest images extracted from 10 different breast cancer slides stained with Ki67. To determine the intra-reader variability, each image was annotated twice for hot spots by a boardcertified pathologist with a two-week interval in between her two readings. A computer-generated hot spot region was considered a true-positive if it agrees with either one of the two annotation sets provided by the pathologist. While the intra-reader variability was 57%, our proposed method can correctly detect hot spots with 81% precision.