WorldWideScience

Sample records for gatan imaging filter

  1. Applications of custom scripting in digital micrograph: general image manipulation and utilities

    International Nuclear Information System (INIS)

    Mitchell, D.R.G.

    2002-01-01

    Full text: The Gatan Imaging Filter (GIF) uses a charge coupled device (CCD) camera to capture images and spectra. Image capture and manipulation is achieved through Gatan's Digital Micrograph software. This has many capabilities built-in, and can be further extended through installation of custom scripts. These are typically short programs written in a powerful scripting language, which permits many aspects of image acquisition and subsequent manipulation to be controlled by the user. Custom scripts can be added to the normal pull down menus, producing a very flexible and easy to use environment. The scripts described here demonstrate how custom scripting can enhance the functionality of a modem analytical TEM equipped with, in this instance, a GIF. However, scripting will enhance any TEM using a CCD camera controlled through Digital Micrograph. The examples shown here include: a) a script to rotationally average a selected area diffraction pattern and produce a calibrated radial intensity profile, b) a utility script which monitors and graphically displays the CCD temperature as a function of time and c) a simple script to propagate image spatial calibrations to uncalibrated images, such as EFTEM images. Other scripts by the author along with some scripting resources are also discussed. Copyright (2002) Australian Society for Electron Microscopy Inc

  2. MR image reconstruction via guided filter.

    Science.gov (United States)

    Huang, Heyan; Yang, Hang; Wang, Kang

    2018-04-01

    Magnetic resonance imaging (MRI) reconstruction from the smallest possible set of Fourier samples has been a difficult problem in medical imaging field. In our paper, we present a new approach based on a guided filter for efficient MRI recovery algorithm. The guided filter is an edge-preserving smoothing operator and has better behaviors near edges than the bilateral filter. Our reconstruction method is consist of two steps. First, we propose two cost functions which could be computed efficiently and thus obtain two different images. Second, the guided filter is used with these two obtained images for efficient edge-preserving filtering, and one image is used as the guidance image, the other one is used as a filtered image in the guided filter. In our reconstruction algorithm, we can obtain more details by introducing guided filter. We compare our reconstruction algorithm with some competitive MRI reconstruction techniques in terms of PSNR and visual quality. Simulation results are given to show the performance of our new method.

  3. A Digital Image Denoising Algorithm Based on Gaussian Filtering and Bilateral Filtering

    Directory of Open Access Journals (Sweden)

    Piao Weiying

    2018-01-01

    Full Text Available Bilateral filtering has been applied in the area of digital image processing widely, but in the high gradient region of the image, bilateral filtering may generate staircase effect. Bilateral filtering can be regarded as one particular form of local mode filtering, according to above analysis, an mixed image de-noising algorithm is proposed based on Gaussian filter and bilateral filtering. First of all, it uses Gaussian filter to filtrate the noise image and get the reference image, then to take both the reference image and noise image as the input for range kernel function of bilateral filter. The reference image can provide the image’s low frequency information, and noise image can provide image’s high frequency information. Through the competitive experiment on both the method in this paper and traditional bilateral filtering, the experimental result showed that the mixed de-noising algorithm can effectively overcome staircase effect, and the filtrated image was more smooth, its textural features was also more close to the original image, and it can achieve higher PSNR value, but the amount of calculation of above two algorithms are basically the same.

  4. Pair distribution functions of carbonaceous solids, determined using energy filtered diffraction

    International Nuclear Information System (INIS)

    Petersen, T.C.; McCulloch, D.G.

    2002-01-01

    Full text: The structures of various carbonaceous solids were investigated using energy filtered diffraction patterns collected in two dimensions using a Gatan Imaging Filter (GIF). In order to reduce multiple scattering and eliminate inelastic scattering effects, the diffraction patterns were filtered using an energy -selecting slit around the zero-loss peak. Software has been developed for the extraction of radially averaged pair distributions functions from the diffraction data. This entails finding the position of the un-scattered beam, radially averaging the two dimensional intensity distributions, calibrating the resulting one dimensional intensity profiles and finally normalising the data to obtain structure factors. Techniques for improving and assessing data quality, pertaining to the methodology used here, have also been explored. Structure factors and radial distribution functions generated using this analysis will be discussed and, for the commercial V25 glassy carbon samples, compared to previous, work of one of the authors'. In order to answer questions regarding multiple scattering effects and structural homogeneity of the samples, neutron scattering was performed on the Medium Resolution Powder Diffractometer (MRPD), at the Australian Nuclear Science and Technology's (ANSTO) facility. A critical comparison of the neutron scattering and electron diffraction generated structure factors will be presented. Copyright (2002) Australian Society for Electron Microscopy Inc

  5. Two-dimensional filtering of SPECT images using the Metz and Wiener filters

    International Nuclear Information System (INIS)

    King, M.A.; Schwinger, R.B.; Penney, B.C.; Doherty, P.W.

    1984-01-01

    Presently, single photon emission computed tomographic (SPECT) images are usually reconstructed by arbitrarily selecting a one-dimensional ''window'' function for use in reconstruction. A better method would be to automatically choose among a family of two-dimensional image restoration filters in such a way as to produce ''optimum'' image quality. Two-dimensional image processing techniques offer the advantages of a larger statistical sampling of the data for better noise reduction, and two-dimensional image deconvolution to correct for blurring during acquisition. An investigation of two such ''optimal'' digital image restoration techniques (the count-dependent Metz filter and the Wiener filter) was made. They were applied both as two-dimensional ''window'' functions for preprocessing SPECT images, and for filtering reconstructed images. Their performance was compared by measuring image contrast and per cent fractional standard deviation (% FSD) in multiple-acquisitions of the Jaszczak SPECT phantom at two different count levels. A statistically significant increase in image contrast and decrease in % FSD was observed with these techniques when compared to the results of reconstruction with a ramp filter. The adaptability of the techniques was manifested in a lesser % reduction in % FSD at the high count level coupled with a greater enhancement in image contrast. Using an array processor, processing time was 0.2 sec per image for the Metz filter and 3 sec for the Wiener filter. It is concluded that two-dimensional digital image restoration with these techniques can produce a significant increase in SPECT image quality

  6. Frequency Domain Image Filtering Using CUDA

    Directory of Open Access Journals (Sweden)

    Muhammad Awais Rajput

    2014-10-01

    Full Text Available In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA?s CUDA (Compute Unified Device Architecture. In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA?s parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butterworth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output image quality on both the processing architectures

  7. Frequency domain image filtering using cuda

    International Nuclear Information System (INIS)

    Rajput, M.A.; Khan, U.A.

    2014-01-01

    In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA's CUDA (Compute Unified Device Architecture). In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform) which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA's parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butter worth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output) image quality on both the processing architectures. (author)

  8. Combination of Wiener filtering and singular value decomposition filtering for volume imaging PET

    International Nuclear Information System (INIS)

    Shao, L.; Lewitt, R.M.; Karp, J.S.

    1995-01-01

    Although the three-dimensional (3D) multi-slice rebinning (MSRB) algorithm in PET is fast and practical, and provides an accurate reconstruction, the MSRB image, in general, suffers from the noise amplified by its singular value decomposition (SVD) filtering operation in the axial direction. Their aim in this study is to combine the use of the Wiener filter (WF) with the SVD to decrease the noise and improve the image quality. The SVD filtering ''deconvolves'' the spatially variant axial response function while the WF suppresses the noise and reduces the blurring not modeled by the axial SVD filter but included in the system modulation transfer function. Therefore, the synthesis of these two techniques combines the advantages of both filters. The authors applied this approach to the volume imaging HEAD PENN-PET brain scanner with an axial extent of 256 mm. This combined filter was evaluated in terms of spatial resolution, image contrast, and signal-to-noise ratio with several phantoms, such as a cold sphere phantom and 3D brain phantom. Specifically, the authors studied both the SVD filter with an axial Wiener filter and the SVD filter with a 3D Wiener filter, and compared the filtered images to those from the 3D reprojection (3DRP) reconstruction algorithm. Their results indicate that the Wiener filter increases the signal-to-noise ratio and also improves the contrast. For the MSRB images of the 3D brain phantom, after 3D WF, both the Gray/White and Gray/Ventricle ratios were improved from 1.8 to 2.8 and 2.1 to 4.1, respectively. In addition, the image quality with the MSRB algorithm is close to that of the 3DRP algorithm with 3D WF applied to both image reconstructions

  9. Gabor filter based fingerprint image enhancement

    Science.gov (United States)

    Wang, Jin-Xiang

    2013-03-01

    Fingerprint recognition technology has become the most reliable biometric technology due to its uniqueness and invariance, which has been most convenient and most reliable technique for personal authentication. The development of Automated Fingerprint Identification System is an urgent need for modern information security. Meanwhile, fingerprint preprocessing algorithm of fingerprint recognition technology has played an important part in Automatic Fingerprint Identification System. This article introduces the general steps in the fingerprint recognition technology, namely the image input, preprocessing, feature recognition, and fingerprint image enhancement. As the key to fingerprint identification technology, fingerprint image enhancement affects the accuracy of the system. It focuses on the characteristics of the fingerprint image, Gabor filters algorithm for fingerprint image enhancement, the theoretical basis of Gabor filters, and demonstration of the filter. The enhancement algorithm for fingerprint image is in the windows XP platform with matlab.65 as a development tool for the demonstration. The result shows that the Gabor filter is effective in fingerprint image enhancement technology.

  10. Color image guided depth image super resolution using fusion filter

    Science.gov (United States)

    He, Jin; Liang, Bin; He, Ying; Yang, Jun

    2018-04-01

    Depth cameras are currently playing an important role in many areas. However, most of them can only obtain lowresolution (LR) depth images. Color cameras can easily provide high-resolution (HR) color images. Using color image as a guide image is an efficient way to get a HR depth image. In this paper, we propose a depth image super resolution (SR) algorithm, which uses a HR color image as a guide image and a LR depth image as input. We use the fusion filter of guided filter and edge based joint bilateral filter to get HR depth image. Our experimental results on Middlebury 2005 datasets show that our method can provide better quality in HR depth images both numerically and visually.

  11. Convex blind image deconvolution with inverse filtering

    Science.gov (United States)

    Lv, Xiao-Guang; Li, Fang; Zeng, Tieyong

    2018-03-01

    Blind image deconvolution is the process of estimating both the original image and the blur kernel from the degraded image with only partial or no information about degradation and the imaging system. It is a bilinear ill-posed inverse problem corresponding to the direct problem of convolution. Regularization methods are used to handle the ill-posedness of blind deconvolution and get meaningful solutions. In this paper, we investigate a convex regularized inverse filtering method for blind deconvolution of images. We assume that the support region of the blur object is known, as has been done in a few existing works. By studying the inverse filters of signal and image restoration problems, we observe the oscillation structure of the inverse filters. Inspired by the oscillation structure of the inverse filters, we propose to use the star norm to regularize the inverse filter. Meanwhile, we use the total variation to regularize the resulting image obtained by convolving the inverse filter with the degraded image. The proposed minimization model is shown to be convex. We employ the first-order primal-dual method for the solution of the proposed minimization model. Numerical examples for blind image restoration are given to show that the proposed method outperforms some existing methods in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), visual quality and time consumption.

  12. Efficient Filtering of Noisy Fingerprint Images

    Directory of Open Access Journals (Sweden)

    Maria Liliana Costin

    2016-01-01

    Full Text Available Fingerprint identification is an important field in the wide domain of biometrics with many applications, in different areas such: judicial, mobile phones, access systems, airports. There are many elaborated algorithms for fingerprint identification, but none of them can guarantee that the results of identification are always 100 % accurate. A first step in a fingerprint image analysing process consists in the pre-processing or filtering. If the result after this step is not by a good quality the upcoming identification process can fail. A major difficulty can appear in case of fingerprint identification if the images that should be identified from a fingerprint image database are noisy with different type of noise. The objectives of the paper are: the successful completion of the noisy digital image filtering, a novel more robust algorithm of identifying the best filtering algorithm and the classification and ranking of the images. The choice about the best filtered images of a set of 9 algorithms is made with a dual method of fuzzy and aggregation model. We are proposing through this paper a set of 9 filters with different novelty designed for processing the digital images using the following methods: quartiles, medians, average, thresholds and histogram equalization, applied all over the image or locally on small areas. Finally the statistics reveal the classification and ranking of the best algorithms.

  13. Infrared image background modeling based on improved Susan filtering

    Science.gov (United States)

    Yuehua, Xia

    2018-02-01

    When SUSAN filter is used to model the infrared image, the Gaussian filter lacks the ability of direction filtering. After filtering, the edge information of the image cannot be preserved well, so that there are a lot of edge singular points in the difference graph, increase the difficulties of target detection. To solve the above problems, the anisotropy algorithm is introduced in this paper, and the anisotropic Gauss filter is used instead of the Gauss filter in the SUSAN filter operator. Firstly, using anisotropic gradient operator to calculate a point of image's horizontal and vertical gradient, to determine the long axis direction of the filter; Secondly, use the local area of the point and the neighborhood smoothness to calculate the filter length and short axis variance; And then calculate the first-order norm of the difference between the local area of the point's gray-scale and mean, to determine the threshold of the SUSAN filter; Finally, the built SUSAN filter is used to convolution the image to obtain the background image, at the same time, the difference between the background image and the original image is obtained. The experimental results show that the background modeling effect of infrared image is evaluated by Mean Squared Error (MSE), Structural Similarity (SSIM) and local Signal-to-noise Ratio Gain (GSNR). Compared with the traditional filtering algorithm, the improved SUSAN filter has achieved better background modeling effect, which can effectively preserve the edge information in the image, and the dim small target is effectively enhanced in the difference graph, which greatly reduces the false alarm rate of the image.

  14. Fast bilateral filtering of CT-images

    Energy Technology Data Exchange (ETDEWEB)

    Steckmann, Sven; Baer, Matthias; Kachelriess, Marc [Erlangen-Nuernberg Univ., Erlangen (Germany). Inst. of Medical Physics (IMP)

    2011-07-01

    The Bilateral filter is able to get a lower noise level while retaining the edges in images. The downside of a bilateral filter is the high order of the problem itself. While having a Volume size of N with a dimension of d and a filter window of r the problem is of size N{sup d} . r{sup d}. In the literature there are some proposals for speeding up by reducing this order by approximating a component of the filter. This leads to inaccurate results which often implies non acceptable artifacts for medical imaging. A better way for medical imaging is to speed up the filter itself while leaving the basic structure intact. This is the way our implementation uses. We solve the problem of calculating the function of e{sup -x} in an efficient way on modern architectures, and the problem of vectorizing the filtering process. As result we implemented a filter which is 2.5 times faster than the highly optimized basic approach. By comparing the basic analytical approach with the final algorithm, the differences in quality of the computing process is negligible to the human eye. We are able to process a volume with 512{sup 3} voxels with a filter of 25 x 25 x 1 in 21 s on a modern Intel Xeon platform with two X5590 processors running at 3.33 GHz. (orig.)

  15. Quantum Image Filtering in the Frequency Domain

    Directory of Open Access Journals (Sweden)

    MANTA, V. I.

    2013-08-01

    Full Text Available In this paper we address the emerging field of Quantum Image Processing. We investigate the use of quantum computing systems to represent and manipulate images. In particular, we consider the basic task of image filtering. We prove that a quantum version for this operation can be achieved, even though the quantum convolution of two sequences is physically impossible. In our approach we use the principle of the quantum oracle to implement the filter function. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on grayscale images. There are important differences between the classical and the quantum implementations for image filtering. We analyze these differences and show that the major advantage of the quantum approach lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  16. Filter and Filter Bank Design for Image Texture Recognition

    Energy Technology Data Exchange (ETDEWEB)

    Randen, Trygve

    1997-12-31

    The relevance of this thesis to energy and environment lies in its application to remote sensing such as for instance sea floor mapping and seismic pattern recognition. The focus is on the design of two-dimensional filters for feature extraction, segmentation, and classification of digital images with textural content. The features are extracted by filtering with a linear filter and estimating the local energy in the filter response. The thesis gives a review covering broadly most previous approaches to texture feature extraction and continues with proposals of some new techniques. 143 refs., 59 figs., 7 tabs.

  17. Independent component analysis based filtering for penumbral imaging

    International Nuclear Information System (INIS)

    Chen Yenwei; Han Xianhua; Nozaki, Shinya

    2004-01-01

    We propose a filtering based on independent component analysis (ICA) for Poisson noise reduction. In the proposed filtering, the image is first transformed to ICA domain and then the noise components are removed by a soft thresholding (shrinkage). The proposed filter, which is used as a preprocessing of the reconstruction, has been successfully applied to penumbral imaging. Both simulation results and experimental results show that the reconstructed image is dramatically improved in comparison to that without the noise-removing filters

  18. Wiener discrete cosine transform-based image filtering

    Science.gov (United States)

    Pogrebnyak, Oleksiy; Lukin, Vladimir V.

    2012-10-01

    A classical problem of additive white (spatially uncorrelated) Gaussian noise suppression in grayscale images is considered. The main attention is paid to discrete cosine transform (DCT)-based denoising, in particular, to image processing in blocks of a limited size. The efficiency of DCT-based image filtering with hard thresholding is studied for different sizes of overlapped blocks. A multiscale approach that aggregates the outputs of DCT filters having different overlapped block sizes is proposed. Later, a two-stage denoising procedure that presumes the use of the multiscale DCT-based filtering with hard thresholding at the first stage and a multiscale Wiener DCT-based filtering at the second stage is proposed and tested. The efficiency of the proposed multiscale DCT-based filtering is compared to the state-of-the-art block-matching and three-dimensional filter. Next, the potentially reachable multiscale filtering efficiency in terms of output mean square error (MSE) is studied. The obtained results are of the same order as those obtained by Chatterjee's approach based on nonlocal patch processing. It is shown that the ideal Wiener DCT-based filter potential is usually higher when noise variance is high.

  19. Ghost suppression in image restoration filtering

    Science.gov (United States)

    Riemer, T. E.; Mcgillem, C. D.

    1975-01-01

    An optimum image restoration filter is described in which provision is made to constrain the spatial extent of the restoration function, the noise level of the filter output and the rate of falloff of the composite system point-spread away from the origin. Experimental results show that sidelobes on the composite system point-spread function produce ghosts in the restored image near discontinuities in intensity level. By redetermining the filter using a penalty function that is zero over the main lobe of the composite point-spread function of the optimum filter and nonzero where the point-spread function departs from a smoothly decaying function in the sidelobe region, a great reduction in sidelobe level is obtained. Almost no loss in resolving power of the composite system results from this procedure. By iteratively carrying out the same procedure even further reductions in sidelobe level are obtained. Examples of original and iterated restoration functions are shown along with their effects on a test image.

  20. Image defog algorithm based on open close filter and gradient domain recursive bilateral filter

    Science.gov (United States)

    Liu, Daqian; Liu, Wanjun; Zhao, Qingguo; Fei, Bowen

    2017-11-01

    To solve the problems of fuzzy details, color distortion, low brightness of the image obtained by the dark channel prior defog algorithm, an image defog algorithm based on open close filter and gradient domain recursive bilateral filter, referred to as OCRBF, was put forward. The algorithm named OCRBF firstly makes use of weighted quad tree to obtain more accurate the global atmospheric value, then exploits multiple-structure element morphological open and close filter towards the minimum channel map to obtain a rough scattering map by dark channel prior, makes use of variogram to correct the transmittance map,and uses gradient domain recursive bilateral filter for the smooth operation, finally gets recovery images by image degradation model, and makes contrast adjustment to get bright, clear and no fog image. A large number of experimental results show that the proposed defog method in this paper can be good to remove the fog , recover color and definition of the fog image containing close range image, image perspective, the image including the bright areas very well, compared with other image defog algorithms,obtain more clear and natural fog free images with details of higher visibility, what's more, the relationship between the time complexity of SIDA algorithm and the number of image pixels is a linear correlation.

  1. Filter's importance in nuclear cardiology imaging

    International Nuclear Information System (INIS)

    Jesus, Maria C. de; Lima, Ana L.S.; Santos, Joyra A. dos; Megueriam, Berdj A.

    2008-01-01

    Full text: Nuclear Medicine is a medical speciality which employs tomography procedures for the diagnosis, treatment and prevention of diseases. One of the most commonly used apparatus is the Single Photon Emission Computed Tomography (SPECT). To perform exams, a very small amount of a radiopharmaceutical must be given to the patient. Then, a gamma camera is placed in convenient positions to perform the photon counting, which is used to reconstruct a full 3 dimensional distribution of the radionuclide inside the body or organ. This reconstruction provides a 3-dimensional image in spatial coordinates, of the body or organ under study, allowing the physician to give the diagnostic. Image reconstruction is usually worked in the frequency domain, due to a great simplification introduced by the Fourier decomposition of image spectra. After the reconstruction, an inverse Fourier transform must be applied to trace back the image into spatial coordinates. To optimize this reconstruction procedure, digital filters are used to remove undesirable components of frequency, which can 'shadow' relevant physical signatures of diseases. Unfortunately, the efficiency of the applied filter is strongly dependent on its own mathematical parameters. In this work we demonstrate how filters interfere on image quality in cardiology examinations with SPECT, concerning perfusion and myocardial viability and the importance of the medical physicist in the choice of the right filters avoiding some serious problems that could occur in the inadequate processing of an image damaging the medical diagnosis. (author)

  2. Image pre-filtering for measurement error reduction in digital image correlation

    Science.gov (United States)

    Zhou, Yihao; Sun, Chen; Song, Yuntao; Chen, Jubing

    2015-02-01

    In digital image correlation, the sub-pixel intensity interpolation causes a systematic error in the measured displacements. The error increases toward high-frequency component of the speckle pattern. In practice, a captured image is usually corrupted by additive white noise. The noise introduces additional energy in the high frequencies and therefore raises the systematic error. Meanwhile, the noise also elevates the random error which increases with the noise power. In order to reduce the systematic error and the random error of the measurements, we apply a pre-filtering to the images prior to the correlation so that the high-frequency contents are suppressed. Two spatial-domain filters (binomial and Gaussian) and two frequency-domain filters (Butterworth and Wiener) are tested on speckle images undergoing both simulated and real-world translations. By evaluating the errors of the various combinations of speckle patterns, interpolators, noise levels, and filter configurations, we come to the following conclusions. All the four filters are able to reduce the systematic error. Meanwhile, the random error can also be reduced if the signal power is mainly distributed around DC. For high-frequency speckle patterns, the low-pass filters (binomial, Gaussian and Butterworth) slightly increase the random error and Butterworth filter produces the lowest random error among them. By using Wiener filter with over-estimated noise power, the random error can be reduced but the resultant systematic error is higher than that of low-pass filters. In general, Butterworth filter is recommended for error reduction due to its flexibility of passband selection and maximal preservation of the allowed frequencies. Binomial filter enables efficient implementation and thus becomes a good option if computational cost is a critical issue. While used together with pre-filtering, B-spline interpolator produces lower systematic error than bicubic interpolator and similar level of the random

  3. A vector Wiener filter for dual-radionuclide imaging

    International Nuclear Information System (INIS)

    Links, J.M.; Prince, J.L.; Gupta, S.N.

    1996-01-01

    The routine use of a single radionuclide for patient imaging in nuclear medicine can be complemented by studies employing two tracers to examine two different processes in a single organ, most frequently by simultaneous imaging of both radionuclides in two different energy windows. In addition, simultaneous transmission/emission imaging with dual-radionuclides has been described, with one radionuclide used for the transmission study and a second for the emission study. There is thus currently considerable interest in dual-radionuclide imaging. A major problem with all dual-radionuclide imaging is the crosstalk between the two radionuclides. Such crosstalk frequently occurs, because scattered radiation from the higher energy radionuclide is detected in the lower energy window, and because the lower energy radionuclide may have higher energy emissions which are detected in the higher energy window. The authors have previously described the use of Fourier-based restoration filtering in single photon emission computed tomography (SPECT) and positron emission tomography (PET) to improve quantitative accuracy by designing a Wiener or other Fourier filter to partially restore the loss of contrast due to scatter and finite spatial resolution effects. The authors describe here the derivation and initial validation of an extension of such filtering for dual-radionuclide imaging that simultaneously (1) improves contrast in each radionuclide's direct image, (2) reduces image noise, and (3) reduces the crosstalk contribution from the other radionuclide. This filter is based on a vector version of the Wiener filter, which is shown to be superior [in the minimum mean square error (MMSE) sense] to the sequential application of separate crosstalk and restoration filters

  4. FEI Titan G2 60-300 HOLO

    Directory of Open Access Journals (Sweden)

    Chris Boothroyd

    2016-02-01

    Full Text Available The FEI Titan G2 60-300 HOLO is a unique fourth generation transmission electron microscope, which has been specifically designed for the investigation of electromagnetic fields of materials using off-axis electron holography. It has a Lorentz lens to allow magnetic field free imaging plus two electron biprisms, which in combination enable more uniform holographic fringes to be used. The instrument also has an ultra-wide objective lens pole piece gap which is ideal for in situ experiments. For these purposes, the FEI Titan G2 60-300 HOLO is equipped with a Schottky type high-brightness electron gun (FEI X-FEG, an image Cs corrector (CEOS, a post-column energy filter system (Gatan Tridiem 865 ER as well as a 4 megapixel CCD system (Gatan UltraScan 1000 XP. Typical examples of use and technical specifications for the instrument are given below.

  5. Improving the quality of brain CT image from Wavelet filters

    International Nuclear Information System (INIS)

    Pita Machado, Reinaldo; Perez Diaz, Marlen; Bravo Pino, Rolando

    2012-01-01

    An algorithm to reduce Poisson noise is described using Wavelet filters. Five tomographic images of patients and a head anthropomorphic phantom were used. They were acquired with two different CT machines. Due to the original images contain the acquisition noise; some simulated free noise lesions were added to the images and after that the whole images were contaminated with noise. Contaminated images were filtered with 9 Wavelet filters at different decomposition levels and thresholds. Image quality of filtered and unfiltered images was graded using the Signal to Noise ratio, Normalized Mean Square Error and the Structural Similarity Index, as well as, by the subjective JAFROC methods with 5 observers. Some filters as Bior 3.7 and dB45 improved in a significant way head CT image quality (p<0.05) producing an increment in SNR without visible structural distortions

  6. Optimization of Butterworth filter for brain SPECT imaging

    International Nuclear Information System (INIS)

    Minoshima, Satoshi; Maruno, Hirotaka; Yui, Nobuharu

    1993-01-01

    A method has been described to optimize the cutoff frequency of the Butterworth filter for brain SPECT imaging. Since a computer simulation study has demonstrated that separation between an object signal and the random noise in projection images in a spatial-frequency domain is influenced by the total number of counts, the cutoff frequency of the Butterworth filter should be optimized for individual subjects according to total counts in a study. To reveal the relationship between the optimal cutoff frequencies and total counts in brain SPECT study, we used a normal volunteer and 99m Tc hexamethyl-propyleneamine oxime (HMPAO) to obtain projection sets with different total counts. High quality images were created from a projection set with an acquisition time of 300-seconds per projection. The filter was optimized by calculating mean square errors from high quality images visually inspecting filtered reconstructed images. Dependence between total counts and optimal cutoff frequencies was clearly demonstrated in a nomogram. Using this nomogram, the optimal cutoff frequency for each study can be estimated from total counts, maximizing visual image quality. The results suggest that the cutoff frequency of Butterworth filter should be determined by referring to total counts in each study. (author)

  7. Image Recommendation Algorithm Using Feature-Based Collaborative Filtering

    Science.gov (United States)

    Kim, Deok-Hwan

    As the multimedia contents market continues its rapid expansion, the amount of image contents used in mobile phone services, digital libraries, and catalog service is increasing remarkably. In spite of this rapid growth, users experience high levels of frustration when searching for the desired image. Even though new images are profitable to the service providers, traditional collaborative filtering methods cannot recommend them. To solve this problem, in this paper, we propose feature-based collaborative filtering (FBCF) method to reflect the user's most recent preference by representing his purchase sequence in the visual feature space. The proposed approach represents the images that have been purchased in the past as the feature clusters in the multi-dimensional feature space and then selects neighbors by using an inter-cluster distance function between their feature clusters. Various experiments using real image data demonstrate that the proposed approach provides a higher quality recommendation and better performance than do typical collaborative filtering and content-based filtering techniques.

  8. The value of filtered planar images in pediatric DMSA scans

    International Nuclear Information System (INIS)

    Mohammed, A.M.; Naddaf, S.Y.; Elgazzar, A.H.; Al-Abdul Salam, A.A.; Omar, A.A.

    2006-01-01

    The study was designed to demonstrate the value of filtered planar images in paediatric DMSA scanning. One hundred and seventy three patients ranged in age from 15 days to 12 years (mean: 4.3 years) with urinary tract infection (UTI) and clinical and/or laboratory suspicion of acute pyelonephritis (APN) were retrospectively studied. Planar images were filtered using Butterworth filter. The scan findings were reported as positive, negative or equivocal for cortical defects. Each scan was read in a double-blind fashion by two nuclear medicine physicians to evaluate inter-observer variations. Each kidney was divided into three zones, upper, middle and lower, and each zone was graded as positive, negative or equivocal for the presence of renal defects. Renal cortical defects were found in 66 patients (91 kidneys and 186 zones) with filtered images, 58 patients (81 kidneys and 175 zones) with planar images, and 69 patients (87 kidneys and 180 zones) with SPECT images. McNemar's test revealed statistically significant difference between filtered and planar images (p=0.038 for patients, 0.021 for kidneys and 0.034 for number of zones). Inter-observer agreement was 0.877 for filtered images, 0.915 for planar images and 0.915 for SPECT images. It was concluded that filtered planar images of renal cortex are comparable to SPECT images and can be used effectively in place of SPECT, when required, to shorten imaging time and eliminate motion artifacts, especially in the paediatric population. (author)

  9. An Improved Filtering Method for Quantum Color Image in Frequency Domain

    Science.gov (United States)

    Li, Panchi; Xiao, Hong

    2018-01-01

    In this paper we investigate the use of quantum Fourier transform (QFT) in the field of image processing. We consider QFT-based color image filtering operations and their applications in image smoothing, sharpening, and selective filtering using quantum frequency domain filters. The underlying principle used for constructing the proposed quantum filters is to use the principle of the quantum Oracle to implement the filter function. Compared with the existing methods, our method is not only suitable for color images, but also can flexibly design the notch filters. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on color images. The major advantages of the quantum frequency filtering lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  10. Efficient OCT Image Enhancement Based on Collaborative Shock Filtering.

    Science.gov (United States)

    Liu, Guohua; Wang, Ziyu; Mu, Guoying; Li, Peijin

    2018-01-01

    Efficient enhancement of noisy optical coherence tomography (OCT) images is a key task for interpreting them correctly. In this paper, to better enhance details and layered structures of a human retina image, we propose a collaborative shock filtering for OCT image denoising and enhancement. Noisy OCT image is first denoised by a collaborative filtering method with new similarity measure, and then the denoised image is sharpened by a shock-type filtering for edge and detail enhancement. For dim OCT images, in order to improve image contrast for the detection of tiny lesions, a gamma transformation is first used to enhance the images within proper gray levels. The proposed method integrating image smoothing and sharpening simultaneously obtains better visual results in experiments.

  11. Vector Directional Distance Rational Hybrid Filters for Color Image Restoration

    Directory of Open Access Journals (Sweden)

    L. Khriji

    2005-12-01

    Full Text Available A new class of nonlinear filters, called vector-directional distance rational hybrid filters (VDDRHF for multispectral image processing, is introduced and applied to color image-filtering problems. These filters are based on rational functions (RF. The VDDRHF filter is a two-stage filter, which exploits the features of the vector directional distance filter (VDDF, the center weighted vector directional distance filter (CWVDDF and those of the rational operator. The filter output is a result of vector rational function (VRF operating on the output of three sub-functions. Two vector directional distance (VDDF filters and one center weighted vector directional distance filter (CWVDDF are proposed to be used in the first stage due to their desirable properties, such as, noise attenuation, chromaticity retention, and edges and details preservation. Experimental results show that the new VDDRHF outperforms a number of widely known nonlinear filters for multi-spectral image processing such as the vector median filter (VMF, the generalized vector directional filters (GVDF and distance directional filters (DDF with respect to all criteria used.

  12. Selected annotated bibliographies for adaptive filtering of digital image data

    Science.gov (United States)

    Mayers, Margaret; Wood, Lynnette

    1988-01-01

    Digital spatial filtering is an important tool both for enhancing the information content of satellite image data and for implementing cosmetic effects which make the imagery more interpretable and appealing to the eye. Spatial filtering is a context-dependent operation that alters the gray level of a pixel by computing a weighted average formed from the gray level values of other pixels in the immediate vicinity.Traditional spatial filtering involves passing a particular filter or set of filters over an entire image. This assumes that the filter parameter values are appropriate for the entire image, which in turn is based on the assumption that the statistics of the image are constant over the image. However, the statistics of an image may vary widely over the image, requiring an adaptive or "smart" filter whose parameters change as a function of the local statistical properties of the image. Then a pixel would be averaged only with more typical members of the same population. This annotated bibliography cites some of the work done in the area of adaptive filtering. The methods usually fall into two categories, (a) those that segment the image into subregions, each assumed to have stationary statistics, and use a different filter on each subregion, and (b) those that use a two-dimensional "sliding window" to continuously estimate the filter either the spatial or frequency domain, or may utilize both domains. They may be used to deal with images degraded by space variant noise, to suppress undesirable local radiometric statistics while enforcing desirable (user-defined) statistics, to treat problems where space-variant point spread functions are involved, to segment images into regions of constant value for classification, or to "tune" images in order to remove (nonstationary) variations in illumination, noise, contrast, shadows, or haze.Since adpative filtering, like nonadaptive filtering, is used in image processing to accomplish various goals, this bibliography

  13. Multiscale infrared and visible image fusion using gradient domain guided image filtering

    Science.gov (United States)

    Zhu, Jin; Jin, Weiqi; Li, Li; Han, Zhenghao; Wang, Xia

    2018-03-01

    For better surveillance with infrared and visible imaging, a novel hybrid multiscale decomposition fusion method using gradient domain guided image filtering (HMSD-GDGF) is proposed in this study. In this method, hybrid multiscale decomposition with guided image filtering and gradient domain guided image filtering of source images are first applied before the weight maps of each scale are obtained using a saliency detection technology and filtering means with three different fusion rules at different scales. The three types of fusion rules are for small-scale detail level, large-scale detail level, and base level. Finally, the target becomes more salient and can be more easily detected in the fusion result, with the detail information of the scene being fully displayed. After analyzing the experimental comparisons with state-of-the-art fusion methods, the HMSD-GDGF method has obvious advantages in fidelity of salient information (including structural similarity, brightness, and contrast), preservation of edge features, and human visual perception. Therefore, visual effects can be improved by using the proposed HMSD-GDGF method.

  14. Directional Joint Bilateral Filter for Depth Images

    Directory of Open Access Journals (Sweden)

    Anh Vu Le

    2014-06-01

    Full Text Available Depth maps taken by the low cost Kinect sensor are often noisy and incomplete. Thus, post-processing for obtaining reliable depth maps is necessary for advanced image and video applications such as object recognition and multi-view rendering. In this paper, we propose adaptive directional filters that fill the holes and suppress the noise in depth maps. Specifically, novel filters whose window shapes are adaptively adjusted based on the edge direction of the color image are presented. Experimental results show that our method yields higher quality filtered depth maps than other existing methods, especially at the edge boundaries.

  15. Implementation of non-linear filters for iterative penalized maximum likelihood image reconstruction

    International Nuclear Information System (INIS)

    Liang, Z.; Gilland, D.; Jaszczak, R.; Coleman, R.

    1990-01-01

    In this paper, the authors report on the implementation of six edge-preserving, noise-smoothing, non-linear filters applied in image space for iterative penalized maximum-likelihood (ML) SPECT image reconstruction. The non-linear smoothing filters implemented were the median filter, the E 6 filter, the sigma filter, the edge-line filter, the gradient-inverse filter, and the 3-point edge filter with gradient-inverse filter, and the 3-point edge filter with gradient-inverse weight. A 3 x 3 window was used for all these filters. The best image obtained, by viewing the profiles through the image in terms of noise-smoothing, edge-sharpening, and contrast, was the one smoothed with the 3-point edge filter. The computation time for the smoothing was less than 1% of one iteration, and the memory space for the smoothing was negligible. These images were compared with the results obtained using Bayesian analysis

  16. Multi-dimensional medical images compressed and filtered with wavelets

    International Nuclear Information System (INIS)

    Boyen, H.; Reeth, F. van; Flerackers, E.

    2002-01-01

    Full text: Using the standard wavelet decomposition methods, multi-dimensional medical images can be compressed and filtered by repeating the wavelet-algorithm on 1D-signals in an extra loop per extra dimension. In the non-standard decomposition for multi-dimensional images the areas that must be zero-filled in case of band- or notch-filters are more complex than geometric areas such as rectangles or cubes. Adding an additional dimension in this algorithm until 4D (e.g. a 3D beating heart) increases the geometric complexity of those areas even more. The aim of our study was to calculate the boundaries of the formed complex geometric areas, so we can use the faster non-standard decomposition to compress and filter multi-dimensional medical images. Because a lot of 3D medical images taken by PET- or SPECT-cameras have only a few layers in the Z-dimension and compressing images in a dimension with a few voxels is usually not worthwhile, we provided a solution in which one can choose which dimensions will be compressed or filtered. With the proposal of non-standard decomposition on Daubechies' wavelets D2 to D20 by Steven Gollmer in 1992, 1D data can be compressed and filtered. Each additional level works only on the smoothed data, so the transformation-time halves per extra level. Zero-filling a well-defined area alter the wavelet-transform and then performing the inverse transform will do the filtering. To be capable to compress and filter up to 4D-Images with the faster non-standard wavelet decomposition method, we have investigated a new method for calculating the boundaries of the areas which must be zero-filled in case of filtering. This is especially true for band- and notch filtering. Contrary to the standard decomposition method, the areas are no longer rectangles in 2D or cubes in 3D or a row of cubes in 4D: they are rectangles expanded with a half-sized rectangle in the other direction for 2D, cubes expanded with half cubes in one and quarter cubes in the

  17. A robust nonlinear filter for image restoration.

    Science.gov (United States)

    Koivunen, V

    1995-01-01

    A class of nonlinear regression filters based on robust estimation theory is introduced. The goal of the filtering is to recover a high-quality image from degraded observations. Models for desired image structures and contaminating processes are employed, but deviations from strict assumptions are allowed since the assumptions on signal and noise are typically only approximately true. The robustness of filters is usually addressed only in a distributional sense, i.e., the actual error distribution deviates from the nominal one. In this paper, the robustness is considered in a broad sense since the outliers may also be due to inappropriate signal model, or there may be more than one statistical population present in the processing window, causing biased estimates. Two filtering algorithms minimizing a least trimmed squares criterion are provided. The design of the filters is simple since no scale parameters or context-dependent threshold values are required. Experimental results using both real and simulated data are presented. The filters effectively attenuate both impulsive and nonimpulsive noise while recovering the signal structure and preserving interesting details.

  18. Spatial filters for focusing ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Gori, Paola

    2001-01-01

    , but the approach always yields point spread functions better or equal to a traditional dynamically focused image. Finally, the process was applied to in-vivo clinical images of the liver and right kidney from a 28 years old male. The data was obtained with a single element transducer focused at 100 mm....... A new method for making spatial matched filter focusing of RF ultrasound data is proposed based on the spatial impulse response description of the imaging. The response from a scatterer at any given point in space relative to the transducer can be calculated, and this gives the spatial matched filter...... for synthetic aperture imaging for single element transducers. It is evaluated using the Field II program. Data from a single 3 MHz transducer focused at a distance of 80 mm is processed. Far from the transducer focal region, the processing greatly improves the image resolution: the lateral slice...

  19. Comparison of various filtering methods for digital X-ray image processing

    International Nuclear Information System (INIS)

    Pfluger, T.; Reinfelder, H.E.; Dorschky, K.; Oppelt, A.; Siemens A.G., Erlangen

    1987-01-01

    Three filtering methods are explained and compared that are used for border edge enhancement of digitally processed X-ray images. The filters are compared by two examples, a radiograph of the chest, and one of the knee joint. The unsharpness mask is found to yield the best compromise between edge enhancement and image noise intensifying effect, whereas the results obtained by the high-pass filter or the Wallis filter are less good for diagnostic evaluation. The filtered images better display narrow lines, structural borders and edges, and finely spotted areas, than the original radiograph, so that diagnostic evaluation is easier after image filtering. (orig.) [de

  20. Iodine filter imaging system for subtraction angiography using synchrotron radiation

    Science.gov (United States)

    Umetani, K.; Ueda, K.; Takeda, T.; Itai, Y.; Akisada, M.; Nakajima, T.

    1993-11-01

    A new type of real-time imaging system was developed for transvenous coronary angiography. A combination of an iodine filter and a single energy broad-bandwidth X-ray produces two-energy images for the iodine K-edge subtraction technique. X-ray images are sequentially converted to visible images by an X-ray image intensifier. By synchronizing the timing of the movement of the iodine filter into and out of the X-ray beam, two output images of the image intensifier are focused side by side on the photoconductive layer of a camera tube by an oscillating mirror. Both images are read out by electron beam scanning of a 1050-scanning-line video camera within a camera frame time of 66.7 ms. One hundred ninety two pairs of iodine-filtered and non-iodine-filtered images are stored in the frame memory at a rate of 15 pairs/s. In vivo subtracted images of coronary arteries in dogs were obtained in the form of motion pictures.

  1. Detail Enhancement for Infrared Images Based on Propagated Image Filter

    Directory of Open Access Journals (Sweden)

    Yishu Peng

    2016-01-01

    Full Text Available For displaying high-dynamic-range images acquired by thermal camera systems, 14-bit raw infrared data should map into 8-bit gray values. This paper presents a new method for detail enhancement of infrared images to display the image with a relatively satisfied contrast and brightness, rich detail information, and no artifacts caused by the image processing. We first adopt a propagated image filter to smooth the input image and separate the image into the base layer and the detail layer. Then, we refine the base layer by using modified histogram projection for compressing. Meanwhile, the adaptive weights derived from the layer decomposition processing are used as the strict gain control for the detail layer. The final display result is obtained by recombining the two modified layers. Experimental results on both cooled and uncooled infrared data verify that the proposed method outperforms the method based on log-power histogram modification and bilateral filter-based detail enhancement in both detail enhancement and visual effect.

  2. Imaging spectrometer using a liquid crystal tunable filter

    Science.gov (United States)

    Chrien, Thomas G.; Chovit, Christopher; Miller, Peter J.

    1993-09-01

    A demonstration imaging spectrometer using a liquid crystal tunable filter (LCTF) was built and tested on a hot air balloon platform. The LCTF is a tunable polarization interference or Lyot filter. The LCTF enables a small, light weight, low power, band sequential imaging spectrometer design. An overview of the prototype system is given along with a description of balloon experiment results. System model performance predictions are given for a future LCTF based imaging spectrometer design. System design considerations of LCTF imaging spectrometers are discussed.

  3. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    Science.gov (United States)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  4. Filters in 2D and 3D Cardiac SPECT Image Processing

    Directory of Open Access Journals (Sweden)

    Maria Lyra

    2014-01-01

    Full Text Available Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast.

  5. Restoration of nuclear medicine images using adaptive Wiener filters

    International Nuclear Information System (INIS)

    Meinel, G.

    1989-01-01

    An adaptive Wiener filter implementation for restoration of nuclear medicine images is described. These are considerably disturbed both deterministically (definition) and stochastically (Poisson's quantum noise). After introduction of an image model, description of necessary parameter approximations and information on optimum design methods the implementation is described. The filter operates adaptively as concerns the local signal-to-noise ratio and is based on a filter band concept. To verify the restoration effect size numbers are introduced and the filter is tested against these numbers. (author)

  6. Detection of pulmonary nodules on lung X-ray images. Studies on multi-resolutional filter and energy subtraction images

    International Nuclear Information System (INIS)

    Sawada, Akira; Sato, Yoshinobu; Kido, Shoji; Tamura, Shinichi

    1999-01-01

    The purpose of this work is to prove the effectiveness of an energy subtraction image for the detection of pulmonary nodules and the effectiveness of multi-resolutional filter on an energy subtraction image to detect pulmonary nodules. Also we study influential factors to the accuracy of detection of pulmonary nodules from viewpoints of types of images, types of digital filters and types of evaluation methods. As one type of images, we select an energy subtraction image, which removes bones such as ribs from the conventional X-ray image by utilizing the difference of X-ray absorption ratios at different energy between bones and soft tissue. Ribs and vessels are major causes of CAD errors in detection of pulmonary nodules and many researches have tried to solve this problem. So we select conventional X-ray images and energy subtraction X-ray images as types of images, and at the same time select ∇ 2 G (Laplacian of Guassian) filter, Min-DD (Minimum Directional Difference) filter and our multi-resolutional filter as types of digital filters. Also we select two evaluation methods and prove the effectiveness of an energy subtraction image, the effectiveness of Min-DD filter on a conventional X-ray image and the effectiveness of multi-resolutional filter on an energy subtraction image. (author)

  7. Fusion of multispectral and panchromatic images using multirate filter banks

    Institute of Scientific and Technical Information of China (English)

    Wang Hong; Jing Zhongliang; Li Jianxun

    2005-01-01

    In this paper, an image fusion method based on the filter banks is proposed for merging a high-resolution panchromatic image and a low-resolution multispectral image. Firstly, the filter banks are designed to merge different signals with minimum distortion by using cosine modulation. Then, the filter banks-based image fusion is adopted to obtain a high-resolution multispectral image that combines the spectral characteristic of low-resolution data with the spatial resolution of the panchromatic image. Finally, two different experiments and corresponding performance analysis are presented. Experimental results indicate that the proposed approach outperforms the HIS transform, discrete wavelet transform and discrete wavelet frame.

  8. Efficient Hardware Implementation For Fingerprint Image Enhancement Using Anisotropic Gaussian Filter.

    Science.gov (United States)

    Khan, Tariq Mahmood; Bailey, Donald G; Khan, Mohammad A U; Kong, Yinan

    2017-05-01

    A real-time image filtering technique is proposed which could result in faster implementation for fingerprint image enhancement. One major hurdle associated with fingerprint filtering techniques is the expensive nature of their hardware implementations. To circumvent this, a modified anisotropic Gaussian filter is efficiently adopted in hardware by decomposing the filter into two orthogonal Gaussians and an oriented line Gaussian. An architecture is developed for dynamically controlling the orientation of the line Gaussian filter. To further improve the performance of the filter, the input image is homogenized by a local image normalization. In the proposed structure, for a middle-range reconfigurable FPGA, both parallel compute-intensive and real-time demands were achieved. We manage to efficiently speed up the image-processing time and improve the resource utilization of the FPGA. Test results show an improved speed for its hardware architecture while maintaining reasonable enhancement benchmarks.

  9. Detail-enhanced multimodality medical image fusion based on gradient minimization smoothing filter and shearing filter.

    Science.gov (United States)

    Liu, Xingbin; Mei, Wenbo; Du, Huiqian

    2018-02-13

    In this paper, a detail-enhanced multimodality medical image fusion algorithm is proposed by using proposed multi-scale joint decomposition framework (MJDF) and shearing filter (SF). The MJDF constructed with gradient minimization smoothing filter (GMSF) and Gaussian low-pass filter (GLF) is used to decompose source images into low-pass layers, edge layers, and detail layers at multiple scales. In order to highlight the detail information in the fused image, the edge layer and the detail layer in each scale are weighted combined into a detail-enhanced layer. As directional filter is effective in capturing salient information, so SF is applied to the detail-enhanced layer to extract geometrical features and obtain directional coefficients. Visual saliency map-based fusion rule is designed for fusing low-pass layers, and the sum of standard deviation is used as activity level measurement for directional coefficients fusion. The final fusion result is obtained by synthesizing the fused low-pass layers and directional coefficients. Experimental results show that the proposed method with shift-invariance, directional selectivity, and detail-enhanced property is efficient in preserving and enhancing detail information of multimodality medical images. Graphical abstract The detailed implementation of the proposed medical image fusion algorithm.

  10. Image enhancement by spatial frequency post-processing of images obtained with pupil filters

    Science.gov (United States)

    Estévez, Irene; Escalera, Juan C.; Stefano, Quimey Pears; Iemmi, Claudio; Ledesma, Silvia; Yzuel, María J.; Campos, Juan

    2016-12-01

    The use of apodizing or superresolving filters improves the performance of an optical system in different frequency bands. This improvement can be seen as an increase in the OTF value compared to the OTF for the clear aperture. In this paper we propose a method to enhance the contrast of an image in both its low and its high frequencies. The method is based on the generation of a synthetic Optical Transfer Function, by multiplexing the OTFs given by the use of different non-uniform transmission filters on the pupil. We propose to capture three images, one obtained with a clear pupil, one obtained with an apodizing filter that enhances the low frequencies and another one taken with a superresolving filter that improves the high frequencies. In the Fourier domain the three spectra are combined by using smoothed passband filters, and then the inverse transform is performed. We show that we can create an enhanced image better than the image obtained with the clear aperture. To evaluate the performance of the method, bar tests (sinusoidal tests) with different frequency content are used. The results show that a contrast improvement in the high and low frequencies is obtained.

  11. Oriented diffusion filtering for enhancing low-quality fingerprint images

    KAUST Repository

    Gottschlich, C.; Schönlieb, C.-B.

    2012-01-01

    To enhance low-quality fingerprint images, we present a novel method that first estimates the local orientation of the fingerprint ridge and valley flow and next performs oriented diffusion filtering, followed by a locally adaptive contrast enhancement step. By applying the authors' new approach to low-quality images of the FVC2004 fingerprint databases, the authors are able to show its competitiveness with other state-of-the-art enhancement methods for fingerprints like curved Gabor filtering. A major advantage of oriented diffusion filtering over those is its computational efficiency. Combining oriented diffusion filtering with curved Gabor filters led to additional improvements and, to the best of the authors' knowledge, the lowest equal error rates achieved so far using MINDTCT and BOZORTH3 on the FVC2004 databases. The recognition performance and the computational efficiency of the method suggest to include oriented diffusion filtering as a standard image enhancement add-on module for real-time fingerprint recognition systems. In order to facilitate the reproduction of these results, an implementation of the oriented diffusion filtering for Matlab and GNU Octave is made available for download. © 2012 The Institution of Engineering and Technology.

  12. Oriented diffusion filtering for enhancing low-quality fingerprint images

    KAUST Repository

    Gottschlich, C.

    2012-01-01

    To enhance low-quality fingerprint images, we present a novel method that first estimates the local orientation of the fingerprint ridge and valley flow and next performs oriented diffusion filtering, followed by a locally adaptive contrast enhancement step. By applying the authors\\' new approach to low-quality images of the FVC2004 fingerprint databases, the authors are able to show its competitiveness with other state-of-the-art enhancement methods for fingerprints like curved Gabor filtering. A major advantage of oriented diffusion filtering over those is its computational efficiency. Combining oriented diffusion filtering with curved Gabor filters led to additional improvements and, to the best of the authors\\' knowledge, the lowest equal error rates achieved so far using MINDTCT and BOZORTH3 on the FVC2004 databases. The recognition performance and the computational efficiency of the method suggest to include oriented diffusion filtering as a standard image enhancement add-on module for real-time fingerprint recognition systems. In order to facilitate the reproduction of these results, an implementation of the oriented diffusion filtering for Matlab and GNU Octave is made available for download. © 2012 The Institution of Engineering and Technology.

  13. Computer processing of the scintigraphic image using digital filtering techniques

    International Nuclear Information System (INIS)

    Matsuo, Michimasa

    1976-01-01

    The theory of digital filtering was studied as a method for the computer processing of scintigraphic images. The characteristics and design techniques of finite impulse response (FIR) digital filters with linear phases were examined using the z-transform. The conventional data processing method, smoothing, could be recognized as one kind of linear phase FIR low-pass digital filtering. Ten representatives of FIR low-pass digital filters with various cut-off frequencies were scrutinized from the frequency domain in one-dimension and two-dimensions. These filters were applied to phantom studies with cold targets, using a Scinticamera-Minicomputer on-line System. These studies revealed that the resultant images had a direct connection with the magnitude response of the filter, that is, they could be estimated fairly well from the frequency response of the digital filter used. The filter, which was estimated from phantom studies as optimal for liver scintigrams using 198 Au-colloid, was successfully applied in clinical use for detecting true cold lesions and, at the same time, for eliminating spurious images. (J.P.N.)

  14. Human visual modeling and image deconvolution by linear filtering

    International Nuclear Information System (INIS)

    Larminat, P. de; Barba, D.; Gerber, R.; Ronsin, J.

    1978-01-01

    The problem is the numerical restoration of images degraded by passing through a known and spatially invariant linear system, and by the addition of a stationary noise. We propose an improvement of the Wiener's filter to allow the restoration of such images. This improvement allows to reduce the important drawbacks of classical Wiener's filter: the voluminous data processing, the lack of consideration of the vision's characteristivs which condition the perception by the observer of the restored image. In a first paragraph, we describe the structure of the visual detection system and a modelling method of this system. In the second paragraph we explain a restoration method by Wiener filtering that takes the visual properties into account and that can be adapted to the local properties of the image. Then the results obtained on TV images or scintigrams (images obtained by a gamma-camera) are commented [fr

  15. Neutron Imaging of Diesel Particulate Filters

    International Nuclear Information System (INIS)

    Strzelec, Andrea; Bilheux, Hassina Z.; Finney, Charles E.A.; Daw, C. Stuart; Foster, Dave; Rutland, Christopher J.; Schillinger, Burkhard; Schulz, Michael

    2009-01-01

    This article presents nondestructive neutron computed tomography (nCT) measurements of Diesel Particulate Filters (DPFs) as a method to measure ash and soot loading in the filters. Uncatalyzed and unwashcoated 200cpsi cordierite DPFs exposed to 100% biodiesel (B100) exhaust and conventional ultra low sulfur 2007 certification diesel (ULSD) exhaust at one speed-load point (1500rpm, 2.6bar BMEP) are compared to a brand new (never exposed) filter. Precise structural information about the substrate as well as an attempt to quantify soot and ash loading in the channel of the DPF illustrates the potential strength of the neutron imaging technique

  16. A Tentative Application Of Morphological Filters To Time-Varying Images

    Science.gov (United States)

    Billard, D.; Poquillon, B.

    1989-03-01

    In this paper, morphological filters, which are commonly used to process either 2D or multidimensional static images, are generalized to the analysis of time-varying image sequence. The introduction of the time dimension induces then interesting prop-erties when designing such spatio-temporal morphological filters. In particular, the specification of spatio-temporal structuring ele-ments (equivalent to time-varying spatial structuring elements) can be adjusted according to the temporal variations of the image sequences to be processed : this allows to derive specific morphological transforms to perform noise filtering or moving objects discrimination on dynamic images viewed by a non-stationary sensor. First, a brief introduction to the basic principles underlying morphological filters will be given. Then, a straightforward gener-alization of these principles to time-varying images will be pro-posed. This will lead us to define spatio-temporal opening and closing and to introduce some of their possible applications to process dynamic images. At last, preliminary results obtained us-ing a natural forward looking infrared (FUR) image sequence are presented.

  17. Development of an adaptive bilateral filter for evaluating color image difference

    Science.gov (United States)

    Wang, Zhaohui; Hardeberg, Jon Yngve

    2012-04-01

    Spatial filtering, which aims to mimic the contrast sensitivity function (CSF) of the human visual system (HVS), has previously been combined with color difference formulae for measuring color image reproduction errors. These spatial filters attenuate imperceptible information in images, unfortunately including high frequency edges, which are believed to be crucial in the process of scene analysis by the HVS. The adaptive bilateral filter represents a novel approach, which avoids the undesirable loss of edge information introduced by CSF-based filtering. The bilateral filter employs two Gaussian smoothing filters in different domains, i.e., spatial domain and intensity domain. We propose a method to decide the parameters, which are designed to be adaptive to the corresponding viewing conditions, and the quantity and homogeneity of information contained in an image. Experiments and discussions are given to support the proposal. A series of perceptual experiments were conducted to evaluate the performance of our approach. The experimental sample images were reproduced with variations in six image attributes: lightness, chroma, hue, compression, noise, and sharpness/blurriness. The Pearson's correlation values between the model-predicted image difference and the observed difference were employed to evaluate the performance, and compare it with that of spatial CIELAB and image appearance model.

  18. Adaptive multiresolution Hermite-Binomial filters for image edge and texture analysis

    NARCIS (Netherlands)

    Gu, Y.H.; Katsaggelos, A.K.

    1994-01-01

    A new multiresolution image analysis approach using adaptive Hermite-Binomial filters is presented in this paper. According to the local image structural and textural properties, the analysis filter kernels are made adaptive both in their scales and orders. Applications of such an adaptive filtering

  19. Complex noise suppression using a sparse representation and 3D filtering of images

    Science.gov (United States)

    Kravchenko, V. F.; Ponomaryov, V. I.; Pustovoit, V. I.; Palacios-Enriquez, A.

    2017-08-01

    A novel method for the filtering of images corrupted by complex noise composed of randomly distributed impulses and additive Gaussian noise has been substantiated for the first time. The method consists of three main stages: the detection and filtering of pixels corrupted by impulsive noise, the subsequent image processing to suppress the additive noise based on 3D filtering and a sparse representation of signals in a basis of wavelets, and the concluding image processing procedure to clean the final image of the errors emerged at the previous stages. A physical interpretation of the filtering method under complex noise conditions is given. A filtering block diagram has been developed in accordance with the novel approach. Simulations of the novel image filtering method have shown an advantage of the proposed filtering scheme in terms of generally recognized criteria, such as the structural similarity index measure and the peak signal-to-noise ratio, and when visually comparing the filtered images.

  20. AER image filtering

    Science.gov (United States)

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  1. 3D spectrum imaging of multi-wall carbon nanotube coupled π-surface modes utilising electron energy-loss spectra acquired using a STEM/Enfina system

    International Nuclear Information System (INIS)

    Seepujak, A.; Bangert, U.; Gutierrez-Sosa, A.; Harvey, A.J.; Blank, V.D.; Kulnitskiy, B.A.; Batov, D.V.

    2005-01-01

    Numerous studies have utilised electron energy-loss (EEL) spectra acquired in the plasmon (2-10 eV) regime in order to probe delocalised π-electronic states of multi-wall carbon nanotubes (MWCNTs). Interpretation of electron energy loss (EEL) spectra of MWCNTs in the 2-10 eV regime. Carbon (accepted for publication); Blank et al. J. Appl. Phys. 91 (2002) 1657). In the present contribution, EEL spectra were acquired from a 2D raster defined on a bottle-shaped MWCNT, using a Gatan UHV Enfina system attached to a dedicated scanning transmission electron microscope (STEM). The technique utilised to isolate and sequentially filter each of the volume and surface resonances is described in detail. Utilising a scale for the intensity of a filtered mode enables one to 'see' the distribution of each resonance in the raster. This enables striking 3D resonance-filtered spectrum images (SIs) of π-collective modes to be observed. Red-shift of the lower energy split π-surface resonance provides explicit evidence of π-surface mode coupling predicted for thin graphitic films (Lucas et al. Phys. Rev. B 49 (1994) 2888). Resonance-filtered SIs are also compared to non-filtered SIs with suppressed surface contributions, acquired utilising a displaced collector aperture. The present filtering technique is seen to isolate surface contributions more effectively, and without the significant loss of statistics, associated with the displaced collector aperture mode. Isolation of collective modes utilising 3D resonance-filtered spectrum imaging, demonstrates a valuable method for 'pinpointing' the location of discrete modes in irregularly shaped nanostructures

  2. A filtering approach to edge preserving MAP estimation of images.

    Science.gov (United States)

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  3. Filter and slice thickness selection in SPECT image reconstruction

    International Nuclear Information System (INIS)

    Ivanovic, M.; Weber, D.A.; Wilson, G.A.; O'Mara, R.E.

    1985-01-01

    The choice of filter and slice thickness in SPECT image reconstruction as function of activity and linear and angular sampling were investigated in phantom and patient imaging studies. Reconstructed transverse and longitudinal spatial resolution of the system were measured using a line source in a water filled phantom. Phantom studies included measurements of the Data Spectrum phantom; clinical studies included tomographic procedures in 40 patients undergoing imaging of the temporomandibular joint. Slices of the phantom and patient images were evaluated for spatial of the phantom and patient images were evaluated for spatial resolution, noise, and image quality. Major findings include; spatial resolution and image quality improve with increasing linear sampling frequencies over the range of 4-8 mm/p in the phantom images, best spatial resolution and image quality in clinical images were observed at a linear sampling frequency of 6mm/p, Shepp and Logan filter gives the best spatial resolution for phantom studies at the lowest linear sampling frequency; smoothed Shepp and Logan filter provides best quality images without loss of resolution at higher frequencies and, spatial resolution and image quality improve with increased angular sampling frequency in the phantom at 40 c/p but appear to be independent of angular sampling frequency at 400 c/p

  4. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Directory of Open Access Journals (Sweden)

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  5. Superharmonic imaging with chirp coded excitation: filtering spectrally overlapped harmonics.

    Science.gov (United States)

    Harput, Sevan; McLaughlan, James; Cowell, David M J; Freear, Steven

    2014-11-01

    Superharmonic imaging improves the spatial resolution by using the higher order harmonics generated in tissue. The superharmonic component is formed by combining the third, fourth, and fifth harmonics, which have low energy content and therefore poor SNR. This study uses coded excitation to increase the excitation energy. The SNR improvement is achieved on the receiver side by performing pulse compression with harmonic matched filters. The use of coded signals also introduces new filtering capabilities that are not possible with pulsed excitation. This is especially important when using wideband signals. For narrowband signals, the spectral boundaries of the harmonics are clearly separated and thus easy to filter; however, the available imaging bandwidth is underused. Wideband excitation is preferable for harmonic imaging applications to preserve axial resolution, but it generates spectrally overlapping harmonics that are not possible to filter in time and frequency domains. After pulse compression, this overlap increases the range side lobes, which appear as imaging artifacts and reduce the Bmode image quality. In this study, the isolation of higher order harmonics was achieved in another domain by using the fan chirp transform (FChT). To show the effect of excitation bandwidth in superharmonic imaging, measurements were performed by using linear frequency modulated chirp excitation with varying bandwidths of 10% to 50%. Superharmonic imaging was performed on a wire phantom using a wideband chirp excitation. Results were presented with and without applying the FChT filtering technique by comparing the spatial resolution and side lobe levels. Wideband excitation signals achieved a better resolution as expected, however range side lobes as high as -23 dB were observed for the superharmonic component of chirp excitation with 50% fractional bandwidth. The proposed filtering technique achieved >50 dB range side lobe suppression and improved the image quality without

  6. An adaptive Kalman filter for speckle reductions in ultrasound images

    International Nuclear Information System (INIS)

    Castellini, G.; Labate, D.; Masotti, L.; Mannini, E.; Rocchi, S.

    1988-01-01

    Speckle is the term used to describe the granular appearance found in ultrasound images. The presence of speckle reduces the diagnostic potential of the echographic technique because it tends to mask small inhomogeneities of the investigated tissue. We developed a new method of speckle reductions that utilizes an adaptive one-dimensional Kalman filter based on the assumption that the observed image can be considered as a superimposition of speckle on a ''true images''. The filter adaptivity, necessary to avoid loss of resolution, has been obtained by statistical considerations on the local signal variations. The results of the applications of this particular Kalman filter, both on A-Mode and B-MODE images, show a significant speckle reduction

  7. M2 FILTER FOR SPECKLE NOISE SUPPRESSION IN BREAST ULTRASOUND IMAGES

    Directory of Open Access Journals (Sweden)

    E.S. Samundeeswari

    2016-11-01

    Full Text Available Breast cancer, commonly found in women is a serious life threatening disease due to its invasive nature. Ultrasound (US imaging method plays an effective role in screening early detection and diagnosis of Breast cancer. Speckle noise generally affects medical ultrasound images and also causes a number of difficulties in identifying the Region of Interest. Suppressing speckle noise is a challenging task as it destroys fine edge details. No specific filter is designed yet to get a noise free BUS image that is contaminated by speckle noise. In this paper M2 filter, a novel hybrid of linear and nonlinear filter is proposed and compared to other spatial filters with 3×3 kernel size. The performance of the proposed M2 filter is measured by statistical quantity parameters like MSE, PSNR and SSI. The experimental analysis clearly shows that the proposed M2 filter outperforms better than other spatial filters by 2% high PSNR values with regards to speckle suppression.

  8. Fuzzy Logic-Based Filter for Removing Additive and Impulsive Noise from Color Images

    Science.gov (United States)

    Zhu, Yuhong; Li, Hongyang; Jiang, Huageng

    2017-12-01

    This paper presents an efficient filter method based on fuzzy logics for adaptively removing additive and impulsive noise from color images. The proposed filter comprises two parts including noise detection and noise removal filtering. In the detection part, the fuzzy peer group concept is applied to determine what type of noise is added to each pixel of the corrupted image. In the filter part, the impulse noise is deducted by the vector median filter in the CIELAB color space and an optimal fuzzy filter is introduced to reduce the Gaussian noise, while they can work together to remove the mixed Gaussian-impulse noise from color images. Experimental results on several color images proves the efficacy of the proposed fuzzy filter.

  9. General filtering method for electronic speckle pattern interferometry fringe images with various densities based on variational image decomposition.

    Science.gov (United States)

    Li, Biyuan; Tang, Chen; Gao, Guannan; Chen, Mingming; Tang, Shuwei; Lei, Zhenkun

    2017-06-01

    Filtering off speckle noise from a fringe image is one of the key tasks in electronic speckle pattern interferometry (ESPI). In general, ESPI fringe images can be divided into three categories: low-density fringe images, high-density fringe images, and variable-density fringe images. In this paper, we first present a general filtering method based on variational image decomposition that can filter speckle noise for ESPI fringe images with various densities. In our method, a variable-density ESPI fringe image is decomposed into low-density fringes, high-density fringes, and noise. A low-density fringe image is decomposed into low-density fringes and noise. A high-density fringe image is decomposed into high-density fringes and noise. We give some suitable function spaces to describe low-density fringes, high-density fringes, and noise, respectively. Then we construct several models and numerical algorithms for ESPI fringe images with various densities. And we investigate the performance of these models via our extensive experiments. Finally, we compare our proposed models with the windowed Fourier transform method and coherence enhancing diffusion partial differential equation filter. These two methods may be the most effective filtering methods at present. Furthermore, we use the proposed method to filter a collection of the experimentally obtained ESPI fringe images with poor quality. The experimental results demonstrate the performance of our proposed method.

  10. The singular value filter: a general filter design strategy for PCA-based signal separation in medical ultrasound imaging.

    Science.gov (United States)

    Mauldin, F William; Lin, Dan; Hossack, John A

    2011-11-01

    A general filtering method, called the singular value filter (SVF), is presented as a framework for principal component analysis (PCA) based filter design in medical ultrasound imaging. The SVF approach operates by projecting the original data onto a new set of bases determined from PCA using singular value decomposition (SVD). The shape of the SVF weighting function, which relates the singular value spectrum of the input data to the filtering coefficients assigned to each basis function, is designed in accordance with a signal model and statistical assumptions regarding the underlying source signals. In this paper, we applied SVF for the specific application of clutter artifact rejection in diagnostic ultrasound imaging. SVF was compared to a conventional PCA-based filtering technique, which we refer to as the blind source separation (BSS) method, as well as a simple frequency-based finite impulse response (FIR) filter used as a baseline for comparison. The performance of each filter was quantified in simulated lesion images as well as experimental cardiac ultrasound data. SVF was demonstrated in both simulation and experimental results, over a wide range of imaging conditions, to outperform the BSS and FIR filtering methods in terms of contrast-to-noise ratio (CNR) and motion tracking performance. In experimental mouse heart data, SVF provided excellent artifact suppression with an average CNR improvement of 1.8 dB with over 40% reduction in displacement tracking error. It was further demonstrated from simulation and experimental results that SVF provided superior clutter rejection, as reflected in larger CNR values, when filtering was achieved using complex pulse-echo received data and non-binary filter coefficients.

  11. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  12. Metal artefact reduction for a dental cone beam CT image using image segmentation and backprojection filters

    International Nuclear Information System (INIS)

    Mohammadi, Mahdi; Khotanlou, Hassan; Mohammadi, Mohammad

    2011-01-01

    Full text: Due to low dose delivery and fast scanning, the dental Cone Beam CT (CBCT) is the latest technology being implanted for a range of dental imaging. The presence of metallic objects including amalgam or gold fillings in the mouth produces an intuitive image for human jaws. The feasibility of a fast and accurate approach for metal artefact reduction for dental CBCT is investigated. The current study investigates the metal artefact reduction using image segmentation and modification of several sinigrams. In order to reduce metal effects such as beam hardening, streak artefact and intense noises, the application of several algorithms is evaluated. The proposed method includes three stages: preprocessing, reconstruction and post-processing. In the pre-processing stage, in order to reduce the noise level, several phase and frequency filters were applied. At the second stage, based on the specific sinogram achieved for each segment, spline interpolation and weighting backprojection filters were applied to reconstruct the original image. A three-dimensional filter was then applied on reconstructed images, to improve the image quality. Results showed that compared to other available filters, standard frequency filters have a significant influence in the preprocessing stage (ΔHU = 48 ± 6). In addition, with the streak artefact, the probability of beam hardening artefact increases. t e post-processing stage, the application of three-dimensional filters improves the quality of reconstructed images (See Fig. I). Conclusion The proposed method reduces metal artefacts especially where there are more than one metal implanted in the region of interest.

  13. Two-dimensional restoration of single photon emission computed tomography images using the Kalman filter

    International Nuclear Information System (INIS)

    Boulfelfel, D.; Rangayyan, R.M.; Kuduvalli, G.R.; Hahn, L.J.; Kloiber, R.

    1994-01-01

    The discrete filtered backprojection (DFBP) algorithm used for the reconstruction of single photon emission computed tomography (SPECT) images affects image quality because of the operations of filtering and discretization. The discretization of the filtered backprojection process can cause the modulation transfer function (MTF) of the SPECT imaging system to be anisotropic and nonstationary, especially near the edges of the camera's field of view. The use of shift-invariant restoration techniques fails to restore large images because these techniques do not account for such variations in the MTF. This study presents the application of a two-dimensional (2-D) shift-variant Kalman filter for post-reconstruction restoration of SPECT slices. This filter was applied to SPECT images of a hollow cylinder phantom; a resolution phantom; and a large, truncated cone phantom containing two types of cold spots, a sphere, and a triangular prism. The images were acquired on an ADAC GENESYS camera. A comparison was performed between results obtained by the Kalman filter and those obtained by shift-invariant filters. Quantitative analysis of the restored images performed through measurement of root mean squared errors shows a considerable reduction in error of Kalman-filtered images over images restored using shift-invariant methods

  14. Exploring an optimal wavelet-based filter for cryo-ET imaging.

    Science.gov (United States)

    Huang, Xinrui; Li, Sha; Gao, Song

    2018-02-07

    Cryo-electron tomography (cryo-ET) is one of the most advanced technologies for the in situ visualization of molecular machines by producing three-dimensional (3D) biological structures. However, cryo-ET imaging has two serious disadvantages-low dose and low image contrast-which result in high-resolution information being obscured by noise and image quality being degraded, and this causes errors in biological interpretation. The purpose of this research is to explore an optimal wavelet denoising technique to reduce noise in cryo-ET images. We perform tests using simulation data and design a filter using the optimum selected wavelet parameters (three-level decomposition, level-1 zeroed out, subband-dependent threshold, a soft-thresholding and spline-based discrete dyadic wavelet transform (DDWT)), which we call a modified wavelet shrinkage filter; this filter is suitable for noisy cryo-ET data. When testing using real cryo-ET experiment data, higher quality images and more accurate measures of a biological structure can be obtained with the modified wavelet shrinkage filter processing compared with conventional processing. Because the proposed method provides an inherent advantage when dealing with cryo-ET images, it can therefore extend the current state-of-the-art technology in assisting all aspects of cryo-ET studies: visualization, reconstruction, structural analysis, and interpretation.

  15. Correlation Filters for Detection of Cellular Nuclei in Histopathology Images.

    Science.gov (United States)

    Ahmad, Asif; Asif, Amina; Rajpoot, Nasir; Arif, Muhammad; Minhas, Fayyaz Ul Amir Afsar

    2017-11-21

    Nuclei detection in histology images is an essential part of computer aided diagnosis of cancers and tumors. It is a challenging task due to diverse and complicated structures of cells. In this work, we present an automated technique for detection of cellular nuclei in hematoxylin and eosin stained histopathology images. Our proposed approach is based on kernelized correlation filters. Correlation filters have been widely used in object detection and tracking applications but their strength has not been explored in the medical imaging domain up till now. Our experimental results show that the proposed scheme gives state of the art accuracy and can learn complex nuclear morphologies. Like deep learning approaches, the proposed filters do not require engineering of image features as they can operate directly on histopathology images without significant preprocessing. However, unlike deep learning methods, the large-margin correlation filters developed in this work are interpretable, computationally efficient and do not require specialized or expensive computing hardware. A cloud based webserver of the proposed method and its python implementation can be accessed at the following URL: http://faculty.pieas.edu.pk/fayyaz/software.html#corehist .

  16. US images encoding envelope amplitude following narrow band filtering

    International Nuclear Information System (INIS)

    Sommer, F.G.; Stern, R.A.; Chen, H.S.

    1986-01-01

    Ultrasonic waveform data from phantoms having differing scattering characteristics and from normal and cirrhotic human liver in vivo were recorded within a standardized dynamic range and filtered with narrow band filters either above or below the mean recorded ultrasonic center frequency. Images created by mapping the amplitudes of received ultrasound following such filtration permitted dramatic differentiation, not discernible in conventional US images, of phantoms having differing scattering characteristics, and of normal and cirrhotic human livers

  17. Nonlinear image filtering within IDP++

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, S.K.; Wieting, M.G.; Brase, J.M.

    1995-02-09

    IDP++, image and data processing in C++, is a set of a signal processing libraries written in C++. It is a multi-dimension (up to four dimensions), multi-data type (implemented through templates) signal processing extension to C++. IDP++ takes advantage of the object-oriented compiler technology to provide ``information hiding.`` Users need only know C, not C++. Signals or data sets are treated like any other variable with a defined set of operators and functions. We here some examples of the nonlinear filter library within IDP++. Specifically, the results of MIN, MAX median, {alpha}-trimmed mean, and edge-trimmed mean filters as applied to a real aperture radar (RR) and synthetic aperture radar (SAR) data set.

  18. Imaging through scattering media by Fourier filtering and single-pixel detection

    Science.gov (United States)

    Jauregui-Sánchez, Y.; Clemente, P.; Lancis, J.; Tajahuerce, E.

    2018-02-01

    We present a novel imaging system that combines the principles of Fourier spatial filtering and single-pixel imaging in order to recover images of an object hidden behind a turbid medium by transillumination. We compare the performance of our single-pixel imaging setup with that of a conventional system. We conclude that the introduction of Fourier gating improves the contrast of images in both cases. Furthermore, we show that the combination of single-pixel imaging and Fourier spatial filtering techniques is particularly well adapted to provide images of objects transmitted through scattering media.

  19. Preprocessing of PHERMEX flash radiographic images with Haar and adaptive filtering

    International Nuclear Information System (INIS)

    Brolley, J.E.

    1978-11-01

    Work on image preparation has continued with the application of high-sequency boosting via Haar filtering. This is useful in developing line or edge structures. Widrow LMS adaptive filtering has also been shown to be useful in developing edge structure in special problems. Shadow effects can be obtained with the latter which may be useful for some problems. Combined Haar and adaptive filtering is illustrated for a PHERMEX image

  20. Pleasant/Unpleasant Filtering for Affective Image Retrieval Based on Cross-Correlation of EEG Features

    Directory of Open Access Journals (Sweden)

    Keranmu Xielifuguli

    2014-01-01

    Full Text Available People often make decisions based on sensitivity rather than rationality. In the field of biological information processing, methods are available for analyzing biological information directly based on electroencephalogram: EEG to determine the pleasant/unpleasant reactions of users. In this study, we propose a sensitivity filtering technique for discriminating preferences (pleasant/unpleasant for images using a sensitivity image filtering system based on EEG. Using a set of images retrieved by similarity retrieval, we perform the sensitivity-based pleasant/unpleasant classification of images based on the affective features extracted from images with the maximum entropy method: MEM. In the present study, the affective features comprised cross-correlation features obtained from EEGs produced when an individual observed an image. However, it is difficult to measure the EEG when a subject visualizes an unknown image. Thus, we propose a solution where a linear regression method based on canonical correlation is used to estimate the cross-correlation features from image features. Experiments were conducted to evaluate the validity of sensitivity filtering compared with image similarity retrieval methods based on image features. We found that sensitivity filtering using color correlograms was suitable for the classification of preferred images, while sensitivity filtering using local binary patterns was suitable for the classification of unpleasant images. Moreover, sensitivity filtering using local binary patterns for unpleasant images had a 90% success rate. Thus, we conclude that the proposed method is efficient for filtering unpleasant images.

  1. A Kalman filter technique applied for medical image reconstruction

    International Nuclear Information System (INIS)

    Goliaei, S.; Ghorshi, S.; Manzuri, M. T.; Mortazavi, M.

    2011-01-01

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  2. Weighted ensemble transform Kalman filter for image assimilation

    Directory of Open Access Journals (Sweden)

    Sebastien Beyou

    2013-01-01

    Full Text Available This study proposes an extension of the Weighted Ensemble Kalman filter (WEnKF proposed by Papadakis et al. (2010 for the assimilation of image observations. The main focus of this study is on a novel formulation of the Weighted filter with the Ensemble Transform Kalman filter (WETKF, incorporating directly as a measurement model a non-linear image reconstruction criterion. This technique has been compared to the original WEnKF on numerical and real world data of 2-D turbulence observed through the transport of a passive scalar. In particular, it has been applied for the reconstruction of oceanic surface current vorticity fields from sea surface temperature (SST satellite data. This latter technique enables a consistent recovery along time of oceanic surface currents and vorticity maps in presence of large missing data areas and strong noise.

  3. Apodized RFI filtering of synthetic aperture radar images

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter

    2014-02-01

    Fine resolution Synthetic Aperture Radar (SAR) systems necessarily require wide bandwidths that often overlap spectrum utilized by other wireless services. These other emitters pose a source of Radio Frequency Interference (RFI) to the SAR echo signals that degrades SAR image quality. Filtering, or excising, the offending spectral contaminants will mitigate the interference, but at a cost of often degrading the SAR image in other ways, notably by raising offensive sidelobe levels. This report proposes borrowing an idea from nonlinear sidelobe apodization techniques to suppress interference without the attendant increase in sidelobe levels. The simple post-processing technique is termed Apodized RFI Filtering (ARF).

  4. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Floros, D; Zhang, Y; Yin, FF; Ren, L; Pitsianis, N

    2016-01-01

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  5. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Floros, D [Aristotle University of Thessaloniki (Greece); Zhang, Y; Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States)

    2016-06-15

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  6. Filtering and deconvolution for bioluminescence imaging of small animals

    International Nuclear Information System (INIS)

    Akkoul, S.

    2010-01-01

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  7. Digital Path Approach Despeckle Filter for Ultrasound Imaging and Video

    Directory of Open Access Journals (Sweden)

    Marek Szczepański

    2017-01-01

    Full Text Available We propose a novel filtering technique capable of reducing the multiplicative noise in ultrasound images that is an extension of the denoising algorithms based on the concept of digital paths. In this approach, the filter weights are calculated taking into account the similarity between pixel intensities that belongs to the local neighborhood of the processed pixel, which is called a path. The output of the filter is estimated as the weighted average of pixels connected by the paths. The way of creating paths is pivotal and determines the effectiveness and computational complexity of the proposed filtering design. Such procedure can be effective for different types of noise but fail in the presence of multiplicative noise. To increase the filtering efficiency for this type of disturbances, we introduce some improvements of the basic concept and new classes of similarity functions and finally extend our techniques to a spatiotemporal domain. The experimental results prove that the proposed algorithm provides the comparable results with the state-of-the-art techniques for multiplicative noise removal in ultrasound images and it can be applied for real-time image enhancement of video streams.

  8. Image restoration technique using median filter combined with decision tree algorithm

    International Nuclear Information System (INIS)

    Sethu, D.; Assadi, H.M.; Hasson, F.N.; Hasson, N.N.

    2007-01-01

    Images are usually corrupted during transmission principally due to interface in the channel used for transmission. Images also be impaired by the addition of various forms of noise. Salt and pepper is commonly used to impair the image. Salt and pepper noise can be caused by errors in data transmission, malfunctioning pixel elements in camera sensors, and timing errors in the digitization process. During the filtering of noisy image, important features such as edges, lines and other fine image details embedded in the image tends to blur because of filtering operation. The enhancement of noisy data, however, is a very critical process because the sharpening operation can significantly increase the noise. In this respect, contrast enhancement is often necessary in order to highlight details that have been blurred. In this proposed approach we aim to develop image processing technique that can meet this new requirement, which are high quality and high speed. Furthermore, prevent the noise accretion during the sharpening of the image details, and compare the restored images via proposed method with other kinds of filters. (author)

  9. A filtering approach to image reconstruction in 3D SPECT

    International Nuclear Information System (INIS)

    Bronnikov, Andrei V.

    2000-01-01

    We present a new approach to three-dimensional (3D) image reconstruction using analytical inversion of the exponential divergent beam transform, which can serve as a mathematical model for cone-beam 3D SPECT imaging. We apply a circular cone-beam scan and assume constant attenuation inside a convex area with a known boundary, which is satisfactory in brain imaging. The reconstruction problem is reduced to an image restoration problem characterized by a shift-variant point spread function which is given analytically. The method requires two computation steps: backprojection and filtering. The modulation transfer function (MTF) of the filter is derived by means of an original methodology using the 2D Laplace transform. The filter is implemented in the frequency domain and requires 2D Fourier transform of transverse slices. In order to obtain a shift-invariant cone-beam projection-backprojection operator we resort to an approximation, assuming that the collimator has a relatively large focal length. Nevertheless, numerical experiments demonstrate surprisingly good results for detectors with relatively short focal lengths. The use of a wavelet-based filtering algorithm greatly improves the stability to Poisson noise. (author)

  10. Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.

    Science.gov (United States)

    Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R

    2017-02-01

    Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Time Domain Filtering of Resolved Images of Sgr A{sup ∗}

    Energy Technology Data Exchange (ETDEWEB)

    Shiokawa, Hotaka; Doeleman, Sheperd S. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Gammie, Charles F. [Department of Physics, University of Illinois, 1110 West Green Street, Urbana, IL 61801 (United States)

    2017-09-01

    The goal of the Event Horizon Telescope (EHT) is to provide spatially resolved images of Sgr A*, the source associated with the Galactic Center black hole. Because Sgr A* varies on timescales that are short compared to an EHT observing campaign, it is interesting to ask whether variability contains information about the structure and dynamics of the accretion flow. In this paper, we introduce “time-domain filtering,” a technique to filter time fluctuating images with specific temporal frequency ranges and to demonstrate the power and usage of the technique by applying it to mock millimeter wavelength images of Sgr A*. The mock image data is generated from the General Relativistic Magnetohydrodynamic (GRMHD) simulation and the general relativistic ray-tracing method. We show that the variability on each line of sight is tightly correlated with a typical radius of emission. This is because disk emissivity fluctuates on a timescale of the order of the local orbital period. Time-domain filtered images therefore reflect the model dependent emission radius distribution, which is not accessible in time-averaged images. We show that, in principle, filtered data have the power to distinguish between models with different black-hole spins, different disk viewing angles, and different disk orientations in the sky.

  12. Image denoising by sparse 3-D transform-domain collaborative filtering.

    Science.gov (United States)

    Dabov, Kostadin; Foi, Alessandro; Katkovnik, Vladimir; Egiazarian, Karen

    2007-08-01

    We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2-D image fragments (e.g., blocks) into 3-D data arrays which we call "groups." Collaborative filtering is a special procedure developed to deal with these 3-D groups. We realize it using the three successive steps: 3-D transformation of a group, shrinkage of the transform spectrum, and inverse 3-D transformation. The result is a 3-D estimate that consists of the jointly filtered grouped image blocks. By attenuating the noise, the collaborative filtering reveals even the finest details shared by grouped blocks and, at the same time, it preserves the essential unique features of each individual block. The filtered blocks are then returned to their original positions. Because these blocks are overlapping, for each pixel, we obtain many different estimates which need to be combined. Aggregation is a particular averaging procedure which is exploited to take advantage of this redundancy. A significant improvement is obtained by a specially developed collaborative Wiener filtering. An algorithm based on this novel denoising strategy and its efficient implementation are presented in full detail; an extension to color-image denoising is also developed. The experimental results demonstrate that this computationally scalable algorithm achieves state-of-the-art denoising performance in terms of both peak signal-to-noise ratio and subjective visual quality.

  13. Kalman filter techniques for accelerated Cartesian dynamic cardiac imaging.

    Science.gov (United States)

    Feng, Xue; Salerno, Michael; Kramer, Christopher M; Meyer, Craig H

    2013-05-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome, and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and signal-to-noise ratio. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view-sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. Copyright © 2012 Wiley Periodicals, Inc.

  14. Filters involving derivatives with application to reconstruction from scanned halftone images

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Jensen, Kim S.

    1995-01-01

    This paper presents a method for designing finite impulse response (FIR) filters for samples of a 2-D signal, e.g., an image, and its gradient. The filters, which are called blended filters, are decomposable in three filters, each separable in 1-D filters on subsets of the data set. Optimality...... in the minimum mean square error sense (MMSE) of blended filtering is shown for signals with separable autocorrelation function. Relations between correlation functions for signals and their gradients are derived. Blended filters may be composed from FIR Wiener filters using these relations. Simple blended...... is achievable with blended filters...

  15. Magnetic resonance image enhancement using V-filter

    International Nuclear Information System (INIS)

    Yamamoto, H.; Sugita, K.; Kanzaki, N.; Johja, I.; Hiraki, Y.

    1990-01-01

    The purpose of this study is to present a method of boundary enhancement algorithms for magnetic resonance images using a V-filter. The boundary of the brain tumor was precisely extracted by the region segmentation techniques

  16. Raman imaging using fixed bandpass filter

    Science.gov (United States)

    Landström, L.; Kullander, F.; Lundén, H.; Wästerby, P.

    2017-05-01

    By using fixed narrow band pass optical filtering and scanning the laser excitation wavelength, hyperspectral Raman imaging could be achieved. Experimental, proof-of-principle results from the Chemical Warfare Agent (CWA) tabun (GA) as well as the common CWA simulant tributyl phosphate (TBP) on different surfaces/substrates are presented and discussed.

  17. Advanced microlens and color filter process technology for the high-efficiency CMOS and CCD image sensors

    Science.gov (United States)

    Fan, Yang-Tung; Peng, Chiou-Shian; Chu, Cheng-Yu

    2000-12-01

    New markets are emerging for digital electronic image device, especially in visual communications, PC camera, mobile/cell phone, security system, toys, vehicle image system and computer peripherals for document capture. To enable one-chip image system that image sensor is with a full digital interface, can make image capture devices in our daily lives. Adding a color filter to such image sensor in a pattern of mosaics pixel or wide stripes can make image more real and colorful. We can say 'color filter makes the life more colorful color filter is? Color filter means can filter image light source except the color with specific wavelength and transmittance that is same as color filter itself. Color filter process is coating and patterning green, red and blue (or cyan, magenta and yellow) mosaic resists onto matched pixel in image sensing array pixels. According to the signal caught from each pixel, we can figure out the environment image picture. Widely use of digital electronic camera and multimedia applications today makes the feature of color filter becoming bright. Although it has challenge but it is very worthy to develop the process of color filter. We provide the best service on shorter cycle time, excellent color quality, high and stable yield. The key issues of advanced color process have to be solved and implemented are planarization and micro-lens technology. Lost of key points of color filter process technology have to consider will also be described in this paper.

  18. Dynamic positron emission tomography image restoration via a kinetics-induced bilateral filter.

    Directory of Open Access Journals (Sweden)

    Zhaoying Bian

    Full Text Available Dynamic positron emission tomography (PET imaging is a powerful tool that provides useful quantitative information on physiological and biochemical processes. However, low signal-to-noise ratio in short dynamic frames makes accurate kinetic parameter estimation from noisy voxel-wise time activity curves (TAC a challenging task. To address this problem, several spatial filters have been investigated to reduce the noise of each frame with noticeable gains. These filters include the Gaussian filter, bilateral filter, and wavelet-based filter. These filters usually consider only the local properties of each frame without exploring potential kinetic information from entire frames. Thus, in this work, to improve PET parametric imaging accuracy, we present a kinetics-induced bilateral filter (KIBF to reduce the noise of dynamic image frames by incorporating the similarity between the voxel-wise TACs using the framework of bilateral filter. The aim of the proposed KIBF algorithm is to reduce the noise in homogeneous areas while preserving the distinct kinetics of regions of interest. Experimental results on digital brain phantom and in vivo rat study with typical (18F-FDG kinetics have shown that the present KIBF algorithm can achieve notable gains over other existing algorithms in terms of quantitative accuracy measures and visual inspection.

  19. Two-dimensional real-time imaging system for subtraction angiography using an iodine filter

    Science.gov (United States)

    Umetani, Keiji; Ueda, Ken; Takeda, Tohoru; Anno, Izumi; Itai, Yuji; Akisada, Masayoshi; Nakajima, Teiichi

    1992-01-01

    A new type of subtraction imaging system was developed using an iodine filter and a single-energy broad bandwidth monochromatized x ray. The x-ray images of coronary arteries made after intravenous injection of a contrast agent are enhanced by an energy-subtraction technique. Filter chopping of the x-ray beam switches energies rapidly, so that a nearly simultaneous pair of filtered and nonfiltered images can be made. By using a high-speed video camera, a pair of two 512 × 512 pixel images can be obtained within 9 ms. Three hundred eighty-four images (raw data) are stored in a 144-Mbyte frame memory. After phantom studies, in vivo subtracted images of coronary arteries in dogs were obtained at a rate of 15 images/s.

  20. Guided filtering for solar image/video processing

    Directory of Open Access Journals (Sweden)

    Long Xu

    2017-06-01

    Full Text Available A new image enhancement algorithm employing guided filtering is proposed in this work for enhancement of solar images and videos, so that users can easily figure out important fine structures imbedded in the recorded images/movies for solar observation. The proposed algorithm can efficiently remove image noises, including Gaussian and impulse noises. Meanwhile, it can further highlight fibrous structures on/beyond the solar disk. These fibrous structures can clearly demonstrate the progress of solar flare, prominence coronal mass emission, magnetic field, and so on. The experimental results prove that the proposed algorithm gives significant enhancement of visual quality of solar images beyond original input and several classical image enhancement algorithms, thus facilitating easier determination of interesting solar burst activities from recorded images/movies.

  1. DESIGN OF DYADIC-INTEGER-COEFFICIENTS BASED BI-ORTHOGONAL WAVELET FILTERS FOR IMAGE SUPER-RESOLUTION USING SUB-PIXEL IMAGE REGISTRATION

    Directory of Open Access Journals (Sweden)

    P.B. Chopade

    2014-05-01

    Full Text Available This paper presents image super-resolution scheme based on sub-pixel image registration by the design of a specific class of dyadic-integer-coefficient based wavelet filters derived from the construction of a half-band polynomial. First, the integer-coefficient based half-band polynomial is designed by the splitting approach. Next, this designed half-band polynomial is factorized and assigned specific number of vanishing moments and roots to obtain the dyadic-integer coefficients low-pass analysis and synthesis filters. The possibility of these dyadic-integer coefficients based wavelet filters is explored in the field of image super-resolution using sub-pixel image registration. The two-resolution frames are registered at a specific shift from one another to restore the resolution lost by CCD array of camera. The discrete wavelet transform (DWT obtained from the designed coefficients is applied on these two low-resolution images to obtain the high resolution image. The developed approach is validated by comparing the quality metrics with existing filter banks.

  2. Iris image recognition wavelet filter-banks based iris feature extraction schemes

    CERN Document Server

    Rahulkar, Amol D

    2014-01-01

    This book provides the new results in wavelet filter banks based feature extraction, and the classifier in the field of iris image recognition. It provides the broad treatment on the design of separable, non-separable wavelets filter banks, and the classifier. The design techniques presented in the book are applied on iris image analysis for person authentication. This book also brings together the three strands of research (wavelets, iris image analysis, and classifier). It compares the performance of the presented techniques with state-of-the-art available schemes. This book contains the compilation of basic material on the design of wavelets that avoids reading many different books. Therefore, it provide an easier path for the new-comers, researchers to master the contents. In addition, the designed filter banks and classifier can also be effectively used than existing filter-banks in many signal processing applications like pattern classification, data-compression, watermarking, denoising etc.  that will...

  3. Document image binarization using "multi-scale" predefined filters

    Science.gov (United States)

    Saabni, Raid M.

    2018-04-01

    Reading text or searching for key words within a historical document is a very challenging task. one of the first steps of the complete task is binarization, where we separate foreground such as text, figures and drawings from the background. Successful results of this important step in many cases can determine next steps to success or failure, therefore it is very vital to the success of the complete task of reading and analyzing the content of a document image. Generally, historical documents images are of poor quality due to their storage condition and degradation over time, which mostly cause to varying contrasts, stains, dirt and seeping ink from reverse side. In this paper, we use banks of anisotropic predefined filters in different scales and orientations to develop a binarization method for degraded documents and manuscripts. Using the fact, that handwritten strokes may follow different scales and orientations, we use predefined sets of filter banks having various scales, weights, and orientations to seek a compact set of filters and weights in order to generate diffrent layers of foregrounds and background. Results of convolving these fiters on the gray level image locally, weighted and accumulated to enhance the original image. Based on the different layers, seeds of components in the gray level image and a learning process, we present an improved binarization algorithm to separate the background from layers of foreground. Different layers of foreground which may be caused by seeping ink, degradation or other factors are also separated from the real foreground in a second phase. Promising experimental results were obtained on the DIBCO2011 , DIBCO2013 and H-DIBCO2016 data sets and a collection of images taken from real historical documents.

  4. Image restoration by Wiener filtering in the presence of signal-dependent noise.

    Science.gov (United States)

    Kondo, K; Ichioka, Y; Suzuki, T

    1977-09-01

    An optimum filter to restore the degraded image due to blurring and the signal-dependent noise is obtained on the basis of the theory of Wiener filtering. Computer simulations of image restoration using signal-dependent noise models are carried out. It becomes clear that the optimum filter, which makes use of a priori information on the signal-dependent nature of the noise and the spectral density of the signal and the noise showing significant spatial correlation, is potentially advantageous.

  5. Multi-look polarimetric SAR image filtering using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper

    2000-01-01

    Based on a previously published algorithm capable of estimating the radar cross-section in synthetic aperture radar (SAR) intensity images, a new filter is presented utilizing multi-look polarimetric SAR images. The underlying mean covariance matrix is estimated from the observed sample covariance...

  6. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  7. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  8. FEI Titan 80-300 STEM

    Directory of Open Access Journals (Sweden)

    Marc Heggen

    2016-02-01

    Full Text Available The FEI Titan 80-300 STEM is a scanning transmission electron microscope equipped with a field emission electron gun, a three-condenser lens system, a monochromator unit, and a Cs probe corrector (CEOS, a post-column energy filter system (Gatan Tridiem 865 ER as well as a Gatan 2k slow scan CCD system. Characterised by a STEM resolution of 80 pm at 300 kV, the instrument was one of the first of a small number of sub-ångström resolution scanning transmission electron microscopes in the world when commissioned in 2006.

  9. COMPARISON OF ULTRASOUND IMAGE FILTERING METHODS BY MEANS OF MULTIVARIABLE KURTOSIS

    Directory of Open Access Journals (Sweden)

    Mariusz Nieniewski

    2017-06-01

    Full Text Available Comparison of the quality of despeckled US medical images is complicated because there is no image of a human body that would be free of speckles and could serve as a reference. A number of various image metrics are currently used for comparison of filtering methods; however, they do not satisfactorily represent the visual quality of images and medical expert’s satisfaction with images. This paper proposes an innovative use of relative multivariate kurtosis for the evaluation of the most important edges in an image. Multivariate kurtosis allows one to introduce an order among the filtered images and can be used as one of the metrics for image quality evaluation. At present there is no method which would jointly consider individual metrics. Furthermore, these metrics are typically defined by comparing the noisy original and filtered images, which is incorrect since the noisy original cannot serve as a golden standard. In contrast to this, the proposed kurtosis is the absolute measure, which is calculated independently of any reference image and it agrees with the medical expert’s satisfaction to a large extent. The paper presents a numerical procedure for calculating kurtosis and describes results of such calculations for a computer-generated noisy image, images of a general purpose phantom and a cyst phantom, as well as real-life images of thyroid and carotid artery obtained with SonixTouch ultrasound machine. 16 different methods of image despeckling are compared via kurtosis. The paper shows that visually more satisfactory despeckling results are associated with higher kurtosis, and to a certain degree kurtosis can be used as a single metric for evaluation of image quality.

  10. Fan-beam and cone-beam image reconstruction via filtering the backprojection image of differentiated projection data

    International Nuclear Information System (INIS)

    Zhuang Tingliang; Leng Shuai; Nett, Brian E; Chen Guanghong

    2004-01-01

    In this paper, a new image reconstruction scheme is presented based on Tuy's cone-beam inversion scheme and its fan-beam counterpart. It is demonstrated that Tuy's inversion scheme may be used to derive a new framework for fan-beam and cone-beam image reconstruction. In this new framework, images are reconstructed via filtering the backprojection image of differentiated projection data. The new framework is mathematically exact and is applicable to a general source trajectory provided the Tuy data sufficiency condition is satisfied. By choosing a piece-wise constant function for one of the components in the factorized weighting function, the filtering kernel is one dimensional, viz. the filtering process is along a straight line. Thus, the derived image reconstruction algorithm is mathematically exact and efficient. In the cone-beam case, the derived reconstruction algorithm is applicable to a large class of source trajectories where the pi-lines or the generalized pi-lines exist. In addition, the new reconstruction scheme survives the super-short scan mode in both the fan-beam and cone-beam cases provided the data are not transversely truncated. Numerical simulations were conducted to validate the new reconstruction scheme for the fan-beam case

  11. Evaluation of multichannel Wiener filters applied to fine resolution passive microwave images of first-year sea ice

    Science.gov (United States)

    Full, William E.; Eppler, Duane T.

    1993-01-01

    The effectivity of multichannel Wiener filters to improve images obtained with passive microwave systems was investigated by applying Wiener filters to passive microwave images of first-year sea ice. Four major parameters which define the filter were varied: the lag or pixel offset between the original and the desired scenes, filter length, the number of lines in the filter, and the weight applied to the empirical correlation functions. The effect of each variable on the image quality was assessed by visually comparing the results. It was found that the application of multichannel Wiener theory to passive microwave images of first-year sea ice resulted in visually sharper images with enhanced textural features and less high-frequency noise. However, Wiener filters induced a slight blocky grain to the image and could produce a type of ringing along scan lines traversing sharp intensity contrasts.

  12. MRT letter: Guided filtering of image focus volume for 3D shape recovery of microscopic objects.

    Science.gov (United States)

    Mahmood, Muhammad Tariq

    2014-12-01

    In this letter, a shape from focus (SFF) method is proposed that utilizes the guided image filtering to enhance the image focus volume efficiently. First, image focus volume is computed using a conventional focus measure. Then each layer of image focus volume is filtered using guided filtering. In this work, the all-in-focus image, which can be obtained from the initial focus volume, is used as guidance image. Finally, improved depth map is obtained from the filtered image focus volume by maximizing the focus measure along the optical axis. The proposed SFF method is efficient and provides better depth maps. The improved performance is highlighted by conducting several experiments using image sequences of simulated and real microscopic objects. The comparative analysis demonstrates the effectiveness of the proposed SFF method. © 2014 Wiley Periodicals, Inc.

  13. An Efficient FPGA Implementation of Optimized Anisotropic Diffusion Filtering of Images

    Directory of Open Access Journals (Sweden)

    Chandrajit Pal

    2016-01-01

    Full Text Available Digital image processing is an exciting area of research with a variety of applications including medical, surveillance security systems, defence, and space applications. Noise removal as a preprocessing step helps to improve the performance of the signal processing algorithms, thereby enhancing image quality. Anisotropic diffusion filtering proposed by Perona and Malik can be used as an edge-preserving smoother, removing high-frequency components of images without blurring their edges. In this paper, we present the FPGA implementation of an edge-preserving anisotropic diffusion filter for digital images. The designed architecture completely replaced the convolution operation and implemented the same using simple arithmetic subtraction of the neighboring intensities within a kernel, preceded by multiple operations in parallel within the kernel. To improve the image reconstruction quality, the diffusion coefficient parameter, responsible for controlling the filtering process, has been properly analyzed. Its signal behavior has been studied by subsequently scaling and differentiating the signal. The hardware implementation of the proposed design shows better performance in terms of reconstruction quality and accelerated performance with respect to its software implementation. It also reduces computation, power consumption, and resource utilization with respect to other related works.

  14. Active filtering applied to radiographic images unfolded by the Richardson-Lucy algorithm

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Silvani, Maria Ines; Lopes, Ricardo T.

    2011-01-01

    Degradation of images caused by systematic uncertainties can be reduced when one knows the features of the spoiling agent. Typical uncertainties of this kind arise in radiographic images due to the non - zero resolution of the detector used to acquire them, and from the non-punctual character of the source employed in the acquisition, or from the beam divergence when extended sources are used. Both features blur the image, which, instead of a single point exhibits a spot with a vanishing edge, reproducing hence the point spread function - PSF of the system. Once this spoiling function is known, an inverse problem approach, involving inversion of matrices, can then be used to retrieve the original image. As these matrices are generally ill-conditioned, due to statistical fluctuation and truncation errors, iterative procedures should be applied, such as the Richardson-Lucy algorithm. This algorithm has been applied in this work to unfold radiographic images acquired by transmission of thermal neutrons and gamma-rays. After this procedure, the resulting images undergo an active filtering which fairly improves their final quality at a negligible cost in terms of processing time. The filter ruling the process is based on the matrix of the correction factors for the last iteration of the deconvolution procedure. Synthetic images degraded with a known PSF, and undergone to the same treatment, have been used as benchmark to evaluate the soundness of the developed active filtering procedure. The deconvolution and filtering algorithms have been incorporated to a Fortran program, written to deal with real images, generate the synthetic ones and display both. (author)

  15. Image reconstruction for digital breast tomosynthesis (DBT) by using projection-angle-dependent filter functions

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yeonok; Park, Chulkyu; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Lee, Minsik; Cho, Heemoon; Choi, Sungil; Koo, Yangseo [Yonsei University, Wonju (Korea, Republic of)

    2014-09-15

    Digital breast tomosynthesis (DBT) is considered in clinics as a standard three-dimensional imaging modality, allowing the earlier detection of cancer. It typically acquires only 10-30 projections over a limited angle range of 15 - 60 .deg. with a stationary detector and typically uses a computationally-efficient filtered-backprojection (FBP) algorithm for image reconstruction. However, a common FBP algorithm yields poor image quality resulting from the loss of average image value and the presence of severe image artifacts due to the elimination of the dc component of the image by the ramp filter and to the incomplete data, respectively. As an alternative, iterative reconstruction methods are often used in DBT to overcome these difficulties, even though they are still computationally expensive. In this study, as a compromise, we considered a projection-angle dependent filtering method in which one-dimensional geometry-adapted filter kernels are computed with the aid of a conjugate-gradient method and are incorporated into the standard FBP framework. We implemented the proposed algorithm and performed systematic simulation works to investigate the imaging characteristics. Our results indicate that the proposed method is superior to a conventional FBP method for DBT imaging and has a comparable computational cost, while preserving good image homogeneity and edge sharpening with no serious image artifacts.

  16. Processing of a neutrographic image, using Bosso Filter

    International Nuclear Information System (INIS)

    Pereda, C.; Bustamante, M.; Henriquez, C.

    2006-01-01

    The following paper shows the result of the treatment of a neutron radiographic image, obtained in the RECH-1 experimental reactor, making use of the computational image treatment techniques of the IDL software, which are complemented with the Bosso filter method already tested to improve quality in medical diagnosis. These techniques possess an undeniable value as an auxiliary to neutrography, which results can be noticed through this first try with an auxiliary neutrographic image used in PGNAA. These results insinuate that this method should give all its advantages to the neutrographic analysis standards: structural images, density variations, etc

  17. Variation of the count-dependent Metz filter with imaging system modulation transfer function

    International Nuclear Information System (INIS)

    King, M.A.; Schwinger, R.B.; Penney, B.C.

    1986-01-01

    A systematic investigation was conducted of how a number of parameters which alter the system modulation transfer function (MTF) influence the count-dependent Metz filter. Since restoration filters are most effective at those frequencies where the object power spectrum dominates that of the noise, it was observed that parameters which significantly degrade the MTF at low spatial frequencies strongly influence the formation of the Metz filter. Thus the radionuclide imaged and the depth of the source in a scattering medium had the most influence. This is because they alter the relative amount of scattered radiation being imaged. For low-energy photon emitters, the collimator employed and the distance from the collimator were found to have less of an influence but still to be significant. These cause alterations in the MTF which are more gradual, and hence are most pronounced at mid to high spatial frequencies. As long as adequate spatial sampling is employed, the Metz filter was determined to be independent of the exact size of the sampling bin width, to a first approximation. For planar and single photon emission computed tomographic (SPECT) imaging, it is shown that two-dimensional filtering with the Metz filter optimized for the imaging conditions is able to deconvolve scatter and other causes of spatial resolution loss while diminishing noise, all in a balanced manner

  18. Spectral characterization in deep UV of an improved imaging KDP acousto-optic tunable filter

    International Nuclear Information System (INIS)

    Gupta, Neelam; Voloshinov, Vitaly

    2014-01-01

    Recently, we developed a number of high quality noncollinear acousto-optic tunable filter (AOTF) cells in different birefringent materials with UV imaging capability. Cells based on a single crystal of KDP (potassium dihydrophosphate) had the best transmission efficiency and the optical throughput needed to acquire high quality spectral images at wavelengths above 220 nm. One of the main limitations of these imaging filters was their small angular aperture in air, limited to about 1.0°. In this paper, we describe an improved imaging KDP AOTF operating from the deep UV to the visible region of the spectrum. The linear and angular apertures of the new filter are 10 × 10 mm 2 and 1.8°, respectively. The spectral tuning range is 205–430 nm with a 60 cm −1 spectral resolution. We describe the filter and present experimental results on imaging using both a broadband source and a number of light emitting diodes (LEDs) in the UV, and include the measured spectra of these LEDs obtained with a collinear SiO 2 filter-based spectrometer operating above 255 nm. (paper)

  19. Imaging reconstruction based on improved wavelet denoising combined with parallel-beam filtered back-projection algorithm

    Science.gov (United States)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2012-11-01

    The image reconstruction is a key step in medical imaging (MI) and its algorithm's performance determinates the quality and resolution of reconstructed image. Although some algorithms have been used, filter back-projection (FBP) algorithm is still the classical and commonly-used algorithm in clinical MI. In FBP algorithm, filtering of original projection data is a key step in order to overcome artifact of the reconstructed image. Since simple using of classical filters, such as Shepp-Logan (SL), Ram-Lak (RL) filter have some drawbacks and limitations in practice, especially for the projection data polluted by non-stationary random noises. So, an improved wavelet denoising combined with parallel-beam FBP algorithm is used to enhance the quality of reconstructed image in this paper. In the experiments, the reconstructed effects were compared between the improved wavelet denoising and others (directly FBP, mean filter combined FBP and median filter combined FBP method). To determine the optimum reconstruction effect, different algorithms, and different wavelet bases combined with three filters were respectively test. Experimental results show the reconstruction effect of improved FBP algorithm is better than that of others. Comparing the results of different algorithms based on two evaluation standards i.e. mean-square error (MSE), peak-to-peak signal-noise ratio (PSNR), it was found that the reconstructed effects of the improved FBP based on db2 and Hanning filter at decomposition scale 2 was best, its MSE value was less and the PSNR value was higher than others. Therefore, this improved FBP algorithm has potential value in the medical imaging.

  20. Improving Image Matching by Reducing Surface Reflections Using Polarising Filter Techniques

    Science.gov (United States)

    Conen, N.; Hastedt, H.; Kahmen, O.; Luhmann, T.

    2018-05-01

    In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera's orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002) using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm) with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  1. IMPROVING IMAGE MATCHING BY REDUCING SURFACE REFLECTIONS USING POLARISING FILTER TECHNIQUES

    Directory of Open Access Journals (Sweden)

    N. Conen

    2018-05-01

    Full Text Available In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera’s orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002 using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  2. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  3. Bowtie filter and water calibration in the improvement of cone beam CT image quality

    International Nuclear Information System (INIS)

    Li Minghui; Dai Jianrong; Zhang Ke

    2010-01-01

    Objective: To evaluate the improvement of cone beam CT (CBCT) image quality by using bewtie filter (F 1 ) and water calibration. Methods: First the multi-level gain calibration of the detector panel with the method of Cal 2 calibration was performed, and the CT images of CATPHAN503 with F 0 and bowtie filter were collected, respectively. Then the detector panel using water calibration kit was calibrated, and images were acquired again. Finally, the change of image quality after using F 1 and (or) water calibration method was observed. The observed indexes included low contrast visibility, spatial uniformity, ring artifact, spatial resolution and geometric accuracy. Results: Comparing with the traditional combination of F 0 filter and Cal 2 calibration, the combination of bowtie filter F 1 and water calibration improves low contrast visibility by 13.71%, and spatial uniformity by 54. 42%. Water calibration removes ring artifacts effectively. However, none of them improves spatial resolution and geometric accuracy. Conclusions: The combination of F 1 and water calibration improves CBCT image quality effectively. This improvement is aid to the registration of CBCT images and localization images. (authors)

  4. Digital filtering and reconstruction of coded aperture images

    International Nuclear Information System (INIS)

    Tobin, K.W. Jr.

    1987-01-01

    The real-time neutron radiography facility at the University of Virginia has been used for both transmission radiography and computed tomography. Recently, a coded aperture system has been developed to permit the extraction of three dimensional information from a low intensity field of radiation scattered by an extended object. Short wave-length radiations (e.g. neutrons) are not easily image because of the difficulties in achieving diffraction and refraction with a conventional lens imaging system. By using a coded aperture approach, an imaging system has been developed that records and reconstructs an object from an intensity distribution. This system has a signal-to-noise ratio that is proportional to the total open area of the aperture making it ideal for imaging with a limiting intensity radiation field. The main goal of this research was to develope and implement the digital methods and theory necessary for the reconstruction process. Several real-time video systems, attached to an Intellect-100 image processor, a DEC PDP-11 micro-computer, and a Convex-1 parallel processing mainframe were employed. This system, coupled with theoretical extensions and improvements, allowed for retrieval of information previously unobtainable by earlier optical methods. The effect of thermal noise, shot noise, and aperture related artifacts were examined so that new digital filtering techniques could be constructed and implemented. Results of image data filtering prior to and following the reconstruction process are reported. Improvements related to the different signal processing methods are emphasized. The application and advantages of this imaging technique to the field of non-destructive testing are also discussed

  5. 3D early embryogenesis image filtering by nonlinear partial differential equations.

    Science.gov (United States)

    Krivá, Z; Mikula, K; Peyriéras, N; Rizzi, B; Sarti, A; Stasová, O

    2010-08-01

    We present nonlinear diffusion equations, numerical schemes to solve them and their application for filtering 3D images obtained from laser scanning microscopy (LSM) of living zebrafish embryos, with a goal to identify the optimal filtering method and its parameters. In the large scale applications dealing with analysis of 3D+time embryogenesis images, an important objective is a correct detection of the number and position of cell nuclei yielding the spatio-temporal cell lineage tree of embryogenesis. The filtering is the first and necessary step of the image analysis chain and must lead to correct results, removing the noise, sharpening the nuclei edges and correcting the acquisition errors related to spuriously connected subregions. In this paper we study such properties for the regularized Perona-Malik model and for the generalized mean curvature flow equations in the level-set formulation. A comparison with other nonlinear diffusion filters, like tensor anisotropic diffusion and Beltrami flow, is also included. All numerical schemes are based on the same discretization principles, i.e. finite volume method in space and semi-implicit scheme in time, for solving nonlinear partial differential equations. These numerical schemes are unconditionally stable, fast and naturally parallelizable. The filtering results are evaluated and compared first using the Mean Hausdorff distance between a gold standard and different isosurfaces of original and filtered data. Then, the number of isosurface connected components in a region of interest (ROI) detected in original and after the filtering is compared with the corresponding correct number of nuclei in the gold standard. Such analysis proves the robustness and reliability of the edge preserving nonlinear diffusion filtering for this type of data and lead to finding the optimal filtering parameters for the studied models and numerical schemes. Further comparisons consist in ability of splitting the very close objects which

  6. The HURRA filter: An easy method to eliminate collimator artifacts in high-energy gamma camera images.

    Science.gov (United States)

    Perez-Garcia, H; Barquero, R

    The correct determination and delineation of tumor/organ size is crucial in 2-D imaging in 131 I therapy. These images are usually obtained using a system composed of a Gamma camera and high-energy collimator, although the system can produce artifacts in the image. This article analyses these artifacts and describes a correction filter that can eliminate those collimator artifacts. Using free software, ImageJ, a central profile in the image is obtained and analyzed. Two components can be seen in the fluctuation of the profile: one associated with the stochastic nature of the radiation, plus electronic noise and the other periodically across the position in space due to the collimator. These frequencies are analytically obtained and compared with the frequencies in the Fourier transform of the profile. A specially developed filter removes the artifacts in the 2D Fourier transform of the DICOM image. This filter is tested using a 15-cm-diameter Petri dish with 131 I radioactive water (big object size) image, a 131 I clinical pill (small object size) image, and an image of the remainder of the lesion of two patients treated with 3.7GBq (100mCi), and 4.44GBq (120mCi) of 131 I, respectively, after thyroidectomy. The artifact is due to the hexagonal periodic structure of the collimator. The use of the filter on large-sized images reduces the fluctuation by 5.8-3.5%. In small-sized images, the FWHM can be determined in the filtered image, while this is impossible in the unfiltered image. The definition of tumor boundary and the visualization of the activity distribution inside patient lesions improve drastically when the filter is applied to the corresponding images obtained with HE gamma camera. The HURRA filter removes the artifact of high-energy collimator artifacts in planar images obtained with a Gamma camera without reducing the image resolution. It can be applied in any study of patient quantification because the number of counts remains invariant. The filter makes

  7. Eigenimage filtering of nuclear medicine image sequences

    International Nuclear Information System (INIS)

    Windham, J.P.; Froelich, J.W.; Abd-Allah, M.

    1985-01-01

    In many nuclear medicine imaging sequences the localization of radioactivity in organs other than the target organ interferes with imaging of the desired anatomical structure or physiological process. A filtering technique has been developed which suppresses the interfering process while enhancing the desired process. This technique requires the identification of temporal sequential signatures for both the interfering and desired processes. These signatures are placed in the form of signature vectors. Signature matrices, M/sub D/ and M/sub U/, are formed by taking the outer product expansion of the temporal signature vectors for the desired and interfering processes respectively. By using the transformation from the simultaneous diagonalization of these two signature matrices a weighting vector is obtained. The technique is shown to maximize the projection of the desired process while minimizing the interfering process based upon an extension of Rayleigh's Principle. The technique is demonstrated for first pass renal and cardiac flow studies. This filter offers a potential for simplifying and extending the accuracy of diagnostic nuclear medicine procedures

  8. Improvement of natural image search engines results by emotional filtering

    Directory of Open Access Journals (Sweden)

    Patrice Denis

    2016-04-01

    Full Text Available With the Internet 2.0 era, managing user emotions is a problem that more and more actors are interested in. Historically, the first notions of emotion sharing were expressed and defined with emoticons. They allowed users to show their emotional status to others in an impersonal and emotionless digital world. Now, in the Internet of social media, every day users share lots of content with each other on Facebook, Twitter, Google+ and so on. Several new popular web sites like FlickR, Picassa, Pinterest, Instagram or DeviantArt are now specifically based on sharing image content as well as personal emotional status. This kind of information is economically very valuable as it can for instance help commercial companies sell more efficiently. In fact, with this king of emotional information, business can made where companies will better target their customers needs, and/or even sell them more products. Research has been and is still interested in the mining of emotional information from user data since then. In this paper, we focus on the impact of emotions from images that have been collected from search image engines. More specifically our proposition is the creation of a filtering layer applied on the results of such image search engines. Our peculiarity relies in the fact that it is the first attempt from our knowledge to filter image search engines results with an emotional filtering approach.

  9. Adaptive iterated function systems filter for images highly corrupted with fixed - Value impulse noise

    Science.gov (United States)

    Shanmugavadivu, P.; Eliahim Jeevaraj, P. S.

    2014-06-01

    The Adaptive Iterated Functions Systems (AIFS) Filter presented in this paper has an outstanding potential to attenuate the fixed-value impulse noise in images. This filter has two distinct phases namely noise detection and noise correction which uses Measure of Statistics and Iterated Function Systems (IFS) respectively. The performance of AIFS filter is assessed by three metrics namely, Peak Signal-to-Noise Ratio (PSNR), Mean Structural Similarity Index Matrix (MSSIM) and Human Visual Perception (HVP). The quantitative measures PSNR and MSSIM endorse the merit of this filter in terms of degree of noise suppression and details/edge preservation respectively, in comparison with the high performing filters reported in the recent literature. The qualitative measure HVP confirms the noise suppression ability of the devised filter. This computationally simple noise filter broadly finds application wherein the images are highly degraded by fixed-value impulse noise.

  10. Pornographic image recognition and filtering using incremental learning in compressed domain

    Science.gov (United States)

    Zhang, Jing; Wang, Chao; Zhuo, Li; Geng, Wenhao

    2015-11-01

    With the rapid development and popularity of the network, the openness, anonymity, and interactivity of networks have led to the spread and proliferation of pornographic images on the Internet, which have done great harm to adolescents' physical and mental health. With the establishment of image compression standards, pornographic images are mainly stored with compressed formats. Therefore, how to efficiently filter pornographic images is one of the challenging issues for information security. A pornographic image recognition and filtering method in the compressed domain is proposed by using incremental learning, which includes the following steps: (1) low-resolution (LR) images are first reconstructed from the compressed stream of pornographic images, (2) visual words are created from the LR image to represent the pornographic image, and (3) incremental learning is adopted to continuously adjust the classification rules to recognize the new pornographic image samples after the covering algorithm is utilized to train and recognize the visual words in order to build the initial classification model of pornographic images. The experimental results show that the proposed pornographic image recognition method using incremental learning has a higher recognition rate as well as costing less recognition time in the compressed domain.

  11. Methods of filtering the graph images of the functions

    Directory of Open Access Journals (Sweden)

    Олександр Григорович Бурса

    2017-06-01

    Full Text Available The theoretical aspects of cleaning raster images of scanned graphs of functions from digital, chromatic and luminance distortions by using computer graphics techniques have been considered. The basic types of distortions characteristic of graph images of functions have been stated. To suppress the distortion several methods, providing for high-quality of the resulting images and saving their topological features, were suggested. The paper describes the techniques developed and improved by the authors: the method of cleaning the image of distortions by means of iterative contrasting, based on the step-by-step increase in image contrast in the graph by 1%; the method of small entities distortion restoring, based on the thinning of the known matrix of contrast increase filter (the allowable dimensions of the nucleus dilution radius convolution matrix, which provide for the retention of the graph lines have been established; integration technique of the noise reduction method by means of contrasting and distortion restoring method of small entities with known σ-filter. Each method in the complex has been theoretically substantiated. The developed methods involve treatment of graph images as the entire image (global processing and its fragments (local processing. The metrics assessing the quality of the resulting image with the global and local processing have been chosen, the substantiation of the choice as well as the formulas have been given. The proposed complex methods of cleaning the graphs images of functions from grayscale image distortions is adaptive to the form of an image carrier, the distortion level in the image and its distribution. The presented results of testing the developed complex of methods for a representative sample of images confirm its effectiveness

  12. Image enhancement filters significantly improve reading performance for low vision observers

    Science.gov (United States)

    Lawton, T. B.

    1992-01-01

    As people age, so do their photoreceptors; many photoreceptors in central vision stop functioning when a person reaches their late sixties or early seventies. Low vision observers with losses in central vision, those with age-related maculopathies, were studied. Low vision observers no longer see high spatial frequencies, being unable to resolve fine edge detail. We developed image enhancement filters to compensate for the low vision observer's losses in contrast sensitivity to intermediate and high spatial frequencies. The filters work by boosting the amplitude of the less visible intermediate spatial frequencies. The lower spatial frequencies. These image enhancement filters not only reduce the magnification needed for reading by up to 70 percent, but they also increase the observer's reading speed by 2-4 times. A summary of this research is presented.

  13. Median Filter Noise Reduction of Image and Backpropagation Neural Network Model for Cervical Cancer Classification

    Science.gov (United States)

    Wutsqa, D. U.; Marwah, M.

    2017-06-01

    In this paper, we consider spatial operation median filter to reduce the noise in the cervical images yielded by colposcopy tool. The backpropagation neural network (BPNN) model is applied to the colposcopy images to classify cervical cancer. The classification process requires an image extraction by using a gray level co-occurrence matrix (GLCM) method to obtain image features that are used as inputs of BPNN model. The advantage of noise reduction is evaluated by comparing the performances of BPNN models with and without spatial operation median filter. The experimental result shows that the spatial operation median filter can improve the accuracy of the BPNN model for cervical cancer classification.

  14. Use of metameric filters for future interference security image structures

    Science.gov (United States)

    Baloukas, Bill; Larouche, Stéphane; Martinu, Ludvik

    2006-02-01

    In the present work, we describe innovative approaches and properties that can be added to the already popular thin film optically variable devices (OVD) used on banknotes. We show two practical examples of OVDs, namely (i) a pair of metameric filters offering a hidden image effect as a function of the angle of observation as well as a specific spectral property permitting automatic note readability, and (ii) multi-material filters offering a side-dependent color shift. We first describe the design approach of these new devices followed by their sensitivity to deposition errors especially in the case of the metameric filters where slight thickness variations have a significant effect on the obtained colors. The performance of prototype filters prepared by dual ion beam sputtering (DIBS) is shown.

  15. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    Science.gov (United States)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  16. Superresolution restoration of an image sequence: adaptive filtering approach.

    Science.gov (United States)

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  17. Energy-filtered Photoelectron Emission Microscopy (EF-PEEM) for imaging nanoelectronic materials

    International Nuclear Information System (INIS)

    Renault, Olivier; Chabli, Amal

    2007-01-01

    Photoelectron-Emission Microscopy (PEEM) is the most promising approach to photoemission-based (XPS, UPS) imaging techniques with high lateral resolution, typically below 100 nm. It has now reached its maturity with a new generation of instruments with energy-filtering capabilities. Therefore UPS and XPS imaging with energy-filtered PEEM (EF-PEEM) can be applied to technologically-relevant samples. UPS images with contrast in local work function, obtained with laboratory UV sources, are obtained in ultra-high vacuum environment with lateral resolutions better than 50 nm and sensitivies of 20 meV. XPS images with elemental and bonding state contrast can show up lateral resolution better than 200 nm with synchrotron excitation. In this paper, we present the principles and capabilities of EF-PEEM and nanospectroscopy. Then, we focus on an example of application to non-destructive work-function imaging of polycrystalline copper for advanced interconnects, where it is shown that EF-PEEM is an alternative to Kelvin probes

  18. Enhancement of noisy EDX HRSTEM spectrum-images by combination of filtering and PCA.

    Science.gov (United States)

    Potapov, Pavel; Longo, Paolo; Okunishi, Eiji

    2017-05-01

    STEM spectrum-imaging with collecting EDX signal is considered in view of the extraction of maximum information from very noisy data. It is emphasized that spectrum-images with weak EDX signal often suffer from information loss in the course of PCA treatment. The loss occurs when the level of random noise exceeds a certain threshold. Weighted PCA, though potentially helpful in isolation of meaningful variations from noise, might provoke the complete loss of information in the situation of weak EDX signal. Filtering datasets prior PCA can improve the situation and recover the lost information. In particular, Gaussian kernel filters are found to be efficient. A new filter useful in the case of sparse atomic-resolution EDX spectrum-images is suggested. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. The use of the Kalman filter in the automated segmentation of EIT lung images

    International Nuclear Information System (INIS)

    Zifan, A; Chapman, B E; Liatsis, P

    2013-01-01

    In this paper, we present a new pipeline for the fast and accurate segmentation of impedance images of the lungs using electrical impedance tomography (EIT). EIT is an emerging, promising, non-invasive imaging modality that produces real-time, low spatial but high temporal resolution images of impedance inside a body. Recovering impedance itself constitutes a nonlinear ill-posed inverse problem, therefore the problem is usually linearized, which produces impedance-change images, rather than static impedance ones. Such images are highly blurry and fuzzy along object boundaries. We provide a mathematical reasoning behind the high suitability of the Kalman filter when it comes to segmenting and tracking conductivity changes in EIT lung images. Next, we use a two-fold approach to tackle the segmentation problem. First, we construct a global lung shape to restrict the search region of the Kalman filter. Next, we proceed with augmenting the Kalman filter by incorporating an adaptive foreground detection system to provide the boundary contours for the Kalman filter to carry out the tracking of the conductivity changes as the lungs undergo deformation in a respiratory cycle. The proposed method has been validated by using performance statistics such as misclassified area, and false positive rate, and compared to previous approaches. The results show that the proposed automated method can be a fast and reliable segmentation tool for EIT imaging. (paper)

  20. The use of the Kalman filter in the automated segmentation of EIT lung images.

    Science.gov (United States)

    Zifan, A; Liatsis, P; Chapman, B E

    2013-06-01

    In this paper, we present a new pipeline for the fast and accurate segmentation of impedance images of the lungs using electrical impedance tomography (EIT). EIT is an emerging, promising, non-invasive imaging modality that produces real-time, low spatial but high temporal resolution images of impedance inside a body. Recovering impedance itself constitutes a nonlinear ill-posed inverse problem, therefore the problem is usually linearized, which produces impedance-change images, rather than static impedance ones. Such images are highly blurry and fuzzy along object boundaries. We provide a mathematical reasoning behind the high suitability of the Kalman filter when it comes to segmenting and tracking conductivity changes in EIT lung images. Next, we use a two-fold approach to tackle the segmentation problem. First, we construct a global lung shape to restrict the search region of the Kalman filter. Next, we proceed with augmenting the Kalman filter by incorporating an adaptive foreground detection system to provide the boundary contours for the Kalman filter to carry out the tracking of the conductivity changes as the lungs undergo deformation in a respiratory cycle. The proposed method has been validated by using performance statistics such as misclassified area, and false positive rate, and compared to previous approaches. The results show that the proposed automated method can be a fast and reliable segmentation tool for EIT imaging.

  1. Brain MR Image Restoration Using an Automatic Trilateral Filter With GPU-Based Acceleration.

    Science.gov (United States)

    Chang, Herng-Hua; Li, Cheng-Yuan; Gallogly, Audrey Haihong

    2018-02-01

    Noise reduction in brain magnetic resonance (MR) images has been a challenging and demanding task. This study develops a new trilateral filter that aims to achieve robust and efficient image restoration. Extended from the bilateral filter, the proposed algorithm contains one additional intensity similarity funct-ion, which compensates for the unique characteristics of noise in brain MR images. An entropy function adaptive to intensity variations is introduced to regulate the contributions of the weighting components. To hasten the computation, parallel computing based on the graphics processing unit (GPU) strategy is explored with emphasis on memory allocations and thread distributions. To automate the filtration, image texture feature analysis associated with machine learning is investigated. Among the 98 candidate features, the sequential forward floating selection scheme is employed to acquire the optimal texture features for regularization. Subsequently, a two-stage classifier that consists of support vector machines and artificial neural networks is established to predict the filter parameters for automation. A speedup gain of 757 was reached to process an entire MR image volume of 256 × 256 × 256 pixels, which completed within 0.5 s. Automatic restoration results revealed high accuracy with an ensemble average relative error of 0.53 ± 0.85% in terms of the peak signal-to-noise ratio. This self-regulating trilateral filter outperformed many state-of-the-art noise reduction methods both qualitatively and quantitatively. We believe that this new image restoration algorithm is of potential in many brain MR image processing applications that require expedition and automation.

  2. Image Denoising Using Interquartile Range Filter with Local Averaging

    OpenAIRE

    Jassim, Firas Ajil

    2013-01-01

    Image denoising is one of the fundamental problems in image processing. In this paper, a novel approach to suppress noise from the image is conducted by applying the interquartile range (IQR) which is one of the statistical methods used to detect outlier effect from a dataset. A window of size kXk was implemented to support IQR filter. Each pixel outside the IQR range of the kXk window is treated as noisy pixel. The estimation of the noisy pixels was obtained by local averaging. The essential...

  3. Mammographic image enhancement using wavelet transform and homomorphic filter

    Directory of Open Access Journals (Sweden)

    F Majidi

    2015-12-01

    Full Text Available Mammography is the most effective method for the early diagnosis of breast cancer diseases. As mammographic images contain low signal to noise ratio and low contrast, it becomes too difficult for radiologists to analyze mammogram. To deal with the above stated problems, it is very important to enhance the mammographic images using image processing methods. This paper introduces a new image enhancement approach for mammographic images which uses the modified mathematical morphology, wavelet transform and homomorphic filter to suppress the noise of images. For performance evaluation of the proposed method, contrast improvement index (CII and edge preservation index (EPI are adopted. Experimental results on mammographic images from Pejvak Digital Imaging Center (PDIC show that the proposed algorithm improves the two indexes, thereby achieving the goal of enhancing mammographic images.

  4. A METHOD FOR RECORDING AND VIEWING STEREOSCOPIC IMAGES IN COLOUR USING MULTICHROME FILTERS

    DEFF Research Database (Denmark)

    2000-01-01

    in a conventional stereogram recorded of the scene. The invention makes use of a colour-based encoding technique and viewing filters selected so that the human observer receives, in one eye, an image of nearly full colour information, in the other eye, an essentially monochrome image supplying the parallactic......The aim of the invention is to create techniques for the encoding, production and viewing of stereograms, supplemented by methods for selecting certain optical filters needed in these novel techniques, thus providing a human observer with stereograms each of which consist of a single image...

  5. Evaluation of an image-based tracking workflow with Kalman filtering for automatic image plane alignment in interventional MRI.

    Science.gov (United States)

    Neumann, M; Cuvillon, L; Breton, E; de Matheli, M

    2013-01-01

    Recently, a workflow for magnetic resonance (MR) image plane alignment based on tracking in real-time MR images was introduced. The workflow is based on a tracking device composed of 2 resonant micro-coils and a passive marker, and allows for tracking of the passive marker in clinical real-time images and automatic (re-)initialization using the microcoils. As the Kalman filter has proven its benefit as an estimator and predictor, it is well suited for use in tracking applications. In this paper, a Kalman filter is integrated in the previously developed workflow in order to predict position and orientation of the tracking device. Measurement noise covariances of the Kalman filter are dynamically changed in order to take into account that, according to the image plane orientation, only a subset of the 3D pose components is available. The improved tracking performance of the Kalman extended workflow could be quantified in simulation results. Also, a first experiment in the MRI scanner was performed but without quantitative results yet.

  6. Blurred image restoration using knife-edge function and optimal window Wiener filtering

    Science.gov (United States)

    Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects. PMID:29377950

  7. Impact of Image Filters and Observations Parameters in CBCT for Identification of Mandibular Osteolytic Lesions.

    Science.gov (United States)

    Monteiro, Bruna Moraes; Nobrega Filho, Denys Silveira; Lopes, Patrícia de Medeiros Loureiro; de Sales, Marcelo Augusto Oliveira

    2012-01-01

    The aim of this study was to analyze the influence of filters (algorithms) to improve the image of Cone Beam Computed Tomography (CBCT) in diagnosis of osteolytic lesions of the mandible, in order to establish the protocols for viewing images more suitable for CBCT diagnostics. 15 dry mandibles in which perforations were performed, simulating lesions, were submitted to CBCT examination. Two examiners analyzed the images, using filters to improve image Hard, Normal, and Very Sharp, contained in the iCAT Vision software, and protocols for assessment: axial; sagittal and coronal; and axial, sagittal and coronal planes simultaneously (MPR), on two occasions. The sensitivity and specificity (validity) of the cone beam computed tomography (CBCT) have been demonstrated as the values achieved were above 75% for sensitivity and above 85% for specificity, reaching around 95.5% of sensitivity and 99% of specificity when we used the appropriate observation protocol. It was concluded that the use of filters (algorithms) to improve the CBCT image influences the diagnosis, due to the fact that all measured values were correspondingly higher when it was used the filter Very Sharp, which justifies its use for clinical activities, followed by Hard and Normal filters, in order of decreasing values.

  8. Impact of Image Filters and Observations Parameters in CBCT for Identification of Mandibular Osteolytic Lesions

    Directory of Open Access Journals (Sweden)

    Bruna Moraes Monteiro

    2012-01-01

    Full Text Available The aim of this study was to analyze the influence of filters (algorithms to improve the image of Cone Beam Computed Tomography (CBCT in diagnosis of osteolytic lesions of the mandible, in order to establish the protocols for viewing images more suitable for CBCT diagnostics. 15 dry mandibles in which perforations were performed, simulating lesions, were submitted to CBCT examination. Two examiners analyzed the images, using filters to improve image Hard, Normal, and Very Sharp, contained in the iCAT Vision software, and protocols for assessment: axial; sagittal and coronal; and axial, sagittal and coronal planes simultaneously (MPR, on two occasions. The sensitivity and specificity (validity of the cone beam computed tomography (CBCT have been demonstrated as the values achieved were above 75% for sensitivity and above 85% for specificity, reaching around 95.5% of sensitivity and 99% of specificity when we used the appropriate observation protocol. It was concluded that the use of filters (algorithms to improve the CBCT image influences the diagnosis, due to the fact that all measured values were correspondingly higher when it was used the filter Very Sharp, which justifies its use for clinical activities, followed by Hard and Normal filters, in order of decreasing values.

  9. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    Directory of Open Access Journals (Sweden)

    Hyeon Sik Kim

    2014-10-01

    Full Text Available Objective(s: In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF measurement by dynamic N-13 ammonia positron emission tomography (PET, we compared various reconstruction and filtering methods with image characteristics. Methods: Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years were reconstructed, using filtered back projection (FBP and ordered subset expectation maximization (OSEM methods. OSEM reconstruction consisted of OSEM_2I, OSEM_4I, and OSEM_6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR was calculated by noise and contrast recovery (CR. Stress and rest MBF and coronary flow reserve (CFR were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. Results: In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (PP=0.923 and 0.855 for readers 1 and 2, respectively. SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Conclusion: Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation. .

  10. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Cho, Sang-Geon; Kim, Ju Han; Kwon, Seong Young; Lee, Byeong-il; Bom, Hee-Seung

    2014-01-01

    In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF) measurement by dynamic N-13 ammonia positron emission tomography (PET), we compared various reconstruction and filtering methods with image characteristics. Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years) were reconstructed, using filtered back projection (FBP) and ordered subset expectation maximization (OSEM) methods. OSEM reconstruction consisted of OSEM-2I, OSEM-4I, and OSEM-6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ) was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR) was calculated by noise and contrast recovery (CR). Stress and rest MBF and coronary flow reserve (CFR) were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (P<0.001 for both readers). However, no significant difference of IQ was found between FBP and various numbers of iteration in OSEM (P=0.923 and 0.855 for readers 1 and 2, respectively). SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation

  11. Denoising of MR images using FREBAS collaborative filtering

    International Nuclear Information System (INIS)

    Ito, Satoshi; Hizume, Masayuki; Yamada, Yoshifumi

    2011-01-01

    We propose a novel image denoising strategy based on the correlation in the FREBAS transformed domain. FREBAS transform is a kind of multi-resolution image analysis which consists of two different Fresnel transforms. It can decompose images into down-scaled images of the same size with a different frequency bandwidth. Since these decomposed images have similar distributions for the same directions from the center of the FREBAS domain, even when the FREBAS signal is hidden by noise in the case of a low-signal-to-noise ratio (SNR) image, the signal distribution can be estimated using the distribution of the FREBAS signal located near the position of interest. We have developed a collaborative Wiener filter in the FREBAS transformed domain which implements collaboration of the standard deviation of the position of interest and that of analogous positions. The experimental results demonstrated that the proposed algorithm improves the SNR in terms of both the total SNR and the SNR at the edges of images. (author)

  12. Performance evaluation of 3-D enhancement filters for detection of lung cancer from 3-D chest X-ray CT images

    International Nuclear Information System (INIS)

    Shimizu, Akinobu; Hagai, Makoto; Toriwaki, Jun-ichiro; Hasegawa, Jun-ichi.

    1995-01-01

    This paper evaluates the performance of several three dimensional enhancement filters used in procedures for detecting lung cancer shadows from three dimensional (3D) chest X-ray CT images. Two dimensional enhancement filters such as Min-DD filter, Contrast filter and N-Quoit filter have been proposed for enhancing cancer shadows in conventional 2D X-ray images. In this paper, we extend each of these 2D filters to a 3D filter and evaluate its performance experimentally by using CT images with artificial and true lung cancer shadows. As a result, we find that these 3D filters are effective for determining the position of a lung cancer shadow in a 3D chest CT image, as compared with the simple procedure such as smoothing filter, and that the performance of these filters become lower in the hilar area due to the influence of the vessel shadows. (author)

  13. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Sihem SLATNIA

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks,extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  14. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Okba Kazar

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks, extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  15. Thermographic image analysis for classification of ACL rupture disease, bone cancer, and feline hyperthyroid, with Gabor filters

    Science.gov (United States)

    Alvandipour, Mehrdad; Umbaugh, Scott E.; Mishra, Deependra K.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Thermography and pattern classification techniques are used to classify three different pathologies in veterinary images. Thermographic images of both normal and diseased animals were provided by the Long Island Veterinary Specialists (LIVS). The three pathologies are ACL rupture disease, bone cancer, and feline hyperthyroid. The diagnosis of these diseases usually involves radiology and laboratory tests while the method that we propose uses thermographic images and image analysis techniques and is intended for use as a prescreening tool. Images in each category of pathologies are first filtered by Gabor filters and then various features are extracted and used for classification into normal and abnormal classes. Gabor filters are linear filters that can be characterized by the two parameters wavelength λ and orientation θ. With two different wavelength and five different orientations, a total of ten different filters were studied. Different combinations of camera views, filters, feature vectors, normalization methods, and classification methods, produce different tests that were examined and the sensitivity, specificity and success rate for each test were produced. Using the Gabor features alone, sensitivity, specificity, and overall success rates of 85% for each of the pathologies was achieved.

  16. Comparison of the diagnostic accuracy of direct digital radiography system, filtered images, and subtraction radiography

    Directory of Open Access Journals (Sweden)

    Wilton Mitsunari Takeshita

    2013-01-01

    Full Text Available Background: To compare the diagnostic accuracy of three different imaging systems: Direct digital radiography system (DDR-CMOS, four types of filtered images, and a priori and a posteriori registration of digital subtraction radiography (DSR in the diagnosis of proximal defects. Materials and Methods: The teeth were arranged in pairs in 10 blocks of vinyl polysiloxane, and proximal defects were performed with drills of 0.25, 0.5, and 1 mm diameter. Kodak RVG 6100 sensor was used to capture the images. A posteriori DSR registrations were done with Regeemy 0.2.43 and subtraction with Image Tool 3.0. Filtered images were obtained with Kodak Dental Imaging 6.1 software. Images (n = 360 were evaluated by three raters, all experts in dental radiology. Results: Sensitivity and specificity of the area under the receiver operator characteristic (ROC curve (Az were higher for DSR images with all three drills (Az = 0.896, 0.979, and 1.000 for drills 0.25, 0.5, and 1 mm, respectively. The highest values were found for 1-mm drills and the lowest for 0.25-mm drills, with negative filter having the lowest values of all (Az = 0.631. Conclusion: The best method of diagnosis was by using a DSR. The negative filter obtained the worst results. Larger drills showed the highest sensitivity and specificity values of the area under the ROC curve.

  17. Use of morphologic filters in the computerized detection of lung nodules in digital chest images

    International Nuclear Information System (INIS)

    Yoshimura, H.; Giger, M.L.; Doi, K.; Ahn, N.; MacMahon, H.

    1989-01-01

    The authors have previously described a computerized scheme for the detection of lung nodules based on a difference-image approach, which had a detection accuracy of 70% with 7--8 false positives per image. Currently, they are investigating morphologic filters for the further enhancement/suppression of nodule-signals and the removal of false-positives. Gray-level morphologic filtering is performed on clinical chest radiographs digitized with an optical drum scanner. Various shapes and sequences of erosion and dilation filters (i.e., determination of the minimum and maximum gray levels, respectively) were examined for signal enhancement and suppression for sue in the difference- image approach

  18. ANALYSIS OF SST IMAGES BY WEIGHTED ENSEMBLE TRANSFORM KALMAN FILTER

    OpenAIRE

    Sai , Gorthi; Beyou , Sébastien; Memin , Etienne

    2011-01-01

    International audience; This paper presents a novel, efficient scheme for the analysis of Sea Surface Temperature (SST) ocean images. We consider the estimation of the velocity fields and vorticity values from a sequence of oceanic images. The contribution of this paper lies in proposing a novel, robust and simple approach based onWeighted Ensemble Transform Kalman filter (WETKF) data assimilation technique for the analysis of real SST images, that may contain coast regions or large areas of ...

  19. Information Recovery Algorithm for Ground Objects in Thin Cloud Images by Fusing Guide Filter and Transfer Learning

    Directory of Open Access Journals (Sweden)

    HU Gensheng

    2018-03-01

    Full Text Available Ground object information of remote sensing images covered with thin clouds is obscure. An information recovery algorithm for ground objects in thin cloud images is proposed by fusing guide filter and transfer learning. Firstly, multi-resolution decomposition of thin cloud target images and cloud-free guidance images is performed by using multi-directional nonsubsampled dual-tree complex wavelet transform. Then the decomposed low frequency subbands are processed by using support vector guided filter and transfer learning respectively. The decomposed high frequency subbands are enhanced by using modified Laine enhancement function. The low frequency subbands output by guided filter and those predicted by transfer learning model are fused by the method of selection and weighting based on regional energy. Finally, the enhanced high frequency subbands and the fused low frequency subbands are reconstructed by using inverse multi-directional nonsubsampled dual-tree complex wavelet transform to obtain the ground object information recovery images. Experimental results of Landsat-8 OLI multispectral images show that, support vector guided filter can effectively preserve the detail information of the target images, domain adaptive transfer learning can effectively extend the range of available multi-source and multi-temporal remote sensing images, and good effects for ground object information recover are obtained by fusing guide filter and transfer learning to remove thin cloud on the remote sensing images.

  20. Despeckle filtering for ultrasound imaging and video II selected applications

    CERN Document Server

    Loizou, Christos P

    2015-01-01

    In ultrasound imaging and video visual perception is hindered by speckle multiplicative noise that degrades the quality. Noise reduction is therefore essential for improving the visual observation quality or as a pre-processing step for further automated analysis, such as image/video segmentation, texture analysis and encoding in ultrasound imaging and video. The goal of the first book (book 1 of 2 books) was to introduce the problem of speckle in ultrasound image and video as well as the theoretical background, algorithmic steps, and the MatlabTM for the following group of despeckle filters:

  1. Three-State Locally Adaptive Texture Preserving Filter for Radar and Optical Image Processing

    Directory of Open Access Journals (Sweden)

    Jaakko T. Astola

    2005-05-01

    Full Text Available Textural features are one of the most important types of useful information contained in images. In practice, these features are commonly masked by noise. Relatively little attention has been paid to texture preserving properties of noise attenuation methods. This stimulates solving the following tasks: (1 to analyze the texture preservation properties of various filters; and (2 to design image processing methods capable to preserve texture features well and to effectively reduce noise. This paper deals with examining texture feature preserving properties of different filters. The study is performed for a set of texture samples and different noise variances. The locally adaptive three-state schemes are proposed for which texture is considered as a particular class. For “detection” of texture regions, several classifiers are proposed and analyzed. As shown, an appropriate trade-off of the designed filter properties is provided. This is demonstrated quantitatively for artificial test images and is confirmed visually for real-life images.

  2. HDR Pathological Image Enhancement Based on Improved Bias Field Correction and Guided Image Filter

    Directory of Open Access Journals (Sweden)

    Qingjiao Sun

    2016-01-01

    Full Text Available Pathological image enhancement is a significant topic in the field of pathological image processing. This paper proposes a high dynamic range (HDR pathological image enhancement method based on improved bias field correction and guided image filter (GIF. Firstly, a preprocessing including stain normalization and wavelet denoising is performed for Haematoxylin and Eosin (H and E stained pathological image. Then, an improved bias field correction model is developed to enhance the influence of light for high-frequency part in image and correct the intensity inhomogeneity and detail discontinuity of image. Next, HDR pathological image is generated based on least square method using low dynamic range (LDR image, H and E channel images. Finally, the fine enhanced image is acquired after the detail enhancement process. Experiments with 140 pathological images demonstrate the performance advantages of our proposed method as compared with related work.

  3. Aircraft Detection from VHR Images Based on Circle-Frequency Filter and Multilevel Features

    Directory of Open Access Journals (Sweden)

    Feng Gao

    2013-01-01

    Full Text Available Aircraft automatic detection from very high-resolution (VHR images plays an important role in a wide variety of applications. This paper proposes a novel detector for aircraft detection from very high-resolution (VHR remote sensing images. To accurately distinguish aircrafts from background, a circle-frequency filter (CF-filter is used to extract the candidate locations of aircrafts from a large size image. A multi-level feature model is then employed to represent both local appearance and spatial layout of aircrafts by means of Robust Hue Descriptor and Histogram of Oriented Gradients. The experimental results demonstrate the superior performance of the proposed method.

  4. Monochromated scanning transmission electron microscopy

    International Nuclear Information System (INIS)

    Rechberger, W.; Kothleitner, G.; Hofer, F.

    2006-01-01

    Full text: Electron energy-loss spectroscopy (EELS) has developed into an established technique for chemical and structural analysis of thin specimens in the (scanning) transmission electron microscope (S)TEM. The energy resolution in EELS is largely limited by the stability of the high voltage supply, by the resolution of the spectrometer and by the energy spread of the source. To overcome this limitation a Wien filter monochromator was recently introduced with commercially available STEMs, offering the advantage to better resolve EELS fine structures, which contain valuable bonding information. The method of atomic resolution Z-contrast imaging within an STEM, utilizing a high-angle annular dark-field (HAADF) detector can perfectly complement the excellent energy resolution, since EELS spectra can be collected simultaneously. In combination with a monochromator microscope not only high spatial resolution images can be recorded but also high energy resolution EELS spectra are attainable. In this work we investigated the STEM performance of a 200 kV monochromated Tecnai F20 with a high resolution Gatan Imaging Filter (HR-GIF). (author)

  5. Cross-correlated imaging of distributed mode filtering rod fiber

    DEFF Research Database (Denmark)

    Laurila, Marko; Barankov, Roman; Jørgensen, Mette Marie

    2013-01-01

    We analyze the modal properties of an 85μm core distributed mode filtering rod fiber using cross-correlated (C2) imaging. We evaluate suppression of higher-order modes (HOMs) under severely misaligned mode excitation and identify a single-mode regime where HOMs are suppressed by more than 20dB....

  6. 32Still Image Compression Algorithm Based on Directional Filter Banks

    OpenAIRE

    Chunling Yang; Duanwu Cao; Li Ma

    2010-01-01

    Hybrid wavelet and directional filter banks (HWD) is an effective multi-scale geometrical analysis method. Compared to wavelet transform, it can better capture the directional information of images. But the ringing artifact, which is caused by the coefficient quantization in transform domain, is the biggest drawback of image compression algorithms in HWD domain. In this paper, by researching on the relationship between directional decomposition and ringing artifact, an improved decomposition ...

  7. Applying Enhancement Filters in the Pre-processing of Images of Lymphoma

    International Nuclear Information System (INIS)

    Silva, Sérgio Henrique; Do Nascimento, Marcelo Zanchetta; Neves, Leandro Alves; Batista, Valério Ramos

    2015-01-01

    Lymphoma is a type of cancer that affects the immune system, and is classified as Hodgkin or non-Hodgkin. It is one of the ten types of cancer that are the most common on earth. Among all malignant neoplasms diagnosed in the world, lymphoma ranges from three to four percent of them. Our work presents a study of some filters devoted to enhancing images of lymphoma at the pre-processing step. Here the enhancement is useful for removing noise from the digital images. We have analysed the noise caused by different sources like room vibration, scraps and defocusing, and in the following classes of lymphoma: follicular, mantle cell and B-cell chronic lymphocytic leukemia. The filters Gaussian, Median and Mean-Shift were applied to different colour models (RGB, Lab and HSV). Afterwards, we performed a quantitative analysis of the images by means of the Structural Similarity Index. This was done in order to evaluate the similarity between the images. In all cases we have obtained a certainty of at least 75%, which rises to 99% if one considers only HSV. Namely, we have concluded that HSV is an important choice of colour model at pre-processing histological images of lymphoma, because in this case the resulting image will get the best enhancement

  8. High performance 3D adaptive filtering for DSP based portable medical imaging systems

    Science.gov (United States)

    Bockenbach, Olivier; Ali, Murtaza; Wainwright, Ian; Nadeski, Mark

    2015-03-01

    Portable medical imaging devices have proven valuable for emergency medical services both in the field and hospital environments and are becoming more prevalent in clinical settings where the use of larger imaging machines is impractical. Despite their constraints on power, size and cost, portable imaging devices must still deliver high quality images. 3D adaptive filtering is one of the most advanced techniques aimed at noise reduction and feature enhancement, but is computationally very demanding and hence often cannot be run with sufficient performance on a portable platform. In recent years, advanced multicore digital signal processors (DSP) have been developed that attain high processing performance while maintaining low levels of power dissipation. These processors enable the implementation of complex algorithms on a portable platform. In this study, the performance of a 3D adaptive filtering algorithm on a DSP is investigated. The performance is assessed by filtering a volume of size 512x256x128 voxels sampled at a pace of 10 MVoxels/sec with an Ultrasound 3D probe. Relative performance and power is addressed between a reference PC (Quad Core CPU) and a TMS320C6678 DSP from Texas Instruments.

  9. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    International Nuclear Information System (INIS)

    Bowman, Wesley; Sattarivand, Mike

    2016-01-01

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknesses range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.

  10. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, Wesley; Sattarivand, Mike [Department of Radiation Oncology, Dalhousie University at Nova Scotia Health Authority, Department of Radiation Oncology, Dalhousie University at Nova Scotia Health Authority (Canada)

    2016-08-15

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknesses range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.

  11. Bandwidth Controllable Tunable Filter for Hyper-/Multi-Spectral Imager, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase I proposal introduces a fast speed bandwidth controllable tunable filter for hyper-/multi-spectral (HS/MS) imagers. It dynamically passes a variable...

  12. Noninvasive mapping of water diffusional exchange in the human brain using filter-exchange imaging.

    Science.gov (United States)

    Nilsson, Markus; Lätt, Jimmy; van Westen, Danielle; Brockstedt, Sara; Lasič, Samo; Ståhlberg, Freddy; Topgaard, Daniel

    2013-06-01

    We present the first in vivo application of the filter-exchange imaging protocol for diffusion MRI. The protocol allows noninvasive mapping of the rate of water exchange between microenvironments with different self-diffusivities, such as the intracellular and extracellular spaces in tissue. Since diffusional water exchange across the cell membrane is a fundamental process in human physiology and pathophysiology, clinically feasible and noninvasive imaging of the water exchange rate would offer new means to diagnose disease and monitor treatment response in conditions such as cancer and edema. The in vivo use of filter-exchange imaging was demonstrated by studying the brain of five healthy volunteers and one intracranial tumor (meningioma). Apparent exchange rates in white matter range from 0.8±0.08 s(-1) in the internal capsule, to 1.6±0.11 s(-1) for frontal white matter, indicating that low values are associated with high myelination. Solid tumor displayed values of up to 2.9±0.8 s(-1). In white matter, the apparent exchange rate values suggest intra-axonal exchange times in the order of seconds, confirming the slow exchange assumption in the analysis of diffusion MRI data. We propose that filter-exchange imaging could be used clinically to map the water exchange rate in pathologies. Filter-exchange imaging may also be valuable for evaluating novel therapies targeting the function of aquaporins. Copyright © 2012 Wiley Periodicals, Inc.

  13. Spatio-spectral color filter array design for optimal image recovery.

    Science.gov (United States)

    Hirakawa, Keigo; Wolfe, Patrick J

    2008-10-01

    In digital imaging applications, data are typically obtained via a spatial subsampling procedure implemented as a color filter array-a physical construction whereby only a single color value is measured at each pixel location. Owing to the growing ubiquity of color imaging and display devices, much recent work has focused on the implications of such arrays for subsequent digital processing, including in particular the canonical demosaicking task of reconstructing a full color image from spatially subsampled and incomplete color data acquired under a particular choice of array pattern. In contrast to the majority of the demosaicking literature, we consider here the problem of color filter array design and its implications for spatial reconstruction quality. We pose this problem formally as one of simultaneously maximizing the spectral radii of luminance and chrominance channels subject to perfect reconstruction, and-after proving sub-optimality of a wide class of existing array patterns-provide a constructive method for its solution that yields robust, new panchromatic designs implementable as subtractive colors. Empirical evaluations on multiple color image test sets support our theoretical results, and indicate the potential of these patterns to increase spatial resolution for fixed sensor size, and to contribute to improved reconstruction fidelity as well as significantly reduced hardware complexity.

  14. Real-time MR diffusion tensor and Q-ball imaging using Kalman filtering

    International Nuclear Information System (INIS)

    Poupon, C.; Roche, A.; Dubois, J.; Mangin, J.F.; Poupon, F.

    2008-01-01

    Diffusion magnetic resonance imaging (dMRI) has become an established research tool for the investigation of tissue structure and orientation. In this paper, we present a method for real-time processing of diffusion tensor and Q-ball imaging. The basic idea is to use Kalman filtering framework to fit either the linear tensor or Q-ball model. Because the Kalman filter is designed to be an incremental algorithm, it naturally enables updating the model estimate after the acquisition of any new diffusion-weighted volume. Processing diffusion models and maps during ongoing scans provides a new useful tool for clinicians, especially when it is not possible to predict how long a subject may remain still in the magnet. First, we introduce the general linear models corresponding to the two diffusion tensor and analytical Q-ball models of interest. Then, we present the Kalman filtering framework and we focus on the optimization of the diffusion orientation sets in order to speed up the convergence of the online processing. Last, we give some results on a healthy volunteer for the online tensor and the Q-ball model, and we make some comparisons with the conventional offline techniques used in the literature. We could achieve full real-time for diffusion tensor imaging and deferred time for Q-ball imaging, using a single workstation. (authors)

  15. Fractional order integration and fuzzy logic based filter for denoising of echocardiographic image.

    Science.gov (United States)

    Saadia, Ayesha; Rashdi, Adnan

    2016-12-01

    Ultrasound is widely used for imaging due to its cost effectiveness and safety feature. However, ultrasound images are inherently corrupted with speckle noise which severely affects the quality of these images and create difficulty for physicians in diagnosis. To get maximum benefit from ultrasound imaging, image denoising is an essential requirement. To perform image denoising, a two stage methodology using fuzzy weighted mean and fractional integration filter has been proposed in this research work. In stage-1, image pixels are processed by applying a 3 × 3 window around each pixel and fuzzy logic is used to assign weights to the pixels in each window, replacing central pixel of the window with weighted mean of all neighboring pixels present in the same window. Noise suppression is achieved by assigning weights to the pixels while preserving edges and other important features of an image. In stage-2, the resultant image is further improved by fractional order integration filter. Effectiveness of the proposed methodology has been analyzed for standard test images artificially corrupted with speckle noise and real ultrasound B-mode images. Results of the proposed technique have been compared with different state-of-the-art techniques including Lsmv, Wiener, Geometric filter, Bilateral, Non-local means, Wavelet, Perona et al., Total variation (TV), Global Adaptive Fractional Integral Algorithm (GAFIA) and Improved Fractional Order Differential (IFD) model. Comparison has been done on quantitative and qualitative basis. For quantitative analysis different metrics like Peak Signal to Noise Ratio (PSNR), Speckle Suppression Index (SSI), Structural Similarity (SSIM), Edge Preservation Index (β) and Correlation Coefficient (ρ) have been used. Simulations have been done using Matlab. Simulation results of artificially corrupted standard test images and two real Echocardiographic images reveal that the proposed method outperforms existing image denoising techniques

  16. Large-Scale Query-by-Image Video Retrieval Using Bloom Filters

    OpenAIRE

    Araujo, Andre; Chaves, Jason; Lakshman, Haricharan; Angst, Roland; Girod, Bernd

    2016-01-01

    We consider the problem of using image queries to retrieve videos from a database. Our focus is on large-scale applications, where it is infeasible to index each database video frame independently. Our main contribution is a framework based on Bloom filters, which can be used to index long video segments, enabling efficient image-to-video comparisons. Using this framework, we investigate several retrieval architectures, by considering different types of aggregation and different functions to ...

  17. A Filtering Approach for Image-Guided Surgery With a Highly Articulated Surgical Snake Robot.

    Science.gov (United States)

    Tully, Stephen; Choset, Howie

    2016-02-01

    The objective of this paper is to introduce a probabilistic filtering approach to estimate the pose and internal shape of a highly flexible surgical snake robot during minimally invasive surgery. Our approach renders a depiction of the robot that is registered to preoperatively reconstructed organ models to produce a 3-D visualization that can be used for surgical feedback. Our filtering method estimates the robot shape using an extended Kalman filter that fuses magnetic tracker data with kinematic models that define the motion of the robot. Using Lie derivative analysis, we show that this estimation problem is observable, and thus, the shape and configuration of the robot can be successfully recovered with a sufficient number of magnetic tracker measurements. We validate this study with benchtop and in-vivo image-guidance experiments in which the surgical robot was driven along the epicardial surface of a porcine heart. This paper introduces a filtering approach for shape estimation that can be used for image guidance during minimally invasive surgery. The methods being introduced in this paper enable informative image guidance for highly articulated surgical robots, which benefits the advancement of robotic surgery.

  18. Correlation study of effect of additional filter on radiation dose and image quality in digital mammography

    International Nuclear Information System (INIS)

    Liu Jie; Liu Peifang; Wang Hongbin; Zhang Shuping; Liu Xueou

    2012-01-01

    Objective: To explore the effect of different additional filters on radiation dose and image quality in digital mammography. Methods: Hologic company's Selenia digital mammography machine and the post-processing workstations and 5 M high resolution medical monitor were used in this study. Mammography phantoms with the thickness from 1.6 cm to 8.6 cm were used to simulate human breast tissue. The same exposure conditions, pressure, compression thickness, the anode were employed with the additional filters of Mo and Rh under the automatic and manual exposure mode. The image kV, mAs, pressure, filter, average glandular dose (AGD), entrance surface dose (ESD), signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and image score according to ACR criteria were recorded for the two additional filters. Paired sample t test was performed to compare the indices of Mo and Rh groups by using SPSS 17.0. Results: AGD and ESD of Rh and Mo group were both higher with the increase of the thickness of all the phantoms. AGD, ESD and their increased value of Rh filter(1.484 ± 1.041, 7.969 ± 7.633, 0.423 ± 0.190 and 3.057 ± 2.139) were lower than those of Mo filter (1.915 ± 1.301, 12.516 ± 11.632, 0.539 ±0.246 and 4.731 ± 3.294), in all the phantoms with different thickness (t values were 4.614, 3.209, 3.396 and 3.605, P<0.05). SNR, CNR, and image score of Rh and Mo group both decreased with the increase of the thickness of all the phantoms. There were no statistical difference (P>0.05). Conclusions: Compared with Mo filter, Rh filter could reduce the radiation dose, and this advantage is more obvious in the thicker phantom when the same image quality is required. (authors)

  19. Precision of quantitative computed tomography texture analysis using image filtering: A phantom study for scanner variability.

    Science.gov (United States)

    Yasaka, Koichiro; Akai, Hiroyuki; Mackin, Dennis; Court, Laurence; Moros, Eduardo; Ohtomo, Kuni; Kiryu, Shigeru

    2017-05-01

    Quantitative computed tomography (CT) texture analyses for images with and without filtration are gaining attention to capture the heterogeneity of tumors. The aim of this study was to investigate how quantitative texture parameters using image filtering vary among different computed tomography (CT) scanners using a phantom developed for radiomics studies.A phantom, consisting of 10 different cartridges with various textures, was scanned under 6 different scanning protocols using four CT scanners from four different vendors. CT texture analyses were performed for both unfiltered images and filtered images (using a Laplacian of Gaussian spatial band-pass filter) featuring fine, medium, and coarse textures. Forty-five regions of interest were placed for each cartridge (x) in a specific scan image set (y), and the average of the texture values (T(x,y)) was calculated. The interquartile range (IQR) of T(x,y) among the 6 scans was calculated for a specific cartridge (IQR(x)), while the IQR of T(x,y) among the 10 cartridges was calculated for a specific scan (IQR(y)), and the median IQR(y) was then calculated for the 6 scans (as the control IQR, IQRc). The median of their quotient (IQR(x)/IQRc) among the 10 cartridges was defined as the variability index (VI).The VI was relatively small for the mean in unfiltered images (0.011) and for standard deviation (0.020-0.044) and entropy (0.040-0.044) in filtered images. Skewness and kurtosis in filtered images featuring medium and coarse textures were relatively variable across different CT scanners, with VIs of 0.638-0.692 and 0.430-0.437, respectively.Various quantitative CT texture parameters are robust and variable among different scanners, and the behavior of these parameters should be taken into consideration.

  20. A motion-compensated image filter for low-dose fluoroscopy in a real-time tumor-tracking radiotherapy system

    International Nuclear Information System (INIS)

    Miyamoto, Naoki; Ishikawa, Masayori; Sutherland, Kenneth

    2015-01-01

    In the real-time tumor-tracking radiotherapy system, a surrogate fiducial marker inserted in or near the tumor is detected by fluoroscopy to realize respiratory-gated radiotherapy. The imaging dose caused by fluoroscopy should be minimized. In this work, an image processing technique is proposed for tracing a moving marker in low-dose imaging. The proposed tracking technique is a combination of a motion-compensated recursive filter and template pattern matching. The proposed image filter can reduce motion artifacts resulting from the recursive process based on the determination of the region of interest for the next frame according to the current marker position in the fluoroscopic images. The effectiveness of the proposed technique and the expected clinical benefit were examined by phantom experimental studies with actual tumor trajectories generated from clinical patient data. It was demonstrated that the marker motion could be traced in low-dose imaging by applying the proposed algorithm with acceptable registration error and high pattern recognition score in all trajectories, although some trajectories were not able to be tracked with the conventional spatial filters or without image filters. The positional accuracy is expected to be kept within ±2 mm. The total computation time required to determine the marker position is a few milliseconds. The proposed image processing technique is applicable for imaging dose reduction. (author)

  1. Influence of different anode/filter combination on radiation dose and image quality in digital mammography

    International Nuclear Information System (INIS)

    Liu Jie; Liu Peifang; Zhang Lianlian; Ma Wenjuan

    2013-01-01

    Objective: To explore the effect of different anode/filter combination on radiation dose and image quality in digital mammography, so as to choose optimal anode/filter combination to reduce radiation injury without scarifying image quality. Methods: Mammography accredition phantoms with the thickness from 1.6 cm to 8.6 cm were used to simulate human breast tissue. The same exposure conditions, pressure, compression thickness. and different anode/filter combination were employed under the automatic and manual exposure modes. The image kV, mAs, pressure, filter, average glandular dose (ACD), contrast to noise ratio (CNR) were recorded and the figure of merit (FOM) was calculated. SPSS 17.0 and one-way analysis of variance were used in the statistical analysis. Results: As the phantom thickness increase, the ACD values which were acquired with Mo/Mo, Mo/Rh, and W/Ag three different anode/filter combinations were increased, but CNR and FOM values were decreased, ACD, CNR, and FOM values which were acquired in the phantom with different thickness, and three different anode/filter combinations were statistically different (P=0.000, respectively). The ACD values of Mo/Mo were lowest. For 1.6 cm-2.6 cm phantom thicknesses, the FOMs of Mo/Rh were lowest, and for 3.6 cm-8.6 cm phantom thicknesses, the FOMs of W/Ag were lowest. Conclusion: Phantom thickness in 1.6 cm-2.6 cm and 3.6 cm-8.6 cm. Mo/Rh combination and W/Ag combination respectively can achieve the highest FOM, and can provide the best imaging quality with low radiation dose. (authors)

  2. Automatic detection of solar features in HSOS full-disk solar images using guided filter

    Science.gov (United States)

    Yuan, Fei; Lin, Jiaben; Guo, Jingjing; Wang, Gang; Tong, Liyue; Zhang, Xinwei; Wang, Bingxiang

    2018-02-01

    A procedure is introduced for the automatic detection of solar features using full-disk solar images from Huairou Solar Observing Station (HSOS), National Astronomical Observatories of China. In image preprocessing, median filter is applied to remove the noises. Guided filter is adopted to enhance the edges of solar features and restrain the solar limb darkening, which is first introduced into the astronomical target detection. Then specific features are detected by Otsu algorithm and further threshold processing technique. Compared with other automatic detection procedures, our procedure has some advantages such as real time and reliability as well as no need of local threshold. Also, it reduces the amount of computation largely, which is benefited from the efficient guided filter algorithm. The procedure has been tested on one month sequences (December 2013) of HSOS full-disk solar images and the result shows that the number of features detected by our procedure is well consistent with the manual one.

  3. Artifact reduction of compressed images and video combining adaptive fuzzy filtering and directional anisotropic diffusion

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Forchhammer, Søren; Korhonen, Jari

    2011-01-01

    and ringing artifacts, we have applied directional anisotropic diffusion. Besides that, the selection of the adaptive threshold parameter for the diffusion coefficient has also improved the performance of the algorithm. Experimental results on JPEG compressed images as well as MJPEG and H.264 compressed......Fuzzy filtering is one of the recently developed methods for reducing distortion in compressed images and video. In this paper, we combine the powerful anisotropic diffusion equations with fuzzy filtering in order to reduce the impact of artifacts. Based on the directional nature of the blocking...... videos show improvement in artifact reduction of the proposed algorithm over other directional and spatial fuzzy filters....

  4. Demosaicing and Superresolution for Color Filter Array via Residual Image Reconstruction and Sparse Representation

    OpenAIRE

    Sun, Guangling

    2012-01-01

    A framework of demosaicing and superresolution for color filter array (CFA) via residual image reconstruction and sparse representation is presented.Given the intermediate image produced by certain demosaicing and interpolation technique, a residual image between the final reconstruction image and the intermediate image is reconstructed using sparse representation.The final reconstruction image has richer edges and details than that of the intermediate image. Specifically, a generic dictionar...

  5. An Extension to a Filter Implementation of Local Quadratic Surface for Image Noise Estimation

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    1999-01-01

    Based on regression analysis this paper gives a description for simple image filter design. Specifically 3x3 filter implementations of a quadratic surface, residuals from this surface, gradients and the Laplacian are given. For the residual a 5x5 filter is given also. It is shown that the 3x3......) it is concluded that if striping is to be considered as a part of the noise, the residual from a 3x3 median filter seems best. If we are interested in a salt-and-pepper noise estimator the proposed extension to the 3x3 filter for the residual from a quadratic surface seems best. Simple statistics...

  6. SU-F-J-28: Development of a New Imaging Filter to Remove the Shadows From the Carbon Fiber Grid Table Top

    Energy Technology Data Exchange (ETDEWEB)

    Maehana, W [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Yokohama National University, Yokohama, kanagawa (Japan); Nagao, T [Yokohama National University, Yokohama, kanagawa (Japan)

    2016-06-15

    Purpose: For the image guided radiation therapy (IGRT), the shadows caused by the construction of the treatment couch top adversely affect the visual evaluation. Therefore, we developed the new imaging filter in order to remove the shadows. The performance of the new filter was evaluated using the clinical images. Methods: The new filter was composed of the band-pass filter (BPF) weighted by the k factor and the low-pass filter (LPF). In the frequency region, the stop bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the BPF, and the pass bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the LPF. After adding the filter, the shadows from the carbon fiber grid table top (CFGTT, Varian) on the kV-image was removed. To check the filter effect, we compared the clinical images, which are thorax and thoracoabdominal region, with to without the filter. The subjective evaluation tests was performed by adapting a three-point scale (agree, neither agree nor disagree, disagree) about the 15 persons in the department of radiation oncology. Results: We succeeded in removing all shadows of CFGTT using the new filter. This filter is very useful shown by the results of the subjective evaluation having the 23/30 persons agreed to the filtered clinical images. Conclusion: We concluded that the proposed method was useful tool for the IGRT and the new filter leads to improvement of the accuracy of radiation therapy.

  7. Tin-filter enhanced dual-energy-CT: image quality and accuracy of CT numbers in virtual noncontrast imaging.

    Science.gov (United States)

    Kaufmann, Sascha; Sauter, Alexander; Spira, Daniel; Gatidis, Sergios; Ketelsen, Dominik; Heuschmid, Martin; Claussen, Claus D; Thomas, Christoph

    2013-05-01

    To measure and compare the objective image quality of true noncontrast (TNC) images with virtual noncontrast (VNC) images acquired by tin-filter-enhanced, dual-source, dual-energy computed tomography (DECT) of upper abdomen. Sixty-three patients received unenhanced abdominal CT and enhanced abdominal DECT (100/140 kV with tin filter) in portal-venous phase. VNC images were calculated from the DECT datasets using commercially available software. The mean attenuation of relevant tissues and image quality were compared between the TNC and VNC images. Image quality was rated objectively by measuring image noise and the sharpness of object edges using custom-designed software. Measurements were compared using Student two-tailed t-test. Correlation coefficients for tissue attenuation measurements between TNC and VNC were calculated and the relative deviations were illustrated using Bland-Altman plots. Mean attenuation differences between TNC and VNC (HUTNC - HUVNC) image sets were as follows: right liver lobe -4.94 Hounsfield units (HU), left liver lobe -3.29 HU, vena cava -2.19 HU, spleen -7.46 HU, pancreas 1.29 HU, fat -11.14 HU, aorta 1.29 HU, bone marrow 36.83 HU (all P VNC and TNC series were observed for liver, vena portae, kidneys, pancreas, muscle and bone marrow (Pearson's correlation coefficient ≥0.75). Mean image noise was significantly higher in TNC images (P VNC and TNC images (P = .19). The Hounsfield units in VNC images closely resemble TNC images in the majority of the organs of the upper abdomen (kidneys, liver, pancreas). In spleen and fat, Hounsfield numbers in VNC images are tend to be higher than in TNC images. VNC images show a low image noise and satisfactory edge sharpness. Other criteria of image quality and the depiction of certain lesions need to be evaluated additionally. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  8. Image Vector Quantization codec indexes filtering

    Directory of Open Access Journals (Sweden)

    Lakhdar Moulay Abdelmounaim

    2012-01-01

    Full Text Available Vector Quantisation (VQ is an efficient coding algorithm that has been widely used in the field of video and image coding, due to its fast decoding efficiency. However, the indexes of VQ are sometimes lost because of signal interference during the transmission. In this paper, we propose an efficient estimation method to conceal and recover the lost indexes on the decoder side, to avoid re-transmitting the whole image again. If the image or video has the limitation of a period of validity, re-transmitting the data wastes the resources of time and network bandwidth. Therefore, using the originally received correct data to estimate and recover the lost data is efficient in time-constrained situations, such as network conferencing or mobile transmissions. In nature images, the pixels are correlated with their neighbours and VQ partitions the image into sub-blocks and quantises them to the indexes that are transmitted; the correlation between adjacent indexes is very strong. There are two parts of the proposed method. The first is pre-processing and the second is an estimation process. In pre-processing, we modify the order of codevectors in the VQ codebook to increase the correlation among the neighbouring vectors. We then use a special filtering method in the estimation process. Using conventional VQ to compress the Lena image and transmit it without any loss of index can achieve a PSNR of 30.429 dB on the decoder. The simulation results demonstrate that our method can estimate the indexes to achieve PSNR values of 29.084 and 28.327 dB when the loss rate is 0.5% and 1%, respectively.

  9. A multi-stage noise adaptive switching filter for extremely corrupted images

    Science.gov (United States)

    Dinh, Hai; Adhami, Reza; Wang, Yi

    2015-07-01

    A multi-stage noise adaptive switching filter (MSNASF) is proposed for the restoration of images extremely corrupted by impulse and impulse-like noise. The filter consists of two steps: noise detection and noise removal. The proposed extrema-based noise detection scheme utilizes the false contouring effect to get better over detection rate at low noise density. It is adaptive and will detect not only impulse but also impulse-like noise. In the noise removal step, a novel multi-stage filtering scheme is proposed. It replaces corrupted pixel with the nearest uncorrupted median to preserve details. When compared with other methods, MSNASF provides better peak signal to noise ratio (PSNR) and structure similarity index (SSIM). A subjective evaluation carried out online also demonstrates that MSNASF yields higher fidelity.

  10. Cat Swarm Optimization Based Functional Link Artificial Neural Network Filter for Gaussian Noise Removal from Computed Tomography Images

    Directory of Open Access Journals (Sweden)

    M. Kumar

    2016-01-01

    Full Text Available Gaussian noise is one of the dominant noises, which degrades the quality of acquired Computed Tomography (CT image data. It creates difficulties in pathological identification or diagnosis of any disease. Gaussian noise elimination is desirable to improve the clarity of a CT image for clinical, diagnostic, and postprocessing applications. This paper proposes an evolutionary nonlinear adaptive filter approach, using Cat Swarm Functional Link Artificial Neural Network (CS-FLANN to remove the unwanted noise. The structure of the proposed filter is based on the Functional Link Artificial Neural Network (FLANN and the Cat Swarm Optimization (CSO is utilized for the selection of optimum weight of the neural network filter. The applied filter has been compared with the existing linear filters, like the mean filter and the adaptive Wiener filter. The performance indices, such as peak signal to noise ratio (PSNR, have been computed for the quantitative analysis of the proposed filter. The experimental evaluation established the superiority of the proposed filtering technique over existing methods.

  11. Switching non-local vector median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2016-04-01

    This paper describes a novel image filtering method that removes random-valued impulse noise superimposed on a natural color image. In impulse noise removal, it is essential to employ a switching-type filtering method, as used in the well-known switching median filter, to preserve the detail of an original image with good quality. In color image filtering, it is generally preferable to deal with the red (R), green (G), and blue (B) components of each pixel of a color image as elements of a vectorized signal, as in the well-known vector median filter, rather than as component-wise signals to prevent a color shift after filtering. By taking these fundamentals into consideration, we propose a switching-type vector median filter with non-local processing that mainly consists of a noise detector and a noise removal filter. Concretely, we propose a noise detector that proactively detects noise-corrupted pixels by focusing attention on the isolation tendencies of pixels of interest not in an input image but in difference images between RGB components. Furthermore, as the noise removal filter, we propose an extended version of the non-local median filter, we proposed previously for grayscale image processing, named the non-local vector median filter, which is designed for color image processing. The proposed method realizes a superior balance between the preservation of detail and impulse noise removal by proactive noise detection and non-local switching vector median filtering, respectively. The effectiveness and validity of the proposed method are verified in a series of experiments using natural color images.

  12. [Testing method research for key performance indicator of imaging acousto-optic tunable filter (AOTF)].

    Science.gov (United States)

    Hu, Shan-Zhou; Chen, Fen-Fei; Zeng, Li-Bo; Wu, Qiong-Shui

    2013-01-01

    Imaging AOTF is an important optical filter component for new spectral imaging instruments developed in recent years. The principle of imaging AOTF component was demonstrated, and a set of testing methods for some key performances were studied, such as diffraction efficiency, wavelength shift with temperature, homogeneity in space for diffraction efficiency, imaging shift, etc.

  13. Pornographic image detection with Gabor filters

    Science.gov (United States)

    Durrell, Kevan; Murray, Daniel J. C.

    2002-04-01

    As Internet-enabled computers become ubiquitous in homes, schools, and other publicly accessible locations, there are more people 'surfing the net' who would prefer not to be exposed to offensive material. There is a lot of material freely available on the Internet that we, as a responsible and caring society, would like to keep our children from viewing. Pornographic image content is one category of material over which we would like some control. We have been conducting experiments to determine the effectiveness of using characteristic feature vectors and neural networks to identify semantic image content. This paper will describe our approach to identifying pornographic images using Gabor filters, Principal Component Analysis (PCA), Correllograms, and Neural Networks. In brief, we used a set of 5,000 typical images available from the Internet, 20% of which were judged to be pornographic, to train a neural network. We then apply the trained neural network to feature vectors from images that had not been used in training. We measure our performance as Recall, how many of the verification images labeled pornographic were correctly identified, and Precision, how many images deemed pornographic by the neural network are in fact pornographic. The set of images that were used in the experiment described in this paper for its training and validation sets are freely available on the Internet. Freely available is an important qualifier, since high-resolution, studio-quality pornographic images are often protected by portals that charge members a fee to gain access to their material. Although this is not a hard and fast rule, many of the pornographic images that are available easily and without charge on the Internet are of low image quality. Some of these images are collages or contain textual elements or have had their resolution intentionally lowered to reduce their file size. These are the offensive images that a user, without a credit card, might inadvertently come

  14. Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms.

    Science.gov (United States)

    Harikumar, G; Bresler, Y

    1999-01-01

    We address the problem of restoring an image from its noisy convolutions with two or more unknown finite impulse response (FIR) filters. We develop theoretical results about the existence and uniqueness of solutions, and show that under some generically true assumptions, both the filters and the image can be determined exactly in the absence of noise, and stably estimated in its presence. We present efficient algorithms to estimate the blur functions and their sizes. These algorithms are of two types, subspace-based and likelihood-based, and are extensions of techniques proposed for the solution of the multichannel blind deconvolution problem in one dimension. We present memory and computation-efficient techniques to handle the very large matrices arising in the two-dimensional (2-D) case. Once the blur functions are determined, they are used in a multichannel deconvolution step to reconstruct the unknown image. The theoretical and practical implications of edge effects, and "weakly exciting" images are examined. Finally, the algorithms are demonstrated on synthetic and real data.

  15. Convergent Filter Bases

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-09-01

    Full Text Available We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections.

  16. Retina-Inspired Filter.

    Science.gov (United States)

    Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien

    2018-07-01

    This paper introduces a novel filter, which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer, and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model "virtual retina." This model is the cornerstone to derive the non-separable spatio-temporal OPL retina-inspired filter, briefly renamed retina-inspired filter, studied in this paper. This filter is connected to the dynamic behavior of the retina, which enables the retina to increase the sharpness of the visual stimulus during filtering before its transmission to the brain. We establish that this retina-inspired transform forms a group of spatio-temporal Weighted Difference of Gaussian (WDoG) filters when it is applied to a still image visible for a given time. We analyze the spatial frequency bandwidth of the retina-inspired filter with respect to time. It is shown that the WDoG spectrum varies from a lowpass filter to a bandpass filter. Therefore, while time increases, the retina-inspired filter enables to extract different kinds of information from the input image. Finally, we discuss the benefits of using the retina-inspired filter in image processing applications such as edge detection and compression.

  17. Chip-scale fluorescence microscope based on a silo-filter complementary metal-oxide semiconductor image sensor.

    Science.gov (United States)

    Ah Lee, Seung; Ou, Xiaoze; Lee, J Eugene; Yang, Changhuei

    2013-06-01

    We demonstrate a silo-filter (SF) complementary metal-oxide semiconductor (CMOS) image sensor for a chip-scale fluorescence microscope. The extruded pixel design with metal walls between neighboring pixels guides fluorescence emission through the thick absorptive filter to the photodiode of a pixel. Our prototype device achieves 13 μm resolution over a wide field of view (4.8 mm × 4.4 mm). We demonstrate bright-field and fluorescence longitudinal imaging of living cells in a compact, low-cost configuration.

  18. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    Science.gov (United States)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  19. Collaborating Filtering Community Image Recommendation System Based on Scene

    Directory of Open Access Journals (Sweden)

    He Tao

    2017-01-01

    Full Text Available With the advancement of smart city, the development of intelligent mobile terminal and wireless network, the traditional text information service no longer meet the needs of the community residents, community image service appeared as a new media service. “There are pictures of the truth” has become a community residents to understand and master the new dynamic community, image information service has become a new information service. However, there are two major problems in image information service. Firstly, the underlying eigenvalues extracted by current image feature extraction techniques are difficult for users to understand, and there is a semantic gap between the image content itself and the user’s understanding; secondly, in community life of the image data increasing quickly, it is difficult to find their own interested image data. Aiming at the two problems, this paper proposes a unified image semantic scene model to express the image content. On this basis, a collaborative filtering recommendation model of fusion scene semantics is proposed. In the recommendation model, a comprehensiveness and accuracy user interest model is proposed to improve the recommendation quality. The results of the present study have achieved good results in the pilot cities of Wenzhou and Yan'an, and it is applied normally.

  20. Mathematical filtering minimizes metallic halation of titanium implants in MicroCT images.

    Science.gov (United States)

    Ha, Jee; Osher, Stanley J; Nishimura, Ichiro

    2013-01-01

    Microcomputed tomography (MicroCT) images containing titanium implant suffer from x-rays scattering, artifact and the implant surface is critically affected by metallic halation. To improve the metallic halation artifact, a nonlinear Total Variation denoising algorithm such as Split Bregman algorithm was applied to the digital data set of MicroCT images. This study demonstrated that the use of a mathematical filter could successfully reduce metallic halation, facilitating the osseointegration evaluation at the bone implant interface in the reconstructed images.

  1. Passive ranging using a filter-based non-imaging method based on oxygen absorption.

    Science.gov (United States)

    Yu, Hao; Liu, Bingqi; Yan, Zongqun; Zhang, Yu

    2017-10-01

    To solve the problem of poor real-time measurement caused by a hyperspectral imaging system and to simplify the design in passive ranging technology based on oxygen absorption spectrum, a filter-based non-imaging ranging method is proposed. In this method, three bandpass filters are used to obtain the source radiation intensities that are located in the oxygen absorption band near 762 nm and the band's left and right non-absorption shoulders, and a photomultiplier tube is used as the non-imaging sensor of the passive ranging system. Range is estimated by comparing the calculated values of band-average transmission due to oxygen absorption, τ O 2 , against the predicted curve of τ O 2 versus range. The method is tested under short-range conditions. Accuracy of 6.5% is achieved with the designed experimental ranging system at the range of 400 m.

  2. Image denoising using new pixon representation based on fuzzy filtering and partial differential equations

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Nikpour, Mohsen

    2012-01-01

    In this paper, we have proposed two extensions to pixon-based image modeling. The first one is using bicubic interpolation instead of bilinear interpolation and the second one is using fuzzy filtering method, aiming to improve the quality of the pixonal image. Finally, partial differential...

  3. Noise reduction and functional maps image quality improvement in dynamic CT perfusion using a new k-means clustering guided bilateral filter (KMGB).

    Science.gov (United States)

    Pisana, Francesco; Henzler, Thomas; Schönberg, Stefan; Klotz, Ernst; Schmidt, Bernhard; Kachelrieß, Marc

    2017-07-01

    Dynamic CT perfusion (CTP) consists in repeated acquisitions of the same volume in different time steps, slightly before, during and slightly afterwards the injection of contrast media. Important functional information can be derived for each voxel, which reflect the local hemodynamic properties and hence the metabolism of the tissue. Different approaches are being investigated to exploit data redundancy and prior knowledge for noise reduction of such datasets, ranging from iterative reconstruction schemes to high dimensional filters. We propose a new spatial bilateral filter which makes use of the k-means clustering algorithm and of an optimal calculated guiding image. We named the proposed filter as k-means clustering guided bilateral filter (KMGB). In this study, the KMGB filter is compared with the partial temporal non-local means filter (PATEN), with the time-intensity profile similarity (TIPS) filter, and with a new version derived from it, by introducing the guiding image (GB-TIPS). All the filters were tested on a digital in-house developed brain CTP phantom, were noise was added to simulate 80 kV and 200 mAs (default scanning parameters), 100 mAs and 30 mAs. Moreover, the filters performances were tested on 7 noisy clinical datasets with different pathologies in different body regions. The original contribution of our work is two-fold: first we propose an efficient algorithm to calculate a guiding image to improve the results of the TIPS filter, secondly we propose the introduction of the k-means clustering step and demonstrate how this can potentially replace the TIPS part of the filter obtaining better results at lower computational efforts. As expected, in the GB-TIPS, the introduction of the guiding image limits the over-smoothing of the TIPS filter, improving spatial resolution by more than 50%. Furthermore, replacing the time-intensity profile similarity calculation with a fuzzy k-means clustering strategy (KMGB) allows to control the edge preserving

  4. Tunable electro-optic filter stack

    Science.gov (United States)

    Fontecchio, Adam K.; Shriyan, Sameet K.; Bellingham, Alyssa

    2017-09-05

    A holographic polymer dispersed liquid crystal (HPDLC) tunable filter exhibits switching times of no more than 20 microseconds. The HPDLC tunable filter can be utilized in a variety of applications. An HPDLC tunable filter stack can be utilized in a hyperspectral imaging system capable of spectrally multiplexing hyperspectral imaging data acquired while the hyperspectral imaging system is airborne. HPDLC tunable filter stacks can be utilized in high speed switchable optical shielding systems, for example as a coating for a visor or an aircraft canopy. These HPDLC tunable filter stacks can be fabricated using a spin coating apparatus and associated fabrication methods.

  5. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  6. Choosing and using astronomical filters

    CERN Document Server

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  7. A Kalman Filter-Based Method to Generate Continuous Time Series of Medium-Resolution NDVI Images

    Directory of Open Access Journals (Sweden)

    Fernando Sedano

    2014-12-01

    Full Text Available A data assimilation method to produce complete temporal sequences of synthetic medium-resolution images is presented. The method implements a Kalman filter recursive algorithm that integrates medium and moderate resolution imagery. To demonstrate the approach, time series of 30-m spatial resolution NDVI images at 16-day time steps were generated using Landsat NDVI images and MODIS NDVI products at four sites with different ecosystems and land cover-land use dynamics. The results show that the time series of synthetic NDVI images captured seasonal land surface dynamics and maintained the spatial structure of the landscape at higher spatial resolution. The time series of synthetic medium-resolution NDVI images were validated within a Monte Carlo simulation framework. Normalized residuals decreased as the number of available observations increased, ranging from 0.2 to below 0.1. Residuals were also significantly lower for time series of synthetic NDVI images generated at combined recursion (smoothing than individually at forward and backward recursions (filtering. Conversely, the uncertainties of the synthetic images also decreased when the number of available observations increased and combined recursions were implemented.

  8. Wavelet Filter Banks for Super-Resolution SAR Imaging

    Science.gov (United States)

    Sheybani, Ehsan O.; Deshpande, Manohar; Memarsadeghi, Nargess

    2011-01-01

    This paper discusses Innovative wavelet-based filter banks designed to enhance the analysis of super resolution Synthetic Aperture Radar (SAR) images using parametric spectral methods and signal classification algorithms, SAR finds applications In many of NASA's earth science fields such as deformation, ecosystem structure, and dynamics of Ice, snow and cold land processes, and surface water and ocean topography. Traditionally, standard methods such as Fast-Fourier Transform (FFT) and Inverse Fast-Fourier Transform (IFFT) have been used to extract Images from SAR radar data, Due to non-parametric features of these methods and their resolution limitations and observation time dependence, use of spectral estimation and signal pre- and post-processing techniques based on wavelets to process SAR radar data has been proposed. Multi-resolution wavelet transforms and advanced spectral estimation techniques have proven to offer efficient solutions to this problem.

  9. Improvement of nonlinear diffusion equation using relaxed geometric mean filter for low PSNR images

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan

    2013-01-01

    A new method to improve the performance of low PSNR image denoising is presented. The proposed scheme estimates edge gradient from an image that is regularised with a relaxed geometric mean filter. The proposed method consists of two stages; the first stage consists of a second order nonlinear an...

  10. Defogging of road images using gain coefficient-based trilateral filter

    Science.gov (United States)

    Singh, Dilbag; Kumar, Vijay

    2018-01-01

    Poor weather conditions are responsible for most of the road accidents year in and year out. Poor weather conditions, such as fog, degrade the visibility of objects. Thus, it becomes difficult for drivers to identify the vehicles in a foggy environment. The dark channel prior (DCP)-based defogging techniques have been found to be an efficient way to remove fog from road images. However, it produces poor results when image objects are inherently similar to airlight and no shadow is cast on them. To eliminate this problem, a modified restoration model-based DCP is developed to remove the fog from road images. The transmission map is also refined by developing a gain coefficient-based trilateral filter. Thus, the proposed technique has an ability to remove fog from road images in an effective manner. The proposed technique is compared with seven well-known defogging techniques on two benchmark foggy images datasets and five real-time foggy images. The experimental results demonstrate that the proposed approach is able to remove the different types of fog from roadside images as well as significantly improve the image's visibility. It also reveals that the restored image has little or no artifacts.

  11. Fringing in MonoCam Y4 filter images

    International Nuclear Information System (INIS)

    Brooks, J.; Nomerotski, A.; Fisher-Levine, M.

    2017-01-01

    We study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y 4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net ''fringe'' pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relative intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. We also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.

  12. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods.

    Science.gov (United States)

    Hoak, Anthony; Medeiros, Henry; Povinelli, Richard J

    2017-03-03

    We develop an interactive likelihood (ILH) for sequential Monte Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL) and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA) and classification of events, activities and relationships for multi-object trackers (CLEAR MOT)). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  13. Hyper-spectral modulation fluorescent imaging using double acousto-optical tunable filter based on TeO2-crystals

    International Nuclear Information System (INIS)

    Zaytsev, Kirill I; Perchik, Alexey V; Chernomyrdin, Nikita V; Yurchenko, Stanislav O; Kudrin, Konstantin G; Reshetov, Igor V

    2015-01-01

    We have proposed a method for hyper-spectral fluorescent imaging based on acousto-optical filtering. The object of interest was pumped using ultraviolet radiation of mercury lamp equipped with monochromatic excitation filter with the window of transparency centered at 365 nm. Double TeO 2 -based acousto-optical filter, tunable in range from 430 to 780 nm and having 2 nm bandwidth of spectral transparency, was used in order to detect quasimonochromatic images of object fluorescence. Modulating of ultraviolet pump intensity was used in order to reduce an impact of non-fluorescent background on the sample fluorescent imaging. The technique for signal-to-noise ratio improvement, based on fluorescence intensity estimation via digital processing of modulated video sequence of fluorescent object, was introduced. We have implemented the proposed technique for the test sample studying and we have discussed its possible applications

  14. [Design Method Analysis and Performance Comparison of Wall Filter for Ultrasound Color Flow Imaging].

    Science.gov (United States)

    Wang, Lutao; Xiao, Jun; Chai, Hua

    2015-08-01

    The successful suppression of clutter arising from stationary or slowly moving tissue is one of the key issues in medical ultrasound color blood imaging. Remaining clutter may cause bias in the mean blood frequency estimation and results in a potentially misleading description of blood-flow. In this paper, based on the principle of general wall-filter, the design process of three classes of filters, infinitely impulse response with projection initialization (Prj-IIR), polynomials regression (Pol-Reg), and eigen-based filters are previewed and analyzed. The performance of the filters was assessed by calculating the bias and variance of a mean blood velocity using a standard autocorrelation estimator. Simulation results show that the performance of Pol-Reg filter is similar to Prj-IIR filters. Both of them can offer accurate estimation of mean blood flow speed under steady clutter conditions, and the clutter rejection ability can be enhanced by increasing the ensemble size of Doppler vector. Eigen-based filters can effectively remove the non-stationary clutter component, and further improve the estimation accuracy for low speed blood flow signals. There is also no significant increase in computation complexity for eigen-based filters when the ensemble size is less than 10.

  15. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    International Nuclear Information System (INIS)

    Huang, J; Szczykutowicz, T; Bayouth, J; Miller, J

    2016-01-01

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between the acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials

  16. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    Energy Technology Data Exchange (ETDEWEB)

    Huang, J; Szczykutowicz, T; Bayouth, J; Miller, J [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between the acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials

  17. Regularization of DT-MR images using a successive Fermat median filtering method.

    Science.gov (United States)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan; Hong, Cheolpyo; Han, Bongsoo

    2008-05-21

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods.

  18. Regularization of DT-MR images using a successive Fermat median filtering method

    International Nuclear Information System (INIS)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan; Hong, Cheolpyo; Han, Bongsoo

    2008-01-01

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods

  19. Regularization of DT-MR images using a successive Fermat median filtering method

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan [Department of Biomedical Engineering, Yonsei University, Wonju, 220-710 (Korea, Republic of); Hong, Cheolpyo; Han, Bongsoo [Department of Radiological Science, Yonsei University, Wonju, 220-710 (Korea, Republic of)], E-mail: bshan@yonsei.ac.kr

    2008-05-21

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods.

  20. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    International Nuclear Information System (INIS)

    Kamezawa, H; Arimura, H; Ohki, M; Shirieda, K; Kameda, N

    2014-01-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems

  1. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    Energy Technology Data Exchange (ETDEWEB)

    Kamezawa, H [Graduate School of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan); Arimura, H; Ohki, M [Faculty of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Shirieda, K; Kameda, N [Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan)

    2014-06-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems.

  2. Rotationally invariant correlation filtering

    International Nuclear Information System (INIS)

    Schils, G.F.; Sweeney, D.W.

    1985-01-01

    A method is presented for analyzing and designing optical correlation filters that have tailored rotational invariance properties. The concept of a correlation of an image with a rotation of itself is introduced. A unified theory of rotation-invariant filtering is then formulated. The unified approach describes matched filters (with no rotation invariance) and circular-harmonic filters (with full rotation invariance) as special cases. The continuum of intermediate cases is described in terms of a cyclic convolution operation over angle. The angular filtering approach allows an exact choice for the continuous trade-off between loss of the correlation energy (or specificity regarding the image) and the amount of rotational invariance desired

  3. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods

    Directory of Open Access Journals (Sweden)

    Anthony Hoak

    2017-03-01

    Full Text Available We develop an interactive likelihood (ILH for sequential Monte Carlo (SMC methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL and TUD-Stadtmitte using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA and classification of events, activities and relationships for multi-object trackers (CLEAR MOT. In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  4. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    Science.gov (United States)

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO 2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  5. Application of digital tomosynthesis (DTS) of optimal deblurring filters for dental X-ray imaging

    International Nuclear Information System (INIS)

    Oh, J. E.; Cho, H. S.; Kim, D. S.; Choi, S. I.; Je, U. K.

    2012-01-01

    Digital tomosynthesis (DTS) is a limited-angle tomographic technique that provides some of the tomographic benefits of computed tomography (CT) but at reduced dose and cost. Thus, the potential for application of DTS to dental X-ray imaging seems promising. As a continuation of our dental radiography R and D, we developed an effective DTS reconstruction algorithm and implemented it in conjunction with a commercial dental CT system for potential use in dental implant placement. The reconstruction algorithm employed a backprojection filtering (BPF) method based upon optimal deblurring filters to suppress effectively both the blur artifacts originating from the out-focus planes and the high-frequency noise. To verify the usefulness of the reconstruction algorithm, we performed systematic simulation works and evaluated the image characteristics. We also performed experimental works in which DTS images of enhanced anatomical resolution were successfully obtained by using the algorithm and were promising to our ongoing applications to dental X-ray imaging. In this paper, our approach to the development of the DTS reconstruction algorithm and the results are described in detail.

  6. Filtering for distributed mechanical systems using position measurements: perspectives in medical imaging

    International Nuclear Information System (INIS)

    Moireau, Philippe; Chapelle, Dominique; Tallec, Patrick Le

    2009-01-01

    We propose an effective filtering methodology designed to perform estimation in a distributed mechanical system using position measurements. As in a previously introduced method, the filter is inspired by robust control feedback, but here we take full advantage of the estimation specificity to choose a feedback law that can act on displacements instead of velocities and still retain the same kind of dissipativity property which guarantees robustness. This is very valuable in many applications for which positions are more readily available than velocities, as in medical imaging. We provide an in-depth analysis of the proposed procedure, as well as detailed numerical assessments using a test problem inspired by cardiac biomechanics, as medical diagnosis assistance is an important perspective for this approach. The method is formulated first for measurements based on Lagrangian displacements, but we then derive a nonlinear extension allowing us to instead consider segmented images, which of course is even more relevant in medical applications

  7. LED induced autofluorescence (LIAF) imager with eight multi-filters for oral cancer diagnosis

    Science.gov (United States)

    Huang, Ting-Wei; Cheng, Nai-Lun; Tsai, Ming-Hsui; Chiou, Jin-Chern; Mang, Ou-Yang

    2016-03-01

    Oral cancer is one of the serious and growing problem in many developing and developed countries. The simple oral visual screening by clinician can reduce 37,000 oral cancer deaths annually worldwide. However, the conventional oral examination with the visual inspection and the palpation of oral lesions is not an objective and reliable approach for oral cancer diagnosis, and it may cause the delayed hospital treatment for the patients of oral cancer or leads to the oral cancer out of control in the late stage. Therefore, a device for oral cancer detection are developed for early diagnosis and treatment. A portable LED Induced autofluorescence (LIAF) imager is developed by our group. It contained the multiple wavelength of LED excitation light and the rotary filter ring of eight channels to capture ex-vivo oral tissue autofluorescence images. The advantages of LIAF imager compared to other devices for oral cancer diagnosis are that LIAF imager has a probe of L shape for fixing the object distance, protecting the effect of ambient light, and observing the blind spot in the deep port between the gumsgingiva and the lining of the mouth. Besides, the multiple excitation of LED light source can induce multiple autofluorescence, and LIAF imager with the rotary filter ring of eight channels can detect the spectral images of multiple narrow bands. The prototype of a portable LIAF imager is applied in the clinical trials for some cases in Taiwan, and the images of the clinical trial with the specific excitation show the significant differences between normal tissue and oral tissue under these cases.

  8. Tunable output-frequency filter algorithm for imaging through scattering media under LED illumination

    Science.gov (United States)

    Zhou, Meiling; Singh, Alok Kumar; Pedrini, Giancarlo; Osten, Wolfgang; Min, Junwei; Yao, Baoli

    2018-03-01

    We present a tunable output-frequency filter (TOF) algorithm to reconstruct the object from noisy experimental data under low-power partially coherent illumination, such as LED, when imaging through scattering media. In the iterative algorithm, we employ Gaussian functions with different filter windows at different stages of iteration process to reduce corruption from experimental noise to search for a global minimum in the reconstruction. In comparison with the conventional iterative phase retrieval algorithm, we demonstrate that the proposed TOF algorithm achieves consistent and reliable reconstruction in the presence of experimental noise. Moreover, the spatial resolution and distinctive features are retained in the reconstruction since the filter is applied only to the region outside the object. The feasibility of the proposed method is proved by experimental results.

  9. Dynamic beam filtering for miscentered patients.

    Science.gov (United States)

    Mao, Andrew; Shyr, William; Gang, Grace J; Stayman, J Webster

    2018-02-01

    Accurate centering of the patient within the bore of a CT scanner takes time and is often difficult to achieve precisely. Patient miscentering can result in significant dose and image noise penalties with the use of traditional bowtie filters. This work describes a system to dynamically position an x-ray beam filter during image acquisition to enable more consistent image performance and potentially lower dose needed for CT imaging. We propose a new approach in which two orthogonal low-dose scout images are used to estimate a parametric model of the object describing its shape, size, and location within the field of view (FOV). This model is then used to compute an optimal filter motion profile by minimizing the variance of the expected detector fluence for each projection. Dynamic filtration was implemented on a cone-beam CT (CBCT) test bench using two different physical filters: 1) an aluminum bowtie and 2) a structured binary filter called a multiple aperture device (MAD). Dynamic filtration performance was compared to a static filter in studies of dose and reconstruction noise as a function of the degree of miscentering of a homogeneous water phantom. Estimated filter trajectories were found to be largely sinusoidal with an amplitude proportional to the amount of miscentering. Dynamic filtration demonstrated an improved ability to keep the spatial distribution of dose and reconstruction noise at baseline levels across varying levels of miscentering, reducing the maximum noise and dose deviation from 53% to 15% and 42% to 14% respectively for the bowtie filter, and 25% to 8% and 24% to 15% respectively for the MAD filter. Dynamic positioning of beam filters during acquisition improves dose utilization and image quality over static filters for miscentered patients. Such dynamic filters relax positioning requirements and have the potential to reduce set-up time and lower dose requirements.

  10. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  11. SD LMS L-Filters for Filtration of Gray Level Images in Timespatial Domain Based on GLCM Features

    Directory of Open Access Journals (Sweden)

    Robert Hudec

    2008-01-01

    Full Text Available In this paper, the new kind of adaptive signal-dependent LMS L-filter for suppression of a mixed noise in greyscale images is developed. It is based on the texture parameter measurement as modification of spatial impulse detector structure. Moreover, the one of GLCM (Gray Level Co-occurrence Matrix features, namely, the contrast or inertia adjusted by threshold as switch between partial filters is utilised. Finally, at the positions of partial filters the adaptive LMS versions of L-filters are chosen.

  12. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Directory of Open Access Journals (Sweden)

    Byeong Hak Kim

    2017-12-01

    Full Text Available Unmanned aerial vehicles (UAVs are equipped with optical systems including an infrared (IR camera such as electro-optical IR (EO/IR, target acquisition and designation sights (TADS, or forward looking IR (FLIR. However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC and scene-based NUC (SBNUC. However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA. In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR and long wave infrared (LWIR images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  13. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  14. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  15. Imaging spin filter for electrons based on specular reflection from iridium (001)

    Energy Technology Data Exchange (ETDEWEB)

    Kutnyakhov, D.; Lushchyk, P. [Johannes Gutenberg-Universität, Institut für Physik, 55099 Mainz (Germany); Fognini, A.; Perriard, D. [Laboratorium für Festkörperphysik, ETH Zürich, 8093 Zürich (Switzerland); Kolbe, M.; Medjanik, K.; Fedchenko, E.; Nepijko, S.A.; Elmers, H.J. [Johannes Gutenberg-Universität, Institut für Physik, 55099 Mainz (Germany); Salvatella, G.; Stieger, C.; Gort, R.; Bähler, T.; Michlmayer, T.; Acremann, Y.; Vaterlaus, A. [Laboratorium für Festkörperphysik, ETH Zürich, 8093 Zürich (Switzerland); Giebels, F.; Gollisch, H.; Feder, R. [Universität Duisburg-Essen, Theoretische Festkörperphysik, 47057 Duisburg (Germany); Tusche, C. [Max Planck-Institut für Mikrostrukturphysik, 06120 Halle (Germany); and others

    2013-07-15

    As Stern–Gerlach type spin filters do not work with electrons, spin analysis of electron beams is accomplished by spin-dependent scattering processes based on spin–orbit or exchange interaction. Existing polarimeters are single-channel devices characterized by an inherently low figure of merit (FoM) of typically 10{sup −4}–10{sup −3}. This single-channel approach is not compatible with parallel imaging microscopes and also not with modern electron spectrometers that acquire a certain energy and angular interval simultaneously. We present a novel type of polarimeter that can transport a full image by making use of k-parallel conservation in low-energy electron diffraction. We studied specular reflection from Ir (001) because this spin-filter crystal provides a high analyzing power combined with a “lifetime” in UHV of a full day. One good working point is centered at 39 eV scattering energy with a broad maximum of 5 eV usable width. A second one at about 10 eV shows a narrower profile but much higher FoM. A relativistic layer-KKR SPLEED calculation shows good agreement with measurements. - Highlights: • Novel type of spin polarimeter can transport a full image by making use of k{sup →}{sub ||} conservation in LEED. • When combined with a hemispherical analyzer, it acquires a certain energy and angular interval simultaneously. • Ir (001) based spin-filter provides a high analyzing power combined with a “lifetime” in UHV of a full day. • Parallel spin detection improves spin polarimeter efficiency by orders of magnitude. • A relativistic layer-KKR SPLEED calculation shows good agreement with measurements.

  16. CHANGE DETECTION VIA SELECTIVE GUIDED CONTRASTING FILTERS

    Directory of Open Access Journals (Sweden)

    Y. V. Vizilter

    2017-05-01

    Full Text Available Change detection scheme based on guided contrasting was previously proposed. Guided contrasting filter takes two images (test and sample as input and forms the output as filtered version of test image. Such filter preserves the similar details and smooths the non-similar details of test image with respect to sample image. Due to this the difference between test image and its filtered version (difference map could be a basis for robust change detection. Guided contrasting is performed in two steps: at the first step some smoothing operator (SO is applied for elimination of test image details; at the second step all matched details are restored with local contrast proportional to the value of some local similarity coefficient (LSC. The guided contrasting filter was proposed based on local average smoothing as SO and local linear correlation as LSC. In this paper we propose and implement new set of selective guided contrasting filters based on different combinations of various SO and thresholded LSC. Linear average and Gaussian smoothing, nonlinear median filtering, morphological opening and closing are considered as SO. Local linear correlation coefficient, morphological correlation coefficient (MCC, mutual information, mean square MCC and geometrical correlation coefficients are applied as LSC. Thresholding of LSC allows operating with non-normalized LSC and enhancing the selective properties of guided contrasting filters: details are either totally recovered or not recovered at all after the smoothing. These different guided contrasting filters are tested as a part of previously proposed change detection pipeline, which contains following stages: guided contrasting filtering on image pyramid, calculation of difference map, binarization, extraction of change proposals and testing change proposals using local MCC. Experiments on real and simulated image bases demonstrate the applicability of all proposed selective guided contrasting filters. All

  17. Elemental distribution imaging by energy-filtering transmission electron microscopy (EFTEM) and its applications

    International Nuclear Information System (INIS)

    Kurata, Hiroki

    1996-01-01

    EFTEM is new microscopy with the object of visualizing high resolution quantitative elemental distribution. The measurement principles and the present state of EFTEM studies are explained by the examples of measurement of the elemental distributions. EFTEM is a combination of the transmission electron microscope with the electron energy loss spectroscopy (EFLS). EFTEM method sets the slit in the specific energy field and put the electron passing the slit back in the microscopic image. The qualitative elemental analysis is obtained by observing the position of the absorption end of core electronic excitation spectrum and the quantitative one by determining the core electronic excitation strength of the specific atom depend on filtering with energy selector slit. The binding state and the local structure in the neighborhood of excited atom is determined by the fine structure of absorption end. By the chemical mapping method, the distribution image of chemical binding state is visualized by the imaging chemical map obtained by filtering the specific peak strength of fine structure with the narrow energy selector slit. The fine powder of lead chromate (PbCrO 4 ) covered with silica glass was shown as a typical example of the elemental distribution image of core electronic excitation spectrum. The quantitative analysis method of elemental distribution image is explained. The possibility of single atom analysis at nanometer was shown by the example of nanotube observed by EFTEM. (S.Y.)

  18. Design and application of finite impulse response digital filters

    International Nuclear Information System (INIS)

    Miller, T.R.; Sampathkumaran, K.S.

    1982-01-01

    The finite impulse response (FIR) digital filter is a spatial domain filter with a frequency domain representation. The theory of the FIR filter is presented and techniques are described for designing FIR filters with known frequency response characteristics. Rational design principles are emphasized based on characterization of the imaging system using the modulation transfer function and physical properties of the imaged objects. Bandpass, Wiener, and low-pass filters were designed and applied to 201 Tl myocardial images. The bandpass filter eliminates low-frequency image components that represent background activity and high-frequency components due to noise. The Wiener, or minimum mean square error filter 'sharpens' the image while also reducing noise. The Wiener filter illustrates the power of the FIR technique to design filters with any desired frequency reponse. The low-pass filter, while of relative limited use, is presented to compare it with a popular elementary 'smoothing' filter. (orig.)

  19. On-Line Multi-Damage Scanning Spatial-Wavenumber Filter Based Imaging Method for Aircraft Composite Structure

    Directory of Open Access Journals (Sweden)

    Yuanqiang Ren

    2017-05-01

    Full Text Available Structural health monitoring (SHM of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures.

  20. Multiscale bilateral filtering for improving image quality in digital breast tomosynthesis

    Science.gov (United States)

    Lu, Yao; Chan, Heang-Ping; Wei, Jun; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2015-01-01

    Purpose: Detection of subtle microcalcifications in digital breast tomosynthesis (DBT) is a challenging task because of the large, noisy DBT volume. It is important to enhance the contrast-to-noise ratio (CNR) of microcalcifications in DBT reconstruction. Most regularization methods depend on local gradient and may treat the ill-defined margins or subtle spiculations of masses and subtle microcalcifications as noise because of their small gradient. The authors developed a new multiscale bilateral filtering (MSBF) regularization method for the simultaneous algebraic reconstruction technique (SART) to improve the CNR of microcalcifications without compromising the quality of masses. Methods: The MSBF exploits a multiscale structure of DBT images to suppress noise and selectively enhance high frequency structures. At the end of each SART iteration, every DBT slice is decomposed into several frequency bands via Laplacian pyramid decomposition. No regularization is applied to the low frequency bands so that subtle edges of masses and structured background are preserved. Bilateral filtering is applied to the high frequency bands to enhance microcalcifications while suppressing noise. The regularized DBT images are used for updating in the next SART iteration. The new MSBF method was compared with the nonconvex total p-variation (TpV) method for noise regularization with SART. A GE GEN2 prototype DBT system was used for acquisition of projections at 21 angles in 3° increments over a ±30° range. The reconstruction image quality with no regularization (NR) and that with the two regularization methods were compared using the DBT scans of a heterogeneous breast phantom and several human subjects with masses and microcalcifications. The CNR and the full width at half maximum (FWHM) of the line profiles of microcalcifications and across the spiculations within their in-focus DBT slices were used as image quality measures. Results: The MSBF method reduced contouring artifacts

  1. SU-G-IeP4-15: Ultrasound Imaging of Absorbable Inferior Vena Cava Filters for Proper Placement

    Energy Technology Data Exchange (ETDEWEB)

    Mitcham, T; Bouchard, R; Melancon, A; Melancon, M [University of Texas MD Anderson Cancer Center, Houston, TX (United States); Eggers, M [Adient Medical Technologies, Pearland, TX (United States)

    2016-06-15

    Purpose: Inferior vena cava filters (IVCFs) are used in patients with a high risk of pulmonary embolism in situations when the use of blood thinning drugs would be inappropriate. These filters are implanted under x-ray guidance; however, this provides a dose of ionizing radiation to both patient and physician. B-mode ultrasound (US) imaging allows for localization of certain implanted devices without radiation dose concerns. The goal of this study was to investigate the feasibility of imaging the placement of absorbable IVCFs using US imaging to alleviate the dosage concern inherent to fluoroscopy. Methods: A phantom was constructed to mimic a human IVC using tissue-mimicking material with 0.5 dB/cm/MHz acoustic attenuation, while agar inclusions were used to model acoustic mismatch at the venous interface. Absorbable IVCF’s were imaged at 15 cm depth using B-mode US at 2, 3, 5, and 7 MHz transmit frequencies. Then, to determine temporal stability, the IVCF was left in the phantom for 10 weeks; during this time, the IVCF was imaged using the same techniques as above, while the integrity of the filter was analyzed by inspecting for fiber discontinuities. Results: Visualization of the inferior vena cava filter was possible at 5, 7.5, and 15 cm depth at US central frequencies of 2, 3, 5, and 7 MHz. Imaging the IVCF at 5 MHz yielded the clearest images while maintaining acceptable spatial resolution for identifying the IVCF’s, while lower frequencies provided noticeably worse image quality. No obvious degradation was observed over the course of the 10 weeks in a static phantom environment. Conclusion: Biodegradable IVCF localization was possible up to 15 cm in depth using conventional B-mode US in a tissue-mimicking phantom. This leads to the potential for using B-mode US to guide the placement of the IVCF upon deployment by the interventional radiologist. Mitch Eggers is an owner of Adient Medical Technologies. There are no other conflicts of interest to disclose.

  2. Tunable thin-film optical filters for hyperspectral microscopy

    Science.gov (United States)

    Favreau, Peter F.; Rich, Thomas C.; Prabhat, Prashant; Leavesley, Silas J.

    2013-02-01

    Hyperspectral imaging was originally developed for use in remote sensing applications. More recently, it has been applied to biological imaging systems, such as fluorescence microscopes. The ability to distinguish molecules based on spectral differences has been especially advantageous for identifying fluorophores in highly autofluorescent tissues. A key component of hyperspectral imaging systems is wavelength filtering. Each filtering technology used for hyperspectral imaging has corresponding advantages and disadvantages. Recently, a new optical filtering technology has been developed that uses multi-layered thin-film optical filters that can be rotated, with respect to incident light, to control the center wavelength of the pass-band. Compared to the majority of tunable filter technologies, these filters have superior optical performance including greater than 90% transmission, steep spectral edges and high out-of-band blocking. Hence, tunable thin-film optical filters present optical characteristics that may make them well-suited for many biological spectral imaging applications. An array of tunable thin-film filters was implemented on an inverted fluorescence microscope (TE 2000, Nikon Instruments) to cover the full visible wavelength range. Images of a previously published model, GFP-expressing endothelial cells in the lung, were acquired using a charge-coupled device camera (Rolera EM-C2, Q-Imaging). This model sample presents fluorescently-labeled cells in a highly autofluorescent environment. Linear unmixing of hyperspectral images indicates that thin-film tunable filters provide equivalent spectral discrimination to our previous acousto-optic tunable filter-based approach, with increased signal-to-noise characteristics. Hence, tunable multi-layered thin film optical filters may provide greatly improved spectral filtering characteristics and therefore enable wider acceptance of hyperspectral widefield microscopy.

  3. Cervical spine imaging in trauma: Does the use of grid and filter combination improve visualisation of the cervicothoracic junction?

    International Nuclear Information System (INIS)

    Goyal, Nimit; Rachapalli, Vamsidhar; Burns, Helen; Lloyd, David C.F.

    2011-01-01

    Purpose: To evaluate the usefulness of filter and anti-scatter grid combination in demonstrating the cervicothoracic junction in lateral cervical spine radiographs performed for trauma patients. Methods: Following a change in departmental protocol in our hospital, anti-scatter grid and filter are routinely used for lateral cervical spine radiograph in all trauma patients with immobilised cervical spine. A retrospective study was done to compare the efficacy of lateral cervical spine radiographs in demonstrating the cervicothoracic junction for a period of three months before and after the implementation of the change. All images were independently evaluated by two observers. Results: 253 trauma patients had a lateral cervical spine radiograph done in January to March 2003 without using the anti-scatter grid and filter while 309 patients in January to March 2007, using filter and grid. Inter-observer variability between the two observers was calculated using Cohen's Kappa which showed good and very good agreement for 2003 and 2007 respectively. 126 (49.8%) images adequately demonstrated the cervicothoracic junction without using filter and grid while 189 (61.1%) were adequate following their use. This was statistically significant (Fischer exact test, p value = 0.0081). Conclusion: The use of filter and anti-scatter grids improves the visualisation of cervicothoracic junction in lateral cervical spine imaging and reduces the need to repeat exposure.

  4. Cervical spine imaging in trauma: Does the use of grid and filter combination improve visualisation of the cervicothoracic junction?

    Energy Technology Data Exchange (ETDEWEB)

    Goyal, Nimit, E-mail: nimitgoyal@doctors.org.u [University Hospital of Wales, Heath Park, Cardiff, CF14 4XW (United Kingdom); Rachapalli, Vamsidhar; Burns, Helen; Lloyd, David C.F. [University Hospital of Wales, Heath Park, Cardiff, CF14 4XW (United Kingdom)

    2011-02-15

    Purpose: To evaluate the usefulness of filter and anti-scatter grid combination in demonstrating the cervicothoracic junction in lateral cervical spine radiographs performed for trauma patients. Methods: Following a change in departmental protocol in our hospital, anti-scatter grid and filter are routinely used for lateral cervical spine radiograph in all trauma patients with immobilised cervical spine. A retrospective study was done to compare the efficacy of lateral cervical spine radiographs in demonstrating the cervicothoracic junction for a period of three months before and after the implementation of the change. All images were independently evaluated by two observers. Results: 253 trauma patients had a lateral cervical spine radiograph done in January to March 2003 without using the anti-scatter grid and filter while 309 patients in January to March 2007, using filter and grid. Inter-observer variability between the two observers was calculated using Cohen's Kappa which showed good and very good agreement for 2003 and 2007 respectively. 126 (49.8%) images adequately demonstrated the cervicothoracic junction without using filter and grid while 189 (61.1%) were adequate following their use. This was statistically significant (Fischer exact test, p value = 0.0081). Conclusion: The use of filter and anti-scatter grids improves the visualisation of cervicothoracic junction in lateral cervical spine imaging and reduces the need to repeat exposure.

  5. Generalized Selection Weighted Vector Filters

    Directory of Open Access Journals (Sweden)

    Rastislav Lukac

    2004-09-01

    Full Text Available This paper introduces a class of nonlinear multichannel filters capable of removing impulsive noise in color images. The here-proposed generalized selection weighted vector filter class constitutes a powerful filtering framework for multichannel signal processing. Previously defined multichannel filters such as vector median filter, basic vector directional filter, directional-distance filter, weighted vector median filters, and weighted vector directional filters are treated from a global viewpoint using the proposed framework. Robust order-statistic concepts and increased degree of freedom in filter design make the proposed method attractive for a variety of applications. Introduced multichannel sigmoidal adaptation of the filter parameters and its modifications allow to accommodate the filter parameters to varying signal and noise statistics. Simulation studies reported in this paper indicate that the proposed filter class is computationally attractive, yields excellent performance, and is able to preserve fine details and color information while efficiently suppressing impulsive noise. This paper is an extended version of the paper by Lukac et al. presented at the 2003 IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03 in Grado, Italy.

  6. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  7. Detection of retinal nerve fiber layer defects in retinal fundus images using Gabor filtering

    Science.gov (United States)

    Hayashi, Yoshinori; Nakagawa, Toshiaki; Hatanaka, Yuji; Aoyama, Akira; Kakogawa, Masakatsu; Hara, Takeshi; Fujita, Hiroshi; Yamamoto, Tetsuya

    2007-03-01

    Retinal nerve fiber layer defect (NFLD) is one of the most important findings for the diagnosis of glaucoma reported by ophthalmologists. However, such changes could be overlooked, especially in mass screenings, because ophthalmologists have limited time to search for a number of different changes for the diagnosis of various diseases such as diabetes, hypertension and glaucoma. Therefore, the use of a computer-aided detection (CAD) system can improve the results of diagnosis. In this work, a technique for the detection of NFLDs in retinal fundus images is proposed. In the preprocessing step, blood vessels are "erased" from the original retinal fundus image by using morphological filtering. The preprocessed image is then transformed into a rectangular array. NFLD regions are observed as vertical dark bands in the transformed image. Gabor filtering is then applied to enhance the vertical dark bands. False positives (FPs) are reduced by a rule-based method which uses the information of the location and the width of each candidate region. The detected regions are back-transformed into the original configuration. In this preliminary study, 71% of NFLD regions are detected with average number of FPs of 3.2 per image. In conclusion, we have developed a technique for the detection of NFLDs in retinal fundus images. Promising results have been obtained in this initial study.

  8. MO-FG-CAMPUS-IeP1-01: Alternative K-Edge Filters for Low-Energy Image Acquisition in Contrast Enhanced Spectral Mammography

    Energy Technology Data Exchange (ETDEWEB)

    Shrestha, S; Vedantham, S; Karellas, A [University of Massachusetts Medical School, Worcester, MA (United States)

    2016-06-15

    Purpose: In Contrast Enhanced Spectral Mammography (CESM), Rh filter is often used during low-energy image acquisition. The potential for using Ag, In and Sn filters, which exhibit K-edge closer to, and just below that of Iodine, instead of the Rh filter, was investigated for the low-energy image acquisition. Methods: Analytical computations of the half-value thickness (HVT) and the photon fluence per mAs (photons/mm2/mAs) for 50µm Rh were compared with other potential K-edge filters (Ag, In and Sn), all with K-absorption edge below that of Iodine. Two strategies were investigated: fixed kVp and filter thickness (50µm for all filters) resulting in HVT variation, and fixed kVp and HVT resulting in variation in Ag, In and Sn thickness. Monte Carlo simulations (GEANT4) were conducted to determine if the scatter-to-primary ratio (SPR) and the point spread function of scatter (scatter PSF) differed between Rh and other K-edge filters. Results: Ag, In and Sn filters (50µm thick) increased photon fluence/mAs by 1.3–1.4, 1.8–2, and 1.7–2 at 28-32 kVp compared to 50µm Rh, which could decrease exposure time. Additionally, the fraction of spectra closer to and just below Iodine’s K-edge increased with these filters, which could improve post-subtraction image contrast. For HVT matched to 50µm Rh filtered spectra, the thickness range for Ag, In, and Sn were (41,44)µm, (49,55)µm and (45,53)µm, and increased photon fluence/mAs by 1.5–1.7, 1.6–2, and 1.6–2.2, respectively. Monte Carlo simulations showed that neither the SPR nor the scatter PSF of Ag, In and Sn differed from Rh, indicating no additional detriment due to x-ray scatter. Conclusion: The use of Ag, In and Sn filters for low-energy image acquisition in CESM is potentially feasible and could decrease exposure time and may improve post-subtraction image contrast. Effect of these filters on radiation dose, contrast, noise and associated metrics are being investigated. Funding Support: Supported in

  9. Kalman Filtered MR Temperature Imaging for Laser Induced Thermal Therapies

    OpenAIRE

    Fuentes, D.; Yung, J.; Hazle, J. D.; Weinberg, J. S.; Stafford, R. J.

    2011-01-01

    The feasibility of using a stochastic form of Pennes bioheat model within a 3D finite element based Kalman filter (KF) algorithm is critically evaluated for the ability to provide temperature field estimates in the event of magnetic resonance temperature imaging (MRTI) data loss during laser induced thermal therapy (LITT). The ability to recover missing MRTI data was analyzed by systematically removing spatiotemporal information from a clinical MR-guided LITT procedure in human brain and comp...

  10. Deconvolution of Defocused Image with Multivariate Local Polynomial Regression and Iterative Wiener Filtering in DWT Domain

    Directory of Open Access Journals (Sweden)

    Liyun Su

    2010-01-01

    obtaining the point spread function (PSF parameter, iterative wiener filter is adopted to complete the restoration. We experimentally illustrate its performance on simulated data and real blurred image. Results show that the proposed PSF parameter estimation technique and the image restoration method are effective.

  11. Edge Detection from High Resolution Remote Sensing Images using Two-Dimensional log Gabor Filter in Frequency Domain

    International Nuclear Information System (INIS)

    Wang, K; Yu, T; Meng, Q Y; Wang, G K; Li, S P; Liu, S H

    2014-01-01

    Edges are vital features to describe the structural information of images, especially high spatial resolution remote sensing images. Edge features can be used to define the boundaries between different ground objects in high spatial resolution remote sensing images. Thus edge detection is important in the remote sensing image processing. Even though many different edge detection algorithms have been proposed, it is difficult to extract the edge features from high spatial resolution remote sensing image including complex ground objects. This paper introduces a novel method to detect edges from the high spatial resolution remote sensing image based on frequency domain. Firstly, the high spatial resolution remote sensing images are Fourier transformed to obtain the magnitude spectrum image (frequency image) by FFT. Then, the frequency spectrum is analyzed by using the radius and angle sampling. Finally, two-dimensional log Gabor filter with optimal parameters is designed according to the result of spectrum analysis. Finally, dot product between the result of Fourier transform and the log Gabor filter is inverse Fourier transformed to obtain the detections. The experimental result shows that the proposed algorithm can detect edge features from the high resolution remote sensing image commendably

  12. Faraday anomalous dispersion optical filters

    Science.gov (United States)

    Shay, T. M.; Yin, B.; Alvarez, L. S.

    1993-01-01

    The effect of Faraday anomalous dispersion optical filters on infrared and blue transitions of some alkali atoms is calculated. A composite system is designed to further increase the background noise rejection. The measured results of the solar background rejection and image quality through the filter are presented. The results show that the filter may provide high transmission and high background noise rejection with excellent image quality.

  13. Face Recognition using Gabor Filters

    Directory of Open Access Journals (Sweden)

    Sajjad MOHSIN

    2011-01-01

    Full Text Available An Elastic Bunch Graph Map (EBGM algorithm is being proposed in this research paper that successfully implements face recognition using Gabor filters. The proposed system applies 40 different Gabor filters on an image. As aresult of which 40 images with different angles and orientation are received. Next, maximum intensity points in each filtered image are calculated and mark them as Fiducial points. The system reduces these points in accordance to distance between them. The next step is calculating the distances between the reduced points using distance formula. At last, the distances are compared with database. If match occurs, it means that the image is recognized.

  14. Extraction of topographic and material contrasts on surfaces from SEM images obtained by energy filtering detection with low-energy primary electrons

    International Nuclear Information System (INIS)

    Nagoshi, Masayasu; Aoyama, Tomohiro; Sato, Kaoru

    2013-01-01

    Secondary electron microscope (SEM) images have been obtained for practical materials using low primary electron energies and an in-lens type annular detector with changing negative bias voltage supplied to a grid placed in front of the detector. The kinetic-energy distribution of the detected electrons was evaluated by the gradient of the bias-energy dependence of the brightness of the images. This is divided into mainly two parts at about 500 V, high and low brightness in the low- and high-energy regions, respectively and shows difference among the surface regions having different composition and topography. The combination of the negative grid bias and the pixel-by-pixel image subtraction provides the band-pass filtered images and extracts the material and topographic information of the specimen surfaces. -- Highlights: ► Scanning electron (SE) images contain many kind of information on material surfaces. ► We investigate energy-filtered SE images for practical materials. ► The brightness of the images is divided into two parts by the bias voltage. ► Topographic and material contrasts are extracted by subtracting the filtered images.

  15. An Image Filter Based on Shearlet Transformation and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2015-01-01

    Full Text Available Digital image is always polluted by noise and made data postprocessing difficult. To remove noise and preserve detail of image as much as possible, this paper proposed image filter algorithm which combined the merits of Shearlet transformation and particle swarm optimization (PSO algorithm. Firstly, we use classical Shearlet transform to decompose noised image into many subwavelets under multiscale and multiorientation. Secondly, we gave weighted factor to those subwavelets obtained. Then, using classical Shearlet inverse transform, we obtained a composite image which is composed of those weighted subwavelets. After that, we designed fast and rough evaluation method to evaluate noise level of the new image; by using this method as fitness, we adopted PSO to find the optimal weighted factor we added; after lots of iterations, by the optimal factors and Shearlet inverse transform, we got the best denoised image. Experimental results have shown that proposed algorithm eliminates noise effectively and yields good peak signal noise ratio (PSNR.

  16. Using Convolutional Neural Network Filters to Measure Left-Right Mirror Symmetry in Images

    Directory of Open Access Journals (Sweden)

    Anselm Brachmann

    2016-12-01

    Full Text Available We propose a method for measuring symmetry in images by using filter responses from Convolutional Neural Networks (CNNs. The aim of the method is to model human perception of left/right symmetry as closely as possible. Using the Convolutional Neural Network (CNN approach has two main advantages: First, CNN filter responses closely match the responses of neurons in the human visual system; they take information on color, edges and texture into account simultaneously. Second, we can measure higher-order symmetry, which relies not only on color, edges and texture, but also on the shapes and objects that are depicted in images. We validated our algorithm on a dataset of 300 music album covers, which were rated according to their symmetry by 20 human observers, and compared results with those from a previously proposed method. With our method, human perception of symmetry can be predicted with high accuracy. Moreover, we demonstrate that the inclusion of features from higher CNN layers, which encode more abstract image content, increases the performance further. In conclusion, we introduce a model of left/right symmetry that closely models human perception of symmetry in CD album covers.

  17. Least median of squares filtering of locally optimal point matches for compressible flow image registration

    International Nuclear Information System (INIS)

    Castillo, Edward; Guerrero, Thomas; Castillo, Richard; White, Benjamin; Rojo, Javier

    2012-01-01

    Compressible flow based image registration operates under the assumption that the mass of the imaged material is conserved from one image to the next. Depending on how the mass conservation assumption is modeled, the performance of existing compressible flow methods is limited by factors such as image quality, noise, large magnitude voxel displacements, and computational requirements. The Least Median of Squares Filtered Compressible Flow (LFC) method introduced here is based on a localized, nonlinear least squares, compressible flow model that describes the displacement of a single voxel that lends itself to a simple grid search (block matching) optimization strategy. Spatially inaccurate grid search point matches, corresponding to erroneous local minimizers of the nonlinear compressible flow model, are removed by a novel filtering approach based on least median of squares fitting and the forward search outlier detection method. The spatial accuracy of the method is measured using ten thoracic CT image sets and large samples of expert determined landmarks (available at www.dir-lab.com). The LFC method produces an average error within the intra-observer error on eight of the ten cases, indicating that the method is capable of achieving a high spatial accuracy for thoracic CT registration. (paper)

  18. STRUCTURE TENSOR IMAGE FILTERING USING RIEMANNIAN L1 AND L∞ CENTER-OF-MASS

    Directory of Open Access Journals (Sweden)

    Jesus Angulo

    2014-06-01

    Full Text Available Structure tensor images are obtained by a Gaussian smoothing of the dyadic product of gradient image. These images give at each pixel a n×n symmetric positive definite matrix SPD(n, representing the local orientation and the edge information. Processing such images requires appropriate algorithms working on the Riemannian manifold on the SPD(n matrices. This contribution deals with structure tensor image filtering based on Lp geometric averaging. In particular, L1 center-of-mass (Riemannian median or Fermat-Weber point and L∞ center-of-mass (Riemannian circumcenter can be obtained for structure tensors using recently proposed algorithms. Our contribution in this paper is to study the interest of L1 and L∞ Riemannian estimators for structure tensor image processing. In particular, we compare both for two image analysis tasks: (i structure tensor image denoising; (ii anomaly detection in structure tensor images.

  19. Be Foil ''Filter Knee Imaging'' NSTX Plasma with Fast Soft X-ray Camera

    International Nuclear Information System (INIS)

    B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

    2005-01-01

    A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28 o ) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip

  20. HEPA air filter (image)

    Science.gov (United States)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  1. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging...... using the original OMERACT Filter and the newly proposed structure. Breakout groups critically reviewed the extent to which the candidate biomarkers complied with the proposed stepwise approach, as a way of examining the utility of the proposed 3-dimensional structure. RESULTS: Although...... was obtained for a proposed tri-axis structure to assess validation of imaging and soluble biomarkers; nevertheless, additional work is required to better evaluate its place within the OMERACT Filter 2.0....

  2. Feature-Based Nonlocal Polarimetric SAR Filtering

    Directory of Open Access Journals (Sweden)

    Xiaoli Xing

    2017-10-01

    Full Text Available Polarimetric synthetic aperture radar (PolSAR images are inherently contaminated by multiplicative speckle noise, which complicates the image interpretation and image analyses. To reduce the speckle effect, several adaptive speckle filters have been developed based on the weighted average of the similarity measures commonly depending on the model or probability distribution, which are often affected by the distribution parameters and modeling texture components. In this paper, a novel filtering method introduces the coefficient of variance ( CV and Pauli basis (PB to measure the similarity, and the two features are combined with the framework of the nonlocal mean filtering. The CV is used to describe the complexity of various scenes and distinguish the scene heterogeneity; moreover, the Pauli basis is able to express the polarimetric information in PolSAR image processing. This proposed filtering combines the CV and Pauli basis to improve the estimation accuracy of the similarity weights. Then, the similarity of the features is deduced according to the test statistic. Subsequently, the filtering is proceeded by using the nonlocal weighted estimation. The performance of the proposed filter is tested with the simulated images and real PolSAR images, which are acquired by AIRSAR system and ESAR system. The qualitative and quantitative experiments indicate the validity of the proposed method by comparing with the widely-used despeckling methods.

  3. MR angiography with a matched filter

    International Nuclear Information System (INIS)

    De Castro, J.B.; Riederer, S.J.; Lee, J.N.

    1987-01-01

    The technique of matched filtering was applied to a series of cine MR images. The filter was devised to yield a subtraction angiographic image in which direct current components present in the cine series are removed and the signal-to-noise ratio (S/N) of the vascular structures is optimized. The S/N of a matched filter was compared with that of a simple subtraction, in which an image with high flow is subtracted from one with low flow. Experimentally, a range of results from minimal improvement to significant (60%) improvement in S/N was seen in the comparisons of matched filtered subtraction with simple subtraction

  4. A comparison of filters for thoracic diagnosis

    International Nuclear Information System (INIS)

    Oestmann, J.W.; Hendrickx, P.; Rieder, P.; Geerlings, H.; Medizinische Hochschule Hannover

    1986-01-01

    The effect of three types of filter on the quality of radiographs of the chest was compared. These filters improve visualization of mediastinal structures without significantly reducing the quality of the pulmonary image. In practice the Du Pont filter proved best; the quality of the central and peripheral portions of the lung image is equal to that of an ordinary radiograph and visualization of the mediastinum is improved. The Agfa-Gevaert filter showed no significant disadvantages compared with the ordinary techniques but the improvement in mediastinal visualization is not that marked. The 3M-filter yields poor images of the central portions of the lung and its type of construction prevents the retrocardiac structures from being pictured as well as with the other filters. (orig.) [de

  5. Noise reduction with complex bilateral filter.

    Science.gov (United States)

    Matsumoto, Mitsuharu

    2017-12-01

    This study introduces a noise reduction technique that uses a complex bilateral filter. A bilateral filter is a nonlinear filter originally developed for images that can reduce noise while preserving edge information. It is an attractive filter and has been used in many applications in image processing. When it is applied to an acoustical signal, small-amplitude noise is reduced while the speech signal is preserved. However, a bilateral filter cannot handle noise with relatively large amplitudes owing to its innate characteristics. In this study, the noisy signal is transformed into the time-frequency domain and the filter is improved to handle complex spectra. The high-amplitude noise is reduced in the time-frequency domain via the proposed filter. The features and the potential of the proposed filter are also confirmed through experiments.

  6. Segmentation of dermatoscopic images by frequency domain filtering and k-means clustering algorithms.

    Science.gov (United States)

    Rajab, Maher I

    2011-11-01

    Since the introduction of epiluminescence microscopy (ELM), image analysis tools have been extended to the field of dermatology, in an attempt to algorithmically reproduce clinical evaluation. Accurate image segmentation of skin lesions is one of the key steps for useful, early and non-invasive diagnosis of coetaneous melanomas. This paper proposes two image segmentation algorithms based on frequency domain processing and k-means clustering/fuzzy k-means clustering. The two methods are capable of segmenting and extracting the true border that reveals the global structure irregularity (indentations and protrusions), which may suggest excessive cell growth or regression of a melanoma. As a pre-processing step, Fourier low-pass filtering is applied to reduce the surrounding noise in a skin lesion image. A quantitative comparison of the techniques is enabled by the use of synthetic skin lesion images that model lesions covered with hair to which Gaussian noise is added. The proposed techniques are also compared with an established optimal-based thresholding skin-segmentation method. It is demonstrated that for lesions with a range of different border irregularity properties, the k-means clustering and fuzzy k-means clustering segmentation methods provide the best performance over a range of signal to noise ratios. The proposed segmentation techniques are also demonstrated to have similar performance when tested on real skin lesions representing high-resolution ELM images. This study suggests that the segmentation results obtained using a combination of low-pass frequency filtering and k-means or fuzzy k-means clustering are superior to the result that would be obtained by using k-means or fuzzy k-means clustering segmentation methods alone. © 2011 John Wiley & Sons A/S.

  7. Performance tuning for CUDA-accelerated neighborhood denoising filters

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Ziyi; Mueller, Klaus [Stony Brook Univ., NY (United States). Center for Visual Computing, Computer Science; Xu, Wei

    2011-07-01

    Neighborhood denoising filters are powerful techniques in image processing and can effectively enhance the image quality in CT reconstructions. In this study, by taking the bilateral filter and the non-local mean filter as two examples, we discuss their implementations and perform fine-tuning on the targeted GPU architecture. Experimental results show that the straightforward GPU-based neighborhood filters can be further accelerated by pre-fetching. The optimized GPU-accelerated denoising filters are ready for plug-in into reconstruction framework to enable fast denoising without compromising image quality. (orig.)

  8. Depth Images Filtering In Distributed Streaming

    Directory of Open Access Journals (Sweden)

    Dziubich Tomasz

    2016-04-01

    Full Text Available In this paper, we propose a distributed system for point cloud processing and transferring them via computer network regarding to effectiveness-related requirements. We discuss the comparison of point cloud filters focusing on their usage for streaming optimization. For the filtering step of the stream pipeline processing we evaluate four filters: Voxel Grid, Radial Outliner Remover, Statistical Outlier Removal and Pass Through. For each of the filters we perform a series of tests for evaluating the impact on the point cloud size and transmitting frequency (analysed for various fps ratio. We present results of the optimization process used for point cloud consolidation in a distributed environment. We describe the processing of the point clouds before and after the transmission. Pre- and post-processing allow the user to send the cloud via network without any delays. The proposed pre-processing compression of the cloud and the post-processing reconstruction of it are focused on assuring that the end-user application obtains the cloud with a given precision.

  9. UV Fluorescence Photography of Works of Art : Replacing the Traditional UV Cut Filters with Interference Filters

    Directory of Open Access Journals (Sweden)

    Luís BRAVO PEREIRA

    2010-09-01

    Full Text Available For many years filters like the Kodak Wratten E series, or the equivalent Schneider B+W 415, were used as standard UV cut filters, necessary to obtain good quality on UV Fluorescence photography. The only problem with the use of these filters is that, when they receive the UV radiation that they should remove, they present themselves an internal fluorescence as side effect, that usually reduce contrast and quality on the final image. This article presents the results of our experiences on using some innovative filters, that appeared available on the market in recent years, projected to adsorb UV radiation even more efficiently than with the mentioned above pigment based standard filters: the interference filters for UV rejection (and, usually, for IR rejection too manufactured using interference layers, that present better results than the pigment based filters. The only problem with interference filters type is that they are sensitive to the rays direction and, because of that, they are not adequate to wide-angle lenses. The internal fluorescence for three filters: the B+W 415 UV cut (equivalent to the Kodak Wratten 2E, pigment based, the B+W 486 UV IR cut (an interference type filter, used frequently on digital cameras to remove IR or UV and the Baader UVIR rejection filter (two versions of this interference filter were used had been tested and compared. The final quality of the UV fluorescence images seems to be of a superior quality when compared to the images obtained with classic filters.

  10. Acousto-Optic Tunable Filter Hyperspectral Microscope Imaging Method for Characterizing Spectra from Foodborne Pathogens.

    Science.gov (United States)

    Hyperspectral microscope imaging (HMI) method, which provides both spatial and spectral characteristics of samples, can be effective for foodborne pathogen detection. The acousto-optic tunable filter (AOTF)-based HMI method can be used to characterize spectral properties of biofilms formed by Salmon...

  11. Mobile Phone Ratiometric Imaging Enables Highly Sensitive Fluorescence Lateral Flow Immunoassays without External Optical Filters.

    Science.gov (United States)

    Shah, Kamal G; Singh, Vidhi; Kauffman, Peter C; Abe, Koji; Yager, Paul

    2018-05-14

    Paper-based diagnostic tests based on the lateral flow immunoassay concept promise low-cost, point-of-care detection of infectious diseases, but such assays suffer from poor limits of detection. One factor that contributes to poor analytical performance is a reliance on low-contrast chromophoric optical labels such as gold nanoparticles. Previous attempts to improve the sensitivity of paper-based diagnostics include replacing chromophoric labels with enzymes, fluorophores, or phosphors at the expense of increased fluidic complexity or the need for device readers with costly optoelectronics. Several groups, including our own, have proposed mobile phones as suitable point-of-care readers due to their low cost, ease of use, and ubiquity. However, extant mobile phone fluorescence readers require costly optical filters and were typically validated with only one camera sensor module, which is inappropriate for potential point-of-care use. In response, we propose to couple low-cost ultraviolet light-emitting diodes with long Stokes-shift quantum dots to enable ratiometric mobile phone fluorescence measurements without optical filters. Ratiometric imaging with unmodified smartphone cameras improves the contrast and attenuates the impact of excitation intensity variability by 15×. Practical application was shown with a lateral flow immunoassay for influenza A with nucleoproteins spiked into simulated nasal matrix. Limits of detection of 1.5 and 2.6 fmol were attained on two mobile phones, which are comparable to a gel imager (1.9 fmol), 10× better than imaging gold nanoparticles on a scanner (18 fmol), and >2 orders of magnitude better than gold nanoparticle-labeled assays imaged with mobile phones. Use of the proposed filter-free mobile phone imaging scheme is a first step toward enabling a new generation of highly sensitive, point-of-care fluorescence assays.

  12. Filtering Photogrammetric Point Clouds Using Standard LIDAR Filters Towards DTM Generation

    Science.gov (United States)

    Zhang, Z.; Gerke, M.; Vosselman, G.; Yang, M. Y.

    2018-05-01

    Digital Terrain Models (DTMs) can be generated from point clouds acquired by laser scanning or photogrammetric dense matching. During the last two decades, much effort has been paid to developing robust filtering algorithms for the airborne laser scanning (ALS) data. With the point cloud quality from dense image matching (DIM) getting better and better, the research question that arises is whether those standard Lidar filters can be used to filter photogrammetric point clouds as well. Experiments are implemented to filter two dense matching point clouds with different noise levels. Results show that the standard Lidar filter is robust to random noise. However, artefacts and blunders in the DIM points often appear due to low contrast or poor texture in the images. Filtering will be erroneous in these locations. Filtering the DIM points pre-processed by a ranking filter will bring higher Type II error (i.e. non-ground points actually labelled as ground points) but much lower Type I error (i.e. bare ground points labelled as non-ground points). Finally, the potential DTM accuracy that can be achieved by DIM points is evaluated. Two DIM point clouds derived by Pix4Dmapper and SURE are compared. On grassland dense matching generates points higher than the true terrain surface, which will result in incorrectly elevated DTMs. The application of the ranking filter leads to a reduced bias in the DTM height, but a slightly increased noise level.

  13. Image classification using multiscale information fusion based on saliency driven nonlinear diffusion filtering.

    Science.gov (United States)

    Hu, Weiming; Hu, Ruiguang; Xie, Nianhua; Ling, Haibin; Maybank, Stephen

    2014-04-01

    In this paper, we propose saliency driven image multiscale nonlinear diffusion filtering. The resulting scale space in general preserves or even enhances semantically important structures such as edges, lines, or flow-like structures in the foreground, and inhibits and smoothes clutter in the background. The image is classified using multiscale information fusion based on the original image, the image at the final scale at which the diffusion process converges, and the image at a midscale. Our algorithm emphasizes the foreground features, which are important for image classification. The background image regions, whether considered as contexts of the foreground or noise to the foreground, can be globally handled by fusing information from different scales. Experimental tests of the effectiveness of the multiscale space for the image classification are conducted on the following publicly available datasets: 1) the PASCAL 2005 dataset; 2) the Oxford 102 flowers dataset; and 3) the Oxford 17 flowers dataset, with high classification rates.

  14. Characterization of the Edges and Contrasts in a digital image with the variation of the Parameters of the High-pass Filters used in the Estimation of Atmospheric Visibility

    Directory of Open Access Journals (Sweden)

    Martha C. Guzmán-Zapata

    2013-11-01

    Full Text Available This paper considers the edges and contrasts obtained with high-pass filters used in the estimation of daytime atmospheric visibility from digital images, and the behavior of these edges and contrasts is characterized by varying the parameters of high-pass filters such as the Ideal, Gaussian, and Homomorphic-Gaussian. A synthetic image of regions with different contrasts is used to apply different filters, then, we define an index to measure the quality of the edges obtained in the filtered image and it is used to analyze the results. The results show that both, the filter selection and the selection of its parameters: affects the characteristics and quality of the detected edges in the filtered image, also determine the amount of noise that the filter added to the image (artifacts that were not present in the original image, and also establish if achieved, or not, the edge detection. The results also show that the edge quality index reaches maximum values at certain combinations of the filters parameters, which means that some combinations of parameters reduce situations distorting the edges and distorting atmospheric visibility measures based on the Fourier transform. So these parameters which provide maximum quality edges are established as suitable for use in visibility measurement.

  15. Investigation of the influence of image reconstruction filter and scan parameters on operation of automatic tube current modulation systems for different CT scanners

    International Nuclear Information System (INIS)

    Sookpeng, Supawitoo; Martin, Colin J.; Gentle, David J.

    2015-01-01

    Variation in the user selected CT scanning parameters under automatic tube current modulation (ATCM) between hospitals has a substantial influence on the radiation doses and image quality for patients. The aim of this study was to investigate the effect of changing image reconstruction filter and scan parameter settings on tube current, dose and image quality for various CT scanners operating under ATCM. The scan parameters varied were pitch factor, rotation time, collimator configuration, kVp, image thickness and image filter convolution (FC) used for reconstruction. The Toshiba scanner varies the tube current to achieve a set target noise. Changes in the FC setting and image thickness for the first reconstruction were the major factors affecting patient dose. A two-step change in FC from smoother to sharper filters doubles the dose, but is counterbalanced by an improvement in spatial resolution. In contrast, Philips and Siemens scanners maintained tube current values similar to those for a reference image and patient, and the tube current only varied slightly for changes in individual CT scan parameters. The selection of a sharp filter increased the image noise, while use of iDose iterative reconstruction reduced the noise. Since the principles used by CT manufacturers for ATCM vary, it is important that parameters which affect patient dose and image quality for each scanner are made clear to operator to aid in optimisation. (authors)

  16. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery

    Directory of Open Access Journals (Sweden)

    Yalan Zheng

    2017-12-01

    Full Text Available GaoFen-2 (GF-2 is a civilian optical satellite self-developed by China equipped with both multispectral and panchromatic sensors, and is the first satellite in China with a resolution below 1 m. Because the pan-sharpening methods on GF-2 imagery have not been a focus of previous works, we propose a novel pan-sharpening method based on guided image filtering and compare the performance to state-of-the-art methods on GF-2 images. Guided image filtering was introduced to decompose and transfer the details and structures from the original panchromatic and multispectral bands. Thereafter, an adaptive model that considers the local spectral relationship was designed to properly inject spatial information back into the original spectral bands. Four pairs of GF-2 images acquired from urban, water body, cropland, and forest areas were selected for the experiments. Both quantitative and visual inspections were used for the assessment. The experimental results demonstrated that for GF-2 imagery acquired over different scenes, the proposed approach consistently achieves high spectral fidelity and enhances spatial details, thereby benefitting the potential classification procedures.

  17. Caval penetration by retrievable inferior vena cava filters: a retrospective comparison of Option and Günther Tulip filters.

    Science.gov (United States)

    Olorunsola, Olufoladare G; Kohi, Maureen P; Fidelman, Nicholas; Westphalen, Antonio C; Kolli, Pallav K; Taylor, Andrew G; Gordon, Roy L; LaBerge, Jeanne M; Kerlan, Robert K

    2013-04-01

    To compare the frequency of vena caval penetration by the struts of the Option and Günther Tulip cone filters on postplacement computed tomography (CT) imaging. All patients who had an Option or Günther Tulip inferior vena cava (IVC) filter placed between January 2010 and May 2012 were identified retrospectively from medical records. Of the 208 IVC filters placed, the positions of 58 devices (21 Option filters, 37 Günther Tulip filters [GTFs]) were documented on follow-up CT examinations obtained for reasons unrelated to filter placement. In cases when multiple CT studies were obtained after placement, each study was reviewed, for a total of 80 examinations. Images were assessed for evidence of caval wall penetration by filter components, noting the number of penetrating struts and any effect on pericaval tissues. Penetration of at least one strut was observed in 17% of all filters imaged by CT between 1 and 447 days following placement. Although there was no significant difference in the overall prevalence of penetration when comparing the Option filter and GTF (Option, 10%; GTF, 22%), only GTFs showed time-dependent penetration, with penetration becoming more likely after prolonged indwelling times. No patient had damage to pericaval tissues or documented symptoms attributed to penetration. Although the Günther Tulip and Option filters exhibit caval penetration at CT imaging, only the GTF exhibits progressive penetration over time. Copyright © 2013 SIR. Published by Elsevier Inc. All rights reserved.

  18. Unsupervised Retinal Vessel Segmentation Using Combined Filters.

    Directory of Open Access Journals (Sweden)

    Wendeson S Oliveira

    Full Text Available Image segmentation of retinal blood vessels is a process that can help to predict and diagnose cardiovascular related diseases, such as hypertension and diabetes, which are known to affect the retinal blood vessels' appearance. This work proposes an unsupervised method for the segmentation of retinal vessels images using a combined matched filter, Frangi's filter and Gabor Wavelet filter to enhance the images. The combination of these three filters in order to improve the segmentation is the main motivation of this work. We investigate two approaches to perform the filter combination: weighted mean and median ranking. Segmentation methods are tested after the vessel enhancement. Enhanced images with median ranking are segmented using a simple threshold criterion. Two segmentation procedures are applied when considering enhanced retinal images using the weighted mean approach. The first method is based on deformable models and the second uses fuzzy C-means for the image segmentation. The procedure is evaluated using two public image databases, Drive and Stare. The experimental results demonstrate that the proposed methods perform well for vessel segmentation in comparison with state-of-the-art methods.

  19. Nonlinear filtering for character recognition in low quality document images

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2014-09-01

    Optical character recognition in scanned printed documents is a well-studied task, where the captured conditions like sheet position, illumination, contrast and resolution are controlled. Nowadays, it is more practical to use mobile devices for document capture than a scanner. So as a consequence, the quality of document images is often poor owing to presence of geometric distortions, nonhomogeneous illumination, low resolution, etc. In this work we propose to use multiple adaptive nonlinear composite filters for detection and classification of characters. Computer simulation results obtained with the proposed system are presented and discussed.

  20. Image super-resolution reconstruction based on regularization technique and guided filter

    Science.gov (United States)

    Huang, De-tian; Huang, Wei-qin; Gu, Pei-ting; Liu, Pei-zhong; Luo, Yan-min

    2017-06-01

    In order to improve the accuracy of sparse representation coefficients and the quality of reconstructed images, an improved image super-resolution algorithm based on sparse representation is presented. In the sparse coding stage, the autoregressive (AR) regularization and the non-local (NL) similarity regularization are introduced to improve the sparse coding objective function. A group of AR models which describe the image local structures are pre-learned from the training samples, and one or several suitable AR models can be adaptively selected for each image patch to regularize the solution space. Then, the image non-local redundancy is obtained by the NL similarity regularization to preserve edges. In the process of computing the sparse representation coefficients, the feature-sign search algorithm is utilized instead of the conventional orthogonal matching pursuit algorithm to improve the accuracy of the sparse coefficients. To restore image details further, a global error compensation model based on weighted guided filter is proposed to realize error compensation for the reconstructed images. Experimental results demonstrate that compared with Bicubic, L1SR, SISR, GR, ANR, NE + LS, NE + NNLS, NE + LLE and A + (16 atoms) methods, the proposed approach has remarkable improvement in peak signal-to-noise ratio, structural similarity and subjective visual perception.

  1. The 2D Hotelling filter - a quantitative noise-reducing principal-component filter for dynamic PET data, with applications in patient dose reduction

    International Nuclear Information System (INIS)

    Axelsson, Jan; Sörensen, Jens

    2013-01-01

    In this paper we apply the principal-component analysis filter (Hotelling filter) to reduce noise from dynamic positron-emission tomography (PET) patient data, for a number of different radio-tracer molecules. We furthermore show how preprocessing images with this filter improves parametric images created from such dynamic sequence. We use zero-mean unit variance normalization, prior to performing a Hotelling filter on the slices of a dynamic time-series. The Scree-plot technique was used to determine which principal components to be rejected in the filter process. This filter was applied to [ 11 C]-acetate on heart and head-neck tumors, [ 18 F]-FDG on liver tumors and brain, and [ 11 C]-Raclopride on brain. Simulations of blood and tissue regions with noise properties matched to real PET data, was used to analyze how quantitation and resolution is affected by the Hotelling filter. Summing varying parts of a 90-frame [ 18 F]-FDG brain scan, we created 9-frame dynamic scans with image statistics comparable to 20 MBq, 60 MBq and 200 MBq injected activity. Hotelling filter performed on slices (2D) and on volumes (3D) were compared. The 2D Hotelling filter reduces noise in the tissue uptake drastically, so that it becomes simple to manually pick out regions-of-interest from noisy data. 2D Hotelling filter introduces less bias than 3D Hotelling filter in focal Raclopride uptake. Simulations show that the Hotelling filter is sensitive to typical blood peak in PET prior to tissue uptake have commenced, introducing a negative bias in early tissue uptake. Quantitation on real dynamic data is reliable. Two examples clearly show that pre-filtering the dynamic sequence with the Hotelling filter prior to Patlak-slope calculations gives clearly improved parametric image quality. We also show that a dramatic dose reduction can be achieved for Patlak slope images without changing image quality or quantitation. The 2D Hotelling-filtering of dynamic PET data is a computer

  2. Improved automatic filtering methodology for an optimal pharmacokinetic modelling of DCE-MR images of the prostate

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez Martinez, V.; Bosch Roig, I.; Sanz Requena, R.

    2016-07-01

    In Dynamic Contrast-Enhanced Magnetic Resonance (DCEMR) studies with high temporal resolution, images are quite noisy due to the complicate balance between temporal and spatial resolution. For this reason, the temporal curves extracted from the images present remarkable noise levels and, because of that, the pharmacokinetic parameters calculated by least squares fitting from the curves and the arterial phase (a useful marker in tumour diagnosis which appears in curves with high arterial contribution) are affected. In order to solve these limitations, an automatic filtering method was developed by our group. In this work, an advanced automatic filtering methodology is presented to further improve noise reduction of the temporal curves in order to obtain more accurate kinetic parameters and a proper modelling of the arterial phase. (Author)

  3. Enhancing Perceived Quality of Compressed Images and Video with Anisotropic Diffusion and Fuzzy Filtering

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Korhonen, Jari; Forchhammer, Søren

    2013-01-01

    and subjective results on JPEG compressed images, as well as MJPEG and H.264/AVC compressed video, indicate that the proposed algorithms employing directional and spatial fuzzy filters achieve better artifact reduction than other methods. In particular, robust improvements with H.264/AVC video have been gained...

  4. Comparative study of reconstruction filters for cranio examinations of the Philips system and its influence on the quality of the tomographic image

    International Nuclear Information System (INIS)

    Silveira, V.C.; Kodlulovich, S.; Delduck, R.S.; Oliveira, L.C.G.

    2011-01-01

    The aim of this study was to evaluate different reconstruction algorithm (kernels) applied for head examinations. The research was carried out using 40-slice MDCT (Philips, Brilliance 40 CT scanner) and an ACR Phantoms to evaluate the image quality. The doses were estimated applying the coefficients obtained by IMPACT .The study showed that independently of the filter used the values of CT number have no change. The low contrast showed that the choice of correct filter might result a decrease of 9% of doses values. For special resolution, the sharp filter showed a better response for low m As. The noise value of image to certain filters smooth has no changed, even by reducing the m As values. (author)

  5. Real-time single image dehazing based on dark channel prior theory and guided filtering

    Science.gov (United States)

    Zhang, Zan

    2017-10-01

    Images and videos taken outside the foggy day are serious degraded. In order to restore degraded image taken in foggy day and overcome traditional Dark Channel prior algorithms problems of remnant fog in edge, we propose a new dehazing method.We first find the fog area in the dark primary color map to obtain the estimated value of the transmittance using quadratic tree. Then we regard the gray-scale image after guided filtering as atmospheric light map and remove haze based on it. Box processing and image down sampling technology are also used to improve the processing speed. Finally, the atmospheric light scattering model is used to restore the image. A plenty of experiments show that algorithm is effective, efficient and has a wide range of application.

  6. Exposure reduction in general dental practice using digital x-ray imaging system for intraoral radiography with additional x-ray beam filter

    International Nuclear Information System (INIS)

    Shibuya, Hitoshi; Mori, Toshimichi; Hayakawa, Yoshihiko; Kuroyanagi, Kinya; Ota, Yoshiko

    1997-01-01

    To measure exposure reduction in general dental practice using digital x-ray imaging systems for intraoral radiography with additional x-ray beam filter. Two digital x-ray imaging systems, Pana Digital (Pana-Heraus Dental) and CDR (Schick Technologies), were applied for intraoral radiography in general dental practice. Due to the high sensitivity to x-rays, additional x-ray beam filters for output reduction were used for examination. An Orex W II (Osada Electric Industry) x-ray generator was operated at 60 kVp, 7 mA. X-ray output (air-kerma; Gy) necessary for obtaining clinically acceptable images was measured at 0 to 20 cm in 5 cm steps from the cone tip using an ionizing chamber type 660 (Nuclear Associates) and compared with those for Ektaspeed Plus film (Eastman Kodak). The Pana Digital system was used with the optional filter supplied by Pana-Heraus Dental which reduced the output to 38%. The exposure necessary to obtain clinically acceptable images was only 40% of that for the film. The CDR system was used with the Dental X-ray Beam Filter Kit (Eastman Kodak) which reduced the x-ray output to 30%. The exposure necessary to obtain clinically acceptable images was only 20% of that for the film. The two digital x-ray imaging systems, Pana Digital and CDR, provided large dose savings (60-80%) compared with Ektaspeed Plus film when applied for intraoral radiography in general dental practice. (author)

  7. Impulse Noise Cancellation of Medical Images Using Wavelet Networks and Median Filters

    Science.gov (United States)

    Sadri, Amir Reza; Zekri, Maryam; Sadri, Saeid; Gheissari, Niloofar

    2012-01-01

    This paper presents a new two-stage approach to impulse noise removal for medical images based on wavelet network (WN). The first step is noise detection, in which the so-called gray-level difference and average background difference are considered as the inputs of a WN. Wavelet Network is used as a preprocessing for the second stage. The second step is removing impulse noise with a median filter. The wavelet network presented here is a fixed one without learning. Experimental results show that our method acts on impulse noise effectively, and at the same time preserves chromaticity and image details very well. PMID:23493998

  8. Robust non-local median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2017-04-01

    This paper describes a novel image filter with superior performance on detail-preserving removal of random-valued impulse noise superimposed on natural gray-scale images. The non-local means filter is in the limelight as a way of Gaussian noise removal with superior performance on detail preservation. By referring the fundamental concept of the non-local means, we had proposed a non-local median filter as a specialized way for random-valued impulse noise removal so far. In the non-local processing, the output of a filter is calculated from pixels in blocks which are similar to the block centered at a pixel of interest. As a result, aggressive noise removal is conducted without destroying the detailed structures in an original image. However, the performance of non-local processing decreases enormously in the case of high noise occurrence probability. A cause of this problem is that the superimposed noise disturbs accurate calculation of the similarity between the blocks. To cope with this problem, we propose an improved non-local median filter which is robust to the high level of corruption by introducing a new similarity measure considering possibility of being the original signal. The effectiveness and validity of the proposed method are verified in a series of experiments using natural gray-scale images.

  9. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging.

    Science.gov (United States)

    Meyer, Mathias; Haubenreisser, Holger; Raupach, Rainer; Schmidt, Bernhard; Lietzmann, Florian; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Schad, Lothar R; Schoenberg, Stefan O; Henzler, Thomas

    2015-01-01

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm(2) removes the necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. • Omitting the z-axis-filter allows a reduction in radiation dose of 50% • A smaller focal spot of 0.2 mm (2) significantly improves spatial resolution • Ultra-high-resolution temporal-bone-CT helps to gain diagnostic information of the middle/inner ear.

  10. Influence of Respiratory Gating, Image Filtering, and Animal Positioning on High-Resolution Electrocardiography-Gated Murine Cardiac Single-Photon Emission Computed Tomography

    Directory of Open Access Journals (Sweden)

    Chao Wu

    2015-01-01

    Full Text Available Cardiac parameters obtained from single-photon emission computed tomographic (SPECT images can be affected by respiratory motion, image filtering, and animal positioning. We investigated the influence of these factors on ultra-high-resolution murine myocardial perfusion SPECT. Five mice were injected with 99m technetium (99mTc-tetrofosmin, and each was scanned in supine and prone positions in a U-SPECT-II scanner with respiratory and electrocardiographic (ECG gating. ECG-gated SPECT images were created without applying respiratory motion correction or with two different respiratory motion correction strategies. The images were filtered with a range of three-dimensional gaussian kernels, after which end-diastolic volumes (EDVs, end-systolic volumes (ESVs, and left ventricular ejection fractions were calculated. No significant differences in the measured cardiac parameters were detected when any strategy to reduce or correct for respiratory motion was applied, whereas big differences (> 5% in EDV and ESV were found with regard to different positioning of animals. A linear relationship (p < .001 was found between the EDV or ESV and the kernel size of the gaussian filter. In short, respiratory gating did not significantly affect the cardiac parameters of mice obtained with ultra-high-resolution SPECT, whereas the position of the animals and the image filters should be the same in a comparative study with multiple scans to avoid systematic differences in measured cardiac parameters.

  11. Hot spot detection for breast cancer in Ki-67 stained slides: image dependent filtering approach

    Science.gov (United States)

    Niazi, M. Khalid Khan; Downs-Kelly, Erinn; Gurcan, Metin N.

    2014-03-01

    We present a new method to detect hot spots from breast cancer slides stained for Ki67 expression. It is common practice to use centroid of a nucleus as a surrogate representation of a cell. This often requires the detection of individual nuclei. Once all the nuclei are detected, the hot spots are detected by clustering the centroids. For large size images, nuclei detection is computationally demanding. Instead of detecting the individual nuclei and treating hot spot detection as a clustering problem, we considered hot spot detection as an image filtering problem where positively stained pixels are used to detect hot spots in breast cancer images. The method first segments the Ki-67 positive pixels using the visually meaningful segmentation (VMS) method that we developed earlier. Then, it automatically generates an image dependent filter to generate a density map from the segmented image. The smoothness of the density image simplifies the detection of local maxima. The number of local maxima directly corresponds to the number of hot spots in the breast cancer image. The method was tested on 23 different regions of interest images extracted from 10 different breast cancer slides stained with Ki67. To determine the intra-reader variability, each image was annotated twice for hot spots by a boardcertified pathologist with a two-week interval in between her two readings. A computer-generated hot spot region was considered a true-positive if it agrees with either one of the two annotation sets provided by the pathologist. While the intra-reader variability was 57%, our proposed method can correctly detect hot spots with 81% precision.

  12. Characteristics of Quoit filter, a digital filter developed for the extraction of circumscribed shadows, and its applications to mammograms

    International Nuclear Information System (INIS)

    Isobe, Yoshiaki; Ohkubo, Natsumi; Yamamoto, Shinji; Toriwaki, Jun-ichiro; Kobatake, Hidefumi.

    1993-01-01

    This paper presents a newly developed filter called Quoit filter, which detects circumscribed shadows (concentric circular isolated image), like typical cancer regions. This Quoit filter is based on the mathematical morphology and is found to have interesting facts as follows. (1) Output of this filter can be analytically expressible when an input image is assumed to be a concentric circular model (output is expectable for typical inputs). (2) This filter has an ability to reconstruct original isolated models mentioned in (1) selectively, when this filter is applied sequentially twice. This filter was tested on the detection of cancer regions in X-ray mammograms, and for 12 cancer mammograms, this filter achieved a true-positive cancer detection rate of 100 %. (author)

  13. Switching non-local median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2015-06-01

    This paper describes a novel image filtering method for removal of random-valued impulse noise superimposed on grayscale images. Generally, it is well known that switching-type median filters are effective for impulse noise removal. In this paper, we propose a more sophisticated switching-type impulse noise removal method in terms of detail-preserving performance. Specifically, the noise detector of the proposed method finds out noise-corrupted pixels by focusing attention on the difference between the value of a pixel of interest (POI) and the median of its neighboring pixel values, and on the POI's isolation tendency from the surrounding pixels. Furthermore, the removal of the detected noise is performed by the newly proposed median filter based on non-local processing, which has superior detail-preservation capability compared to the conventional median filter. The effectiveness and the validity of the proposed method are verified by some experiments using natural grayscale images.

  14. Filtering and deconvolution for bioluminescence imaging of small animals; Filtrage et deconvolution en imagerie de bioluminescence chez le petit animal

    Energy Technology Data Exchange (ETDEWEB)

    Akkoul, S.

    2010-06-22

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  15. Field programmable gate array based hardware implementation of a gradient filter for edge detection in colour images with subpixel precision

    International Nuclear Information System (INIS)

    Schellhorn, M; Rosenberger, M; Correns, M; Blau, M; Goepfert, A; Rueckwardt, M; Linss, G

    2010-01-01

    Within the field of industrial image processing the use of colour cameras becomes ever more common. Increasingly the established black and white cameras are replaced by economical single-chip colour cameras with Bayer pattern. The use of the additional colour information is particularly important for recognition or inspection. Become interesting however also for the geometric metrology, if measuring tasks can be solved more robust or more exactly. However only few suitable algorithms are available, in order to detect edges with the necessary precision. All attempts require however additional computation expenditure. On the basis of a new filter for edge detection in colour images with subpixel precision, the implementation on a pre-processing hardware platform is presented. Hardware implemented filters offer the advantage that they can be used easily with existing measuring software, since after the filtering a single channel image is present, which unites the information of all colour channels. Advanced field programmable gate arrays represent an ideal platform for the parallel processing of multiple channels. The effective implementation presupposes however a high programming expenditure. On the example of the colour filter implementation, arising problems are analyzed and the chosen solution method is presented.

  16. Preliminary energy-filtering neutron imaging with time-of-flight method on PKUNIFTY: A compact accelerator based neutron imaging facility at Peking University

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hu; Zou, Yubin, E-mail: zouyubin@pku.edu.cn; Wen, Weiwei; Lu, Yuanrong; Guo, Zhiyu

    2016-07-01

    Peking University Neutron Imaging Facility (PKUNIFTY) works on an accelerator–based neutron source with a repetition period of 10 ms and pulse duration of 0.4 ms, which has a rather low Cd ratio. To improve the effective Cd ratio and thus improve the detection capability of the facility, energy-filtering neutron imaging was realized with the intensified CCD camera and time-of-flight (TOF) method. Time structure of the pulsed neutron source was firstly simulated with Geant4, and the simulation result was evaluated with experiment. Both simulation and experiment results indicated that fast neutrons and epithermal neutrons were concentrated in the first 0.8 ms of each pulse period; meanwhile in the period of 0.8–2.0 ms only thermal neutrons existed. Based on this result, neutron images with and without energy filtering were acquired respectively, and it showed that detection capability of PKUNIFTY was improved with setting the exposure interval as 0.8–2.0 ms, especially for materials with strong moderating capability.

  17. Comparison of Deconvolution Filters for Photoacoustic Tomography.

    Directory of Open Access Journals (Sweden)

    Dominique Van de Sompel

    Full Text Available In this work, we compare the merits of three temporal data deconvolution methods for use in the filtered backprojection algorithm for photoacoustic tomography (PAT. We evaluate the standard Fourier division technique, the Wiener deconvolution filter, and a Tikhonov L-2 norm regularized matrix inversion method. Our experiments were carried out on subjects of various appearances, namely a pencil lead, two man-made phantoms, an in vivo subcutaneous mouse tumor model, and a perfused and excised mouse brain. All subjects were scanned using an imaging system with a rotatable hemispherical bowl, into which 128 ultrasound transducer elements were embedded in a spiral pattern. We characterized the frequency response of each deconvolution method, compared the final image quality achieved by each deconvolution technique, and evaluated each method's robustness to noise. The frequency response was quantified by measuring the accuracy with which each filter recovered the ideal flat frequency spectrum of an experimentally measured impulse response. Image quality under the various scenarios was quantified by computing noise versus resolution curves for a point source phantom, as well as the full width at half maximum (FWHM and contrast-to-noise ratio (CNR of selected image features such as dots and linear structures in additional imaging subjects. It was found that the Tikhonov filter yielded the most accurate balance of lower and higher frequency content (as measured by comparing the spectra of deconvolved impulse response signals to the ideal flat frequency spectrum, achieved a competitive image resolution and contrast-to-noise ratio, and yielded the greatest robustness to noise. While the Wiener filter achieved a similar image resolution, it tended to underrepresent the lower frequency content of the deconvolved signals, and hence of the reconstructed images after backprojection. In addition, its robustness to noise was poorer than that of the Tikhonov

  18. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    International Nuclear Information System (INIS)

    Meyer, Mathias; Haubenreisser, Holger; Schoenberg, Stefan O.; Henzler, Thomas; Raupach, Rainer; Schmidt, Bernhard; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Lietzmann, Florian; Schad, Lothar R.

    2015-01-01

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm 2 removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  19. Reducing contrast contamination in radial turbo-spin-echo acquisitions by combining a narrow-band KWIC filter with parallel imaging.

    Science.gov (United States)

    Neumann, Daniel; Breuer, Felix A; Völker, Michael; Brandt, Tobias; Griswold, Mark A; Jakob, Peter M; Blaimer, Martin

    2014-12-01

    Cartesian turbo spin-echo (TSE) and radial TSE images are usually reconstructed by assembling data containing different contrast information into a single k-space. This approach results in mixed contrast contributions in the images, which may reduce their diagnostic value. The goal of this work is to improve the image contrast from radial TSE acquisitions by reducing the contribution of signals with undesired contrast information. Radial TSE acquisitions allow the reconstruction of multiple images with different T2 contrasts using the k-space weighted image contrast (KWIC) filter. In this work, the image contrast is improved by reducing the band-width of the KWIC filter. Data for the reconstruction of a single image are selected from within a small temporal range around the desired echo time. The resulting dataset is undersampled and, therefore, an iterative parallel imaging algorithm is applied to remove aliasing artifacts. Radial TSE images of the human brain reconstructed with the proposed method show an improved contrast when compared with Cartesian TSE images or radial TSE images with conventional KWIC reconstructions. The proposed method provides multi-contrast images from radial TSE data with contrasts similar to multi spin-echo images. Contaminations from unwanted contrast weightings are strongly reduced. © 2014 Wiley Periodicals, Inc.

  20. Determination of optimum filter in inferolateral view of myocardial SPECT

    International Nuclear Information System (INIS)

    Takavar; Eftekhari, M.; Fallahi, B.; Shamsipour, Gh.; Sohrabi, M.; Saghari, M.

    2004-01-01

    Background: In myocardial perfusion SPECT imaging, images are degraded by photon attenuation, distance-dependent collimator, detector response and photon scattering. As filters greatly affect quality of nuclear medicine images, in this study determination of optimum filter for inferolateral view is our prime objective. Materials and Methods: .A phantom simulating heart left ventricle was built. About 1mCi of 99m Tc, was injected into the phantom. Images were taken from this phantom. Parzen, Hamming, Hanning, Butter worth and Gaussian filters were exerted on the images obtained from the phantom.. By defining some criteria such as contrast, signal to noise ratio, and defect size delectability, the best filter was determined for our ADAC spect system at our nuclear medicine center. In this study, 27 patients who previously had undergone coronary angiography were chosen to be included. All of these patients revealed significant stenosis in the left circumflex artery. Myocardial SPECT images of these patients had inferolateral defect. The images of these patients were processed with 12 filters including the optimum filters obtained from phantom study and some other non-optimum filters. A nuclear medicine physician quantified the results by assigmng mark from 0 to 4. to every image. 0 mark for images that didn't show the defect properly and 4 for the best one. The data from patient study were analyzed with non-related, non -parametric Friedman test. Results: Nyquist frequency of 0.325 and 0.5 were obtained as the optimum cut-off frequencies for hamming and Hanning filters respectively. Order 11 and cut-off frequency of 0.45 and order 20. with cut-off frequency of 0.5 were found to be optimum for Butter worth and Gaussian filters. In patient studies it was found that, Butter worth filter with cut-off frequency of 0.45 and order of 11 produced the best quality images. Conclusion: In this study. Butter worth filter with cut-off frequency of 0.45 and order of 11 was the

  1. Kalman filtered MR temperature imaging for laser induced thermal therapies.

    Science.gov (United States)

    Fuentes, D; Yung, J; Hazle, J D; Weinberg, J S; Stafford, R J

    2012-04-01

    The feasibility of using a stochastic form of Pennes bioheat model within a 3-D finite element based Kalman filter (KF) algorithm is critically evaluated for the ability to provide temperature field estimates in the event of magnetic resonance temperature imaging (MRTI) data loss during laser induced thermal therapy (LITT). The ability to recover missing MRTI data was analyzed by systematically removing spatiotemporal information from a clinical MR-guided LITT procedure in human brain and comparing predictions in these regions to the original measurements. Performance was quantitatively evaluated in terms of a dimensionless L(2) (RMS) norm of the temperature error weighted by acquisition uncertainty. During periods of no data corruption, observed error histories demonstrate that the Kalman algorithm does not alter the high quality temperature measurement provided by MR thermal imaging. The KF-MRTI implementation considered is seen to predict the bioheat transfer with RMS error 10 sec.

  2. Speckle Reduction for Ultrasonic Imaging Using Frequency Compounding and Despeckling Filters along with Coded Excitation and Pulse Compression

    Directory of Open Access Journals (Sweden)

    Joshua S. Ullom

    2012-01-01

    Full Text Available A method for improving the contrast-to-noise ratio (CNR while maintaining the −6 dB axial resolution of ultrasonic B-mode images is proposed. The technique proposed is known as eREC-FC, which enhances a recently developed REC-FC technique. REC-FC is a combination of the coded excitation technique known as resolution enhancement compression (REC and the speckle-reduction technique frequency compounding (FC. In REC-FC, image CNR is improved but at the expense of a reduction in axial resolution. However, by compounding various REC-FC images made from various subband widths, the tradeoff between axial resolution and CNR enhancement can be extended. Further improvements in CNR can be obtained by applying postprocessing despeckling filters to the eREC-FC B-mode images. The despeckling filters evaluated were the following: median, Lee, homogeneous mask area, geometric, and speckle-reducing anisotropic diffusion (SRAD. Simulations and experimental measurements were conducted with a single-element transducer (f/2.66 having a center frequency of 2.25 MHz and a −3 dB bandwidth of 50%. In simulations and experiments, the eREC-FC technique resulted in the same axial resolution that would be typically observed with conventional excitation with a pulse. Moreover, increases in CNR of 348% were obtained in experiments when comparing eREC-FC with a Lee filter to conventional pulsing methods.

  3. GPU Accelerated Vector Median Filter

    Science.gov (United States)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  4. Classification of Textures Using Filter Based Local Feature Extraction

    Directory of Open Access Journals (Sweden)

    Bocekci Veysel Gokhan

    2016-01-01

    Full Text Available In this work local features are used in feature extraction process in image processing for textures. The local binary pattern feature extraction method from textures are introduced. Filtering is also used during the feature extraction process for getting discriminative features. To show the effectiveness of the algorithm before the extraction process, three different noise are added to both train and test images. Wiener filter and median filter are used to remove the noise from images. We evaluate the performance of the method with Naïve Bayesian classifier. We conduct the comparative analysis on benchmark dataset with different filtering and size. Our experiments demonstrate that feature extraction process combine with filtering give promising results on noisy images.

  5. FLOWING BILATERAL FILTER: DEFINITION AND IMPLEMENTATIONS

    Directory of Open Access Journals (Sweden)

    Maxime Moreaud

    2015-06-01

    Full Text Available The bilateral filter plays a key role in image processing applications due to its intuitive parameterization and its high quality filter result, smoothing homogeneous regions while preserving the edges of the objects. Considering the image as a topological relief, seeing pixel intensities as peaks and valleys, we introduce a way to control the tonal weighting coefficients, the flowing bilateral filter, reducing "halo" artifacts typically produced by the regular bilateral filter around a large peak surrounded by two valleys of lower values. In this paper we propose to investigate exact and approximated versions of CPU and parallel GPU (Graphical Processing Unit based implementations of the regular and flowing bilateral filter using the NVidia CUDA API. Fast implementations of these filters are important for the processing of large 3D volumes up to several GB acquired by x-ray or electron tomography.

  6. Signal-to-noise ratio enhancement on SEM images using a cubic spline interpolation with Savitzky-Golay filters and weighted least squares error.

    Science.gov (United States)

    Kiani, M A; Sim, K S; Nia, M E; Tso, C P

    2015-05-01

    A new technique based on cubic spline interpolation with Savitzky-Golay smoothing using weighted least squares error filter is enhanced for scanning electron microscope (SEM) images. A diversity of sample images is captured and the performance is found to be better when compared with the moving average and the standard median filters, with respect to eliminating noise. This technique can be implemented efficiently on real-time SEM images, with all mandatory data for processing obtained from a single image. Noise in images, and particularly in SEM images, are undesirable. A new noise reduction technique, based on cubic spline interpolation with Savitzky-Golay and weighted least squares error method, is developed. We apply the combined technique to single image signal-to-noise ratio estimation and noise reduction for SEM imaging system. This autocorrelation-based technique requires image details to be correlated over a few pixels, whereas the noise is assumed to be uncorrelated from pixel to pixel. The noise component is derived from the difference between the image autocorrelation at zero offset, and the estimation of the corresponding original autocorrelation. In the few test cases involving different images, the efficiency of the developed noise reduction filter is proved to be significantly better than those obtained from the other methods. Noise can be reduced efficiently with appropriate choice of scan rate from real-time SEM images, without generating corruption or increasing scanning time. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  7. Electrical Resistance Imaging of Bubble Boundary in Annular Two-Phase Flows Using Unscented Kalman Filter

    International Nuclear Information System (INIS)

    Lee, Jeong Seong; Chung, Soon Il; Ljaz, Umer Zeeshan; Khambampati, Anil Kumar; Kim, Kyung Youn; Kim, Sin Kim

    2007-01-01

    For the visualization of the phase boundary in annular two-phase flows, the electrical resistance tomography (ERT) technique is introduced. In ERT, a set of predetermined electrical currents is injected trough the electrodes placed on the boundary of the flow passage and the induced electrical potentials are measured on the electrode. With the relationship between the injected currents and the induced voltages, the electrical conductivity distribution across the flow domain is estimated through the image reconstruction algorithm. In this, the conductivity distribution corresponds to the phase distribution. In the application of ERT to two-phase flows where there are only two conductivity values, the conductivity distribution estimation problem can be transformed into the boundary estimation problem. This paper considers a bubble boundary estimation with ERT in annular two-phase flows. As the image reconstruction algorithm, the unscented Kalman filter (UKF) is adopted since from the control theory it is reported that the UKF shows better performance than the extended Kalman filter (EKF) that has been commonly used. We formulated the UKF algorithm to be incorporate into the image reconstruction algorithm for the present problem. Also, phantom experiments have been conducted to evaluate the improvement by UKF

  8. Filtering algorithm for dotted interferences

    Energy Technology Data Exchange (ETDEWEB)

    Osterloh, K., E-mail: kurt.osterloh@bam.de [Federal Institute for Materials Research and Testing (BAM), Division VIII.3, Radiological Methods, Unter den Eichen 87, 12205 Berlin (Germany); Buecherl, T.; Lierse von Gostomski, Ch. [Technische Universitaet Muenchen, Lehrstuhl fuer Radiochemie, Walther-Meissner-Str. 3, 85748 Garching (Germany); Zscherpel, U.; Ewert, U. [Federal Institute for Materials Research and Testing (BAM), Division VIII.3, Radiological Methods, Unter den Eichen 87, 12205 Berlin (Germany); Bock, S. [Technische Universitaet Muenchen, Lehrstuhl fuer Radiochemie, Walther-Meissner-Str. 3, 85748 Garching (Germany)

    2011-09-21

    An algorithm has been developed to remove reliably dotted interferences impairing the perceptibility of objects within a radiographic image. This particularly is a major challenge encountered with neutron radiographs collected at the NECTAR facility, Forschungs-Neutronenquelle Heinz Maier-Leibnitz (FRM II): the resulting images are dominated by features resembling a snow flurry. These artefacts are caused by scattered neutrons, gamma radiation, cosmic radiation, etc. all hitting the detector CCD directly in spite of a sophisticated shielding. This makes such images rather useless for further direct evaluations. One approach to resolve this problem of these random effects would be to collect a vast number of single images, to combine them appropriately and to process them with common image filtering procedures. However, it has been shown that, e.g. median filtering, depending on the kernel size in the plane and/or the number of single shots to be combined, is either insufficient or tends to blur sharp lined structures. This inevitably makes a visually controlled processing image by image unavoidable. Particularly in tomographic studies, it would be by far too tedious to treat each single projection by this way. Alternatively, it would be not only more comfortable but also in many cases the only reasonable approach to filter a stack of images in a batch procedure to get rid of the disturbing interferences. The algorithm presented here meets all these requirements. It reliably frees the images from the snowy pattern described above without the loss of fine structures and without a general blurring of the image. It consists of an iterative, within a batch procedure parameter free filtering algorithm aiming to eliminate the often complex interfering artefacts while leaving the original information untouched as far as possible.

  9. Filtering algorithm for dotted interferences

    International Nuclear Information System (INIS)

    Osterloh, K.; Buecherl, T.; Lierse von Gostomski, Ch.; Zscherpel, U.; Ewert, U.; Bock, S.

    2011-01-01

    An algorithm has been developed to remove reliably dotted interferences impairing the perceptibility of objects within a radiographic image. This particularly is a major challenge encountered with neutron radiographs collected at the NECTAR facility, Forschungs-Neutronenquelle Heinz Maier-Leibnitz (FRM II): the resulting images are dominated by features resembling a snow flurry. These artefacts are caused by scattered neutrons, gamma radiation, cosmic radiation, etc. all hitting the detector CCD directly in spite of a sophisticated shielding. This makes such images rather useless for further direct evaluations. One approach to resolve this problem of these random effects would be to collect a vast number of single images, to combine them appropriately and to process them with common image filtering procedures. However, it has been shown that, e.g. median filtering, depending on the kernel size in the plane and/or the number of single shots to be combined, is either insufficient or tends to blur sharp lined structures. This inevitably makes a visually controlled processing image by image unavoidable. Particularly in tomographic studies, it would be by far too tedious to treat each single projection by this way. Alternatively, it would be not only more comfortable but also in many cases the only reasonable approach to filter a stack of images in a batch procedure to get rid of the disturbing interferences. The algorithm presented here meets all these requirements. It reliably frees the images from the snowy pattern described above without the loss of fine structures and without a general blurring of the image. It consists of an iterative, within a batch procedure parameter free filtering algorithm aiming to eliminate the often complex interfering artefacts while leaving the original information untouched as far as possible.

  10. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Energy Technology Data Exchange (ETDEWEB)

    Buhk, J.H. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Neuroradiology; Laqmani, A.; Schultzendorff, H.C. von; Hammerle, D.; Adam, G.; Regier, M. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Diagnostic and Interventional Radiology; Sehner, S. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Inst. of Medical Biometry and Epidemiology; Fiehler, J. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Neuroradiology; Nagel, H.D. [Dr. HD Nagel, Science and Technology for Radiology, Buchholz (Germany)

    2013-08-15

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  11. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    International Nuclear Information System (INIS)

    Buhk, J.H.

    2013-01-01

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  12. Gated myocardial SPECT using spatial and temporal filtering

    International Nuclear Information System (INIS)

    Hatton, R.L.; Hutton, B.F.; Kyme, A.Z.; Larcos, G.

    2002-01-01

    Full text: Standard protocols for examining myocardial perfusion and motion defects involve the use of gated SPECT images, and a composite of the gated frames. This study examines the usefulness of extracting one or a combination of frames from the gated image to assess perfusion, and whether the addition of a temporal filter to the gated image improves signal to noise. Choice of the most appropriate frame was also considered. Sixteen and eight frame gated SPECT studies were simulated using the dynamic NURBS-based cardiac torso (NCAT) phantom. Variously sized perfusion defects were included in the inferior wall to assess contrast to normal tissue. Scatter and attenuation were not included. Butterworth spatial cutoff frequencies were varied to establish the most appropriate combination of temporal/spatial filters to reduce noise and retain contrast in the images. The 16 frame data produced higher ejection fraction across all spatial filter cutoffs, and generally was unaffected by temporal filtering. Temporal filtering reduced the noise in a uniform liver region in the gated images to within 25% of the composite image noise. The lesion extent and contrast were greater in the end-diastolic frames compared to end-systolic and mid-cycle frames. In conclusion, by using a temporally filtered end-diastolic image from the gated sequence, a favourable balance between noise and contrast can be achieved. Work is progress to confirm these findings in the clinical situation. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  13. SU-D-207-07: Implementation of Full/half Bowtie Filter Model in a Commercial Treatment Planning System for Kilovoltage X-Ray Imaging Dose Estimation

    International Nuclear Information System (INIS)

    Kim, S; Alaei, P

    2015-01-01

    Purpose: To implement full/half bowtie filter models in a commercial treatment planning system (TPS) to calculate kilovoltage (kV) x-ray imaging dose of Varian On-Board Imager (OBI) cone beam CT (CBCT) system. Methods: Full/half bowtie filters of Varian OBI were created as compensator models in Pinnacle TPS (version 9.6) using Matlab software (version 2011a). The profiles of both bowtie filters were acquired from the manufacturer, imported into the Matlab system and hard coded in binary file format. A Pinnacle script was written to import each bowtie filter data into a Pinnacle treatment plan as a compensator. A kV x-ray beam model without including the compensator model was commissioned per each bowtie filter setting based on percent depth dose and lateral profile data acquired from Monte Carlo simulations. To validate the bowtie filter models, a rectangular water phantom was generated in the planning system and an anterior/posterior beam with each bowtie filter was created. Using the Pinnacle script, each bowtie filter compensator was added to the treatment plan. Lateral profile at the depth of 3cm and percent depth dose were measured using an ion chamber and compared with the data extracted from the treatment plans. Results: The kV x-ray beams for both full and half bowtie filter have been modeled in a commercial TPS. The difference of lateral and depth dose profiles between dose calculations and ion chamber measurements were within 6%. Conclusion: Both full/half bowtie filter models provide reasonable results in kV x-ray dose calculations in the water phantom. This study demonstrates the possibility of using a model-based treatment planning system to calculate the kV imaging dose for both full and half bowtie filter modes. Further study is to be performed to evaluate the models in clinical situations

  14. Supervised retinal vessel segmentation from color fundus images based on matched filtering and AdaBoost classifier.

    Directory of Open Access Journals (Sweden)

    Nogol Memari

    Full Text Available The structure and appearance of the blood vessel network in retinal fundus images is an essential part of diagnosing various problems associated with the eyes, such as diabetes and hypertension. In this paper, an automatic retinal vessel segmentation method utilizing matched filter techniques coupled with an AdaBoost classifier is proposed. The fundus image is enhanced using morphological operations, the contrast is increased using contrast limited adaptive histogram equalization (CLAHE method and the inhomogeneity is corrected using Retinex approach. Then, the blood vessels are enhanced using a combination of B-COSFIRE and Frangi matched filters. From this preprocessed image, different statistical features are computed on a pixel-wise basis and used in an AdaBoost classifier to extract the blood vessel network inside the image. Finally, the segmented images are postprocessed to remove the misclassified pixels and regions. The proposed method was validated using publicly accessible Digital Retinal Images for Vessel Extraction (DRIVE, Structured Analysis of the Retina (STARE and Child Heart and Health Study in England (CHASE_DB1 datasets commonly used for determining the accuracy of retinal vessel segmentation methods. The accuracy of the proposed segmentation method was comparable to other state of the art methods while being very close to the manual segmentation provided by the second human observer with an average accuracy of 0.972, 0.951 and 0.948 in DRIVE, STARE and CHASE_DB1 datasets, respectively.

  15. Filtered region of interest cone-beam rotational angiography

    International Nuclear Information System (INIS)

    Schafer, Sebastian; Noeel, Peter B.; Walczak, Alan M.; Hoffmann, Kenneth R.

    2010-01-01

    Purpose: Cone-beam rotational angiography (CBRA) is widely used in the modern clinical settings. In a number of procedures, the area of interest is often considerably smaller than the field of view (FOV) of the detector, subjecting the patient to potentially unnecessary x-ray dose. The authors therefore propose a filter-based method to reduce the dose in the regions of low interest, while supplying high image quality in the region of interest (ROI). Methods: For such procedures, the authors propose a method of filtered region of interest (FROI)-CBRA. In the authors' approach, a gadolinium filter with a circular central opening is placed into the x-ray beam during image acquisition. The central region is imaged with high contrast, while peripheral regions are subjected to a substantial lower intensity and dose through beam filtering. The resulting images contain a high contrast/intensity ROI, as well as a low contrast/intensity peripheral region, and a transition region in between. To equalize the two regions' intensities, the first projection of the acquisition is performed with and without the filter in place. The equalization relationship, based on Beer's law, is established through linear regression using corresponding filtered and nonfiltered data. The transition region is equalized based on radial profiles. Results: Evaluations in 2D and 3D show no visible difference between conventional FROI-CBRA projection images and reconstructions in the ROI. CNR evaluations show similar image quality in the ROI, with a reduced CNR in the reconstructed peripheral region. In all filtered projection images, the scatter fraction inside the ROI was reduced. Theoretical and experimental dose evaluations show a considerable dose reduction; using a ROI half the original FOV reduces the dose by 60% for the filter thickness of 1.29 mm. Conclusions: These results indicate the potential of FROI-CBRA to reduce the dose to the patient while supplying the physician with the desired

  16. Filtered region of interest cone-beam rotational angiography

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, Sebastian; Noeel, Peter B.; Walczak, Alan M.; Hoffmann, Kenneth R. [Department of Mechanical Engineering, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States) and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Computer Science, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States) and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Mechanical Engineering, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Neurosurgery, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States); Department of Computer Science, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States) and Toshiba Stroke Research Center, SUNY at Buffalo, 3435 Main Street, Buffalo, New York 14214 (United States)

    2010-02-15

    Purpose: Cone-beam rotational angiography (CBRA) is widely used in the modern clinical settings. In a number of procedures, the area of interest is often considerably smaller than the field of view (FOV) of the detector, subjecting the patient to potentially unnecessary x-ray dose. The authors therefore propose a filter-based method to reduce the dose in the regions of low interest, while supplying high image quality in the region of interest (ROI). Methods: For such procedures, the authors propose a method of filtered region of interest (FROI)-CBRA. In the authors' approach, a gadolinium filter with a circular central opening is placed into the x-ray beam during image acquisition. The central region is imaged with high contrast, while peripheral regions are subjected to a substantial lower intensity and dose through beam filtering. The resulting images contain a high contrast/intensity ROI, as well as a low contrast/intensity peripheral region, and a transition region in between. To equalize the two regions' intensities, the first projection of the acquisition is performed with and without the filter in place. The equalization relationship, based on Beer's law, is established through linear regression using corresponding filtered and nonfiltered data. The transition region is equalized based on radial profiles. Results: Evaluations in 2D and 3D show no visible difference between conventional FROI-CBRA projection images and reconstructions in the ROI. CNR evaluations show similar image quality in the ROI, with a reduced CNR in the reconstructed peripheral region. In all filtered projection images, the scatter fraction inside the ROI was reduced. Theoretical and experimental dose evaluations show a considerable dose reduction; using a ROI half the original FOV reduces the dose by 60% for the filter thickness of 1.29 mm. Conclusions: These results indicate the potential of FROI-CBRA to reduce the dose to the patient while supplying the physician with

  17. Bowtie filters for dedicated breast CT: Analysis of bowtie filter material selection

    Energy Technology Data Exchange (ETDEWEB)

    Kontson, Kimberly, E-mail: Kimberly.Kontson@fda.hhs.gov; Jennings, Robert J. [Department of Bioengineering, University of Maryland, College Park, Maryland 20742 and Division of Imaging and Applied Mathematics, Office of Science and Engineering Laboratories, Center for Devices and Radiological Health, U.S. Food and Drug Administration, 10903 New Hampshire Avenue, Silver Spring, Maryland 20993 (United States)

    2015-09-15

    Purpose: For a given bowtie filter design, both the selection of material and the physical design control the energy fluence, and consequently the dose distribution, in the object. Using three previously described bowtie filter designs, the goal of this work is to demonstrate the effect that different materials have on the bowtie filter performance measures. Methods: Three bowtie filter designs that compensate for one or more aspects of the beam-modifying effects due to the differences in path length in a projection have been designed. The nature of the designs allows for their realization using a variety of materials. The designs were based on a phantom, 14 cm in diameter, composed of 40% fibroglandular and 60% adipose tissue. Bowtie design #1 is based on single material spectral matching and produces nearly uniform spectral shape for radiation incident upon the detector. Bowtie design #2 uses the idea of basis-material decomposition to produce the same spectral shape and intensity at the detector, using two different materials. With bowtie design #3, it is possible to eliminate the beam hardening effect in the reconstructed image by adjusting the bowtie filter thickness so that the effective attenuation coefficient for every ray is the same. Seven different materials were chosen to represent a range of chemical compositions and densities. After calculation of construction parameters for each bowtie filter design, a bowtie filter was created using each of these materials (assuming reasonable construction parameters were obtained), resulting in a total of 26 bowtie filters modeled analytically and in the PENELOPE Monte Carlo simulation environment. Using the analytical model of each bowtie filter, design profiles were obtained and energy fluence as a function of fan-angle was calculated. Projection images with and without each bowtie filter design were also generated using PENELOPE and reconstructed using FBP. Parameters such as dose distribution, noise uniformity

  18. Bowtie filters for dedicated breast CT: Analysis of bowtie filter material selection

    International Nuclear Information System (INIS)

    Kontson, Kimberly; Jennings, Robert J.

    2015-01-01

    Purpose: For a given bowtie filter design, both the selection of material and the physical design control the energy fluence, and consequently the dose distribution, in the object. Using three previously described bowtie filter designs, the goal of this work is to demonstrate the effect that different materials have on the bowtie filter performance measures. Methods: Three bowtie filter designs that compensate for one or more aspects of the beam-modifying effects due to the differences in path length in a projection have been designed. The nature of the designs allows for their realization using a variety of materials. The designs were based on a phantom, 14 cm in diameter, composed of 40% fibroglandular and 60% adipose tissue. Bowtie design #1 is based on single material spectral matching and produces nearly uniform spectral shape for radiation incident upon the detector. Bowtie design #2 uses the idea of basis-material decomposition to produce the same spectral shape and intensity at the detector, using two different materials. With bowtie design #3, it is possible to eliminate the beam hardening effect in the reconstructed image by adjusting the bowtie filter thickness so that the effective attenuation coefficient for every ray is the same. Seven different materials were chosen to represent a range of chemical compositions and densities. After calculation of construction parameters for each bowtie filter design, a bowtie filter was created using each of these materials (assuming reasonable construction parameters were obtained), resulting in a total of 26 bowtie filters modeled analytically and in the PENELOPE Monte Carlo simulation environment. Using the analytical model of each bowtie filter, design profiles were obtained and energy fluence as a function of fan-angle was calculated. Projection images with and without each bowtie filter design were also generated using PENELOPE and reconstructed using FBP. Parameters such as dose distribution, noise uniformity

  19. Deposition of Aerosol Particles in Electrically Charged Membrane Filters

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L

    1972-05-15

    A theory for the influence of electric charge on particle deposition on the surface of charged filters has been developed. It has been tested experimentally on ordinary membrane filters and Nuclepore filters of 8 mum pore size, with a bipolar monodisperse test aerosol of 1 mum particle diameter, and at a filter charge up to 20 muC/m2. Agreement with theory was obtained for the Coulomb force between filter and particle for both kinds of filters. The image force between charged filter and neutral particles did not result in the predicted deposition in the ordinary membrane filter, probably due to lacking correspondence between the filter model employed for the theory, and the real filter. For the Nuclepore filter a satisfactory agreement with theory was obtained, also at image interaction

  20. A 3D Image Filter for Parameter-Free Segmentation of Macromolecular Structures from Electron Tomograms

    Science.gov (United States)

    Ali, Rubbiya A.; Landsberg, Michael J.; Knauth, Emily; Morgan, Garry P.; Marsh, Brad J.; Hankamer, Ben

    2012-01-01

    3D image reconstruction of large cellular volumes by electron tomography (ET) at high (≤5 nm) resolution can now routinely resolve organellar and compartmental membrane structures, protein coats, cytoskeletal filaments, and macromolecules. However, current image analysis methods for identifying in situ macromolecular structures within the crowded 3D ultrastructural landscape of a cell remain labor-intensive, time-consuming, and prone to user-bias and/or error. This paper demonstrates the development and application of a parameter-free, 3D implementation of the bilateral edge-detection (BLE) algorithm for the rapid and accurate segmentation of cellular tomograms. The performance of the 3D BLE filter has been tested on a range of synthetic and real biological data sets and validated against current leading filters—the pseudo 3D recursive and Canny filters. The performance of the 3D BLE filter was found to be comparable to or better than that of both the 3D recursive and Canny filters while offering the significant advantage that it requires no parameter input or optimisation. Edge widths as little as 2 pixels are reproducibly detected with signal intensity and grey scale values as low as 0.72% above the mean of the background noise. The 3D BLE thus provides an efficient method for the automated segmentation of complex cellular structures across multiple scales for further downstream processing, such as cellular annotation and sub-tomogram averaging, and provides a valuable tool for the accurate and high-throughput identification and annotation of 3D structural complexity at the subcellular level, as well as for mapping the spatial and temporal rearrangement of macromolecular assemblies in situ within cellular tomograms. PMID:22479430

  1. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  2. Low-loss interference filter arrays made by plasma-assisted reactive magnetron sputtering (PARMS) for high-performance multispectral imaging

    Science.gov (United States)

    Broßmann, Jan; Best, Thorsten; Bauer, Thomas; Jakobs, Stefan; Eisenhammer, Thomas

    2016-10-01

    Optical remote sensing of the earth from air and space typically utilizes several channels in the visible and near infrared spectrum. Thin-film optical interference filters, mostly of narrow bandpass type, are applied to select these channels. The filters are arranged in filter wheels, arrays of discrete stripe filters mounted in frames, or patterned arrays on a monolithic substrate. Such multi-channel filter assemblies can be mounted close to the detector, which allows a compact and lightweight camera design. Recent progress in image resolution and sensor sensitivity requires improvements of the optical filter performance. Higher demands placed on blocking in the UV and NIR and in between the spectral channels, in-band transmission and filter edge steepness as well as scattering lead to more complex filter coatings with thicknesses in the range of 10 - 25μm. Technological limits of the conventionally used ion-assisted evaporation process (IAD) can be overcome only by more precise and higher-energetic coating technologies like plasma-assisted reactive magnetron sputtering (PARMS) in combination with optical broadband monitoring. Optics Balzers has developed a photolithographic patterning process for coating thicknesses up to 15μm that is fully compatible with the advanced PARMS coating technology. This provides the possibility of depositing multiple complex high-performance filters on a monolithic substrate. We present an overview of the performance of recently developed filters with improved spectral performance designed for both monolithic filter-arrays and stripe filters mounted in frames. The pros and cons as well as the resulting limits of the filter designs for both configurations are discussed.

  3. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  4. Time-resolved magnetic imaging in an aberration-corrected, energy-filtered photoemission electron microscope

    International Nuclear Information System (INIS)

    Nickel, F.; Gottlob, D.M.; Krug, I.P.; Doganay, H.; Cramm, S.; Kaiser, A.M.; Lin, G.; Makarov, D.; Schmidt, O.G.

    2013-01-01

    We report on the implementation and usage of a synchrotron-based time-resolving operation mode in an aberration-corrected, energy-filtered photoemission electron microscope. The setup consists of a new type of sample holder, which enables fast magnetization reversal of the sample by sub-ns pulses of up to 10 mT. Within the sample holder current pulses are generated by a fast avalanche photo diode and transformed into magnetic fields by means of a microstrip line. For more efficient use of the synchrotron time structure, we developed an electrostatic deflection gating mechanism capable of beam blanking within a few nanoseconds. This allows us to operate the setup in the hybrid bunch mode of the storage ring facility, selecting one or several bright singular light pulses which are temporally well-separated from the normal high-intensity multibunch pulse pattern. - Highlights: • A new time-resolving operation mode in photoemission electron microscopy is shown. • Our setup works within an energy-filtered, aberration-corrected PEEM. • A new gating system for bunch selection using synchrotron radiation is developed. • An alternative magnetic excitation system is developed. • First tr-imaging using an energy-filtered, aberration-corrected PEEM is shown

  5. Blind Source Parameters for Performance Evaluation of Despeckling Filters

    Directory of Open Access Journals (Sweden)

    Nagashettappa Biradar

    2016-01-01

    Full Text Available The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI, and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink embedded with Stein’s unbiased risk estimation (SURE. The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  6. Blind Source Parameters for Performance Evaluation of Despeckling Filters.

    Science.gov (United States)

    Biradar, Nagashettappa; Dewal, M L; Rohit, ManojKumar; Gowre, Sanjaykumar; Gundge, Yogesh

    2016-01-01

    The speckle noise is inherent to transthoracic echocardiographic images. A standard noise-free reference echocardiographic image does not exist. The evaluation of filters based on the traditional parameters such as peak signal-to-noise ratio, mean square error, and structural similarity index may not reflect the true filter performance on echocardiographic images. Therefore, the performance of despeckling can be evaluated using blind assessment metrics like the speckle suppression index, speckle suppression and mean preservation index (SMPI), and beta metric. The need for noise-free reference image is overcome using these three parameters. This paper presents a comprehensive analysis and evaluation of eleven types of despeckling filters for echocardiographic images in terms of blind and traditional performance parameters along with clinical validation. The noise is effectively suppressed using the logarithmic neighborhood shrinkage (NeighShrink) embedded with Stein's unbiased risk estimation (SURE). The SMPI is three times more effective compared to the wavelet based generalized likelihood estimation approach. The quantitative evaluation and clinical validation reveal that the filters such as the nonlocal mean, posterior sampling based Bayesian estimation, hybrid median, and probabilistic patch based filters are acceptable whereas median, anisotropic diffusion, fuzzy, and Ripplet nonlinear approximation filters have limited applications for echocardiographic images.

  7. Reversible wavelet filter banks with side informationless spatially adaptive low-pass filters

    Science.gov (United States)

    Abhayaratne, Charith

    2011-07-01

    Wavelet transforms that have an adaptive low-pass filter are useful in applications that require the signal singularities, sharp transitions, and image edges to be left intact in the low-pass signal. In scalable image coding, the spatial resolution scalability is achieved by reconstructing the low-pass signal subband, which corresponds to the desired resolution level, and discarding other high-frequency wavelet subbands. In such applications, it is vital to have low-pass subbands that are not affected by smoothing artifacts associated with low-pass filtering. We present the mathematical framework for achieving 1-D wavelet transforms that have a spatially adaptive low-pass filter (SALP) using the prediction-first lifting scheme. The adaptivity decisions are computed using the wavelet coefficients, and no bookkeeping is required for the perfect reconstruction. Then, 2-D wavelet transforms that have a spatially adaptive low-pass filter are designed by extending the 1-D SALP framework. Because the 2-D polyphase decompositions are used in this case, the 2-D adaptivity decisions are made nonseparable as opposed to the separable 2-D realization using 1-D transforms. We present examples using the 2-D 5/3 wavelet transform and their lossless image coding and scalable decoding performances in terms of quality and resolution scalability. The proposed 2-D-SALP scheme results in better performance compared to the existing adaptive update lifting schemes.

  8. Wiener filter applied to a neutrongraphic system

    International Nuclear Information System (INIS)

    Crispim, V.R.; Lopes, R.T.; Borges, J.C.

    1986-01-01

    The randon characteristics of the image formation process influence the spatial image obtained in a neutrongraphy. Several methods can be used to optimize this image, though estimation of the noise added to the original signal. This work deals with the optimal filtering technique, using Wiener's filter. A simulation is made, where the signal (spatial resolution function) has a Lorentz's form, and ten kinds of random noise with increasing R.M.S. are generated and individually added to the original signal. Wiener's filter is applied to different noise amplitudes and the behaviour of the spatial resolution function for our system is also analysed. (Author) [pt

  9. Graphic filter library implemented in CUDA language

    OpenAIRE

    Peroutková, Hedvika

    2009-01-01

    This thesis deals with the problem of reducing computation time of raster image processing by parallel computing on graphics processing unit. Raster image processing thereby refers to the application of graphic filters, which can be applied in sequence with different settings. This thesis evaluates the suitability of using parallelization on graphic card for raster image adjustments based on multicriterial choice. Filters are implemented for graphics processing unit in CUDA language. Opacity ...

  10. A Low-Noise CMOS THz Imager Based on Source Modulation and an In-Pixel High-Q Passive Switched-Capacitor N-Path Filter

    Science.gov (United States)

    Boukhayma, Assim; Dupret, Antoine; Rostaing, Jean-Pierre; Enz, Christian

    2016-01-01

    This paper presents the first low noise complementary metal oxide semiconductor (CMOS) terahertz (THz) imager based on source modulation and in-pixel high-Q filtering. The 31×31 focal plane array has been fully integrated in a 0.13μm standard CMOS process. The sensitivity has been improved significantly by modulating the active THz source that lights the scene and performing on-chip high-Q filtering. Each pixel encompass a broadband bow tie antenna coupled to an N-type metal-oxide-semiconductor (NMOS) detector that shifts the THz radiation, a low noise adjustable gain amplifier and a high-Q filter centered at the modulation frequency. The filter is based on a passive switched-capacitor (SC) N-path filter combined with a continuous-time broad-band Gm-C filter. A simplified analysis that helps in designing and tuning the passive SC N-path filter is provided. The characterization of the readout chain shows that a Q factor of 100 has been achieved for the filter with a good matching between the analytical calculation and the measurement results. An input-referred noise of 0.2μV RMS has been measured. Characterization of the chip with different THz wavelengths confirms the broadband feature of the antenna and shows that this THz imager reaches a total noise equivalent power of 0.6 nW at 270 GHz and 0.8 nW at 600 GHz. PMID:26950131

  11. A Low-Noise CMOS THz Imager Based on Source Modulation and an In-Pixel High-Q Passive Switched-Capacitor N-Path Filter.

    Science.gov (United States)

    Boukhayma, Assim; Dupret, Antoine; Rostaing, Jean-Pierre; Enz, Christian

    2016-03-03

    This paper presents the first low noise complementary metal oxide semiconductor (CMOS) deletedCMOS terahertz (THz) imager based on source modulation and in-pixel high-Q filtering. The 31 × 31 focal plane array has been fully integrated in a 0 . 13 μ m standard CMOS process. The sensitivity has been improved significantly by modulating the active THz source that lights the scene and performing on-chip high-Q filtering. Each pixel encompass a broadband bow tie antenna coupled to an N-type metal-oxide-semiconductor (NMOS) detector that shifts the THz radiation, a low noise adjustable gain amplifier and a high-Q filter centered at the modulation frequency. The filter is based on a passive switched-capacitor (SC) N-path filter combined with a continuous-time broad-band Gm-C filter. A simplified analysis that helps in designing and tuning the passive SC N-path filter is provided. The characterization of the readout chain shows that a Q factor of 100 has been achieved for the filter with a good matching between the analytical calculation and the measurement results. An input-referred noise of 0 . 2 μ V RMS has been measured. Characterization of the chip with different THz wavelengths confirms the broadband feature of the antenna and shows that this THz imager reaches a total noise equivalent power of 0 . 6 nW at 270 GHz and 0 . 8 nW at 600 GHz.

  12. Photoacoustic imaging in scattering media by combining a correlation matrix filter with a time reversal operator.

    Science.gov (United States)

    Rui, Wei; Tao, Chao; Liu, Xiaojun

    2017-09-18

    Acoustic scattering medium is a fundamental challenge for photoacoustic imaging. In this study, we reveal the different coherent properties of the scattering photoacoustic waves and the direct photoacoustic waves in a matrix form. Direct waves show a particular coherence on the antidiagonals of the matrix, whereas scattering waves do not. Based on this property, a correlation matrix filter combining with a time reversal operator is proposed to preserve the direct waves and recover the image behind a scattering layer. Both numerical simulations and photoacoustic imaging experiments demonstrate that the proposed approach effectively increases the image contrast and decreases the background speckles in a scattering medium. This study might improve the quality of photoacoustic imaging in an acoustic scattering environment and extend its applications.

  13. Nonlinear spatio-temporal filtering of dynamic PET data using a four-dimensional Gaussian filter and expectation-maximization deconvolution

    International Nuclear Information System (INIS)

    Floberg, J M; Holden, J E

    2013-01-01

    We introduce a method for denoising dynamic PET data, spatio-temporal expectation-maximization (STEM) filtering, that combines four-dimensional Gaussian filtering with EM deconvolution. The initial Gaussian filter suppresses noise at a broad range of spatial and temporal frequencies and EM deconvolution quickly restores the frequencies most important to the signal. We aim to demonstrate that STEM filtering can improve variance in both individual time frames and in parametric images without introducing significant bias. We evaluate STEM filtering with a dynamic phantom study, and with simulated and human dynamic PET studies of a tracer with reversible binding behaviour, [C-11]raclopride, and a tracer with irreversible binding behaviour, [F-18]FDOPA. STEM filtering is compared to a number of established three and four-dimensional denoising methods. STEM filtering provides substantial improvements in variance in both individual time frames and in parametric images generated with a number of kinetic analysis techniques while introducing little bias. STEM filtering does bias early frames, but this does not affect quantitative parameter estimates. STEM filtering is shown to be superior to the other simple denoising methods studied. STEM filtering is a simple and effective denoising method that could be valuable for a wide range of dynamic PET applications. (paper)

  14. Restoration of Static JPEG Images and RGB Video Frames by Means of Nonlinear Filtering in Conditions of Gaussian and Non-Gaussian Noise

    Science.gov (United States)

    Sokolov, R. I.; Abdullin, R. R.

    2017-11-01

    The use of nonlinear Markov process filtering makes it possible to restore both video stream frames and static photos at the stage of preprocessing. The present paper reflects the results of research in comparison of these types image filtering quality by means of special algorithm when Gaussian or non-Gaussian noises acting. Examples of filter operation at different values of signal-to-noise ratio are presented. A comparative analysis has been performed, and the best filtered kind of noise has been defined. It has been shown the quality of developed algorithm is much better than quality of adaptive one for RGB signal filtering at the same a priori information about the signal. Also, an advantage over median filter takes a place when both fluctuation and pulse noise filtering.

  15. Determination of optimum filter in myocardial SPECT: A phantom study

    International Nuclear Information System (INIS)

    Takavar, A.; Shamsipour, Gh.; Sohrabi, M.; Eftekhari, M.

    2004-01-01

    Background: In myocardial perfusion SPECT images are degraded by photon attenuation, the distance-dependent collimator, detector response and photons scatter. Filters greatly affect quality of nuclear medicine images. Materials and Methods: A phantom simulating heart left ventricle was built. About 1mCi of 99m Tc was injected into the phantom. Images was taken from this phantom. Some filters including Parzen, Hamming, Hanning, Butter worth and Gaussian were exerted on the phantom images. By defining some criteria such as contrast, signal to noise ratio, and defect size detectability, the best filter can be determined. Results: 0.325 Nyquist frequency and 0.5 nq was obtained as the optimum cut off frequencies respectively for hamming and handing filters. Order 11, cut off 0.45 Nq and order 20 cut off 0.5 Nq obtained optimum respectively for Butter worth and Gaussian filters. Conclusion: The optimum member of every filter's family was obtained

  16. A diagnostic imaging approach for online characterization of multi-impact in aircraft composite structures based on a scanning spatial-wavenumber filter of guided wave

    Science.gov (United States)

    Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Su, Zhongqing

    2017-06-01

    Monitoring of impact and multi-impact in particular in aircraft composite structures has been an intensive research topic in the field of guided-wave-based structural health monitoring (SHM). Compared with the majority of existing methods such as those using signal features in the time-, frequency- or joint time-frequency domain, the approach based on spatial-wavenumber filter of guided wave shows superb advantage in effectively distinguishing particular wave modes and identifying their propagation direction relative to the sensor array. However, there exist two major issues when conducting online characterization of multi-impact event. Firstly, the spatial-wavenumber filter should be realized in the situation that the wavenumber of high spatial resolution of the complicated multi-impact signal cannot be measured or modeled. Secondly, it's difficult to identify the multiple impacts and realize multi-impact localization due to the overlapping of wavenumbers. To address these issues, a scanning spatial-wavenumber filter based diagnostic imaging method for online characterization of multi-impact event is proposed to conduct multi-impact imaging and localization in this paper. The principle of the scanning filter for multi-impact is developed first to conduct spatial-wavenumber filtering and to achieve wavenumber-time imaging of the multiple impacts. Then, a feature identification method of multi-impact based on eigenvalue decomposition and wavenumber searching is presented to estimate the number of impacts and calculate the wavenumber of the multi-impact signal, and an image mapping method is proposed as well to convert the wavenumber-time image to an angle-distance image to distinguish and locate the multiple impacts. A series of multi-impact events are applied to a carbon fiber laminate plate to validate the proposed methods. The validation results show that the localization of the multiple impacts are well achieved.

  17. Evaluation of non-linear adaptive smoothing filter by digital phantom

    International Nuclear Information System (INIS)

    Sato, Kazuhiro; Ishiya, Hiroki; Oshita, Ryosuke; Yanagawa, Isao; Goto, Mitsunori; Mori, Issei

    2008-01-01

    As a result of the development of multi-slice CT, diagnoses based on three-dimensional reconstruction images and multi-planar reconstruction have spread. For these applications, which require high z-resolution, thin slice imaging is essential. However, because z-resolution is always based on a trade-off with image noise, thin slice imaging is necessarily accompanied by an increase in noise level. To improve the quality of thin slice images, a non-linear adaptive smoothing filter has been developed, and is being widely applied to clinical use. We developed a digital bar pattern phantom for the purpose of evaluating the effect of this filter and attempted evaluation from an addition image of the bar pattern phantom and the image of the water phantom. The effect of this filter was changed in a complex manner by the contrast and spatial frequency of the original image. We have confirmed the reduced effect of image noise in the low frequency component of the image, but decreased contrast or increased quantity of noise in the image of the high frequency component. This result represents the effect of change in the adaptation of this filter. The digital phantom was useful for this evaluation, but to understand the total effect of filtering, much improvement of the shape of the digital phantom is required. (author)

  18. Adaptive mean filtering for noise reduction in CT polymer gel dosimetry

    International Nuclear Information System (INIS)

    Hilts, Michelle; Jirasek, Andrew

    2008-01-01

    X-ray computed tomography (CT) as a method of extracting 3D dose information from irradiated polymer gel dosimeters is showing potential as a practical means to implement gel dosimetry in a radiation therapy clinic. However, the response of CT contrast to dose is weak and noise reduction is critical in order to achieve adequate dose resolutions with this method. Phantom design and CT imaging technique have both been shown to decrease image noise. In addition, image postprocessing using noise reduction filtering techniques have been proposed. This work evaluates in detail the use of the adaptive mean filter for reducing noise in CT gel dosimetry. Filter performance is systematically tested using both synthetic patterns mimicking a range of clinical dose distribution features as well as actual clinical dose distributions. Both low and high signal-to-noise ratio (SNR) situations are examined. For all cases, the effects of filter kernel size and the number of iterations are investigated. Results indicate that adaptive mean filtering is a highly effective tool for noise reduction CT gel dosimetry. The optimum filtering strategy depends on characteristics of the dose distributions and image noise level. For low noise images (SNR ∼20), the filtered results are excellent and use of adaptive mean filtering is recommended as a standard processing tool. For high noise images (SNR ∼5) adaptive mean filtering can also produce excellent results, but filtering must be approached with more caution as spatial and dose distortions of the original dose distribution can occur

  19. Cluster Based Vector Attribute Filtering

    NARCIS (Netherlands)

    Kiwanuka, Fred N.; Wilkinson, Michael H.F.

    2016-01-01

    Morphological attribute filters operate on images based on properties or attributes of connected components. Until recently, attribute filtering was based on a single global threshold on a scalar property to remove or retain objects. A single threshold struggles in case no single property or

  20. 3D Wavelet-Based Filter and Method

    Science.gov (United States)

    Moss, William C.; Haase, Sebastian; Sedat, John W.

    2008-08-12

    A 3D wavelet-based filter for visualizing and locating structural features of a user-specified linear size in 2D or 3D image data. The only input parameter is a characteristic linear size of the feature of interest, and the filter output contains only those regions that are correlated with the characteristic size, thus denoising the image.

  1. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  2. Image scale measurement with correlation filters in a volume holographic optical correlator

    Science.gov (United States)

    Zheng, Tianxiang; Cao, Liangcai; He, Qingsheng; Jin, Guofan

    2013-08-01

    A search engine containing various target images or different part of a large scene area is of great use for many applications, including object detection, biometric recognition, and image registration. The input image captured in realtime is compared with all the template images in the search engine. A volume holographic correlator is one type of these search engines. It performs thousands of comparisons among the images at a super high speed, with the correlation task accomplishing mainly in optics. However, the inputted target image always contains scale variation to the filtering template images. At the time, the correlation values cannot properly reflect the similarity of the images. It is essential to estimate and eliminate the scale variation of the inputted target image. There are three domains for performing the scale measurement, as spatial, spectral and time domains. Most methods dealing with the scale factor are based on the spatial or the spectral domains. In this paper, a method with the time domain is proposed to measure the scale factor of the input image. It is called a time-sequential scaled method. The method utilizes the relationship between the scale variation and the correlation value of two images. It sends a few artificially scaled input images to compare with the template images. The correlation value increases and decreases with the increasing of the scale factor at the intervals of 0.8~1 and 1~1.2, respectively. The original scale of the input image can be measured by estimating the largest correlation value through correlating the artificially scaled input image with the template images. The measurement range for the scale can be 0.8~4.8. Scale factor beyond 1.2 is measured by scaling the input image at the factor of 1/2, 1/3 and 1/4, correlating the artificially scaled input image with the template images, and estimating the new corresponding scale factor inside 0.8~1.2.

  3. MEDOF - MINIMUM EUCLIDEAN DISTANCE OPTIMAL FILTER

    Science.gov (United States)

    Barton, R. S.

    1994-01-01

    The Minimum Euclidean Distance Optimal Filter program, MEDOF, generates filters for use in optical correlators. The algorithm implemented in MEDOF follows theory put forth by Richard D. Juday of NASA/JSC. This program analytically optimizes filters on arbitrary spatial light modulators such as coupled, binary, full complex, and fractional 2pi phase. MEDOF optimizes these modulators on a number of metrics including: correlation peak intensity at the origin for the centered appearance of the reference image in the input plane, signal to noise ratio including the correlation detector noise as well as the colored additive input noise, peak to correlation energy defined as the fraction of the signal energy passed by the filter that shows up in the correlation spot, and the peak to total energy which is a generalization of PCE that adds the passed colored input noise to the input image's passed energy. The user of MEDOF supplies the functions that describe the following quantities: 1) the reference signal, 2) the realizable complex encodings of both the input and filter SLM, 3) the noise model, possibly colored, as it adds at the reference image and at the correlation detection plane, and 4) the metric to analyze, here taken to be one of the analytical ones like SNR (signal to noise ratio) or PCE (peak to correlation energy) rather than peak to secondary ratio. MEDOF calculates filters for arbitrary modulators and a wide range of metrics as described above. MEDOF examines the statistics of the encoded input image's noise (if SNR or PCE is selected) and the filter SLM's (Spatial Light Modulator) available values. These statistics are used as the basis of a range for searching for the magnitude and phase of k, a pragmatically based complex constant for computing the filter transmittance from the electric field. The filter is produced for the mesh points in those ranges and the value of the metric that results from these points is computed. When the search is concluded, the

  4. Scripting-customised microscopy tools for Digital MicrographTM

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, D.R.G. [Materials and Engineering Science, ANSTO Materials PMB 1, Menai, NSW 2234 (Australia)]. E-mail: drm@ansto.gov.au; Schaffer, B. [Research Institute for Electron Microscopy, Graz University of Technology, Steyrergasse 17, A-8010 Graz (Austria)

    2005-07-15

    Software is an integral part of all electron microscopy systems, encompassing hardware control, data acquisition and processing. It is unlikely that any one software system will meet all the requirements of experienced users. However, if the software supports custom scripting, then users are well placed to address any shortcomings by writing their own software. In this paper, we highlight the scripting capability within Gatan Inc.'s Digital MicrographTM (DM) software, a widely used program for TEM imaging and EELS spectroscopy. We show how scripting can greatly extend the capabilities of the DM software, in tasks ranging in complexity from simple image manipulation through to full-blown microscope/imaging filter control and data acquisition. Scripting enables customised software tools to be developed to meet individual experimental needs, something which no software manufacturer could ever hope to do on a commercial basis. In essence, scripting allows the microscopist to drive the software rather than the software drive the microscopist. To foster an increased awareness and interest in DM scripting we have developed a web-based archive for DM scripts, which is freely accessible via the internet.

  5. Hardware Implementation of a Bilateral Subtraction Filter

    Science.gov (United States)

    Huertas, Andres; Watson, Robert; Villalpando, Carlos; Goldberg, Steven

    2009-01-01

    A bilateral subtraction filter has been implemented as a hardware module in the form of a field-programmable gate array (FPGA). In general, a bilateral subtraction filter is a key subsystem of a high-quality stereoscopic machine vision system that utilizes images that are large and/or dense. Bilateral subtraction filters have been implemented in software on general-purpose computers, but the processing speeds attainable in this way even on computers containing the fastest processors are insufficient for real-time applications. The present FPGA bilateral subtraction filter is intended to accelerate processing to real-time speed and to be a prototype of a link in a stereoscopic-machine- vision processing chain, now under development, that would process large and/or dense images in real time and would be implemented in an FPGA. In terms that are necessarily oversimplified for the sake of brevity, a bilateral subtraction filter is a smoothing, edge-preserving filter for suppressing low-frequency noise. The filter operation amounts to replacing the value for each pixel with a weighted average of the values of that pixel and the neighboring pixels in a predefined neighborhood or window (e.g., a 9 9 window). The filter weights depend partly on pixel values and partly on the window size. The present FPGA implementation of a bilateral subtraction filter utilizes a 9 9 window. This implementation was designed to take advantage of the ability to do many of the component computations in parallel pipelines to enable processing of image data at the rate at which they are generated. The filter can be considered to be divided into the following parts (see figure): a) An image pixel pipeline with a 9 9- pixel window generator, b) An array of processing elements; c) An adder tree; d) A smoothing-and-delaying unit; and e) A subtraction unit. After each 9 9 window is created, the affected pixel data are fed to the processing elements. Each processing element is fed the pixel value for

  6. From Matched Spatial Filtering towards the Fused Statistical Descriptive Regularization Method for Enhanced Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shkvarko Yuriy

    2006-01-01

    Full Text Available We address a new approach to solve the ill-posed nonlinear inverse problem of high-resolution numerical reconstruction of the spatial spectrum pattern (SSP of the backscattered wavefield sources distributed over the remotely sensed scene. An array or synthesized array radar (SAR that employs digital data signal processing is considered. By exploiting the idea of combining the statistical minimum risk estimation paradigm with numerical descriptive regularization techniques, we address a new fused statistical descriptive regularization (SDR strategy for enhanced radar imaging. Pursuing such an approach, we establish a family of the SDR-related SSP estimators, that encompass a manifold of existing beamforming techniques ranging from traditional matched filter to robust and adaptive spatial filtering, and minimum variance methods.

  7. A Low Cost Structurally Optimized Design for Diverse Filter Types

    Science.gov (United States)

    Kazmi, Majida; Aziz, Arshad; Akhtar, Pervez; Ikram, Nassar

    2016-01-01

    A wide range of image processing applications deploys two dimensional (2D)-filters for performing diversified tasks such as image enhancement, edge detection, noise suppression, multi scale decomposition and compression etc. All of these tasks require multiple type of 2D-filters simultaneously to acquire the desired results. The resource hungry conventional approach is not a viable option for implementing these computationally intensive 2D-filters especially in a resource constraint environment. Thus it calls for optimized solutions. Mostly the optimization of these filters are based on exploiting structural properties. A common shortcoming of all previously reported optimized approaches is their restricted applicability only for a specific filter type. These narrow scoped solutions completely disregard the versatility attribute of advanced image processing applications and in turn offset their effectiveness while implementing a complete application. This paper presents an efficient framework which exploits the structural properties of 2D-filters for effectually reducing its computational cost along with an added advantage of versatility for supporting diverse filter types. A composite symmetric filter structure is introduced which exploits the identities of quadrant and circular T-symmetries in two distinct filter regions simultaneously. These T-symmetries effectually reduce the number of filter coefficients and consequently its multipliers count. The proposed framework at the same time empowers this composite filter structure with additional capabilities of realizing all of its Ψ-symmetry based subtypes and also its special asymmetric filters case. The two-fold optimized framework thus reduces filter computational cost up to 75% as compared to the conventional approach as well as its versatility attribute not only supports diverse filter types but also offers further cost reduction via resource sharing for sequential implementation of diversified image

  8. A high-transmission liquid-crystal Fabry-Perot infrared filter for electrically tunable spectral imaging detection

    Science.gov (United States)

    Liu, Zhonglun; Xin, Zhaowei; Long, Huabao; Wei, Dong; Dai, Wanwan; Zhang, Xinyu; Wang, Haiwei; Xie, Changsheng

    2018-02-01

    Previous studies have presented the usefulness of typical liquid-crystal Fabry-Perot (LC-FP) infrared filters for spectral imaging detection. Yet, their infrared transmission performances still remain to improve or even rise. In this paper, we propose a new type of electrically tunable LC-FP infrared filter to solve the problem above. The key component of the device is a FP resonant cavity composed of two parallel plane mirrors, in which the zinc selenide (ZnSe) materials with a very high transmittance in the mid-long-wavelength infrared regions are used as the electrode substrates and a layer of nano-aluminum (Al) film, which is directly contacted with liquid-crystal materials, is chosen to make high reflective mirrors as well as the electrodes. Particularly, it should be noted that the directional layer made up of ployimide (PI) used previously is removed. The experiment results indicate that the filter can reduce the absorption of infrared wave remarkably, and thus highlight a road to effectively improve the infrared transmittance ability.

  9. STUDIES OF NGC 6720 WITH CALIBRATED HST/WFC3 EMISSION-LINE FILTER IMAGES. I. STRUCTURE AND EVOLUTION ,

    International Nuclear Information System (INIS)

    O'Dell, C. R.; Ferland, G. J.; Henney, W. J.; Peimbert, M.

    2013-01-01

    We have performed a detailed analysis of the Ring Nebula (NGC 6720) using Hubble Space Telescope WFC3 images and derived a new three-dimensional model. Existing high spectral resolution spectra played an important supplementary role in our modeling. It is shown that the Main Ring of the nebula is an ionization-bounded irregular non-symmetric disk with a central cavity and perpendicular extended lobes pointed almost toward the observer. The faint outer halos are determined to be fossil radiation, i.e., radiation from gas ionized in an earlier stage of the nebula when it was not ionization bounded. The narrowband WFC3 filters that isolate some of the emission lines are affected by broadening on their short wavelength side and all the filters were calibrated using ground-based spectra. The filter calibration results are presented in an appendix.

  10. An approach of point cloud denoising based on improved bilateral filtering

    Science.gov (United States)

    Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin

    2018-04-01

    An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.

  11. Gaussian particle filter based pose and motion estimation

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Determination of relative three-dimensional (3D) position, orientation, and relative motion between two reference frames is an important problem in robotic guidance, manipulation, and assembly as well as in other fields such as photogrammetry.A solution to pose and motion estimation problem that uses two-dimensional (2D) intensity images from a single camera is desirable for real-time applications. The difficulty in performing this measurement is that the process of projecting 3D object features to 2D images is a nonlinear transformation. In this paper, the 3D transformation is modeled as a nonlinear stochastic system with the state estimation providing six degrees-of-freedom motion and position values, using line features in image plane as measuring inputs and dual quaternion to represent both rotation and translation in a unified notation. A filtering method called the Gaussian particle filter (GPF) based on the particle filtering concept is presented for 3D pose and motion estimation of a moving target from monocular image sequences. The method has been implemented with simulated data, and simulation results are provided along with comparisons to the extended Kalman filter (EKF) and the unscented Kalman filter (UKF) to show the relative advantages of the GPF. Simulation results showed that GPF is a superior alternative to EKF and UKF.

  12. Filtered-X Affine Projection Algorithms for Active Noise Control Using Volterra Filters

    Directory of Open Access Journals (Sweden)

    Sicuranza Giovanni L

    2004-01-01

    Full Text Available We consider the use of adaptive Volterra filters, implemented in the form of multichannel filter banks, as nonlinear active noise controllers. In particular, we discuss the derivation of filtered-X affine projection algorithms for homogeneous quadratic filters. According to the multichannel approach, it is then easy to pass from these algorithms to those of a generic Volterra filter. It is shown in the paper that the AP technique offers better convergence and tracking capabilities than the classical LMS and NLMS algorithms usually applied in nonlinear active noise controllers, with a limited complexity increase. This paper extends in two ways the content of a previous contribution published in Proc. IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03, Grado, Italy, June 2003. First of all, a general adaptation algorithm valid for any order of affine projections is presented. Secondly, a more complete set of experiments is reported. In particular, the effects of using multichannel filter banks with a reduced number of channels are investigated and relevant results are shown.

  13. Filtered Rayleigh Scattering Measurements in a Buoyant Flow Field

    National Research Council Canada - National Science Library

    Meents, Steven M

    2008-01-01

    Filtered Rayleigh Scattering (FRS) is a non-intrusive, laser-based flow characterization technique that consists of a narrow linewidth laser, a molecular absorption filter, and a high resolution camera behind the filter to record images...

  14. Sub-nanometre resolution imaging of polymer-fullerene photovoltaic blends using energy-filtered scanning electron microscopy.

    Science.gov (United States)

    Masters, Robert C; Pearson, Andrew J; Glen, Tom S; Sasam, Fabian-Cyril; Li, Letian; Dapor, Maurizio; Donald, Athene M; Lidzey, David G; Rodenburg, Cornelia

    2015-04-24

    The resolution capability of the scanning electron microscope has increased immensely in recent years, and is now within the sub-nanometre range, at least for inorganic materials. An equivalent advance has not yet been achieved for imaging the morphologies of nanostructured organic materials, such as organic photovoltaic blends. Here we show that energy-selective secondary electron detection can be used to obtain high-contrast, material-specific images of an organic photovoltaic blend. We also find that we can differentiate mixed phases from pure material phases in our data. The lateral resolution demonstrated is twice that previously reported from secondary electron imaging. Our results suggest that our energy-filtered scanning electron microscopy approach will be able to make major inroads into the understanding of complex, nano-structured organic materials.

  15. Sub-nanometre resolution imaging of polymer–fullerene photovoltaic blends using energy-filtered scanning electron microscopy

    Science.gov (United States)

    Masters, Robert C.; Pearson, Andrew J.; Glen, Tom S.; Sasam, Fabian-Cyril; Li, Letian; Dapor, Maurizio; Donald, Athene M.; Lidzey, David G.; Rodenburg, Cornelia

    2015-01-01

    The resolution capability of the scanning electron microscope has increased immensely in recent years, and is now within the sub-nanometre range, at least for inorganic materials. An equivalent advance has not yet been achieved for imaging the morphologies of nanostructured organic materials, such as organic photovoltaic blends. Here we show that energy-selective secondary electron detection can be used to obtain high-contrast, material-specific images of an organic photovoltaic blend. We also find that we can differentiate mixed phases from pure material phases in our data. The lateral resolution demonstrated is twice that previously reported from secondary electron imaging. Our results suggest that our energy-filtered scanning electron microscopy approach will be able to make major inroads into the understanding of complex, nano-structured organic materials. PMID:25906738

  16. Adaptive wiener filter based on Gaussian mixture distribution model for denoising chest X-ray CT image

    International Nuclear Information System (INIS)

    Tabuchi, Motohiro; Yamane, Nobumoto; Morikawa, Yoshitaka

    2008-01-01

    In recent decades, X-ray CT imaging has become more important as a result of its high-resolution performance. However, it is well known that the X-ray dose is insufficient in the techniques that use low-dose imaging in health screening or thin-slice imaging in work-up. Therefore, the degradation of CT images caused by the streak artifact frequently becomes problematic. In this study, we applied a Wiener filter (WF) using the universal Gaussian mixture distribution model (UNI-GMM) as a statistical model to remove streak artifact. In designing the WF, it is necessary to estimate the statistical model and the precise co-variances of the original image. In the proposed method, we obtained a variety of chest X-ray CT images using a phantom simulating a chest organ, and we estimated the statistical information using the images for training. The results of simulation showed that it is possible to fit the UNI-GMM to the chest X-ray CT images and reduce the specific noise. (author)

  17. A comparison of a niobium filter (NIOBI-X) with conventional filters in x-ray radiography

    International Nuclear Information System (INIS)

    Sandborg, M.; Alm Carlsson, G.

    1990-01-01

    A 0.05 mm thick x-ray filter of niobium (NIOBI-X) has been tested and the x-ray image quality and radiation doses have been compared with conventionel x-ray filters of copper and aluminium. The results show that for x-ray tube voltage higher than 50 kV or objects thicker than 50 mm a 0.05 mm thick niobium with benefit can be replaced by a 0.11 mm thick copper filter. (25 refs.) (K.A.E.)

  18. Particle image velocimetry (PIV) study of rotating cylindrical filters for animal cell perfusion processes.

    Science.gov (United States)

    Figueredo-Cardero, Alvio; Chico, Ernesto; Castilho, Leda; de Andrade Medronho, Ricardo

    2012-01-01

    In the present work, the main fluid flow features inside a rotating cylindrical filtration (RCF) system used as external cell retention device for animal cell perfusion processes were investigated using particle image velocimetry (PIV). The motivation behind this work was to provide experimental fluid dynamic data for such turbulent flow using a high-permeability filter, given the lack of information about this system in the literature. The results shown herein gave evidence that, at the boundary between the filter mesh and the fluid, a slip velocity condition in the tangential direction does exist, which had not been reported in the literature so far. In the RCF system tested, this accounted for a fluid velocity 10% lower than that of the filter tip, which could be important for the cake formation kinetics during filtration. Evidence confirming the existence of Taylor vortices under conditions of turbulent flow and high permeability, typical of animal cell perfusion RCF systems, was obtained. Second-order turbulence statistics were successfully calculated. The radial behavior of the second-order turbulent moments revealed that turbulence in this system is highly anisotropic, which is relevant for performing numerical simulations of this system. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  19. First magnetic resonance imaging-guided aortic stenting and cava filter placement using a polyetheretherketone-based magnetic resonance imaging-compatible guidewire in swine: proof of concept.

    Science.gov (United States)

    Kos, Sebastian; Huegli, Rolf; Hofmann, Eugen; Quick, Harald H; Kuehl, Hilmar; Aker, Stephanie; Kaiser, Gernot M; Borm, Paul J A; Jacob, Augustinus L; Bilecen, Deniz

    2009-05-01

    The purpose of this study was to demonstrate feasibility of percutaneous transluminal aortic stenting and cava filter placement under magnetic resonance imaging (MRI) guidance exclusively using a polyetheretherketone (PEEK)-based MRI-compatible guidewire. Percutaneous transluminal aortic stenting and cava filter placement were performed in 3 domestic swine. Procedures were performed under MRI-guidance in an open-bore 1.5-T scanner. The applied 0.035-inch guidewire has a PEEK core reinforced by fibres, floppy tip, hydrophilic coating, and paramagnetic markings for passive visualization. Through an 11F sheath, the guidewire was advanced into the abdominal (swine 1) or thoracic aorta (swine 2), and the stents were deployed. The guidewire was advanced into the inferior vena cava (swine 3), and the cava filter was deployed. Postmortem autopsy was performed. Procedural success, guidewire visibility, pushability, and stent support were qualitatively assessed by consensus. Procedure times were documented. Guidewire guidance into the abdominal and thoracic aortas and the inferior vena cava was successful. Stent deployments were successful in the abdominal (swine 1) and thoracic (swine 2) segments of the descending aorta. Cava filter positioning and deployment was successful. Autopsy documented good stent and filter positioning. Guidewire visibility through applied markers was rated acceptable for aortic stenting and good for venous filter placement. Steerability, pushability, and device support were good. The PEEK-based guidewire allows either percutaneous MRI-guided aortic stenting in the thoracic and abdominal segments of the descending aorta and filter placement in the inferior vena cava with acceptable to good device visibility and offers good steerability, pushability, and device support.

  20. First Magnetic Resonance Imaging-Guided Aortic Stenting and Cava Filter Placement Using a Polyetheretherketone-Based Magnetic Resonance Imaging-Compatible Guidewire in Swine: Proof of Concept

    International Nuclear Information System (INIS)

    Kos, Sebastian; Huegli, Rolf; Hofmann, Eugen; Quick, Harald H.; Kuehl, Hilmar; Aker, Stephanie; Kaiser, Gernot M.; Borm, Paul J. A.; Jacob, Augustinus L.; Bilecen, Deniz

    2009-01-01

    The purpose of this study was to demonstrate feasibility of percutaneous transluminal aortic stenting and cava filter placement under magnetic resonance imaging (MRI) guidance exclusively using a polyetheretherketone (PEEK)-based MRI-compatible guidewire. Percutaneous transluminal aortic stenting and cava filter placement were performed in 3 domestic swine. Procedures were performed under MRI-guidance in an open-bore 1.5-T scanner. The applied 0.035-inch guidewire has a PEEK core reinforced by fibres, floppy tip, hydrophilic coating, and paramagnetic markings for passive visualization. Through an 11F sheath, the guidewire was advanced into the abdominal (swine 1) or thoracic aorta (swine 2), and the stents were deployed. The guidewire was advanced into the inferior vena cava (swine 3), and the cava filter was deployed. Postmortem autopsy was performed. Procedural success, guidewire visibility, pushability, and stent support were qualitatively assessed by consensus. Procedure times were documented. Guidewire guidance into the abdominal and thoracic aortas and the inferior vena cava was successful. Stent deployments were successful in the abdominal (swine 1) and thoracic (swine 2) segments of the descending aorta. Cava filter positioning and deployment was successful. Autopsy documented good stent and filter positioning. Guidewire visibility through applied markers was rated acceptable for aortic stenting and good for venous filter placement. Steerability, pushability, and device support were good. The PEEK-based guidewire allows either percutaneous MRI-guided aortic stenting in the thoracic and abdominal segments of the descending aorta and filter placement in the inferior vena cava with acceptable to good device visibility and offers good steerability, pushability, and device support.

  1. Matched-Filter Thermography

    Directory of Open Access Journals (Sweden)

    Nima Tabatabaei

    2018-04-01

    Full Text Available Conventional infrared thermography techniques, including pulsed and lock-in thermography, have shown great potential for non-destructive evaluation of broad spectrum of materials, spanning from metals to polymers to biological tissues. However, performance of these techniques is often limited due to the diffuse nature of thermal wave fields, resulting in an inherent compromise between inspection depth and depth resolution. Recently, matched-filter thermography has been introduced as a means for overcoming this classic limitation to enable depth-resolved subsurface thermal imaging and improving axial/depth resolution. This paper reviews the basic principles and experimental results of matched-filter thermography: first, mathematical and signal processing concepts related to matched-fileting and pulse compression are discussed. Next, theoretical modeling of thermal-wave responses to matched-filter thermography using two categories of pulse compression techniques (linear frequency modulation and binary phase coding are reviewed. Key experimental results from literature demonstrating the maintenance of axial resolution while inspecting deep into opaque and turbid media are also presented and discussed. Finally, the concept of thermal coherence tomography for deconvolution of thermal responses of axially superposed sources and creation of depth-selective images in a diffusion-wave field is reviewed.

  2. Edge-based correlation image registration for multispectral imaging

    Science.gov (United States)

    Nandy, Prabal [Albuquerque, NM

    2009-11-17

    Registration information for images of a common target obtained from a plurality of different spectral bands can be obtained by combining edge detection and phase correlation. The images are edge-filtered, and pairs of the edge-filtered images are then phase correlated to produce phase correlation images. The registration information can be determined based on these phase correlation images.

  3. The efficacy of K-edge filters in diagnostic radiology

    International Nuclear Information System (INIS)

    Williamson, B.D.P.; van Doorn, T.

    1994-01-01

    The application of K-edge filters in diagnostic has been investigated by many workers for over twenty years. These investigations have analysed the effects of such filters on image quality and radiation dose as well as the practicalities of their application. This paper presents a synopsis of the published works and concludes that K-edge filters do not perceptibly improve image quality and make only limited reductions in patient dose. K-edge filters are also costly to purchase and potentially result in a reduction in the cost effectiveness of x-ray examinations by increasing the x-ray tube loading. Equivalent contrast enhancement and dose reductions can be achieved by the assiduous choice of non-selective filters. 51 refs., 2 tab., 6 figs

  4. Distinguishing the Noise and image structures for detecting the correction term and filtering the noise by using fuzzy rules

    OpenAIRE

    Sridevi.Ravada,; Vani prasanna.Kanakala,; Ramya.Koilada

    2011-01-01

    A fuzzy filter is constructed from a set of fuzzy IF-THEN rules, these fuzzy rules come either from human experts or by matching input-output pairs .in this paper we propose a new fuzzy filter for the noise reduction of images corrupted with additive noise. here in this approach ,initially fuzzy derivatives for all eight directions that is N,E,W,S, NE,NW,SE,SW are calculated using “fuzzy IF-THEN rules “ and membership functions . Further the fuzzy derivative values obtained are used in the fu...

  5. Optical supervised filtering technique based on Hopfield neural network

    Science.gov (United States)

    Bal, Abdullah

    2004-11-01

    Hopfield neural network is commonly preferred for optimization problems. In image segmentation, conventional Hopfield neural networks (HNN) are formulated as a cost-function-minimization problem to perform gray level thresholding on the image histogram or the pixels' gray levels arranged in a one-dimensional array [R. Sammouda, N. Niki, H. Nishitani, Pattern Rec. 30 (1997) 921-927; K.S. Cheng, J.S. Lin, C.W. Mao, IEEE Trans. Med. Imag. 15 (1996) 560-567; C. Chang, P. Chung, Image and Vision comp. 19 (2001) 669-678]. In this paper, a new high speed supervised filtering technique is proposed for image feature extraction and enhancement problems by modifying the conventional HNN. The essential improvement in this technique is to use 2D convolution operation instead of weight-matrix multiplication. Thereby, neural network based a new filtering technique has been obtained that is required just 3 × 3 sized filter mask matrix instead of large size weight coefficient matrix. Optical implementation of the proposed filtering technique is executed easily using the joint transform correlator. The requirement of non-negative data for optical implementation is provided by bias technique to convert the bipolar data to non-negative data. Simulation results of the proposed optical supervised filtering technique are reported for various feature extraction problems such as edge detection, corner detection, horizontal and vertical line extraction, and fingerprint enhancement.

  6. Dose profile measurement using an imaging plate: Evaluation of filters using Monte Carlo simulation of 4 MV x-rays

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, Masatoshi [Division of Radiology and Biomedical Engineering, Graduate School of Medicine, University of Tokyo, Bunkyo-ku, Tokyo 113-8655 (Japan); Department of Therapeutic Radiology, Medical Plaza Edogawa, Edogawa-ku, Tokyo 133-0052 (Japan); Tomita, Tetsuya; Sawada, Koichi [Department of Radiology, Chiba University Hospital, Cyuo-ku, Chiba 260-8677 (Japan); Fujibuchi, Toshioh [Department of Radiological Sciences, School of Health Science, Ibaraki Prefectural University, Inashiki-gun, Ibaraki 300-0394, Japan and Graduate School of Comprehensive Human Sciences, University of Tsukuba, Tsukuba-shi, Ibaraki 305-8575 (Japan); Nishio, Teiji [Particle Therapy Division, Research Center for Innovation Oncology, National Cancer Center Hospital East, Kashiwa-shi, Chiba 277-8577 (Japan); Nakagawa, Keiichi [Division of Radiology and Biomedical Engineering, Graduate School of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo 113-8655 (Japan)

    2009-04-15

    Computed radiography (CR) is gradually replacing film. The application of CR for two-dimensional profiles and off-axis ratio (OAR) measurement using an imaging plate (IP) in a CR system is currently under discussion. However, a well known problem for IPs in dosimetry is that they use high atomic number (Z) materials, such as Ba, which have an energy dependency in a photon interaction. Although there are some reports that it is possible to compensate for the energy dependency with metal filters, the appropriate thicknesses of these filters and where they should be located have not been investigated. The purpose of this study is to find the most suitable filter for use with an IP as a dosimetric tool. Monte Carlo simulation (Geant4 8.1) was used to determine the filter to minimize the measurement error in OAR measurements of 4 MV x-rays. In this simulation, the material and thickness of the filter and distance between the IP and the filter were varied to determine most suitable filter conditions that gave the best fit to the MC calculated OAR in water. With regard to changing the filter material, we found that using higher Z and higher density material increased the effectiveness of the filter. Also, increasing the distance between the filter and the IP reduced the effectiveness, whereas increasing the thickness of the filter increased the effectiveness. The result of this study showed that the most appropriate filter conditions consistent with the calculated OAR in water were the ones with the IP sandwiched between two 2 mm thick lead filters at a distance of 5 mm from the IP or the IP sandwiched directly between two 1 mm lead filters. Using these filters, we measured the OAR at 10 cm depth with 100 cm source-to-surface distance and surface 10x10 cm{sup 2} field size. The results of this measurement represented that it is possible to achieve measurements with less than within 2.0% and 2.0% in the field and with less than 1.1% and 0.6% out of the field by using 2 and

  7. Improvement of QR Code Recognition Based on Pillbox Filter Analysis

    Directory of Open Access Journals (Sweden)

    Jia-Shing Sheu

    2013-04-01

    Full Text Available The objective of this paper is to perform the innovation design for improving the recognition of a captured QR code image with blur through the Pillbox filter analysis. QR code images can be captured by digital video cameras. Many factors contribute to QR code decoding failure, such as the low quality of the image. Focus is an important factor that affects the quality of the image. This study discusses the out-of-focus QR code image and aims to improve the recognition of the contents in the QR code image. Many studies have used the pillbox filter (circular averaging filter method to simulate an out-of-focus image. This method is also used in this investigation to improve the recognition of a captured QR code image. A blurred QR code image is separated into nine levels. In the experiment, four different quantitative approaches are used to reconstruct and decode an out-of-focus QR code image. These nine reconstructed QR code images using methods are then compared. The final experimental results indicate improvements in identification.

  8. Use of astronomy filters in fluorescence microscopy.

    Science.gov (United States)

    Piper, Jörg

    2012-02-01

    Monochrome astronomy filters are well suited for use as excitation or suppression filters in fluorescence microscopy. Because of their particular optical design, such filters can be combined with standard halogen light sources for excitation in many fluorescent probes. In this "low energy excitation," photobleaching (fading) or other irritations of native specimens are avoided. Photomicrographs can be taken from living motile fluorescent specimens also with a flash so that fluorescence images can be created free from indistinctness caused by movement. Special filter cubes or dichroic mirrors are not needed for our method. By use of suitable astronomy filters, fluorescence microscopy can be carried out with standard laboratory microscopes equipped with condensers for bright-field (BF) and dark-field (DF) illumination in transmitted light. In BF excitation, the background brightness can be modulated in tiny steps up to dark or black. Moreover, standard industry microscopes fitted with a vertical illuminator for examinations of opaque probes in DF or BF illumination based on incident light (wafer inspections, for instance) can also be used for excitation in epi-illumination when adequate astronomy filters are inserted as excitatory and suppression filters in the illuminating and imaging light path. In all variants, transmission bands can be modulated by transmission shift.

  9. Biomedical bandpass filter for fluorescence microscopy imaging based on TiO2/SiO2 and TiO2/MgF2 dielectric multilayers

    International Nuclear Information System (INIS)

    Butt, M A; Fomchenkov, S A; Verma, P; Khonina, S N; Ullah, A

    2016-01-01

    We report a design for creating a multilayer dielectric optical filters based on TiO 2 and SiO 2 /MgF 2 alternating layers. We have selected Titanium dioxide (TiO 2 ) for high refractive index (2.5), Silicon dioxide (SiO 2 ) and Magnesium fluoride (MgF 2 ) as a low refractive index layer (1.45 and 1.37) respectively. Miniaturized visible spectrometers are useful for quick and mobile characterization of biological samples. Such devices can be fabricated by using Fabry-Perot (FP) filters consisting of two highly reflecting mirrors with a central cavity in between. Distributed Bragg Reflectors (DBRs) consisting of alternating high and low refractive index material pairs are the most commonly used mirrors in FP filters, due to their high reflectivity. However, DBRs have high reflectivity for a selected range of wavelengths known as the stopband of the DBR. This range is usually much smaller than the sensitivity range of the spectrometer range. Therefore a bandpass filters are required to restrict wavelength outside the stopband of the FP DBRs. The proposed filter shows a high quality with average transmission of 97.4% within the passbands and the transmission outside the passband is around 4%. Special attention has been given to keep the thickness of the filters within the economic limits. It can be suggested that these filters are exceptional choice for florescence imaging and Endoscope narrow band imaging. (paper)

  10. Filtered backprojection for modifying the impulse response of circular tomosynthesis

    International Nuclear Information System (INIS)

    Stevens, Grant M.; Fahrig, Rebecca; Pelc, Norbert J.

    2001-01-01

    A filtering technique has been developed to modify the three-dimensional impulse response of circular motion tomosynthesis to allow the generation of images whose appearance is like those of some other imaging geometries. In particular, this technique can reconstruct images with a blurring function which is more homogeneous for off-focal plane objects than that from circular tomosynthesis. In this paper, we describe the filtering process, and demonstrate the ability to alter the impulse response in circular motion tomosynthesis from a ring to a disk. This filtering may be desirable because the blurred out-of-plane objects appear less structured

  11. Reduction of radiation exposure while maintaining high-quality fluoroscopic images during interventional cardiology using novel x-ray tube technology with extra beam filtering.

    Science.gov (United States)

    den Boer, A; de Feyter, P J; Hummel, W A; Keane, D; Roelandt, J R

    1994-06-01

    Radiographic technology plays an integral role in interventional cardiology. The number of interventions continues to increase, and the associated radiation exposure to patients and personnel is of major concern. This study was undertaken to determine whether a newly developed x-ray tube deploying grid-switched pulsed fluoroscopy and extra beam filtering can achieve a reduction in radiation exposure while maintaining fluoroscopic images of high quality. Three fluoroscopic techniques were compared: continuous fluoroscopy, pulsed fluoroscopy, and a newly developed high-output pulsed fluoroscopy with extra filtering. To ascertain differences in the quality of images and to determine differences in patient entrance and investigator radiation exposure, the radiated volume curve was measured to determine the required high voltage levels (kVpeak) for different object sizes for each fluoroscopic mode. The fluoroscopic data of 124 patient procedures were combined. The data were analyzed for radiographic projections, image intensifier field size, and x-ray tube kilovoltage levels (kVpeak). On the basis of this analysis, a reference procedure was constructed. The reference procedure was tested on a phantom or dummy patient by all three fluoroscopic modes. The phantom was so designed that the kilovoltage requirements for each projection were comparable to those needed for the average patient. Radiation exposure of the operator and patient was measured during each mode. The patient entrance dose was measured in air, and the operator dose was measured by 18 dosimeters on a dummy operator. Pulsed compared with continuous fluoroscopy could be performed with improved image quality at lower kilovoltages. The patient entrance dose was reduced by 21% and the operator dose by 54%. High-output pulsed fluoroscopy with extra beam filtering compared with continuous fluoroscopy improved the image quality, lowered the kilovoltage requirements, and reduced the patient entrance dose by 55% and

  12. Anti-Aliasing filter for reverse-time migration

    KAUST Repository

    Zhan, Ge

    2012-01-01

    We develop an anti-aliasing filter for reverse-time migration (RTM). It is similar to the traditional anti-aliasing filter used for Kirchhoff migration in that it low-pass filters the migration operator so that the dominant wavelength in the operator is greater than two times the trace sampling interval, except it is applied to both primary and multiple reflection events. Instead of applying this filter to the data in the traditional RTM operation, we apply the anti-aliasing filter to the generalized diffraction-stack migration operator. This gives the same migration image as computed by anti-aliased RTM. Download

  13. Ortho-Babinet polarization-interrogating filter: an interferometric approach to polarization measurement.

    Science.gov (United States)

    Van Delden, Jay S

    2003-07-15

    A novel, interferometric, polarization-interrogating filter assembly and method for the simultaneous measurement of all four Stokes parameters across a partially polarized irradiance image in a no-moving-parts, instantaneous, highly sensitive manner is described. In the reported embodiment of the filter, two spatially varying linear retarders and a linear polarizer comprise an ortho-Babinet, polarization-interrogating (OBPI) filter. The OBPI filter uniquely encodes the incident ensemble of electromagnetic wave fronts comprising a partially polarized irradiance image in a controlled, deterministic, spatially varying manner to map the complete state of polarization across the image to local variations in a superposed interference pattern. Experimental interferograms are reported along with a numerical simulation of the method.

  14. SART-Type Half-Threshold Filtering Approach for CT Reconstruction.

    Science.gov (United States)

    Yu, Hengyong; Wang, Ge

    2014-01-01

    The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.

  15. Quaternionic Spatiotemporal Filtering for Dense Motion Field Estimation in Ultrasound Imaging

    Directory of Open Access Journals (Sweden)

    Marion Adrien

    2010-01-01

    Full Text Available Abstract Blood motion estimation provides fundamental clinical information to prevent and detect pathologies such as cancer. Ultrasound imaging associated with Doppler methods is often used for blood flow evaluation. However, Doppler methods suffer from shortcomings such as limited spatial resolution and the inability to estimate lateral motion. Numerous methods such as block matching and decorrelation-based techniques have been proposed to overcome these limitations. In this paper, we propose an original method to estimate dense fields of vector velocity from ultrasound image sequences. Our proposal is based on a spatiotemporal approach and considers 2D+t data as a 3D volume. Orientation of the texture within this volume is related to velocity. Thus, we designed a bank of 3D quaternionic filters to estimate local orientation and then calculate local velocities. The method was applied to a large set of experimental and simulated flow sequences with low motion ( 1 mm/s within small vessels ( 1 mm. Evaluation was conducted with several quantitative criteria such as the normalized mean error or the estimated mean velocity. The results obtained show the good behaviour of our method, characterizing the flows studied.

  16. The model of illumination-transillumination for image enhancement of X-ray images

    Energy Technology Data Exchange (ETDEWEB)

    Lyu, Kwang Yeul [Shingu College, Sungnam (Korea, Republic of); Rhee, Sang Min [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2001-06-01

    In digital image processing, the homomorphic filtering approach is derived from an illumination - reflectance model of the image. It can also be used with an illumination-transillumination model X-ray film. Several X-ray images were applied to enhancement with histogram equalization and homomorphic filter based on an illumination-transillumination model. The homomorphic filter has proven theoretical claim of image density range compression and balanced contrast enhancement, and also was found a valuable tool to process analog X-ray images to digital images.

  17. Morphological representation of order-statistics filters.

    Science.gov (United States)

    Charif-Chefchaouni, M; Schonfeld, D

    1995-01-01

    We propose a comprehensive theory for the morphological bounds on order-statistics filters (and their repeated iterations). Conditions are derived for morphological openings and closings to serve as bounds (lower and upper, respectively) on order-statistics filters (and their repeated iterations). Under various assumptions, morphological open-closings and close-openings are also shown to serve as (tighter) bounds (lower and upper, respectively) on iterations of order-statistics filters. Simulations of the application of the results presented to image restoration are finally provided.

  18. Energy-filtered real- and k-space secondary and energy-loss electron imaging with Dual Emission Electron spectro-Microscope: Cs/Mo(110)

    Energy Technology Data Exchange (ETDEWEB)

    Grzelakowski, Krzysztof P., E-mail: k.grzelakowski@opticon-nanotechnology.com

    2016-05-15

    Since its introduction the importance of complementary k{sub ||}-space (LEED) and real space (LEEM) information in the investigation of surface science phenomena has been widely demonstrated over the last five decades. In this paper we report the application of a novel kind of electron spectromicroscope Dual Emission Electron spectroMicroscope (DEEM) with two independent electron optical channels for reciprocal and real space quasi-simultaneous imaging in investigation of a Cs covered Mo(110) single crystal by using the 800 eV electron beam from an “in-lens” electron gun system developed for the sample illumination. With the DEEM spectromicroscope it is possible to observe dynamic, irreversible processes at surfaces in the energy-filtered real space and in the corresponding energy-filtered k{sub ǁ}-space quasi-simultaneously in two independent imaging columns. The novel concept of the high energy electron beam sample illumination in the cathode lens based microscopes allows chemically selective imaging and analysis under laboratory conditions. - Highlights: • A novel concept of the electron sample illumination with “in-lens” e- gun is realized. • Quasi-simultaneous energy selective observation of the real- and k-space in EELS mode. • Observation of the energy filtered Auger electron diffraction at Cs atoms on Mo(110). • Energy-loss, Auger and secondary electron momentum microscopy is realized.

  19. Virtual experiment of optical spatial filtering in Matlab environment

    Science.gov (United States)

    Ji, Yunjing; Wang, Chunyong; Song, Yang; Lai, Jiancheng; Wang, Qinghua; Qi, Jing; Shen, Zhonghua

    2017-08-01

    The principle of spatial filtering experiment has been introduced, and the computer simulation platform with graphical user interface (GUI) has been made out in Matlab environment. Using it various filtering processes for different input image or different filtering purpose will be completed accurately, and filtering effect can be observed clearly with adjusting experimental parameters. The physical nature of the optical spatial filtering can be showed vividly, and so experimental teaching effect will be promoted.

  20. [Restoration filtering based on projection power spectrum for single-photon emission computed tomography].

    Science.gov (United States)

    Kubo, N

    1995-04-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical "least squares filter" theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the "Butterworth" filtering method (cut-off frequency of 0.15 cycles/pixel), and "Wiener" filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99mTc filled cylinder, were used. NMSE of the "Butterworth" filter, "Wiener" filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images.

  1. Restoration filtering based on projection power spectrum for single-photon emission computed tomography

    International Nuclear Information System (INIS)

    Kubo, Naoki

    1995-01-01

    To improve the quality of single-photon emission computed tomographic (SPECT) images, a restoration filter has been developed. This filter was designed according to practical 'least squares filter' theory. It is necessary to know the object power spectrum and the noise power spectrum. The power spectrum is estimated from the power spectrum of a projection, when the high-frequency power spectrum of a projection is adequately approximated as a polynomial exponential expression. A study of the restoration with the filter based on a projection power spectrum was conducted, and compared with that of the 'Butterworth' filtering method (cut-off frequency of 0.15 cycles/pixel), and 'Wiener' filtering (signal-to-noise power spectrum ratio was a constant). Normalized mean-squared errors (NMSE) of the phantom, two line sources located in a 99m Tc filled cylinder, were used. NMSE of the 'Butterworth' filter, 'Wiener' filter, and filtering based on a power spectrum were 0.77, 0.83, and 0.76 respectively. Clinically, brain SPECT images utilizing this new restoration filter improved the contrast. Thus, this filter may be useful in diagnosis of SPECT images. (author)

  2. Hierarchical detection of red lesions in retinal images by multiscale correlation filtering

    Science.gov (United States)

    Zhang, Bob; Wu, Xiangqian; You, Jane; Li, Qin; Karray, Fakhri

    2009-02-01

    This paper presents an approach to the computer aided diagnosis (CAD) of diabetic retinopathy (DR) -- a common and severe complication of long-term diabetes which damages the retina and cause blindness. Since red lesions are regarded as the first signs of DR, there has been extensive research on effective detection and localization of these abnormalities in retinal images. In contrast to existing algorithms, a new approach based on Multiscale Correlation Filtering (MSCF) and dynamic thresholding is developed. This consists of two levels, Red Lesion Candidate Detection (coarse level) and True Red Lesion Detection (fine level). The approach was evaluated using data from Retinopathy On-line Challenge (ROC) competition website and we conclude our method to be effective and efficient.

  3. Design considerations for a suboptimal Kalman filter

    Science.gov (United States)

    Difilippo, D. J.

    1995-06-01

    In designing a suboptimal Kalman filter, the designer must decide how to simplify the system error model without causing the filter estimation errors to increase to unacceptable levels. Deletion of certain error states and decoupling of error state dynamics are the two principal model simplifications that are commonly used in suboptimal filter design. For the most part, the decisions as to which error states can be deleted or decoupled are based on the designer's understanding of the physics of the particular system. Consequently, the details of a suboptimal design are usually unique to the specific application. In this paper, the process of designing a suboptimal Kalman filter is illustrated for the case of an airborne transfer-of-alignment (TOA) system used for synthetic aperture radar (SAR) motion compensation. In this application, the filter must continuously transfer the alignment of an onboard Doppler-damped master inertial navigation system (INS) to a strapdown navigator that processes information from a less accurate inertial measurement unit (IMU) mounted on the radar antenna. The IMU is used to measure spurious antenna motion during the SAR imaging interval, so that compensating phase corrections can be computed and applied to the radar returns, thereby presenting image degradation that would otherwise result from such motions. The principles of SAR are described in many references, for instance. The primary function of the TOA Kalman filter in a SAR motion compensation system is to control strapdown navigator attitude errors, and to a less degree, velocity and heading errors. Unlike a classical navigation application, absolute positional accuracy is not important. The motion compensation requirements for SAR imaging are discussed in some detail. This TOA application is particularly appropriate as a vehicle for discussing suboptimal filter design, because the system contains features that can be exploited to allow both deletion and decoupling of error

  4. Energy Based Clutter Filtering for Vector Flow Imaging

    DEFF Research Database (Denmark)

    Villagómez Hoyos, Carlos Armando; Jensen, Jonas; Ewertsen, Caroline

    2017-01-01

    for obtaining vector flow measurements, since the spectra overlaps at high beam-to-flow angles. In this work a distinct approach is proposed, where the energy of the velocity spectrum is used to differentiate among the two signals. The energy based method is applied by limiting the amplitude of the velocity...... spectrum function to a predetermined threshold. The effect of the clutter filtering is evaluated on a plane wave (PW) scan sequence in combination with transverse oscillation (TO) and directional beamforming (DB) for velocity estimation. The performance of the filter is assessed by comparison...

  5. Application of a novel Kalman filter based block matching method to ultrasound images for hand tendon displacement estimation.

    Science.gov (United States)

    Lai, Ting-Yu; Chen, Hsiao-I; Shih, Cho-Chiang; Kuo, Li-Chieh; Hsu, Hsiu-Yun; Huang, Chih-Chung

    2016-01-01

    Information about tendon displacement is important for allowing clinicians to not only quantify preoperative tendon injuries but also to identify any adhesive scaring between tendon and adjacent tissue. The Fisher-Tippett (FT) similarity measure has recently been shown to be more accurate than the Laplacian sum of absolute differences (SAD) and Gaussian sum of squared differences (SSD) similarity measures for tracking tendon displacement in ultrasound B-mode images. However, all of these similarity measures can easily be influenced by the quality of the ultrasound image, particularly its signal-to-noise ratio. Ultrasound images of injured hands are unfortunately often of poor quality due to the presence of adhesive scars. The present study investigated a novel Kalman-filter scheme for overcoming this problem. Three state-of-the-art tracking methods (FT, SAD, and SSD) were used to track the displacements of phantom and cadaver tendons, while FT was used to track human tendons. These three tracking methods were combined individually with the proposed Kalman-filter (K1) scheme and another Kalman-filter scheme used in a previous study to optimize the displacement trajectories of the phantom and cadaver tendons. The motion of the human extensor digitorum communis tendon was measured in the present study using the FT-K1 scheme. The experimental results indicated that SSD exhibited better accuracy in the phantom experiments, whereas FT exhibited better performance for tracking real tendon motion in the cadaver experiments. All three tracking methods were influenced by the signal-to-noise ratio of the images. On the other hand, the K1 scheme was able to optimize the tracking trajectory of displacement in all experiments, even from a location with a poor image quality. The human experimental data indicated that the normal tendons were displaced more than the injured tendons, and that the motion ability of the injured tendon was restored after appropriate rehabilitation

  6. Implementation of linear filters for iterative penalized maximum likelihood SPECT reconstruction

    International Nuclear Information System (INIS)

    Liang, Z.

    1991-01-01

    This paper reports on six low-pass linear filters applied in frequency space implemented for iterative penalized maximum-likelihood (ML) SPECT image reconstruction. The filters implemented were the Shepp-Logan filter, the Butterworth filer, the Gaussian filter, the Hann filter, the Parzen filer, and the Lagrange filter. The low-pass filtering was applied in frequency space to projection data for the initial estimate and to the difference of projection data and reprojected data for higher order approximations. The projection data were acquired experimentally from a chest phantom consisting of non-uniform attenuating media. All the filters could effectively remove the noise and edge artifacts associated with ML approach if the frequency cutoff was properly chosen. The improved performance of the Parzen and Lagrange filters relative to the others was observed. The best image, by viewing its profiles in terms of noise-smoothing, edge-sharpening, and contrast, was the one obtained with the Parzen filter. However, the Lagrange filter has the potential to consider the characteristics of detector response function

  7. Nonrigid registration with tissue-dependent filtering of the deformation field

    International Nuclear Information System (INIS)

    Staring, Marius; Klein, Stefan; Pluim, Josien P W

    2007-01-01

    In present-day medical practice it is often necessary to nonrigidly align image data. Current registration algorithms do not generally take the characteristics of tissue into account. Consequently, rigid tissue, such as bone, can be deformed elastically, growth of tumours may be concealed, and contrast-enhanced structures may be reduced in volume. We propose a method to locally adapt the deformation field at structures that must be kept rigid, using a tissue-dependent filtering technique. This adaptive filtering of the deformation field results in locally linear transformations without scaling or shearing. The degree of filtering is related to tissue stiffness: more filtering is applied at stiff tissue locations, less at parts of the image containing nonrigid tissue. The tissue-dependent filter is incorporated in a commonly used registration algorithm, using mutual information as a similarity measure and cubic B-splines to model the deformation field. The new registration algorithm is compared with this popular method. Evaluation of the proposed tissue-dependent filtering is performed on 3D computed tomography (CT) data of the thorax and on 2D digital subtraction angiography (DSA) images. The results show that tissue-dependent filtering of the deformation field leads to improved registration results: tumour volumes and vessel widths are preserved rather than affected

  8. An Adaptive Filtering Algorithm Based on Genetic Algorithm-Backpropagation Network

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2013-01-01

    Full Text Available A new image filtering algorithm is proposed. GA-BPN algorithm uses genetic algorithm (GA to decide weights in a back propagation neural network (BPN. It has better global optimal characteristics than traditional optimal algorithm. In this paper, we used GA-BPN to do image noise filter researching work. Firstly, this paper uses training samples to train GA-BPN as the noise detector. Then, we utilize the well-trained GA-BPN to recognize noise pixels in target image. And at last, an adaptive weighted average algorithm is used to recover noise pixels recognized by GA-BPN. Experiment data shows that this algorithm has better performance than other filters.

  9. Two-stage nonlinear filter for processing of scintigrams

    International Nuclear Information System (INIS)

    Pistor, P.; Hoener, J.; Walch, G.

    1973-01-01

    Linear filters which have been successfully used to process scintigrams can be modified in a meaningful manner by a preceding non-linear point operator, the Anscombe-transform. The advantages are: The scintigraphic noise becomes quasi-stationary and thus independent of the image. By these means the noise can be readily allowed for in the design of the convolutional operators. Transformed images with a stationary signal-to-noise ratio and a non-constant background t correspond to untransformed images with a signal-to-noise ratio that varies in certain limits. The filter chain automatically adapts to these changes. Our filter has the advantage over the majority of space-varying filters of being realizable by Fast Fourier Transform techniques. These advantages have to be paid for by reduced signal amplitude to background ratios. If the background is known, this shortcoming can be easily by-passed by processing trendfree scintigrams. If not, the filter chain should be completed by a third operator which reverses the Anscombe-transform. The Anscombe-transform influences the signal-to-noise ratio of cold spots and of hot spots in a different way. It remains an open question if this fact can be utilized to directly influence the detectability of the different kinds of spots

  10. A local region of interest image reconstruction via filtered backprojection for fan-beam differential phase-contrast computed tomography

    International Nuclear Information System (INIS)

    Qi Zhihua; Chen Guanghong

    2007-01-01

    Recently, x-ray differential phase contrast computed tomography (DPC-CT) has been experimentally implemented using a conventional source combined with several gratings. Images were reconstructed using a parallel-beam reconstruction formula. However, parallel-beam reconstruction formulae are not directly applicable for a large image object where the parallel-beam approximation fails. In this note, we present a new image reconstruction formula for fan-beam DPC-CT. There are two major features in this algorithm: (1) it enables the reconstruction of a local region of interest (ROI) using data acquired from an angular interval shorter than 180 0 + fan angle and (2) it still preserves the filtered backprojection structure. Numerical simulations have been conducted to validate the image reconstruction algorithm. (note)

  11. KALMAN FILTER BASED FEATURE ANALYSIS FOR TRACKING PEOPLE FROM AIRBORNE IMAGES

    Directory of Open Access Journals (Sweden)

    B. Sirmacek

    2012-09-01

    Full Text Available Recently, analysis of man events in real-time using computer vision techniques became a very important research field. Especially, understanding motion of people can be helpful to prevent unpleasant conditions. Understanding behavioral dynamics of people can also help to estimate future states of underground passages, shopping center like public entrances, or streets. In order to bring an automated solution to this problem, we propose a novel approach using airborne image sequences. Although airborne image resolutions are not enough to see each person in detail, we can still notice a change of color components in the place where a person exists. Therefore, we propose a color feature detection based probabilistic framework in order to detect people automatically. Extracted local features behave as observations of the probability density function (pdf of the people locations to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf. First, we use estimated pdf to detect boundaries of dense crowds. After that, using background information of dense crowds and previously extracted local features, we detect other people in non-crowd regions automatically for each image in the sequence. We benefit from Kalman filtering to track motion of detected people. To test our algorithm, we use a stadium entrance image data set taken from airborne camera system. Our experimental results indicate possible usage of the algorithm in real-life man events. We believe that the proposed approach can also provide crucial information to police departments and crisis management teams to achieve more detailed observations of people in large open area events to prevent possible accidents or unpleasant conditions.

  12. A center-median filtering method for detection of temporal variation in coronal images

    Directory of Open Access Journals (Sweden)

    Plowman Joseph

    2016-01-01

    Full Text Available Events in the solar corona are often widely separated in their timescales, which can allow them to be identified when they would otherwise be confused with emission from other sources in the corona. Methods for cleanly separating such events based on their timescales are thus desirable for research in the field. This paper develops a technique for identifying time-varying signals in solar coronal image sequences which is based on a per-pixel running median filter and an understanding of photon-counting statistics. Example applications to “EIT waves” (named after EIT, the EUV Imaging Telescope on the Solar and Heliospheric Observatory and small-scale dynamics are shown, both using 193 Å data from the Atmospheric Imaging Assembly (AIA on the Solar Dynamics Observatory. The technique is found to discriminate EIT waves more cleanly than the running and base difference techniques most commonly used. It is also demonstrated that there is more signal in the data than is commonly appreciated, finding that the waves can be traced to the edge of the AIA field of view when the data are rebinned to increase the signal-to-noise ratio.

  13. Automated microaneurysm detection method based on double ring filter in retinal fundus images

    Science.gov (United States)

    Mizutani, Atsushi; Muramatsu, Chisako; Hatanaka, Yuji; Suemori, Shinsuke; Hara, Takeshi; Fujita, Hiroshi

    2009-02-01

    The presence of microaneurysms in the eye is one of the early signs of diabetic retinopathy, which is one of the leading causes of vision loss. We have been investigating a computerized method for the detection of microaneurysms on retinal fundus images, which were obtained from the Retinopathy Online Challenge (ROC) database. The ROC provides 50 training cases, in which "gold standard" locations of microaneurysms are provided, and 50 test cases without the gold standard locations. In this study, the computerized scheme was developed by using the training cases. Although the results for the test cases are also included, this paper mainly discusses the results for the training cases because the "gold standard" for the test cases is not known. After image preprocessing, candidate regions for microaneurysms were detected using a double-ring filter. Any potential false positives located in the regions corresponding to blood vessels were removed by automatic extraction of blood vessels from the images. Twelve image features were determined, and the candidate lesions were classified into microaneurysms or false positives using the rule-based method and an artificial neural network. The true positive fraction of the proposed method was 0.45 at 27 false positives per image. Forty-two percent of microaneurysms in the 50 training cases were considered invisible by the consensus of two co-investigators. When the method was evaluated for visible microaneurysms, the sensitivity for detecting microaneurysms was 65% at 27 false positives per image. Our computerized detection scheme could be improved for helping ophthalmologists in the early diagnosis of diabetic retinopathy.

  14. Influence of Cone-Beam Computed Tomography filters on diagnosis of simulated endodontic complications.

    Science.gov (United States)

    Verner, F S; D'Addazio, P S; Campos, C N; Devito, K L; Almeida, S M; Junqueira, R B

    2017-11-01

    To evaluate the influence of cone-beam computed tomography (CBCT) filters on diagnosis of simulated endodontic complications. Sixteen human teeth, in three mandibles, were submitted to the following simulated endodontic complications: (G1) fractured file, (G2) perforations in the canal walls, (G3) deviated cast post, and (G4) external root resorption. The mandibles were submitted to CBCT examination (I-Cat ® Next Generation). Five oral radiologists evaluated the images independently with and without XoranCat ® software filters. Accuracy, sensitivity and specificity were determined. ROC curves were calculated for each group with the filters, and the areas under the curves were compared using anova (one-way) test. McNemar test was applied for pair-wise agreement between all images versus the gold standard and original images versus images with filters (P originals images (P = 0.00 for all filters) only in G1 group. There were no differences in the other groups. The filters did not improve the diagnosis of the simulated endodontic complications evaluated. Their diagnosis remains a major challenge in clinical practice. © 2016 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  15. A background subtraction routine for enhancing energy-filtered plasmon images of MgAl2O4 implanted with Al+ and Mg+ ions

    International Nuclear Information System (INIS)

    Evans, N.D.; Kenik, E.A.; Bentley, J.; Zinkle, S.J.

    1995-01-01

    MgAl 2 O 4 , a candidate fusion reactor material, was irradiated with Al + or Mg + ions; electron energy-loss spectra and energy-filtered plasmon images showed that metallic Al colloids are present in the ion-irradiated regions. This paper shows the subtraction of the spinel plasmon component in images using 15-eV-loss electrons in some detail

  16. Blood Vessel Extraction in Color Retinal Fundus Images with Enhancement Filtering and Unsupervised Classification

    Directory of Open Access Journals (Sweden)

    Zafer Yavuz

    2017-01-01

    Full Text Available Retinal blood vessels have a significant role in the diagnosis and treatment of various retinal diseases such as diabetic retinopathy, glaucoma, arteriosclerosis, and hypertension. For this reason, retinal vasculature extraction is important in order to help specialists for the diagnosis and treatment of systematic diseases. In this paper, a novel approach is developed to extract retinal blood vessel network. Our method comprises four stages: (1 preprocessing stage in order to prepare dataset for segmentation; (2 an enhancement procedure including Gabor, Frangi, and Gauss filters obtained separately before a top-hat transform; (3 a hard and soft clustering stage which includes K-means and Fuzzy C-means (FCM in order to get binary vessel map; and (4 a postprocessing step which removes falsely segmented isolated regions. The method is tested on color retinal images obtained from STARE and DRIVE databases which are available online. As a result, Gabor filter followed by K-means clustering method achieves 95.94% and 95.71% of accuracy for STARE and DRIVE databases, respectively, which are acceptable for diagnosis systems.

  17. Union operation image processing of data cubes separately processed by different objective filters and its application to void analysis in an all-solid-state lithium-ion battery.

    Science.gov (United States)

    Yamamoto, Yuta; Iriyama, Yasutoshi; Muto, Shunsuke

    2016-04-01

    In this article, we propose a smart image-analysis method suitable for extracting target features with hierarchical dimension from original data. The method was applied to three-dimensional volume data of an all-solid lithium-ion battery obtained by the automated sequential sample milling and imaging process using a focused ion beam/scanning electron microscope to investigate the spatial configuration of voids inside the battery. To automatically fully extract the shape and location of the voids, three types of filters were consecutively applied: a median blur filter to extract relatively larger voids, a morphological opening operation filter for small dot-shaped voids and a morphological closing operation filter for small voids with concave contrasts. Three data cubes separately processed by the above-mentioned filters were integrated by a union operation to the final unified volume data, which confirmed the correct extraction of the voids over the entire dimension contained in the original data. © The Author 2015. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Novel noise reduction filter for improving visibility of early computed tomography signs of hyperacute stroke. Evaluation of the filter's performance. Preliminary clinical experience

    International Nuclear Information System (INIS)

    Takahashi, Noriyuki; Ishii, Kiyoshi; Lee, Y.; Tsai, D.Y.

    2007-01-01

    The aim of this study was to evaluate the performance of a novel noise reduction filter for improving the visibility of early computed tomography (CT) signs of hyperacute stroke on nonenhanced CT images. Fourteen patients with a middle cerebral artery occlusion within 4.5 h after onset were evaluated. The signal-to-noise ratio (SNR) of the processed images with the noise reduction filter and that of original images were measured. Two neuroradiologists visually rated all the processed and original images on the visibility of normal and abnormal gray-white matter interfaces. The SNR value of the processed images was approximately eight times as high as that of the original images, and a 87% reduction of noise was achieved using this technique. For the visual assessment, the results showed that the visibility of normal gray-white matter interface and that of the loss of the gray-white matter interface were significantly improved using the proposed method (P<0.05). The noise reduction filter proposed in the present study has the potential to improve the visibility of early CT signs of hyperacute stroke on nonenhanced CT images. (author)

  19. Modified-hybrid optical neural network filter for multiple object recognition within cluttered scenes

    Science.gov (United States)

    Kypraios, Ioannis; Young, Rupert C. D.; Chatwin, Chris R.

    2009-08-01

    Motivated by the non-linear interpolation and generalization abilities of the hybrid optical neural network filter between the reference and non-reference images of the true-class object we designed the modifiedhybrid optical neural network filter. We applied an optical mask to the hybrid optical neural network's filter input. The mask was built with the constant weight connections of a randomly chosen image included in the training set. The resulted design of the modified-hybrid optical neural network filter is optimized for performing best in cluttered scenes of the true-class object. Due to the shift invariance properties inherited by its correlator unit the filter can accommodate multiple objects of the same class to be detected within an input cluttered image. Additionally, the architecture of the neural network unit of the general hybrid optical neural network filter allows the recognition of multiple objects of different classes within the input cluttered image by modifying the output layer of the unit. We test the modified-hybrid optical neural network filter for multiple objects of the same and of different classes' recognition within cluttered input images and video sequences of cluttered scenes. The filter is shown to exhibit with a single pass over the input data simultaneously out-of-plane rotation, shift invariance and good clutter tolerance. It is able to successfully detect and classify correctly the true-class objects within background clutter for which there has been no previous training.

  20. Experimental use of iteratively designed rotation invariant correlation filters

    International Nuclear Information System (INIS)

    Sweeney, D.W.; Ochoa, E.; Schils, G.F.

    1987-01-01

    Iteratively designed filters are incorporated into an optical correlator for position, rotation, and intensity invariant recognition of target images. The filters exhibit excellent discrimination because they are designed to contain full information about the target image. Numerical simulations and experiments demonstrate detection of targets that are corrupted with random noise (SNR≅0.5) and also partially obscured by other objects. The complex valued filters are encoded in a computer generated hologram and fabricated directly using an electron-beam system. Experimental results using a liquid crystal spatial light modulator for real-time input show excellent agreement with analytical and numerical computations

  1. [Examination of patient dose reduction in cardiovasucular X-ray systems with a metal filter].

    Science.gov (United States)

    Yasuda, Mitsuyoshi; Kato, Kyouichi; Tanabe, Nobuaki; Sakiyama, Koushi; Uchiyama, Yushi; Suzuki, Yoshiaki; Suzuki, Hiroshi; Nakazawa, Yasuo

    2012-01-01

    In interventional X-ray for cardiology of flat panel digital detector (FPD), the phenomenon that exposure dose was suddenly increased when a subject thickness was thickened was recognized. At that time, variable metal built-in filters in FPD were all off. Therefore, we examined whether dose reduction was possible without affecting a clinical image using metal filter (filter) which we have been conventionally using for dose reduction. About 45% dose reduction was achieved when we measured an exposure dose at 30 cm of acrylic thickness in the presence of a filter. In addition, we measured signal to noise ratio/contrast to noise ratio/a resolution limit by the visual evaluation, and there was no influence by filter usage. In the clinical examination, visual evaluation of image quality of coronary angiography (40 cases) using a 5-point evaluation scale by a physician was performed. As a result, filter usage did not influence the image quality (p=NS). Therefore, reduction of sudden increase of exposure dose was achieved without influencing an image quality by adding filter to FPD.

  2. Object-Oriented Semisupervised Classification of VHR Images by Combining MedLDA and a Bilateral Filter

    Directory of Open Access Journals (Sweden)

    Shi He

    2015-01-01

    Full Text Available A Bayesian hierarchical model is presented to classify very high resolution (VHR images in a semisupervised manner, in which both a maximum entropy discrimination latent Dirichlet allocation (MedLDA and a bilateral filter are combined into a novel application framework. The primary contribution of this paper is to nullify the disadvantages of traditional probabilistic topic models on pixel-level supervised information and to achieve the effective classification of VHR remote sensing images. This framework consists of the following two iterative steps. In the training stage, the model utilizes the central labeled pixel and its neighborhood, as a squared labeled image object, to train the classifiers. In the classification stage, each central unlabeled pixel with its neighborhood, as an unlabeled object, is classified as a user-provided geoobject class label with the maximum posterior probability. Gibbs sampling is adopted for model inference. The experimental results demonstrate that the proposed method outperforms two classical SVM-based supervised classification methods and probabilistic-topic-models-based classification methods.

  3. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Matched Filtering and Convolutional Neural Network.

    Science.gov (United States)

    Chen, Shuo; Luo, Chenggao; Wang, Hongqiang; Deng, Bin; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-04-26

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. However, there are still two problems in three-dimensional (3D) TCAI. Firstly, the large-scale reference-signal matrix based on meshing the 3D imaging area creates a heavy computational burden, thus leading to unsatisfactory efficiency. Secondly, it is difficult to resolve the target under low signal-to-noise ratio (SNR). In this paper, we propose a 3D imaging method based on matched filtering (MF) and convolutional neural network (CNN), which can reduce the computational burden and achieve high-resolution imaging for low SNR targets. In terms of the frequency-hopping (FH) signal, the original echo is processed with MF. By extracting the processed echo in different spike pulses separately, targets in different imaging planes are reconstructed simultaneously to decompose the global computational complexity, and then are synthesized together to reconstruct the 3D target. Based on the conventional TCAI model, we deduce and build a new TCAI model based on MF. Furthermore, the convolutional neural network (CNN) is designed to teach the MF-TCAI how to reconstruct the low SNR target better. The experimental results demonstrate that the MF-TCAI achieves impressive performance on imaging ability and efficiency under low SNR. Moreover, the MF-TCAI has learned to better resolve the low-SNR 3D target with the help of CNN. In summary, the proposed 3D TCAI can achieve: (1) low-SNR high-resolution imaging by using MF; (2) efficient 3D imaging by downsizing the large-scale reference-signal matrix; and (3) intelligent imaging with CNN. Therefore, the TCAI based on MF and CNN has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  4. X-ray fluoroscopy spatio-temporal filtering with object detection

    International Nuclear Information System (INIS)

    Aufrichtig, R.; Wilson, D.L.; University Hospitals of Cleveland, OH

    1995-01-01

    One potential way to reduce patient and staff x-ray fluoroscopy dose is to reduce the quantum exposure to the detector and compensate the additional noise with digital filtering. A new filtering method, spatio-temporal filtering with object detection, is described that reduces noise while minimizing motion and spatial blur. As compared to some conventional motion-detection filtering schemes, this object-detection method incorporates additional a priori knowledge of image content; i.e. much of the motion occurs in isolated long thin objects (catheters, guide wires, etc.). The authors create object-likelihood images and use these to control spatial and recursive temporal filtering such as to reduce blurring the objects of interest. They use automatically computed receiver operating characteristic (ROC) curves to optimize the object-likelihood enhancement method and determine that oriented matched filter kernels with 4 orientations are appropriate. The matched filter kernels are simple projected cylinders. The authors demonstrate the method on several representative x-ray fluoroscopy sequences to which noise is added to simulate very low dose acquisitions. With processing, they find that noise variance is significantly reduced with slightly less noise reduction near moving objects. They estimate an effective exposure reduction greater than 80%

  5. Low SWaP multispectral sensors using dichroic filter arrays

    Science.gov (United States)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  6. Suitable post processing algorithms for X-ray imaging using oversampled displaced multiple images

    International Nuclear Information System (INIS)

    Thim, J; Reza, S; Nawaz, K; Norlin, B; O'Nils, M; Oelmann, B

    2011-01-01

    X-ray imaging systems such as photon counting pixel detectors have a limited spatial resolution of the pixels, based on the complexity and processing technology of the readout electronics. For X-ray imaging situations where the features of interest are smaller than the imaging system pixel size, and the pixel size cannot be made smaller in the hardware, alternative means of resolution enhancement require to be considered. Oversampling with the usage of multiple displaced images, where the pixels of all images are mapped to a final resolution enhanced image, has proven a viable method of reaching a sub-pixel resolution exceeding the original resolution. The effectiveness of the oversampling method declines with the number of images taken, the sub-pixel resolution increases, but relative to a real reduction of imaging pixel sizes yielding a full resolution image, the perceived resolution from the sub-pixel oversampled image is lower. This is because the oversampling method introduces blurring noise into the mapped final images, and the blurring relative to full resolution images increases with the oversampling factor. One way of increasing the performance of the oversampling method is by sharpening the images in post processing. This paper focus on characterizing the performance increase of the oversampling method after the use of some suitable post processing filters, for digital X-ray images specifically. The results show that spatial domain filters and frequency domain filters of the same type yield indistinguishable results, which is to be expected. The results also show that the effectiveness of applying sharpening filters to oversampled multiple images increase with the number of images used (oversampling factor), leaving 60-80% of the original blurring noise after filtering a 6 x 6 mapped image (36 images taken), where the percentage is depending on the type of filter. This means that the effectiveness of the oversampling itself increase by using sharpening

  7. Unsharp masking technique as a preprocessing filter for improvement of 3D-CT image of bony structure in the maxillofacial region

    International Nuclear Information System (INIS)

    Harada, Takuya; Nishikawa, Keiichi; Kuroyanagi, Kinya

    1998-01-01

    We evaluated the usefulness of the unsharp masking technique as a preprocessing filter to improve 3D-CT images of bony structure in the maxillofacial region. The effect of the unsharp masking technique with several combinations of mask size and weighting factor on image resolution was investigated using a spatial frequency phantom made of bone-equivalent material. The 3D-CT images were obtained with scans perpendicular to and parallel to the phantom plates. The contrast transfer function (CTF) and the full width at half maximum (FWHM) of each spatial frequency component were measured. The FWHM was expressed as a ratio against the actual thickness of phantom plate. The effect on pseudoforamina was assessed using sliced CT images obtained in clinical bony 3D-CT examinations. The effect of the unsharp masking technique on image quality was also visually evaluated using five clinical fracture cases. CTFs did not change. FWHM ratios of original 3D-CT images were smaller than 1.0, regardless of the scanning direction. Those in scans perpendicular to the phantom plates were not changed by the unsharp masking technique. Those in parallel scanning were increased by mask size and weighting factor. The area of pseudoforamina decreased with increases in mask size and weighting factor. The combination of mask size 3 x 3 pixels and weighting factor 5 was optimal. Visual evaluation indicated that preprocessing with the unsharp masking technique improved the image quality of the 3D-CT images. The unsharp masking technique is useful as a preprocessing filter to improve the 3D-CT image of bony structure in the maxillofacial region. (author)

  8. Papaya Tree Detection with UAV Images Using a GPU-Accelerated Scale-Space Filtering Method

    Directory of Open Access Journals (Sweden)

    Hao Jiang

    2017-07-01

    Full Text Available The use of unmanned aerial vehicles (UAV can allow individual tree detection for forest inventories in a cost-effective way. The scale-space filtering (SSF algorithm is commonly used and has the capability of detecting trees of different crown sizes. In this study, we made two improvements with regard to the existing method and implementations. First, we incorporated SSF with a Lab color transformation to reduce over-detection problems associated with the original luminance image. Second, we ported four of the most time-consuming processes to the graphics processing unit (GPU to improve computational efficiency. The proposed method was implemented using PyCUDA, which enabled access to NVIDIA’s compute unified device architecture (CUDA through high-level scripting of the Python language. Our experiments were conducted using two images captured by the DJI Phantom 3 Professional and a most recent NVIDIA GPU GTX1080. The resulting accuracy was high, with an F-measure larger than 0.94. The speedup achieved by our parallel implementation was 44.77 and 28.54 for the first and second test image, respectively. For each 4000 × 3000 image, the total runtime was less than 1 s, which was sufficient for real-time performance and interactive application.

  9. FIONDA (Filtering Images of Niobium Disks Application): Filter application for Eddy Current Scanner data analysis

    International Nuclear Information System (INIS)

    Boffo, C.; Bauer, P.

    2005-01-01

    As part of the material QC process, each Niobium disk from which a superconducting RF cavity is built must undergo an eddy current scan [1]. This process allows to discover embedded defects in the material that are not visible to the naked eye because too small or under the surface. Moreover, during the production process of SC cavities the outer layer of Nb is removed via chemical or electro-chemical etching, thus it is important to evaluate the quality of the subsurface layer (in the order of 100nm) where superconductivity will happen. The reference eddy current scanning machine is operated at DESY; at Fermilab we are using the SNS eddy current scanner on loan, courtesy of SNS. In the past year, several upgrades were implemented aiming at raising the SNS machine performance to that of the DESY reference machine [2]. As part of this effort an algorithm that enables the filtering of the results of the scans and thus improves the resolution of the process was developed. The description of the algorithm and of the software used to filter the scan results is presented in this note. This filter application is a useful tool when the coupling between the signal associated to the long range probe distance (or sample thickness) variation and that associated to inclusions masks the presence of defects. Moreover instead of using indirect criteria (such as appearance on screen), the filter targets precisely the topology variations of interest. This application is listed in the FermiTools database and is freely available

  10. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    Energy Technology Data Exchange (ETDEWEB)

    Velo, Alexandre F.; Martins, Joao F.T.; Oliveira, Adriano S.; Carvalho, Diego V.S.; Faria, Fernando S.; Hamada, Margarida M.; Mesquita, Carlos H., E-mail: afvelo@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV ({sup 137}Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  11. Filter assessment applied to analytical reconstruction for industrial third-generation tomography

    International Nuclear Information System (INIS)

    Velo, Alexandre F.; Martins, Joao F.T.; Oliveira, Adriano S.; Carvalho, Diego V.S.; Faria, Fernando S.; Hamada, Margarida M.; Mesquita, Carlos H.

    2015-01-01

    Multiphase systems are structures that contain a mixture of solids, liquids and gases inside a chemical reactor or pipes in a dynamic process. These systems are found in chemical, food, pharmaceutical and petrochemical industries. The gamma ray computed tomography (CT) system has been applied to visualize the distribution of multiphase systems without interrupting production. CT systems have been used to improve design, operation and troubleshooting of industrial processes. Computer tomography for multiphase processes is being developed at several laboratories. It is well known that scanning systems demand high processing time, limited set of data projections and views to obtain an image. Because of it, the image quality is dependent on the number of projection, number of detectors, acquisition time and reconstruction time. A phantom containing air, iron and aluminum was used on the third generation industrial tomography with 662 keV ( 137 Cs) radioactive source. It was applied the Filtered Back Projection algorithm to reconstruct the images. An efficient tomography is dependent of the image quality, thus the objective of this research was to apply different types of filters on the analytical algorithm and compare each other using the figure of merit denominated root mean squared error (RMSE), the filter that presents lower value of RMSE has better quality. On this research, five types of filters were used: Ram-Lak, Shepp-Logan, Cosine, Hamming and Hann filters. As results, all filters presented lower values of RMSE, that means the filters used have low stand deviation compared to the mass absorption coefficient, however, the Hann filter presented better RMSE and CNR compared to the others. (author)

  12. Analog Electronic Filters Theory, Design and Synthesis

    CERN Document Server

    Dimopoulos, Hercules G

    2012-01-01

    Filters are essential subsystems in a huge variety of electronic systems. Filter applications are innumerable; they are used for noise reduction, demodulation, signal detection, multiplexing, sampling, sound and speech processing, transmission line equalization and image processing, to name just a few. In practice, no electronic system can exist without filters. They can be found in everything from power supplies to mobile phones and hard disk drives and from loudspeakers and MP3 players to home cinema systems and broadband Internet connections. This textbook introduces basic concepts and methods and the associated mathematical and computational tools employed in electronic filter theory, synthesis and design.  This book can be used as an integral part of undergraduate courses on analog electronic filters. Includes numerous, solved examples, applied examples and exercises for each chapter. Includes detailed coverage of active and passive filters in an independent but correlated manner. Emphasizes real filter...

  13. Enhanced 3D PET OSEM reconstruction using inter-update Metz filtering

    International Nuclear Information System (INIS)

    Jacobson, M.; Levkovitz, R.; Ben-Tal, A.; Thielemans, K.; Spinks, T.; Belluzzo, D.; Pagani, E.; Bettinardi, V.; Gilardi, M.C.; Zverovich, A.; Mitra, G.

    2000-01-01

    We present an enhancement of the OSEM (ordered set expectation maximization) algorithm for 3D PET reconstruction, which we call the inter-update Metz filtered OSEM (IMF-OSEM). The IMF-OSEM algorithm incorporates filtering action into the image updating process in order to improve the quality of the reconstruction. With this technique, the multiplicative correction image - ordinarily used to update image estimates in plain OSEM - is applied to a Metz-filtered version of the image estimate at certain intervals. In addition, we present a software implementation that employs several high-speed features to accelerate reconstruction. These features include, firstly, forward and back projection functions which make full use of symmetry as well as a fast incremental computation technique. Secondly, the software has the capability of running in parallel mode on several processors. The parallelization approach employed yields a significant speed-up, which is nearly independent of the amount of data. Together, these features lead to reasonable reconstruction times even when using large image arrays and non-axially compressed projection data. The performance of IMF-OSEM was tested on phantom data acquired on the GE Advance scanner. Our results demonstrate that an appropriate choice of Metz filter parameters can improve the contrast-noise balance of certain regions of interest relative to both plain and post-filtered OSEM, and to the GE commercial reprojection algorithm software. (author)

  14. Tungsten anode tubes with K-edge filters for mammography

    Energy Technology Data Exchange (ETDEWEB)

    Beaman, S.; Lillicrap, S.C. (Wessex Regional Medical Physics Service, Bath (UK)); Price, J.L. (Jarvis Screening Centre, Guildford (UK))

    1983-10-01

    Optimum X-ray energies for mammography have previously been calculated using the maximum signal to noise ratio (SNR) per unit dose to the breast, or the minimum exposure for constant SNR. Filters having absorption edges at appropriate energy positions have been used to modify the shape of tungsten anode spectra towards the calculated optimum. The suitability of such spectra for practical use has been assessed by comparing the film image quality and the incident breast dose obtained using a K-edge filtered tungsten anode tube with that obtained using a molybdenum anode. Image quality has been assessed by using a 'random' phantom and by comparing mammograms where one breast was radiographed using a filtered tungsten anode tube and the other using a standard molybdenum anode unit. Relative breast doses were estimated from both ionisation chamber measurements with a phantom and thermoluminescent dosimetry measurements on the breast. Film image quality assessment indicated that the filtered tungsten anode tube gave results not significantly different from those obtained with a molybdenum anode tube for a tissue thickness of about 4 cm and which were better for larger breast thicknesses. Doses could be reduced to between one-half and one-third with the filtered tungsten anode tube.

  15. Tungsten anode tubes with K-edge filters for mammography

    International Nuclear Information System (INIS)

    Beaman, S.; Lillicrap, S.C.; Price, J.L.

    1983-01-01

    Optimum X-ray energies for mammography have previously been calculated using the maximum signal to noise ratio (SNR) per unit dose to the breast, or the minimum exposure for constant SNR. Filters having absorption edges at appropriate energy positions have been used to modify the shape of tungsten anode spectra towards the calculated optimum. The suitability of such spectra for practical use has been assessed by comparing the film image quality and the incident breast dose obtained using a K-edge filtered tungsten anode tube with that obtained using a molybdenum anode. Image quality has been assessed by using a 'random' phantom and by comparing mammograms where one breast was radiographed using a filtered tungsten anode tube and the other using a standard molybdenum anode unit. Relative breast doses were estimated from both ionisation chamber measurements with a phantom and thermoluminescent dosimetry measurements on the breast. Film image quality assessment indicated that the filtered tungsten anode tube gave results not significantly different from those obtained with a molybdenum anode tube for a tissue thickness of abut 4 cm and which were better for larger breast thicknesses. Doses could be reduced to between one-half and one-third with the filtered tungsten anode tube. (U.K.)

  16. Nanophotonic Image Sensors.

    Science.gov (United States)

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Efficient Scalable Median Filtering Using Histogram-Based Operations.

    Science.gov (United States)

    Green, Oded

    2018-05-01

    Median filtering is a smoothing technique for noise removal in images. While there are various implementations of median filtering for a single-core CPU, there are few implementations for accelerators and multi-core systems. Many parallel implementations of median filtering use a sorting algorithm for rearranging the values within a filtering window and taking the median of the sorted value. While using sorting algorithms allows for simple parallel implementations, the cost of the sorting becomes prohibitive as the filtering windows grow. This makes such algorithms, sequential and parallel alike, inefficient. In this work, we introduce the first software parallel median filtering that is non-sorting-based. The new algorithm uses efficient histogram-based operations. These reduce the computational requirements of the new algorithm while also accessing the image fewer times. We show an implementation of our algorithm for both the CPU and NVIDIA's CUDA supported graphics processing unit (GPU). The new algorithm is compared with several other leading CPU and GPU implementations. The CPU implementation has near perfect linear scaling with a speedup on a quad-core system. The GPU implementation is several orders of magnitude faster than the other GPU implementations for mid-size median filters. For small kernels, and , comparison-based approaches are preferable as fewer operations are required. Lastly, the new algorithm is open-source and can be found in the OpenCV library.

  18. Creation of an iOS and Android Mobile Application for Inferior Vena Cava (IVC) Filters: A Powerful Tool to Optimize Care of Patients with IVC Filters.

    Science.gov (United States)

    Deso, Steven E; Idakoji, Ibrahim A; Muelly, Michael C; Kuo, William T

    2016-06-01

    Owing to a myriad of inferior vena cava (IVC) filter types and their potential complications, rapid and correct identification may be challenging when encountered on routine imaging. The authors aimed to develop an interactive mobile application that allows recognition of all IVC filters and related complications, to optimize the care of patients with indwelling IVC filters. The FDA Premarket Notification Database was queried from 1980 to 2014 to identify all IVC filter types in the United States. An electronic search was then performed on MEDLINE and the FDA MAUDE database to identify all reported complications associated with each device. High-resolution photos were taken of each filter type and corresponding computed tomographic and fluoroscopic images were obtained from an institutional review board-approved IVC filter registry. A wireframe and storyboard were created, and software was developed using HTML5/CSS compliant code. The software was deployed using PhoneGap (Adobe, San Jose, CA), and the prototype was tested and refined. Twenty-three IVC filter types were identified for inclusion. Safety data from FDA MAUDE and 72 relevant peer-reviewed studies were acquired, and complication rates for each filter type were highlighted in the application. Digital photos, fluoroscopic images, and CT DICOM files were seamlessly incorporated. All data were succinctly organized electronically, and the software was successfully deployed into Android (Google, Mountain View, CA) and iOS (Apple, Cupertino, CA) platforms. A powerful electronic mobile application was successfully created to allow rapid identification of all IVC filter types and related complications. This application may be used to optimize the care of patients with IVC filters.

  19. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    Energy Technology Data Exchange (ETDEWEB)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Jackowski, Marcel P., E-mail: mjack@ime.usp.b [University of Sao Paulo (USP), SP (Brazil). Dept. of Computer Science

    2011-07-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  20. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    International Nuclear Information System (INIS)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio; Jackowski, Marcel P.

    2011-01-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  1. FILTER-INDUCED BIAS IN Lyα EMITTER SURVEYS: A COMPARISON BETWEEN STANDARD AND TUNABLE FILTERS. GRAN TELESCOPIO CANARIAS PRELIMINARY RESULTS

    Energy Technology Data Exchange (ETDEWEB)

    De Diego, J. A.; De Leo, M. A. [Instituto de Astronomía, Universidad Nacional Autónoma de México Avenida Universidad 3000, Ciudad Universitaria, C.P. 04510, Distrito Federal (Mexico); Cepa, J.; Bongiovanni, A. [Instituto de Astrofísica de Canarias, E-38205 La Laguna, Tenerife (Spain); Verdugo, T. [Centro de Investigaciones de Astronomía (CIDA), Apartado Postal 264, Mérida 5101-A (Venezuela, Bolivarian Republic of); Sánchez-Portal, M. [Herschel Science Centre (HSC), European Space Agency Centre (ESAC)/INSA, Villanueva de la Cañada, Madrid (Spain); González-Serrano, J. I., E-mail: jdo@astro.unam.mx [Instituto de Física de Cantabria (CSIC-Universidad de Cantabria), E-39005 Santander (Spain)

    2013-10-01

    Lyα emitter (LAE) surveys have successfully used the excess in a narrowband filter compared to a nearby broadband image to find candidates. However, the odd spectral energy distribution (SED) of LAEs combined with the instrumental profile has important effects on the properties of the candidate samples extracted from these surveys. We investigate the effect of the bandpass width and the transmission profile of the narrowband filters used for extracting LAE candidates at redshifts z ≅ 6.5 through Monte Carlo simulations, and we present pilot observations to test the performance of tunable filters to find LAEs and other emission-line candidates. We compare the samples obtained using a narrow ideal rectangular filter, the Subaru NB921 narrowband filter, and sweeping across a wavelength range using the ultra-narrow-band tunable filters of the instrument OSIRIS, installed at the 10.4 m Gran Telescopio Canarias. We use this instrument for extracting LAE candidates from a small set of real observations. Broadband data from the Subaru, Hubble Space Telescope, and Spitzer databases were used for fitting SEDs to calculate photometric redshifts and to identify interlopers. Narrowband surveys are very efficient in finding LAEs in large sky areas, but the samples obtained are not evenly distributed in redshift along the filter bandpass, and the number of LAEs with equivalent widths <60 Å can be underestimated. These biased results do not appear in samples obtained using ultra-narrow-band tunable filters. However, the field size of tunable filters is restricted because of the variation of the effective wavelength across the image. Thus, narrowband and ultra-narrow-band surveys are complementary strategies to investigate high-redshift LAEs.

  2. FILTER-INDUCED BIAS IN Lyα EMITTER SURVEYS: A COMPARISON BETWEEN STANDARD AND TUNABLE FILTERS. GRAN TELESCOPIO CANARIAS PRELIMINARY RESULTS

    International Nuclear Information System (INIS)

    De Diego, J. A.; De Leo, M. A.; Cepa, J.; Bongiovanni, A.; Verdugo, T.; Sánchez-Portal, M.; González-Serrano, J. I.

    2013-01-01

    Lyα emitter (LAE) surveys have successfully used the excess in a narrowband filter compared to a nearby broadband image to find candidates. However, the odd spectral energy distribution (SED) of LAEs combined with the instrumental profile has important effects on the properties of the candidate samples extracted from these surveys. We investigate the effect of the bandpass width and the transmission profile of the narrowband filters used for extracting LAE candidates at redshifts z ≅ 6.5 through Monte Carlo simulations, and we present pilot observations to test the performance of tunable filters to find LAEs and other emission-line candidates. We compare the samples obtained using a narrow ideal rectangular filter, the Subaru NB921 narrowband filter, and sweeping across a wavelength range using the ultra-narrow-band tunable filters of the instrument OSIRIS, installed at the 10.4 m Gran Telescopio Canarias. We use this instrument for extracting LAE candidates from a small set of real observations. Broadband data from the Subaru, Hubble Space Telescope, and Spitzer databases were used for fitting SEDs to calculate photometric redshifts and to identify interlopers. Narrowband surveys are very efficient in finding LAEs in large sky areas, but the samples obtained are not evenly distributed in redshift along the filter bandpass, and the number of LAEs with equivalent widths <60 Å can be underestimated. These biased results do not appear in samples obtained using ultra-narrow-band tunable filters. However, the field size of tunable filters is restricted because of the variation of the effective wavelength across the image. Thus, narrowband and ultra-narrow-band surveys are complementary strategies to investigate high-redshift LAEs

  3. Efficient scattering angle filtering for Full waveform inversion

    KAUST Repository

    Alkhalifah, Tariq Ali

    2015-01-01

    Controlling the scattering angles between the state and the adjoint variables for the energy admitted into an inversion gradient or an image can help improve these functions for objectives in full waveform inversion (FWI) or seismic imaging. However, the access of the scattering angle information usually requires an axis extension that could be costly, especially in 3D. For the purpose of a scattering angle filter, I develop techniques that utilize the mapping nature (no domain extension) of the filter for constant-velocity background models to interpolate between such filtered gradients using the actual velocity. The concept has well known roots in the application of phase-shift-plus-interpolation utilized commonly in the downward continuation process. If the difference between the minimum and maximum velocity of the background medium is large, we obtain filtered gradients corresponding to more constant velocity backgrounds and use linear interpolation between such velocities. The accuracy of this approximation for the Marmousi model gradient demonstrates the e ectiveness of the approach.

  4. Efficient scattering angle filtering for Full waveform inversion

    KAUST Repository

    Alkhalifah, Tariq Ali

    2015-08-19

    Controlling the scattering angles between the state and the adjoint variables for the energy admitted into an inversion gradient or an image can help improve these functions for objectives in full waveform inversion (FWI) or seismic imaging. However, the access of the scattering angle information usually requires an axis extension that could be costly, especially in 3D. For the purpose of a scattering angle filter, I develop techniques that utilize the mapping nature (no domain extension) of the filter for constant-velocity background models to interpolate between such filtered gradients using the actual velocity. The concept has well known roots in the application of phase-shift-plus-interpolation utilized commonly in the downward continuation process. If the difference between the minimum and maximum velocity of the background medium is large, we obtain filtered gradients corresponding to more constant velocity backgrounds and use linear interpolation between such velocities. The accuracy of this approximation for the Marmousi model gradient demonstrates the e ectiveness of the approach.

  5. Full-field particle velocimetry with a photorefractive optical novelty filter

    International Nuclear Information System (INIS)

    Woerdemann, Mike; Holtmann, Frank; Denz, Cornelia

    2008-01-01

    We utilize the finite time constant of a photorefractive optical novelty filter microscope to access full-field velocity information of fluid flows on microscopic scales. In contrast to conventional methods such as particle image velocimetry and particle tracking velocimetry, not only image acquisition of the tracer particle field but also evaluation of tracer particle velocities is done all-optically by the novelty filter. We investigate the velocity dependent parameters of two-beam coupling based optical novelty filters and demonstrate calibration and application of a photorefractive velocimetry system. Theoretical and practical limits to the range of accessible velocities are discussed

  6. Static Hyperspectral Fluorescence Imaging of Viscous Materials Based on a Linear Variable Filter Spectrometer

    Directory of Open Access Journals (Sweden)

    Alexander W. Koch

    2013-09-01

    Full Text Available This paper presents a low-cost hyperspectral measurement setup in a new application based on fluorescence detection in the visible (Vis wavelength range. The aim of the setup is to take hyperspectral fluorescence images of viscous materials. Based on these images, fluorescent and non-fluorescent impurities in the viscous materials can be detected. For the illumination of the measurement object, a narrow-band high-power light-emitting diode (LED with a center wavelength of 370 nm was used. The low-cost acquisition unit for the imaging consists of a linear variable filter (LVF and a complementary metal oxide semiconductor (CMOS 2D sensor array. The translucent wavelength range of the LVF is from 400 nm to 700 nm. For the confirmation of the concept, static measurements of fluorescent viscous materials with a non-fluorescent impurity have been performed and analyzed. With the presented setup, measurement surfaces in the micrometer range can be provided. The measureable minimum particle size of the impurities is in the nanometer range. The recording rate for the measurements depends on the exposure time of the used CMOS 2D sensor array and has been found to be in the microsecond range.

  7. Spatial filtering self-velocimeter for vehicle application using a CMOS linear image sensor

    Science.gov (United States)

    He, Xin; Zhou, Jian; Nie, Xiaoming; Long, Xingwu

    2015-03-01

    The idea of using a spatial filtering velocimeter (SFV) to measure the velocity of a vehicle for an inertial navigation system is put forward. The presented SFV is based on a CMOS linear image sensor with a high-speed data rate, large pixel size, and built-in timing generator. These advantages make the image sensor suitable to measure vehicle velocity. The power spectrum of the output signal is obtained by fast Fourier transform and is corrected by a frequency spectrum correction algorithm. This velocimeter was used to measure the velocity of a conveyor belt driven by a rotary table and the measurement uncertainty is ˜0.54%. Furthermore, it was also installed on a vehicle together with a laser Doppler velocimeter (LDV) to measure self-velocity. The measurement result of the designed SFV is compared with that of the LDV. It is shown that the measurement result of the SFV is coincident with that of the LDV. Therefore, the designed SFV is suitable for a vehicle self-contained inertial navigation system.

  8. The use of wavelet filters for reducing noise in posterior fossa Computed Tomography images

    International Nuclear Information System (INIS)

    Pita-Machado, Reinado; Perez-Diaz, Marlen; Lorenzo-Ginori, Juan V.; Bravo-Pino, Rolando

    2014-01-01

    Wavelet transform based de-noising like wavelet shrinkage, gives the good results in CT. This procedure affects very little the spatial resolution. Some applications are reconstruction methods, while others are a posteriori de-noising methods. De-noising after reconstruction is very difficult because the noise is non-stationary and has unknown distribution. Therefore, methods which work on the sinogram-space don’t have this problem, because they always work over a known noise distribution at this point. On the other hand, the posterior fossa in a head CT is a very complex region for physicians, because it is commonly affected by artifacts and noise which are not eliminated during the reconstruction procedure. This can leads to some false positive evaluations. The purpose of our present work is to compare different wavelet shrinkage de-noising filters to reduce noise, particularly in images of the posterior fossa within CT scans in the sinogram-space. This work describes an experimental search for the best wavelets, to reduce Poisson noise in Computed Tomography (CT) scans. Results showed that de-noising with wavelet filters improved the quality of posterior fossa region in terms of an increased CNR, without noticeable structural distortions

  9. Fast estimate of Hartley entropy in image sharpening

    Science.gov (United States)

    Krbcová, Zuzana; Kukal, Jaromír.; Svihlik, Jan; Fliegel, Karel

    2016-09-01

    Two classes of linear IIR filters: Laplacian of Gaussian (LoG) and Difference of Gaussians (DoG) are frequently used as high pass filters for contextual vision and edge detection. They are also used for image sharpening when linearly combined with the original image. Resulting sharpening filters are radially symmetric in spatial and frequency domains. Our approach is based on the radial approximation of unknown optimal filter, which is designed as a weighted sum of Gaussian filters with various radii. The novel filter is designed for MRI image enhancement where the image intensity represents anatomical structure plus additive noise. We prefer the gradient norm of Hartley entropy of whole image intensity as a measure which has to be maximized for the best sharpening. The entropy estimation procedure is as fast as FFT included in the filter but this estimate is a continuous function of enhanced image intensities. Physically motivated heuristic is used for optimum sharpening filter design by its parameter tuning. Our approach is compared with Wiener filter on MRI images.

  10. Relationship between pre-reconstruction filter and accuracy of registration software based on mutual-information maximization. A study of SPECT-MR brain phantom images

    International Nuclear Information System (INIS)

    Mito, Suzuko; Magota, Keiichi; Arai, Hiroshi; Omote, Hidehiko; Katsuura, Hidenori; Suzuki, Kotaro; Kubo Naoki

    2005-01-01

    Image registration technique is becoming an increasingly important tool in SPECT. Recently, software based on mutual-information maximization has been developed for automatic multimodality image registration. The accuracy of the software is important for its application to image registration. During SPECT reconstruction, the projection data are pre-filtered in order to reduce Poisson noise, commonly using a Butterworth filter. We have investigated the dependence of the absolute accuracy of MRI-SPECT registration on the cut-off frequencies of a range of Butterworth filters. This study used a 3D Hoffman phantom (Model No. 9000, Data-spectrum Co.). For the reference volume, an magnetization prepared rapid gradient echo (MPRage) sequence was performed on a Vision MRI (Siemence, 1.5 T). For the floating volumes, SPECT data of a phantom including 99m Tc 85 kBq/mL were acquired by a GCA-9300 (Toshiba Medical Systems Co.). During SPECT, the orbito-meatal (OM) line of the phantom was tilted by 5 deg and 15 deg to mimic the incline of a patient's head. The projection data were pre-filtered with Butterworth filters (cut-off frequency varying between 0.24 to 0.94 cycles/cm in 0.02 steps, order 8). The automated registrations were performed using iNRT β version software (Nihon Medi. Co.) and the rotation angles of SPECT for registration were noted. In this study, the registrations of all SPECT data were successful. Graphs of registration rotation angles against cut-off frequencies were scattered and showed no correlation between the two. The registration rotation angles ranged with changing cut-off frequency from -0.4 deg to +3.8 deg at a 5 deg tilt and from +12.7 deg to +19.6 deg at a 15 deg tilt. The registration rotation angles showed variation even for slight differences in cut-off frequencies. The absolute errors were a few degrees for any cut-off frequency. Regardless of the cut-off frequency, automatic registration using this software provides similar results. (author)

  11. FDTD parallel computational analysis of grid-type scattering filter characteristics for medical X-ray image diagnosis

    International Nuclear Information System (INIS)

    Takahashi, Koichi; Miyazaki, Yasumitsu; Goto, Nobuo

    2007-01-01

    X-ray diagnosis depends on the intensity of transmitted and scattered waves in X-ray propagation in biomedical media. X-ray is scattered and absorbed by tissues, such as fat, bone and internal organs. However, image processing for medical diagnosis, based on the scattering and absorption characteristics of these tissues in X-ray spectrum is not so much studied. To obtain precise information of tissues in a living body, the accurate characteristics of scattering and absorption are required. In this paper, X-ray scattering and absorption in biomedical media are studied using 2-dimensional finite difference time domain (FDTD) method. In FDTD method, the size of analysis space is very limited by the performance of available computers. To overcome this limitation, parallel and successive FDTD method is introduced. As a result of computer simulation, the amplitude of transmitted and scattered waves are presented numerically. The fundamental filtering characteristics of grid-type filter are also shown numerically. (author)

  12. Multisource least-squares reverse-time migration with structure-oriented filtering

    Science.gov (United States)

    Fan, Jing-Wen; Li, Zhen-Chun; Zhang, Kai; Zhang, Min; Liu, Xue-Tong

    2016-09-01

    The technology of simultaneous-source acquisition of seismic data excited by several sources can significantly improve the data collection efficiency. However, direct imaging of simultaneous-source data or blended data may introduce crosstalk noise and affect the imaging quality. To address this problem, we introduce a structure-oriented filtering operator as preconditioner into the multisource least-squares reverse-time migration (LSRTM). The structure-oriented filtering operator is a nonstationary filter along structural trends that suppresses crosstalk noise while maintaining structural information. The proposed method uses the conjugate-gradient method to minimize the mismatch between predicted and observed data, while effectively attenuating the interference noise caused by exciting several sources simultaneously. Numerical experiments using synthetic data suggest that the proposed method can suppress the crosstalk noise and produce highly accurate images.

  13. Matching rendered and real world images by digital image processing

    Science.gov (United States)

    Mitjà, Carles; Bover, Toni; Bigas, Miquel; Escofet, Jaume

    2010-05-01

    Recent advances in computer-generated images (CGI) have been used in commercial and industrial photography providing a broad scope in product advertising. Mixing real world images with those rendered from virtual space software shows a more or less visible mismatching between corresponding image quality performance. Rendered images are produced by software which quality performance is only limited by the resolution output. Real world images are taken with cameras with some amount of image degradation factors as lens residual aberrations, diffraction, sensor low pass anti aliasing filters, color pattern demosaicing, etc. The effect of all those image quality degradation factors can be characterized by the system Point Spread Function (PSF). Because the image is the convolution of the object by the system PSF, its characterization shows the amount of image degradation added to any taken picture. This work explores the use of image processing to degrade the rendered images following the parameters indicated by the real system PSF, attempting to match both virtual and real world image qualities. The system MTF is determined by the slanted edge method both in laboratory conditions and in the real picture environment in order to compare the influence of the working conditions on the device performance; an approximation to the system PSF is derived from the two measurements. The rendered images are filtered through a Gaussian filter obtained from the taking system PSF. Results with and without filtering are shown and compared measuring the contrast achieved in different final image regions.

  14. Computational multispectral video imaging [Invited].

    Science.gov (United States)

    Wang, Peng; Menon, Rajesh

    2018-01-01

    Multispectral imagers reveal information unperceivable to humans and conventional cameras. Here, we demonstrate a compact single-shot multispectral video-imaging camera by placing a micro-structured diffractive filter in close proximity to the image sensor. The diffractive filter converts spectral information to a spatial code on the sensor pixels. Following a calibration step, this code can be inverted via regularization-based linear algebra to compute the multispectral image. We experimentally demonstrated spectral resolution of 9.6 nm within the visible band (430-718 nm). We further show that the spatial resolution is enhanced by over 30% compared with the case without the diffractive filter. We also demonstrate Vis-IR imaging with the same sensor. Because no absorptive color filters are utilized, sensitivity is preserved as well. Finally, the diffractive filters can be easily manufactured using optical lithography and replication techniques.

  15. Primer uticaja filtriranja slike u sistemima za praćenje ciljeva primenom termovizije / An example of image filtering in target tracking systems with thermal imagery

    Directory of Open Access Journals (Sweden)

    Zvonko M. Radosavljević

    2003-07-01

    Full Text Available U radu je dat primer primene jedne vrste niskofrekventnog filtriranja sa usrednjavanjem, koje se primenjuje u sistemima za detekciju i praćenje ciljeva u vazdušnom prostoru primenom termovizije. Date su dve metode filtriranja slike. Prva metoda koristi niskofrekventno konvoluciono filtriranje a druga usrednjavajući filtar na osnovu srednje vrednosti nivoa sivog. Ovi filtri su primenjeni u sistemima za praćenje uz pomoć infracrvenih senzora. Određivanje nivoa praga filtriranja vrši se uz pomoć statističkih osobina slike. Veoma važan korak u procesu praćenja je određivanje prozora praćenja, koji maze biti, po dimenzijama, fiksan ili adaptibilan. Pogrešna procena o postojanju cilja u prozoru može se doneti u slučaju prisustva šuma pozadine, predpojačavača, detektora, itd. Filtriranje je neophodan korak u ovim sistemima, kao značajan činilac U povećanju brzine i tačnosti praćenja. / A case of image filtering in air target detecting and tracking systems is described in this paper. Two image filtering methods are given. The first method is performed using a low pass convolving filter and the second one uses the mean value of gray level filter. The main goal of the cited filtering is implementation in IR (infra red systems. Some statistical features of the images were used for selecting the threshold level. The next step in the algorithm is the determination of a 'tracking window' that can be fixed or adaptive in size. A false estimation of a target existing in the window may be influenced by the background noise, low noise amplifier detector, etc.

  16. Comparison analysis between filtered back projection and algebraic reconstruction technique on microwave imaging

    Science.gov (United States)

    Ramadhan, Rifqi; Prabowo, Rian Gilang; Aprilliyani, Ria; Basari

    2018-02-01

    Victims of acute cancer and tumor are growing each year and cancer becomes one of the causes of human deaths in the world. Cancers or tumor tissue cells are cells that grow abnormally and turn to take over and damage the surrounding tissues. At the beginning, cancers or tumors do not have definite symptoms in its early stages, and can even attack the tissues inside of the body. This phenomena is not identifiable under visual human observation. Therefore, an early detection system which is cheap, quick, simple, and portable is essensially required to anticipate the further development of cancer or tumor. Among all of the modalities, microwave imaging is considered to be a cheaper, simple, and portable system method. There are at least two simple image reconstruction algorithms i.e. Filtered Back Projection (FBP) and Algebraic Reconstruction Technique (ART), which have been adopted in some common modalities. In this paper, both algorithms will be compared by reconstructing the image from an artificial tissue model (i.e. phantom), which has two different dielectric distributions. We addressed two performance comparisons, namely quantitative and qualitative analysis. Qualitative analysis includes the smoothness of the image and also the success in distinguishing dielectric differences by observing the image with human eyesight. In addition, quantitative analysis includes Histogram, Structural Similarity Index (SSIM), Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR) calculation were also performed. As a result, quantitative parameters of FBP might show better values than the ART. However, ART is likely more capable to distinguish two different dielectric value than FBP, due to higher contrast in ART and wide distribution grayscale level.

  17. Evaluation of the effects of an arm artifact reduction filter in computed tomography

    International Nuclear Information System (INIS)

    Nozaki, Fumie; Ohno, Hajime; Hirako, Tetsuya

    2002-01-01

    While performing CT during arterial portography (CTAP) or CT arteriography (CTA), we have instructed patients to in order to reduce streak artifacts. However the repetitive raising and lowering their arms has made it difficult to keep a clean zone and as well as to maintain the position of a target organ. An arm artifact reduction (AAR) filter developed by GE Yokokawa Medical System has been reported to be useful for reducing streak artifacts in CTAP and CTA. The purpose of this study was to estimate the effectiveness of the AAR filter in terms of artifact reduction ratio, CT value, standard deviation and spatial resolution. The use of an AAR filter reduced streak artifacts on images by 15-22% compared with those on images obtained by without the use of the filter and limited deterioration of spatial resolution to within 1%. Moreover, CT value in examination using AAR filter showed no significant change compared with that in non-filter's examination. It is concluded that the use of an AAR filter reduces the burden for the patient and increase the accuracy and flexibility of CT examination while minimizing the reduction in the image quality in CTAP and CTA. (author)

  18. Person-Specific Face Detection in a Scene with Optimum Composite Filtering and Colour-Shape Information

    Directory of Open Access Journals (Sweden)

    Seokwon Yeom

    2013-01-01

    Full Text Available Face detection and recognition have wide applications in robot vision and intelligent surveillance. However, face identification at a distance is very challenging because long-distance images are often degraded by low resolution, blurring and noise. This paper introduces a person-specific face detection method that uses a nonlinear optimum composite filter and subsequent verification stages. The filter's optimum criterion minimizes the sum of the output energy generated by the input noise and the input image. The composite filter is trained with several training images under long-distance modelling. The candidate facial regions are provided by the filter's outputs of the input scene. False alarms are eliminated by subsequent testing stages, which comprise skin colour and edge mask filtering tests. In the experiments, images captured by a webcam and a CCTV camera are processed to show the effectiveness of the person-specific face detection system at a long distance.

  19. Characterizing high energy spectra of NIF ignition Hohlraums using a differentially filtered high energy multipinhole x-ray imager.

    Science.gov (United States)

    Park, Hye-Sook; Dewald, E D; Glenzer, S; Kalantar, D H; Kilkenny, J D; MacGowan, B J; Maddox, B R; Milovich, J L; Prasad, R R; Remington, B A; Robey, H F; Thomas, C A

    2010-10-01

    Understanding hot electron distributions generated inside Hohlraums is important to the national ignition campaign for controlling implosion symmetry and sources of preheat. While direct imaging of hot electrons is difficult, their spatial distribution and spectrum can be deduced by detecting high energy x-rays generated as they interact with target materials. We used an array of 18 pinholes with four independent filter combinations to image entire Hohlraums with a magnification of 0.87× during the Hohlraum energetics campaign on NIF. Comparing our results with Hohlraum simulations indicates that the characteristic 10-40 keV hot electrons are mainly generated from backscattered laser-plasma interactions rather than from Hohlraum hydrodynamics.

  20. A robust technique based on VLM and Frangi filter for retinal vessel extraction and denoising.

    Directory of Open Access Journals (Sweden)

    Khan Bahadar Khan

    Full Text Available The exploration of retinal vessel structure is colossally important on account of numerous diseases including stroke, Diabetic Retinopathy (DR and coronary heart diseases, which can damage the retinal vessel structure. The retinal vascular network is very hard to be extracted due to its spreading and diminishing geometry and contrast variation in an image. The proposed technique consists of unique parallel processes for denoising and extraction of blood vessels in retinal images. In the preprocessing section, an adaptive histogram equalization enhances dissimilarity between the vessels and the background and morphological top-hat filters are employed to eliminate macula and optic disc, etc. To remove local noise, the difference of images is computed from the top-hat filtered image and the high-boost filtered image. Frangi filter is applied at multi scale for the enhancement of vessels possessing diverse widths. Segmentation is performed by using improved Otsu thresholding on the high-boost filtered image and Frangi's enhanced image, separately. In the postprocessing steps, a Vessel Location Map (VLM is extracted by using raster to vector transformation. Postprocessing steps are employed in a novel way to reject misclassified vessel pixels. The final segmented image is obtained by using pixel-by-pixel AND operation between VLM and Frangi output image. The method has been rigorously analyzed on the STARE, DRIVE and HRF datasets.

  1. Q-Least Squares Reverse Time Migration with Viscoacoustic Deblurring Filters

    KAUST Repository

    Chen, Yuqing; Dutta, Gaurav; Dai, Wei; Schuster, Gerard T.

    2017-01-01

    Viscoacoustic least-squares reverse time migration (Q-LSRTM) linearly inverts for the subsurface reflectivity model from lossy data. Compared to the conventional migration methods, it can compensate for the amplitude loss in the migrated images because of the strong subsurface attenuation and can produce reflectors that are accurately positioned in depth. However, the adjoint Q propagators used for backward propagating the residual data are also attenuative. Thus, the inverted images from Q-LSRTM are often observed to have lower resolution when compared to the benchmark acoustic LSRTM images from acoustic data. To increase the resolution and accelerate the convergence of Q-LSRTM, we propose using viscoacoustic deblurring filters as a preconditioner for Q-LSRTM. These filters can be estimated by matching a simulated migration image to its reference reflectivity model. Numerical tests on synthetic and field data demonstrate that Q-LSRTM combined with viscoacoustic deblurring filters can produce images with higher resolution and more balanced amplitudes than images from acoustic RTM, acoustic LSRTM and Q-LSRTM when there is strong attenuation in the background medium. The proposed preconditioning method is also shown to improve the convergence rate of Q-LSRTM by more than 30 percent in some cases and significantly compensate for the lossy artifacts in RTM images.

  2. Q-Least Squares Reverse Time Migration with Viscoacoustic Deblurring Filters

    KAUST Repository

    Chen, Yuqing

    2017-08-02

    Viscoacoustic least-squares reverse time migration (Q-LSRTM) linearly inverts for the subsurface reflectivity model from lossy data. Compared to the conventional migration methods, it can compensate for the amplitude loss in the migrated images because of the strong subsurface attenuation and can produce reflectors that are accurately positioned in depth. However, the adjoint Q propagators used for backward propagating the residual data are also attenuative. Thus, the inverted images from Q-LSRTM are often observed to have lower resolution when compared to the benchmark acoustic LSRTM images from acoustic data. To increase the resolution and accelerate the convergence of Q-LSRTM, we propose using viscoacoustic deblurring filters as a preconditioner for Q-LSRTM. These filters can be estimated by matching a simulated migration image to its reference reflectivity model. Numerical tests on synthetic and field data demonstrate that Q-LSRTM combined with viscoacoustic deblurring filters can produce images with higher resolution and more balanced amplitudes than images from acoustic RTM, acoustic LSRTM and Q-LSRTM when there is strong attenuation in the background medium. The proposed preconditioning method is also shown to improve the convergence rate of Q-LSRTM by more than 30 percent in some cases and significantly compensate for the lossy artifacts in RTM images.

  3. Supervised Filter Learning for Representation Based Face Recognition.

    Directory of Open Access Journals (Sweden)

    Chao Bi

    Full Text Available Representation based classification methods, such as Sparse Representation Classification (SRC and Linear Regression Classification (LRC have been developed for face recognition problem successfully. However, most of these methods use the original face images without any preprocessing for recognition. Thus, their performances may be affected by some problematic factors (such as illumination and expression variances in the face images. In order to overcome this limitation, a novel supervised filter learning algorithm is proposed for representation based face recognition in this paper. The underlying idea of our algorithm is to learn a filter so that the within-class representation residuals of the faces' Local Binary Pattern (LBP features are minimized and the between-class representation residuals of the faces' LBP features are maximized. Therefore, the LBP features of filtered face images are more discriminative for representation based classifiers. Furthermore, we also extend our algorithm for heterogeneous face recognition problem. Extensive experiments are carried out on five databases and the experimental results verify the efficacy of the proposed algorithm.

  4. High Performance, Three-Dimensional Bilateral Filtering

    International Nuclear Information System (INIS)

    Bethel, E. Wes

    2008-01-01

    Image smoothing is a fundamental operation in computer vision and image processing. This work has two main thrusts: (1) implementation of a bilateral filter suitable for use in smoothing, or denoising, 3D volumetric data; (2) implementation of the 3D bilateral filter in three different parallelization models, along with parallel performance studies on two modern HPC architectures. Our bilateral filter formulation is based upon the work of Tomasi [11], but extended to 3D for use on volumetric data. Our three parallel implementations use POSIX threads, the Message Passing Interface (MPI), and Unified Parallel C (UPC), a Partitioned Global Address Space (PGAS) language. Our parallel performance studies, which were conducted on a Cray XT4 supercomputer and aquad-socket, quad-core Opteron workstation, show our algorithm to have near-perfect scalability up to 120 processors. Parallel algorithms, such as the one we present here, will have an increasingly important role for use in production visual analysis systems as the underlying computational platforms transition from single- to multi-core architectures in the future.

  5. High Performance, Three-Dimensional Bilateral Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E. Wes

    2008-06-05

    Image smoothing is a fundamental operation in computer vision and image processing. This work has two main thrusts: (1) implementation of a bilateral filter suitable for use in smoothing, or denoising, 3D volumetric data; (2) implementation of the 3D bilateral filter in three different parallelization models, along with parallel performance studies on two modern HPC architectures. Our bilateral filter formulation is based upon the work of Tomasi [11], but extended to 3D for use on volumetric data. Our three parallel implementations use POSIX threads, the Message Passing Interface (MPI), and Unified Parallel C (UPC), a Partitioned Global Address Space (PGAS) language. Our parallel performance studies, which were conducted on a Cray XT4 supercomputer and aquad-socket, quad-core Opteron workstation, show our algorithm to have near-perfect scalability up to 120 processors. Parallel algorithms, such as the one we present here, will have an increasingly important role for use in production visual analysis systems as the underlying computational platforms transition from single- to multi-core architectures in the future.

  6. Chromatic aberrations correction for imaging spectrometer based on acousto-optic tunable filter with two transducers.

    Science.gov (United States)

    Zhao, Huijie; Wang, Ziye; Jia, Guorui; Zhang, Ying; Xu, Zefu

    2017-10-02

    The acousto-optic tunable filter (AOTF) with wide wavelength range and high spectral resolution has long crystal and two transducers. A longer crystal length leads to a bigger chromatic focal shift and the double-transducer arrangement induces angular mutation in diffracted beam, which increase difficulty in longitudinal and lateral chromatic aberration correction respectively. In this study, the two chromatic aberrations are analyzed quantitatively based on an AOTF optical model and a novel catadioptric dual-path configuration is proposed to correct both the chromatic aberrations. The test results exhibit effectiveness of the optical configuration for this type of AOTF-based imaging spectrometer.

  7. Filter Halaman Web Pornografi Menggunakan Kecocokan Kata dan Deteksi Warna Kulit

    Directory of Open Access Journals (Sweden)

    Yusron Rijal

    2011-05-01

    Full Text Available This paper presents an effort to detect pornographic webpages. It was stated that a positive relationship exists between percentage of human skin color in an image and the image itself (Jones et.al., 1998. Based on the statement, rather than using the traditional method of text-filtering, this paper propose a new approach to detect pornographic images by using skin color detection. The skin color detection performed by using RGB, HSI, and YCbCr color model. Using algorithm stated by Ap-apid (Ap-apid, 2005, the system will classify nude and not-nude images. If one or more nude images are found, the system will block the webpage. Keywords: Webpage Filtering, Image Processing, Pornography, Nudity, Skin Color, Nude Images

  8. Phase-correcting non-local means filtering for diffusion-weighted imaging of the spinal cord.

    Science.gov (United States)

    Kafali, Sevgi Gokce; Çukur, Tolga; Saritas, Emine Ulku

    2018-02-09

    DWI suffers from low SNR when compared to anatomical MRI. To maintain reasonable SNR at relatively high spatial resolution, multiple acquisitions must be averaged. However, subject motion or involuntary physiological motion during diffusion-sensitizing gradients cause phase offsets among acquisitions. When the motion is localized to a small region, these phase offsets become particularly problematic. Complex averaging of acquisitions lead to cancellations from these phase offsets, whereas magnitude averaging results in noise amplification. Here, we propose an improved reconstruction for multi-acquisition DWI that effectively corrects for phase offsets while reducing noise. Each acquisition is processed with a refocusing reconstruction for global phase correction and a partial k-space reconstruction via projection-onto-convex-sets (POCS). The proposed reconstruction then embodies a new phase-correcting non-local means (PC-NLM) filter. PC-NLM is performed on the complex-valued outputs of the POCS algorithm aggregated across acquisitions. The PC-NLM filter leverages the shared structure among multiple acquisitions to simultaneously alleviate nuisance factors including phase offsets and noise. Extensive simulations and in vivo DWI experiments of the cervical spinal cord are presented. The results demonstrate that the proposed reconstruction improves image quality by mitigating signal loss because of phase offsets and reducing noise. Importantly, these improvements are achieved while preserving the accuracy of apparent diffusion coefficient maps. An improved reconstruction incorporating a PC-NLM filter for multi-acquisition DWI is presented. This reconstruction can be particularly beneficial for high-resolution or high-b-value DWI acquisitions that suffer from low SNR and phase offsets from local motion. © 2018 International Society for Magnetic Resonance in Medicine.

  9. Learnable despeckling framework for optical coherence tomography images

    Science.gov (United States)

    Adabi, Saba; Rashedi, Elaheh; Clayton, Anne; Mohebbi-Kalkhoran, Hamed; Chen, Xue-wen; Conforto, Silvia; Nasiriavanaki, Mohammadreza

    2018-01-01

    Optical coherence tomography (OCT) is a prevalent, interferometric, high-resolution imaging method with broad biomedical applications. Nonetheless, OCT images suffer from an artifact called speckle, which degrades the image quality. Digital filters offer an opportunity for image improvement in clinical OCT devices, where hardware modification to enhance images is expensive. To reduce speckle, a wide variety of digital filters have been proposed; selecting the most appropriate filter for an OCT image/image set is a challenging decision, especially in dermatology applications of OCT where a different variety of tissues are imaged. To tackle this challenge, we propose an expandable learnable despeckling framework, we call LDF. LDF decides which speckle reduction algorithm is most effective on a given image by learning a figure of merit (FOM) as a single quantitative image assessment measure. LDF is learnable, which means when implemented on an OCT machine, each given image/image set is retrained and its performance is improved. Also, LDF is expandable, meaning that any despeckling algorithm can easily be added to it. The architecture of LDF includes two main parts: (i) an autoencoder neural network and (ii) filter classifier. The autoencoder learns the FOM based on several quality assessment measures obtained from the OCT image including signal-to-noise ratio, contrast-to-noise ratio, equivalent number of looks, edge preservation index, and mean structural similarity index. Subsequently, the filter classifier identifies the most efficient filter from the following categories: (a) sliding window filters including median, mean, and symmetric nearest neighborhood, (b) adaptive statistical-based filters including Wiener, homomorphic Lee, and Kuwahara, and (c) edge preserved patch or pixel correlation-based filters including nonlocal mean, total variation, and block matching three-dimensional filtering.

  10. Polarization-Insensitive Tunable Optical Filters based on Liquid Crystal Polarization Gratings

    Science.gov (United States)

    Nicolescu, Elena

    Tunable optical filters are widely used for a variety of applications including spectroscopy, optical communication networks, remote sensing, and biomedical imaging and diagnostics. All of these application areas can greatly benefit from improvements in the key characteristics of the tunable optical filters embedded in them. Some of these key parameters include peak transmittance, bandwidth, tuning range, and transition width. In recent years research efforts have also focused on miniaturizing tunable optical filters into physically small packages for compact portable spectroscopy and hyperspectral imaging applications such as real-time medical diagnostics and defense applications. However, it is important that miniaturization not have a detrimental effect on filter performance. The overarching theme of this dissertation is to explore novel configurations of Polarization Gratings (PGs) as simple, low-cost, polarization-insensitive alternatives to conventional optical filtering technologies for applications including hyperspectral imaging and telecommunications. We approach this goal from several directions with a combination of theory and experimental demonstration leading to, in our opinion, a significant contribution to the field. We present three classes of tunable optical filters, the first of which is an angle-filtering scheme where the stop-band wavelengths are redirected off axis and the passband is transmitted on-axis. This is achieved using a stacked configuration of polarization gratings of various thicknesses. To improve this class of filter, we also introduce a novel optical element, the Bilayer Polarization Grating, exhibiting unique optical properties and demonstrating complex anchoring conditions with high quality. The second class of optical filter is analogous to a Lyot filter, utilizing stacks of static or tunable waveplates sandwiched with polarizing elements. However, we introduce a new configuration using PGs and static waveplates to replace

  11. Prototype heel effect compensation filter for cone-beam CT

    International Nuclear Information System (INIS)

    Mori, Shinichiro; Endo, Masahiro; Nishizawa, Kanae; Ohno, Mari; Miyazaki, Hiroaki; Tsujita, Kazuhiko; Saito, Yasuo

    2005-01-01

    The prototype cone-beam CT (CBCT) has a larger beam width than the conventional multi-detector row CT (MDCT). This causes a non-uniform angular distribution of the x-ray beam intensity known as the heel effect. Scan conditions for CBCT tube current are adjusted on the anode side to obtain an acceptable clinical image quality. However, as the dose is greater on the cathode side than on the anode side, the signal-to-noise ratio on the cathode side is excessively high, resulting in an unnecessary dose amount. To compensate for the heel effect, we developed a heel effect compensation (HEC) filter. The HEC filter rendered the dose distribution uniform and reduced the dose by an average of 25% for free air and by 20% for CTDI phantoms compared to doses with the conventional filter. In addition, its effect in rendering the effective energy uniform resulted in an improvement in image quality. This new HEC filter may be useful in cone-beam CT studies. (note)

  12. TU-H-206-04: An Effective Homomorphic Unsharp Mask Filtering Method to Correct Intensity Inhomogeneity in Daily Treatment MR Images

    International Nuclear Information System (INIS)

    Yang, D; Gach, H; Li, H; Mutic, S

    2016-01-01

    Purpose: The daily treatment MRIs acquired on MR-IGRT systems, like diagnostic MRIs, suffer from intensity inhomogeneity issue, associated with B1 and B0 inhomogeneities. An improved homomorphic unsharp mask (HUM) filtering method, automatic and robust body segmentation, and imaging field-of-view (FOV) detection methods were developed to compute the multiplicative slow-varying correction field and correct the intensity inhomogeneity. The goal is to improve and normalize the voxel intensity so that the images could be processed more accurately by quantitative methods (e.g., segmentation and registration) that require consistent image voxel intensity values. Methods: HUM methods have been widely used for years. A body mask is required, otherwise the body surface in the corrected image would be incorrectly bright due to the sudden intensity transition at the body surface. In this study, we developed an improved HUM-based correction method that includes three main components: 1) Robust body segmentation on the normalized image gradient map, 2) Robust FOV detection (needed for body segmentation) using region growing and morphologic filters, and 3) An effective implementation of HUM using repeated Gaussian convolution. Results: The proposed method was successfully tested on patient images of common anatomical sites (H/N, lung, abdomen and pelvis). Initial qualitative comparisons showed that this improved HUM method outperformed three recently published algorithms (FCM, LEMS, MICO) in both computation speed (by 50+ times) and robustness (in intermediate to severe inhomogeneity situations). Currently implemented in MATLAB, it takes 20 to 25 seconds to process a 3D MRI volume. Conclusion: Compared to more sophisticated MRI inhomogeneity correction algorithms, the improved HUM method is simple and effective. The inhomogeneity correction, body mask, and FOV detection methods developed in this study would be useful as preprocessing tools for many MRI-related research and

  13. Pupil filter design by using a Bessel functions basis at the image plane.

    Science.gov (United States)

    Canales, Vidal F; Cagigal, Manuel P

    2006-10-30

    Many applications can benefit from the use of pupil filters for controlling the light intensity distribution near the focus of an optical system. Most of the design methods for such filters are based on a second-order expansion of the Point Spread Function (PSF). Here, we present a new procedure for designing radially-symmetric pupil filters. It is more precise than previous procedures as it considers the exact expression of the PSF, expanded as a function of first-order Bessel functions. Furthermore, this new method presents other advantages: the height of the side lobes can be easily controlled, it allows the design of amplitude-only, phase-only or hybrid filters, and the coefficients of the PSF expansion can be directly related to filter parameters. Finally, our procedure allows the design of filters with very different behaviours and optimal performance.

  14. The AGILE on-board Kalman filter

    International Nuclear Information System (INIS)

    Giuliani, A.; Cocco, V.; Mereghetti, S.; Pittori, C.; Tavani, M.

    2006-01-01

    On-board reduction of particle background is one of the main challenges of space instruments dedicated to gamma-ray astrophysics. We present in this paper a discussion of the method and main simulation results of the on-board background filter of the Gamma-Ray Imaging Detector (GRID) of the AGILE mission. The GRID is capable of detecting and imaging with optimal point spread function gamma-ray photons in the range 30MeV-30GeV. The AGILE planned orbit is equatorial, with an altitude of 550km. This is an optimal orbit from the point of view of the expected particle background. For this orbit, electrons and positrons of kinetic energies between 20MeV and hundreds of MeV dominate the particle background, with significant contributions from high-energy (primary) and low-energy protons, and gamma-ray albedo-photons. We present here the main results obtained by extensive simulations of the on-board AGILE-GRID particle/photon background rejection algorithms based on a special application of Kalman filter techniques. This filter is applied (Level-2) sequentially after other data processing techniques characterizing the Level-1 processing. We show that, in conjunction with the Level-1 processing, the adopted Kalman filtering is expected to reduce the total particle/albedo-photon background rate to a value (=<10-30Hz) that is compatible with the AGILE telemetry. The AGILE on-board Kalman filter is also effective in reducing the Earth-albedo-photon background rate, and therefore contributes to substantially increase the AGILE exposure for celestial gamma-ray sources

  15. Effectual switching filter for removing impulse noise using a SCM detector

    Science.gov (United States)

    Yuan, Jin-xia; Zhang, Hong-juan; Ma, Yi-de

    2012-03-01

    An effectual method is proposed to remove impulse noise from corrupted color images. The spiking cortical model (SCM) is adopted as a noise detector to identify noisy pixels in each channel of color images, and detected noise pixels are saved in three marking matrices. According to the three marking matrices, the detected noisy pixels are divided into two types (type I and type II). They are filtered differently: an adaptive median filter is used for type I and an adaptive vector median for type II. Noise-free pixels are left unchanged. Extensive experiments show that the proposed method outperforms most of the other well-known filters in the aspects of both visual and objective quality measures, and this method can also reduce the possibility of generating color artifacts while preserving image details.

  16. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    Science.gov (United States)

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  17. Retinal blood vessel extraction using tunable bandpass filter and fuzzy conditional entropy.

    Science.gov (United States)

    Sil Kar, Sudeshna; Maity, Santi P

    2016-09-01

    Extraction of blood vessels on retinal images plays a significant role for screening of different opthalmologic diseases. However, accurate extraction of the entire and individual type of vessel silhouette from the noisy images with poorly illuminated background is a complicated task. To this aim, an integrated system design platform is suggested in this work for vessel extraction using a sequential bandpass filter followed by fuzzy conditional entropy maximization on matched filter response. At first noise is eliminated from the image under consideration through curvelet based denoising. To include the fine details and the relatively less thick vessel structures, the image is passed through a bank of sequential bandpass filter structure optimized for contrast enhancement. Fuzzy conditional entropy on matched filter response is then maximized to find the set of multiple optimal thresholds to extract the different types of vessel silhouettes from the background. Differential Evolution algorithm is used to determine the optimal gain in bandpass filter and the combination of the fuzzy parameters. Using the multiple thresholds, retinal image is classified as the thick, the medium and the thin vessels including neovascularization. Performance evaluated on different publicly available retinal image databases shows that the proposed method is very efficient in identifying the diverse types of vessels. Proposed method is also efficient in extracting the abnormal and the thin blood vessels in pathological retinal images. The average values of true positive rate, false positive rate and accuracy offered by the method is 76.32%, 1.99% and 96.28%, respectively for the DRIVE database and 72.82%, 2.6% and 96.16%, respectively for the STARE database. Simulation results demonstrate that the proposed method outperforms the existing methods in detecting the various types of vessels and the neovascularization structures. The combination of curvelet transform and tunable bandpass

  18. Non-Euclidean phasor analysis for quantification of oxidative stress in ex vivo human skin exposed to sun filters using fluorescence lifetime imaging microscopy

    Science.gov (United States)

    Osseiran, Sam; Roider, Elisabeth M.; Wang, Hequn; Suita, Yusuke; Murphy, Michael; Fisher, David E.; Evans, Conor L.

    2017-12-01

    Chemical sun filters are commonly used as active ingredients in sunscreens due to their efficient absorption of ultraviolet (UV) radiation. Yet, it is known that these compounds can photochemically react with UV light and generate reactive oxygen species and oxidative stress in vitro, though this has yet to be validated in vivo. One label-free approach to probe oxidative stress is to measure and compare the relative endogenous fluorescence generated by cellular coenzymes nicotinamide adenine dinucleotides and flavin adenine dinucleotides. However, chemical sun filters are fluorescent, with emissive properties that contaminate endogenous fluorescent signals. To accurately distinguish the source of fluorescence in ex vivo skin samples treated with chemical sun filters, fluorescence lifetime imaging microscopy data were processed on a pixel-by-pixel basis using a non-Euclidean separation algorithm based on Mahalanobis distance and validated on simulated data. Applying this method, ex vivo samples exhibited a small oxidative shift when exposed to sun filters alone, though this shift was much smaller than that imparted by UV irradiation. Given the need for investigative tools to further study the clinical impact of chemical sun filters in patients, the reported methodology may be applied to visualize chemical sun filters and measure oxidative stress in patients' skin.

  19. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    histogram of an existing image (input image) into a new grey value histogram (output image) are most quickly handled by a look-up table (LUT). The histogram of an image can be influenced by gain, offset and gamma of the camera. Gain defines the voltage range, offset defines the reference voltage and gamma the slope of the regression line between the light intensity and the voltage of the camera. A very important descriptor of neighbourhood relations in an image is the co-occurrence matrix. The distance between the pixels (original pixel and its neighbouring pixel) can influence the various parameters calculated from the co-occurrence matrix. The main goals of image enhancement are elimination of surface roughness in an image (smoothing), correction of defects (e.g. noise), extraction of edges, identification of points, strengthening texture elements and improving contrast. In enhancement, two types of operations can be distinguished: pixel-based (point operations) and neighbourhood-based (matrix operations). The most important pixel-based operations are linear stretching of grey values, application of pre-stored LUTs and histogram equalisation. The neighbourhood-based operations work with so-called filters. These are organising elements with an original or initial point in their centre. Filters can be used to accentuate or to suppress specific structures within the image. Filters can work either in the spatial or in the frequency domain. The method used for analysing alterations of grey value intensities in the frequency domain is the Hartley transform. Filter operations in the spatial domain can be based on averaging or ranking the grey values occurring in the organising element. The most important filters, which are usually applied, are the Gaussian filter and the Laplace filter (both averaging filters), and the median filter, the top hat filter and the range operator (all ranking filters). Segmentation of objects is traditionally based on threshold grey values. (AB

  20. The Science Advantage of a Redder Filter for WFIRST

    Science.gov (United States)

    Bauer, James; Stauffer, John; Milam, Stefanie N.; Holler, Bryan J.

    2018-01-01

    WFIRST will be capable of providing Hubble-quality imaging performance over several thousand square degrees of the sky. The wide-area, high spatial resolution survey data from WFIRST will be unsurpassed for probably many decades into the future. With the current baseline design, the WFIRST filter complement will extend from the bluest wavelength allowed by the optical design to a reddest filter (F184W) that has a red cutoff at 2.0 microns. Extension of the imaging capabilities even slightly beyond the 2.0 micron wavelength cut-off would provide significant advantages over the presently proposed science for objects both near and far. The inclusion of a Ks (2.0-2.3 micron) filter would result in a wider range and more comprehensive set of Solar System investigations. It would also extend the range of higher-redshift population studies. In this poster, we outline some of the science advantages for adding a K filter, similar in bandpass to the 2MASS Ks filter, in order to extend the wavelength range for WFIRST as far to the red as the thermal performance of the spacecraft allows.

  1. Regularization of DT-MRI Using 3D Median Filtering Methods

    Directory of Open Access Journals (Sweden)

    Soondong Kwon

    2014-01-01

    Full Text Available DT-MRI (diffusion tensor magnetic resonance imaging tractography is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of the principal eigenvectors obtained from tensor matrix, which is different from the conventional isotropic MRI. Tractography based on DT-MRI is known to need many computations and is highly sensitive to noise. Hence, adequate regularization methods, such as image processing techniques, are in demand. Among many regularization methods we are interested in the median filtering method. In this paper, we extended two-dimensional median filters already developed to three-dimensional median filters. We compared four median filtering methods which are two-dimensional simple median method (SM2D, two-dimensional successive Fermat method (SF2D, three-dimensional simple median method (SM3D, and three-dimensional successive Fermat method (SF3D. Three kinds of synthetic data with different altitude angles from axial slices and one kind of human data from MR scanner are considered for numerical implementation by the four filtering methods.

  2. Human attention filters for single colors

    Science.gov (United States)

    Sun, Peng; Chubb, Charles; Wright, Charles E.; Sperling, George

    2016-01-01

    The visual images in the eyes contain much more information than the brain can process. An important selection mechanism is feature-based attention (FBA). FBA is best described by attention filters that specify precisely the extent to which items containing attended features are selectively processed and the extent to which items that do not contain the attended features are attenuated. The centroid-judgment paradigm enables quick, precise measurements of such human perceptual attention filters, analogous to transmission measurements of photographic color filters. Subjects use a mouse to locate the centroid—the center of gravity—of a briefly displayed cloud of dots and receive precise feedback. A subset of dots is distinguished by some characteristic, such as a different color, and subjects judge the centroid of only the distinguished subset (e.g., dots of a particular color). The analysis efficiently determines the precise weight in the judged centroid of dots of every color in the display (i.e., the attention filter for the particular attended color in that context). We report 32 attention filters for single colors. Attention filters that discriminate one saturated hue from among seven other equiluminant distractor hues are extraordinarily selective, achieving attended/unattended weight ratios >20:1. Attention filters for selecting a color that differs in saturation or lightness from distractors are much less selective than attention filters for hue (given equal discriminability of the colors), and their filter selectivities are proportional to the discriminability distance of neighboring colors, whereas in the same range hue attention-filter selectivity is virtually independent of discriminabilty. PMID:27791040

  3. Ross filter pairs for metal artefact reduction in x-ray tomography: a case study based on imaging and segmentation of metallic implants

    Science.gov (United States)

    Arhatari, Benedicta D.; Abbey, Brian

    2018-01-01

    Ross filter pairs have recently been demonstrated as a highly effective means of producing quasi-monoenergetic beams from polychromatic X-ray sources. They have found applications in both X-ray spectroscopy and for elemental separation in X-ray computed tomography (XCT). Here we explore whether they could be applied to the problem of metal artefact reduction (MAR) for applications in medical imaging. Metal artefacts are a common problem in X-ray imaging of metal implants embedded in bone and soft tissue. A number of data post-processing approaches to MAR have been proposed in the literature, however these can be time-consuming and sometimes have limited efficacy. Here we describe and demonstrate an alternative approach based on beam conditioning using Ross filter pairs. This approach obviates the need for any complex post-processing of the data and enables MAR and segmentation from the surrounding tissue by exploiting the absorption edge contrast of the implant.

  4. Plasma Treatment to Remove Carbon from Indium UV Filters

    Science.gov (United States)

    Greer, Harold F.; Nikzad, Shouleh; Beasley, Matthew; Gantner, Brennan

    2012-01-01

    The sounding rocket experiment FIRE (Far-ultraviolet Imaging Rocket Experiment) will improve the science community fs ability to image a spectral region hitherto unexplored astronomically. The imaging band of FIRE (.900 to 1,100 Angstroms) will help fill the current wavelength imaging observation hole existing from approximately equal to 620 Angstroms to the GALEX band near 1,350 Angstroms. FIRE is a single-optic prime focus telescope with a 1.75-m focal length. The bandpass of 900 to 1100 Angstroms is set by a combination of the mirror coating, the indium filter in front of the detector, and the salt coating on the front of the detector fs microchannel plates. Critical to this is the indium filter that must reduce the flux from Lymanalpha at 1,216 Angstroms by a minimum factor of 10(exp -4). The cost of this Lyman-alpha removal is that the filter is not fully transparent at the desired wavelengths of 900 to 1,100 Angstroms. Recently, in a project to improve the performance of optical and solar blind detectors, JPL developed a plasma process capable of removing carbon contamination from indium metal. In this work, a low-power, low-temperature hydrogen plasma reacts with the carbon contaminants in the indium to form methane, but leaves the indium metal surface undisturbed. This process was recently tested in a proof-of-concept experiment with a filter provided by the University of Colorado. This initial test on a test filter showed improvement in transmission from 7 to 9 percent near 900 with no process optimization applied. Further improvements in this performance were readily achieved to bring the total transmission to 12% with optimization to JPL's existing process.

  5. Three-dimensional anisotropic adaptive filtering of projection data for noise reduction in cone beam CT

    Energy Technology Data Exchange (ETDEWEB)

    Maier, Andreas; Wigstroem, Lars; Hofmann, Hannes G.; Hornegger, Joachim; Zhu Lei; Strobel, Norbert; Fahrig, Rebecca [Department of Radiology, Stanford University, Stanford, California 94305 (United States); Department of Radiology, Stanford University, Stanford, California 94305 (United States) and Center for Medical Image Science and Visualization, Linkoeping University, Linkoeping (Sweden); Pattern Recognition Laboratory, Department of Computer Science, Friedrich-Alexander University of Erlangen-Nuremberg, 91054, Erlangen (Germany); Nuclear and Radiological Engineering and Medical Physics Programs, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); Siemens AG Healthcare, Forchheim 91301 (Germany); Department of Radiology, Stanford University, Stanford, California 94305 (United States)

    2011-11-15

    Purpose: The combination of quickly rotating C-arm gantry with digital flat panel has enabled the acquisition of three-dimensional data (3D) in the interventional suite. However, image quality is still somewhat limited since the hardware has not been optimized for CT imaging. Adaptive anisotropic filtering has the ability to improve image quality by reducing the noise level and therewith the radiation dose without introducing noticeable blurring. By applying the filtering prior to 3D reconstruction, noise-induced streak artifacts are reduced as compared to processing in the image domain. Methods: 3D anisotropic adaptive filtering was used to process an ensemble of 2D x-ray views acquired along a circular trajectory around an object. After arranging the input data into a 3D space (2D projections + angle), the orientation of structures was estimated using a set of differently oriented filters. The resulting tensor representation of local orientation was utilized to control the anisotropic filtering. Low-pass filtering is applied only along structures to maintain high spatial frequency components perpendicular to these. The evaluation of the proposed algorithm includes numerical simulations, phantom experiments, and in-vivo data which were acquired using an AXIOM Artis dTA C-arm system (Siemens AG, Healthcare Sector, Forchheim, Germany). Spatial resolution and noise levels were compared with and without adaptive filtering. A human observer study was carried out to evaluate low-contrast detectability. Results: The adaptive anisotropic filtering algorithm was found to significantly improve low-contrast detectability by reducing the noise level by half (reduction of the standard deviation in certain areas from 74 to 30 HU). Virtually no degradation of high contrast spatial resolution was observed in the modulation transfer function (MTF) analysis. Although the algorithm is computationally intensive, hardware acceleration using Nvidia's CUDA Interface provided an 8

  6. Three-dimensional anisotropic adaptive filtering of projection data for noise reduction in cone beam CT

    International Nuclear Information System (INIS)

    Maier, Andreas; Wigstroem, Lars; Hofmann, Hannes G.; Hornegger, Joachim; Zhu Lei; Strobel, Norbert; Fahrig, Rebecca

    2011-01-01

    Purpose: The combination of quickly rotating C-arm gantry with digital flat panel has enabled the acquisition of three-dimensional data (3D) in the interventional suite. However, image quality is still somewhat limited since the hardware has not been optimized for CT imaging. Adaptive anisotropic filtering has the ability to improve image quality by reducing the noise level and therewith the radiation dose without introducing noticeable blurring. By applying the filtering prior to 3D reconstruction, noise-induced streak artifacts are reduced as compared to processing in the image domain. Methods: 3D anisotropic adaptive filtering was used to process an ensemble of 2D x-ray views acquired along a circular trajectory around an object. After arranging the input data into a 3D space (2D projections + angle), the orientation of structures was estimated using a set of differently oriented filters. The resulting tensor representation of local orientation was utilized to control the anisotropic filtering. Low-pass filtering is applied only along structures to maintain high spatial frequency components perpendicular to these. The evaluation of the proposed algorithm includes numerical simulations, phantom experiments, and in-vivo data which were acquired using an AXIOM Artis dTA C-arm system (Siemens AG, Healthcare Sector, Forchheim, Germany). Spatial resolution and noise levels were compared with and without adaptive filtering. A human observer study was carried out to evaluate low-contrast detectability. Results: The adaptive anisotropic filtering algorithm was found to significantly improve low-contrast detectability by reducing the noise level by half (reduction of the standard deviation in certain areas from 74 to 30 HU). Virtually no degradation of high contrast spatial resolution was observed in the modulation transfer function (MTF) analysis. Although the algorithm is computationally intensive, hardware acceleration using Nvidia's CUDA Interface provided an 8.9-fold

  7. A study of full width at half maximum (FWHM) according to the filter's cut off level in SPECT camera

    International Nuclear Information System (INIS)

    Park, Soung Ock; Kwon, Soo Il

    2003-01-01

    Filtering is necessary to reduce statistical noise and to increase image quality in SPECT images. Noises controlled by low-pass filter designed to suppress high spatial frequency in SPECT image. Most SPECT filter function control the degree of high frequency suppression by choosing a cut off frequency. The location of cut off frequency determines the affect image noise and spatial resolution. If select the low cut off frequency, its provide good noise suppression but insufficient image quantity and high cut off frequencies increase the image resolution but insufficient noise suppression. The purpose of this study was to determines the optimum cut off level with comparison of FWHM according to cut off level in each filters-Band-limited, Sheep-logan, Sheep-logan Hanning, Generalized Hamming, Low pass cosine, Parazen and Butterworth filter in SPECT camera. We recorded image along the X, Y, Z-axis with 99m TcO 4 point source and measured FWHM by use profile curve. We find averaged length is 9.16 mm ∼ 18.14 mm of FWHM in X, Y, and Z-axis, and Band-limited and Generalized Hamming filters measures 9.16 mm at 0.7 cycle/pixel cut off frequency

  8. HIGH-PRECISION ATTITUDE ESTIMATION METHOD OF STAR SENSORS AND GYRO BASED ON COMPLEMENTARY FILTER AND UNSCENTED KALMAN FILTER

    Directory of Open Access Journals (Sweden)

    C. Guo

    2017-07-01

    Full Text Available Determining the attitude of satellite at the time of imaging then establishing the mathematical relationship between image points and ground points is essential in high-resolution remote sensing image mapping. Star tracker is insensitive to the high frequency attitude variation due to the measure noise and satellite jitter, but the low frequency attitude motion can be determined with high accuracy. Gyro, as a short-term reference to the satellite’s attitude, is sensitive to high frequency attitude change, but due to the existence of gyro drift and integral error, the attitude determination error increases with time. Based on the opposite noise frequency characteristics of two kinds of attitude sensors, this paper proposes an on-orbit attitude estimation method of star sensors and gyro based on Complementary Filter (CF and Unscented Kalman Filter (UKF. In this study, the principle and implementation of the proposed method are described. First, gyro attitude quaternions are acquired based on the attitude kinematics equation. An attitude information fusion method is then introduced, which applies high-pass filtering and low-pass filtering to the gyro and star tracker, respectively. Second, the attitude fusion data based on CF are introduced as the observed values of UKF system in the process of measurement updating. The accuracy and effectiveness of the method are validated based on the simulated sensors attitude data. The obtained results indicate that the proposed method can suppress the gyro drift and measure noise of attitude sensors, improving the accuracy of the attitude determination significantly, comparing with the simulated on-orbit attitude and the attitude estimation results of the UKF defined by the same simulation parameters.

  9. Spectral and Wavefront Error Performance of WFIRST/AFTA Prototype Filters

    Science.gov (United States)

    Quijada, Manuel; Seide, Laurie; Marx, Cathy; Pasquale, Bert; McMann, Joseph; Hagopian, John; Dominguez, Margaret; Gong, Qian; Morey, Peter

    2016-01-01

    The Cycle 5 design baseline for the Wide-Field Infrared Survey Telescope Astrophysics Focused Telescope Assets (WFIRSTAFTA) instrument includes a single wide-field channel (WFC) instrument for both imaging and slit-less spectroscopy. The only routinely moving part during scientific observations for this wide-field channel is the element wheel (EW) assembly. This filter-wheel assembly will have 8 positions that will be populated with 6 bandpass filters, a blank position, and a Grism that will consist of a three-element assembly to disperse the full field with an undeviated central wavelength for galaxy redshift surveys. All filter elements in the EW assembly will be made out of fused silica substrates (110 mm diameter) that will have the appropriate bandpass coatings according to the filter designations (Z087, Y106, J129, H158, F184, W149 and Grism). This paper presents and discusses the performance (including spectral transmission and reflectedtransmitted wavefront error measurements) of a subset of bandpass filter coating prototypes that are based on the WFC instrument filter compliment. The bandpass coating prototypes that are tested in this effort correspond to the Z087, W149, and Grism filter elements. These filter coatings have been procured from three different vendors to assess the most challenging aspects in terms of the in-band throughput, out of band rejection (including the cut-on and cutoff slopes), and the impact the wavefront error distortions of these filter coatings will have on the imaging performance of the de-field channel in the WFIRSTAFTA observatory.

  10. WE-G-18A-08: Axial Cone Beam DBPF Reconstruction with Three-Dimensional Weighting and Butterfly Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Tang, S; Wang, W [School of Automation, Xi' an University of Post and Telecommunication, Xi' an, Shaanxi (China); Tang, X [Emory University School of Medicine, Atlanta, GA (United States)

    2014-06-15

    Purpose: With the major benefit in dealing with data truncation for ROI reconstruction, the algorithm of differentiated backprojection followed by Hilbert filtering (DBPF) is originally derived for image reconstruction from parallel- or fan-beam data. To extend its application for axial CB scan, we proposed the integration of the DBPF algorithm with 3-D weighting. In this work, we further propose the incorporation of Butterfly filtering into the 3-D weighted axial CB-DBPF algorithm and conduct an evaluation to verify its performance. Methods: Given an axial scan, tomographic images are reconstructed by the DBPF algorithm with 3-D weighting, in which streak artifacts exist along the direction of Hilbert filtering. Recognizing this orientation-specific behavior, a pair of orthogonal Butterfly filtering is applied on the reconstructed images with the horizontal and vertical Hilbert filtering correspondingly. In addition, the Butterfly filtering can also be utilized for streak artifact suppression in the scenarios wherein only partial scan data with an angular range as small as 270° are available. Results: Preliminary data show that, with the correspondingly applied Butterfly filtering, the streak artifacts existing in the images reconstructed by the 3-D weighted DBPF algorithm can be suppressed to an unnoticeable level. Moreover, the Butterfly filtering also works at the scenarios of partial scan, though the 3-D weighting scheme may have to be dropped because of no sufficient projection data are available. Conclusion: As an algorithmic step, the incorporation of Butterfly filtering enables the DBPF algorithm for CB image reconstruction from data acquired along either a full or partial axial scan.

  11. WE-G-18A-08: Axial Cone Beam DBPF Reconstruction with Three-Dimensional Weighting and Butterfly Filtering

    International Nuclear Information System (INIS)

    Tang, S; Wang, W; Tang, X

    2014-01-01

    Purpose: With the major benefit in dealing with data truncation for ROI reconstruction, the algorithm of differentiated backprojection followed by Hilbert filtering (DBPF) is originally derived for image reconstruction from parallel- or fan-beam data. To extend its application for axial CB scan, we proposed the integration of the DBPF algorithm with 3-D weighting. In this work, we further propose the incorporation of Butterfly filtering into the 3-D weighted axial CB-DBPF algorithm and conduct an evaluation to verify its performance. Methods: Given an axial scan, tomographic images are reconstructed by the DBPF algorithm with 3-D weighting, in which streak artifacts exist along the direction of Hilbert filtering. Recognizing this orientation-specific behavior, a pair of orthogonal Butterfly filtering is applied on the reconstructed images with the horizontal and vertical Hilbert filtering correspondingly. In addition, the Butterfly filtering can also be utilized for streak artifact suppression in the scenarios wherein only partial scan data with an angular range as small as 270° are available. Results: Preliminary data show that, with the correspondingly applied Butterfly filtering, the streak artifacts existing in the images reconstructed by the 3-D weighted DBPF algorithm can be suppressed to an unnoticeable level. Moreover, the Butterfly filtering also works at the scenarios of partial scan, though the 3-D weighting scheme may have to be dropped because of no sufficient projection data are available. Conclusion: As an algorithmic step, the incorporation of Butterfly filtering enables the DBPF algorithm for CB image reconstruction from data acquired along either a full or partial axial scan

  12. Electron energy loss spectroscopy microanalysis and imaging in the transmission electron microscope: example of biological applications

    International Nuclear Information System (INIS)

    Diociaiuti, Marco

    2005-01-01

    This paper reports original results obtained in our laboratory over the past few years in the application of both electron energy loss spectroscopy (EELS) and electron spectroscopy imaging (ESI) to biological samples, performed in two transmission electron microscopes (TEM) equipped with high-resolution electron filters and spectrometers: a Gatan model 607 single magnetic sector double focusing EEL serial spectrometer attached to a Philips 430 TEM and a Zeiss EM902 Energy Filtering TEM. The primary interest was on the possibility offered by the combined application of these spectroscopic techniques with those offered by the TEM. In particular, the electron beam focusing available in a TEM allowed us to perform EELS and ESI on very small sample volumes, where high-resolution imaging and electron diffraction techniques can provide important structural information. I show that ESI was able to improve TEM performance, due to the reduced chromatic aberration and the possibility of avoiding the sample staining procedure. Finally, the analysis of the oscillating extended energy loss fine structure (EXELFS) beyond the ionization edges characterizing the EELS spectra allowed me, in a manner very similar to the extended X-ray absorption fine structure (EXAFS) analysis of the X-ray absorption spectra, to obtain short-range structural information for such light elements of biological interest as O or Fe. The Philips EM430 (250-300 keV) TEM was used to perform EELS microanalysis on Ca, P, O, Fe, Al and Si. The assessment of the detection limits of this method was obtained working with well-characterized samples containing Ca and P, and mimicking the actual cellular matrix. I applied EELS microanalysis to Ca detection in bone tissue during the mineralization process and to P detection in the cellular membrane of erythrocytes treated with an anti-tumoral drug, demonstrating that the cellular membrane is a drug target. I applied EELS microanalysis and selected area electron

  13. Evaluation of the composition of filters additional of equipment radiological intraoral by energy dispersive x-ray fluorescence (EDXRF)

    International Nuclear Information System (INIS)

    Franca, Alana Caroline; Torres, Catarina A.M.P.; Rocha, Ana S.S.; Deniak, Valeriy; Lara, Alessandro L.; Paschuk, Sergei A.; Fernandes, Angela; Westphalen, Fernando Henrique

    2013-01-01

    The need for high quality standards for radiographic images in order to make a diagnosis assertive, and being the additional filtration required in the intraoral X-ray equipment show the need of evaluating these filters. This study aims to examine the influence of the elemental composition of the filters of X-ray of dental intraoral equipment in the radiographic images quality. The filters analysis were performed by using the energy dispersive X-ray fluorescence method (EDXRF). Ten conventional filters were analysed. In this study, 33 radiographic exposures were performed using films: twenty radiographs in the incisor region and ten in the molar region, three exposures were also made in the same regions with same conditions without using filter. After radiographs development, optical density was measure and all radiographs were submitted to subjective evaluation by dental radiologists. Data obtained were correlated to effects evaluation of the elemental composition of all filters in the quality of the radiographic images. The elements found were: aluminum, cobalt, copper, sulfur, iron, manganese, titanium, zinc, and zirconium. The images obtained were identified in groups: Molars to 0.3 s; Incisors to 0.2 s; Incisors to 0.3 s, and for the group without filters. From the results obtained it was concluded that both unclear radiographs and ideal radiographs were produced by using filters of elementary common. Therefore, conventional filters evaluated were an acceptable option to produce quality images in dental radiology, despite differences in the composition of the alloys. (author)

  14. Evaluation of the composition of filters additional of equipment radiological intraoral by energy dispersive x-ray fluorescence (EDXRF)

    Energy Technology Data Exchange (ETDEWEB)

    Franca, Alana Caroline; Torres, Catarina A.M.P.; Rocha, Ana S.S.; Deniak, Valeriy; Lara, Alessandro L.; Paschuk, Sergei A., E-mail: alanacarolinef@gmail.com, E-mail: sergei@utfpr.edu.br [Universidade Tecnologica Federal do Parana (CPGEI/UTFPR), Curitiba, PR (Brazil). Programa de Pos-Graduacao em Engenharia Eletrica e Informatica Industrial; Fernandes, Angela; Westphalen, Fernando Henrique, E-mail: angelafernandes@ufpr.br [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Setor de Ciencias da Saude

    2013-07-01

    The need for high quality standards for radiographic images in order to make a diagnosis assertive, and being the additional filtration required in the intraoral X-ray equipment show the need of evaluating these filters. This study aims to examine the influence of the elemental composition of the filters of X-ray of dental intraoral equipment in the radiographic images quality. The filters analysis were performed by using the energy dispersive X-ray fluorescence method (EDXRF). Ten conventional filters were analysed. In this study, 33 radiographic exposures were performed using films: twenty radiographs in the incisor region and ten in the molar region, three exposures were also made in the same regions with same conditions without using filter. After radiographs development, optical density was measure and all radiographs were submitted to subjective evaluation by dental radiologists. Data obtained were correlated to effects evaluation of the elemental composition of all filters in the quality of the radiographic images. The elements found were: aluminum, cobalt, copper, sulfur, iron, manganese, titanium, zinc, and zirconium. The images obtained were identified in groups: Molars to 0.3 s; Incisors to 0.2 s; Incisors to 0.3 s, and for the group without filters. From the results obtained it was concluded that both unclear radiographs and ideal radiographs were produced by using filters of elementary common. Therefore, conventional filters evaluated were an acceptable option to produce quality images in dental radiology, despite differences in the composition of the alloys. (author)

  15. Asymptomatic Lumbar Vertebral Erosion from Inferior Vena Cava Filter Perforation

    International Nuclear Information System (INIS)

    Fang, Wayne; Hieb, Robert A.; Olson, Eric; Carrera, Guillermo F.

    2007-01-01

    In 2002, a 24-year-old female trauma patient underwent prophylactic inferior vena cava filter placement. Recurrent bouts of renal stones prompted serial CT imaging in 2004. In this brief report, we describe erosion and ossification of the L3 vertebral body by a Greenfield filter strut

  16. SU-E-I-62: Assessing Radiation Dose Reduction and CT Image Optimization Through the Measurement and Analysis of the Detector Quantum Efficiency (DQE) of CT Images Using Different Beam Hardening Filters

    International Nuclear Information System (INIS)

    Collier, J; Aldoohan, S; Gill, K

    2014-01-01

    Purpose: Reducing patient dose while maintaining (or even improving) image quality is one of the foremost goals in CT imaging. To this end, we consider the feasibility of optimizing CT scan protocols in conjunction with the application of different beam-hardening filtrations and assess this augmentation through noise-power spectrum (NPS) and detector quantum efficiency (DQE) analysis. Methods: American College of Radiology (ACR) and Catphan phantoms (The Phantom Laboratory) were scanned with a 64 slice CT scanner when additional filtration of thickness and composition (e.g., copper, nickel, tantalum, titanium, and tungsten) had been applied. A MATLAB-based code was employed to calculate the image of noise NPS. The Catphan Image Owl software suite was then used to compute the modulated transfer function (MTF) responses of the scanner. The DQE for each additional filter, including the inherent filtration, was then computed from these values. Finally, CT dose index (CTDIvol) values were obtained for each applied filtration through the use of a 100 mm pencil ionization chamber and CT dose phantom. Results: NPS, MTF, and DQE values were computed for each applied filtration and compared to the reference case of inherent beam-hardening filtration only. Results showed that the NPS values were reduced between 5 and 12% compared to inherent filtration case. Additionally, CTDIvol values were reduced between 15 and 27% depending on the composition of filtration applied. However, no noticeable changes in image contrast-to-noise ratios were noted. Conclusion: The reduction in the quanta noise section of the NPS profile found in this phantom-based study is encouraging. The reduction in both noise and dose through the application of beam-hardening filters is reflected in our phantom image quality. However, further investigation is needed to ascertain the applicability of this approach to reducing patient dose while maintaining diagnostically acceptable image qualities in a

  17. Adaptive bilateral filter for image denoising and its application to in-vitro Time-of-Flight data

    Science.gov (United States)

    Seitel, Alexander; dos Santos, Thiago R.; Mersmann, Sven; Penne, Jochen; Groch, Anja; Yung, Kwong; Tetzlaff, Ralf; Meinzer, Hans-Peter; Maier-Hein, Lena

    2011-03-01

    Image-guided therapy systems generally require registration of pre-operative planning data with the patient's anatomy. One common approach to achieve this is to acquire intra-operative surface data and match it to surfaces extracted from the planning image. Although increasingly popular for surface generation in general, the novel Time-of-Flight (ToF) technology has not yet been applied in this context. This may be attributed to the fact that the ToF range images are subject to considerable noise. The contribution of this study is two-fold. Firstly, we present an adaption of the well-known bilateral filter for denoising ToF range images based on the noise characteristics of the camera. Secondly, we assess the quality of organ surfaces generated from ToF range data with and without bilateral smoothing using corresponding high resolution CT data as ground truth. According to an evaluation on five porcine organs, the root mean squared (RMS) distance between the denoised ToF data points and the reference computed tomography (CT) surfaces ranged from 3.0 mm (lung) to 9.0 mm (kidney). This corresponds to an error-reduction of up to 36% compared to the error of the original ToF surfaces.

  18. Hardware design of the median filter based on window structure and batcher′s oddeven sort network

    Directory of Open Access Journals (Sweden)

    SUN Kaimin

    2013-06-01

    Full Text Available Area and speed are two important factors to be considered in designing Median Filter with digital circuits.Area consideration requires the use of logical resources as little as possible,while speed consideration requires the system capable of working on higher clock frequencies,with as few clock cycles as possible to complete a frame filtering or real time filtering.This paper gives a new design of Median Filter,the hardware structure of which is a 3×3 window structure with two buffers.The filter function module is based on Batcher′s Odd-Even Sort network theory.Structural design is implemented in FPGA,verified by ModelSim software and realizes video image filtering.The experimental analysis shows that this new structure of Median Filter effectively decreases logical resources (merely using 741 Logic Elements,and accelerates the pixel processing speed up to 27MHz.This filter achieves realtime processing of video images of 30 frames/s.This design not only has a certain practicality,but also provides a reference for the hardware structure design ideas in digital image processing.

  19. Image quality enhancement for skin cancer optical diagnostics

    Science.gov (United States)

    Bliznuks, Dmitrijs; Kuzmina, Ilona; Bolocko, Katrina; Lihachev, Alexey

    2017-12-01

    The research presents image quality analysis and enhancement proposals in biophotonic area. The sources of image problems are reviewed and analyzed. The problems with most impact in biophotonic area are analyzed in terms of specific biophotonic task - skin cancer diagnostics. The results point out that main problem for skin cancer analysis is the skin illumination problems. Since it is often not possible to prevent illumination problems, the paper proposes image post processing algorithm - low frequency filtering. Practical results show diagnostic results improvement after using proposed filter. Along that, filter do not reduces diagnostic results' quality for images without illumination defects. Current filtering algorithm requires empirical tuning of filter parameters. Further work needed to test the algorithm in other biophotonic applications and propose automatic filter parameter selection.

  20. Information content of the space-frequency filtering of blood plasma layers laser images in the diagnosis of pathological changes

    Science.gov (United States)

    Ushenko, A. G.; Boychuk, T. M.; Mincer, O. P.; Bodnar, G. B.; Kushnerick, L. Ya.; Savich, V. O.

    2013-12-01

    The bases of method of the space-frequency of the filtering phase allocation of blood plasma pellicle are given here. The model of the optical-anisotropic properties of the albumen chain of blood plasma pellicle with regard to linear and circular double refraction of albumen and globulin crystals is proposed. Comparative researches of the effectiveness of methods of the direct polarized mapping of the azimuth images of blood plasma pcllicle layers and space-frequency polarimetry of the laser radiation transformed by divaricate and holelikc optical-anisotropic chains of blood plasma pellicles were held. On the basis of the complex statistic, correlative and fracta.1 analysis of the filtered frcquencydimensional polarizing azimuth maps of the blood plasma pellicles structure a set of criteria of the change of the double refraction of the albumen chains caused by the prostate cancer was traced and proved.

  1. CLASSIFICATION OF HYPERSPECTRAL DATA BASED ON GUIDED FILTERING AND RANDOM FOREST

    Directory of Open Access Journals (Sweden)

    H. Ma

    2017-09-01

    Full Text Available Hyperspectral images usually consist of more than one hundred spectral bands, which have potentials to provide rich spatial and spectral information. However, the application of hyperspectral data is still challengeable due to “the curse of dimensionality”. In this context, many techniques, which aim to make full use of both the spatial and spectral information, are investigated. In order to preserve the geometrical information, meanwhile, with less spectral bands, we propose a novel method, which combines principal components analysis (PCA, guided image filtering and the random forest classifier (RF. In detail, PCA is firstly employed to reduce the dimension of spectral bands. Secondly, the guided image filtering technique is introduced to smooth land object, meanwhile preserving the edge of objects. Finally, the features are fed into RF classifier. To illustrate the effectiveness of the method, we carry out experiments over the popular Indian Pines data set, which is collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS sensor. By comparing the proposed method with the method of only using PCA or guided image filter, we find that effect of the proposed method is better.

  2. Backprojection filtering for variable orbit fan-beam tomography

    International Nuclear Information System (INIS)

    Gullberg, G.T.; Zeng, G.L.

    1995-01-01

    Backprojection filtering algorithms are presented for three variable Orbit fan-beam geometries. Expressions for the fan beam projection and backprojection operators are given for a flat detector fan-beam geometry with fixed focal length, with variable focal length, and with fixed focal length and off-center focusing. Backprojection operators are derived for each geometry using transformation of coordinates to transform from a parallel geometry backprojector to a fan-beam backprojector for the appropriate geometry. The backprojection operator includes a factor which is a function of the coordinates of the projection ray and the coordinates of the pixel in the backprojected image. The backprojection filtering algorithm first backprojects the variable orbit fan-beam projection data using the appropriately derived backprojector to obtain a 1/r blurring of the original image then takes the two-dimensional (2D) Fast Fourier Transform (FFT) of the backprojected image, then multiples the transformed image by the 2D ramp filter function, and finally takes the inverse 2D FFT to obtain the reconstructed image. Computer simulations verify that backprojectors with appropriate weighting give artifact free reconstructions of simulated line integral projections. Also, it is shown that it is not necessary to assume a projection model of line integrals, but the projector and backprojector can be defined to model the physics of the imaging detection process. A backprojector for variable orbit fan-beam tomography with fixed focal length is derived which includes an additional factor which is a function of the flux density along the flat detector. It is shown that the impulse response for the composite of the projection and backprojection operations is equal to 1/r

  3. Periodic additive noises reduction in 3D images used in building of voxel phantoms through an efficient implementation of the 3D FFT: zipper artifacts filtering

    International Nuclear Information System (INIS)

    Oliveira, Alex C.H. de; Lima, Fernando R.A.; Vieira, Jose W.; Leal Neto, Viriato

    2009-01-01

    The anthropomorphic models used in computational dosimetry are predominantly build from scanning CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) image stacks obtained of patients or volunteers. The building of these stacks (usually called of voxel phantoms or tomography phantoms) requires computer processing to be used in an exposure computational model. Noises present in these stacks can be confused with significant structures. In a 3D image with periodic additive noise in the frequency domain, the noise is fully added to its central slice. The discrete Fourier transform is the fundamental mathematical tool that allows the switch of the spatial domain for the frequency domain, and vice versa. The FFT (fast Fourier transform) algorithm is an ideal computational tool for this switch in domain with efficiency. This paper presents a new methodology for implementation in managed C++ language (Microsoft Visual Studio R .NET) of the fast Fourier transform of 3D digital images (FFT3D) using, essentially, the trigonometric recombination. The reduction of periodic additive noise consists in filtering only the central slice of 3D image in the frequency domain and transforms it back into the spatial domain through the inverse FFT3D. An example of application of this method it is the zipper artifacts filtering in images of MRI. These processes were implemented in the software DIP (Digital Image Processing). (author)

  4. Accuracy improvement of CT reconstruction using tree-structured filter bank

    International Nuclear Information System (INIS)

    Ueda, Kazuhiro; Morimoto, Hiroaki; Morikawa, Yoshitaka; Murakami, Junichi

    2009-01-01

    Accuracy improvement of 'CT reconstruction algorithm using TSFB (Tree-Structured Filter Bank)' that is high-speed CT reconstruction algorithm, was proposed. TSFB method could largely reduce the amount of computation in comparison with the CB (Convolution Backprojection) method, but it was the problem that an artifact occurred in a reconstruction image since reconstruction was performed with disregard to a signal out of the reconstruction domain in stage processing. Also the whole band filter being the component of a two-dimensional synthesis filter was IIR filter and then an artifact occurred at the end of the reconstruction image. In order to suppress these artifacts the proposed method enlarged the processing range by the TSFB method in the domain outside by the width control of the specimen line and line addition to the reconstruction domain outside. And, furthermore, to avoid increase of the amount of computation, the algorithm was proposed such as to decide the needed processing range depending on the number of steps processing with the TSFB and the degree of incline of filter, and then update the position and width of the specimen line to process the needed range. According to the simulation to realize a high-speed and highly accurate CT reconstruction in this way, the quality of the reconstruction image of the proposed method was improved in comparison with the TSFB method and got the same result with the CB method. (T. Tanaka)

  5. The effect of high-pass filters on the visibility of microcalcifications

    International Nuclear Information System (INIS)

    Lai, Y-M.

    2002-01-01

    Mammographic microcalcifications are significant in diagnosis, as they may indicate early stage malignancy. Thus mammographic screen-film combinations incorporate high contrast and spatial resolution in order to detect fine details. However, screen-film combinations, coupling different functions, can pose limitations. Images acquired in digital format provide a new way of using images, as image enhancement can be achieved by manipulation in the spatial domain without additional radiation exposure. Gold disks of differing diameters and thickness in the Nijmegan phantom were used to simulate microcalcifications. Microcalcification, as an image feature, is of high frequency components in the spatial domain. High-pass filters enable enhancement of microcalcification. Three high-pass filters were investigated to compare their efficacy. A phantom consisting of polystyrene granules embedded in a sodium iodide solution was used to simulate the breast tissue pattern. A composite radiographic image was produced by combining the phantom with a Nijmegen phantom, which was then digitised and processed. This was assessed by observer performance in locating the microcalcifications. Also, line profile image analysis was performed on digital mammograms with microcalcifications. The filter with a central weight of 9 and neighbouring pixels of -1 exhibited the greatest effectiveness in the enhancement of the microcalcification. Copyright (2002) Australian Institute of Radiography

  6. Filter multiplexing by use of spatial Code Division Multiple Access approach.

    Science.gov (United States)

    Solomon, Jonathan; Zalevsky, Zeev; Mendlovic, David; Monreal, Javier Garcia

    2003-02-10

    The increasing popularity of optical communication has also brought a demand for a broader bandwidth. The trend, naturally, was to implement methods from traditional electronic communication. One of the most effective traditional methods is Code Division Multiple Access. In this research, we suggest the use of this approach for spatial coding applied to images. The approach is to multiplex several filters into one plane while keeping their mutual orthogonality. It is shown that if the filters are limited by their bandwidth, the output of all the filters can be sampled in the original image resolution and fully recovered through an all-optical setup. The theoretical analysis of such a setup is verified in an experimental demonstration.

  7. Comparison of optimal performance at 300 keV of three direct electron detectors for use in low dose electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    McMullan, G., E-mail: gm2@mrc-lmb.cam.ac.uk [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom); Faruqi, A.R. [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom); Clare, D. [Crystallography and Institute of Structural and Molecular Biology, Birkbeck College, University of London, Malet Street, London WC1E 7HX (United Kingdom); Henderson, R. [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom)

    2014-12-15

    Low dose electron imaging applications such as electron cryo-microscopy are now benefitting from the improved performance and flexibility of recently introduced electron imaging detectors in which electrons are directly incident on backthinned CMOS sensors. There are currently three commercially available detectors of this type: the Direct Electron DE-20, the FEI Falcon II and the Gatan K2 Summit. These have different characteristics and so it is important to compare their imaging properties carefully with a view to optimise how each is used. Results at 300 keV for both the modulation transfer function (MTF) and the detective quantum efficiency (DQE) are presented. Of these, the DQE is the most important in the study of radiation sensitive samples where detector performance is crucial. We find that all three detectors have a better DQE than film. The K2 Summit has the best DQE at low spatial frequencies but with increasing spatial frequency its DQE falls below that of the Falcon II. - Highlights: • Three direct electron detectors offer better DQE than film at 300 keV. • Recorded 300 keV electron events on the detectors have very similar Landau distributions. • The Gatan K2 Summit detector has the highest DQE at low spatial frequency. • The FEI Falcon II detector has the highest DQE beyond one half the Nyquist frequency. • The Direct Electron DE-20 detector has the fastest data acquisition rate.

  8. Comparison of optimal performance at 300 keV of three direct electron detectors for use in low dose electron microscopy

    International Nuclear Information System (INIS)

    McMullan, G.; Faruqi, A.R.; Clare, D.; Henderson, R.

    2014-01-01

    Low dose electron imaging applications such as electron cryo-microscopy are now benefitting from the improved performance and flexibility of recently introduced electron imaging detectors in which electrons are directly incident on backthinned CMOS sensors. There are currently three commercially available detectors of this type: the Direct Electron DE-20, the FEI Falcon II and the Gatan K2 Summit. These have different characteristics and so it is important to compare their imaging properties carefully with a view to optimise how each is used. Results at 300 keV for both the modulation transfer function (MTF) and the detective quantum efficiency (DQE) are presented. Of these, the DQE is the most important in the study of radiation sensitive samples where detector performance is crucial. We find that all three detectors have a better DQE than film. The K2 Summit has the best DQE at low spatial frequencies but with increasing spatial frequency its DQE falls below that of the Falcon II. - Highlights: • Three direct electron detectors offer better DQE than film at 300 keV. • Recorded 300 keV electron events on the detectors have very similar Landau distributions. • The Gatan K2 Summit detector has the highest DQE at low spatial frequency. • The FEI Falcon II detector has the highest DQE beyond one half the Nyquist frequency. • The Direct Electron DE-20 detector has the fastest data acquisition rate

  9. Spatial frequency domain imaging using a snap-shot filter mosaic camera with multi-wavelength sensitive pixels

    Science.gov (United States)

    Strömberg, Tomas; Saager, Rolf B.; Kennedy, Gordon T.; Fredriksson, Ingemar; Salerud, Göran; Durkin, Anthony J.; Larsson, Marcus

    2018-02-01

    Spatial frequency domain imaging (SFDI) utilizes a digital light processing (DLP) projector for illuminating turbid media with sinusoidal patterns. The tissue absorption (μa) and reduced scattering coefficient (μ,s) are calculated by analyzing the modulation transfer function for at least two spatial frequencies. We evaluated different illumination strategies with a red, green and blue light emitting diodes (LED) in the DLP, while imaging with a filter mosaic camera, XiSpec, with 16 different multi-wavelength sensitive pixels in the 470-630 nm wavelength range. Data were compared to SFDI by a multispectral camera setup (MSI) consisting of four cameras with bandpass filters centered at 475, 560, 580 and 650 nm. A pointwise system for comprehensive microcirculation analysis was used (EPOS) for comparison. A 5-min arterial occlusion and release protocol on the forearm of a Caucasian male with fair skin was analyzed by fitting the absorption spectra of the chromophores HbO2, Hb and melanin to the estimatedμa. The tissue fractions of red blood cells (fRBC), melanin (/mel) and the Hb oxygenation (S02 ) were calculated at baseline, end of occlusion, early after release and late after release. EPOS results showed a decrease in S02 during the occlusion and hyperemia during release (S02 = 40%, 5%, 80% and 51%). The fRBC showed an increase during occlusion and release phases. The best MSI resemblance to the EPOS was for green LED illumination (S02 = 53%, 9%, 82%, 65%). Several illumination and analysis strategies using the XiSpec gave un-physiological results (e.g. negative S02 ). XiSpec with green LED illumination gave the expected change in /RBC , while the dynamics in S02 were less than those for EPOS. These results may be explained by the calculation of modulation using an illumination and detector setup with a broad spectral transmission bandwidth, with considerable variation in μa of included chromophores. Approaches for either reducing the effective bandwidth of

  10. Speckle reduction in echocardiography by temporal compounding and anisotropic diffusion filtering

    Science.gov (United States)

    Giraldo-Guzmán, Jader; Porto-Solano, Oscar; Cadena-Bonfanti, Alberto; Contreras-Ortiz, Sonia H.

    2015-01-01

    Echocardiography is a medical imaging technique based on ultrasound signals that is used to evaluate heart anatomy and physiology. Echocardiographic images are affected by speckle, a type of multiplicative noise that obscures details of the structures, and reduces the overall image quality. This paper shows an approach to enhance echocardiography using two processing techniques: temporal compounding and anisotropic diffusion filtering. We used twenty echocardiographic videos that include one or three cardiac cycles to test the algorithms. Two images from each cycle were aligned in space and averaged to obtain the compound images. These images were then processed using anisotropic diffusion filters to further improve their quality. Resultant images were evaluated using quality metrics and visual assessment by two medical doctors. The average total improvement on signal-to-noise ratio was up to 100.29% for videos with three cycles, and up to 32.57% for videos with one cycle.

  11. MULTISCALE TENSOR ANISOTROPIC FILTERING OF FLUORESCENCE MICROSCOPY FOR DENOISING MICROVASCULATURE.

    Science.gov (United States)

    Prasath, V B S; Pelapur, R; Glinskii, O V; Glinsky, V V; Huxley, V H; Palaniappan, K

    2015-04-01

    Fluorescence microscopy images are contaminated by noise and improving image quality without blurring vascular structures by filtering is an important step in automatic image analysis. The application of interest here is to automatically extract the structural components of the microvascular system with accuracy from images acquired by fluorescence microscopy. A robust denoising process is necessary in order to extract accurate vascular morphology information. For this purpose, we propose a multiscale tensor with anisotropic diffusion model which progressively and adaptively updates the amount of smoothing while preserving vessel boundaries accurately. Based on a coherency enhancing flow with planar confidence measure and fused 3D structure information, our method integrates multiple scales for microvasculature preservation and noise removal membrane structures. Experimental results on simulated synthetic images and epifluorescence images show the advantage of our improvement over other related diffusion filters. We further show that the proposed multiscale integration approach improves denoising accuracy of different tensor diffusion methods to obtain better microvasculature segmentation.

  12. Image Denoising And Segmentation Approchto Detect Tumor From BRAINMRI Images

    Directory of Open Access Journals (Sweden)

    Shanta Rangaswamy

    2018-04-01

    Full Text Available The detection of the Brain Tumor is a challenging problem, due to the structure of the Tumor cells in the brain. This project presents a systematic method that enhances the detection of brain tumor cells and to analyze functional structures by training and classification of the samples in SVM and tumor cell segmentation of the sample using DWT algorithm. From the input MRI Images collected, first noise is removed from MRI images by applying wiener filtering technique. In image enhancement phase, all the color components of MRI Images will be converted into gray scale image and make the edges clear in the image to get better identification and improvised quality of the image. In the segmentation phase, DWT on MRI Image to segment the grey-scale image is performed. During the post-processing, classification of tumor is performed by using SVM classifier. Wiener Filter, DWT, SVM Segmentation strategies were used to find and group the tumor position in the MRI filtered picture respectively. An essential perception in this work is that multi arrange approach utilizes various leveled classification strategy which supports execution altogether. This technique diminishes the computational complexity quality in time and memory. This classification strategy works accurately on all images and have achieved the accuracy of 93%.

  13. Demosaicking algorithm for the Kodak-RGBW color filter array

    Science.gov (United States)

    Rafinazari, M.; Dubois, E.

    2015-01-01

    Digital cameras capture images through different Color Filter Arrays and then reconstruct the full color image. Each CFA pixel only captures one primary color component; the other primary components will be estimated using information from neighboring pixels. During the demosaicking algorithm, the two unknown color components will be estimated at each pixel location. Most of the demosaicking algorithms use the RGB Bayer CFA pattern with Red, Green and Blue filters. The least-Squares Luma-Chroma demultiplexing method is a state of the art demosaicking method for the Bayer CFA. In this paper we develop a new demosaicking algorithm using the Kodak-RGBW CFA. This particular CFA reduces noise and improves the quality of the reconstructed images by adding white pixels. We have applied non-adaptive and adaptive demosaicking method using the Kodak-RGBW CFA on the standard Kodak image dataset and the results have been compared with previous work.

  14. Super-resolution pupil filtering for visual performance enhancement using adaptive optics

    Science.gov (United States)

    Zhao, Lina; Dai, Yun; Zhao, Junlei; Zhou, Xiaojun

    2018-05-01

    Ocular aberration correction can significantly improve visual function of the human eye. However, even under ideal aberration correction conditions, pupil diffraction restricts the resolution of retinal images. Pupil filtering is a simple super-resolution (SR) method that can overcome this diffraction barrier. In this study, a 145-element piezoelectric deformable mirror was used as a pupil phase filter because of its programmability and high fitting accuracy. Continuous phase-only filters were designed based on Zernike polynomial series and fitted through closed-loop adaptive optics. SR results were validated using double-pass point spread function images. Contrast sensitivity was further assessed to verify the SR effect on visual function. An F-test was conducted for nested models to statistically compare different CSFs. These results indicated CSFs for the proposed SR filter were significantly higher than the diffraction correction (p vision optical correction of the human eye.

  15. Microscopy with spatial filtering for sorting particles and monitoring subcellular morphology

    Science.gov (United States)

    Zheng, Jing-Yi; Qian, Zhen; Pasternack, Robert M.; Boustany, Nada N.

    2009-02-01

    Optical scatter imaging (OSI) was developed to non-invasively track real-time changes in particle morphology with submicron sensitivity in situ without exogenous labeling, cell fixing, or organelle isolation. For spherical particles, the intensity ratio of wide-to-narrow angle scatter (OSIR, Optical Scatter Image Ratio) was shown to decrease monotonically with diameter and agree with Mie theory. In living cells, we recently reported this technique is able to detect mitochondrial morphological alterations, which were mediated by the Bcl-xL transmembrane domain, and could not be observed by fluorescence or differential interference contrast images. Here we further extend the ability of morphology assessment by adopting a digital micromirror device (DMD) for Fourier filtering. When placed in the Fourier plane the DMD can be used to select scattering intensities at desired combination of scattering angles. We designed an optical filter bank consisting of Gabor-like filters with various scales and rotations based on Gabor filters, which have been widely used for localization of spatial and frequency information in digital images and texture analysis. Using a model system consisting of mixtures of polystyrene spheres and bacteria, we show how this system can be used to sort particles on a microscopic slide based on their size, orientation and aspect ratio. We are currently applying this technique to characterize the morphology of subcellular organelles to help understand fundamental biological processes.

  16. A New Switching-Based Median Filtering Scheme and Algorithm for Removal of High-Density Salt and Pepper Noise in Images

    Directory of Open Access Journals (Sweden)

    Jayaraj V

    2010-01-01

    Full Text Available A new switching-based median filtering scheme for restoration of images that are highly corrupted by salt and pepper noise is proposed. An algorithm based on the scheme is developed. The new scheme introduces the concept of substitution of noisy pixels by linear prediction prior to estimation. A novel simplified linear predictor is developed for this purpose. The objective of the scheme and algorithm is the removal of high-density salt and pepper noise in images. The new algorithm shows significantly better image quality with good PSNR, reduced MSE, good edge preservation, and reduced streaking. The good performance is achieved with reduced computational complexity. A comparison of the performance is made with several existing algorithms in terms of visual and quantitative results. The performance of the proposed scheme and algorithm is demonstrated.

  17. Regularized iterative integration combined with non-linear diffusion filtering for phase-contrast x-ray computed tomography.

    Science.gov (United States)

    Burger, Karin; Koehler, Thomas; Chabior, Michael; Allner, Sebastian; Marschner, Mathias; Fehringer, Andreas; Willner, Marian; Pfeiffer, Franz; Noël, Peter

    2014-12-29

    Phase-contrast x-ray computed tomography has a high potential to become clinically implemented because of its complementarity to conventional absorption-contrast.In this study, we investigate noise-reducing but resolution-preserving analytical reconstruction methods to improve differential phase-contrast imaging. We apply the non-linear Perona-Malik filter on phase-contrast data prior or post filtered backprojected reconstruction. Secondly, the Hilbert kernel is replaced by regularized iterative integration followed by ramp filtered backprojection as used for absorption-contrast imaging. Combining the Perona-Malik filter with this integration algorithm allows to successfully reveal relevant sample features, quantitatively confirmed by significantly increased structural similarity indices and contrast-to-noise ratios. With this concept, phase-contrast imaging can be performed at considerably lower dose.

  18. Recent progress in energy-filtered high energy X-ray photoemission electron microscopy using a Wien filter type energy analyzer

    International Nuclear Information System (INIS)

    Niimi, H.; Tsutsumi, T.; Matsudaira, H.; Kawasaki, T.; Suzuki, S.; Chun, W.-J.; Kato, M.; Kitajima, Y.; Iwasawa, Y.; Asakura, K.

    2004-01-01

    Energy-filtered X-ray photoemission electron microscopy (EXPEEM) is a microscopy technique which has the potential to provide surface chemical mapping during surface chemical processes on the nanometer scale. We studied the possibilities of EXPEEM using a Wien filter type energy analyzer in the high energy X-ray region above 1000 eV. We have successfully observed the EXPEEM images of Au islands on a Ta sheet using Au 3d 5/2 and Ta 3d 5/2 photoelectron peaks which were excited by 2380 eV X-rays emitted from an undulator (BL2A) at Photon Factory. Our recent efforts to improve the sensitivity of the Wien filter energy analyzer will also be discussed

  19. Large window median filtering on Clip7

    Energy Technology Data Exchange (ETDEWEB)

    Mathews, K N

    1983-07-01

    Median filtering has been found to be a useful operation to perform on images in order to reduce random noise while preserving edges of objects. However, in some cases, as the resolution of the image increases, so too does the required window size of the filter. For parallel array processors, this leads to problems when dealing with the large amount of data involved. That is to say that there tend to be problems over slow access of data from pixels over a large neighbourhood, lack of available storage of this data during the operation and long computational times for finding the median. An algorithm for finding the median, designed for use on byte wide architecture parallel array processors is presented together with its implementation on Clip7, a scanning array of such processors. 6 references.

  20. Decoupled deblurring filter and its application to elastic migration and inversion

    KAUST Repository

    Feng, Zongcai

    2017-08-17

    We present a decoupled deblurring filter that approximates the multiparameter Hessian inverse by using local filters to approximate its submatrices for the same and different parameter classes. Numerical tests show that the filter not only reduces the footprint noise, balances the amplitudes and increases the resolution of the elastic migration images, but also mitigates the crosstalk artifacts. When used as a preconditioner, it accelerates the convergence rate for elastic inversion.