WorldWideScience

Sample records for gatan imaging filter

  1. AER image filtering

    Science.gov (United States)

    Gómez-Rodríguez, F.; Linares-Barranco, A.; Paz, R.; Miró-Amarante, L.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows real-time virtual massive connectivity among huge number of neurons located on different chips.[1] By exploiting high speed digital communication circuits (with nano-seconds timing), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Neurons generate "events" according to their activity levels. That is, more active neurons generate more events per unit time and access the interchip communication channel more frequently than neurons with low activity. In Neuromorphic system development, AER brings some advantages to develop real-time image processing system: (1) AER represents the information like time continuous stream not like a frame; (2) AER sends the most important information first (although this depends on the sender); (3) AER allows to process information as soon as it is received. When AER is used in artificial vision field, each pixel is considered like a neuron, so pixel's intensity is represented like a sequence of events; modifying the number and the frequency of these events, it is possible to make some image filtering. In this paper we present four image filters using AER: (a) Noise addition and suppression, (b) brightness modification, (c) single moving object tracking and (d) geometrical transformations (rotation, translation, reduction and magnification). For testing and debugging, we use USB-AER board developed by Robotic and Technology of Computers Applied to Rehabilitation (RTCAR) research group. This board is based on an FPGA, devoted to manage the AER functionality. This board also includes a micro-controlled for USB communication, 2 Mbytes RAM and 2 AER ports (one for input and one for output).

  2. HEPA air filter (image)

    Science.gov (United States)

    ... pet dander and other irritating allergens from the air. Along with other methods to reduce allergens, such ... controlling the amount of allergens circulating in the air. HEPA filters can be found in most air ...

  3. Nonlinear image filtering within IDP++

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, S.K.; Wieting, M.G.; Brase, J.M.

    1995-02-09

    IDP++, image and data processing in C++, is a set of a signal processing libraries written in C++. It is a multi-dimension (up to four dimensions), multi-data type (implemented through templates) signal processing extension to C++. IDP++ takes advantage of the object-oriented compiler technology to provide ``information hiding.`` Users need only know C, not C++. Signals or data sets are treated like any other variable with a defined set of operators and functions. We here some examples of the nonlinear filter library within IDP++. Specifically, the results of MIN, MAX median, {alpha}-trimmed mean, and edge-trimmed mean filters as applied to a real aperture radar (RR) and synthetic aperture radar (SAR) data set.

  4. Filter and Filter Bank Design for Image Texture Recognition

    Energy Technology Data Exchange (ETDEWEB)

    Randen, Trygve

    1997-12-31

    The relevance of this thesis to energy and environment lies in its application to remote sensing such as for instance sea floor mapping and seismic pattern recognition. The focus is on the design of two-dimensional filters for feature extraction, segmentation, and classification of digital images with textural content. The features are extracted by filtering with a linear filter and estimating the local energy in the filter response. The thesis gives a review covering broadly most previous approaches to texture feature extraction and continues with proposals of some new techniques. 143 refs., 59 figs., 7 tabs.

  5. Gabor filter based fingerprint image enhancement

    Science.gov (United States)

    Wang, Jin-Xiang

    2013-03-01

    Fingerprint recognition technology has become the most reliable biometric technology due to its uniqueness and invariance, which has been most convenient and most reliable technique for personal authentication. The development of Automated Fingerprint Identification System is an urgent need for modern information security. Meanwhile, fingerprint preprocessing algorithm of fingerprint recognition technology has played an important part in Automatic Fingerprint Identification System. This article introduces the general steps in the fingerprint recognition technology, namely the image input, preprocessing, feature recognition, and fingerprint image enhancement. As the key to fingerprint identification technology, fingerprint image enhancement affects the accuracy of the system. It focuses on the characteristics of the fingerprint image, Gabor filters algorithm for fingerprint image enhancement, the theoretical basis of Gabor filters, and demonstration of the filter. The enhancement algorithm for fingerprint image is in the windows XP platform with matlab.65 as a development tool for the demonstration. The result shows that the Gabor filter is effective in fingerprint image enhancement technology.

  6. MR image reconstruction via guided filter.

    Science.gov (United States)

    Huang, Heyan; Yang, Hang; Wang, Kang

    2018-04-01

    Magnetic resonance imaging (MRI) reconstruction from the smallest possible set of Fourier samples has been a difficult problem in medical imaging field. In our paper, we present a new approach based on a guided filter for efficient MRI recovery algorithm. The guided filter is an edge-preserving smoothing operator and has better behaviors near edges than the bilateral filter. Our reconstruction method is consist of two steps. First, we propose two cost functions which could be computed efficiently and thus obtain two different images. Second, the guided filter is used with these two obtained images for efficient edge-preserving filtering, and one image is used as the guidance image, the other one is used as a filtered image in the guided filter. In our reconstruction algorithm, we can obtain more details by introducing guided filter. We compare our reconstruction algorithm with some competitive MRI reconstruction techniques in terms of PSNR and visual quality. Simulation results are given to show the performance of our new method.

  7. Frequency Domain Image Filtering Using CUDA

    Directory of Open Access Journals (Sweden)

    Muhammad Awais Rajput

    2014-10-01

    Full Text Available In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA?s CUDA (Compute Unified Device Architecture. In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA?s parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butterworth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output image quality on both the processing architectures

  8. Frequency domain image filtering using cuda

    International Nuclear Information System (INIS)

    Rajput, M.A.; Khan, U.A.

    2014-01-01

    In this paper, we investigate the implementation of image filtering in frequency domain using NVIDIA's CUDA (Compute Unified Device Architecture). In contrast to signal and image filtering in spatial domain which uses convolution operations and hence is more compute-intensive for filters having larger spatial extent, the frequency domain filtering uses FFT (Fast Fourier Transform) which is much faster and significantly reduces the computational complexity of the filtering. We implement the frequency domain filtering on CPU and GPU respectively and analyze the speed-up obtained from the CUDA's parallel processing paradigm. In order to demonstrate the efficiency of frequency domain filtering on CUDA, we implement three frequency domain filters, i.e., Butter worth, low-pass and Gaussian for processing different sizes of images on CPU and GPU respectively and perform the GPU vs. CPU benchmarks. The results presented in this paper show that the frequency domain filtering with CUDA achieves significant speed-up over the CPU processing in frequency domain with the same level of (output) image quality on both the processing architectures. (author)

  9. Convex blind image deconvolution with inverse filtering

    Science.gov (United States)

    Lv, Xiao-Guang; Li, Fang; Zeng, Tieyong

    2018-03-01

    Blind image deconvolution is the process of estimating both the original image and the blur kernel from the degraded image with only partial or no information about degradation and the imaging system. It is a bilinear ill-posed inverse problem corresponding to the direct problem of convolution. Regularization methods are used to handle the ill-posedness of blind deconvolution and get meaningful solutions. In this paper, we investigate a convex regularized inverse filtering method for blind deconvolution of images. We assume that the support region of the blur object is known, as has been done in a few existing works. By studying the inverse filters of signal and image restoration problems, we observe the oscillation structure of the inverse filters. Inspired by the oscillation structure of the inverse filters, we propose to use the star norm to regularize the inverse filter. Meanwhile, we use the total variation to regularize the resulting image obtained by convolving the inverse filter with the degraded image. The proposed minimization model is shown to be convex. We employ the first-order primal-dual method for the solution of the proposed minimization model. Numerical examples for blind image restoration are given to show that the proposed method outperforms some existing methods in terms of peak signal-to-noise ratio (PSNR), structural similarity (SSIM), visual quality and time consumption.

  10. Quantum Image Filtering in the Frequency Domain

    Directory of Open Access Journals (Sweden)

    MANTA, V. I.

    2013-08-01

    Full Text Available In this paper we address the emerging field of Quantum Image Processing. We investigate the use of quantum computing systems to represent and manipulate images. In particular, we consider the basic task of image filtering. We prove that a quantum version for this operation can be achieved, even though the quantum convolution of two sequences is physically impossible. In our approach we use the principle of the quantum oracle to implement the filter function. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on grayscale images. There are important differences between the classical and the quantum implementations for image filtering. We analyze these differences and show that the major advantage of the quantum approach lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  11. Fast bilateral filtering of CT-images

    Energy Technology Data Exchange (ETDEWEB)

    Steckmann, Sven; Baer, Matthias; Kachelriess, Marc [Erlangen-Nuernberg Univ., Erlangen (Germany). Inst. of Medical Physics (IMP)

    2011-07-01

    The Bilateral filter is able to get a lower noise level while retaining the edges in images. The downside of a bilateral filter is the high order of the problem itself. While having a Volume size of N with a dimension of d and a filter window of r the problem is of size N{sup d} . r{sup d}. In the literature there are some proposals for speeding up by reducing this order by approximating a component of the filter. This leads to inaccurate results which often implies non acceptable artifacts for medical imaging. A better way for medical imaging is to speed up the filter itself while leaving the basic structure intact. This is the way our implementation uses. We solve the problem of calculating the function of e{sup -x} in an efficient way on modern architectures, and the problem of vectorizing the filtering process. As result we implemented a filter which is 2.5 times faster than the highly optimized basic approach. By comparing the basic analytical approach with the final algorithm, the differences in quality of the computing process is negligible to the human eye. We are able to process a volume with 512{sup 3} voxels with a filter of 25 x 25 x 1 in 21 s on a modern Intel Xeon platform with two X5590 processors running at 3.33 GHz. (orig.)

  12. Efficient Filtering of Noisy Fingerprint Images

    Directory of Open Access Journals (Sweden)

    Maria Liliana Costin

    2016-01-01

    Full Text Available Fingerprint identification is an important field in the wide domain of biometrics with many applications, in different areas such: judicial, mobile phones, access systems, airports. There are many elaborated algorithms for fingerprint identification, but none of them can guarantee that the results of identification are always 100 % accurate. A first step in a fingerprint image analysing process consists in the pre-processing or filtering. If the result after this step is not by a good quality the upcoming identification process can fail. A major difficulty can appear in case of fingerprint identification if the images that should be identified from a fingerprint image database are noisy with different type of noise. The objectives of the paper are: the successful completion of the noisy digital image filtering, a novel more robust algorithm of identifying the best filtering algorithm and the classification and ranking of the images. The choice about the best filtered images of a set of 9 algorithms is made with a dual method of fuzzy and aggregation model. We are proposing through this paper a set of 9 filters with different novelty designed for processing the digital images using the following methods: quartiles, medians, average, thresholds and histogram equalization, applied all over the image or locally on small areas. Finally the statistics reveal the classification and ranking of the best algorithms.

  13. Directional Joint Bilateral Filter for Depth Images

    Directory of Open Access Journals (Sweden)

    Anh Vu Le

    2014-06-01

    Full Text Available Depth maps taken by the low cost Kinect sensor are often noisy and incomplete. Thus, post-processing for obtaining reliable depth maps is necessary for advanced image and video applications such as object recognition and multi-view rendering. In this paper, we propose adaptive directional filters that fill the holes and suppress the noise in depth maps. Specifically, novel filters whose window shapes are adaptively adjusted based on the edge direction of the color image are presented. Experimental results show that our method yields higher quality filtered depth maps than other existing methods, especially at the edge boundaries.

  14. Color image guided depth image super resolution using fusion filter

    Science.gov (United States)

    He, Jin; Liang, Bin; He, Ying; Yang, Jun

    2018-04-01

    Depth cameras are currently playing an important role in many areas. However, most of them can only obtain lowresolution (LR) depth images. Color cameras can easily provide high-resolution (HR) color images. Using color image as a guide image is an efficient way to get a HR depth image. In this paper, we propose a depth image super resolution (SR) algorithm, which uses a HR color image as a guide image and a LR depth image as input. We use the fusion filter of guided filter and edge based joint bilateral filter to get HR depth image. Our experimental results on Middlebury 2005 datasets show that our method can provide better quality in HR depth images both numerically and visually.

  15. Neutron Imaging of Diesel Particulate Filters

    International Nuclear Information System (INIS)

    Strzelec, Andrea; Bilheux, Hassina Z.; Finney, Charles E.A.; Daw, C. Stuart; Foster, Dave; Rutland, Christopher J.; Schillinger, Burkhard; Schulz, Michael

    2009-01-01

    This article presents nondestructive neutron computed tomography (nCT) measurements of Diesel Particulate Filters (DPFs) as a method to measure ash and soot loading in the filters. Uncatalyzed and unwashcoated 200cpsi cordierite DPFs exposed to 100% biodiesel (B100) exhaust and conventional ultra low sulfur 2007 certification diesel (ULSD) exhaust at one speed-load point (1500rpm, 2.6bar BMEP) are compared to a brand new (never exposed) filter. Precise structural information about the substrate as well as an attempt to quantify soot and ash loading in the channel of the DPF illustrates the potential strength of the neutron imaging technique

  16. Filter's importance in nuclear cardiology imaging

    International Nuclear Information System (INIS)

    Jesus, Maria C. de; Lima, Ana L.S.; Santos, Joyra A. dos; Megueriam, Berdj A.

    2008-01-01

    Full text: Nuclear Medicine is a medical speciality which employs tomography procedures for the diagnosis, treatment and prevention of diseases. One of the most commonly used apparatus is the Single Photon Emission Computed Tomography (SPECT). To perform exams, a very small amount of a radiopharmaceutical must be given to the patient. Then, a gamma camera is placed in convenient positions to perform the photon counting, which is used to reconstruct a full 3 dimensional distribution of the radionuclide inside the body or organ. This reconstruction provides a 3-dimensional image in spatial coordinates, of the body or organ under study, allowing the physician to give the diagnostic. Image reconstruction is usually worked in the frequency domain, due to a great simplification introduced by the Fourier decomposition of image spectra. After the reconstruction, an inverse Fourier transform must be applied to trace back the image into spatial coordinates. To optimize this reconstruction procedure, digital filters are used to remove undesirable components of frequency, which can 'shadow' relevant physical signatures of diseases. Unfortunately, the efficiency of the applied filter is strongly dependent on its own mathematical parameters. In this work we demonstrate how filters interfere on image quality in cardiology examinations with SPECT, concerning perfusion and myocardial viability and the importance of the medical physicist in the choice of the right filters avoiding some serious problems that could occur in the inadequate processing of an image damaging the medical diagnosis. (author)

  17. Spatial filters for focusing ultrasound images

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Gori, Paola

    2001-01-01

    , but the approach always yields point spread functions better or equal to a traditional dynamically focused image. Finally, the process was applied to in-vivo clinical images of the liver and right kidney from a 28 years old male. The data was obtained with a single element transducer focused at 100 mm....... A new method for making spatial matched filter focusing of RF ultrasound data is proposed based on the spatial impulse response description of the imaging. The response from a scatterer at any given point in space relative to the transducer can be calculated, and this gives the spatial matched filter...... for synthetic aperture imaging for single element transducers. It is evaluated using the Field II program. Data from a single 3 MHz transducer focused at a distance of 80 mm is processed. Far from the transducer focal region, the processing greatly improves the image resolution: the lateral slice...

  18. A robust nonlinear filter for image restoration.

    Science.gov (United States)

    Koivunen, V

    1995-01-01

    A class of nonlinear regression filters based on robust estimation theory is introduced. The goal of the filtering is to recover a high-quality image from degraded observations. Models for desired image structures and contaminating processes are employed, but deviations from strict assumptions are allowed since the assumptions on signal and noise are typically only approximately true. The robustness of filters is usually addressed only in a distributional sense, i.e., the actual error distribution deviates from the nominal one. In this paper, the robustness is considered in a broad sense since the outliers may also be due to inappropriate signal model, or there may be more than one statistical population present in the processing window, causing biased estimates. Two filtering algorithms minimizing a least trimmed squares criterion are provided. The design of the filters is simple since no scale parameters or context-dependent threshold values are required. Experimental results using both real and simulated data are presented. The filters effectively attenuate both impulsive and nonimpulsive noise while recovering the signal structure and preserving interesting details.

  19. Raman imaging using fixed bandpass filter

    Science.gov (United States)

    Landström, L.; Kullander, F.; Lundén, H.; Wästerby, P.

    2017-05-01

    By using fixed narrow band pass optical filtering and scanning the laser excitation wavelength, hyperspectral Raman imaging could be achieved. Experimental, proof-of-principle results from the Chemical Warfare Agent (CWA) tabun (GA) as well as the common CWA simulant tributyl phosphate (TBP) on different surfaces/substrates are presented and discussed.

  20. Ghost suppression in image restoration filtering

    Science.gov (United States)

    Riemer, T. E.; Mcgillem, C. D.

    1975-01-01

    An optimum image restoration filter is described in which provision is made to constrain the spatial extent of the restoration function, the noise level of the filter output and the rate of falloff of the composite system point-spread away from the origin. Experimental results show that sidelobes on the composite system point-spread function produce ghosts in the restored image near discontinuities in intensity level. By redetermining the filter using a penalty function that is zero over the main lobe of the composite point-spread function of the optimum filter and nonzero where the point-spread function departs from a smoothly decaying function in the sidelobe region, a great reduction in sidelobe level is obtained. Almost no loss in resolving power of the composite system results from this procedure. By iteratively carrying out the same procedure even further reductions in sidelobe level are obtained. Examples of original and iterated restoration functions are shown along with their effects on a test image.

  1. Independent component analysis based filtering for penumbral imaging

    International Nuclear Information System (INIS)

    Chen Yenwei; Han Xianhua; Nozaki, Shinya

    2004-01-01

    We propose a filtering based on independent component analysis (ICA) for Poisson noise reduction. In the proposed filtering, the image is first transformed to ICA domain and then the noise components are removed by a soft thresholding (shrinkage). The proposed filter, which is used as a preprocessing of the reconstruction, has been successfully applied to penumbral imaging. Both simulation results and experimental results show that the reconstructed image is dramatically improved in comparison to that without the noise-removing filters

  2. A Digital Image Denoising Algorithm Based on Gaussian Filtering and Bilateral Filtering

    Directory of Open Access Journals (Sweden)

    Piao Weiying

    2018-01-01

    Full Text Available Bilateral filtering has been applied in the area of digital image processing widely, but in the high gradient region of the image, bilateral filtering may generate staircase effect. Bilateral filtering can be regarded as one particular form of local mode filtering, according to above analysis, an mixed image de-noising algorithm is proposed based on Gaussian filter and bilateral filtering. First of all, it uses Gaussian filter to filtrate the noise image and get the reference image, then to take both the reference image and noise image as the input for range kernel function of bilateral filter. The reference image can provide the image’s low frequency information, and noise image can provide image’s high frequency information. Through the competitive experiment on both the method in this paper and traditional bilateral filtering, the experimental result showed that the mixed de-noising algorithm can effectively overcome staircase effect, and the filtrated image was more smooth, its textural features was also more close to the original image, and it can achieve higher PSNR value, but the amount of calculation of above two algorithms are basically the same.

  3. Pornographic image detection with Gabor filters

    Science.gov (United States)

    Durrell, Kevan; Murray, Daniel J. C.

    2002-04-01

    As Internet-enabled computers become ubiquitous in homes, schools, and other publicly accessible locations, there are more people 'surfing the net' who would prefer not to be exposed to offensive material. There is a lot of material freely available on the Internet that we, as a responsible and caring society, would like to keep our children from viewing. Pornographic image content is one category of material over which we would like some control. We have been conducting experiments to determine the effectiveness of using characteristic feature vectors and neural networks to identify semantic image content. This paper will describe our approach to identifying pornographic images using Gabor filters, Principal Component Analysis (PCA), Correllograms, and Neural Networks. In brief, we used a set of 5,000 typical images available from the Internet, 20% of which were judged to be pornographic, to train a neural network. We then apply the trained neural network to feature vectors from images that had not been used in training. We measure our performance as Recall, how many of the verification images labeled pornographic were correctly identified, and Precision, how many images deemed pornographic by the neural network are in fact pornographic. The set of images that were used in the experiment described in this paper for its training and validation sets are freely available on the Internet. Freely available is an important qualifier, since high-resolution, studio-quality pornographic images are often protected by portals that charge members a fee to gain access to their material. Although this is not a hard and fast rule, many of the pornographic images that are available easily and without charge on the Internet are of low image quality. Some of these images are collages or contain textual elements or have had their resolution intentionally lowered to reduce their file size. These are the offensive images that a user, without a credit card, might inadvertently come

  4. Two-dimensional filtering of SPECT images using the Metz and Wiener filters

    International Nuclear Information System (INIS)

    King, M.A.; Schwinger, R.B.; Penney, B.C.; Doherty, P.W.

    1984-01-01

    Presently, single photon emission computed tomographic (SPECT) images are usually reconstructed by arbitrarily selecting a one-dimensional ''window'' function for use in reconstruction. A better method would be to automatically choose among a family of two-dimensional image restoration filters in such a way as to produce ''optimum'' image quality. Two-dimensional image processing techniques offer the advantages of a larger statistical sampling of the data for better noise reduction, and two-dimensional image deconvolution to correct for blurring during acquisition. An investigation of two such ''optimal'' digital image restoration techniques (the count-dependent Metz filter and the Wiener filter) was made. They were applied both as two-dimensional ''window'' functions for preprocessing SPECT images, and for filtering reconstructed images. Their performance was compared by measuring image contrast and per cent fractional standard deviation (% FSD) in multiple-acquisitions of the Jaszczak SPECT phantom at two different count levels. A statistically significant increase in image contrast and decrease in % FSD was observed with these techniques when compared to the results of reconstruction with a ramp filter. The adaptability of the techniques was manifested in a lesser % reduction in % FSD at the high count level coupled with a greater enhancement in image contrast. Using an array processor, processing time was 0.2 sec per image for the Metz filter and 3 sec for the Wiener filter. It is concluded that two-dimensional digital image restoration with these techniques can produce a significant increase in SPECT image quality

  5. Vector Directional Distance Rational Hybrid Filters for Color Image Restoration

    Directory of Open Access Journals (Sweden)

    L. Khriji

    2005-12-01

    Full Text Available A new class of nonlinear filters, called vector-directional distance rational hybrid filters (VDDRHF for multispectral image processing, is introduced and applied to color image-filtering problems. These filters are based on rational functions (RF. The VDDRHF filter is a two-stage filter, which exploits the features of the vector directional distance filter (VDDF, the center weighted vector directional distance filter (CWVDDF and those of the rational operator. The filter output is a result of vector rational function (VRF operating on the output of three sub-functions. Two vector directional distance (VDDF filters and one center weighted vector directional distance filter (CWVDDF are proposed to be used in the first stage due to their desirable properties, such as, noise attenuation, chromaticity retention, and edges and details preservation. Experimental results show that the new VDDRHF outperforms a number of widely known nonlinear filters for multi-spectral image processing such as the vector median filter (VMF, the generalized vector directional filters (GVDF and distance directional filters (DDF with respect to all criteria used.

  6. Eigenimage filtering of nuclear medicine image sequences

    International Nuclear Information System (INIS)

    Windham, J.P.; Froelich, J.W.; Abd-Allah, M.

    1985-01-01

    In many nuclear medicine imaging sequences the localization of radioactivity in organs other than the target organ interferes with imaging of the desired anatomical structure or physiological process. A filtering technique has been developed which suppresses the interfering process while enhancing the desired process. This technique requires the identification of temporal sequential signatures for both the interfering and desired processes. These signatures are placed in the form of signature vectors. Signature matrices, M/sub D/ and M/sub U/, are formed by taking the outer product expansion of the temporal signature vectors for the desired and interfering processes respectively. By using the transformation from the simultaneous diagonalization of these two signature matrices a weighting vector is obtained. The technique is shown to maximize the projection of the desired process while minimizing the interfering process based upon an extension of Rayleigh's Principle. The technique is demonstrated for first pass renal and cardiac flow studies. This filter offers a potential for simplifying and extending the accuracy of diagnostic nuclear medicine procedures

  7. Detail Enhancement for Infrared Images Based on Propagated Image Filter

    Directory of Open Access Journals (Sweden)

    Yishu Peng

    2016-01-01

    Full Text Available For displaying high-dynamic-range images acquired by thermal camera systems, 14-bit raw infrared data should map into 8-bit gray values. This paper presents a new method for detail enhancement of infrared images to display the image with a relatively satisfied contrast and brightness, rich detail information, and no artifacts caused by the image processing. We first adopt a propagated image filter to smooth the input image and separate the image into the base layer and the detail layer. Then, we refine the base layer by using modified histogram projection for compressing. Meanwhile, the adaptive weights derived from the layer decomposition processing are used as the strict gain control for the detail layer. The final display result is obtained by recombining the two modified layers. Experimental results on both cooled and uncooled infrared data verify that the proposed method outperforms the method based on log-power histogram modification and bilateral filter-based detail enhancement in both detail enhancement and visual effect.

  8. Improving the quality of brain CT image from Wavelet filters

    International Nuclear Information System (INIS)

    Pita Machado, Reinaldo; Perez Diaz, Marlen; Bravo Pino, Rolando

    2012-01-01

    An algorithm to reduce Poisson noise is described using Wavelet filters. Five tomographic images of patients and a head anthropomorphic phantom were used. They were acquired with two different CT machines. Due to the original images contain the acquisition noise; some simulated free noise lesions were added to the images and after that the whole images were contaminated with noise. Contaminated images were filtered with 9 Wavelet filters at different decomposition levels and thresholds. Image quality of filtered and unfiltered images was graded using the Signal to Noise ratio, Normalized Mean Square Error and the Structural Similarity Index, as well as, by the subjective JAFROC methods with 5 observers. Some filters as Bior 3.7 and dB45 improved in a significant way head CT image quality (p<0.05) producing an increment in SNR without visible structural distortions

  9. Image Vector Quantization codec indexes filtering

    Directory of Open Access Journals (Sweden)

    Lakhdar Moulay Abdelmounaim

    2012-01-01

    Full Text Available Vector Quantisation (VQ is an efficient coding algorithm that has been widely used in the field of video and image coding, due to its fast decoding efficiency. However, the indexes of VQ are sometimes lost because of signal interference during the transmission. In this paper, we propose an efficient estimation method to conceal and recover the lost indexes on the decoder side, to avoid re-transmitting the whole image again. If the image or video has the limitation of a period of validity, re-transmitting the data wastes the resources of time and network bandwidth. Therefore, using the originally received correct data to estimate and recover the lost data is efficient in time-constrained situations, such as network conferencing or mobile transmissions. In nature images, the pixels are correlated with their neighbours and VQ partitions the image into sub-blocks and quantises them to the indexes that are transmitted; the correlation between adjacent indexes is very strong. There are two parts of the proposed method. The first is pre-processing and the second is an estimation process. In pre-processing, we modify the order of codevectors in the VQ codebook to increase the correlation among the neighbouring vectors. We then use a special filtering method in the estimation process. Using conventional VQ to compress the Lena image and transmit it without any loss of index can achieve a PSNR of 30.429 dB on the decoder. The simulation results demonstrate that our method can estimate the indexes to achieve PSNR values of 29.084 and 28.327 dB when the loss rate is 0.5% and 1%, respectively.

  10. Infrared image background modeling based on improved Susan filtering

    Science.gov (United States)

    Yuehua, Xia

    2018-02-01

    When SUSAN filter is used to model the infrared image, the Gaussian filter lacks the ability of direction filtering. After filtering, the edge information of the image cannot be preserved well, so that there are a lot of edge singular points in the difference graph, increase the difficulties of target detection. To solve the above problems, the anisotropy algorithm is introduced in this paper, and the anisotropic Gauss filter is used instead of the Gauss filter in the SUSAN filter operator. Firstly, using anisotropic gradient operator to calculate a point of image's horizontal and vertical gradient, to determine the long axis direction of the filter; Secondly, use the local area of the point and the neighborhood smoothness to calculate the filter length and short axis variance; And then calculate the first-order norm of the difference between the local area of the point's gray-scale and mean, to determine the threshold of the SUSAN filter; Finally, the built SUSAN filter is used to convolution the image to obtain the background image, at the same time, the difference between the background image and the original image is obtained. The experimental results show that the background modeling effect of infrared image is evaluated by Mean Squared Error (MSE), Structural Similarity (SSIM) and local Signal-to-noise Ratio Gain (GSNR). Compared with the traditional filtering algorithm, the improved SUSAN filter has achieved better background modeling effect, which can effectively preserve the edge information in the image, and the dim small target is effectively enhanced in the difference graph, which greatly reduces the false alarm rate of the image.

  11. Combination of Wiener filtering and singular value decomposition filtering for volume imaging PET

    International Nuclear Information System (INIS)

    Shao, L.; Lewitt, R.M.; Karp, J.S.

    1995-01-01

    Although the three-dimensional (3D) multi-slice rebinning (MSRB) algorithm in PET is fast and practical, and provides an accurate reconstruction, the MSRB image, in general, suffers from the noise amplified by its singular value decomposition (SVD) filtering operation in the axial direction. Their aim in this study is to combine the use of the Wiener filter (WF) with the SVD to decrease the noise and improve the image quality. The SVD filtering ''deconvolves'' the spatially variant axial response function while the WF suppresses the noise and reduces the blurring not modeled by the axial SVD filter but included in the system modulation transfer function. Therefore, the synthesis of these two techniques combines the advantages of both filters. The authors applied this approach to the volume imaging HEAD PENN-PET brain scanner with an axial extent of 256 mm. This combined filter was evaluated in terms of spatial resolution, image contrast, and signal-to-noise ratio with several phantoms, such as a cold sphere phantom and 3D brain phantom. Specifically, the authors studied both the SVD filter with an axial Wiener filter and the SVD filter with a 3D Wiener filter, and compared the filtered images to those from the 3D reprojection (3DRP) reconstruction algorithm. Their results indicate that the Wiener filter increases the signal-to-noise ratio and also improves the contrast. For the MSRB images of the 3D brain phantom, after 3D WF, both the Gray/White and Gray/Ventricle ratios were improved from 1.8 to 2.8 and 2.1 to 4.1, respectively. In addition, the image quality with the MSRB algorithm is close to that of the 3DRP algorithm with 3D WF applied to both image reconstructions

  12. Fusion of multispectral and panchromatic images using multirate filter banks

    Institute of Scientific and Technical Information of China (English)

    Wang Hong; Jing Zhongliang; Li Jianxun

    2005-01-01

    In this paper, an image fusion method based on the filter banks is proposed for merging a high-resolution panchromatic image and a low-resolution multispectral image. Firstly, the filter banks are designed to merge different signals with minimum distortion by using cosine modulation. Then, the filter banks-based image fusion is adopted to obtain a high-resolution multispectral image that combines the spectral characteristic of low-resolution data with the spatial resolution of the panchromatic image. Finally, two different experiments and corresponding performance analysis are presented. Experimental results indicate that the proposed approach outperforms the HIS transform, discrete wavelet transform and discrete wavelet frame.

  13. The value of filtered planar images in pediatric DMSA scans

    International Nuclear Information System (INIS)

    Mohammed, A.M.; Naddaf, S.Y.; Elgazzar, A.H.; Al-Abdul Salam, A.A.; Omar, A.A.

    2006-01-01

    The study was designed to demonstrate the value of filtered planar images in paediatric DMSA scanning. One hundred and seventy three patients ranged in age from 15 days to 12 years (mean: 4.3 years) with urinary tract infection (UTI) and clinical and/or laboratory suspicion of acute pyelonephritis (APN) were retrospectively studied. Planar images were filtered using Butterworth filter. The scan findings were reported as positive, negative or equivocal for cortical defects. Each scan was read in a double-blind fashion by two nuclear medicine physicians to evaluate inter-observer variations. Each kidney was divided into three zones, upper, middle and lower, and each zone was graded as positive, negative or equivocal for the presence of renal defects. Renal cortical defects were found in 66 patients (91 kidneys and 186 zones) with filtered images, 58 patients (81 kidneys and 175 zones) with planar images, and 69 patients (87 kidneys and 180 zones) with SPECT images. McNemar's test revealed statistically significant difference between filtered and planar images (p=0.038 for patients, 0.021 for kidneys and 0.034 for number of zones). Inter-observer agreement was 0.877 for filtered images, 0.915 for planar images and 0.915 for SPECT images. It was concluded that filtered planar images of renal cortex are comparable to SPECT images and can be used effectively in place of SPECT, when required, to shorten imaging time and eliminate motion artifacts, especially in the paediatric population. (author)

  14. Efficient OCT Image Enhancement Based on Collaborative Shock Filtering.

    Science.gov (United States)

    Liu, Guohua; Wang, Ziyu; Mu, Guoying; Li, Peijin

    2018-01-01

    Efficient enhancement of noisy optical coherence tomography (OCT) images is a key task for interpreting them correctly. In this paper, to better enhance details and layered structures of a human retina image, we propose a collaborative shock filtering for OCT image denoising and enhancement. Noisy OCT image is first denoised by a collaborative filtering method with new similarity measure, and then the denoised image is sharpened by a shock-type filtering for edge and detail enhancement. For dim OCT images, in order to improve image contrast for the detection of tiny lesions, a gamma transformation is first used to enhance the images within proper gray levels. The proposed method integrating image smoothing and sharpening simultaneously obtains better visual results in experiments.

  15. Restoration of nuclear medicine images using adaptive Wiener filters

    International Nuclear Information System (INIS)

    Meinel, G.

    1989-01-01

    An adaptive Wiener filter implementation for restoration of nuclear medicine images is described. These are considerably disturbed both deterministically (definition) and stochastically (Poisson's quantum noise). After introduction of an image model, description of necessary parameter approximations and information on optimum design methods the implementation is described. The filter operates adaptively as concerns the local signal-to-noise ratio and is based on a filter band concept. To verify the restoration effect size numbers are introduced and the filter is tested against these numbers. (author)

  16. Image defog algorithm based on open close filter and gradient domain recursive bilateral filter

    Science.gov (United States)

    Liu, Daqian; Liu, Wanjun; Zhao, Qingguo; Fei, Bowen

    2017-11-01

    To solve the problems of fuzzy details, color distortion, low brightness of the image obtained by the dark channel prior defog algorithm, an image defog algorithm based on open close filter and gradient domain recursive bilateral filter, referred to as OCRBF, was put forward. The algorithm named OCRBF firstly makes use of weighted quad tree to obtain more accurate the global atmospheric value, then exploits multiple-structure element morphological open and close filter towards the minimum channel map to obtain a rough scattering map by dark channel prior, makes use of variogram to correct the transmittance map,and uses gradient domain recursive bilateral filter for the smooth operation, finally gets recovery images by image degradation model, and makes contrast adjustment to get bright, clear and no fog image. A large number of experimental results show that the proposed defog method in this paper can be good to remove the fog , recover color and definition of the fog image containing close range image, image perspective, the image including the bright areas very well, compared with other image defog algorithms,obtain more clear and natural fog free images with details of higher visibility, what's more, the relationship between the time complexity of SIDA algorithm and the number of image pixels is a linear correlation.

  17. Magnetic resonance image enhancement using V-filter

    International Nuclear Information System (INIS)

    Yamamoto, H.; Sugita, K.; Kanzaki, N.; Johja, I.; Hiraki, Y.

    1990-01-01

    The purpose of this study is to present a method of boundary enhancement algorithms for magnetic resonance images using a V-filter. The boundary of the brain tumor was precisely extracted by the region segmentation techniques

  18. US images encoding envelope amplitude following narrow band filtering

    International Nuclear Information System (INIS)

    Sommer, F.G.; Stern, R.A.; Chen, H.S.

    1986-01-01

    Ultrasonic waveform data from phantoms having differing scattering characteristics and from normal and cirrhotic human liver in vivo were recorded within a standardized dynamic range and filtered with narrow band filters either above or below the mean recorded ultrasonic center frequency. Images created by mapping the amplitudes of received ultrasound following such filtration permitted dramatic differentiation, not discernible in conventional US images, of phantoms having differing scattering characteristics, and of normal and cirrhotic human livers

  19. Human visual modeling and image deconvolution by linear filtering

    International Nuclear Information System (INIS)

    Larminat, P. de; Barba, D.; Gerber, R.; Ronsin, J.

    1978-01-01

    The problem is the numerical restoration of images degraded by passing through a known and spatially invariant linear system, and by the addition of a stationary noise. We propose an improvement of the Wiener's filter to allow the restoration of such images. This improvement allows to reduce the important drawbacks of classical Wiener's filter: the voluminous data processing, the lack of consideration of the vision's characteristivs which condition the perception by the observer of the restored image. In a first paragraph, we describe the structure of the visual detection system and a modelling method of this system. In the second paragraph we explain a restoration method by Wiener filtering that takes the visual properties into account and that can be adapted to the local properties of the image. Then the results obtained on TV images or scintigrams (images obtained by a gamma-camera) are commented [fr

  20. Wiener discrete cosine transform-based image filtering

    Science.gov (United States)

    Pogrebnyak, Oleksiy; Lukin, Vladimir V.

    2012-10-01

    A classical problem of additive white (spatially uncorrelated) Gaussian noise suppression in grayscale images is considered. The main attention is paid to discrete cosine transform (DCT)-based denoising, in particular, to image processing in blocks of a limited size. The efficiency of DCT-based image filtering with hard thresholding is studied for different sizes of overlapped blocks. A multiscale approach that aggregates the outputs of DCT filters having different overlapped block sizes is proposed. Later, a two-stage denoising procedure that presumes the use of the multiscale DCT-based filtering with hard thresholding at the first stage and a multiscale Wiener DCT-based filtering at the second stage is proposed and tested. The efficiency of the proposed multiscale DCT-based filtering is compared to the state-of-the-art block-matching and three-dimensional filter. Next, the potentially reachable multiscale filtering efficiency in terms of output mean square error (MSE) is studied. The obtained results are of the same order as those obtained by Chatterjee's approach based on nonlocal patch processing. It is shown that the ideal Wiener DCT-based filter potential is usually higher when noise variance is high.

  1. Computer processing of the scintigraphic image using digital filtering techniques

    International Nuclear Information System (INIS)

    Matsuo, Michimasa

    1976-01-01

    The theory of digital filtering was studied as a method for the computer processing of scintigraphic images. The characteristics and design techniques of finite impulse response (FIR) digital filters with linear phases were examined using the z-transform. The conventional data processing method, smoothing, could be recognized as one kind of linear phase FIR low-pass digital filtering. Ten representatives of FIR low-pass digital filters with various cut-off frequencies were scrutinized from the frequency domain in one-dimension and two-dimensions. These filters were applied to phantom studies with cold targets, using a Scinticamera-Minicomputer on-line System. These studies revealed that the resultant images had a direct connection with the magnitude response of the filter, that is, they could be estimated fairly well from the frequency response of the digital filter used. The filter, which was estimated from phantom studies as optimal for liver scintigrams using 198 Au-colloid, was successfully applied in clinical use for detecting true cold lesions and, at the same time, for eliminating spurious images. (J.P.N.)

  2. Imaging spectrometer using a liquid crystal tunable filter

    Science.gov (United States)

    Chrien, Thomas G.; Chovit, Christopher; Miller, Peter J.

    1993-09-01

    A demonstration imaging spectrometer using a liquid crystal tunable filter (LCTF) was built and tested on a hot air balloon platform. The LCTF is a tunable polarization interference or Lyot filter. The LCTF enables a small, light weight, low power, band sequential imaging spectrometer design. An overview of the prototype system is given along with a description of balloon experiment results. System model performance predictions are given for a future LCTF based imaging spectrometer design. System design considerations of LCTF imaging spectrometers are discussed.

  3. Iodine filter imaging system for subtraction angiography using synchrotron radiation

    Science.gov (United States)

    Umetani, K.; Ueda, K.; Takeda, T.; Itai, Y.; Akisada, M.; Nakajima, T.

    1993-11-01

    A new type of real-time imaging system was developed for transvenous coronary angiography. A combination of an iodine filter and a single energy broad-bandwidth X-ray produces two-energy images for the iodine K-edge subtraction technique. X-ray images are sequentially converted to visible images by an X-ray image intensifier. By synchronizing the timing of the movement of the iodine filter into and out of the X-ray beam, two output images of the image intensifier are focused side by side on the photoconductive layer of a camera tube by an oscillating mirror. Both images are read out by electron beam scanning of a 1050-scanning-line video camera within a camera frame time of 66.7 ms. One hundred ninety two pairs of iodine-filtered and non-iodine-filtered images are stored in the frame memory at a rate of 15 pairs/s. In vivo subtracted images of coronary arteries in dogs were obtained in the form of motion pictures.

  4. Depth Images Filtering In Distributed Streaming

    Directory of Open Access Journals (Sweden)

    Dziubich Tomasz

    2016-04-01

    Full Text Available In this paper, we propose a distributed system for point cloud processing and transferring them via computer network regarding to effectiveness-related requirements. We discuss the comparison of point cloud filters focusing on their usage for streaming optimization. For the filtering step of the stream pipeline processing we evaluate four filters: Voxel Grid, Radial Outliner Remover, Statistical Outlier Removal and Pass Through. For each of the filters we perform a series of tests for evaluating the impact on the point cloud size and transmitting frequency (analysed for various fps ratio. We present results of the optimization process used for point cloud consolidation in a distributed environment. We describe the processing of the point clouds before and after the transmission. Pre- and post-processing allow the user to send the cloud via network without any delays. The proposed pre-processing compression of the cloud and the post-processing reconstruction of it are focused on assuring that the end-user application obtains the cloud with a given precision.

  5. Selected annotated bibliographies for adaptive filtering of digital image data

    Science.gov (United States)

    Mayers, Margaret; Wood, Lynnette

    1988-01-01

    Digital spatial filtering is an important tool both for enhancing the information content of satellite image data and for implementing cosmetic effects which make the imagery more interpretable and appealing to the eye. Spatial filtering is a context-dependent operation that alters the gray level of a pixel by computing a weighted average formed from the gray level values of other pixels in the immediate vicinity.Traditional spatial filtering involves passing a particular filter or set of filters over an entire image. This assumes that the filter parameter values are appropriate for the entire image, which in turn is based on the assumption that the statistics of the image are constant over the image. However, the statistics of an image may vary widely over the image, requiring an adaptive or "smart" filter whose parameters change as a function of the local statistical properties of the image. Then a pixel would be averaged only with more typical members of the same population. This annotated bibliography cites some of the work done in the area of adaptive filtering. The methods usually fall into two categories, (a) those that segment the image into subregions, each assumed to have stationary statistics, and use a different filter on each subregion, and (b) those that use a two-dimensional "sliding window" to continuously estimate the filter either the spatial or frequency domain, or may utilize both domains. They may be used to deal with images degraded by space variant noise, to suppress undesirable local radiometric statistics while enforcing desirable (user-defined) statistics, to treat problems where space-variant point spread functions are involved, to segment images into regions of constant value for classification, or to "tune" images in order to remove (nonstationary) variations in illumination, noise, contrast, shadows, or haze.Since adpative filtering, like nonadaptive filtering, is used in image processing to accomplish various goals, this bibliography

  6. Oriented diffusion filtering for enhancing low-quality fingerprint images

    KAUST Repository

    Gottschlich, C.; Schönlieb, C.-B.

    2012-01-01

    To enhance low-quality fingerprint images, we present a novel method that first estimates the local orientation of the fingerprint ridge and valley flow and next performs oriented diffusion filtering, followed by a locally adaptive contrast enhancement step. By applying the authors' new approach to low-quality images of the FVC2004 fingerprint databases, the authors are able to show its competitiveness with other state-of-the-art enhancement methods for fingerprints like curved Gabor filtering. A major advantage of oriented diffusion filtering over those is its computational efficiency. Combining oriented diffusion filtering with curved Gabor filters led to additional improvements and, to the best of the authors' knowledge, the lowest equal error rates achieved so far using MINDTCT and BOZORTH3 on the FVC2004 databases. The recognition performance and the computational efficiency of the method suggest to include oriented diffusion filtering as a standard image enhancement add-on module for real-time fingerprint recognition systems. In order to facilitate the reproduction of these results, an implementation of the oriented diffusion filtering for Matlab and GNU Octave is made available for download. © 2012 The Institution of Engineering and Technology.

  7. Oriented diffusion filtering for enhancing low-quality fingerprint images

    KAUST Repository

    Gottschlich, C.

    2012-01-01

    To enhance low-quality fingerprint images, we present a novel method that first estimates the local orientation of the fingerprint ridge and valley flow and next performs oriented diffusion filtering, followed by a locally adaptive contrast enhancement step. By applying the authors\\' new approach to low-quality images of the FVC2004 fingerprint databases, the authors are able to show its competitiveness with other state-of-the-art enhancement methods for fingerprints like curved Gabor filtering. A major advantage of oriented diffusion filtering over those is its computational efficiency. Combining oriented diffusion filtering with curved Gabor filters led to additional improvements and, to the best of the authors\\' knowledge, the lowest equal error rates achieved so far using MINDTCT and BOZORTH3 on the FVC2004 databases. The recognition performance and the computational efficiency of the method suggest to include oriented diffusion filtering as a standard image enhancement add-on module for real-time fingerprint recognition systems. In order to facilitate the reproduction of these results, an implementation of the oriented diffusion filtering for Matlab and GNU Octave is made available for download. © 2012 The Institution of Engineering and Technology.

  8. Filter and slice thickness selection in SPECT image reconstruction

    International Nuclear Information System (INIS)

    Ivanovic, M.; Weber, D.A.; Wilson, G.A.; O'Mara, R.E.

    1985-01-01

    The choice of filter and slice thickness in SPECT image reconstruction as function of activity and linear and angular sampling were investigated in phantom and patient imaging studies. Reconstructed transverse and longitudinal spatial resolution of the system were measured using a line source in a water filled phantom. Phantom studies included measurements of the Data Spectrum phantom; clinical studies included tomographic procedures in 40 patients undergoing imaging of the temporomandibular joint. Slices of the phantom and patient images were evaluated for spatial of the phantom and patient images were evaluated for spatial resolution, noise, and image quality. Major findings include; spatial resolution and image quality improve with increasing linear sampling frequencies over the range of 4-8 mm/p in the phantom images, best spatial resolution and image quality in clinical images were observed at a linear sampling frequency of 6mm/p, Shepp and Logan filter gives the best spatial resolution for phantom studies at the lowest linear sampling frequency; smoothed Shepp and Logan filter provides best quality images without loss of resolution at higher frequencies and, spatial resolution and image quality improve with increased angular sampling frequency in the phantom at 40 c/p but appear to be independent of angular sampling frequency at 400 c/p

  9. Optimization of Butterworth filter for brain SPECT imaging

    International Nuclear Information System (INIS)

    Minoshima, Satoshi; Maruno, Hirotaka; Yui, Nobuharu

    1993-01-01

    A method has been described to optimize the cutoff frequency of the Butterworth filter for brain SPECT imaging. Since a computer simulation study has demonstrated that separation between an object signal and the random noise in projection images in a spatial-frequency domain is influenced by the total number of counts, the cutoff frequency of the Butterworth filter should be optimized for individual subjects according to total counts in a study. To reveal the relationship between the optimal cutoff frequencies and total counts in brain SPECT study, we used a normal volunteer and 99m Tc hexamethyl-propyleneamine oxime (HMPAO) to obtain projection sets with different total counts. High quality images were created from a projection set with an acquisition time of 300-seconds per projection. The filter was optimized by calculating mean square errors from high quality images visually inspecting filtered reconstructed images. Dependence between total counts and optimal cutoff frequencies was clearly demonstrated in a nomogram. Using this nomogram, the optimal cutoff frequency for each study can be estimated from total counts, maximizing visual image quality. The results suggest that the cutoff frequency of Butterworth filter should be determined by referring to total counts in each study. (author)

  10. Multi-look polarimetric SAR image filtering using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper

    2000-01-01

    Based on a previously published algorithm capable of estimating the radar cross-section in synthetic aperture radar (SAR) intensity images, a new filter is presented utilizing multi-look polarimetric SAR images. The underlying mean covariance matrix is estimated from the observed sample covariance...

  11. An adaptive Kalman filter for speckle reductions in ultrasound images

    International Nuclear Information System (INIS)

    Castellini, G.; Labate, D.; Masotti, L.; Mannini, E.; Rocchi, S.

    1988-01-01

    Speckle is the term used to describe the granular appearance found in ultrasound images. The presence of speckle reduces the diagnostic potential of the echographic technique because it tends to mask small inhomogeneities of the investigated tissue. We developed a new method of speckle reductions that utilizes an adaptive one-dimensional Kalman filter based on the assumption that the observed image can be considered as a superimposition of speckle on a ''true images''. The filter adaptivity, necessary to avoid loss of resolution, has been obtained by statistical considerations on the local signal variations. The results of the applications of this particular Kalman filter, both on A-Mode and B-MODE images, show a significant speckle reduction

  12. A vector Wiener filter for dual-radionuclide imaging

    International Nuclear Information System (INIS)

    Links, J.M.; Prince, J.L.; Gupta, S.N.

    1996-01-01

    The routine use of a single radionuclide for patient imaging in nuclear medicine can be complemented by studies employing two tracers to examine two different processes in a single organ, most frequently by simultaneous imaging of both radionuclides in two different energy windows. In addition, simultaneous transmission/emission imaging with dual-radionuclides has been described, with one radionuclide used for the transmission study and a second for the emission study. There is thus currently considerable interest in dual-radionuclide imaging. A major problem with all dual-radionuclide imaging is the crosstalk between the two radionuclides. Such crosstalk frequently occurs, because scattered radiation from the higher energy radionuclide is detected in the lower energy window, and because the lower energy radionuclide may have higher energy emissions which are detected in the higher energy window. The authors have previously described the use of Fourier-based restoration filtering in single photon emission computed tomography (SPECT) and positron emission tomography (PET) to improve quantitative accuracy by designing a Wiener or other Fourier filter to partially restore the loss of contrast due to scatter and finite spatial resolution effects. The authors describe here the derivation and initial validation of an extension of such filtering for dual-radionuclide imaging that simultaneously (1) improves contrast in each radionuclide's direct image, (2) reduces image noise, and (3) reduces the crosstalk contribution from the other radionuclide. This filter is based on a vector version of the Wiener filter, which is shown to be superior [in the minimum mean square error (MMSE) sense] to the sequential application of separate crosstalk and restoration filters

  13. Superharmonic imaging with chirp coded excitation: filtering spectrally overlapped harmonics.

    Science.gov (United States)

    Harput, Sevan; McLaughlan, James; Cowell, David M J; Freear, Steven

    2014-11-01

    Superharmonic imaging improves the spatial resolution by using the higher order harmonics generated in tissue. The superharmonic component is formed by combining the third, fourth, and fifth harmonics, which have low energy content and therefore poor SNR. This study uses coded excitation to increase the excitation energy. The SNR improvement is achieved on the receiver side by performing pulse compression with harmonic matched filters. The use of coded signals also introduces new filtering capabilities that are not possible with pulsed excitation. This is especially important when using wideband signals. For narrowband signals, the spectral boundaries of the harmonics are clearly separated and thus easy to filter; however, the available imaging bandwidth is underused. Wideband excitation is preferable for harmonic imaging applications to preserve axial resolution, but it generates spectrally overlapping harmonics that are not possible to filter in time and frequency domains. After pulse compression, this overlap increases the range side lobes, which appear as imaging artifacts and reduce the Bmode image quality. In this study, the isolation of higher order harmonics was achieved in another domain by using the fan chirp transform (FChT). To show the effect of excitation bandwidth in superharmonic imaging, measurements were performed by using linear frequency modulated chirp excitation with varying bandwidths of 10% to 50%. Superharmonic imaging was performed on a wire phantom using a wideband chirp excitation. Results were presented with and without applying the FChT filtering technique by comparing the spatial resolution and side lobe levels. Wideband excitation signals achieved a better resolution as expected, however range side lobes as high as -23 dB were observed for the superharmonic component of chirp excitation with 50% fractional bandwidth. The proposed filtering technique achieved >50 dB range side lobe suppression and improved the image quality without

  14. Applications of custom scripting in digital micrograph: general image manipulation and utilities

    International Nuclear Information System (INIS)

    Mitchell, D.R.G.

    2002-01-01

    Full text: The Gatan Imaging Filter (GIF) uses a charge coupled device (CCD) camera to capture images and spectra. Image capture and manipulation is achieved through Gatan's Digital Micrograph software. This has many capabilities built-in, and can be further extended through installation of custom scripts. These are typically short programs written in a powerful scripting language, which permits many aspects of image acquisition and subsequent manipulation to be controlled by the user. Custom scripts can be added to the normal pull down menus, producing a very flexible and easy to use environment. The scripts described here demonstrate how custom scripting can enhance the functionality of a modem analytical TEM equipped with, in this instance, a GIF. However, scripting will enhance any TEM using a CCD camera controlled through Digital Micrograph. The examples shown here include: a) a script to rotationally average a selected area diffraction pattern and produce a calibrated radial intensity profile, b) a utility script which monitors and graphically displays the CCD temperature as a function of time and c) a simple script to propagate image spatial calibrations to uncalibrated images, such as EFTEM images. Other scripts by the author along with some scripting resources are also discussed. Copyright (2002) Australian Society for Electron Microscopy Inc

  15. A filtering approach to edge preserving MAP estimation of images.

    Science.gov (United States)

    Humphrey, David; Taubman, David

    2011-05-01

    The authors present a computationally efficient technique for maximum a posteriori (MAP) estimation of images in the presence of both blur and noise. The image is divided into statistically independent regions. Each region is modelled with a WSS Gaussian prior. Classical Wiener filter theory is used to generate a set of convex sets in the solution space, with the solution to the MAP estimation problem lying at the intersection of these sets. The proposed algorithm uses an underlying segmentation of the image, and a means of determining the segmentation and refining it are described. The algorithm is suitable for a range of image restoration problems, as it provides a computationally efficient means to deal with the shortcomings of Wiener filtering without sacrificing the computational simplicity of the filtering approach. The algorithm is also of interest from a theoretical viewpoint as it provides a continuum of solutions between Wiener filtering and Inverse filtering depending upon the segmentation used. We do not attempt to show here that the proposed method is the best general approach to the image reconstruction problem. However, related work referenced herein shows excellent performance in the specific problem of demosaicing.

  16. Detail-enhanced multimodality medical image fusion based on gradient minimization smoothing filter and shearing filter.

    Science.gov (United States)

    Liu, Xingbin; Mei, Wenbo; Du, Huiqian

    2018-02-13

    In this paper, a detail-enhanced multimodality medical image fusion algorithm is proposed by using proposed multi-scale joint decomposition framework (MJDF) and shearing filter (SF). The MJDF constructed with gradient minimization smoothing filter (GMSF) and Gaussian low-pass filter (GLF) is used to decompose source images into low-pass layers, edge layers, and detail layers at multiple scales. In order to highlight the detail information in the fused image, the edge layer and the detail layer in each scale are weighted combined into a detail-enhanced layer. As directional filter is effective in capturing salient information, so SF is applied to the detail-enhanced layer to extract geometrical features and obtain directional coefficients. Visual saliency map-based fusion rule is designed for fusing low-pass layers, and the sum of standard deviation is used as activity level measurement for directional coefficients fusion. The final fusion result is obtained by synthesizing the fused low-pass layers and directional coefficients. Experimental results show that the proposed method with shift-invariance, directional selectivity, and detail-enhanced property is efficient in preserving and enhancing detail information of multimodality medical images. Graphical abstract The detailed implementation of the proposed medical image fusion algorithm.

  17. ANALYSIS OF SST IMAGES BY WEIGHTED ENSEMBLE TRANSFORM KALMAN FILTER

    OpenAIRE

    Sai , Gorthi; Beyou , Sébastien; Memin , Etienne

    2011-01-01

    International audience; This paper presents a novel, efficient scheme for the analysis of Sea Surface Temperature (SST) ocean images. We consider the estimation of the velocity fields and vorticity values from a sequence of oceanic images. The contribution of this paper lies in proposing a novel, robust and simple approach based onWeighted Ensemble Transform Kalman filter (WETKF) data assimilation technique for the analysis of real SST images, that may contain coast regions or large areas of ...

  18. Multi-dimensional medical images compressed and filtered with wavelets

    International Nuclear Information System (INIS)

    Boyen, H.; Reeth, F. van; Flerackers, E.

    2002-01-01

    Full text: Using the standard wavelet decomposition methods, multi-dimensional medical images can be compressed and filtered by repeating the wavelet-algorithm on 1D-signals in an extra loop per extra dimension. In the non-standard decomposition for multi-dimensional images the areas that must be zero-filled in case of band- or notch-filters are more complex than geometric areas such as rectangles or cubes. Adding an additional dimension in this algorithm until 4D (e.g. a 3D beating heart) increases the geometric complexity of those areas even more. The aim of our study was to calculate the boundaries of the formed complex geometric areas, so we can use the faster non-standard decomposition to compress and filter multi-dimensional medical images. Because a lot of 3D medical images taken by PET- or SPECT-cameras have only a few layers in the Z-dimension and compressing images in a dimension with a few voxels is usually not worthwhile, we provided a solution in which one can choose which dimensions will be compressed or filtered. With the proposal of non-standard decomposition on Daubechies' wavelets D2 to D20 by Steven Gollmer in 1992, 1D data can be compressed and filtered. Each additional level works only on the smoothed data, so the transformation-time halves per extra level. Zero-filling a well-defined area alter the wavelet-transform and then performing the inverse transform will do the filtering. To be capable to compress and filter up to 4D-Images with the faster non-standard wavelet decomposition method, we have investigated a new method for calculating the boundaries of the areas which must be zero-filled in case of filtering. This is especially true for band- and notch filtering. Contrary to the standard decomposition method, the areas are no longer rectangles in 2D or cubes in 3D or a row of cubes in 4D: they are rectangles expanded with a half-sized rectangle in the other direction for 2D, cubes expanded with half cubes in one and quarter cubes in the

  19. Apodized RFI filtering of synthetic aperture radar images

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter

    2014-02-01

    Fine resolution Synthetic Aperture Radar (SAR) systems necessarily require wide bandwidths that often overlap spectrum utilized by other wireless services. These other emitters pose a source of Radio Frequency Interference (RFI) to the SAR echo signals that degrades SAR image quality. Filtering, or excising, the offending spectral contaminants will mitigate the interference, but at a cost of often degrading the SAR image in other ways, notably by raising offensive sidelobe levels. This report proposes borrowing an idea from nonlinear sidelobe apodization techniques to suppress interference without the attendant increase in sidelobe levels. The simple post-processing technique is termed Apodized RFI Filtering (ARF).

  20. Mammographic image enhancement using wavelet transform and homomorphic filter

    Directory of Open Access Journals (Sweden)

    F Majidi

    2015-12-01

    Full Text Available Mammography is the most effective method for the early diagnosis of breast cancer diseases. As mammographic images contain low signal to noise ratio and low contrast, it becomes too difficult for radiologists to analyze mammogram. To deal with the above stated problems, it is very important to enhance the mammographic images using image processing methods. This paper introduces a new image enhancement approach for mammographic images which uses the modified mathematical morphology, wavelet transform and homomorphic filter to suppress the noise of images. For performance evaluation of the proposed method, contrast improvement index (CII and edge preservation index (EPI are adopted. Experimental results on mammographic images from Pejvak Digital Imaging Center (PDIC show that the proposed algorithm improves the two indexes, thereby achieving the goal of enhancing mammographic images.

  1. Implementation of non-linear filters for iterative penalized maximum likelihood image reconstruction

    International Nuclear Information System (INIS)

    Liang, Z.; Gilland, D.; Jaszczak, R.; Coleman, R.

    1990-01-01

    In this paper, the authors report on the implementation of six edge-preserving, noise-smoothing, non-linear filters applied in image space for iterative penalized maximum-likelihood (ML) SPECT image reconstruction. The non-linear smoothing filters implemented were the median filter, the E 6 filter, the sigma filter, the edge-line filter, the gradient-inverse filter, and the 3-point edge filter with gradient-inverse filter, and the 3-point edge filter with gradient-inverse weight. A 3 x 3 window was used for all these filters. The best image obtained, by viewing the profiles through the image in terms of noise-smoothing, edge-sharpening, and contrast, was the one smoothed with the 3-point edge filter. The computation time for the smoothing was less than 1% of one iteration, and the memory space for the smoothing was negligible. These images were compared with the results obtained using Bayesian analysis

  2. Image Recommendation Algorithm Using Feature-Based Collaborative Filtering

    Science.gov (United States)

    Kim, Deok-Hwan

    As the multimedia contents market continues its rapid expansion, the amount of image contents used in mobile phone services, digital libraries, and catalog service is increasing remarkably. In spite of this rapid growth, users experience high levels of frustration when searching for the desired image. Even though new images are profitable to the service providers, traditional collaborative filtering methods cannot recommend them. To solve this problem, in this paper, we propose feature-based collaborative filtering (FBCF) method to reflect the user's most recent preference by representing his purchase sequence in the visual feature space. The proposed approach represents the images that have been purchased in the past as the feature clusters in the multi-dimensional feature space and then selects neighbors by using an inter-cluster distance function between their feature clusters. Various experiments using real image data demonstrate that the proposed approach provides a higher quality recommendation and better performance than do typical collaborative filtering and content-based filtering techniques.

  3. Correlation Filters for Detection of Cellular Nuclei in Histopathology Images.

    Science.gov (United States)

    Ahmad, Asif; Asif, Amina; Rajpoot, Nasir; Arif, Muhammad; Minhas, Fayyaz Ul Amir Afsar

    2017-11-21

    Nuclei detection in histology images is an essential part of computer aided diagnosis of cancers and tumors. It is a challenging task due to diverse and complicated structures of cells. In this work, we present an automated technique for detection of cellular nuclei in hematoxylin and eosin stained histopathology images. Our proposed approach is based on kernelized correlation filters. Correlation filters have been widely used in object detection and tracking applications but their strength has not been explored in the medical imaging domain up till now. Our experimental results show that the proposed scheme gives state of the art accuracy and can learn complex nuclear morphologies. Like deep learning approaches, the proposed filters do not require engineering of image features as they can operate directly on histopathology images without significant preprocessing. However, unlike deep learning methods, the large-margin correlation filters developed in this work are interpretable, computationally efficient and do not require specialized or expensive computing hardware. A cloud based webserver of the proposed method and its python implementation can be accessed at the following URL: http://faculty.pieas.edu.pk/fayyaz/software.html#corehist .

  4. Use of metameric filters for future interference security image structures

    Science.gov (United States)

    Baloukas, Bill; Larouche, Stéphane; Martinu, Ludvik

    2006-02-01

    In the present work, we describe innovative approaches and properties that can be added to the already popular thin film optically variable devices (OVD) used on banknotes. We show two practical examples of OVDs, namely (i) a pair of metameric filters offering a hidden image effect as a function of the angle of observation as well as a specific spectral property permitting automatic note readability, and (ii) multi-material filters offering a side-dependent color shift. We first describe the design approach of these new devices followed by their sensitivity to deposition errors especially in the case of the metameric filters where slight thickness variations have a significant effect on the obtained colors. The performance of prototype filters prepared by dual ion beam sputtering (DIBS) is shown.

  5. Digital Path Approach Despeckle Filter for Ultrasound Imaging and Video

    Directory of Open Access Journals (Sweden)

    Marek Szczepański

    2017-01-01

    Full Text Available We propose a novel filtering technique capable of reducing the multiplicative noise in ultrasound images that is an extension of the denoising algorithms based on the concept of digital paths. In this approach, the filter weights are calculated taking into account the similarity between pixel intensities that belongs to the local neighborhood of the processed pixel, which is called a path. The output of the filter is estimated as the weighted average of pixels connected by the paths. The way of creating paths is pivotal and determines the effectiveness and computational complexity of the proposed filtering design. Such procedure can be effective for different types of noise but fail in the presence of multiplicative noise. To increase the filtering efficiency for this type of disturbances, we introduce some improvements of the basic concept and new classes of similarity functions and finally extend our techniques to a spatiotemporal domain. The experimental results prove that the proposed algorithm provides the comparable results with the state-of-the-art techniques for multiplicative noise removal in ultrasound images and it can be applied for real-time image enhancement of video streams.

  6. Filtering and deconvolution for bioluminescence imaging of small animals

    International Nuclear Information System (INIS)

    Akkoul, S.

    2010-01-01

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  7. Weighted ensemble transform Kalman filter for image assimilation

    Directory of Open Access Journals (Sweden)

    Sebastien Beyou

    2013-01-01

    Full Text Available This study proposes an extension of the Weighted Ensemble Kalman filter (WEnKF proposed by Papadakis et al. (2010 for the assimilation of image observations. The main focus of this study is on a novel formulation of the Weighted filter with the Ensemble Transform Kalman filter (WETKF, incorporating directly as a measurement model a non-linear image reconstruction criterion. This technique has been compared to the original WEnKF on numerical and real world data of 2-D turbulence observed through the transport of a passive scalar. In particular, it has been applied for the reconstruction of oceanic surface current vorticity fields from sea surface temperature (SST satellite data. This latter technique enables a consistent recovery along time of oceanic surface currents and vorticity maps in presence of large missing data areas and strong noise.

  8. Processing of a neutrographic image, using Bosso Filter

    International Nuclear Information System (INIS)

    Pereda, C.; Bustamante, M.; Henriquez, C.

    2006-01-01

    The following paper shows the result of the treatment of a neutron radiographic image, obtained in the RECH-1 experimental reactor, making use of the computational image treatment techniques of the IDL software, which are complemented with the Bosso filter method already tested to improve quality in medical diagnosis. These techniques possess an undeniable value as an auxiliary to neutrography, which results can be noticed through this first try with an auxiliary neutrographic image used in PGNAA. These results insinuate that this method should give all its advantages to the neutrographic analysis standards: structural images, density variations, etc

  9. Despeckle filtering for ultrasound imaging and video II selected applications

    CERN Document Server

    Loizou, Christos P

    2015-01-01

    In ultrasound imaging and video visual perception is hindered by speckle multiplicative noise that degrades the quality. Noise reduction is therefore essential for improving the visual observation quality or as a pre-processing step for further automated analysis, such as image/video segmentation, texture analysis and encoding in ultrasound imaging and video. The goal of the first book (book 1 of 2 books) was to introduce the problem of speckle in ultrasound image and video as well as the theoretical background, algorithmic steps, and the MatlabTM for the following group of despeckle filters:

  10. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  11. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  12. Cross-correlated imaging of distributed mode filtering rod fiber

    DEFF Research Database (Denmark)

    Laurila, Marko; Barankov, Roman; Jørgensen, Mette Marie

    2013-01-01

    We analyze the modal properties of an 85μm core distributed mode filtering rod fiber using cross-correlated (C2) imaging. We evaluate suppression of higher-order modes (HOMs) under severely misaligned mode excitation and identify a single-mode regime where HOMs are suppressed by more than 20dB....

  13. A filtering approach to image reconstruction in 3D SPECT

    International Nuclear Information System (INIS)

    Bronnikov, Andrei V.

    2000-01-01

    We present a new approach to three-dimensional (3D) image reconstruction using analytical inversion of the exponential divergent beam transform, which can serve as a mathematical model for cone-beam 3D SPECT imaging. We apply a circular cone-beam scan and assume constant attenuation inside a convex area with a known boundary, which is satisfactory in brain imaging. The reconstruction problem is reduced to an image restoration problem characterized by a shift-variant point spread function which is given analytically. The method requires two computation steps: backprojection and filtering. The modulation transfer function (MTF) of the filter is derived by means of an original methodology using the 2D Laplace transform. The filter is implemented in the frequency domain and requires 2D Fourier transform of transverse slices. In order to obtain a shift-invariant cone-beam projection-backprojection operator we resort to an approximation, assuming that the collimator has a relatively large focal length. Nevertheless, numerical experiments demonstrate surprisingly good results for detectors with relatively short focal lengths. The use of a wavelet-based filtering algorithm greatly improves the stability to Poisson noise. (author)

  14. 32Still Image Compression Algorithm Based on Directional Filter Banks

    OpenAIRE

    Chunling Yang; Duanwu Cao; Li Ma

    2010-01-01

    Hybrid wavelet and directional filter banks (HWD) is an effective multi-scale geometrical analysis method. Compared to wavelet transform, it can better capture the directional information of images. But the ringing artifact, which is caused by the coefficient quantization in transform domain, is the biggest drawback of image compression algorithms in HWD domain. In this paper, by researching on the relationship between directional decomposition and ringing artifact, an improved decomposition ...

  15. Filtering adult image content with topic models

    OpenAIRE

    Lienhart, Rainer (Prof. Dr.); Hauke, Rudolf

    2009-01-01

    Protecting children from exposure to adult content has become a serious problem in the real world. Current statistics show that, for instance, the average age of first Internet exposure to pornography is 11 years, that the largest consumer group of Internet pornography is the age group of 12-to-17-year-olds and that 90% of the 8-to-16-year-olds have viewed porn online. To protect our children, effective algorithms for detecting adult images are needed. In this research we evaluate the use of ...

  16. A Kalman filter technique applied for medical image reconstruction

    International Nuclear Information System (INIS)

    Goliaei, S.; Ghorshi, S.; Manzuri, M. T.; Mortazavi, M.

    2011-01-01

    Medical images contain information about vital organic tissues inside of human body and are widely used for diagnoses of disease or for surgical purposes. Image reconstruction is essential for medical images for some applications such as suppression of noise or de-blurring the image in order to provide images with better quality and contrast. Due to vital rule of image reconstruction in medical sciences the corresponding algorithms with better efficiency and higher speed is desirable. Most algorithms in image reconstruction are operated on frequency domain such as the most popular one known as filtered back projection. In this paper we introduce a Kalman filter technique which is operated in time domain for medical image reconstruction. Results indicated that as the number of projection increases in both normal collected ray sum and the collected ray sum corrupted by noise the quality of reconstructed image becomes better in terms of contract and transparency. It is also seen that as the number of projection increases the error index decreases.

  17. Guided filtering for solar image/video processing

    Directory of Open Access Journals (Sweden)

    Long Xu

    2017-06-01

    Full Text Available A new image enhancement algorithm employing guided filtering is proposed in this work for enhancement of solar images and videos, so that users can easily figure out important fine structures imbedded in the recorded images/movies for solar observation. The proposed algorithm can efficiently remove image noises, including Gaussian and impulse noises. Meanwhile, it can further highlight fibrous structures on/beyond the solar disk. These fibrous structures can clearly demonstrate the progress of solar flare, prominence coronal mass emission, magnetic field, and so on. The experimental results prove that the proposed algorithm gives significant enhancement of visual quality of solar images beyond original input and several classical image enhancement algorithms, thus facilitating easier determination of interesting solar burst activities from recorded images/movies.

  18. Superresolution restoration of an image sequence: adaptive filtering approach.

    Science.gov (United States)

    Elad, M; Feuer, A

    1999-01-01

    This paper presents a new method based on adaptive filtering theory for superresolution restoration of continuous image sequences. The proposed methodology suggests least squares (LS) estimators which adapt in time, based on adaptive filters, least mean squares (LMS) or recursive least squares (RLS). The adaptation enables the treatment of linear space and time-variant blurring and arbitrary motion, both of them assumed known. The proposed new approach is shown to be of relatively low computational requirements. Simulations demonstrating the superresolution restoration algorithms are presented.

  19. Kalman filter techniques for accelerated Cartesian dynamic cardiac imaging.

    Science.gov (United States)

    Feng, Xue; Salerno, Michael; Kramer, Christopher M; Meyer, Craig H

    2013-05-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome, and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and signal-to-noise ratio. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view-sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. Copyright © 2012 Wiley Periodicals, Inc.

  20. Improvement of natural image search engines results by emotional filtering

    Directory of Open Access Journals (Sweden)

    Patrice Denis

    2016-04-01

    Full Text Available With the Internet 2.0 era, managing user emotions is a problem that more and more actors are interested in. Historically, the first notions of emotion sharing were expressed and defined with emoticons. They allowed users to show their emotional status to others in an impersonal and emotionless digital world. Now, in the Internet of social media, every day users share lots of content with each other on Facebook, Twitter, Google+ and so on. Several new popular web sites like FlickR, Picassa, Pinterest, Instagram or DeviantArt are now specifically based on sharing image content as well as personal emotional status. This kind of information is economically very valuable as it can for instance help commercial companies sell more efficiently. In fact, with this king of emotional information, business can made where companies will better target their customers needs, and/or even sell them more products. Research has been and is still interested in the mining of emotional information from user data since then. In this paper, we focus on the impact of emotions from images that have been collected from search image engines. More specifically our proposition is the creation of a filtering layer applied on the results of such image search engines. Our peculiarity relies in the fact that it is the first attempt from our knowledge to filter image search engines results with an emotional filtering approach.

  1. Methods of filtering the graph images of the functions

    Directory of Open Access Journals (Sweden)

    Олександр Григорович Бурса

    2017-06-01

    Full Text Available The theoretical aspects of cleaning raster images of scanned graphs of functions from digital, chromatic and luminance distortions by using computer graphics techniques have been considered. The basic types of distortions characteristic of graph images of functions have been stated. To suppress the distortion several methods, providing for high-quality of the resulting images and saving their topological features, were suggested. The paper describes the techniques developed and improved by the authors: the method of cleaning the image of distortions by means of iterative contrasting, based on the step-by-step increase in image contrast in the graph by 1%; the method of small entities distortion restoring, based on the thinning of the known matrix of contrast increase filter (the allowable dimensions of the nucleus dilution radius convolution matrix, which provide for the retention of the graph lines have been established; integration technique of the noise reduction method by means of contrasting and distortion restoring method of small entities with known σ-filter. Each method in the complex has been theoretically substantiated. The developed methods involve treatment of graph images as the entire image (global processing and its fragments (local processing. The metrics assessing the quality of the resulting image with the global and local processing have been chosen, the substantiation of the choice as well as the formulas have been given. The proposed complex methods of cleaning the graphs images of functions from grayscale image distortions is adaptive to the form of an image carrier, the distortion level in the image and its distribution. The presented results of testing the developed complex of methods for a representative sample of images confirm its effectiveness

  2. Image Denoising Using Interquartile Range Filter with Local Averaging

    OpenAIRE

    Jassim, Firas Ajil

    2013-01-01

    Image denoising is one of the fundamental problems in image processing. In this paper, a novel approach to suppress noise from the image is conducted by applying the interquartile range (IQR) which is one of the statistical methods used to detect outlier effect from a dataset. A window of size kXk was implemented to support IQR filter. Each pixel outside the IQR range of the kXk window is treated as noisy pixel. The estimation of the noisy pixels was obtained by local averaging. The essential...

  3. Filters involving derivatives with application to reconstruction from scanned halftone images

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Jensen, Kim S.

    1995-01-01

    This paper presents a method for designing finite impulse response (FIR) filters for samples of a 2-D signal, e.g., an image, and its gradient. The filters, which are called blended filters, are decomposable in three filters, each separable in 1-D filters on subsets of the data set. Optimality...... in the minimum mean square error sense (MMSE) of blended filtering is shown for signals with separable autocorrelation function. Relations between correlation functions for signals and their gradients are derived. Blended filters may be composed from FIR Wiener filters using these relations. Simple blended...... is achievable with blended filters...

  4. Kalman Filtered MR Temperature Imaging for Laser Induced Thermal Therapies

    OpenAIRE

    Fuentes, D.; Yung, J.; Hazle, J. D.; Weinberg, J. S.; Stafford, R. J.

    2011-01-01

    The feasibility of using a stochastic form of Pennes bioheat model within a 3D finite element based Kalman filter (KF) algorithm is critically evaluated for the ability to provide temperature field estimates in the event of magnetic resonance temperature imaging (MRTI) data loss during laser induced thermal therapy (LITT). The ability to recover missing MRTI data was analyzed by systematically removing spatiotemporal information from a clinical MR-guided LITT procedure in human brain and comp...

  5. Document image binarization using "multi-scale" predefined filters

    Science.gov (United States)

    Saabni, Raid M.

    2018-04-01

    Reading text or searching for key words within a historical document is a very challenging task. one of the first steps of the complete task is binarization, where we separate foreground such as text, figures and drawings from the background. Successful results of this important step in many cases can determine next steps to success or failure, therefore it is very vital to the success of the complete task of reading and analyzing the content of a document image. Generally, historical documents images are of poor quality due to their storage condition and degradation over time, which mostly cause to varying contrasts, stains, dirt and seeping ink from reverse side. In this paper, we use banks of anisotropic predefined filters in different scales and orientations to develop a binarization method for degraded documents and manuscripts. Using the fact, that handwritten strokes may follow different scales and orientations, we use predefined sets of filter banks having various scales, weights, and orientations to seek a compact set of filters and weights in order to generate diffrent layers of foregrounds and background. Results of convolving these fiters on the gray level image locally, weighted and accumulated to enhance the original image. Based on the different layers, seeds of components in the gray level image and a learning process, we present an improved binarization algorithm to separate the background from layers of foreground. Different layers of foreground which may be caused by seeping ink, degradation or other factors are also separated from the real foreground in a second phase. Promising experimental results were obtained on the DIBCO2011 , DIBCO2013 and H-DIBCO2016 data sets and a collection of images taken from real historical documents.

  6. Comparison of various filtering methods for digital X-ray image processing

    International Nuclear Information System (INIS)

    Pfluger, T.; Reinfelder, H.E.; Dorschky, K.; Oppelt, A.; Siemens A.G., Erlangen

    1987-01-01

    Three filtering methods are explained and compared that are used for border edge enhancement of digitally processed X-ray images. The filters are compared by two examples, a radiograph of the chest, and one of the knee joint. The unsharpness mask is found to yield the best compromise between edge enhancement and image noise intensifying effect, whereas the results obtained by the high-pass filter or the Wallis filter are less good for diagnostic evaluation. The filtered images better display narrow lines, structural borders and edges, and finely spotted areas, than the original radiograph, so that diagnostic evaluation is easier after image filtering. (orig.) [de

  7. Collaborating Filtering Community Image Recommendation System Based on Scene

    Directory of Open Access Journals (Sweden)

    He Tao

    2017-01-01

    Full Text Available With the advancement of smart city, the development of intelligent mobile terminal and wireless network, the traditional text information service no longer meet the needs of the community residents, community image service appeared as a new media service. “There are pictures of the truth” has become a community residents to understand and master the new dynamic community, image information service has become a new information service. However, there are two major problems in image information service. Firstly, the underlying eigenvalues extracted by current image feature extraction techniques are difficult for users to understand, and there is a semantic gap between the image content itself and the user’s understanding; secondly, in community life of the image data increasing quickly, it is difficult to find their own interested image data. Aiming at the two problems, this paper proposes a unified image semantic scene model to express the image content. On this basis, a collaborative filtering recommendation model of fusion scene semantics is proposed. In the recommendation model, a comprehensiveness and accuracy user interest model is proposed to improve the recommendation quality. The results of the present study have achieved good results in the pilot cities of Wenzhou and Yan'an, and it is applied normally.

  8. Complex noise suppression using a sparse representation and 3D filtering of images

    Science.gov (United States)

    Kravchenko, V. F.; Ponomaryov, V. I.; Pustovoit, V. I.; Palacios-Enriquez, A.

    2017-08-01

    A novel method for the filtering of images corrupted by complex noise composed of randomly distributed impulses and additive Gaussian noise has been substantiated for the first time. The method consists of three main stages: the detection and filtering of pixels corrupted by impulsive noise, the subsequent image processing to suppress the additive noise based on 3D filtering and a sparse representation of signals in a basis of wavelets, and the concluding image processing procedure to clean the final image of the errors emerged at the previous stages. A physical interpretation of the filtering method under complex noise conditions is given. A filtering block diagram has been developed in accordance with the novel approach. Simulations of the novel image filtering method have shown an advantage of the proposed filtering scheme in terms of generally recognized criteria, such as the structural similarity index measure and the peak signal-to-noise ratio, and when visually comparing the filtered images.

  9. Digital filtering and reconstruction of coded aperture images

    International Nuclear Information System (INIS)

    Tobin, K.W. Jr.

    1987-01-01

    The real-time neutron radiography facility at the University of Virginia has been used for both transmission radiography and computed tomography. Recently, a coded aperture system has been developed to permit the extraction of three dimensional information from a low intensity field of radiation scattered by an extended object. Short wave-length radiations (e.g. neutrons) are not easily image because of the difficulties in achieving diffraction and refraction with a conventional lens imaging system. By using a coded aperture approach, an imaging system has been developed that records and reconstructs an object from an intensity distribution. This system has a signal-to-noise ratio that is proportional to the total open area of the aperture making it ideal for imaging with a limiting intensity radiation field. The main goal of this research was to develope and implement the digital methods and theory necessary for the reconstruction process. Several real-time video systems, attached to an Intellect-100 image processor, a DEC PDP-11 micro-computer, and a Convex-1 parallel processing mainframe were employed. This system, coupled with theoretical extensions and improvements, allowed for retrieval of information previously unobtainable by earlier optical methods. The effect of thermal noise, shot noise, and aperture related artifacts were examined so that new digital filtering techniques could be constructed and implemented. Results of image data filtering prior to and following the reconstruction process are reported. Improvements related to the different signal processing methods are emphasized. The application and advantages of this imaging technique to the field of non-destructive testing are also discussed

  10. HDR Pathological Image Enhancement Based on Improved Bias Field Correction and Guided Image Filter

    Directory of Open Access Journals (Sweden)

    Qingjiao Sun

    2016-01-01

    Full Text Available Pathological image enhancement is a significant topic in the field of pathological image processing. This paper proposes a high dynamic range (HDR pathological image enhancement method based on improved bias field correction and guided image filter (GIF. Firstly, a preprocessing including stain normalization and wavelet denoising is performed for Haematoxylin and Eosin (H and E stained pathological image. Then, an improved bias field correction model is developed to enhance the influence of light for high-frequency part in image and correct the intensity inhomogeneity and detail discontinuity of image. Next, HDR pathological image is generated based on least square method using low dynamic range (LDR image, H and E channel images. Finally, the fine enhanced image is acquired after the detail enhancement process. Experiments with 140 pathological images demonstrate the performance advantages of our proposed method as compared with related work.

  11. Denoising of MR images using FREBAS collaborative filtering

    International Nuclear Information System (INIS)

    Ito, Satoshi; Hizume, Masayuki; Yamada, Yoshifumi

    2011-01-01

    We propose a novel image denoising strategy based on the correlation in the FREBAS transformed domain. FREBAS transform is a kind of multi-resolution image analysis which consists of two different Fresnel transforms. It can decompose images into down-scaled images of the same size with a different frequency bandwidth. Since these decomposed images have similar distributions for the same directions from the center of the FREBAS domain, even when the FREBAS signal is hidden by noise in the case of a low-signal-to-noise ratio (SNR) image, the signal distribution can be estimated using the distribution of the FREBAS signal located near the position of interest. We have developed a collaborative Wiener filter in the FREBAS transformed domain which implements collaboration of the standard deviation of the position of interest and that of analogous positions. The experimental results demonstrated that the proposed algorithm improves the SNR in terms of both the total SNR and the SNR at the edges of images. (author)

  12. An Improved Filtering Method for Quantum Color Image in Frequency Domain

    Science.gov (United States)

    Li, Panchi; Xiao, Hong

    2018-01-01

    In this paper we investigate the use of quantum Fourier transform (QFT) in the field of image processing. We consider QFT-based color image filtering operations and their applications in image smoothing, sharpening, and selective filtering using quantum frequency domain filters. The underlying principle used for constructing the proposed quantum filters is to use the principle of the quantum Oracle to implement the filter function. Compared with the existing methods, our method is not only suitable for color images, but also can flexibly design the notch filters. We provide the quantum circuit that implements the filtering task and present the results of several simulation experiments on color images. The major advantages of the quantum frequency filtering lies in the exploitation of the efficient implementation of the quantum Fourier transform.

  13. Adaptive multiresolution Hermite-Binomial filters for image edge and texture analysis

    NARCIS (Netherlands)

    Gu, Y.H.; Katsaggelos, A.K.

    1994-01-01

    A new multiresolution image analysis approach using adaptive Hermite-Binomial filters is presented in this paper. According to the local image structural and textural properties, the analysis filter kernels are made adaptive both in their scales and orders. Applications of such an adaptive filtering

  14. Nonlinear filtering for character recognition in low quality document images

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2014-09-01

    Optical character recognition in scanned printed documents is a well-studied task, where the captured conditions like sheet position, illumination, contrast and resolution are controlled. Nowadays, it is more practical to use mobile devices for document capture than a scanner. So as a consequence, the quality of document images is often poor owing to presence of geometric distortions, nonhomogeneous illumination, low resolution, etc. In this work we propose to use multiple adaptive nonlinear composite filters for detection and classification of characters. Computer simulation results obtained with the proposed system are presented and discussed.

  15. Image pre-filtering for measurement error reduction in digital image correlation

    Science.gov (United States)

    Zhou, Yihao; Sun, Chen; Song, Yuntao; Chen, Jubing

    2015-02-01

    In digital image correlation, the sub-pixel intensity interpolation causes a systematic error in the measured displacements. The error increases toward high-frequency component of the speckle pattern. In practice, a captured image is usually corrupted by additive white noise. The noise introduces additional energy in the high frequencies and therefore raises the systematic error. Meanwhile, the noise also elevates the random error which increases with the noise power. In order to reduce the systematic error and the random error of the measurements, we apply a pre-filtering to the images prior to the correlation so that the high-frequency contents are suppressed. Two spatial-domain filters (binomial and Gaussian) and two frequency-domain filters (Butterworth and Wiener) are tested on speckle images undergoing both simulated and real-world translations. By evaluating the errors of the various combinations of speckle patterns, interpolators, noise levels, and filter configurations, we come to the following conclusions. All the four filters are able to reduce the systematic error. Meanwhile, the random error can also be reduced if the signal power is mainly distributed around DC. For high-frequency speckle patterns, the low-pass filters (binomial, Gaussian and Butterworth) slightly increase the random error and Butterworth filter produces the lowest random error among them. By using Wiener filter with over-estimated noise power, the random error can be reduced but the resultant systematic error is higher than that of low-pass filters. In general, Butterworth filter is recommended for error reduction due to its flexibility of passband selection and maximal preservation of the allowed frequencies. Binomial filter enables efficient implementation and thus becomes a good option if computational cost is a critical issue. While used together with pre-filtering, B-spline interpolator produces lower systematic error than bicubic interpolator and similar level of the random

  16. Metal artefact reduction for a dental cone beam CT image using image segmentation and backprojection filters

    International Nuclear Information System (INIS)

    Mohammadi, Mahdi; Khotanlou, Hassan; Mohammadi, Mohammad

    2011-01-01

    Full text: Due to low dose delivery and fast scanning, the dental Cone Beam CT (CBCT) is the latest technology being implanted for a range of dental imaging. The presence of metallic objects including amalgam or gold fillings in the mouth produces an intuitive image for human jaws. The feasibility of a fast and accurate approach for metal artefact reduction for dental CBCT is investigated. The current study investigates the metal artefact reduction using image segmentation and modification of several sinigrams. In order to reduce metal effects such as beam hardening, streak artefact and intense noises, the application of several algorithms is evaluated. The proposed method includes three stages: preprocessing, reconstruction and post-processing. In the pre-processing stage, in order to reduce the noise level, several phase and frequency filters were applied. At the second stage, based on the specific sinogram achieved for each segment, spline interpolation and weighting backprojection filters were applied to reconstruct the original image. A three-dimensional filter was then applied on reconstructed images, to improve the image quality. Results showed that compared to other available filters, standard frequency filters have a significant influence in the preprocessing stage (ΔHU = 48 ± 6). In addition, with the streak artefact, the probability of beam hardening artefact increases. t e post-processing stage, the application of three-dimensional filters improves the quality of reconstructed images (See Fig. I). Conclusion The proposed method reduces metal artefacts especially where there are more than one metal implanted in the region of interest.

  17. The singular value filter: a general filter design strategy for PCA-based signal separation in medical ultrasound imaging.

    Science.gov (United States)

    Mauldin, F William; Lin, Dan; Hossack, John A

    2011-11-01

    A general filtering method, called the singular value filter (SVF), is presented as a framework for principal component analysis (PCA) based filter design in medical ultrasound imaging. The SVF approach operates by projecting the original data onto a new set of bases determined from PCA using singular value decomposition (SVD). The shape of the SVF weighting function, which relates the singular value spectrum of the input data to the filtering coefficients assigned to each basis function, is designed in accordance with a signal model and statistical assumptions regarding the underlying source signals. In this paper, we applied SVF for the specific application of clutter artifact rejection in diagnostic ultrasound imaging. SVF was compared to a conventional PCA-based filtering technique, which we refer to as the blind source separation (BSS) method, as well as a simple frequency-based finite impulse response (FIR) filter used as a baseline for comparison. The performance of each filter was quantified in simulated lesion images as well as experimental cardiac ultrasound data. SVF was demonstrated in both simulation and experimental results, over a wide range of imaging conditions, to outperform the BSS and FIR filtering methods in terms of contrast-to-noise ratio (CNR) and motion tracking performance. In experimental mouse heart data, SVF provided excellent artifact suppression with an average CNR improvement of 1.8 dB with over 40% reduction in displacement tracking error. It was further demonstrated from simulation and experimental results that SVF provided superior clutter rejection, as reflected in larger CNR values, when filtering was achieved using complex pulse-echo received data and non-binary filter coefficients.

  18. Wavelet Filter Banks for Super-Resolution SAR Imaging

    Science.gov (United States)

    Sheybani, Ehsan O.; Deshpande, Manohar; Memarsadeghi, Nargess

    2011-01-01

    This paper discusses Innovative wavelet-based filter banks designed to enhance the analysis of super resolution Synthetic Aperture Radar (SAR) images using parametric spectral methods and signal classification algorithms, SAR finds applications In many of NASA's earth science fields such as deformation, ecosystem structure, and dynamics of Ice, snow and cold land processes, and surface water and ocean topography. Traditionally, standard methods such as Fast-Fourier Transform (FFT) and Inverse Fast-Fourier Transform (IFFT) have been used to extract Images from SAR radar data, Due to non-parametric features of these methods and their resolution limitations and observation time dependence, use of spectral estimation and signal pre- and post-processing techniques based on wavelets to process SAR radar data has been proposed. Multi-resolution wavelet transforms and advanced spectral estimation techniques have proven to offer efficient solutions to this problem.

  19. Kalman filtered MR temperature imaging for laser induced thermal therapies.

    Science.gov (United States)

    Fuentes, D; Yung, J; Hazle, J D; Weinberg, J S; Stafford, R J

    2012-04-01

    The feasibility of using a stochastic form of Pennes bioheat model within a 3-D finite element based Kalman filter (KF) algorithm is critically evaluated for the ability to provide temperature field estimates in the event of magnetic resonance temperature imaging (MRTI) data loss during laser induced thermal therapy (LITT). The ability to recover missing MRTI data was analyzed by systematically removing spatiotemporal information from a clinical MR-guided LITT procedure in human brain and comparing predictions in these regions to the original measurements. Performance was quantitatively evaluated in terms of a dimensionless L(2) (RMS) norm of the temperature error weighted by acquisition uncertainty. During periods of no data corruption, observed error histories demonstrate that the Kalman algorithm does not alter the high quality temperature measurement provided by MR thermal imaging. The KF-MRTI implementation considered is seen to predict the bioheat transfer with RMS error 10 sec.

  20. Image enhancement by spatial frequency post-processing of images obtained with pupil filters

    Science.gov (United States)

    Estévez, Irene; Escalera, Juan C.; Stefano, Quimey Pears; Iemmi, Claudio; Ledesma, Silvia; Yzuel, María J.; Campos, Juan

    2016-12-01

    The use of apodizing or superresolving filters improves the performance of an optical system in different frequency bands. This improvement can be seen as an increase in the OTF value compared to the OTF for the clear aperture. In this paper we propose a method to enhance the contrast of an image in both its low and its high frequencies. The method is based on the generation of a synthetic Optical Transfer Function, by multiplexing the OTFs given by the use of different non-uniform transmission filters on the pupil. We propose to capture three images, one obtained with a clear pupil, one obtained with an apodizing filter that enhances the low frequencies and another one taken with a superresolving filter that improves the high frequencies. In the Fourier domain the three spectra are combined by using smoothed passband filters, and then the inverse transform is performed. We show that we can create an enhanced image better than the image obtained with the clear aperture. To evaluate the performance of the method, bar tests (sinusoidal tests) with different frequency content are used. The results show that a contrast improvement in the high and low frequencies is obtained.

  1. Multiscale infrared and visible image fusion using gradient domain guided image filtering

    Science.gov (United States)

    Zhu, Jin; Jin, Weiqi; Li, Li; Han, Zhenghao; Wang, Xia

    2018-03-01

    For better surveillance with infrared and visible imaging, a novel hybrid multiscale decomposition fusion method using gradient domain guided image filtering (HMSD-GDGF) is proposed in this study. In this method, hybrid multiscale decomposition with guided image filtering and gradient domain guided image filtering of source images are first applied before the weight maps of each scale are obtained using a saliency detection technology and filtering means with three different fusion rules at different scales. The three types of fusion rules are for small-scale detail level, large-scale detail level, and base level. Finally, the target becomes more salient and can be more easily detected in the fusion result, with the detail information of the scene being fully displayed. After analyzing the experimental comparisons with state-of-the-art fusion methods, the HMSD-GDGF method has obvious advantages in fidelity of salient information (including structural similarity, brightness, and contrast), preservation of edge features, and human visual perception. Therefore, visual effects can be improved by using the proposed HMSD-GDGF method.

  2. Detection of pulmonary nodules on lung X-ray images. Studies on multi-resolutional filter and energy subtraction images

    International Nuclear Information System (INIS)

    Sawada, Akira; Sato, Yoshinobu; Kido, Shoji; Tamura, Shinichi

    1999-01-01

    The purpose of this work is to prove the effectiveness of an energy subtraction image for the detection of pulmonary nodules and the effectiveness of multi-resolutional filter on an energy subtraction image to detect pulmonary nodules. Also we study influential factors to the accuracy of detection of pulmonary nodules from viewpoints of types of images, types of digital filters and types of evaluation methods. As one type of images, we select an energy subtraction image, which removes bones such as ribs from the conventional X-ray image by utilizing the difference of X-ray absorption ratios at different energy between bones and soft tissue. Ribs and vessels are major causes of CAD errors in detection of pulmonary nodules and many researches have tried to solve this problem. So we select conventional X-ray images and energy subtraction X-ray images as types of images, and at the same time select ∇ 2 G (Laplacian of Guassian) filter, Min-DD (Minimum Directional Difference) filter and our multi-resolutional filter as types of digital filters. Also we select two evaluation methods and prove the effectiveness of an energy subtraction image, the effectiveness of Min-DD filter on a conventional X-ray image and the effectiveness of multi-resolutional filter on an energy subtraction image. (author)

  3. Fuzzy Logic-Based Filter for Removing Additive and Impulsive Noise from Color Images

    Science.gov (United States)

    Zhu, Yuhong; Li, Hongyang; Jiang, Huageng

    2017-12-01

    This paper presents an efficient filter method based on fuzzy logics for adaptively removing additive and impulsive noise from color images. The proposed filter comprises two parts including noise detection and noise removal filtering. In the detection part, the fuzzy peer group concept is applied to determine what type of noise is added to each pixel of the corrupted image. In the filter part, the impulse noise is deducted by the vector median filter in the CIELAB color space and an optimal fuzzy filter is introduced to reduce the Gaussian noise, while they can work together to remove the mixed Gaussian-impulse noise from color images. Experimental results on several color images proves the efficacy of the proposed fuzzy filter.

  4. Pleasant/Unpleasant Filtering for Affective Image Retrieval Based on Cross-Correlation of EEG Features

    Directory of Open Access Journals (Sweden)

    Keranmu Xielifuguli

    2014-01-01

    Full Text Available People often make decisions based on sensitivity rather than rationality. In the field of biological information processing, methods are available for analyzing biological information directly based on electroencephalogram: EEG to determine the pleasant/unpleasant reactions of users. In this study, we propose a sensitivity filtering technique for discriminating preferences (pleasant/unpleasant for images using a sensitivity image filtering system based on EEG. Using a set of images retrieved by similarity retrieval, we perform the sensitivity-based pleasant/unpleasant classification of images based on the affective features extracted from images with the maximum entropy method: MEM. In the present study, the affective features comprised cross-correlation features obtained from EEGs produced when an individual observed an image. However, it is difficult to measure the EEG when a subject visualizes an unknown image. Thus, we propose a solution where a linear regression method based on canonical correlation is used to estimate the cross-correlation features from image features. Experiments were conducted to evaluate the validity of sensitivity filtering compared with image similarity retrieval methods based on image features. We found that sensitivity filtering using color correlograms was suitable for the classification of preferred images, while sensitivity filtering using local binary patterns was suitable for the classification of unpleasant images. Moreover, sensitivity filtering using local binary patterns for unpleasant images had a 90% success rate. Thus, we conclude that the proposed method is efficient for filtering unpleasant images.

  5. Fringing in MonoCam Y4 filter images

    International Nuclear Information System (INIS)

    Brooks, J.; Nomerotski, A.; Fisher-Levine, M.

    2017-01-01

    We study the fringing patterns observed in MonoCam, a camera with a single Large Synoptic Survey Telescope (LSST) CCD sensor. Images were taken at the U.S. Naval Observatory in Flagstaff, Arizona (NOFS) employing its 1.3 m telescope and an LSST y 4 filter. Fringing occurs due to the reflection of infrared light (700 nm or larger) from the bottom surface of the CCD which constructively or destructively interferes with the incident light to produce a net ''fringe'' pattern which is superimposed on all images taken. Emission lines from the atmosphere, dominated by hydroxyl (OH) spectra, can change in their relative intensities as the night goes on, producing different fringe patterns in the images taken. We found through several methods that the general shape of the fringe patterns remained constant, though with slight changes in the amplitude and phase of the fringes. We also found that a superposition of fringes from two monochromatic lines taken in the lab offered a reasonable description of the sky data.

  6. Efficient Hardware Implementation For Fingerprint Image Enhancement Using Anisotropic Gaussian Filter.

    Science.gov (United States)

    Khan, Tariq Mahmood; Bailey, Donald G; Khan, Mohammad A U; Kong, Yinan

    2017-05-01

    A real-time image filtering technique is proposed which could result in faster implementation for fingerprint image enhancement. One major hurdle associated with fingerprint filtering techniques is the expensive nature of their hardware implementations. To circumvent this, a modified anisotropic Gaussian filter is efficiently adopted in hardware by decomposing the filter into two orthogonal Gaussians and an oriented line Gaussian. An architecture is developed for dynamically controlling the orientation of the line Gaussian filter. To further improve the performance of the filter, the input image is homogenized by a local image normalization. In the proposed structure, for a middle-range reconfigurable FPGA, both parallel compute-intensive and real-time demands were achieved. We manage to efficiently speed up the image-processing time and improve the resource utilization of the FPGA. Test results show an improved speed for its hardware architecture while maintaining reasonable enhancement benchmarks.

  7. Implementational Aspects of the Contourlet Filter Bank and Application in Image Coding

    Directory of Open Access Journals (Sweden)

    Truong T. Nguyen

    2009-02-01

    Full Text Available This paper analyzed the implementational aspects of the contourlet filter bank (or the pyramidal directional filter bank (PDFB, and considered its application in image coding. First, details of the binary tree-structured directional filter bank (DFB are presented, including a modification to minimize the phase delay factor and necessary steps for handling rectangular images. The PDFB is viewed as an overcomplete filter bank, and the directional filters are expressed in terms of polyphase components of the pyramidal filter bank and the conventional DFB. The aliasing effect of the conventional DFB and the Laplacian pyramid to the directional filters is then considered, and the conditions for reducing this effect are presented. The new filters obtained by redesigning the PDFBs satisfying these requirements have much better frequency responses. A hybrid multiscale filter bank consisting of the PDFB at higher scales and the traditional maximally decimated wavelet filter bank at lower scales is constructed to provide a sparse image representation. A novel embedded image coding system based on the image decomposition and a morphological dilation algorithm is then presented. The coding algorithm efficiently clusters the significant coefficients using progressive morphological operations. Context models for arithmetic coding are designed to exploit the intraband dependency and the correlation existing among the neighboring directional subbands. Experimental results show that the proposed coding algorithm outperforms the current state-of-the-art wavelet-based coders, such as JPEG2000, for images with directional features.

  8. M2 FILTER FOR SPECKLE NOISE SUPPRESSION IN BREAST ULTRASOUND IMAGES

    Directory of Open Access Journals (Sweden)

    E.S. Samundeeswari

    2016-11-01

    Full Text Available Breast cancer, commonly found in women is a serious life threatening disease due to its invasive nature. Ultrasound (US imaging method plays an effective role in screening early detection and diagnosis of Breast cancer. Speckle noise generally affects medical ultrasound images and also causes a number of difficulties in identifying the Region of Interest. Suppressing speckle noise is a challenging task as it destroys fine edge details. No specific filter is designed yet to get a noise free BUS image that is contaminated by speckle noise. In this paper M2 filter, a novel hybrid of linear and nonlinear filter is proposed and compared to other spatial filters with 3×3 kernel size. The performance of the proposed M2 filter is measured by statistical quantity parameters like MSE, PSNR and SSI. The experimental analysis clearly shows that the proposed M2 filter outperforms better than other spatial filters by 2% high PSNR values with regards to speckle suppression.

  9. Two-dimensional restoration of single photon emission computed tomography images using the Kalman filter

    International Nuclear Information System (INIS)

    Boulfelfel, D.; Rangayyan, R.M.; Kuduvalli, G.R.; Hahn, L.J.; Kloiber, R.

    1994-01-01

    The discrete filtered backprojection (DFBP) algorithm used for the reconstruction of single photon emission computed tomography (SPECT) images affects image quality because of the operations of filtering and discretization. The discretization of the filtered backprojection process can cause the modulation transfer function (MTF) of the SPECT imaging system to be anisotropic and nonstationary, especially near the edges of the camera's field of view. The use of shift-invariant restoration techniques fails to restore large images because these techniques do not account for such variations in the MTF. This study presents the application of a two-dimensional (2-D) shift-variant Kalman filter for post-reconstruction restoration of SPECT slices. This filter was applied to SPECT images of a hollow cylinder phantom; a resolution phantom; and a large, truncated cone phantom containing two types of cold spots, a sphere, and a triangular prism. The images were acquired on an ADAC GENESYS camera. A comparison was performed between results obtained by the Kalman filter and those obtained by shift-invariant filters. Quantitative analysis of the restored images performed through measurement of root mean squared errors shows a considerable reduction in error of Kalman-filtered images over images restored using shift-invariant methods

  10. Slice image pretreatment for cone-beam computed tomography based on adaptive filter

    International Nuclear Information System (INIS)

    Huang Kuidong; Zhang Dinghua; Jin Yanfang

    2009-01-01

    According to the noise properties and the serial slice image characteristics in Cone-Beam Computed Tomography (CBCT) system, a slice image pretreatment for CBCT based on adaptive filter was proposed. The judging criterion for the noise is established firstly. All pixels are classified into two classes: adaptive center weighted modified trimmed mean (ACWMTM) filter is used for the pixels corrupted by Gauss noise and adaptive median (AM) filter is used for the pixels corrupted by impulse noise. In ACWMTM filtering algorithm, the estimated Gauss noise standard deviation in the current slice image with offset window is replaced by the estimated standard deviation in the adjacent slice image to the current with the corresponding window, so the filtering accuracy of the serial images is improved. The pretreatment experiment on CBCT slice images of wax model of hollow turbine blade shows that the method makes a good performance both on eliminating noises and on protecting details. (authors)

  11. MRT letter: Guided filtering of image focus volume for 3D shape recovery of microscopic objects.

    Science.gov (United States)

    Mahmood, Muhammad Tariq

    2014-12-01

    In this letter, a shape from focus (SFF) method is proposed that utilizes the guided image filtering to enhance the image focus volume efficiently. First, image focus volume is computed using a conventional focus measure. Then each layer of image focus volume is filtered using guided filtering. In this work, the all-in-focus image, which can be obtained from the initial focus volume, is used as guidance image. Finally, improved depth map is obtained from the filtered image focus volume by maximizing the focus measure along the optical axis. The proposed SFF method is efficient and provides better depth maps. The improved performance is highlighted by conducting several experiments using image sequences of simulated and real microscopic objects. The comparative analysis demonstrates the effectiveness of the proposed SFF method. © 2014 Wiley Periodicals, Inc.

  12. Preprocessing of PHERMEX flash radiographic images with Haar and adaptive filtering

    International Nuclear Information System (INIS)

    Brolley, J.E.

    1978-11-01

    Work on image preparation has continued with the application of high-sequency boosting via Haar filtering. This is useful in developing line or edge structures. Widrow LMS adaptive filtering has also been shown to be useful in developing edge structure in special problems. Shadow effects can be obtained with the latter which may be useful for some problems. Combined Haar and adaptive filtering is illustrated for a PHERMEX image

  13. Image restoration by Wiener filtering in the presence of signal-dependent noise.

    Science.gov (United States)

    Kondo, K; Ichioka, Y; Suzuki, T

    1977-09-01

    An optimum filter to restore the degraded image due to blurring and the signal-dependent noise is obtained on the basis of the theory of Wiener filtering. Computer simulations of image restoration using signal-dependent noise models are carried out. It becomes clear that the optimum filter, which makes use of a priori information on the signal-dependent nature of the noise and the spectral density of the signal and the noise showing significant spatial correlation, is potentially advantageous.

  14. General filtering method for electronic speckle pattern interferometry fringe images with various densities based on variational image decomposition.

    Science.gov (United States)

    Li, Biyuan; Tang, Chen; Gao, Guannan; Chen, Mingming; Tang, Shuwei; Lei, Zhenkun

    2017-06-01

    Filtering off speckle noise from a fringe image is one of the key tasks in electronic speckle pattern interferometry (ESPI). In general, ESPI fringe images can be divided into three categories: low-density fringe images, high-density fringe images, and variable-density fringe images. In this paper, we first present a general filtering method based on variational image decomposition that can filter speckle noise for ESPI fringe images with various densities. In our method, a variable-density ESPI fringe image is decomposed into low-density fringes, high-density fringes, and noise. A low-density fringe image is decomposed into low-density fringes and noise. A high-density fringe image is decomposed into high-density fringes and noise. We give some suitable function spaces to describe low-density fringes, high-density fringes, and noise, respectively. Then we construct several models and numerical algorithms for ESPI fringe images with various densities. And we investigate the performance of these models via our extensive experiments. Finally, we compare our proposed models with the windowed Fourier transform method and coherence enhancing diffusion partial differential equation filter. These two methods may be the most effective filtering methods at present. Furthermore, we use the proposed method to filter a collection of the experimentally obtained ESPI fringe images with poor quality. The experimental results demonstrate the performance of our proposed method.

  15. Adaptive iterated function systems filter for images highly corrupted with fixed - Value impulse noise

    Science.gov (United States)

    Shanmugavadivu, P.; Eliahim Jeevaraj, P. S.

    2014-06-01

    The Adaptive Iterated Functions Systems (AIFS) Filter presented in this paper has an outstanding potential to attenuate the fixed-value impulse noise in images. This filter has two distinct phases namely noise detection and noise correction which uses Measure of Statistics and Iterated Function Systems (IFS) respectively. The performance of AIFS filter is assessed by three metrics namely, Peak Signal-to-Noise Ratio (PSNR), Mean Structural Similarity Index Matrix (MSSIM) and Human Visual Perception (HVP). The quantitative measures PSNR and MSSIM endorse the merit of this filter in terms of degree of noise suppression and details/edge preservation respectively, in comparison with the high performing filters reported in the recent literature. The qualitative measure HVP confirms the noise suppression ability of the devised filter. This computationally simple noise filter broadly finds application wherein the images are highly degraded by fixed-value impulse noise.

  16. Quality evaluation of no-reference MR images using multidirectional filters and image statistics.

    Science.gov (United States)

    Jang, Jinseong; Bang, Kihun; Jang, Hanbyol; Hwang, Dosik

    2018-09-01

    This study aimed to develop a fully automatic, no-reference image-quality assessment (IQA) method for MR images. New quality-aware features were obtained by applying multidirectional filters to MR images and examining the feature statistics. A histogram of these features was then fitted to a generalized Gaussian distribution function for which the shape parameters yielded different values depending on the type of distortion in the MR image. Standard feature statistics were established through a training process based on high-quality MR images without distortion. Subsequently, the feature statistics of a test MR image were calculated and compared with the standards. The quality score was calculated as the difference between the shape parameters of the test image and the undistorted standard images. The proposed IQA method showed a >0.99 correlation with the conventional full-reference assessment methods; accordingly, this proposed method yielded the best performance among no-reference IQA methods for images containing six types of synthetic, MR-specific distortions. In addition, for authentically distorted images, the proposed method yielded the highest correlation with subjective assessments by human observers, thus demonstrating its superior performance over other no-reference IQAs. Our proposed IQA was designed to consider MR-specific features and outperformed other no-reference IQAs designed mainly for photographic images. Magn Reson Med 80:914-924, 2018. © 2018 International Society for Magnetic Resonance in Medicine. © 2018 International Society for Magnetic Resonance in Medicine.

  17. Pair distribution functions of carbonaceous solids, determined using energy filtered diffraction

    International Nuclear Information System (INIS)

    Petersen, T.C.; McCulloch, D.G.

    2002-01-01

    Full text: The structures of various carbonaceous solids were investigated using energy filtered diffraction patterns collected in two dimensions using a Gatan Imaging Filter (GIF). In order to reduce multiple scattering and eliminate inelastic scattering effects, the diffraction patterns were filtered using an energy -selecting slit around the zero-loss peak. Software has been developed for the extraction of radially averaged pair distributions functions from the diffraction data. This entails finding the position of the un-scattered beam, radially averaging the two dimensional intensity distributions, calibrating the resulting one dimensional intensity profiles and finally normalising the data to obtain structure factors. Techniques for improving and assessing data quality, pertaining to the methodology used here, have also been explored. Structure factors and radial distribution functions generated using this analysis will be discussed and, for the commercial V25 glassy carbon samples, compared to previous, work of one of the authors'. In order to answer questions regarding multiple scattering effects and structural homogeneity of the samples, neutron scattering was performed on the Medium Resolution Powder Diffractometer (MRPD), at the Australian Nuclear Science and Technology's (ANSTO) facility. A critical comparison of the neutron scattering and electron diffraction generated structure factors will be presented. Copyright (2002) Australian Society for Electron Microscopy Inc

  18. Imaging through scattering media by Fourier filtering and single-pixel detection

    Science.gov (United States)

    Jauregui-Sánchez, Y.; Clemente, P.; Lancis, J.; Tajahuerce, E.

    2018-02-01

    We present a novel imaging system that combines the principles of Fourier spatial filtering and single-pixel imaging in order to recover images of an object hidden behind a turbid medium by transillumination. We compare the performance of our single-pixel imaging setup with that of a conventional system. We conclude that the introduction of Fourier gating improves the contrast of images in both cases. Furthermore, we show that the combination of single-pixel imaging and Fourier spatial filtering techniques is particularly well adapted to provide images of objects transmitted through scattering media.

  19. Filters in 2D and 3D Cardiac SPECT Image Processing

    Directory of Open Access Journals (Sweden)

    Maria Lyra

    2014-01-01

    Full Text Available Nuclear cardiac imaging is a noninvasive, sensitive method providing information on cardiac structure and physiology. Single photon emission tomography (SPECT evaluates myocardial perfusion, viability, and function and is widely used in clinical routine. The quality of the tomographic image is a key for accurate diagnosis. Image filtering, a mathematical processing, compensates for loss of detail in an image while reducing image noise, and it can improve the image resolution and limit the degradation of the image. SPECT images are then reconstructed, either by filter back projection (FBP analytical technique or iteratively, by algebraic methods. The aim of this study is to review filters in cardiac 2D, 3D, and 4D SPECT applications and how these affect the image quality mirroring the diagnostic accuracy of SPECT images. Several filters, including the Hanning, Butterworth, and Parzen filters, were evaluated in combination with the two reconstruction methods as well as with a specified MatLab program. Results showed that for both 3D and 4D cardiac SPECT the Butterworth filter, for different critical frequencies and orders, produced the best results. Between the two reconstruction methods, the iterative one might be more appropriate for cardiac SPECT, since it improves lesion detectability due to the significant improvement of image contrast.

  20. Median Filter Noise Reduction of Image and Backpropagation Neural Network Model for Cervical Cancer Classification

    Science.gov (United States)

    Wutsqa, D. U.; Marwah, M.

    2017-06-01

    In this paper, we consider spatial operation median filter to reduce the noise in the cervical images yielded by colposcopy tool. The backpropagation neural network (BPNN) model is applied to the colposcopy images to classify cervical cancer. The classification process requires an image extraction by using a gray level co-occurrence matrix (GLCM) method to obtain image features that are used as inputs of BPNN model. The advantage of noise reduction is evaluated by comparing the performances of BPNN models with and without spatial operation median filter. The experimental result shows that the spatial operation median filter can improve the accuracy of the BPNN model for cervical cancer classification.

  1. Demosaicing and Superresolution for Color Filter Array via Residual Image Reconstruction and Sparse Representation

    OpenAIRE

    Sun, Guangling

    2012-01-01

    A framework of demosaicing and superresolution for color filter array (CFA) via residual image reconstruction and sparse representation is presented.Given the intermediate image produced by certain demosaicing and interpolation technique, a residual image between the final reconstruction image and the intermediate image is reconstructed using sparse representation.The final reconstruction image has richer edges and details than that of the intermediate image. Specifically, a generic dictionar...

  2. An Extension to a Filter Implementation of Local Quadratic Surface for Image Noise Estimation

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    1999-01-01

    Based on regression analysis this paper gives a description for simple image filter design. Specifically 3x3 filter implementations of a quadratic surface, residuals from this surface, gradients and the Laplacian are given. For the residual a 5x5 filter is given also. It is shown that the 3x3......) it is concluded that if striping is to be considered as a part of the noise, the residual from a 3x3 median filter seems best. If we are interested in a salt-and-pepper noise estimator the proposed extension to the 3x3 filter for the residual from a quadratic surface seems best. Simple statistics...

  3. [Testing method research for key performance indicator of imaging acousto-optic tunable filter (AOTF)].

    Science.gov (United States)

    Hu, Shan-Zhou; Chen, Fen-Fei; Zeng, Li-Bo; Wu, Qiong-Shui

    2013-01-01

    Imaging AOTF is an important optical filter component for new spectral imaging instruments developed in recent years. The principle of imaging AOTF component was demonstrated, and a set of testing methods for some key performances were studied, such as diffraction efficiency, wavelength shift with temperature, homogeneity in space for diffraction efficiency, imaging shift, etc.

  4. Use of morphologic filters in the computerized detection of lung nodules in digital chest images

    International Nuclear Information System (INIS)

    Yoshimura, H.; Giger, M.L.; Doi, K.; Ahn, N.; MacMahon, H.

    1989-01-01

    The authors have previously described a computerized scheme for the detection of lung nodules based on a difference-image approach, which had a detection accuracy of 70% with 7--8 false positives per image. Currently, they are investigating morphologic filters for the further enhancement/suppression of nodule-signals and the removal of false-positives. Gray-level morphologic filtering is performed on clinical chest radiographs digitized with an optical drum scanner. Various shapes and sequences of erosion and dilation filters (i.e., determination of the minimum and maximum gray levels, respectively) were examined for signal enhancement and suppression for sue in the difference- image approach

  5. Two-dimensional real-time imaging system for subtraction angiography using an iodine filter

    Science.gov (United States)

    Umetani, Keiji; Ueda, Ken; Takeda, Tohoru; Anno, Izumi; Itai, Yuji; Akisada, Masayoshi; Nakajima, Teiichi

    1992-01-01

    A new type of subtraction imaging system was developed using an iodine filter and a single-energy broad bandwidth monochromatized x ray. The x-ray images of coronary arteries made after intravenous injection of a contrast agent are enhanced by an energy-subtraction technique. Filter chopping of the x-ray beam switches energies rapidly, so that a nearly simultaneous pair of filtered and nonfiltered images can be made. By using a high-speed video camera, a pair of two 512 × 512 pixel images can be obtained within 9 ms. Three hundred eighty-four images (raw data) are stored in a 144-Mbyte frame memory. After phantom studies, in vivo subtracted images of coronary arteries in dogs were obtained at a rate of 15 images/s.

  6. Development of an adaptive bilateral filter for evaluating color image difference

    Science.gov (United States)

    Wang, Zhaohui; Hardeberg, Jon Yngve

    2012-04-01

    Spatial filtering, which aims to mimic the contrast sensitivity function (CSF) of the human visual system (HVS), has previously been combined with color difference formulae for measuring color image reproduction errors. These spatial filters attenuate imperceptible information in images, unfortunately including high frequency edges, which are believed to be crucial in the process of scene analysis by the HVS. The adaptive bilateral filter represents a novel approach, which avoids the undesirable loss of edge information introduced by CSF-based filtering. The bilateral filter employs two Gaussian smoothing filters in different domains, i.e., spatial domain and intensity domain. We propose a method to decide the parameters, which are designed to be adaptive to the corresponding viewing conditions, and the quantity and homogeneity of information contained in an image. Experiments and discussions are given to support the proposal. A series of perceptual experiments were conducted to evaluate the performance of our approach. The experimental sample images were reproduced with variations in six image attributes: lightness, chroma, hue, compression, noise, and sharpness/blurriness. The Pearson's correlation values between the model-predicted image difference and the observed difference were employed to evaluate the performance, and compare it with that of spatial CIELAB and image appearance model.

  7. Bandwidth Controllable Tunable Filter for Hyper-/Multi-Spectral Imager, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase I proposal introduces a fast speed bandwidth controllable tunable filter for hyper-/multi-spectral (HS/MS) imagers. It dynamically passes a variable...

  8. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    Science.gov (United States)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  9. Dynamic positron emission tomography image restoration via a kinetics-induced bilateral filter.

    Directory of Open Access Journals (Sweden)

    Zhaoying Bian

    Full Text Available Dynamic positron emission tomography (PET imaging is a powerful tool that provides useful quantitative information on physiological and biochemical processes. However, low signal-to-noise ratio in short dynamic frames makes accurate kinetic parameter estimation from noisy voxel-wise time activity curves (TAC a challenging task. To address this problem, several spatial filters have been investigated to reduce the noise of each frame with noticeable gains. These filters include the Gaussian filter, bilateral filter, and wavelet-based filter. These filters usually consider only the local properties of each frame without exploring potential kinetic information from entire frames. Thus, in this work, to improve PET parametric imaging accuracy, we present a kinetics-induced bilateral filter (KIBF to reduce the noise of dynamic image frames by incorporating the similarity between the voxel-wise TACs using the framework of bilateral filter. The aim of the proposed KIBF algorithm is to reduce the noise in homogeneous areas while preserving the distinct kinetics of regions of interest. Experimental results on digital brain phantom and in vivo rat study with typical (18F-FDG kinetics have shown that the present KIBF algorithm can achieve notable gains over other existing algorithms in terms of quantitative accuracy measures and visual inspection.

  10. FIONDA (Filtering Images of Niobium Disks Application): Filter application for Eddy Current Scanner data analysis

    International Nuclear Information System (INIS)

    Boffo, C.; Bauer, P.

    2005-01-01

    As part of the material QC process, each Niobium disk from which a superconducting RF cavity is built must undergo an eddy current scan [1]. This process allows to discover embedded defects in the material that are not visible to the naked eye because too small or under the surface. Moreover, during the production process of SC cavities the outer layer of Nb is removed via chemical or electro-chemical etching, thus it is important to evaluate the quality of the subsurface layer (in the order of 100nm) where superconductivity will happen. The reference eddy current scanning machine is operated at DESY; at Fermilab we are using the SNS eddy current scanner on loan, courtesy of SNS. In the past year, several upgrades were implemented aiming at raising the SNS machine performance to that of the DESY reference machine [2]. As part of this effort an algorithm that enables the filtering of the results of the scans and thus improves the resolution of the process was developed. The description of the algorithm and of the software used to filter the scan results is presented in this note. This filter application is a useful tool when the coupling between the signal associated to the long range probe distance (or sample thickness) variation and that associated to inclusions masks the presence of defects. Moreover instead of using indirect criteria (such as appearance on screen), the filter targets precisely the topology variations of interest. This application is listed in the FermiTools database and is freely available

  11. Artifact reduction of compressed images and video combining adaptive fuzzy filtering and directional anisotropic diffusion

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Forchhammer, Søren; Korhonen, Jari

    2011-01-01

    and ringing artifacts, we have applied directional anisotropic diffusion. Besides that, the selection of the adaptive threshold parameter for the diffusion coefficient has also improved the performance of the algorithm. Experimental results on JPEG compressed images as well as MJPEG and H.264 compressed......Fuzzy filtering is one of the recently developed methods for reducing distortion in compressed images and video. In this paper, we combine the powerful anisotropic diffusion equations with fuzzy filtering in order to reduce the impact of artifacts. Based on the directional nature of the blocking...... videos show improvement in artifact reduction of the proposed algorithm over other directional and spatial fuzzy filters....

  12. Energy Based Clutter Filtering for Vector Flow Imaging

    DEFF Research Database (Denmark)

    Villagómez Hoyos, Carlos Armando; Jensen, Jonas; Ewertsen, Caroline

    2017-01-01

    for obtaining vector flow measurements, since the spectra overlaps at high beam-to-flow angles. In this work a distinct approach is proposed, where the energy of the velocity spectrum is used to differentiate among the two signals. The energy based method is applied by limiting the amplitude of the velocity...... spectrum function to a predetermined threshold. The effect of the clutter filtering is evaluated on a plane wave (PW) scan sequence in combination with transverse oscillation (TO) and directional beamforming (DB) for velocity estimation. The performance of the filter is assessed by comparison...

  13. Image reconstruction for digital breast tomosynthesis (DBT) by using projection-angle-dependent filter functions

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yeonok; Park, Chulkyu; Cho, Hyosung; Je, Uikyu; Hong, Daeki; Lee, Minsik; Cho, Heemoon; Choi, Sungil; Koo, Yangseo [Yonsei University, Wonju (Korea, Republic of)

    2014-09-15

    Digital breast tomosynthesis (DBT) is considered in clinics as a standard three-dimensional imaging modality, allowing the earlier detection of cancer. It typically acquires only 10-30 projections over a limited angle range of 15 - 60 .deg. with a stationary detector and typically uses a computationally-efficient filtered-backprojection (FBP) algorithm for image reconstruction. However, a common FBP algorithm yields poor image quality resulting from the loss of average image value and the presence of severe image artifacts due to the elimination of the dc component of the image by the ramp filter and to the incomplete data, respectively. As an alternative, iterative reconstruction methods are often used in DBT to overcome these difficulties, even though they are still computationally expensive. In this study, as a compromise, we considered a projection-angle dependent filtering method in which one-dimensional geometry-adapted filter kernels are computed with the aid of a conjugate-gradient method and are incorporated into the standard FBP framework. We implemented the proposed algorithm and performed systematic simulation works to investigate the imaging characteristics. Our results indicate that the proposed method is superior to a conventional FBP method for DBT imaging and has a comparable computational cost, while preserving good image homogeneity and edge sharpening with no serious image artifacts.

  14. Iris image recognition wavelet filter-banks based iris feature extraction schemes

    CERN Document Server

    Rahulkar, Amol D

    2014-01-01

    This book provides the new results in wavelet filter banks based feature extraction, and the classifier in the field of iris image recognition. It provides the broad treatment on the design of separable, non-separable wavelets filter banks, and the classifier. The design techniques presented in the book are applied on iris image analysis for person authentication. This book also brings together the three strands of research (wavelets, iris image analysis, and classifier). It compares the performance of the presented techniques with state-of-the-art available schemes. This book contains the compilation of basic material on the design of wavelets that avoids reading many different books. Therefore, it provide an easier path for the new-comers, researchers to master the contents. In addition, the designed filter banks and classifier can also be effectively used than existing filter-banks in many signal processing applications like pattern classification, data-compression, watermarking, denoising etc.  that will...

  15. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    International Nuclear Information System (INIS)

    Iliopoulos, AS; Sun, X; Floros, D; Zhang, Y; Yin, FF; Ren, L; Pitsianis, N

    2016-01-01

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  16. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Floros, D [Aristotle University of Thessaloniki (Greece); Zhang, Y; Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States)

    2016-06-15

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well as histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial

  17. Tunable filter imaging of high-redshift quasar fields

    NARCIS (Netherlands)

    Swinbank, J.; Baker, J.; Barr, J.; Hook, I.; Bland-Hawthorn, J.

    2012-01-01

    We have used the Taurus Tunable Filter to search for Lyα emitters in the fields of three high-redshift quasars: two at z∼ 2.2 (MRC B1256−243 and MRC B2158−206) and one at z∼ 4.5 (BR B0019−1522). Our observations had a field of view of around 35 arcmin2, and reached AB magnitudes of ∼21 (MRC

  18. Advanced microlens and color filter process technology for the high-efficiency CMOS and CCD image sensors

    Science.gov (United States)

    Fan, Yang-Tung; Peng, Chiou-Shian; Chu, Cheng-Yu

    2000-12-01

    New markets are emerging for digital electronic image device, especially in visual communications, PC camera, mobile/cell phone, security system, toys, vehicle image system and computer peripherals for document capture. To enable one-chip image system that image sensor is with a full digital interface, can make image capture devices in our daily lives. Adding a color filter to such image sensor in a pattern of mosaics pixel or wide stripes can make image more real and colorful. We can say 'color filter makes the life more colorful color filter is? Color filter means can filter image light source except the color with specific wavelength and transmittance that is same as color filter itself. Color filter process is coating and patterning green, red and blue (or cyan, magenta and yellow) mosaic resists onto matched pixel in image sensing array pixels. According to the signal caught from each pixel, we can figure out the environment image picture. Widely use of digital electronic camera and multimedia applications today makes the feature of color filter becoming bright. Although it has challenge but it is very worthy to develop the process of color filter. We provide the best service on shorter cycle time, excellent color quality, high and stable yield. The key issues of advanced color process have to be solved and implemented are planarization and micro-lens technology. Lost of key points of color filter process technology have to consider will also be described in this paper.

  19. Variation of the count-dependent Metz filter with imaging system modulation transfer function

    International Nuclear Information System (INIS)

    King, M.A.; Schwinger, R.B.; Penney, B.C.

    1986-01-01

    A systematic investigation was conducted of how a number of parameters which alter the system modulation transfer function (MTF) influence the count-dependent Metz filter. Since restoration filters are most effective at those frequencies where the object power spectrum dominates that of the noise, it was observed that parameters which significantly degrade the MTF at low spatial frequencies strongly influence the formation of the Metz filter. Thus the radionuclide imaged and the depth of the source in a scattering medium had the most influence. This is because they alter the relative amount of scattered radiation being imaged. For low-energy photon emitters, the collimator employed and the distance from the collimator were found to have less of an influence but still to be significant. These cause alterations in the MTF which are more gradual, and hence are most pronounced at mid to high spatial frequencies. As long as adequate spatial sampling is employed, the Metz filter was determined to be independent of the exact size of the sampling bin width, to a first approximation. For planar and single photon emission computed tomographic (SPECT) imaging, it is shown that two-dimensional filtering with the Metz filter optimized for the imaging conditions is able to deconvolve scatter and other causes of spatial resolution loss while diminishing noise, all in a balanced manner

  20. A METHOD FOR RECORDING AND VIEWING STEREOSCOPIC IMAGES IN COLOUR USING MULTICHROME FILTERS

    DEFF Research Database (Denmark)

    2000-01-01

    in a conventional stereogram recorded of the scene. The invention makes use of a colour-based encoding technique and viewing filters selected so that the human observer receives, in one eye, an image of nearly full colour information, in the other eye, an essentially monochrome image supplying the parallactic......The aim of the invention is to create techniques for the encoding, production and viewing of stereograms, supplemented by methods for selecting certain optical filters needed in these novel techniques, thus providing a human observer with stereograms each of which consist of a single image...

  1. Aircraft Detection from VHR Images Based on Circle-Frequency Filter and Multilevel Features

    Directory of Open Access Journals (Sweden)

    Feng Gao

    2013-01-01

    Full Text Available Aircraft automatic detection from very high-resolution (VHR images plays an important role in a wide variety of applications. This paper proposes a novel detector for aircraft detection from very high-resolution (VHR remote sensing images. To accurately distinguish aircrafts from background, a circle-frequency filter (CF-filter is used to extract the candidate locations of aircrafts from a large size image. A multi-level feature model is then employed to represent both local appearance and spatial layout of aircrafts by means of Robust Hue Descriptor and Histogram of Oriented Gradients. The experimental results demonstrate the superior performance of the proposed method.

  2. Comparison of the diagnostic accuracy of direct digital radiography system, filtered images, and subtraction radiography

    Directory of Open Access Journals (Sweden)

    Wilton Mitsunari Takeshita

    2013-01-01

    Full Text Available Background: To compare the diagnostic accuracy of three different imaging systems: Direct digital radiography system (DDR-CMOS, four types of filtered images, and a priori and a posteriori registration of digital subtraction radiography (DSR in the diagnosis of proximal defects. Materials and Methods: The teeth were arranged in pairs in 10 blocks of vinyl polysiloxane, and proximal defects were performed with drills of 0.25, 0.5, and 1 mm diameter. Kodak RVG 6100 sensor was used to capture the images. A posteriori DSR registrations were done with Regeemy 0.2.43 and subtraction with Image Tool 3.0. Filtered images were obtained with Kodak Dental Imaging 6.1 software. Images (n = 360 were evaluated by three raters, all experts in dental radiology. Results: Sensitivity and specificity of the area under the receiver operator characteristic (ROC curve (Az were higher for DSR images with all three drills (Az = 0.896, 0.979, and 1.000 for drills 0.25, 0.5, and 1 mm, respectively. The highest values were found for 1-mm drills and the lowest for 0.25-mm drills, with negative filter having the lowest values of all (Az = 0.631. Conclusion: The best method of diagnosis was by using a DSR. The negative filter obtained the worst results. Larger drills showed the highest sensitivity and specificity values of the area under the ROC curve.

  3. Improvement of nonlinear diffusion equation using relaxed geometric mean filter for low PSNR images

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan

    2013-01-01

    A new method to improve the performance of low PSNR image denoising is presented. The proposed scheme estimates edge gradient from an image that is regularised with a relaxed geometric mean filter. The proposed method consists of two stages; the first stage consists of a second order nonlinear an...

  4. Deconvolution of Defocused Image with Multivariate Local Polynomial Regression and Iterative Wiener Filtering in DWT Domain

    Directory of Open Access Journals (Sweden)

    Liyun Su

    2010-01-01

    obtaining the point spread function (PSF parameter, iterative wiener filter is adopted to complete the restoration. We experimentally illustrate its performance on simulated data and real blurred image. Results show that the proposed PSF parameter estimation technique and the image restoration method are effective.

  5. Image denoising using new pixon representation based on fuzzy filtering and partial differential equations

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Nikpour, Mohsen

    2012-01-01

    In this paper, we have proposed two extensions to pixon-based image modeling. The first one is using bicubic interpolation instead of bilinear interpolation and the second one is using fuzzy filtering method, aiming to improve the quality of the pixonal image. Finally, partial differential...

  6. Error analysis of filtering operations in pixel-duplicated images of diabetic retinopathy

    Science.gov (United States)

    Mehrubeoglu, Mehrube; McLauchlan, Lifford

    2010-08-01

    In this paper, diabetic retinopathy is chosen for a sample target image to demonstrate the effectiveness of image enlargement through pixel duplication in identifying regions of interest. Pixel duplication is presented as a simpler alternative to data interpolation techniques for detecting small structures in the images. A comparative analysis is performed on different image processing schemes applied to both original and pixel-duplicated images. Structures of interest are detected and and classification parameters optimized for minimum false positive detection in the original and enlarged retinal pictures. The error analysis demonstrates the advantages as well as shortcomings of pixel duplication in image enhancement when spatial averaging operations (smoothing filters) are also applied.

  7. Impact of Image Filters and Observations Parameters in CBCT for Identification of Mandibular Osteolytic Lesions.

    Science.gov (United States)

    Monteiro, Bruna Moraes; Nobrega Filho, Denys Silveira; Lopes, Patrícia de Medeiros Loureiro; de Sales, Marcelo Augusto Oliveira

    2012-01-01

    The aim of this study was to analyze the influence of filters (algorithms) to improve the image of Cone Beam Computed Tomography (CBCT) in diagnosis of osteolytic lesions of the mandible, in order to establish the protocols for viewing images more suitable for CBCT diagnostics. 15 dry mandibles in which perforations were performed, simulating lesions, were submitted to CBCT examination. Two examiners analyzed the images, using filters to improve image Hard, Normal, and Very Sharp, contained in the iCAT Vision software, and protocols for assessment: axial; sagittal and coronal; and axial, sagittal and coronal planes simultaneously (MPR), on two occasions. The sensitivity and specificity (validity) of the cone beam computed tomography (CBCT) have been demonstrated as the values achieved were above 75% for sensitivity and above 85% for specificity, reaching around 95.5% of sensitivity and 99% of specificity when we used the appropriate observation protocol. It was concluded that the use of filters (algorithms) to improve the CBCT image influences the diagnosis, due to the fact that all measured values were correspondingly higher when it was used the filter Very Sharp, which justifies its use for clinical activities, followed by Hard and Normal filters, in order of decreasing values.

  8. Impact of Image Filters and Observations Parameters in CBCT for Identification of Mandibular Osteolytic Lesions

    Directory of Open Access Journals (Sweden)

    Bruna Moraes Monteiro

    2012-01-01

    Full Text Available The aim of this study was to analyze the influence of filters (algorithms to improve the image of Cone Beam Computed Tomography (CBCT in diagnosis of osteolytic lesions of the mandible, in order to establish the protocols for viewing images more suitable for CBCT diagnostics. 15 dry mandibles in which perforations were performed, simulating lesions, were submitted to CBCT examination. Two examiners analyzed the images, using filters to improve image Hard, Normal, and Very Sharp, contained in the iCAT Vision software, and protocols for assessment: axial; sagittal and coronal; and axial, sagittal and coronal planes simultaneously (MPR, on two occasions. The sensitivity and specificity (validity of the cone beam computed tomography (CBCT have been demonstrated as the values achieved were above 75% for sensitivity and above 85% for specificity, reaching around 95.5% of sensitivity and 99% of specificity when we used the appropriate observation protocol. It was concluded that the use of filters (algorithms to improve the CBCT image influences the diagnosis, due to the fact that all measured values were correspondingly higher when it was used the filter Very Sharp, which justifies its use for clinical activities, followed by Hard and Normal filters, in order of decreasing values.

  9. Spectral characterization in deep UV of an improved imaging KDP acousto-optic tunable filter

    International Nuclear Information System (INIS)

    Gupta, Neelam; Voloshinov, Vitaly

    2014-01-01

    Recently, we developed a number of high quality noncollinear acousto-optic tunable filter (AOTF) cells in different birefringent materials with UV imaging capability. Cells based on a single crystal of KDP (potassium dihydrophosphate) had the best transmission efficiency and the optical throughput needed to acquire high quality spectral images at wavelengths above 220 nm. One of the main limitations of these imaging filters was their small angular aperture in air, limited to about 1.0°. In this paper, we describe an improved imaging KDP AOTF operating from the deep UV to the visible region of the spectrum. The linear and angular apertures of the new filter are 10 × 10 mm 2 and 1.8°, respectively. The spectral tuning range is 205–430 nm with a 60 cm −1 spectral resolution. We describe the filter and present experimental results on imaging using both a broadband source and a number of light emitting diodes (LEDs) in the UV, and include the measured spectra of these LEDs obtained with a collinear SiO 2 filter-based spectrometer operating above 255 nm. (paper)

  10. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    Science.gov (United States)

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO 2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  11. An image filtering technique for SPIDER visible tomography

    Energy Technology Data Exchange (ETDEWEB)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.; Pasqualotto, R.; Serianni, G. [Consorzio RFX, Associazione EURATOM-ENEA sulla Fusione, Corso Stati Uniti 4, I-35127 Padova (Italy)

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  12. An image filtering technique for SPIDER visible tomography

    International Nuclear Information System (INIS)

    Fonnesu, N.; Agostini, M.; Brombin, M.; Pasqualotto, R.; Serianni, G.

    2014-01-01

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported

  13. A Tentative Application Of Morphological Filters To Time-Varying Images

    Science.gov (United States)

    Billard, D.; Poquillon, B.

    1989-03-01

    In this paper, morphological filters, which are commonly used to process either 2D or multidimensional static images, are generalized to the analysis of time-varying image sequence. The introduction of the time dimension induces then interesting prop-erties when designing such spatio-temporal morphological filters. In particular, the specification of spatio-temporal structuring ele-ments (equivalent to time-varying spatial structuring elements) can be adjusted according to the temporal variations of the image sequences to be processed : this allows to derive specific morphological transforms to perform noise filtering or moving objects discrimination on dynamic images viewed by a non-stationary sensor. First, a brief introduction to the basic principles underlying morphological filters will be given. Then, a straightforward gener-alization of these principles to time-varying images will be pro-posed. This will lead us to define spatio-temporal opening and closing and to introduce some of their possible applications to process dynamic images. At last, preliminary results obtained us-ing a natural forward looking infrared (FUR) image sequence are presented.

  14. Bowtie filter and water calibration in the improvement of cone beam CT image quality

    International Nuclear Information System (INIS)

    Li Minghui; Dai Jianrong; Zhang Ke

    2010-01-01

    Objective: To evaluate the improvement of cone beam CT (CBCT) image quality by using bewtie filter (F 1 ) and water calibration. Methods: First the multi-level gain calibration of the detector panel with the method of Cal 2 calibration was performed, and the CT images of CATPHAN503 with F 0 and bowtie filter were collected, respectively. Then the detector panel using water calibration kit was calibrated, and images were acquired again. Finally, the change of image quality after using F 1 and (or) water calibration method was observed. The observed indexes included low contrast visibility, spatial uniformity, ring artifact, spatial resolution and geometric accuracy. Results: Comparing with the traditional combination of F 0 filter and Cal 2 calibration, the combination of bowtie filter F 1 and water calibration improves low contrast visibility by 13.71%, and spatial uniformity by 54. 42%. Water calibration removes ring artifacts effectively. However, none of them improves spatial resolution and geometric accuracy. Conclusions: The combination of F 1 and water calibration improves CBCT image quality effectively. This improvement is aid to the registration of CBCT images and localization images. (authors)

  15. The use of the Kalman filter in the automated segmentation of EIT lung images

    International Nuclear Information System (INIS)

    Zifan, A; Chapman, B E; Liatsis, P

    2013-01-01

    In this paper, we present a new pipeline for the fast and accurate segmentation of impedance images of the lungs using electrical impedance tomography (EIT). EIT is an emerging, promising, non-invasive imaging modality that produces real-time, low spatial but high temporal resolution images of impedance inside a body. Recovering impedance itself constitutes a nonlinear ill-posed inverse problem, therefore the problem is usually linearized, which produces impedance-change images, rather than static impedance ones. Such images are highly blurry and fuzzy along object boundaries. We provide a mathematical reasoning behind the high suitability of the Kalman filter when it comes to segmenting and tracking conductivity changes in EIT lung images. Next, we use a two-fold approach to tackle the segmentation problem. First, we construct a global lung shape to restrict the search region of the Kalman filter. Next, we proceed with augmenting the Kalman filter by incorporating an adaptive foreground detection system to provide the boundary contours for the Kalman filter to carry out the tracking of the conductivity changes as the lungs undergo deformation in a respiratory cycle. The proposed method has been validated by using performance statistics such as misclassified area, and false positive rate, and compared to previous approaches. The results show that the proposed automated method can be a fast and reliable segmentation tool for EIT imaging. (paper)

  16. The use of the Kalman filter in the automated segmentation of EIT lung images.

    Science.gov (United States)

    Zifan, A; Liatsis, P; Chapman, B E

    2013-06-01

    In this paper, we present a new pipeline for the fast and accurate segmentation of impedance images of the lungs using electrical impedance tomography (EIT). EIT is an emerging, promising, non-invasive imaging modality that produces real-time, low spatial but high temporal resolution images of impedance inside a body. Recovering impedance itself constitutes a nonlinear ill-posed inverse problem, therefore the problem is usually linearized, which produces impedance-change images, rather than static impedance ones. Such images are highly blurry and fuzzy along object boundaries. We provide a mathematical reasoning behind the high suitability of the Kalman filter when it comes to segmenting and tracking conductivity changes in EIT lung images. Next, we use a two-fold approach to tackle the segmentation problem. First, we construct a global lung shape to restrict the search region of the Kalman filter. Next, we proceed with augmenting the Kalman filter by incorporating an adaptive foreground detection system to provide the boundary contours for the Kalman filter to carry out the tracking of the conductivity changes as the lungs undergo deformation in a respiratory cycle. The proposed method has been validated by using performance statistics such as misclassified area, and false positive rate, and compared to previous approaches. The results show that the proposed automated method can be a fast and reliable segmentation tool for EIT imaging.

  17. [Design Method Analysis and Performance Comparison of Wall Filter for Ultrasound Color Flow Imaging].

    Science.gov (United States)

    Wang, Lutao; Xiao, Jun; Chai, Hua

    2015-08-01

    The successful suppression of clutter arising from stationary or slowly moving tissue is one of the key issues in medical ultrasound color blood imaging. Remaining clutter may cause bias in the mean blood frequency estimation and results in a potentially misleading description of blood-flow. In this paper, based on the principle of general wall-filter, the design process of three classes of filters, infinitely impulse response with projection initialization (Prj-IIR), polynomials regression (Pol-Reg), and eigen-based filters are previewed and analyzed. The performance of the filters was assessed by calculating the bias and variance of a mean blood velocity using a standard autocorrelation estimator. Simulation results show that the performance of Pol-Reg filter is similar to Prj-IIR filters. Both of them can offer accurate estimation of mean blood flow speed under steady clutter conditions, and the clutter rejection ability can be enhanced by increasing the ensemble size of Doppler vector. Eigen-based filters can effectively remove the non-stationary clutter component, and further improve the estimation accuracy for low speed blood flow signals. There is also no significant increase in computation complexity for eigen-based filters when the ensemble size is less than 10.

  18. Improving Image Matching by Reducing Surface Reflections Using Polarising Filter Techniques

    Science.gov (United States)

    Conen, N.; Hastedt, H.; Kahmen, O.; Luhmann, T.

    2018-05-01

    In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera's orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002) using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm) with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  19. Evaluation of multichannel Wiener filters applied to fine resolution passive microwave images of first-year sea ice

    Science.gov (United States)

    Full, William E.; Eppler, Duane T.

    1993-01-01

    The effectivity of multichannel Wiener filters to improve images obtained with passive microwave systems was investigated by applying Wiener filters to passive microwave images of first-year sea ice. Four major parameters which define the filter were varied: the lag or pixel offset between the original and the desired scenes, filter length, the number of lines in the filter, and the weight applied to the empirical correlation functions. The effect of each variable on the image quality was assessed by visually comparing the results. It was found that the application of multichannel Wiener theory to passive microwave images of first-year sea ice resulted in visually sharper images with enhanced textural features and less high-frequency noise. However, Wiener filters induced a slight blocky grain to the image and could produce a type of ringing along scan lines traversing sharp intensity contrasts.

  20. Mathematical filtering minimizes metallic halation of titanium implants in MicroCT images.

    Science.gov (United States)

    Ha, Jee; Osher, Stanley J; Nishimura, Ichiro

    2013-01-01

    Microcomputed tomography (MicroCT) images containing titanium implant suffer from x-rays scattering, artifact and the implant surface is critically affected by metallic halation. To improve the metallic halation artifact, a nonlinear Total Variation denoising algorithm such as Split Bregman algorithm was applied to the digital data set of MicroCT images. This study demonstrated that the use of a mathematical filter could successfully reduce metallic halation, facilitating the osseointegration evaluation at the bone implant interface in the reconstructed images.

  1. Enhancement of noisy EDX HRSTEM spectrum-images by combination of filtering and PCA.

    Science.gov (United States)

    Potapov, Pavel; Longo, Paolo; Okunishi, Eiji

    2017-05-01

    STEM spectrum-imaging with collecting EDX signal is considered in view of the extraction of maximum information from very noisy data. It is emphasized that spectrum-images with weak EDX signal often suffer from information loss in the course of PCA treatment. The loss occurs when the level of random noise exceeds a certain threshold. Weighted PCA, though potentially helpful in isolation of meaningful variations from noise, might provoke the complete loss of information in the situation of weak EDX signal. Filtering datasets prior PCA can improve the situation and recover the lost information. In particular, Gaussian kernel filters are found to be efficient. A new filter useful in the case of sparse atomic-resolution EDX spectrum-images is suggested. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Automatic detection of solar features in HSOS full-disk solar images using guided filter

    Science.gov (United States)

    Yuan, Fei; Lin, Jiaben; Guo, Jingjing; Wang, Gang; Tong, Liyue; Zhang, Xinwei; Wang, Bingxiang

    2018-02-01

    A procedure is introduced for the automatic detection of solar features using full-disk solar images from Huairou Solar Observing Station (HSOS), National Astronomical Observatories of China. In image preprocessing, median filter is applied to remove the noises. Guided filter is adopted to enhance the edges of solar features and restrain the solar limb darkening, which is first introduced into the astronomical target detection. Then specific features are detected by Otsu algorithm and further threshold processing technique. Compared with other automatic detection procedures, our procedure has some advantages such as real time and reliability as well as no need of local threshold. Also, it reduces the amount of computation largely, which is benefited from the efficient guided filter algorithm. The procedure has been tested on one month sequences (December 2013) of HSOS full-disk solar images and the result shows that the number of features detected by our procedure is well consistent with the manual one.

  3. Image restoration technique using median filter combined with decision tree algorithm

    International Nuclear Information System (INIS)

    Sethu, D.; Assadi, H.M.; Hasson, F.N.; Hasson, N.N.

    2007-01-01

    Images are usually corrupted during transmission principally due to interface in the channel used for transmission. Images also be impaired by the addition of various forms of noise. Salt and pepper is commonly used to impair the image. Salt and pepper noise can be caused by errors in data transmission, malfunctioning pixel elements in camera sensors, and timing errors in the digitization process. During the filtering of noisy image, important features such as edges, lines and other fine image details embedded in the image tends to blur because of filtering operation. The enhancement of noisy data, however, is a very critical process because the sharpening operation can significantly increase the noise. In this respect, contrast enhancement is often necessary in order to highlight details that have been blurred. In this proposed approach we aim to develop image processing technique that can meet this new requirement, which are high quality and high speed. Furthermore, prevent the noise accretion during the sharpening of the image details, and compare the restored images via proposed method with other kinds of filters. (author)

  4. Image denoising by sparse 3-D transform-domain collaborative filtering.

    Science.gov (United States)

    Dabov, Kostadin; Foi, Alessandro; Katkovnik, Vladimir; Egiazarian, Karen

    2007-08-01

    We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2-D image fragments (e.g., blocks) into 3-D data arrays which we call "groups." Collaborative filtering is a special procedure developed to deal with these 3-D groups. We realize it using the three successive steps: 3-D transformation of a group, shrinkage of the transform spectrum, and inverse 3-D transformation. The result is a 3-D estimate that consists of the jointly filtered grouped image blocks. By attenuating the noise, the collaborative filtering reveals even the finest details shared by grouped blocks and, at the same time, it preserves the essential unique features of each individual block. The filtered blocks are then returned to their original positions. Because these blocks are overlapping, for each pixel, we obtain many different estimates which need to be combined. Aggregation is a particular averaging procedure which is exploited to take advantage of this redundancy. A significant improvement is obtained by a specially developed collaborative Wiener filtering. An algorithm based on this novel denoising strategy and its efficient implementation are presented in full detail; an extension to color-image denoising is also developed. The experimental results demonstrate that this computationally scalable algorithm achieves state-of-the-art denoising performance in terms of both peak signal-to-noise ratio and subjective visual quality.

  5. Time Domain Filtering of Resolved Images of Sgr A{sup ∗}

    Energy Technology Data Exchange (ETDEWEB)

    Shiokawa, Hotaka; Doeleman, Sheperd S. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Gammie, Charles F. [Department of Physics, University of Illinois, 1110 West Green Street, Urbana, IL 61801 (United States)

    2017-09-01

    The goal of the Event Horizon Telescope (EHT) is to provide spatially resolved images of Sgr A*, the source associated with the Galactic Center black hole. Because Sgr A* varies on timescales that are short compared to an EHT observing campaign, it is interesting to ask whether variability contains information about the structure and dynamics of the accretion flow. In this paper, we introduce “time-domain filtering,” a technique to filter time fluctuating images with specific temporal frequency ranges and to demonstrate the power and usage of the technique by applying it to mock millimeter wavelength images of Sgr A*. The mock image data is generated from the General Relativistic Magnetohydrodynamic (GRMHD) simulation and the general relativistic ray-tracing method. We show that the variability on each line of sight is tightly correlated with a typical radius of emission. This is because disk emissivity fluctuates on a timescale of the order of the local orbital period. Time-domain filtered images therefore reflect the model dependent emission radius distribution, which is not accessible in time-averaged images. We show that, in principle, filtered data have the power to distinguish between models with different black-hole spins, different disk viewing angles, and different disk orientations in the sky.

  6. Exploring an optimal wavelet-based filter for cryo-ET imaging.

    Science.gov (United States)

    Huang, Xinrui; Li, Sha; Gao, Song

    2018-02-07

    Cryo-electron tomography (cryo-ET) is one of the most advanced technologies for the in situ visualization of molecular machines by producing three-dimensional (3D) biological structures. However, cryo-ET imaging has two serious disadvantages-low dose and low image contrast-which result in high-resolution information being obscured by noise and image quality being degraded, and this causes errors in biological interpretation. The purpose of this research is to explore an optimal wavelet denoising technique to reduce noise in cryo-ET images. We perform tests using simulation data and design a filter using the optimum selected wavelet parameters (three-level decomposition, level-1 zeroed out, subband-dependent threshold, a soft-thresholding and spline-based discrete dyadic wavelet transform (DDWT)), which we call a modified wavelet shrinkage filter; this filter is suitable for noisy cryo-ET data. When testing using real cryo-ET experiment data, higher quality images and more accurate measures of a biological structure can be obtained with the modified wavelet shrinkage filter processing compared with conventional processing. Because the proposed method provides an inherent advantage when dealing with cryo-ET images, it can therefore extend the current state-of-the-art technology in assisting all aspects of cryo-ET studies: visualization, reconstruction, structural analysis, and interpretation.

  7. COMPARISON OF ULTRASOUND IMAGE FILTERING METHODS BY MEANS OF MULTIVARIABLE KURTOSIS

    Directory of Open Access Journals (Sweden)

    Mariusz Nieniewski

    2017-06-01

    Full Text Available Comparison of the quality of despeckled US medical images is complicated because there is no image of a human body that would be free of speckles and could serve as a reference. A number of various image metrics are currently used for comparison of filtering methods; however, they do not satisfactorily represent the visual quality of images and medical expert’s satisfaction with images. This paper proposes an innovative use of relative multivariate kurtosis for the evaluation of the most important edges in an image. Multivariate kurtosis allows one to introduce an order among the filtered images and can be used as one of the metrics for image quality evaluation. At present there is no method which would jointly consider individual metrics. Furthermore, these metrics are typically defined by comparing the noisy original and filtered images, which is incorrect since the noisy original cannot serve as a golden standard. In contrast to this, the proposed kurtosis is the absolute measure, which is calculated independently of any reference image and it agrees with the medical expert’s satisfaction to a large extent. The paper presents a numerical procedure for calculating kurtosis and describes results of such calculations for a computer-generated noisy image, images of a general purpose phantom and a cyst phantom, as well as real-life images of thyroid and carotid artery obtained with SonixTouch ultrasound machine. 16 different methods of image despeckling are compared via kurtosis. The paper shows that visually more satisfactory despeckling results are associated with higher kurtosis, and to a certain degree kurtosis can be used as a single metric for evaluation of image quality.

  8. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    Science.gov (United States)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  9. Computationally efficient video restoration for Nyquist sampled imaging sensors combining an affine-motion-based temporal Kalman filter and adaptive Wiener filter.

    Science.gov (United States)

    Rucci, Michael; Hardie, Russell C; Barnard, Kenneth J

    2014-05-01

    In this paper, we present a computationally efficient video restoration algorithm to address both blur and noise for a Nyquist sampled imaging system. The proposed method utilizes a temporal Kalman filter followed by a correlation-model based spatial adaptive Wiener filter (AWF). The Kalman filter employs an affine background motion model and novel process-noise variance estimate. We also propose and demonstrate a new multidelay temporal Kalman filter designed to more robustly treat local motion. The AWF is a spatial operation that performs deconvolution and adapts to the spatially varying residual noise left in the Kalman filter stage. In image areas where the temporal Kalman filter is able to provide significant noise reduction, the AWF can be aggressive in its deconvolution. In other areas, where less noise reduction is achieved with the Kalman filter, the AWF balances the deconvolution with spatial noise reduction. In this way, the Kalman filter and AWF work together effectively, but without the computational burden of full joint spatiotemporal processing. We also propose a novel hybrid system that combines a temporal Kalman filter and BM3D processing. To illustrate the efficacy of the proposed methods, we test the algorithms on both simulated imagery and video collected with a visible camera.

  10. Precision of quantitative computed tomography texture analysis using image filtering: A phantom study for scanner variability.

    Science.gov (United States)

    Yasaka, Koichiro; Akai, Hiroyuki; Mackin, Dennis; Court, Laurence; Moros, Eduardo; Ohtomo, Kuni; Kiryu, Shigeru

    2017-05-01

    Quantitative computed tomography (CT) texture analyses for images with and without filtration are gaining attention to capture the heterogeneity of tumors. The aim of this study was to investigate how quantitative texture parameters using image filtering vary among different computed tomography (CT) scanners using a phantom developed for radiomics studies.A phantom, consisting of 10 different cartridges with various textures, was scanned under 6 different scanning protocols using four CT scanners from four different vendors. CT texture analyses were performed for both unfiltered images and filtered images (using a Laplacian of Gaussian spatial band-pass filter) featuring fine, medium, and coarse textures. Forty-five regions of interest were placed for each cartridge (x) in a specific scan image set (y), and the average of the texture values (T(x,y)) was calculated. The interquartile range (IQR) of T(x,y) among the 6 scans was calculated for a specific cartridge (IQR(x)), while the IQR of T(x,y) among the 10 cartridges was calculated for a specific scan (IQR(y)), and the median IQR(y) was then calculated for the 6 scans (as the control IQR, IQRc). The median of their quotient (IQR(x)/IQRc) among the 10 cartridges was defined as the variability index (VI).The VI was relatively small for the mean in unfiltered images (0.011) and for standard deviation (0.020-0.044) and entropy (0.040-0.044) in filtered images. Skewness and kurtosis in filtered images featuring medium and coarse textures were relatively variable across different CT scanners, with VIs of 0.638-0.692 and 0.430-0.437, respectively.Various quantitative CT texture parameters are robust and variable among different scanners, and the behavior of these parameters should be taken into consideration.

  11. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Sihem SLATNIA

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks,extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  12. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Okba Kazar

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks, extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  13. Tin-filter enhanced dual-energy-CT: image quality and accuracy of CT numbers in virtual noncontrast imaging.

    Science.gov (United States)

    Kaufmann, Sascha; Sauter, Alexander; Spira, Daniel; Gatidis, Sergios; Ketelsen, Dominik; Heuschmid, Martin; Claussen, Claus D; Thomas, Christoph

    2013-05-01

    To measure and compare the objective image quality of true noncontrast (TNC) images with virtual noncontrast (VNC) images acquired by tin-filter-enhanced, dual-source, dual-energy computed tomography (DECT) of upper abdomen. Sixty-three patients received unenhanced abdominal CT and enhanced abdominal DECT (100/140 kV with tin filter) in portal-venous phase. VNC images were calculated from the DECT datasets using commercially available software. The mean attenuation of relevant tissues and image quality were compared between the TNC and VNC images. Image quality was rated objectively by measuring image noise and the sharpness of object edges using custom-designed software. Measurements were compared using Student two-tailed t-test. Correlation coefficients for tissue attenuation measurements between TNC and VNC were calculated and the relative deviations were illustrated using Bland-Altman plots. Mean attenuation differences between TNC and VNC (HUTNC - HUVNC) image sets were as follows: right liver lobe -4.94 Hounsfield units (HU), left liver lobe -3.29 HU, vena cava -2.19 HU, spleen -7.46 HU, pancreas 1.29 HU, fat -11.14 HU, aorta 1.29 HU, bone marrow 36.83 HU (all P VNC and TNC series were observed for liver, vena portae, kidneys, pancreas, muscle and bone marrow (Pearson's correlation coefficient ≥0.75). Mean image noise was significantly higher in TNC images (P VNC and TNC images (P = .19). The Hounsfield units in VNC images closely resemble TNC images in the majority of the organs of the upper abdomen (kidneys, liver, pancreas). In spleen and fat, Hounsfield numbers in VNC images are tend to be higher than in TNC images. VNC images show a low image noise and satisfactory edge sharpness. Other criteria of image quality and the depiction of certain lesions need to be evaluated additionally. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  14. Blurred image restoration using knife-edge function and optimal window Wiener filtering

    Science.gov (United States)

    Zhou, Shudao; Yan, Wei

    2018-01-01

    Motion blur in images is usually modeled as the convolution of a point spread function (PSF) and the original image represented as pixel intensities. The knife-edge function can be used to model various types of motion-blurs, and hence it allows for the construction of a PSF and accurate estimation of the degradation function without knowledge of the specific degradation model. This paper addresses the problem of image restoration using a knife-edge function and optimal window Wiener filtering. In the proposed method, we first calculate the motion-blur parameters and construct the optimal window. Then, we use the detected knife-edge function to obtain the system degradation function. Finally, we perform Wiener filtering to obtain the restored image. Experiments show that the restored image has improved resolution and contrast parameters with clear details and no discernible ringing effects. PMID:29377950

  15. Pornographic image recognition and filtering using incremental learning in compressed domain

    Science.gov (United States)

    Zhang, Jing; Wang, Chao; Zhuo, Li; Geng, Wenhao

    2015-11-01

    With the rapid development and popularity of the network, the openness, anonymity, and interactivity of networks have led to the spread and proliferation of pornographic images on the Internet, which have done great harm to adolescents' physical and mental health. With the establishment of image compression standards, pornographic images are mainly stored with compressed formats. Therefore, how to efficiently filter pornographic images is one of the challenging issues for information security. A pornographic image recognition and filtering method in the compressed domain is proposed by using incremental learning, which includes the following steps: (1) low-resolution (LR) images are first reconstructed from the compressed stream of pornographic images, (2) visual words are created from the LR image to represent the pornographic image, and (3) incremental learning is adopted to continuously adjust the classification rules to recognize the new pornographic image samples after the covering algorithm is utilized to train and recognize the visual words in order to build the initial classification model of pornographic images. The experimental results show that the proposed pornographic image recognition method using incremental learning has a higher recognition rate as well as costing less recognition time in the compressed domain.

  16. STRUCTURE TENSOR IMAGE FILTERING USING RIEMANNIAN L1 AND L∞ CENTER-OF-MASS

    Directory of Open Access Journals (Sweden)

    Jesus Angulo

    2014-06-01

    Full Text Available Structure tensor images are obtained by a Gaussian smoothing of the dyadic product of gradient image. These images give at each pixel a n×n symmetric positive definite matrix SPD(n, representing the local orientation and the edge information. Processing such images requires appropriate algorithms working on the Riemannian manifold on the SPD(n matrices. This contribution deals with structure tensor image filtering based on Lp geometric averaging. In particular, L1 center-of-mass (Riemannian median or Fermat-Weber point and L∞ center-of-mass (Riemannian circumcenter can be obtained for structure tensors using recently proposed algorithms. Our contribution in this paper is to study the interest of L1 and L∞ Riemannian estimators for structure tensor image processing. In particular, we compare both for two image analysis tasks: (i structure tensor image denoising; (ii anomaly detection in structure tensor images.

  17. Image edges detection through B-Spline filters

    International Nuclear Information System (INIS)

    Mastropiero, D.G.

    1997-01-01

    B-Spline signal processing was used to detect the edges of a digital image. This technique is based upon processing the image in the Spline transform domain, instead of doing so in the space domain (classical processing). The transformation to the Spline transform domain means finding out the real coefficients that makes it possible to interpolate the grey levels of the original image, with a B-Spline polynomial. There exist basically two methods of carrying out this interpolation, which produces the existence of two different Spline transforms: an exact interpolation of the grey values (direct Spline transform), and an approximated interpolation (smoothing Spline transform). The latter results in a higher smoothness of the gray distribution function defined by the Spline transform coefficients, and is carried out with the aim of obtaining an edge detection algorithm which higher immunity to noise. Finally the transformed image was processed in order to detect the edges of the original image (the gradient method was used), and the results of the three methods (classical, direct Spline transform and smoothing Spline transform) were compared. The results were that, as expected, the smoothing Spline transform technique produced a detection algorithm more immune to external noise. On the other hand the direct Spline transform technique, emphasizes more the edges, even more than the classical method. As far as the consuming time is concerned, the classical method is clearly the fastest one, and may be applied whenever the presence of noise is not important, and whenever edges with high detail are not required in the final image. (author). 9 refs., 17 figs., 1 tab

  18. Implementing the Frequency Filtering into a Dictionary Image

    International Nuclear Information System (INIS)

    Dudek, G.; Borys, P.

    2011-01-01

    In our previous research we have proposed new method for image compression which was based on LZ77 dictionary algorithm. We introduced two modifications such as inaccurate color matching and noise acceptance. Experimental results presented in that paper proved that the new method of image compression gives promising results as compared with original LZ77 dictionary algorithm. In this paper, we propose to supplement the previous algorithm with frequency content reduction. To this end we propose an interpolation based scaling and a compression along the vertical axes. The obtained results are compared to the previously proposed method. (authors)

  19. Noninvasive mapping of water diffusional exchange in the human brain using filter-exchange imaging.

    Science.gov (United States)

    Nilsson, Markus; Lätt, Jimmy; van Westen, Danielle; Brockstedt, Sara; Lasič, Samo; Ståhlberg, Freddy; Topgaard, Daniel

    2013-06-01

    We present the first in vivo application of the filter-exchange imaging protocol for diffusion MRI. The protocol allows noninvasive mapping of the rate of water exchange between microenvironments with different self-diffusivities, such as the intracellular and extracellular spaces in tissue. Since diffusional water exchange across the cell membrane is a fundamental process in human physiology and pathophysiology, clinically feasible and noninvasive imaging of the water exchange rate would offer new means to diagnose disease and monitor treatment response in conditions such as cancer and edema. The in vivo use of filter-exchange imaging was demonstrated by studying the brain of five healthy volunteers and one intracranial tumor (meningioma). Apparent exchange rates in white matter range from 0.8±0.08 s(-1) in the internal capsule, to 1.6±0.11 s(-1) for frontal white matter, indicating that low values are associated with high myelination. Solid tumor displayed values of up to 2.9±0.8 s(-1). In white matter, the apparent exchange rate values suggest intra-axonal exchange times in the order of seconds, confirming the slow exchange assumption in the analysis of diffusion MRI data. We propose that filter-exchange imaging could be used clinically to map the water exchange rate in pathologies. Filter-exchange imaging may also be valuable for evaluating novel therapies targeting the function of aquaporins. Copyright © 2012 Wiley Periodicals, Inc.

  20. Coupled-spin filtered MR imaging in a low field

    International Nuclear Information System (INIS)

    Baudouin, C.J.; Bryant, D.J.; Coutts, G.A.; Bydder, G.M.; Young, I.R.

    1990-01-01

    This paper investigates the use of an editing method of imaging using spin-echo sequences with differing radio-frequency (RF) pulses for lipid imaging in poor fields and to compare it with solvent-suppression methods. A technique of echo difference imaging (EDI) has been described in which two data sets are acquired: a normal spin-echo sequence (90-180) and a 90-90 spin-echo sequence. The intrinsic signal of uncoupled spins in the EDI method is one-half that of the conventional sequence, so that subtracting twice the EDI signal from the conventional signal should result in signal cancellation. With coupled spins, the application of the second 90 degrees pulse results in coherence transfer, and echo magnitude will not be one-half that of the 90-180 echo. This method of lipid imaging may be less vulnerable to field inhomogeneity than are solvent-suppression methods. Phantom and in vivo studies were performed at 0.15 T (TE = 44 msec and various TRs)

  1. Image enhancement filters significantly improve reading performance for low vision observers

    Science.gov (United States)

    Lawton, T. B.

    1992-01-01

    As people age, so do their photoreceptors; many photoreceptors in central vision stop functioning when a person reaches their late sixties or early seventies. Low vision observers with losses in central vision, those with age-related maculopathies, were studied. Low vision observers no longer see high spatial frequencies, being unable to resolve fine edge detail. We developed image enhancement filters to compensate for the low vision observer's losses in contrast sensitivity to intermediate and high spatial frequencies. The filters work by boosting the amplitude of the less visible intermediate spatial frequencies. The lower spatial frequencies. These image enhancement filters not only reduce the magnification needed for reading by up to 70 percent, but they also increase the observer's reading speed by 2-4 times. A summary of this research is presented.

  2. Image classification using multiscale information fusion based on saliency driven nonlinear diffusion filtering.

    Science.gov (United States)

    Hu, Weiming; Hu, Ruiguang; Xie, Nianhua; Ling, Haibin; Maybank, Stephen

    2014-04-01

    In this paper, we propose saliency driven image multiscale nonlinear diffusion filtering. The resulting scale space in general preserves or even enhances semantically important structures such as edges, lines, or flow-like structures in the foreground, and inhibits and smoothes clutter in the background. The image is classified using multiscale information fusion based on the original image, the image at the final scale at which the diffusion process converges, and the image at a midscale. Our algorithm emphasizes the foreground features, which are important for image classification. The background image regions, whether considered as contexts of the foreground or noise to the foreground, can be globally handled by fusing information from different scales. Experimental tests of the effectiveness of the multiscale space for the image classification are conducted on the following publicly available datasets: 1) the PASCAL 2005 dataset; 2) the Oxford 102 flowers dataset; and 3) the Oxford 17 flowers dataset, with high classification rates.

  3. Extended Kalman filtering for continuous volumetric MR-temperature imaging.

    Science.gov (United States)

    Denis de Senneville, Baudouin; Roujol, Sébastien; Hey, Silke; Moonen, Chrit; Ries, Mario

    2013-04-01

    Real time magnetic resonance (MR) thermometry has evolved into the method of choice for the guidance of high-intensity focused ultrasound (HIFU) interventions. For this role, MR-thermometry should preferably have a high temporal and spatial resolution and allow observing the temperature over the entire targeted area and its vicinity with a high accuracy. In addition, the precision of real time MR-thermometry for therapy guidance is generally limited by the available signal-to-noise ratio (SNR) and the influence of physiological noise. MR-guided HIFU would benefit of the large coverage volumetric temperature maps, including characterization of volumetric heating trajectories as well as near- and far-field heating. In this paper, continuous volumetric MR-temperature monitoring was obtained as follows. The targeted area was continuously scanned during the heating process by a multi-slice sequence. Measured data and a priori knowledge of 3-D data derived from a forecast based on a physical model were combined using an extended Kalman filter (EKF). The proposed reconstruction improved the temperature measurement resolution and precision while maintaining guaranteed output accuracy. The method was evaluated experimentally ex vivo on a phantom, and in vivo on a porcine kidney, using HIFU heating. On the in vivo experiment, it allowed the reconstruction from a spatio-temporally under-sampled data set (with an update rate for each voxel of 1.143 s) to a 3-D dataset covering a field of view of 142.5×285×54 mm(3) with a voxel size of 3×3×6 mm(3) and a temporal resolution of 0.127 s. The method also provided noise reduction, while having a minimal impact on accuracy and latency.

  4. Influence of different anode/filter combination on radiation dose and image quality in digital mammography

    International Nuclear Information System (INIS)

    Liu Jie; Liu Peifang; Zhang Lianlian; Ma Wenjuan

    2013-01-01

    Objective: To explore the effect of different anode/filter combination on radiation dose and image quality in digital mammography, so as to choose optimal anode/filter combination to reduce radiation injury without scarifying image quality. Methods: Mammography accredition phantoms with the thickness from 1.6 cm to 8.6 cm were used to simulate human breast tissue. The same exposure conditions, pressure, compression thickness. and different anode/filter combination were employed under the automatic and manual exposure modes. The image kV, mAs, pressure, filter, average glandular dose (ACD), contrast to noise ratio (CNR) were recorded and the figure of merit (FOM) was calculated. SPSS 17.0 and one-way analysis of variance were used in the statistical analysis. Results: As the phantom thickness increase, the ACD values which were acquired with Mo/Mo, Mo/Rh, and W/Ag three different anode/filter combinations were increased, but CNR and FOM values were decreased, ACD, CNR, and FOM values which were acquired in the phantom with different thickness, and three different anode/filter combinations were statistically different (P=0.000, respectively). The ACD values of Mo/Mo were lowest. For 1.6 cm-2.6 cm phantom thicknesses, the FOMs of Mo/Rh were lowest, and for 3.6 cm-8.6 cm phantom thicknesses, the FOMs of W/Ag were lowest. Conclusion: Phantom thickness in 1.6 cm-2.6 cm and 3.6 cm-8.6 cm. Mo/Rh combination and W/Ag combination respectively can achieve the highest FOM, and can provide the best imaging quality with low radiation dose. (authors)

  5. 3D early embryogenesis image filtering by nonlinear partial differential equations.

    Science.gov (United States)

    Krivá, Z; Mikula, K; Peyriéras, N; Rizzi, B; Sarti, A; Stasová, O

    2010-08-01

    We present nonlinear diffusion equations, numerical schemes to solve them and their application for filtering 3D images obtained from laser scanning microscopy (LSM) of living zebrafish embryos, with a goal to identify the optimal filtering method and its parameters. In the large scale applications dealing with analysis of 3D+time embryogenesis images, an important objective is a correct detection of the number and position of cell nuclei yielding the spatio-temporal cell lineage tree of embryogenesis. The filtering is the first and necessary step of the image analysis chain and must lead to correct results, removing the noise, sharpening the nuclei edges and correcting the acquisition errors related to spuriously connected subregions. In this paper we study such properties for the regularized Perona-Malik model and for the generalized mean curvature flow equations in the level-set formulation. A comparison with other nonlinear diffusion filters, like tensor anisotropic diffusion and Beltrami flow, is also included. All numerical schemes are based on the same discretization principles, i.e. finite volume method in space and semi-implicit scheme in time, for solving nonlinear partial differential equations. These numerical schemes are unconditionally stable, fast and naturally parallelizable. The filtering results are evaluated and compared first using the Mean Hausdorff distance between a gold standard and different isosurfaces of original and filtered data. Then, the number of isosurface connected components in a region of interest (ROI) detected in original and after the filtering is compared with the corresponding correct number of nuclei in the gold standard. Such analysis proves the robustness and reliability of the edge preserving nonlinear diffusion filtering for this type of data and lead to finding the optimal filtering parameters for the studied models and numerical schemes. Further comparisons consist in ability of splitting the very close objects which

  6. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    Directory of Open Access Journals (Sweden)

    Hyeon Sik Kim

    2014-10-01

    Full Text Available Objective(s: In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF measurement by dynamic N-13 ammonia positron emission tomography (PET, we compared various reconstruction and filtering methods with image characteristics. Methods: Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years were reconstructed, using filtered back projection (FBP and ordered subset expectation maximization (OSEM methods. OSEM reconstruction consisted of OSEM_2I, OSEM_4I, and OSEM_6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR was calculated by noise and contrast recovery (CR. Stress and rest MBF and coronary flow reserve (CFR were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. Results: In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (PP=0.923 and 0.855 for readers 1 and 2, respectively. SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Conclusion: Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation. .

  7. Effect of Post-Reconstruction Gaussian Filtering on Image Quality and Myocardial Blood Flow Measurement with N-13 Ammonia PET

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Cho, Sang-Geon; Kim, Ju Han; Kwon, Seong Young; Lee, Byeong-il; Bom, Hee-Seung

    2014-01-01

    In order to evaluate the effect of post-reconstruction Gaussian filtering on image quality and myocardial blood flow (MBF) measurement by dynamic N-13 ammonia positron emission tomography (PET), we compared various reconstruction and filtering methods with image characteristics. Dynamic PET images of three patients with coronary artery disease (male-female ratio of 2:1; age: 57, 53, and 76 years) were reconstructed, using filtered back projection (FBP) and ordered subset expectation maximization (OSEM) methods. OSEM reconstruction consisted of OSEM-2I, OSEM-4I, and OSEM-6I with 2, 4, and 6 iterations, respectively. The images, reconstructed and filtered by Gaussian filters of 5, 10, and 15 mm, were obtained, as well as non-filtered images. Visual analysis of image quality (IQ) was performed using a 3-grade scoring system by 2 independent readers, blinded to the reconstruction and filtering methods of stress images. Then, signal-to-noise ratio (SNR) was calculated by noise and contrast recovery (CR). Stress and rest MBF and coronary flow reserve (CFR) were obtained for each method. IQ scores, stress and rest MBF, and CFR were compared between the methods, using Chi-square and Kruskal-Wallis tests. In the visual analysis, IQ was significantly higher by 10 mm Gaussian filtering, compared to other sizes of filter (P<0.001 for both readers). However, no significant difference of IQ was found between FBP and various numbers of iteration in OSEM (P=0.923 and 0.855 for readers 1 and 2, respectively). SNR was significantly higher in 10 mm Gaussian filter. There was a significant difference in stress and rest MBF between several vascular territories. However CFR was not significantly different according to various filtering methods. Post-reconstruction Gaussian filtering with a filter size of 10 mm significantly enhances the IQ of N-13 ammonia PET-CT, without changing the results of CFR calculation

  8. Enhancing Perceived Quality of Compressed Images and Video with Anisotropic Diffusion and Fuzzy Filtering

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Korhonen, Jari; Forchhammer, Søren

    2013-01-01

    and subjective results on JPEG compressed images, as well as MJPEG and H.264/AVC compressed video, indicate that the proposed algorithms employing directional and spatial fuzzy filters achieve better artifact reduction than other methods. In particular, robust improvements with H.264/AVC video have been gained...

  9. Acousto-Optic Tunable Filter Hyperspectral Microscope Imaging Method for Characterizing Spectra from Foodborne Pathogens.

    Science.gov (United States)

    Hyperspectral microscope imaging (HMI) method, which provides both spatial and spectral characteristics of samples, can be effective for foodborne pathogen detection. The acousto-optic tunable filter (AOTF)-based HMI method can be used to characterize spectral properties of biofilms formed by Salmon...

  10. Large-Scale Query-by-Image Video Retrieval Using Bloom Filters

    OpenAIRE

    Araujo, Andre; Chaves, Jason; Lakshman, Haricharan; Angst, Roland; Girod, Bernd

    2016-01-01

    We consider the problem of using image queries to retrieve videos from a database. Our focus is on large-scale applications, where it is infeasible to index each database video frame independently. Our main contribution is a framework based on Bloom filters, which can be used to index long video segments, enabling efficient image-to-video comparisons. Using this framework, we investigate several retrieval architectures, by considering different types of aggregation and different functions to ...

  11. Active filtering applied to radiographic images unfolded by the Richardson-Lucy algorithm

    International Nuclear Information System (INIS)

    Almeida, Gevaldo L. de; Silvani, Maria Ines; Lopes, Ricardo T.

    2011-01-01

    Degradation of images caused by systematic uncertainties can be reduced when one knows the features of the spoiling agent. Typical uncertainties of this kind arise in radiographic images due to the non - zero resolution of the detector used to acquire them, and from the non-punctual character of the source employed in the acquisition, or from the beam divergence when extended sources are used. Both features blur the image, which, instead of a single point exhibits a spot with a vanishing edge, reproducing hence the point spread function - PSF of the system. Once this spoiling function is known, an inverse problem approach, involving inversion of matrices, can then be used to retrieve the original image. As these matrices are generally ill-conditioned, due to statistical fluctuation and truncation errors, iterative procedures should be applied, such as the Richardson-Lucy algorithm. This algorithm has been applied in this work to unfold radiographic images acquired by transmission of thermal neutrons and gamma-rays. After this procedure, the resulting images undergo an active filtering which fairly improves their final quality at a negligible cost in terms of processing time. The filter ruling the process is based on the matrix of the correction factors for the last iteration of the deconvolution procedure. Synthetic images degraded with a known PSF, and undergone to the same treatment, have been used as benchmark to evaluate the soundness of the developed active filtering procedure. The deconvolution and filtering algorithms have been incorporated to a Fortran program, written to deal with real images, generate the synthetic ones and display both. (author)

  12. Correlation study of effect of additional filter on radiation dose and image quality in digital mammography

    International Nuclear Information System (INIS)

    Liu Jie; Liu Peifang; Wang Hongbin; Zhang Shuping; Liu Xueou

    2012-01-01

    Objective: To explore the effect of different additional filters on radiation dose and image quality in digital mammography. Methods: Hologic company's Selenia digital mammography machine and the post-processing workstations and 5 M high resolution medical monitor were used in this study. Mammography phantoms with the thickness from 1.6 cm to 8.6 cm were used to simulate human breast tissue. The same exposure conditions, pressure, compression thickness, the anode were employed with the additional filters of Mo and Rh under the automatic and manual exposure mode. The image kV, mAs, pressure, filter, average glandular dose (AGD), entrance surface dose (ESD), signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR) and image score according to ACR criteria were recorded for the two additional filters. Paired sample t test was performed to compare the indices of Mo and Rh groups by using SPSS 17.0. Results: AGD and ESD of Rh and Mo group were both higher with the increase of the thickness of all the phantoms. AGD, ESD and their increased value of Rh filter(1.484 ± 1.041, 7.969 ± 7.633, 0.423 ± 0.190 and 3.057 ± 2.139) were lower than those of Mo filter (1.915 ± 1.301, 12.516 ± 11.632, 0.539 ±0.246 and 4.731 ± 3.294), in all the phantoms with different thickness (t values were 4.614, 3.209, 3.396 and 3.605, P<0.05). SNR, CNR, and image score of Rh and Mo group both decreased with the increase of the thickness of all the phantoms. There were no statistical difference (P>0.05). Conclusions: Compared with Mo filter, Rh filter could reduce the radiation dose, and this advantage is more obvious in the thicker phantom when the same image quality is required. (authors)

  13. An Image Filter Based on Shearlet Transformation and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2015-01-01

    Full Text Available Digital image is always polluted by noise and made data postprocessing difficult. To remove noise and preserve detail of image as much as possible, this paper proposed image filter algorithm which combined the merits of Shearlet transformation and particle swarm optimization (PSO algorithm. Firstly, we use classical Shearlet transform to decompose noised image into many subwavelets under multiscale and multiorientation. Secondly, we gave weighted factor to those subwavelets obtained. Then, using classical Shearlet inverse transform, we obtained a composite image which is composed of those weighted subwavelets. After that, we designed fast and rough evaluation method to evaluate noise level of the new image; by using this method as fitness, we adopted PSO to find the optimal weighted factor we added; after lots of iterations, by the optimal factors and Shearlet inverse transform, we got the best denoised image. Experimental results have shown that proposed algorithm eliminates noise effectively and yields good peak signal noise ratio (PSNR.

  14. Predicting perceptual quality of images in realistic scenario using deep filter banks

    Science.gov (United States)

    Zhang, Weixia; Yan, Jia; Hu, Shiyong; Ma, Yang; Deng, Dexiang

    2018-03-01

    Classical image perceptual quality assessment models usually resort to natural scene statistic methods, which are based on an assumption that certain reliable statistical regularities hold on undistorted images and will be corrupted by introduced distortions. However, these models usually fail to accurately predict degradation severity of images in realistic scenarios since complex, multiple, and interactive authentic distortions usually appear on them. We propose a quality prediction model based on convolutional neural network. Quality-aware features extracted from filter banks of multiple convolutional layers are aggregated into the image representation. Furthermore, an easy-to-implement and effective feature selection strategy is used to further refine the image representation and finally a linear support vector regression model is trained to map image representation into images' subjective perceptual quality scores. The experimental results on benchmark databases present the effectiveness and generalizability of the proposed model.

  15. Brain MR Image Restoration Using an Automatic Trilateral Filter With GPU-Based Acceleration.

    Science.gov (United States)

    Chang, Herng-Hua; Li, Cheng-Yuan; Gallogly, Audrey Haihong

    2018-02-01

    Noise reduction in brain magnetic resonance (MR) images has been a challenging and demanding task. This study develops a new trilateral filter that aims to achieve robust and efficient image restoration. Extended from the bilateral filter, the proposed algorithm contains one additional intensity similarity funct-ion, which compensates for the unique characteristics of noise in brain MR images. An entropy function adaptive to intensity variations is introduced to regulate the contributions of the weighting components. To hasten the computation, parallel computing based on the graphics processing unit (GPU) strategy is explored with emphasis on memory allocations and thread distributions. To automate the filtration, image texture feature analysis associated with machine learning is investigated. Among the 98 candidate features, the sequential forward floating selection scheme is employed to acquire the optimal texture features for regularization. Subsequently, a two-stage classifier that consists of support vector machines and artificial neural networks is established to predict the filter parameters for automation. A speedup gain of 757 was reached to process an entire MR image volume of 256 × 256 × 256 pixels, which completed within 0.5 s. Automatic restoration results revealed high accuracy with an ensemble average relative error of 0.53 ± 0.85% in terms of the peak signal-to-noise ratio. This self-regulating trilateral filter outperformed many state-of-the-art noise reduction methods both qualitatively and quantitatively. We believe that this new image restoration algorithm is of potential in many brain MR image processing applications that require expedition and automation.

  16. IMPROVING IMAGE MATCHING BY REDUCING SURFACE REFLECTIONS USING POLARISING FILTER TECHNIQUES

    Directory of Open Access Journals (Sweden)

    N. Conen

    2018-05-01

    Full Text Available In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera’s orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002 using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  17. Fan-beam and cone-beam image reconstruction via filtering the backprojection image of differentiated projection data

    International Nuclear Information System (INIS)

    Zhuang Tingliang; Leng Shuai; Nett, Brian E; Chen Guanghong

    2004-01-01

    In this paper, a new image reconstruction scheme is presented based on Tuy's cone-beam inversion scheme and its fan-beam counterpart. It is demonstrated that Tuy's inversion scheme may be used to derive a new framework for fan-beam and cone-beam image reconstruction. In this new framework, images are reconstructed via filtering the backprojection image of differentiated projection data. The new framework is mathematically exact and is applicable to a general source trajectory provided the Tuy data sufficiency condition is satisfied. By choosing a piece-wise constant function for one of the components in the factorized weighting function, the filtering kernel is one dimensional, viz. the filtering process is along a straight line. Thus, the derived image reconstruction algorithm is mathematically exact and efficient. In the cone-beam case, the derived reconstruction algorithm is applicable to a large class of source trajectories where the pi-lines or the generalized pi-lines exist. In addition, the new reconstruction scheme survives the super-short scan mode in both the fan-beam and cone-beam cases provided the data are not transversely truncated. Numerical simulations were conducted to validate the new reconstruction scheme for the fan-beam case

  18. A Filtering Approach for Image-Guided Surgery With a Highly Articulated Surgical Snake Robot.

    Science.gov (United States)

    Tully, Stephen; Choset, Howie

    2016-02-01

    The objective of this paper is to introduce a probabilistic filtering approach to estimate the pose and internal shape of a highly flexible surgical snake robot during minimally invasive surgery. Our approach renders a depiction of the robot that is registered to preoperatively reconstructed organ models to produce a 3-D visualization that can be used for surgical feedback. Our filtering method estimates the robot shape using an extended Kalman filter that fuses magnetic tracker data with kinematic models that define the motion of the robot. Using Lie derivative analysis, we show that this estimation problem is observable, and thus, the shape and configuration of the robot can be successfully recovered with a sufficient number of magnetic tracker measurements. We validate this study with benchtop and in-vivo image-guidance experiments in which the surgical robot was driven along the epicardial surface of a porcine heart. This paper introduces a filtering approach for shape estimation that can be used for image guidance during minimally invasive surgery. The methods being introduced in this paper enable informative image guidance for highly articulated surgical robots, which benefits the advancement of robotic surgery.

  19. An Efficient FPGA Implementation of Optimized Anisotropic Diffusion Filtering of Images

    Directory of Open Access Journals (Sweden)

    Chandrajit Pal

    2016-01-01

    Full Text Available Digital image processing is an exciting area of research with a variety of applications including medical, surveillance security systems, defence, and space applications. Noise removal as a preprocessing step helps to improve the performance of the signal processing algorithms, thereby enhancing image quality. Anisotropic diffusion filtering proposed by Perona and Malik can be used as an edge-preserving smoother, removing high-frequency components of images without blurring their edges. In this paper, we present the FPGA implementation of an edge-preserving anisotropic diffusion filter for digital images. The designed architecture completely replaced the convolution operation and implemented the same using simple arithmetic subtraction of the neighboring intensities within a kernel, preceded by multiple operations in parallel within the kernel. To improve the image reconstruction quality, the diffusion coefficient parameter, responsible for controlling the filtering process, has been properly analyzed. Its signal behavior has been studied by subsequently scaling and differentiating the signal. The hardware implementation of the proposed design shows better performance in terms of reconstruction quality and accelerated performance with respect to its software implementation. It also reduces computation, power consumption, and resource utilization with respect to other related works.

  20. Passive ranging using a filter-based non-imaging method based on oxygen absorption.

    Science.gov (United States)

    Yu, Hao; Liu, Bingqi; Yan, Zongqun; Zhang, Yu

    2017-10-01

    To solve the problem of poor real-time measurement caused by a hyperspectral imaging system and to simplify the design in passive ranging technology based on oxygen absorption spectrum, a filter-based non-imaging ranging method is proposed. In this method, three bandpass filters are used to obtain the source radiation intensities that are located in the oxygen absorption band near 762 nm and the band's left and right non-absorption shoulders, and a photomultiplier tube is used as the non-imaging sensor of the passive ranging system. Range is estimated by comparing the calculated values of band-average transmission due to oxygen absorption, τ O 2 , against the predicted curve of τ O 2 versus range. The method is tested under short-range conditions. Accuracy of 6.5% is achieved with the designed experimental ranging system at the range of 400 m.

  1. Fractional order integration and fuzzy logic based filter for denoising of echocardiographic image.

    Science.gov (United States)

    Saadia, Ayesha; Rashdi, Adnan

    2016-12-01

    Ultrasound is widely used for imaging due to its cost effectiveness and safety feature. However, ultrasound images are inherently corrupted with speckle noise which severely affects the quality of these images and create difficulty for physicians in diagnosis. To get maximum benefit from ultrasound imaging, image denoising is an essential requirement. To perform image denoising, a two stage methodology using fuzzy weighted mean and fractional integration filter has been proposed in this research work. In stage-1, image pixels are processed by applying a 3 × 3 window around each pixel and fuzzy logic is used to assign weights to the pixels in each window, replacing central pixel of the window with weighted mean of all neighboring pixels present in the same window. Noise suppression is achieved by assigning weights to the pixels while preserving edges and other important features of an image. In stage-2, the resultant image is further improved by fractional order integration filter. Effectiveness of the proposed methodology has been analyzed for standard test images artificially corrupted with speckle noise and real ultrasound B-mode images. Results of the proposed technique have been compared with different state-of-the-art techniques including Lsmv, Wiener, Geometric filter, Bilateral, Non-local means, Wavelet, Perona et al., Total variation (TV), Global Adaptive Fractional Integral Algorithm (GAFIA) and Improved Fractional Order Differential (IFD) model. Comparison has been done on quantitative and qualitative basis. For quantitative analysis different metrics like Peak Signal to Noise Ratio (PSNR), Speckle Suppression Index (SSI), Structural Similarity (SSIM), Edge Preservation Index (β) and Correlation Coefficient (ρ) have been used. Simulations have been done using Matlab. Simulation results of artificially corrupted standard test images and two real Echocardiographic images reveal that the proposed method outperforms existing image denoising techniques

  2. Image scale measurement with correlation filters in a volume holographic optical correlator

    Science.gov (United States)

    Zheng, Tianxiang; Cao, Liangcai; He, Qingsheng; Jin, Guofan

    2013-08-01

    A search engine containing various target images or different part of a large scene area is of great use for many applications, including object detection, biometric recognition, and image registration. The input image captured in realtime is compared with all the template images in the search engine. A volume holographic correlator is one type of these search engines. It performs thousands of comparisons among the images at a super high speed, with the correlation task accomplishing mainly in optics. However, the inputted target image always contains scale variation to the filtering template images. At the time, the correlation values cannot properly reflect the similarity of the images. It is essential to estimate and eliminate the scale variation of the inputted target image. There are three domains for performing the scale measurement, as spatial, spectral and time domains. Most methods dealing with the scale factor are based on the spatial or the spectral domains. In this paper, a method with the time domain is proposed to measure the scale factor of the input image. It is called a time-sequential scaled method. The method utilizes the relationship between the scale variation and the correlation value of two images. It sends a few artificially scaled input images to compare with the template images. The correlation value increases and decreases with the increasing of the scale factor at the intervals of 0.8~1 and 1~1.2, respectively. The original scale of the input image can be measured by estimating the largest correlation value through correlating the artificially scaled input image with the template images. The measurement range for the scale can be 0.8~4.8. Scale factor beyond 1.2 is measured by scaling the input image at the factor of 1/2, 1/3 and 1/4, correlating the artificially scaled input image with the template images, and estimating the new corresponding scale factor inside 0.8~1.2.

  3. Be Foil ''Filter Knee Imaging'' NSTX Plasma with Fast Soft X-ray Camera

    International Nuclear Information System (INIS)

    B.C. Stratton; S. von Goeler; D. Stutman; K. Tritz; L.E. Zakharov

    2005-01-01

    A fast soft x-ray (SXR) pinhole camera has been implemented on the National Spherical Torus Experiment (NSTX). This paper presents observations and describes the Be foil Filter Knee Imaging (FKI) technique for reconstructions of a m/n=1/1 mode on NSTX. The SXR camera has a wide-angle (28 o ) field of view of the plasma. The camera images nearly the entire diameter of the plasma and a comparable region in the vertical direction. SXR photons pass through a beryllium foil and are imaged by a pinhole onto a P47 scintillator deposited on a fiber optic faceplate. An electrostatic image intensifier demagnifies the visible image by 6:1 to match it to the size of the charge-coupled device (CCD) chip. A pair of lenses couples the image to the CCD chip

  4. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods

    Directory of Open Access Journals (Sweden)

    Anthony Hoak

    2017-03-01

    Full Text Available We develop an interactive likelihood (ILH for sequential Monte Carlo (SMC methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL and TUD-Stadtmitte using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA and classification of events, activities and relationships for multi-object trackers (CLEAR MOT. In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  5. A multi-stage noise adaptive switching filter for extremely corrupted images

    Science.gov (United States)

    Dinh, Hai; Adhami, Reza; Wang, Yi

    2015-07-01

    A multi-stage noise adaptive switching filter (MSNASF) is proposed for the restoration of images extremely corrupted by impulse and impulse-like noise. The filter consists of two steps: noise detection and noise removal. The proposed extrema-based noise detection scheme utilizes the false contouring effect to get better over detection rate at low noise density. It is adaptive and will detect not only impulse but also impulse-like noise. In the noise removal step, a novel multi-stage filtering scheme is proposed. It replaces corrupted pixel with the nearest uncorrupted median to preserve details. When compared with other methods, MSNASF provides better peak signal to noise ratio (PSNR) and structure similarity index (SSIM). A subjective evaluation carried out online also demonstrates that MSNASF yields higher fidelity.

  6. Image-Based Multi-Target Tracking through Multi-Bernoulli Filtering with Interactive Likelihoods.

    Science.gov (United States)

    Hoak, Anthony; Medeiros, Henry; Povinelli, Richard J

    2017-03-03

    We develop an interactive likelihood (ILH) for sequential Monte Carlo (SMC) methods for image-based multiple target tracking applications. The purpose of the ILH is to improve tracking accuracy by reducing the need for data association. In addition, we integrate a recently developed deep neural network for pedestrian detection along with the ILH with a multi-Bernoulli filter. We evaluate the performance of the multi-Bernoulli filter with the ILH and the pedestrian detector in a number of publicly available datasets (2003 PETS INMOVE, Australian Rules Football League (AFL) and TUD-Stadtmitte) using standard, well-known multi-target tracking metrics (optimal sub-pattern assignment (OSPA) and classification of events, activities and relationships for multi-object trackers (CLEAR MOT)). In all datasets, the ILH term increases the tracking accuracy of the multi-Bernoulli filter.

  7. Tunable output-frequency filter algorithm for imaging through scattering media under LED illumination

    Science.gov (United States)

    Zhou, Meiling; Singh, Alok Kumar; Pedrini, Giancarlo; Osten, Wolfgang; Min, Junwei; Yao, Baoli

    2018-03-01

    We present a tunable output-frequency filter (TOF) algorithm to reconstruct the object from noisy experimental data under low-power partially coherent illumination, such as LED, when imaging through scattering media. In the iterative algorithm, we employ Gaussian functions with different filter windows at different stages of iteration process to reduce corruption from experimental noise to search for a global minimum in the reconstruction. In comparison with the conventional iterative phase retrieval algorithm, we demonstrate that the proposed TOF algorithm achieves consistent and reliable reconstruction in the presence of experimental noise. Moreover, the spatial resolution and distinctive features are retained in the reconstruction since the filter is applied only to the region outside the object. The feasibility of the proposed method is proved by experimental results.

  8. Evaluation of an image-based tracking workflow with Kalman filtering for automatic image plane alignment in interventional MRI.

    Science.gov (United States)

    Neumann, M; Cuvillon, L; Breton, E; de Matheli, M

    2013-01-01

    Recently, a workflow for magnetic resonance (MR) image plane alignment based on tracking in real-time MR images was introduced. The workflow is based on a tracking device composed of 2 resonant micro-coils and a passive marker, and allows for tracking of the passive marker in clinical real-time images and automatic (re-)initialization using the microcoils. As the Kalman filter has proven its benefit as an estimator and predictor, it is well suited for use in tracking applications. In this paper, a Kalman filter is integrated in the previously developed workflow in order to predict position and orientation of the tracking device. Measurement noise covariances of the Kalman filter are dynamically changed in order to take into account that, according to the image plane orientation, only a subset of the 3D pose components is available. The improved tracking performance of the Kalman extended workflow could be quantified in simulation results. Also, a first experiment in the MRI scanner was performed but without quantitative results yet.

  9. Real-time MR diffusion tensor and Q-ball imaging using Kalman filtering

    International Nuclear Information System (INIS)

    Poupon, C.; Roche, A.; Dubois, J.; Mangin, J.F.; Poupon, F.

    2008-01-01

    Diffusion magnetic resonance imaging (dMRI) has become an established research tool for the investigation of tissue structure and orientation. In this paper, we present a method for real-time processing of diffusion tensor and Q-ball imaging. The basic idea is to use Kalman filtering framework to fit either the linear tensor or Q-ball model. Because the Kalman filter is designed to be an incremental algorithm, it naturally enables updating the model estimate after the acquisition of any new diffusion-weighted volume. Processing diffusion models and maps during ongoing scans provides a new useful tool for clinicians, especially when it is not possible to predict how long a subject may remain still in the magnet. First, we introduce the general linear models corresponding to the two diffusion tensor and analytical Q-ball models of interest. Then, we present the Kalman filtering framework and we focus on the optimization of the diffusion orientation sets in order to speed up the convergence of the online processing. Last, we give some results on a healthy volunteer for the online tensor and the Q-ball model, and we make some comparisons with the conventional offline techniques used in the literature. We could achieve full real-time for diffusion tensor imaging and deferred time for Q-ball imaging, using a single workstation. (authors)

  10. High performance 3D adaptive filtering for DSP based portable medical imaging systems

    Science.gov (United States)

    Bockenbach, Olivier; Ali, Murtaza; Wainwright, Ian; Nadeski, Mark

    2015-03-01

    Portable medical imaging devices have proven valuable for emergency medical services both in the field and hospital environments and are becoming more prevalent in clinical settings where the use of larger imaging machines is impractical. Despite their constraints on power, size and cost, portable imaging devices must still deliver high quality images. 3D adaptive filtering is one of the most advanced techniques aimed at noise reduction and feature enhancement, but is computationally very demanding and hence often cannot be run with sufficient performance on a portable platform. In recent years, advanced multicore digital signal processors (DSP) have been developed that attain high processing performance while maintaining low levels of power dissipation. These processors enable the implementation of complex algorithms on a portable platform. In this study, the performance of a 3D adaptive filtering algorithm on a DSP is investigated. The performance is assessed by filtering a volume of size 512x256x128 voxels sampled at a pace of 10 MVoxels/sec with an Ultrasound 3D probe. Relative performance and power is addressed between a reference PC (Quad Core CPU) and a TMS320C6678 DSP from Texas Instruments.

  11. Three-State Locally Adaptive Texture Preserving Filter for Radar and Optical Image Processing

    Directory of Open Access Journals (Sweden)

    Jaakko T. Astola

    2005-05-01

    Full Text Available Textural features are one of the most important types of useful information contained in images. In practice, these features are commonly masked by noise. Relatively little attention has been paid to texture preserving properties of noise attenuation methods. This stimulates solving the following tasks: (1 to analyze the texture preservation properties of various filters; and (2 to design image processing methods capable to preserve texture features well and to effectively reduce noise. This paper deals with examining texture feature preserving properties of different filters. The study is performed for a set of texture samples and different noise variances. The locally adaptive three-state schemes are proposed for which texture is considered as a particular class. For “detection” of texture regions, several classifiers are proposed and analyzed. As shown, an appropriate trade-off of the designed filter properties is provided. This is demonstrated quantitatively for artificial test images and is confirmed visually for real-life images.

  12. Photoacoustic imaging in scattering media by combining a correlation matrix filter with a time reversal operator.

    Science.gov (United States)

    Rui, Wei; Tao, Chao; Liu, Xiaojun

    2017-09-18

    Acoustic scattering medium is a fundamental challenge for photoacoustic imaging. In this study, we reveal the different coherent properties of the scattering photoacoustic waves and the direct photoacoustic waves in a matrix form. Direct waves show a particular coherence on the antidiagonals of the matrix, whereas scattering waves do not. Based on this property, a correlation matrix filter combining with a time reversal operator is proposed to preserve the direct waves and recover the image behind a scattering layer. Both numerical simulations and photoacoustic imaging experiments demonstrate that the proposed approach effectively increases the image contrast and decreases the background speckles in a scattering medium. This study might improve the quality of photoacoustic imaging in an acoustic scattering environment and extend its applications.

  13. Real-time single image dehazing based on dark channel prior theory and guided filtering

    Science.gov (United States)

    Zhang, Zan

    2017-10-01

    Images and videos taken outside the foggy day are serious degraded. In order to restore degraded image taken in foggy day and overcome traditional Dark Channel prior algorithms problems of remnant fog in edge, we propose a new dehazing method.We first find the fog area in the dark primary color map to obtain the estimated value of the transmittance using quadratic tree. Then we regard the gray-scale image after guided filtering as atmospheric light map and remove haze based on it. Box processing and image down sampling technology are also used to improve the processing speed. Finally, the atmospheric light scattering model is used to restore the image. A plenty of experiments show that algorithm is effective, efficient and has a wide range of application.

  14. Regularization of DT-MR images using a successive Fermat median filtering method.

    Science.gov (United States)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan; Hong, Cheolpyo; Han, Bongsoo

    2008-05-21

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods.

  15. Regularization of DT-MR images using a successive Fermat median filtering method

    International Nuclear Information System (INIS)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan; Hong, Cheolpyo; Han, Bongsoo

    2008-01-01

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods

  16. Regularization of DT-MR images using a successive Fermat median filtering method

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Kiwoon; Kim, Dongyoun; Kim, Sunghee; Park, Insung; Jeong, Jaewon; Kim, Taehwan [Department of Biomedical Engineering, Yonsei University, Wonju, 220-710 (Korea, Republic of); Hong, Cheolpyo; Han, Bongsoo [Department of Radiological Science, Yonsei University, Wonju, 220-710 (Korea, Republic of)], E-mail: bshan@yonsei.ac.kr

    2008-05-21

    Tractography using diffusion tensor magnetic resonance imaging (DT-MRI) is a method to determine the architecture of axonal fibers in the central nervous system by computing the direction of greatest diffusion in the white matter of the brain. To reduce the noise in DT-MRI measurements, a tensor-valued median filter, which is reported to be denoising and structure preserving in the tractography, is applied. In this paper, we proposed the successive Fermat (SF) method, successively using Fermat point theory for a triangle contained in the two-dimensional plane, as a median filtering method. We discussed the error analysis and numerical study about the SF method for phantom and experimental data. By considering the computing time and the image quality aspects of the numerical study simultaneously, we showed that the SF method is much more efficient than the simple median (SM) and gradient descents (GD) methods.

  17. Filtering for distributed mechanical systems using position measurements: perspectives in medical imaging

    International Nuclear Information System (INIS)

    Moireau, Philippe; Chapelle, Dominique; Tallec, Patrick Le

    2009-01-01

    We propose an effective filtering methodology designed to perform estimation in a distributed mechanical system using position measurements. As in a previously introduced method, the filter is inspired by robust control feedback, but here we take full advantage of the estimation specificity to choose a feedback law that can act on displacements instead of velocities and still retain the same kind of dissipativity property which guarantees robustness. This is very valuable in many applications for which positions are more readily available than velocities, as in medical imaging. We provide an in-depth analysis of the proposed procedure, as well as detailed numerical assessments using a test problem inspired by cardiac biomechanics, as medical diagnosis assistance is an important perspective for this approach. The method is formulated first for measurements based on Lagrangian displacements, but we then derive a nonlinear extension allowing us to instead consider segmented images, which of course is even more relevant in medical applications

  18. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Directory of Open Access Journals (Sweden)

    Byeong Hak Kim

    2017-12-01

    Full Text Available Unmanned aerial vehicles (UAVs are equipped with optical systems including an infrared (IR camera such as electro-optical IR (EO/IR, target acquisition and designation sights (TADS, or forward looking IR (FLIR. However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC and scene-based NUC (SBNUC. However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA. In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR and long wave infrared (LWIR images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  19. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  20. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    Science.gov (United States)

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  1. Applying Enhancement Filters in the Pre-processing of Images of Lymphoma

    International Nuclear Information System (INIS)

    Silva, Sérgio Henrique; Do Nascimento, Marcelo Zanchetta; Neves, Leandro Alves; Batista, Valério Ramos

    2015-01-01

    Lymphoma is a type of cancer that affects the immune system, and is classified as Hodgkin or non-Hodgkin. It is one of the ten types of cancer that are the most common on earth. Among all malignant neoplasms diagnosed in the world, lymphoma ranges from three to four percent of them. Our work presents a study of some filters devoted to enhancing images of lymphoma at the pre-processing step. Here the enhancement is useful for removing noise from the digital images. We have analysed the noise caused by different sources like room vibration, scraps and defocusing, and in the following classes of lymphoma: follicular, mantle cell and B-cell chronic lymphocytic leukemia. The filters Gaussian, Median and Mean-Shift were applied to different colour models (RGB, Lab and HSV). Afterwards, we performed a quantitative analysis of the images by means of the Structural Similarity Index. This was done in order to evaluate the similarity between the images. In all cases we have obtained a certainty of at least 75%, which rises to 99% if one considers only HSV. Namely, we have concluded that HSV is an important choice of colour model at pre-processing histological images of lymphoma, because in this case the resulting image will get the best enhancement

  2. Detection of retinal nerve fiber layer defects in retinal fundus images using Gabor filtering

    Science.gov (United States)

    Hayashi, Yoshinori; Nakagawa, Toshiaki; Hatanaka, Yuji; Aoyama, Akira; Kakogawa, Masakatsu; Hara, Takeshi; Fujita, Hiroshi; Yamamoto, Tetsuya

    2007-03-01

    Retinal nerve fiber layer defect (NFLD) is one of the most important findings for the diagnosis of glaucoma reported by ophthalmologists. However, such changes could be overlooked, especially in mass screenings, because ophthalmologists have limited time to search for a number of different changes for the diagnosis of various diseases such as diabetes, hypertension and glaucoma. Therefore, the use of a computer-aided detection (CAD) system can improve the results of diagnosis. In this work, a technique for the detection of NFLDs in retinal fundus images is proposed. In the preprocessing step, blood vessels are "erased" from the original retinal fundus image by using morphological filtering. The preprocessed image is then transformed into a rectangular array. NFLD regions are observed as vertical dark bands in the transformed image. Gabor filtering is then applied to enhance the vertical dark bands. False positives (FPs) are reduced by a rule-based method which uses the information of the location and the width of each candidate region. The detected regions are back-transformed into the original configuration. In this preliminary study, 71% of NFLD regions are detected with average number of FPs of 3.2 per image. In conclusion, we have developed a technique for the detection of NFLDs in retinal fundus images. Promising results have been obtained in this initial study.

  3. Performance evaluation of 3-D enhancement filters for detection of lung cancer from 3-D chest X-ray CT images

    International Nuclear Information System (INIS)

    Shimizu, Akinobu; Hagai, Makoto; Toriwaki, Jun-ichiro; Hasegawa, Jun-ichi.

    1995-01-01

    This paper evaluates the performance of several three dimensional enhancement filters used in procedures for detecting lung cancer shadows from three dimensional (3D) chest X-ray CT images. Two dimensional enhancement filters such as Min-DD filter, Contrast filter and N-Quoit filter have been proposed for enhancing cancer shadows in conventional 2D X-ray images. In this paper, we extend each of these 2D filters to a 3D filter and evaluate its performance experimentally by using CT images with artificial and true lung cancer shadows. As a result, we find that these 3D filters are effective for determining the position of a lung cancer shadow in a 3D chest CT image, as compared with the simple procedure such as smoothing filter, and that the performance of these filters become lower in the hilar area due to the influence of the vessel shadows. (author)

  4. Impulse Noise Cancellation of Medical Images Using Wavelet Networks and Median Filters

    Science.gov (United States)

    Sadri, Amir Reza; Zekri, Maryam; Sadri, Saeid; Gheissari, Niloofar

    2012-01-01

    This paper presents a new two-stage approach to impulse noise removal for medical images based on wavelet network (WN). The first step is noise detection, in which the so-called gray-level difference and average background difference are considered as the inputs of a WN. Wavelet Network is used as a preprocessing for the second stage. The second step is removing impulse noise with a median filter. The wavelet network presented here is a fixed one without learning. Experimental results show that our method acts on impulse noise effectively, and at the same time preserves chromaticity and image details very well. PMID:23493998

  5. Hot spot detection for breast cancer in Ki-67 stained slides: image dependent filtering approach

    Science.gov (United States)

    Niazi, M. Khalid Khan; Downs-Kelly, Erinn; Gurcan, Metin N.

    2014-03-01

    We present a new method to detect hot spots from breast cancer slides stained for Ki67 expression. It is common practice to use centroid of a nucleus as a surrogate representation of a cell. This often requires the detection of individual nuclei. Once all the nuclei are detected, the hot spots are detected by clustering the centroids. For large size images, nuclei detection is computationally demanding. Instead of detecting the individual nuclei and treating hot spot detection as a clustering problem, we considered hot spot detection as an image filtering problem where positively stained pixels are used to detect hot spots in breast cancer images. The method first segments the Ki-67 positive pixels using the visually meaningful segmentation (VMS) method that we developed earlier. Then, it automatically generates an image dependent filter to generate a density map from the segmented image. The smoothness of the density image simplifies the detection of local maxima. The number of local maxima directly corresponds to the number of hot spots in the breast cancer image. The method was tested on 23 different regions of interest images extracted from 10 different breast cancer slides stained with Ki67. To determine the intra-reader variability, each image was annotated twice for hot spots by a boardcertified pathologist with a two-week interval in between her two readings. A computer-generated hot spot region was considered a true-positive if it agrees with either one of the two annotation sets provided by the pathologist. While the intra-reader variability was 57%, our proposed method can correctly detect hot spots with 81% precision.

  6. Elemental distribution imaging by energy-filtering transmission electron microscopy (EFTEM) and its applications

    International Nuclear Information System (INIS)

    Kurata, Hiroki

    1996-01-01

    EFTEM is new microscopy with the object of visualizing high resolution quantitative elemental distribution. The measurement principles and the present state of EFTEM studies are explained by the examples of measurement of the elemental distributions. EFTEM is a combination of the transmission electron microscope with the electron energy loss spectroscopy (EFLS). EFTEM method sets the slit in the specific energy field and put the electron passing the slit back in the microscopic image. The qualitative elemental analysis is obtained by observing the position of the absorption end of core electronic excitation spectrum and the quantitative one by determining the core electronic excitation strength of the specific atom depend on filtering with energy selector slit. The binding state and the local structure in the neighborhood of excited atom is determined by the fine structure of absorption end. By the chemical mapping method, the distribution image of chemical binding state is visualized by the imaging chemical map obtained by filtering the specific peak strength of fine structure with the narrow energy selector slit. The fine powder of lead chromate (PbCrO 4 ) covered with silica glass was shown as a typical example of the elemental distribution image of core electronic excitation spectrum. The quantitative analysis method of elemental distribution image is explained. The possibility of single atom analysis at nanometer was shown by the example of nanotube observed by EFTEM. (S.Y.)

  7. Energy-filtered Photoelectron Emission Microscopy (EF-PEEM) for imaging nanoelectronic materials

    International Nuclear Information System (INIS)

    Renault, Olivier; Chabli, Amal

    2007-01-01

    Photoelectron-Emission Microscopy (PEEM) is the most promising approach to photoemission-based (XPS, UPS) imaging techniques with high lateral resolution, typically below 100 nm. It has now reached its maturity with a new generation of instruments with energy-filtering capabilities. Therefore UPS and XPS imaging with energy-filtered PEEM (EF-PEEM) can be applied to technologically-relevant samples. UPS images with contrast in local work function, obtained with laboratory UV sources, are obtained in ultra-high vacuum environment with lateral resolutions better than 50 nm and sensitivies of 20 meV. XPS images with elemental and bonding state contrast can show up lateral resolution better than 200 nm with synchrotron excitation. In this paper, we present the principles and capabilities of EF-PEEM and nanospectroscopy. Then, we focus on an example of application to non-destructive work-function imaging of polycrystalline copper for advanced interconnects, where it is shown that EF-PEEM is an alternative to Kelvin probes

  8. Least median of squares filtering of locally optimal point matches for compressible flow image registration

    International Nuclear Information System (INIS)

    Castillo, Edward; Guerrero, Thomas; Castillo, Richard; White, Benjamin; Rojo, Javier

    2012-01-01

    Compressible flow based image registration operates under the assumption that the mass of the imaged material is conserved from one image to the next. Depending on how the mass conservation assumption is modeled, the performance of existing compressible flow methods is limited by factors such as image quality, noise, large magnitude voxel displacements, and computational requirements. The Least Median of Squares Filtered Compressible Flow (LFC) method introduced here is based on a localized, nonlinear least squares, compressible flow model that describes the displacement of a single voxel that lends itself to a simple grid search (block matching) optimization strategy. Spatially inaccurate grid search point matches, corresponding to erroneous local minimizers of the nonlinear compressible flow model, are removed by a novel filtering approach based on least median of squares fitting and the forward search outlier detection method. The spatial accuracy of the method is measured using ten thoracic CT image sets and large samples of expert determined landmarks (available at www.dir-lab.com). The LFC method produces an average error within the intra-observer error on eight of the ten cases, indicating that the method is capable of achieving a high spatial accuracy for thoracic CT registration. (paper)

  9. Defogging of road images using gain coefficient-based trilateral filter

    Science.gov (United States)

    Singh, Dilbag; Kumar, Vijay

    2018-01-01

    Poor weather conditions are responsible for most of the road accidents year in and year out. Poor weather conditions, such as fog, degrade the visibility of objects. Thus, it becomes difficult for drivers to identify the vehicles in a foggy environment. The dark channel prior (DCP)-based defogging techniques have been found to be an efficient way to remove fog from road images. However, it produces poor results when image objects are inherently similar to airlight and no shadow is cast on them. To eliminate this problem, a modified restoration model-based DCP is developed to remove the fog from road images. The transmission map is also refined by developing a gain coefficient-based trilateral filter. Thus, the proposed technique has an ability to remove fog from road images in an effective manner. The proposed technique is compared with seven well-known defogging techniques on two benchmark foggy images datasets and five real-time foggy images. The experimental results demonstrate that the proposed approach is able to remove the different types of fog from roadside images as well as significantly improve the image's visibility. It also reveals that the restored image has little or no artifacts.

  10. Mobile Phone Ratiometric Imaging Enables Highly Sensitive Fluorescence Lateral Flow Immunoassays without External Optical Filters.

    Science.gov (United States)

    Shah, Kamal G; Singh, Vidhi; Kauffman, Peter C; Abe, Koji; Yager, Paul

    2018-05-14

    Paper-based diagnostic tests based on the lateral flow immunoassay concept promise low-cost, point-of-care detection of infectious diseases, but such assays suffer from poor limits of detection. One factor that contributes to poor analytical performance is a reliance on low-contrast chromophoric optical labels such as gold nanoparticles. Previous attempts to improve the sensitivity of paper-based diagnostics include replacing chromophoric labels with enzymes, fluorophores, or phosphors at the expense of increased fluidic complexity or the need for device readers with costly optoelectronics. Several groups, including our own, have proposed mobile phones as suitable point-of-care readers due to their low cost, ease of use, and ubiquity. However, extant mobile phone fluorescence readers require costly optical filters and were typically validated with only one camera sensor module, which is inappropriate for potential point-of-care use. In response, we propose to couple low-cost ultraviolet light-emitting diodes with long Stokes-shift quantum dots to enable ratiometric mobile phone fluorescence measurements without optical filters. Ratiometric imaging with unmodified smartphone cameras improves the contrast and attenuates the impact of excitation intensity variability by 15×. Practical application was shown with a lateral flow immunoassay for influenza A with nucleoproteins spiked into simulated nasal matrix. Limits of detection of 1.5 and 2.6 fmol were attained on two mobile phones, which are comparable to a gel imager (1.9 fmol), 10× better than imaging gold nanoparticles on a scanner (18 fmol), and >2 orders of magnitude better than gold nanoparticle-labeled assays imaged with mobile phones. Use of the proposed filter-free mobile phone imaging scheme is a first step toward enabling a new generation of highly sensitive, point-of-care fluorescence assays.

  11. LED induced autofluorescence (LIAF) imager with eight multi-filters for oral cancer diagnosis

    Science.gov (United States)

    Huang, Ting-Wei; Cheng, Nai-Lun; Tsai, Ming-Hsui; Chiou, Jin-Chern; Mang, Ou-Yang

    2016-03-01

    Oral cancer is one of the serious and growing problem in many developing and developed countries. The simple oral visual screening by clinician can reduce 37,000 oral cancer deaths annually worldwide. However, the conventional oral examination with the visual inspection and the palpation of oral lesions is not an objective and reliable approach for oral cancer diagnosis, and it may cause the delayed hospital treatment for the patients of oral cancer or leads to the oral cancer out of control in the late stage. Therefore, a device for oral cancer detection are developed for early diagnosis and treatment. A portable LED Induced autofluorescence (LIAF) imager is developed by our group. It contained the multiple wavelength of LED excitation light and the rotary filter ring of eight channels to capture ex-vivo oral tissue autofluorescence images. The advantages of LIAF imager compared to other devices for oral cancer diagnosis are that LIAF imager has a probe of L shape for fixing the object distance, protecting the effect of ambient light, and observing the blind spot in the deep port between the gumsgingiva and the lining of the mouth. Besides, the multiple excitation of LED light source can induce multiple autofluorescence, and LIAF imager with the rotary filter ring of eight channels can detect the spectral images of multiple narrow bands. The prototype of a portable LIAF imager is applied in the clinical trials for some cases in Taiwan, and the images of the clinical trial with the specific excitation show the significant differences between normal tissue and oral tissue under these cases.

  12. Information Recovery Algorithm for Ground Objects in Thin Cloud Images by Fusing Guide Filter and Transfer Learning

    Directory of Open Access Journals (Sweden)

    HU Gensheng

    2018-03-01

    Full Text Available Ground object information of remote sensing images covered with thin clouds is obscure. An information recovery algorithm for ground objects in thin cloud images is proposed by fusing guide filter and transfer learning. Firstly, multi-resolution decomposition of thin cloud target images and cloud-free guidance images is performed by using multi-directional nonsubsampled dual-tree complex wavelet transform. Then the decomposed low frequency subbands are processed by using support vector guided filter and transfer learning respectively. The decomposed high frequency subbands are enhanced by using modified Laine enhancement function. The low frequency subbands output by guided filter and those predicted by transfer learning model are fused by the method of selection and weighting based on regional energy. Finally, the enhanced high frequency subbands and the fused low frequency subbands are reconstructed by using inverse multi-directional nonsubsampled dual-tree complex wavelet transform to obtain the ground object information recovery images. Experimental results of Landsat-8 OLI multispectral images show that, support vector guided filter can effectively preserve the detail information of the target images, domain adaptive transfer learning can effectively extend the range of available multi-source and multi-temporal remote sensing images, and good effects for ground object information recover are obtained by fusing guide filter and transfer learning to remove thin cloud on the remote sensing images.

  13. SD LMS L-Filters for Filtration of Gray Level Images in Timespatial Domain Based on GLCM Features

    Directory of Open Access Journals (Sweden)

    Robert Hudec

    2008-01-01

    Full Text Available In this paper, the new kind of adaptive signal-dependent LMS L-filter for suppression of a mixed noise in greyscale images is developed. It is based on the texture parameter measurement as modification of spatial impulse detector structure. Moreover, the one of GLCM (Gray Level Co-occurrence Matrix features, namely, the contrast or inertia adjusted by threshold as switch between partial filters is utilised. Finally, at the positions of partial filters the adaptive LMS versions of L-filters are chosen.

  14. Application of digital tomosynthesis (DTS) of optimal deblurring filters for dental X-ray imaging

    International Nuclear Information System (INIS)

    Oh, J. E.; Cho, H. S.; Kim, D. S.; Choi, S. I.; Je, U. K.

    2012-01-01

    Digital tomosynthesis (DTS) is a limited-angle tomographic technique that provides some of the tomographic benefits of computed tomography (CT) but at reduced dose and cost. Thus, the potential for application of DTS to dental X-ray imaging seems promising. As a continuation of our dental radiography R and D, we developed an effective DTS reconstruction algorithm and implemented it in conjunction with a commercial dental CT system for potential use in dental implant placement. The reconstruction algorithm employed a backprojection filtering (BPF) method based upon optimal deblurring filters to suppress effectively both the blur artifacts originating from the out-focus planes and the high-frequency noise. To verify the usefulness of the reconstruction algorithm, we performed systematic simulation works and evaluated the image characteristics. We also performed experimental works in which DTS images of enhanced anatomical resolution were successfully obtained by using the algorithm and were promising to our ongoing applications to dental X-ray imaging. In this paper, our approach to the development of the DTS reconstruction algorithm and the results are described in detail.

  15. Perfect blind restoration of images blurred by multiple filters: theory and efficient algorithms.

    Science.gov (United States)

    Harikumar, G; Bresler, Y

    1999-01-01

    We address the problem of restoring an image from its noisy convolutions with two or more unknown finite impulse response (FIR) filters. We develop theoretical results about the existence and uniqueness of solutions, and show that under some generically true assumptions, both the filters and the image can be determined exactly in the absence of noise, and stably estimated in its presence. We present efficient algorithms to estimate the blur functions and their sizes. These algorithms are of two types, subspace-based and likelihood-based, and are extensions of techniques proposed for the solution of the multichannel blind deconvolution problem in one dimension. We present memory and computation-efficient techniques to handle the very large matrices arising in the two-dimensional (2-D) case. Once the blur functions are determined, they are used in a multichannel deconvolution step to reconstruct the unknown image. The theoretical and practical implications of edge effects, and "weakly exciting" images are examined. Finally, the algorithms are demonstrated on synthetic and real data.

  16. Spatio-spectral color filter array design for optimal image recovery.

    Science.gov (United States)

    Hirakawa, Keigo; Wolfe, Patrick J

    2008-10-01

    In digital imaging applications, data are typically obtained via a spatial subsampling procedure implemented as a color filter array-a physical construction whereby only a single color value is measured at each pixel location. Owing to the growing ubiquity of color imaging and display devices, much recent work has focused on the implications of such arrays for subsequent digital processing, including in particular the canonical demosaicking task of reconstructing a full color image from spatially subsampled and incomplete color data acquired under a particular choice of array pattern. In contrast to the majority of the demosaicking literature, we consider here the problem of color filter array design and its implications for spatial reconstruction quality. We pose this problem formally as one of simultaneously maximizing the spectral radii of luminance and chrominance channels subject to perfect reconstruction, and-after proving sub-optimality of a wide class of existing array patterns-provide a constructive method for its solution that yields robust, new panchromatic designs implementable as subtractive colors. Empirical evaluations on multiple color image test sets support our theoretical results, and indicate the potential of these patterns to increase spatial resolution for fixed sensor size, and to contribute to improved reconstruction fidelity as well as significantly reduced hardware complexity.

  17. Cat Swarm Optimization Based Functional Link Artificial Neural Network Filter for Gaussian Noise Removal from Computed Tomography Images

    Directory of Open Access Journals (Sweden)

    M. Kumar

    2016-01-01

    Full Text Available Gaussian noise is one of the dominant noises, which degrades the quality of acquired Computed Tomography (CT image data. It creates difficulties in pathological identification or diagnosis of any disease. Gaussian noise elimination is desirable to improve the clarity of a CT image for clinical, diagnostic, and postprocessing applications. This paper proposes an evolutionary nonlinear adaptive filter approach, using Cat Swarm Functional Link Artificial Neural Network (CS-FLANN to remove the unwanted noise. The structure of the proposed filter is based on the Functional Link Artificial Neural Network (FLANN and the Cat Swarm Optimization (CSO is utilized for the selection of optimum weight of the neural network filter. The applied filter has been compared with the existing linear filters, like the mean filter and the adaptive Wiener filter. The performance indices, such as peak signal to noise ratio (PSNR, have been computed for the quantitative analysis of the proposed filter. The experimental evaluation established the superiority of the proposed filtering technique over existing methods.

  18. From Matched Spatial Filtering towards the Fused Statistical Descriptive Regularization Method for Enhanced Radar Imaging

    Directory of Open Access Journals (Sweden)

    Shkvarko Yuriy

    2006-01-01

    Full Text Available We address a new approach to solve the ill-posed nonlinear inverse problem of high-resolution numerical reconstruction of the spatial spectrum pattern (SSP of the backscattered wavefield sources distributed over the remotely sensed scene. An array or synthesized array radar (SAR that employs digital data signal processing is considered. By exploiting the idea of combining the statistical minimum risk estimation paradigm with numerical descriptive regularization techniques, we address a new fused statistical descriptive regularization (SDR strategy for enhanced radar imaging. Pursuing such an approach, we establish a family of the SDR-related SSP estimators, that encompass a manifold of existing beamforming techniques ranging from traditional matched filter to robust and adaptive spatial filtering, and minimum variance methods.

  19. A 3D Image Filter for Parameter-Free Segmentation of Macromolecular Structures from Electron Tomograms

    Science.gov (United States)

    Ali, Rubbiya A.; Landsberg, Michael J.; Knauth, Emily; Morgan, Garry P.; Marsh, Brad J.; Hankamer, Ben

    2012-01-01

    3D image reconstruction of large cellular volumes by electron tomography (ET) at high (≤5 nm) resolution can now routinely resolve organellar and compartmental membrane structures, protein coats, cytoskeletal filaments, and macromolecules. However, current image analysis methods for identifying in situ macromolecular structures within the crowded 3D ultrastructural landscape of a cell remain labor-intensive, time-consuming, and prone to user-bias and/or error. This paper demonstrates the development and application of a parameter-free, 3D implementation of the bilateral edge-detection (BLE) algorithm for the rapid and accurate segmentation of cellular tomograms. The performance of the 3D BLE filter has been tested on a range of synthetic and real biological data sets and validated against current leading filters—the pseudo 3D recursive and Canny filters. The performance of the 3D BLE filter was found to be comparable to or better than that of both the 3D recursive and Canny filters while offering the significant advantage that it requires no parameter input or optimisation. Edge widths as little as 2 pixels are reproducibly detected with signal intensity and grey scale values as low as 0.72% above the mean of the background noise. The 3D BLE thus provides an efficient method for the automated segmentation of complex cellular structures across multiple scales for further downstream processing, such as cellular annotation and sub-tomogram averaging, and provides a valuable tool for the accurate and high-throughput identification and annotation of 3D structural complexity at the subcellular level, as well as for mapping the spatial and temporal rearrangement of macromolecular assemblies in situ within cellular tomograms. PMID:22479430

  20. Three-Dimensional Terahertz Coded-Aperture Imaging Based on Matched Filtering and Convolutional Neural Network.

    Science.gov (United States)

    Chen, Shuo; Luo, Chenggao; Wang, Hongqiang; Deng, Bin; Cheng, Yongqiang; Zhuang, Zhaowen

    2018-04-26

    As a promising radar imaging technique, terahertz coded-aperture imaging (TCAI) can achieve high-resolution, forward-looking, and staring imaging by producing spatiotemporal independent signals with coded apertures. However, there are still two problems in three-dimensional (3D) TCAI. Firstly, the large-scale reference-signal matrix based on meshing the 3D imaging area creates a heavy computational burden, thus leading to unsatisfactory efficiency. Secondly, it is difficult to resolve the target under low signal-to-noise ratio (SNR). In this paper, we propose a 3D imaging method based on matched filtering (MF) and convolutional neural network (CNN), which can reduce the computational burden and achieve high-resolution imaging for low SNR targets. In terms of the frequency-hopping (FH) signal, the original echo is processed with MF. By extracting the processed echo in different spike pulses separately, targets in different imaging planes are reconstructed simultaneously to decompose the global computational complexity, and then are synthesized together to reconstruct the 3D target. Based on the conventional TCAI model, we deduce and build a new TCAI model based on MF. Furthermore, the convolutional neural network (CNN) is designed to teach the MF-TCAI how to reconstruct the low SNR target better. The experimental results demonstrate that the MF-TCAI achieves impressive performance on imaging ability and efficiency under low SNR. Moreover, the MF-TCAI has learned to better resolve the low-SNR 3D target with the help of CNN. In summary, the proposed 3D TCAI can achieve: (1) low-SNR high-resolution imaging by using MF; (2) efficient 3D imaging by downsizing the large-scale reference-signal matrix; and (3) intelligent imaging with CNN. Therefore, the TCAI based on MF and CNN has great potential in applications such as security screening, nondestructive detection, medical diagnosis, etc.

  1. Image super-resolution reconstruction based on regularization technique and guided filter

    Science.gov (United States)

    Huang, De-tian; Huang, Wei-qin; Gu, Pei-ting; Liu, Pei-zhong; Luo, Yan-min

    2017-06-01

    In order to improve the accuracy of sparse representation coefficients and the quality of reconstructed images, an improved image super-resolution algorithm based on sparse representation is presented. In the sparse coding stage, the autoregressive (AR) regularization and the non-local (NL) similarity regularization are introduced to improve the sparse coding objective function. A group of AR models which describe the image local structures are pre-learned from the training samples, and one or several suitable AR models can be adaptively selected for each image patch to regularize the solution space. Then, the image non-local redundancy is obtained by the NL similarity regularization to preserve edges. In the process of computing the sparse representation coefficients, the feature-sign search algorithm is utilized instead of the conventional orthogonal matching pursuit algorithm to improve the accuracy of the sparse coefficients. To restore image details further, a global error compensation model based on weighted guided filter is proposed to realize error compensation for the reconstructed images. Experimental results demonstrate that compared with Bicubic, L1SR, SISR, GR, ANR, NE + LS, NE + NNLS, NE + LLE and A + (16 atoms) methods, the proposed approach has remarkable improvement in peak signal-to-noise ratio, structural similarity and subjective visual perception.

  2. The HURRA filter: An easy method to eliminate collimator artifacts in high-energy gamma camera images.

    Science.gov (United States)

    Perez-Garcia, H; Barquero, R

    The correct determination and delineation of tumor/organ size is crucial in 2-D imaging in 131 I therapy. These images are usually obtained using a system composed of a Gamma camera and high-energy collimator, although the system can produce artifacts in the image. This article analyses these artifacts and describes a correction filter that can eliminate those collimator artifacts. Using free software, ImageJ, a central profile in the image is obtained and analyzed. Two components can be seen in the fluctuation of the profile: one associated with the stochastic nature of the radiation, plus electronic noise and the other periodically across the position in space due to the collimator. These frequencies are analytically obtained and compared with the frequencies in the Fourier transform of the profile. A specially developed filter removes the artifacts in the 2D Fourier transform of the DICOM image. This filter is tested using a 15-cm-diameter Petri dish with 131 I radioactive water (big object size) image, a 131 I clinical pill (small object size) image, and an image of the remainder of the lesion of two patients treated with 3.7GBq (100mCi), and 4.44GBq (120mCi) of 131 I, respectively, after thyroidectomy. The artifact is due to the hexagonal periodic structure of the collimator. The use of the filter on large-sized images reduces the fluctuation by 5.8-3.5%. In small-sized images, the FWHM can be determined in the filtered image, while this is impossible in the unfiltered image. The definition of tumor boundary and the visualization of the activity distribution inside patient lesions improve drastically when the filter is applied to the corresponding images obtained with HE gamma camera. The HURRA filter removes the artifact of high-energy collimator artifacts in planar images obtained with a Gamma camera without reducing the image resolution. It can be applied in any study of patient quantification because the number of counts remains invariant. The filter makes

  3. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  4. Segmentation of dermatoscopic images by frequency domain filtering and k-means clustering algorithms.

    Science.gov (United States)

    Rajab, Maher I

    2011-11-01

    Since the introduction of epiluminescence microscopy (ELM), image analysis tools have been extended to the field of dermatology, in an attempt to algorithmically reproduce clinical evaluation. Accurate image segmentation of skin lesions is one of the key steps for useful, early and non-invasive diagnosis of coetaneous melanomas. This paper proposes two image segmentation algorithms based on frequency domain processing and k-means clustering/fuzzy k-means clustering. The two methods are capable of segmenting and extracting the true border that reveals the global structure irregularity (indentations and protrusions), which may suggest excessive cell growth or regression of a melanoma. As a pre-processing step, Fourier low-pass filtering is applied to reduce the surrounding noise in a skin lesion image. A quantitative comparison of the techniques is enabled by the use of synthetic skin lesion images that model lesions covered with hair to which Gaussian noise is added. The proposed techniques are also compared with an established optimal-based thresholding skin-segmentation method. It is demonstrated that for lesions with a range of different border irregularity properties, the k-means clustering and fuzzy k-means clustering segmentation methods provide the best performance over a range of signal to noise ratios. The proposed segmentation techniques are also demonstrated to have similar performance when tested on real skin lesions representing high-resolution ELM images. This study suggests that the segmentation results obtained using a combination of low-pass frequency filtering and k-means or fuzzy k-means clustering are superior to the result that would be obtained by using k-means or fuzzy k-means clustering segmentation methods alone. © 2011 John Wiley & Sons A/S.

  5. Using Convolutional Neural Network Filters to Measure Left-Right Mirror Symmetry in Images

    Directory of Open Access Journals (Sweden)

    Anselm Brachmann

    2016-12-01

    Full Text Available We propose a method for measuring symmetry in images by using filter responses from Convolutional Neural Networks (CNNs. The aim of the method is to model human perception of left/right symmetry as closely as possible. Using the Convolutional Neural Network (CNN approach has two main advantages: First, CNN filter responses closely match the responses of neurons in the human visual system; they take information on color, edges and texture into account simultaneously. Second, we can measure higher-order symmetry, which relies not only on color, edges and texture, but also on the shapes and objects that are depicted in images. We validated our algorithm on a dataset of 300 music album covers, which were rated according to their symmetry by 20 human observers, and compared results with those from a previously proposed method. With our method, human perception of symmetry can be predicted with high accuracy. Moreover, we demonstrate that the inclusion of features from higher CNN layers, which encode more abstract image content, increases the performance further. In conclusion, we introduce a model of left/right symmetry that closely models human perception of symmetry in CD album covers.

  6. Electrical Resistance Imaging of Bubble Boundary in Annular Two-Phase Flows Using Unscented Kalman Filter

    International Nuclear Information System (INIS)

    Lee, Jeong Seong; Chung, Soon Il; Ljaz, Umer Zeeshan; Khambampati, Anil Kumar; Kim, Kyung Youn; Kim, Sin Kim

    2007-01-01

    For the visualization of the phase boundary in annular two-phase flows, the electrical resistance tomography (ERT) technique is introduced. In ERT, a set of predetermined electrical currents is injected trough the electrodes placed on the boundary of the flow passage and the induced electrical potentials are measured on the electrode. With the relationship between the injected currents and the induced voltages, the electrical conductivity distribution across the flow domain is estimated through the image reconstruction algorithm. In this, the conductivity distribution corresponds to the phase distribution. In the application of ERT to two-phase flows where there are only two conductivity values, the conductivity distribution estimation problem can be transformed into the boundary estimation problem. This paper considers a bubble boundary estimation with ERT in annular two-phase flows. As the image reconstruction algorithm, the unscented Kalman filter (UKF) is adopted since from the control theory it is reported that the UKF shows better performance than the extended Kalman filter (EKF) that has been commonly used. We formulated the UKF algorithm to be incorporate into the image reconstruction algorithm for the present problem. Also, phantom experiments have been conducted to evaluate the improvement by UKF

  7. Imaging spin filter for electrons based on specular reflection from iridium (001)

    Energy Technology Data Exchange (ETDEWEB)

    Kutnyakhov, D.; Lushchyk, P. [Johannes Gutenberg-Universität, Institut für Physik, 55099 Mainz (Germany); Fognini, A.; Perriard, D. [Laboratorium für Festkörperphysik, ETH Zürich, 8093 Zürich (Switzerland); Kolbe, M.; Medjanik, K.; Fedchenko, E.; Nepijko, S.A.; Elmers, H.J. [Johannes Gutenberg-Universität, Institut für Physik, 55099 Mainz (Germany); Salvatella, G.; Stieger, C.; Gort, R.; Bähler, T.; Michlmayer, T.; Acremann, Y.; Vaterlaus, A. [Laboratorium für Festkörperphysik, ETH Zürich, 8093 Zürich (Switzerland); Giebels, F.; Gollisch, H.; Feder, R. [Universität Duisburg-Essen, Theoretische Festkörperphysik, 47057 Duisburg (Germany); Tusche, C. [Max Planck-Institut für Mikrostrukturphysik, 06120 Halle (Germany); and others

    2013-07-15

    As Stern–Gerlach type spin filters do not work with electrons, spin analysis of electron beams is accomplished by spin-dependent scattering processes based on spin–orbit or exchange interaction. Existing polarimeters are single-channel devices characterized by an inherently low figure of merit (FoM) of typically 10{sup −4}–10{sup −3}. This single-channel approach is not compatible with parallel imaging microscopes and also not with modern electron spectrometers that acquire a certain energy and angular interval simultaneously. We present a novel type of polarimeter that can transport a full image by making use of k-parallel conservation in low-energy electron diffraction. We studied specular reflection from Ir (001) because this spin-filter crystal provides a high analyzing power combined with a “lifetime” in UHV of a full day. One good working point is centered at 39 eV scattering energy with a broad maximum of 5 eV usable width. A second one at about 10 eV shows a narrower profile but much higher FoM. A relativistic layer-KKR SPLEED calculation shows good agreement with measurements. - Highlights: • Novel type of spin polarimeter can transport a full image by making use of k{sup →}{sub ||} conservation in LEED. • When combined with a hemispherical analyzer, it acquires a certain energy and angular interval simultaneously. • Ir (001) based spin-filter provides a high analyzing power combined with a “lifetime” in UHV of a full day. • Parallel spin detection improves spin polarimeter efficiency by orders of magnitude. • A relativistic layer-KKR SPLEED calculation shows good agreement with measurements.

  8. Thermographic image analysis for classification of ACL rupture disease, bone cancer, and feline hyperthyroid, with Gabor filters

    Science.gov (United States)

    Alvandipour, Mehrdad; Umbaugh, Scott E.; Mishra, Deependra K.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Thermography and pattern classification techniques are used to classify three different pathologies in veterinary images. Thermographic images of both normal and diseased animals were provided by the Long Island Veterinary Specialists (LIVS). The three pathologies are ACL rupture disease, bone cancer, and feline hyperthyroid. The diagnosis of these diseases usually involves radiology and laboratory tests while the method that we propose uses thermographic images and image analysis techniques and is intended for use as a prescreening tool. Images in each category of pathologies are first filtered by Gabor filters and then various features are extracted and used for classification into normal and abnormal classes. Gabor filters are linear filters that can be characterized by the two parameters wavelength λ and orientation θ. With two different wavelength and five different orientations, a total of ten different filters were studied. Different combinations of camera views, filters, feature vectors, normalization methods, and classification methods, produce different tests that were examined and the sensitivity, specificity and success rate for each test were produced. Using the Gabor features alone, sensitivity, specificity, and overall success rates of 85% for each of the pathologies was achieved.

  9. Chromatic aberrations correction for imaging spectrometer based on acousto-optic tunable filter with two transducers.

    Science.gov (United States)

    Zhao, Huijie; Wang, Ziye; Jia, Guorui; Zhang, Ying; Xu, Zefu

    2017-10-02

    The acousto-optic tunable filter (AOTF) with wide wavelength range and high spectral resolution has long crystal and two transducers. A longer crystal length leads to a bigger chromatic focal shift and the double-transducer arrangement induces angular mutation in diffracted beam, which increase difficulty in longitudinal and lateral chromatic aberration correction respectively. In this study, the two chromatic aberrations are analyzed quantitatively based on an AOTF optical model and a novel catadioptric dual-path configuration is proposed to correct both the chromatic aberrations. The test results exhibit effectiveness of the optical configuration for this type of AOTF-based imaging spectrometer.

  10. Removal of jitter noise in 3D shape recovery from image focus by using Kalman filter.

    Science.gov (United States)

    Jang, Hoon-Seok; Muhammad, Mannan Saeed; Choi, Tae-Sun

    2018-02-01

    In regard to Shape from Focus, one critical factor impacting system application is mechanical vibration of the translational stage causing jitter noise along the optical axis. This noise is not detectable by simply observing the image. However, when focus measures are applied, inaccuracies in the depth occur. In this article, jitter noise and focus curves are modeled by Gaussian distribution and quadratic function, respectively. Then Kalman filter is designed and applied to eliminate this noise in the focus curves, as a post-processing step after the focus measure application. Experiments are implemented with simulated objects and real objects to show usefulness of proposed algorithm. © 2017 Wiley Periodicals, Inc.

  11. Imaging reconstruction based on improved wavelet denoising combined with parallel-beam filtered back-projection algorithm

    Science.gov (United States)

    Ren, Zhong; Liu, Guodong; Huang, Zhen

    2012-11-01

    The image reconstruction is a key step in medical imaging (MI) and its algorithm's performance determinates the quality and resolution of reconstructed image. Although some algorithms have been used, filter back-projection (FBP) algorithm is still the classical and commonly-used algorithm in clinical MI. In FBP algorithm, filtering of original projection data is a key step in order to overcome artifact of the reconstructed image. Since simple using of classical filters, such as Shepp-Logan (SL), Ram-Lak (RL) filter have some drawbacks and limitations in practice, especially for the projection data polluted by non-stationary random noises. So, an improved wavelet denoising combined with parallel-beam FBP algorithm is used to enhance the quality of reconstructed image in this paper. In the experiments, the reconstructed effects were compared between the improved wavelet denoising and others (directly FBP, mean filter combined FBP and median filter combined FBP method). To determine the optimum reconstruction effect, different algorithms, and different wavelet bases combined with three filters were respectively test. Experimental results show the reconstruction effect of improved FBP algorithm is better than that of others. Comparing the results of different algorithms based on two evaluation standards i.e. mean-square error (MSE), peak-to-peak signal-noise ratio (PSNR), it was found that the reconstructed effects of the improved FBP based on db2 and Hanning filter at decomposition scale 2 was best, its MSE value was less and the PSNR value was higher than others. Therefore, this improved FBP algorithm has potential value in the medical imaging.

  12. Time-resolved magnetic imaging in an aberration-corrected, energy-filtered photoemission electron microscope

    International Nuclear Information System (INIS)

    Nickel, F.; Gottlob, D.M.; Krug, I.P.; Doganay, H.; Cramm, S.; Kaiser, A.M.; Lin, G.; Makarov, D.; Schmidt, O.G.

    2013-01-01

    We report on the implementation and usage of a synchrotron-based time-resolving operation mode in an aberration-corrected, energy-filtered photoemission electron microscope. The setup consists of a new type of sample holder, which enables fast magnetization reversal of the sample by sub-ns pulses of up to 10 mT. Within the sample holder current pulses are generated by a fast avalanche photo diode and transformed into magnetic fields by means of a microstrip line. For more efficient use of the synchrotron time structure, we developed an electrostatic deflection gating mechanism capable of beam blanking within a few nanoseconds. This allows us to operate the setup in the hybrid bunch mode of the storage ring facility, selecting one or several bright singular light pulses which are temporally well-separated from the normal high-intensity multibunch pulse pattern. - Highlights: • A new time-resolving operation mode in photoemission electron microscopy is shown. • Our setup works within an energy-filtered, aberration-corrected PEEM. • A new gating system for bunch selection using synchrotron radiation is developed. • An alternative magnetic excitation system is developed. • First tr-imaging using an energy-filtered, aberration-corrected PEEM is shown

  13. Particle image velocimetry (PIV) study of rotating cylindrical filters for animal cell perfusion processes.

    Science.gov (United States)

    Figueredo-Cardero, Alvio; Chico, Ernesto; Castilho, Leda; de Andrade Medronho, Ricardo

    2012-01-01

    In the present work, the main fluid flow features inside a rotating cylindrical filtration (RCF) system used as external cell retention device for animal cell perfusion processes were investigated using particle image velocimetry (PIV). The motivation behind this work was to provide experimental fluid dynamic data for such turbulent flow using a high-permeability filter, given the lack of information about this system in the literature. The results shown herein gave evidence that, at the boundary between the filter mesh and the fluid, a slip velocity condition in the tangential direction does exist, which had not been reported in the literature so far. In the RCF system tested, this accounted for a fluid velocity 10% lower than that of the filter tip, which could be important for the cake formation kinetics during filtration. Evidence confirming the existence of Taylor vortices under conditions of turbulent flow and high permeability, typical of animal cell perfusion RCF systems, was obtained. Second-order turbulence statistics were successfully calculated. The radial behavior of the second-order turbulent moments revealed that turbulence in this system is highly anisotropic, which is relevant for performing numerical simulations of this system. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  14. Multiscale bilateral filtering for improving image quality in digital breast tomosynthesis

    Science.gov (United States)

    Lu, Yao; Chan, Heang-Ping; Wei, Jun; Hadjiiski, Lubomir M.; Samala, Ravi K.

    2015-01-01

    Purpose: Detection of subtle microcalcifications in digital breast tomosynthesis (DBT) is a challenging task because of the large, noisy DBT volume. It is important to enhance the contrast-to-noise ratio (CNR) of microcalcifications in DBT reconstruction. Most regularization methods depend on local gradient and may treat the ill-defined margins or subtle spiculations of masses and subtle microcalcifications as noise because of their small gradient. The authors developed a new multiscale bilateral filtering (MSBF) regularization method for the simultaneous algebraic reconstruction technique (SART) to improve the CNR of microcalcifications without compromising the quality of masses. Methods: The MSBF exploits a multiscale structure of DBT images to suppress noise and selectively enhance high frequency structures. At the end of each SART iteration, every DBT slice is decomposed into several frequency bands via Laplacian pyramid decomposition. No regularization is applied to the low frequency bands so that subtle edges of masses and structured background are preserved. Bilateral filtering is applied to the high frequency bands to enhance microcalcifications while suppressing noise. The regularized DBT images are used for updating in the next SART iteration. The new MSBF method was compared with the nonconvex total p-variation (TpV) method for noise regularization with SART. A GE GEN2 prototype DBT system was used for acquisition of projections at 21 angles in 3° increments over a ±30° range. The reconstruction image quality with no regularization (NR) and that with the two regularization methods were compared using the DBT scans of a heterogeneous breast phantom and several human subjects with masses and microcalcifications. The CNR and the full width at half maximum (FWHM) of the line profiles of microcalcifications and across the spiculations within their in-focus DBT slices were used as image quality measures. Results: The MSBF method reduced contouring artifacts

  15. 3D spectrum imaging of multi-wall carbon nanotube coupled π-surface modes utilising electron energy-loss spectra acquired using a STEM/Enfina system

    International Nuclear Information System (INIS)

    Seepujak, A.; Bangert, U.; Gutierrez-Sosa, A.; Harvey, A.J.; Blank, V.D.; Kulnitskiy, B.A.; Batov, D.V.

    2005-01-01

    Numerous studies have utilised electron energy-loss (EEL) spectra acquired in the plasmon (2-10 eV) regime in order to probe delocalised π-electronic states of multi-wall carbon nanotubes (MWCNTs). Interpretation of electron energy loss (EEL) spectra of MWCNTs in the 2-10 eV regime. Carbon (accepted for publication); Blank et al. J. Appl. Phys. 91 (2002) 1657). In the present contribution, EEL spectra were acquired from a 2D raster defined on a bottle-shaped MWCNT, using a Gatan UHV Enfina system attached to a dedicated scanning transmission electron microscope (STEM). The technique utilised to isolate and sequentially filter each of the volume and surface resonances is described in detail. Utilising a scale for the intensity of a filtered mode enables one to 'see' the distribution of each resonance in the raster. This enables striking 3D resonance-filtered spectrum images (SIs) of π-collective modes to be observed. Red-shift of the lower energy split π-surface resonance provides explicit evidence of π-surface mode coupling predicted for thin graphitic films (Lucas et al. Phys. Rev. B 49 (1994) 2888). Resonance-filtered SIs are also compared to non-filtered SIs with suppressed surface contributions, acquired utilising a displaced collector aperture. The present filtering technique is seen to isolate surface contributions more effectively, and without the significant loss of statistics, associated with the displaced collector aperture mode. Isolation of collective modes utilising 3D resonance-filtered spectrum imaging, demonstrates a valuable method for 'pinpointing' the location of discrete modes in irregularly shaped nanostructures

  16. Dose profile measurement using an imaging plate: Evaluation of filters using Monte Carlo simulation of 4 MV x-rays

    Energy Technology Data Exchange (ETDEWEB)

    Hashimoto, Masatoshi [Division of Radiology and Biomedical Engineering, Graduate School of Medicine, University of Tokyo, Bunkyo-ku, Tokyo 113-8655 (Japan); Department of Therapeutic Radiology, Medical Plaza Edogawa, Edogawa-ku, Tokyo 133-0052 (Japan); Tomita, Tetsuya; Sawada, Koichi [Department of Radiology, Chiba University Hospital, Cyuo-ku, Chiba 260-8677 (Japan); Fujibuchi, Toshioh [Department of Radiological Sciences, School of Health Science, Ibaraki Prefectural University, Inashiki-gun, Ibaraki 300-0394, Japan and Graduate School of Comprehensive Human Sciences, University of Tsukuba, Tsukuba-shi, Ibaraki 305-8575 (Japan); Nishio, Teiji [Particle Therapy Division, Research Center for Innovation Oncology, National Cancer Center Hospital East, Kashiwa-shi, Chiba 277-8577 (Japan); Nakagawa, Keiichi [Division of Radiology and Biomedical Engineering, Graduate School of Medicine, The University of Tokyo, Bunkyo-ku, Tokyo 113-8655 (Japan)

    2009-04-15

    Computed radiography (CR) is gradually replacing film. The application of CR for two-dimensional profiles and off-axis ratio (OAR) measurement using an imaging plate (IP) in a CR system is currently under discussion. However, a well known problem for IPs in dosimetry is that they use high atomic number (Z) materials, such as Ba, which have an energy dependency in a photon interaction. Although there are some reports that it is possible to compensate for the energy dependency with metal filters, the appropriate thicknesses of these filters and where they should be located have not been investigated. The purpose of this study is to find the most suitable filter for use with an IP as a dosimetric tool. Monte Carlo simulation (Geant4 8.1) was used to determine the filter to minimize the measurement error in OAR measurements of 4 MV x-rays. In this simulation, the material and thickness of the filter and distance between the IP and the filter were varied to determine most suitable filter conditions that gave the best fit to the MC calculated OAR in water. With regard to changing the filter material, we found that using higher Z and higher density material increased the effectiveness of the filter. Also, increasing the distance between the filter and the IP reduced the effectiveness, whereas increasing the thickness of the filter increased the effectiveness. The result of this study showed that the most appropriate filter conditions consistent with the calculated OAR in water were the ones with the IP sandwiched between two 2 mm thick lead filters at a distance of 5 mm from the IP or the IP sandwiched directly between two 1 mm lead filters. Using these filters, we measured the OAR at 10 cm depth with 100 cm source-to-surface distance and surface 10x10 cm{sup 2} field size. The results of this measurement represented that it is possible to achieve measurements with less than within 2.0% and 2.0% in the field and with less than 1.1% and 0.6% out of the field by using 2 and

  17. Cervical spine imaging in trauma: Does the use of grid and filter combination improve visualisation of the cervicothoracic junction?

    International Nuclear Information System (INIS)

    Goyal, Nimit; Rachapalli, Vamsidhar; Burns, Helen; Lloyd, David C.F.

    2011-01-01

    Purpose: To evaluate the usefulness of filter and anti-scatter grid combination in demonstrating the cervicothoracic junction in lateral cervical spine radiographs performed for trauma patients. Methods: Following a change in departmental protocol in our hospital, anti-scatter grid and filter are routinely used for lateral cervical spine radiograph in all trauma patients with immobilised cervical spine. A retrospective study was done to compare the efficacy of lateral cervical spine radiographs in demonstrating the cervicothoracic junction for a period of three months before and after the implementation of the change. All images were independently evaluated by two observers. Results: 253 trauma patients had a lateral cervical spine radiograph done in January to March 2003 without using the anti-scatter grid and filter while 309 patients in January to March 2007, using filter and grid. Inter-observer variability between the two observers was calculated using Cohen's Kappa which showed good and very good agreement for 2003 and 2007 respectively. 126 (49.8%) images adequately demonstrated the cervicothoracic junction without using filter and grid while 189 (61.1%) were adequate following their use. This was statistically significant (Fischer exact test, p value = 0.0081). Conclusion: The use of filter and anti-scatter grids improves the visualisation of cervicothoracic junction in lateral cervical spine imaging and reduces the need to repeat exposure.

  18. Cervical spine imaging in trauma: Does the use of grid and filter combination improve visualisation of the cervicothoracic junction?

    Energy Technology Data Exchange (ETDEWEB)

    Goyal, Nimit, E-mail: nimitgoyal@doctors.org.u [University Hospital of Wales, Heath Park, Cardiff, CF14 4XW (United Kingdom); Rachapalli, Vamsidhar; Burns, Helen; Lloyd, David C.F. [University Hospital of Wales, Heath Park, Cardiff, CF14 4XW (United Kingdom)

    2011-02-15

    Purpose: To evaluate the usefulness of filter and anti-scatter grid combination in demonstrating the cervicothoracic junction in lateral cervical spine radiographs performed for trauma patients. Methods: Following a change in departmental protocol in our hospital, anti-scatter grid and filter are routinely used for lateral cervical spine radiograph in all trauma patients with immobilised cervical spine. A retrospective study was done to compare the efficacy of lateral cervical spine radiographs in demonstrating the cervicothoracic junction for a period of three months before and after the implementation of the change. All images were independently evaluated by two observers. Results: 253 trauma patients had a lateral cervical spine radiograph done in January to March 2003 without using the anti-scatter grid and filter while 309 patients in January to March 2007, using filter and grid. Inter-observer variability between the two observers was calculated using Cohen's Kappa which showed good and very good agreement for 2003 and 2007 respectively. 126 (49.8%) images adequately demonstrated the cervicothoracic junction without using filter and grid while 189 (61.1%) were adequate following their use. This was statistically significant (Fischer exact test, p value = 0.0081). Conclusion: The use of filter and anti-scatter grids improves the visualisation of cervicothoracic junction in lateral cervical spine imaging and reduces the need to repeat exposure.

  19. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    Energy Technology Data Exchange (ETDEWEB)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio, E-mail: hyoriyaz@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Jackowski, Marcel P., E-mail: mjack@ime.usp.b [University of Sao Paulo (USP), SP (Brazil). Dept. of Computer Science

    2011-07-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  20. Stimulated Emission Computed Tomography (NSECT) images enhancement using a linear filter in the frequency domain

    International Nuclear Information System (INIS)

    Viana, Rodrigo S.S.; Tardelli, Tiago C.; Yoriyaz, Helio; Jackowski, Marcel P.

    2011-01-01

    In recent years, a new technique for in vivo spectrographic imaging of stable isotopes was presented as Neutron Stimulated Emission Computed Tomography (NSECT). In this technique, a fast neutrons beam stimulates stable nuclei in a sample, which emit characteristic gamma radiation. The photon energy is unique and is used to identify the emitting nuclei. The emitted gamma energy spectra can be used for reconstruction of the target tissue image and for determination of the tissue elemental composition. Due to the stochastic nature of photon emission process by irradiated tissue, one of the most suitable algorithms for tomographic reconstruction is the Expectation-Maximization (E-M) algorithm, once on its formulation are considered simultaneously the probabilities of photons emission and detection. However, a disadvantage of this algorithm is the introduction of noise in the reconstructed image as the number of iterations increases. This increase can be caused either by features of the algorithm itself or by the low sampling rate of projections used for tomographic reconstruction. In this work, a linear filter in the frequency domain was used in order to improve the quality of the reconstructed images. (author)

  1. Papaya Tree Detection with UAV Images Using a GPU-Accelerated Scale-Space Filtering Method

    Directory of Open Access Journals (Sweden)

    Hao Jiang

    2017-07-01

    Full Text Available The use of unmanned aerial vehicles (UAV can allow individual tree detection for forest inventories in a cost-effective way. The scale-space filtering (SSF algorithm is commonly used and has the capability of detecting trees of different crown sizes. In this study, we made two improvements with regard to the existing method and implementations. First, we incorporated SSF with a Lab color transformation to reduce over-detection problems associated with the original luminance image. Second, we ported four of the most time-consuming processes to the graphics processing unit (GPU to improve computational efficiency. The proposed method was implemented using PyCUDA, which enabled access to NVIDIA’s compute unified device architecture (CUDA through high-level scripting of the Python language. Our experiments were conducted using two images captured by the DJI Phantom 3 Professional and a most recent NVIDIA GPU GTX1080. The resulting accuracy was high, with an F-measure larger than 0.94. The speedup achieved by our parallel implementation was 44.77 and 28.54 for the first and second test image, respectively. For each 4000 × 3000 image, the total runtime was less than 1 s, which was sufficient for real-time performance and interactive application.

  2. Automated microaneurysm detection method based on double ring filter in retinal fundus images

    Science.gov (United States)

    Mizutani, Atsushi; Muramatsu, Chisako; Hatanaka, Yuji; Suemori, Shinsuke; Hara, Takeshi; Fujita, Hiroshi

    2009-02-01

    The presence of microaneurysms in the eye is one of the early signs of diabetic retinopathy, which is one of the leading causes of vision loss. We have been investigating a computerized method for the detection of microaneurysms on retinal fundus images, which were obtained from the Retinopathy Online Challenge (ROC) database. The ROC provides 50 training cases, in which "gold standard" locations of microaneurysms are provided, and 50 test cases without the gold standard locations. In this study, the computerized scheme was developed by using the training cases. Although the results for the test cases are also included, this paper mainly discusses the results for the training cases because the "gold standard" for the test cases is not known. After image preprocessing, candidate regions for microaneurysms were detected using a double-ring filter. Any potential false positives located in the regions corresponding to blood vessels were removed by automatic extraction of blood vessels from the images. Twelve image features were determined, and the candidate lesions were classified into microaneurysms or false positives using the rule-based method and an artificial neural network. The true positive fraction of the proposed method was 0.45 at 27 false positives per image. Forty-two percent of microaneurysms in the 50 training cases were considered invisible by the consensus of two co-investigators. When the method was evaluated for visible microaneurysms, the sensitivity for detecting microaneurysms was 65% at 27 false positives per image. Our computerized detection scheme could be improved for helping ophthalmologists in the early diagnosis of diabetic retinopathy.

  3. Hyper-spectral modulation fluorescent imaging using double acousto-optical tunable filter based on TeO2-crystals

    International Nuclear Information System (INIS)

    Zaytsev, Kirill I; Perchik, Alexey V; Chernomyrdin, Nikita V; Yurchenko, Stanislav O; Kudrin, Konstantin G; Reshetov, Igor V

    2015-01-01

    We have proposed a method for hyper-spectral fluorescent imaging based on acousto-optical filtering. The object of interest was pumped using ultraviolet radiation of mercury lamp equipped with monochromatic excitation filter with the window of transparency centered at 365 nm. Double TeO 2 -based acousto-optical filter, tunable in range from 430 to 780 nm and having 2 nm bandwidth of spectral transparency, was used in order to detect quasimonochromatic images of object fluorescence. Modulating of ultraviolet pump intensity was used in order to reduce an impact of non-fluorescent background on the sample fluorescent imaging. The technique for signal-to-noise ratio improvement, based on fluorescence intensity estimation via digital processing of modulated video sequence of fluorescent object, was introduced. We have implemented the proposed technique for the test sample studying and we have discussed its possible applications

  4. The use of wavelet filters for reducing noise in posterior fossa Computed Tomography images

    International Nuclear Information System (INIS)

    Pita-Machado, Reinado; Perez-Diaz, Marlen; Lorenzo-Ginori, Juan V.; Bravo-Pino, Rolando

    2014-01-01

    Wavelet transform based de-noising like wavelet shrinkage, gives the good results in CT. This procedure affects very little the spatial resolution. Some applications are reconstruction methods, while others are a posteriori de-noising methods. De-noising after reconstruction is very difficult because the noise is non-stationary and has unknown distribution. Therefore, methods which work on the sinogram-space don’t have this problem, because they always work over a known noise distribution at this point. On the other hand, the posterior fossa in a head CT is a very complex region for physicians, because it is commonly affected by artifacts and noise which are not eliminated during the reconstruction procedure. This can leads to some false positive evaluations. The purpose of our present work is to compare different wavelet shrinkage de-noising filters to reduce noise, particularly in images of the posterior fossa within CT scans in the sinogram-space. This work describes an experimental search for the best wavelets, to reduce Poisson noise in Computed Tomography (CT) scans. Results showed that de-noising with wavelet filters improved the quality of posterior fossa region in terms of an increased CNR, without noticeable structural distortions

  5. Blood Vessel Extraction in Color Retinal Fundus Images with Enhancement Filtering and Unsupervised Classification

    Directory of Open Access Journals (Sweden)

    Zafer Yavuz

    2017-01-01

    Full Text Available Retinal blood vessels have a significant role in the diagnosis and treatment of various retinal diseases such as diabetic retinopathy, glaucoma, arteriosclerosis, and hypertension. For this reason, retinal vasculature extraction is important in order to help specialists for the diagnosis and treatment of systematic diseases. In this paper, a novel approach is developed to extract retinal blood vessel network. Our method comprises four stages: (1 preprocessing stage in order to prepare dataset for segmentation; (2 an enhancement procedure including Gabor, Frangi, and Gauss filters obtained separately before a top-hat transform; (3 a hard and soft clustering stage which includes K-means and Fuzzy C-means (FCM in order to get binary vessel map; and (4 a postprocessing step which removes falsely segmented isolated regions. The method is tested on color retinal images obtained from STARE and DRIVE databases which are available online. As a result, Gabor filter followed by K-means clustering method achieves 95.94% and 95.71% of accuracy for STARE and DRIVE databases, respectively, which are acceptable for diagnosis systems.

  6. Chip-scale fluorescence microscope based on a silo-filter complementary metal-oxide semiconductor image sensor.

    Science.gov (United States)

    Ah Lee, Seung; Ou, Xiaoze; Lee, J Eugene; Yang, Changhuei

    2013-06-01

    We demonstrate a silo-filter (SF) complementary metal-oxide semiconductor (CMOS) image sensor for a chip-scale fluorescence microscope. The extruded pixel design with metal walls between neighboring pixels guides fluorescence emission through the thick absorptive filter to the photodiode of a pixel. Our prototype device achieves 13 μm resolution over a wide field of view (4.8 mm × 4.4 mm). We demonstrate bright-field and fluorescence longitudinal imaging of living cells in a compact, low-cost configuration.

  7. Automatic Detection and Evaluation of Solar Cell Micro-Cracks in Electroluminescence Images Using Matched Filters

    Energy Technology Data Exchange (ETDEWEB)

    Spataru, Sergiu; Hacke, Peter; Sera, Dezso

    2016-11-21

    A method for detecting micro-cracks in solar cells using two dimensional matched filters was developed, derived from the electroluminescence intensity profile of typical micro-cracks. We describe the image processing steps to obtain a binary map with the location of the micro-cracks. Finally, we show how to automatically estimate the total length of each micro-crack from these maps, and propose a method to identify severe types of micro-cracks, such as parallel, dendritic, and cracks with multiple orientations. With an optimized threshold parameter, the technique detects over 90 % of cracks larger than 3 cm in length. The method shows great potential for quantifying micro-crack damage after manufacturing or module transportation for the determination of a module quality criterion for cell cracking in photovoltaic modules.

  8. Hierarchical detection of red lesions in retinal images by multiscale correlation filtering

    Science.gov (United States)

    Zhang, Bob; Wu, Xiangqian; You, Jane; Li, Qin; Karray, Fakhri

    2009-02-01

    This paper presents an approach to the computer aided diagnosis (CAD) of diabetic retinopathy (DR) -- a common and severe complication of long-term diabetes which damages the retina and cause blindness. Since red lesions are regarded as the first signs of DR, there has been extensive research on effective detection and localization of these abnormalities in retinal images. In contrast to existing algorithms, a new approach based on Multiscale Correlation Filtering (MSCF) and dynamic thresholding is developed. This consists of two levels, Red Lesion Candidate Detection (coarse level) and True Red Lesion Detection (fine level). The approach was evaluated using data from Retinopathy On-line Challenge (ROC) competition website and we conclude our method to be effective and efficient.

  9. Static Hyperspectral Fluorescence Imaging of Viscous Materials Based on a Linear Variable Filter Spectrometer

    Directory of Open Access Journals (Sweden)

    Alexander W. Koch

    2013-09-01

    Full Text Available This paper presents a low-cost hyperspectral measurement setup in a new application based on fluorescence detection in the visible (Vis wavelength range. The aim of the setup is to take hyperspectral fluorescence images of viscous materials. Based on these images, fluorescent and non-fluorescent impurities in the viscous materials can be detected. For the illumination of the measurement object, a narrow-band high-power light-emitting diode (LED with a center wavelength of 370 nm was used. The low-cost acquisition unit for the imaging consists of a linear variable filter (LVF and a complementary metal oxide semiconductor (CMOS 2D sensor array. The translucent wavelength range of the LVF is from 400 nm to 700 nm. For the confirmation of the concept, static measurements of fluorescent viscous materials with a non-fluorescent impurity have been performed and analyzed. With the presented setup, measurement surfaces in the micrometer range can be provided. The measureable minimum particle size of the impurities is in the nanometer range. The recording rate for the measurements depends on the exposure time of the used CMOS 2D sensor array and has been found to be in the microsecond range.

  10. A center-median filtering method for detection of temporal variation in coronal images

    Directory of Open Access Journals (Sweden)

    Plowman Joseph

    2016-01-01

    Full Text Available Events in the solar corona are often widely separated in their timescales, which can allow them to be identified when they would otherwise be confused with emission from other sources in the corona. Methods for cleanly separating such events based on their timescales are thus desirable for research in the field. This paper develops a technique for identifying time-varying signals in solar coronal image sequences which is based on a per-pixel running median filter and an understanding of photon-counting statistics. Example applications to “EIT waves” (named after EIT, the EUV Imaging Telescope on the Solar and Heliospheric Observatory and small-scale dynamics are shown, both using 193 Å data from the Atmospheric Imaging Assembly (AIA on the Solar Dynamics Observatory. The technique is found to discriminate EIT waves more cleanly than the running and base difference techniques most commonly used. It is also demonstrated that there is more signal in the data than is commonly appreciated, finding that the waves can be traced to the edge of the AIA field of view when the data are rebinned to increase the signal-to-noise ratio.

  11. Comparison analysis between filtered back projection and algebraic reconstruction technique on microwave imaging

    Science.gov (United States)

    Ramadhan, Rifqi; Prabowo, Rian Gilang; Aprilliyani, Ria; Basari

    2018-02-01

    Victims of acute cancer and tumor are growing each year and cancer becomes one of the causes of human deaths in the world. Cancers or tumor tissue cells are cells that grow abnormally and turn to take over and damage the surrounding tissues. At the beginning, cancers or tumors do not have definite symptoms in its early stages, and can even attack the tissues inside of the body. This phenomena is not identifiable under visual human observation. Therefore, an early detection system which is cheap, quick, simple, and portable is essensially required to anticipate the further development of cancer or tumor. Among all of the modalities, microwave imaging is considered to be a cheaper, simple, and portable system method. There are at least two simple image reconstruction algorithms i.e. Filtered Back Projection (FBP) and Algebraic Reconstruction Technique (ART), which have been adopted in some common modalities. In this paper, both algorithms will be compared by reconstructing the image from an artificial tissue model (i.e. phantom), which has two different dielectric distributions. We addressed two performance comparisons, namely quantitative and qualitative analysis. Qualitative analysis includes the smoothness of the image and also the success in distinguishing dielectric differences by observing the image with human eyesight. In addition, quantitative analysis includes Histogram, Structural Similarity Index (SSIM), Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR) calculation were also performed. As a result, quantitative parameters of FBP might show better values than the ART. However, ART is likely more capable to distinguish two different dielectric value than FBP, due to higher contrast in ART and wide distribution grayscale level.

  12. Convergent Filter Bases

    Directory of Open Access Journals (Sweden)

    Coghetto Roland

    2015-09-01

    Full Text Available We are inspired by the work of Henri Cartan [16], Bourbaki [10] (TG. I Filtres and Claude Wagschal [34]. We define the base of filter, image filter, convergent filter bases, limit filter and the filter base of tails (fr: filtre des sections.

  13. Filtering and deconvolution for bioluminescence imaging of small animals; Filtrage et deconvolution en imagerie de bioluminescence chez le petit animal

    Energy Technology Data Exchange (ETDEWEB)

    Akkoul, S.

    2010-06-22

    This thesis is devoted to analysis of bioluminescence images applied to the small animal. This kind of imaging modality is used in cancerology studies. Nevertheless, some problems are related to the diffusion and the absorption of the tissues of the light of internal bioluminescent sources. In addition, system noise and the cosmic rays noise are present. This influences the quality of the images and makes it difficult to analyze. The purpose of this thesis is to overcome these disturbing effects. We first have proposed an image formation model for the bioluminescence images. The processing chain is constituted by a filtering stage followed by a deconvolution stage. We have proposed a new median filter to suppress the random value impulsive noise which corrupts the acquired images; this filter represents the first block of the proposed chain. For the deconvolution stage, we have performed a comparative study of various deconvolution algorithms. It allowed us to choose a blind deconvolution algorithm initialized with the estimated point spread function of the acquisition system. At first, we have validated our global approach by comparing our obtained results with the ground truth. Through various clinical tests, we have shown that the processing chain allows a significant improvement of the spatial resolution and a better distinction of very close tumor sources, what represents considerable contribution for the users of bioluminescence images. (author)

  14. KALMAN FILTER BASED FEATURE ANALYSIS FOR TRACKING PEOPLE FROM AIRBORNE IMAGES

    Directory of Open Access Journals (Sweden)

    B. Sirmacek

    2012-09-01

    Full Text Available Recently, analysis of man events in real-time using computer vision techniques became a very important research field. Especially, understanding motion of people can be helpful to prevent unpleasant conditions. Understanding behavioral dynamics of people can also help to estimate future states of underground passages, shopping center like public entrances, or streets. In order to bring an automated solution to this problem, we propose a novel approach using airborne image sequences. Although airborne image resolutions are not enough to see each person in detail, we can still notice a change of color components in the place where a person exists. Therefore, we propose a color feature detection based probabilistic framework in order to detect people automatically. Extracted local features behave as observations of the probability density function (pdf of the people locations to be estimated. Using an adaptive kernel density estimation method, we estimate the corresponding pdf. First, we use estimated pdf to detect boundaries of dense crowds. After that, using background information of dense crowds and previously extracted local features, we detect other people in non-crowd regions automatically for each image in the sequence. We benefit from Kalman filtering to track motion of detected people. To test our algorithm, we use a stadium entrance image data set taken from airborne camera system. Our experimental results indicate possible usage of the algorithm in real-life man events. We believe that the proposed approach can also provide crucial information to police departments and crisis management teams to achieve more detailed observations of people in large open area events to prevent possible accidents or unpleasant conditions.

  15. DESIGN OF DYADIC-INTEGER-COEFFICIENTS BASED BI-ORTHOGONAL WAVELET FILTERS FOR IMAGE SUPER-RESOLUTION USING SUB-PIXEL IMAGE REGISTRATION

    Directory of Open Access Journals (Sweden)

    P.B. Chopade

    2014-05-01

    Full Text Available This paper presents image super-resolution scheme based on sub-pixel image registration by the design of a specific class of dyadic-integer-coefficient based wavelet filters derived from the construction of a half-band polynomial. First, the integer-coefficient based half-band polynomial is designed by the splitting approach. Next, this designed half-band polynomial is factorized and assigned specific number of vanishing moments and roots to obtain the dyadic-integer coefficients low-pass analysis and synthesis filters. The possibility of these dyadic-integer coefficients based wavelet filters is explored in the field of image super-resolution using sub-pixel image registration. The two-resolution frames are registered at a specific shift from one another to restore the resolution lost by CCD array of camera. The discrete wavelet transform (DWT obtained from the designed coefficients is applied on these two low-resolution images to obtain the high resolution image. The developed approach is validated by comparing the quality metrics with existing filter banks.

  16. Edge Detection from High Resolution Remote Sensing Images using Two-Dimensional log Gabor Filter in Frequency Domain

    International Nuclear Information System (INIS)

    Wang, K; Yu, T; Meng, Q Y; Wang, G K; Li, S P; Liu, S H

    2014-01-01

    Edges are vital features to describe the structural information of images, especially high spatial resolution remote sensing images. Edge features can be used to define the boundaries between different ground objects in high spatial resolution remote sensing images. Thus edge detection is important in the remote sensing image processing. Even though many different edge detection algorithms have been proposed, it is difficult to extract the edge features from high spatial resolution remote sensing image including complex ground objects. This paper introduces a novel method to detect edges from the high spatial resolution remote sensing image based on frequency domain. Firstly, the high spatial resolution remote sensing images are Fourier transformed to obtain the magnitude spectrum image (frequency image) by FFT. Then, the frequency spectrum is analyzed by using the radius and angle sampling. Finally, two-dimensional log Gabor filter with optimal parameters is designed according to the result of spectrum analysis. Finally, dot product between the result of Fourier transform and the log Gabor filter is inverse Fourier transformed to obtain the detections. The experimental result shows that the proposed algorithm can detect edge features from the high resolution remote sensing image commendably

  17. Quaternionic Spatiotemporal Filtering for Dense Motion Field Estimation in Ultrasound Imaging

    Directory of Open Access Journals (Sweden)

    Marion Adrien

    2010-01-01

    Full Text Available Abstract Blood motion estimation provides fundamental clinical information to prevent and detect pathologies such as cancer. Ultrasound imaging associated with Doppler methods is often used for blood flow evaluation. However, Doppler methods suffer from shortcomings such as limited spatial resolution and the inability to estimate lateral motion. Numerous methods such as block matching and decorrelation-based techniques have been proposed to overcome these limitations. In this paper, we propose an original method to estimate dense fields of vector velocity from ultrasound image sequences. Our proposal is based on a spatiotemporal approach and considers 2D+t data as a 3D volume. Orientation of the texture within this volume is related to velocity. Thus, we designed a bank of 3D quaternionic filters to estimate local orientation and then calculate local velocities. The method was applied to a large set of experimental and simulated flow sequences with low motion ( 1 mm/s within small vessels ( 1 mm. Evaluation was conducted with several quantitative criteria such as the normalized mean error or the estimated mean velocity. The results obtained show the good behaviour of our method, characterizing the flows studied.

  18. Spatial filtering self-velocimeter for vehicle application using a CMOS linear image sensor

    Science.gov (United States)

    He, Xin; Zhou, Jian; Nie, Xiaoming; Long, Xingwu

    2015-03-01

    The idea of using a spatial filtering velocimeter (SFV) to measure the velocity of a vehicle for an inertial navigation system is put forward. The presented SFV is based on a CMOS linear image sensor with a high-speed data rate, large pixel size, and built-in timing generator. These advantages make the image sensor suitable to measure vehicle velocity. The power spectrum of the output signal is obtained by fast Fourier transform and is corrected by a frequency spectrum correction algorithm. This velocimeter was used to measure the velocity of a conveyor belt driven by a rotary table and the measurement uncertainty is ˜0.54%. Furthermore, it was also installed on a vehicle together with a laser Doppler velocimeter (LDV) to measure self-velocity. The measurement result of the designed SFV is compared with that of the LDV. It is shown that the measurement result of the SFV is coincident with that of the LDV. Therefore, the designed SFV is suitable for a vehicle self-contained inertial navigation system.

  19. GPR image analysis to locate water leaks from buried pipes by applying variance filters

    Science.gov (United States)

    Ocaña-Levario, Silvia J.; Carreño-Alvarado, Elizabeth P.; Ayala-Cabrera, David; Izquierdo, Joaquín

    2018-05-01

    Nowadays, there is growing interest in controlling and reducing the amount of water lost through leakage in water supply systems (WSSs). Leakage is, in fact, one of the biggest problems faced by the managers of these utilities. This work addresses the problem of leakage in WSSs by using GPR (Ground Penetrating Radar) as a non-destructive method. The main objective is to identify and extract features from GPR images such as leaks and components in a controlled laboratory condition by a methodology based on second order statistical parameters and, using the obtained features, to create 3D models that allows quick visualization of components and leaks in WSSs from GPR image analysis and subsequent interpretation. This methodology has been used before in other fields and provided promising results. The results obtained with the proposed methodology are presented, analyzed, interpreted and compared with the results obtained by using a well-established multi-agent based methodology. These results show that the variance filter is capable of highlighting the characteristics of components and anomalies, in an intuitive manner, which can be identified by non-highly qualified personnel, using the 3D models we develop. This research intends to pave the way towards future intelligent detection systems that enable the automatic detection of leaks in WSSs.

  20. Distinguishing the Noise and image structures for detecting the correction term and filtering the noise by using fuzzy rules

    OpenAIRE

    Sridevi.Ravada,; Vani prasanna.Kanakala,; Ramya.Koilada

    2011-01-01

    A fuzzy filter is constructed from a set of fuzzy IF-THEN rules, these fuzzy rules come either from human experts or by matching input-output pairs .in this paper we propose a new fuzzy filter for the noise reduction of images corrupted with additive noise. here in this approach ,initially fuzzy derivatives for all eight directions that is N,E,W,S, NE,NW,SE,SW are calculated using “fuzzy IF-THEN rules “ and membership functions . Further the fuzzy derivative values obtained are used in the fu...

  1. Field programmable gate array based hardware implementation of a gradient filter for edge detection in colour images with subpixel precision

    International Nuclear Information System (INIS)

    Schellhorn, M; Rosenberger, M; Correns, M; Blau, M; Goepfert, A; Rueckwardt, M; Linss, G

    2010-01-01

    Within the field of industrial image processing the use of colour cameras becomes ever more common. Increasingly the established black and white cameras are replaced by economical single-chip colour cameras with Bayer pattern. The use of the additional colour information is particularly important for recognition or inspection. Become interesting however also for the geometric metrology, if measuring tasks can be solved more robust or more exactly. However only few suitable algorithms are available, in order to detect edges with the necessary precision. All attempts require however additional computation expenditure. On the basis of a new filter for edge detection in colour images with subpixel precision, the implementation on a pre-processing hardware platform is presented. Hardware implemented filters offer the advantage that they can be used easily with existing measuring software, since after the filtering a single channel image is present, which unites the information of all colour channels. Advanced field programmable gate arrays represent an ideal platform for the parallel processing of multiple channels. The effective implementation presupposes however a high programming expenditure. On the example of the colour filter implementation, arising problems are analyzed and the chosen solution method is presented.

  2. Pupil filter design by using a Bessel functions basis at the image plane.

    Science.gov (United States)

    Canales, Vidal F; Cagigal, Manuel P

    2006-10-30

    Many applications can benefit from the use of pupil filters for controlling the light intensity distribution near the focus of an optical system. Most of the design methods for such filters are based on a second-order expansion of the Point Spread Function (PSF). Here, we present a new procedure for designing radially-symmetric pupil filters. It is more precise than previous procedures as it considers the exact expression of the PSF, expanded as a function of first-order Bessel functions. Furthermore, this new method presents other advantages: the height of the side lobes can be easily controlled, it allows the design of amplitude-only, phase-only or hybrid filters, and the coefficients of the PSF expansion can be directly related to filter parameters. Finally, our procedure allows the design of filters with very different behaviours and optimal performance.

  3. SU-F-J-28: Development of a New Imaging Filter to Remove the Shadows From the Carbon Fiber Grid Table Top

    Energy Technology Data Exchange (ETDEWEB)

    Maehana, W [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Yokohama National University, Yokohama, kanagawa (Japan); Nagao, T [Yokohama National University, Yokohama, kanagawa (Japan)

    2016-06-15

    Purpose: For the image guided radiation therapy (IGRT), the shadows caused by the construction of the treatment couch top adversely affect the visual evaluation. Therefore, we developed the new imaging filter in order to remove the shadows. The performance of the new filter was evaluated using the clinical images. Methods: The new filter was composed of the band-pass filter (BPF) weighted by the k factor and the low-pass filter (LPF). In the frequency region, the stop bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the BPF, and the pass bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the LPF. After adding the filter, the shadows from the carbon fiber grid table top (CFGTT, Varian) on the kV-image was removed. To check the filter effect, we compared the clinical images, which are thorax and thoracoabdominal region, with to without the filter. The subjective evaluation tests was performed by adapting a three-point scale (agree, neither agree nor disagree, disagree) about the 15 persons in the department of radiation oncology. Results: We succeeded in removing all shadows of CFGTT using the new filter. This filter is very useful shown by the results of the subjective evaluation having the 23/30 persons agreed to the filtered clinical images. Conclusion: We concluded that the proposed method was useful tool for the IGRT and the new filter leads to improvement of the accuracy of radiation therapy.

  4. Effect of antibrowning agents on fresh-cut potato tubers using frequency filtering of biospeckle images

    International Nuclear Information System (INIS)

    Minz, Preeti D; Ansari, Md Zaheer; Nirala, A K

    2015-01-01

    Our present work aims to study the physiological changes of chemically treated fresh-cut potato tubers and then to correlate such changes with the results of the non destructive and non invasive laser biospeckle technique. The effect of chemically treated (citric acid (CA-0.5% and 1.0%) and citric acid + sodium chloride (CS-0.5% and 1.0%)) fresh-cut potato tubers on physiological activities such as the respiration rate and weight loss at cold storage has been studied for eight consecutive days. In addition, biospeckle recording has been carried out for eight consecutive days for all the chemically treated samples and from captured images, and the numerical results (inertia moment (IM)) with and without frequency filtering have been obtained. A comparatively higher respiration rate and lower weight loss is observed for CS treated samples than that of CA treated samples. The results of the IM obtained with the exclusion of the higher frequency show a similar nature to the respiration rate and also, separations of the respiration curves at two concentrations for both the treated samples were well correlated with the IM curves. The concentration effect for both the treatments on the IM value with the exclusion of lower frequencies has also been presented. Thus the IM method with filtration of particular bands is able to separate the different physiological phenomena with one another and is also able to differentiate the chemical effect on the samples. (paper)

  5. A novel spatiotemporal muscle activity imaging approach based on the Extended Kalman Filter.

    Science.gov (United States)

    Wang, Jing; Zhang, Yingchun; Zhu, Xiangjun; Zhou, Ping; Liu, Chenguang; Rymer, William Z

    2012-01-01

    A novel spatiotemporal muscle activity imaging (sMAI) approach has been developed using the Extended Kalman Filter (EKF) to reconstruct internal muscle activities from non-invasive multi-channel surface electromyogram (sEMG) recordings. A distributed bioelectric dipole source model is employed to describe the internal muscle activity space, and a linear relationship between the muscle activity space and the sEMG measurement space is then established. The EKF is employed to recursively solve the ill-posed inverse problem in the sMAI approach, in which the weighted minimum norm (WMN) method is utilized to calculate the initial state and a new nonlinear method is developed based on the propagating features of muscle activities to predict the recursive state. A series of computer simulations was conducted to test the performance of the proposed sMAI approach. Results show that the localization error rapidly decreases over 35% and the overlap ratio rapidly increases over 45% compared to the results achieved using the WMN method only. The present promising results demonstrate the feasibility of utilizing the proposed EKF-based sMAI approach to accurately reconstruct internal muscle activities from non-invasive sEMG recordings.

  6. STUDIES OF NGC 6720 WITH CALIBRATED HST/WFC3 EMISSION-LINE FILTER IMAGES. I. STRUCTURE AND EVOLUTION ,

    International Nuclear Information System (INIS)

    O'Dell, C. R.; Ferland, G. J.; Henney, W. J.; Peimbert, M.

    2013-01-01

    We have performed a detailed analysis of the Ring Nebula (NGC 6720) using Hubble Space Telescope WFC3 images and derived a new three-dimensional model. Existing high spectral resolution spectra played an important supplementary role in our modeling. It is shown that the Main Ring of the nebula is an ionization-bounded irregular non-symmetric disk with a central cavity and perpendicular extended lobes pointed almost toward the observer. The faint outer halos are determined to be fossil radiation, i.e., radiation from gas ionized in an earlier stage of the nebula when it was not ionization bounded. The narrowband WFC3 filters that isolate some of the emission lines are affected by broadening on their short wavelength side and all the filters were calibrated using ground-based spectra. The filter calibration results are presented in an appendix.

  7. A Kalman Filter-Based Method to Generate Continuous Time Series of Medium-Resolution NDVI Images

    Directory of Open Access Journals (Sweden)

    Fernando Sedano

    2014-12-01

    Full Text Available A data assimilation method to produce complete temporal sequences of synthetic medium-resolution images is presented. The method implements a Kalman filter recursive algorithm that integrates medium and moderate resolution imagery. To demonstrate the approach, time series of 30-m spatial resolution NDVI images at 16-day time steps were generated using Landsat NDVI images and MODIS NDVI products at four sites with different ecosystems and land cover-land use dynamics. The results show that the time series of synthetic NDVI images captured seasonal land surface dynamics and maintained the spatial structure of the landscape at higher spatial resolution. The time series of synthetic medium-resolution NDVI images were validated within a Monte Carlo simulation framework. Normalized residuals decreased as the number of available observations increased, ranging from 0.2 to below 0.1. Residuals were also significantly lower for time series of synthetic NDVI images generated at combined recursion (smoothing than individually at forward and backward recursions (filtering. Conversely, the uncertainties of the synthetic images also decreased when the number of available observations increased and combined recursions were implemented.

  8. Speckle Reduction for Ultrasonic Imaging Using Frequency Compounding and Despeckling Filters along with Coded Excitation and Pulse Compression

    Directory of Open Access Journals (Sweden)

    Joshua S. Ullom

    2012-01-01

    Full Text Available A method for improving the contrast-to-noise ratio (CNR while maintaining the −6 dB axial resolution of ultrasonic B-mode images is proposed. The technique proposed is known as eREC-FC, which enhances a recently developed REC-FC technique. REC-FC is a combination of the coded excitation technique known as resolution enhancement compression (REC and the speckle-reduction technique frequency compounding (FC. In REC-FC, image CNR is improved but at the expense of a reduction in axial resolution. However, by compounding various REC-FC images made from various subband widths, the tradeoff between axial resolution and CNR enhancement can be extended. Further improvements in CNR can be obtained by applying postprocessing despeckling filters to the eREC-FC B-mode images. The despeckling filters evaluated were the following: median, Lee, homogeneous mask area, geometric, and speckle-reducing anisotropic diffusion (SRAD. Simulations and experimental measurements were conducted with a single-element transducer (f/2.66 having a center frequency of 2.25 MHz and a −3 dB bandwidth of 50%. In simulations and experiments, the eREC-FC technique resulted in the same axial resolution that would be typically observed with conventional excitation with a pulse. Moreover, increases in CNR of 348% were obtained in experiments when comparing eREC-FC with a Lee filter to conventional pulsing methods.

  9. CATE 2016 Indonesia: normalized radial graded filtering, site-to-site image registration, and preliminary results

    Science.gov (United States)

    Jensen, L.; Kovac, S. A.; Hare, H. S.; Mitchell, A. M.; McKay, M. A.; Bosh, R.; Watson, Z.; Penn, M.

    2016-12-01

    An area of the solar corona from 1 out to approximately 2.5 solar radii is currently poorly sampled in astronomy. This is largely due to difficulties inherent in observing the sun from space and from the ground. Specifically focusing on ground based observations, the main problem is scattered light in the Earth's atmosphere and in the telescopes themselves. A total solar eclipse solves this problem by blocking the light from the photosphere of the sun before it enters the atmosphere, reducing the scattered light in the atmosphere by a factor of 10,000. However, using a total solar eclipse introduces another challenge due to the small window of time it provides. At any given location in 2017, the totality will last for only about 2.5 minutes and such a small data set limits the studies that can be done on the inner corona. The Citizen Continental-America Telescopic Eclipse Experiment plans to overcome this issue by taking advantage of America's infrastructure and using 60 identical telescopes to collect continuous data of the solar eclipse as the shadow travels from Oregon to South Carolina. By splicing these data together 90 minutes of one-of-a-kind data can be collected, revealing the dynamics of the inner corona as never seen before. For the 2016 Indonesian total solar eclipse the CATE project collected data using 5 sites along the eclipse path. These data were then used to develop processing programs to use on future data. These processes included site-to-site image registration as well as normalized radial graded filtering of the images. Programs were also developed to begin performing studies on the data including overlapping CATE and LASCO space telescope data for a total coronal image as well as thread tracing routines to quantify direction in the coronal filaments. This work was made possible through the National Solar Observatory Research Experiences for Undergraduates (REU) Program, which is funded by the National Science Foundation (NSF). The NSO

  10. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild

    NARCIS (Netherlands)

    Asthana, Akshay; Asthana, Ashish; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-01-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important

  11. A pilot study on slit lamp-adapted optical coherence tomography imaging of trabeculectomy filtering blebs.

    NARCIS (Netherlands)

    Theelen, T.; Wesseling, P.; Keunen, J.E.E.; Klevering, B.J.

    2007-01-01

    BACKGROUND: Our study aims to identify anatomical characteristics of glaucoma filtering blebs by means of slit lamp-adapted optical coherence tomography (SL-OCT) and to identify new parameters for the functional prognosis of the filter in the early post-operative period. METHODS: Patients with

  12. Comparison of pure and hybrid iterative reconstruction techniques with conventional filtered back projection: Image quality assessment in the cervicothoracic region

    International Nuclear Information System (INIS)

    Katsura, Masaki; Sato, Jiro; Akahane, Masaaki; Matsuda, Izuru; Ishida, Masanori; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni

    2013-01-01

    Objectives: To evaluate the impact on image quality of three different image reconstruction techniques in the cervicothoracic region: model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP). Methods: Forty-four patients underwent unenhanced standard-of-care clinical computed tomography (CT) examinations which included the cervicothoracic region with a 64-row multidetector CT scanner. Images were reconstructed with FBP, 50% ASIR-FBP blending (ASIR50), and MBIR. Two radiologists assessed the cervicothoracic region in a blinded manner for streak artifacts, pixilated blotchy appearances, critical reproduction of visually sharp anatomical structures (thyroid gland, common carotid artery, and esophagus), and overall diagnostic acceptability. Objective image noise was measured in the internal jugular vein. Data were analyzed using the sign test and pair-wise Student's t-test. Results: MBIR images had significant lower quantitative image noise (8.88 ± 1.32) compared to ASIR images (18.63 ± 4.19, P 0.9 for ASIR vs. FBP for both readers). MBIR images were all diagnostically acceptable. Unique features of MBIR images included pixilated blotchy appearances, which did not adversely affect diagnostic acceptability. Conclusions: MBIR significantly improves image noise and streak artifacts of the cervicothoracic region over ASIR and FBP. MBIR is expected to enhance the value of CT examinations for areas where image noise and streak artifacts are problematic

  13. On-Line Multi-Damage Scanning Spatial-Wavenumber Filter Based Imaging Method for Aircraft Composite Structure

    Directory of Open Access Journals (Sweden)

    Yuanqiang Ren

    2017-05-01

    Full Text Available Structural health monitoring (SHM of aircraft composite structure is helpful to increase reliability and reduce maintenance costs. Due to the great effectiveness in distinguishing particular guided wave modes and identifying the propagation direction, the spatial-wavenumber filter technique has emerged as an interesting SHM topic. In this paper, a new scanning spatial-wavenumber filter (SSWF based imaging method for multiple damages is proposed to conduct on-line monitoring of aircraft composite structures. Firstly, an on-line multi-damage SSWF is established, including the fundamental principle of SSWF for multiple damages based on a linear piezoelectric (PZT sensor array, and a corresponding wavenumber-time imaging mechanism by using the multi-damage scattering signal. Secondly, through combining the on-line multi-damage SSWF and a PZT 2D cross-shaped array, an image-mapping method is proposed to conduct wavenumber synthesis and convert the two wavenumber-time images obtained by the PZT 2D cross-shaped array to an angle-distance image, from which the multiple damages can be directly recognized and located. In the experimental validation, both simulated multi-damage and real multi-damage introduced by repeated impacts are performed on a composite plate structure. The maximum localization error is less than 2 cm, which shows good performance of the multi-damage imaging method. Compared with the existing spatial-wavenumber filter based damage evaluation methods, the proposed method requires no more than the multi-damage scattering signal and can be performed without depending on any wavenumber modeling or measuring. Besides, this method locates multiple damages by imaging instead of the geometric method, which helps to improve the signal-to-noise ratio. Thus, it can be easily applied to on-line multi-damage monitoring of aircraft composite structures.

  14. Improved automatic filtering methodology for an optimal pharmacokinetic modelling of DCE-MR images of the prostate

    Energy Technology Data Exchange (ETDEWEB)

    Vazquez Martinez, V.; Bosch Roig, I.; Sanz Requena, R.

    2016-07-01

    In Dynamic Contrast-Enhanced Magnetic Resonance (DCEMR) studies with high temporal resolution, images are quite noisy due to the complicate balance between temporal and spatial resolution. For this reason, the temporal curves extracted from the images present remarkable noise levels and, because of that, the pharmacokinetic parameters calculated by least squares fitting from the curves and the arterial phase (a useful marker in tumour diagnosis which appears in curves with high arterial contribution) are affected. In order to solve these limitations, an automatic filtering method was developed by our group. In this work, an advanced automatic filtering methodology is presented to further improve noise reduction of the temporal curves in order to obtain more accurate kinetic parameters and a proper modelling of the arterial phase. (Author)

  15. Supervised retinal vessel segmentation from color fundus images based on matched filtering and AdaBoost classifier.

    Directory of Open Access Journals (Sweden)

    Nogol Memari

    Full Text Available The structure and appearance of the blood vessel network in retinal fundus images is an essential part of diagnosing various problems associated with the eyes, such as diabetes and hypertension. In this paper, an automatic retinal vessel segmentation method utilizing matched filter techniques coupled with an AdaBoost classifier is proposed. The fundus image is enhanced using morphological operations, the contrast is increased using contrast limited adaptive histogram equalization (CLAHE method and the inhomogeneity is corrected using Retinex approach. Then, the blood vessels are enhanced using a combination of B-COSFIRE and Frangi matched filters. From this preprocessed image, different statistical features are computed on a pixel-wise basis and used in an AdaBoost classifier to extract the blood vessel network inside the image. Finally, the segmented images are postprocessed to remove the misclassified pixels and regions. The proposed method was validated using publicly accessible Digital Retinal Images for Vessel Extraction (DRIVE, Structured Analysis of the Retina (STARE and Child Heart and Health Study in England (CHASE_DB1 datasets commonly used for determining the accuracy of retinal vessel segmentation methods. The accuracy of the proposed segmentation method was comparable to other state of the art methods while being very close to the manual segmentation provided by the second human observer with an average accuracy of 0.972, 0.951 and 0.948 in DRIVE, STARE and CHASE_DB1 datasets, respectively.

  16. Restoration of Static JPEG Images and RGB Video Frames by Means of Nonlinear Filtering in Conditions of Gaussian and Non-Gaussian Noise

    Science.gov (United States)

    Sokolov, R. I.; Abdullin, R. R.

    2017-11-01

    The use of nonlinear Markov process filtering makes it possible to restore both video stream frames and static photos at the stage of preprocessing. The present paper reflects the results of research in comparison of these types image filtering quality by means of special algorithm when Gaussian or non-Gaussian noises acting. Examples of filter operation at different values of signal-to-noise ratio are presented. A comparative analysis has been performed, and the best filtered kind of noise has been defined. It has been shown the quality of developed algorithm is much better than quality of adaptive one for RGB signal filtering at the same a priori information about the signal. Also, an advantage over median filter takes a place when both fluctuation and pulse noise filtering.

  17. Task-based detectability in CT image reconstruction by filtered backprojection and penalized likelihood estimation

    Energy Technology Data Exchange (ETDEWEB)

    Gang, Grace J. [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (Canada); Stayman, J. Webster; Zbijewski, Wojciech [Department of Biomedical Engineering, Johns Hopkins University, Baltimore Maryland 21205 (United States); Siewerdsen, Jeffrey H., E-mail: jeff.siewerdsen@jhu.edu [Institute of Biomaterials and Biomedical Engineering, University of Toronto, Toronto, Ontario M5G 2M9, Canada and Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205 (United States)

    2014-08-15

    Purpose: Nonstationarity is an important aspect of imaging performance in CT and cone-beam CT (CBCT), especially for systems employing iterative reconstruction. This work presents a theoretical framework for both filtered-backprojection (FBP) and penalized-likelihood (PL) reconstruction that includes explicit descriptions of nonstationary noise, spatial resolution, and task-based detectability index. Potential utility of the model was demonstrated in the optimal selection of regularization parameters in PL reconstruction. Methods: Analytical models for local modulation transfer function (MTF) and noise-power spectrum (NPS) were investigated for both FBP and PL reconstruction, including explicit dependence on the object and spatial location. For FBP, a cascaded systems analysis framework was adapted to account for nonstationarity by separately calculating fluence and system gains for each ray passing through any given voxel. For PL, the point-spread function and covariance were derived using the implicit function theorem and first-order Taylor expansion according toFessler [“Mean and variance of implicitly defined biased estimators (such as penalized maximum likelihood): Applications to tomography,” IEEE Trans. Image Process. 5(3), 493–506 (1996)]. Detectability index was calculated for a variety of simple tasks. The model for PL was used in selecting the regularization strength parameter to optimize task-based performance, with both a constant and a spatially varying regularization map. Results: Theoretical models of FBP and PL were validated in 2D simulated fan-beam data and found to yield accurate predictions of local MTF and NPS as a function of the object and the spatial location. The NPS for both FBP and PL exhibit similar anisotropic nature depending on the pathlength (and therefore, the object and spatial location within the object) traversed by each ray, with the PL NPS experiencing greater smoothing along directions with higher noise. The MTF of FBP

  18. SU-G-IeP4-15: Ultrasound Imaging of Absorbable Inferior Vena Cava Filters for Proper Placement

    Energy Technology Data Exchange (ETDEWEB)

    Mitcham, T; Bouchard, R; Melancon, A; Melancon, M [University of Texas MD Anderson Cancer Center, Houston, TX (United States); Eggers, M [Adient Medical Technologies, Pearland, TX (United States)

    2016-06-15

    Purpose: Inferior vena cava filters (IVCFs) are used in patients with a high risk of pulmonary embolism in situations when the use of blood thinning drugs would be inappropriate. These filters are implanted under x-ray guidance; however, this provides a dose of ionizing radiation to both patient and physician. B-mode ultrasound (US) imaging allows for localization of certain implanted devices without radiation dose concerns. The goal of this study was to investigate the feasibility of imaging the placement of absorbable IVCFs using US imaging to alleviate the dosage concern inherent to fluoroscopy. Methods: A phantom was constructed to mimic a human IVC using tissue-mimicking material with 0.5 dB/cm/MHz acoustic attenuation, while agar inclusions were used to model acoustic mismatch at the venous interface. Absorbable IVCF’s were imaged at 15 cm depth using B-mode US at 2, 3, 5, and 7 MHz transmit frequencies. Then, to determine temporal stability, the IVCF was left in the phantom for 10 weeks; during this time, the IVCF was imaged using the same techniques as above, while the integrity of the filter was analyzed by inspecting for fiber discontinuities. Results: Visualization of the inferior vena cava filter was possible at 5, 7.5, and 15 cm depth at US central frequencies of 2, 3, 5, and 7 MHz. Imaging the IVCF at 5 MHz yielded the clearest images while maintaining acceptable spatial resolution for identifying the IVCF’s, while lower frequencies provided noticeably worse image quality. No obvious degradation was observed over the course of the 10 weeks in a static phantom environment. Conclusion: Biodegradable IVCF localization was possible up to 15 cm in depth using conventional B-mode US in a tissue-mimicking phantom. This leads to the potential for using B-mode US to guide the placement of the IVCF upon deployment by the interventional radiologist. Mitch Eggers is an owner of Adient Medical Technologies. There are no other conflicts of interest to disclose.

  19. First Magnetic Resonance Imaging-Guided Aortic Stenting and Cava Filter Placement Using a Polyetheretherketone-Based Magnetic Resonance Imaging-Compatible Guidewire in Swine: Proof of Concept

    International Nuclear Information System (INIS)

    Kos, Sebastian; Huegli, Rolf; Hofmann, Eugen; Quick, Harald H.; Kuehl, Hilmar; Aker, Stephanie; Kaiser, Gernot M.; Borm, Paul J. A.; Jacob, Augustinus L.; Bilecen, Deniz

    2009-01-01

    The purpose of this study was to demonstrate feasibility of percutaneous transluminal aortic stenting and cava filter placement under magnetic resonance imaging (MRI) guidance exclusively using a polyetheretherketone (PEEK)-based MRI-compatible guidewire. Percutaneous transluminal aortic stenting and cava filter placement were performed in 3 domestic swine. Procedures were performed under MRI-guidance in an open-bore 1.5-T scanner. The applied 0.035-inch guidewire has a PEEK core reinforced by fibres, floppy tip, hydrophilic coating, and paramagnetic markings for passive visualization. Through an 11F sheath, the guidewire was advanced into the abdominal (swine 1) or thoracic aorta (swine 2), and the stents were deployed. The guidewire was advanced into the inferior vena cava (swine 3), and the cava filter was deployed. Postmortem autopsy was performed. Procedural success, guidewire visibility, pushability, and stent support were qualitatively assessed by consensus. Procedure times were documented. Guidewire guidance into the abdominal and thoracic aortas and the inferior vena cava was successful. Stent deployments were successful in the abdominal (swine 1) and thoracic (swine 2) segments of the descending aorta. Cava filter positioning and deployment was successful. Autopsy documented good stent and filter positioning. Guidewire visibility through applied markers was rated acceptable for aortic stenting and good for venous filter placement. Steerability, pushability, and device support were good. The PEEK-based guidewire allows either percutaneous MRI-guided aortic stenting in the thoracic and abdominal segments of the descending aorta and filter placement in the inferior vena cava with acceptable to good device visibility and offers good steerability, pushability, and device support.

  20. First magnetic resonance imaging-guided aortic stenting and cava filter placement using a polyetheretherketone-based magnetic resonance imaging-compatible guidewire in swine: proof of concept.

    Science.gov (United States)

    Kos, Sebastian; Huegli, Rolf; Hofmann, Eugen; Quick, Harald H; Kuehl, Hilmar; Aker, Stephanie; Kaiser, Gernot M; Borm, Paul J A; Jacob, Augustinus L; Bilecen, Deniz

    2009-05-01

    The purpose of this study was to demonstrate feasibility of percutaneous transluminal aortic stenting and cava filter placement under magnetic resonance imaging (MRI) guidance exclusively using a polyetheretherketone (PEEK)-based MRI-compatible guidewire. Percutaneous transluminal aortic stenting and cava filter placement were performed in 3 domestic swine. Procedures were performed under MRI-guidance in an open-bore 1.5-T scanner. The applied 0.035-inch guidewire has a PEEK core reinforced by fibres, floppy tip, hydrophilic coating, and paramagnetic markings for passive visualization. Through an 11F sheath, the guidewire was advanced into the abdominal (swine 1) or thoracic aorta (swine 2), and the stents were deployed. The guidewire was advanced into the inferior vena cava (swine 3), and the cava filter was deployed. Postmortem autopsy was performed. Procedural success, guidewire visibility, pushability, and stent support were qualitatively assessed by consensus. Procedure times were documented. Guidewire guidance into the abdominal and thoracic aortas and the inferior vena cava was successful. Stent deployments were successful in the abdominal (swine 1) and thoracic (swine 2) segments of the descending aorta. Cava filter positioning and deployment was successful. Autopsy documented good stent and filter positioning. Guidewire visibility through applied markers was rated acceptable for aortic stenting and good for venous filter placement. Steerability, pushability, and device support were good. The PEEK-based guidewire allows either percutaneous MRI-guided aortic stenting in the thoracic and abdominal segments of the descending aorta and filter placement in the inferior vena cava with acceptable to good device visibility and offers good steerability, pushability, and device support.

  1. A local region of interest image reconstruction via filtered backprojection for fan-beam differential phase-contrast computed tomography

    International Nuclear Information System (INIS)

    Qi Zhihua; Chen Guanghong

    2007-01-01

    Recently, x-ray differential phase contrast computed tomography (DPC-CT) has been experimentally implemented using a conventional source combined with several gratings. Images were reconstructed using a parallel-beam reconstruction formula. However, parallel-beam reconstruction formulae are not directly applicable for a large image object where the parallel-beam approximation fails. In this note, we present a new image reconstruction formula for fan-beam DPC-CT. There are two major features in this algorithm: (1) it enables the reconstruction of a local region of interest (ROI) using data acquired from an angular interval shorter than 180 0 + fan angle and (2) it still preserves the filtered backprojection structure. Numerical simulations have been conducted to validate the image reconstruction algorithm. (note)

  2. Comparison of pure and hybrid iterative reconstruction techniques with conventional filtered back projection: Image quality assessment in the cervicothoracic region

    Energy Technology Data Exchange (ETDEWEB)

    Katsura, Masaki, E-mail: mkatsura-tky@umin.ac.jp [Department of Radiology, Graduate School of Medicine, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655 (Japan); Sato, Jiro; Akahane, Masaaki; Matsuda, Izuru; Ishida, Masanori; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [Department of Radiology, Graduate School of Medicine, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655 (Japan)

    2013-02-15

    Objectives: To evaluate the impact on image quality of three different image reconstruction techniques in the cervicothoracic region: model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP). Methods: Forty-four patients underwent unenhanced standard-of-care clinical computed tomography (CT) examinations which included the cervicothoracic region with a 64-row multidetector CT scanner. Images were reconstructed with FBP, 50% ASIR-FBP blending (ASIR50), and MBIR. Two radiologists assessed the cervicothoracic region in a blinded manner for streak artifacts, pixilated blotchy appearances, critical reproduction of visually sharp anatomical structures (thyroid gland, common carotid artery, and esophagus), and overall diagnostic acceptability. Objective image noise was measured in the internal jugular vein. Data were analyzed using the sign test and pair-wise Student's t-test. Results: MBIR images had significant lower quantitative image noise (8.88 ± 1.32) compared to ASIR images (18.63 ± 4.19, P < 0.01) and FBP images (26.52 ± 5.8, P < 0.01). Significant improvements in streak artifacts of the cervicothoracic region were observed with the use of MBIR (P < 0.001 each for MBIR vs. the other two image data sets for both readers), while no significant difference was observed between ASIR and FBP (P > 0.9 for ASIR vs. FBP for both readers). MBIR images were all diagnostically acceptable. Unique features of MBIR images included pixilated blotchy appearances, which did not adversely affect diagnostic acceptability. Conclusions: MBIR significantly improves image noise and streak artifacts of the cervicothoracic region over ASIR and FBP. MBIR is expected to enhance the value of CT examinations for areas where image noise and streak artifacts are problematic.

  3. Adaptive wiener filter based on Gaussian mixture distribution model for denoising chest X-ray CT image

    International Nuclear Information System (INIS)

    Tabuchi, Motohiro; Yamane, Nobumoto; Morikawa, Yoshitaka

    2008-01-01

    In recent decades, X-ray CT imaging has become more important as a result of its high-resolution performance. However, it is well known that the X-ray dose is insufficient in the techniques that use low-dose imaging in health screening or thin-slice imaging in work-up. Therefore, the degradation of CT images caused by the streak artifact frequently becomes problematic. In this study, we applied a Wiener filter (WF) using the universal Gaussian mixture distribution model (UNI-GMM) as a statistical model to remove streak artifact. In designing the WF, it is necessary to estimate the statistical model and the precise co-variances of the original image. In the proposed method, we obtained a variety of chest X-ray CT images using a phantom simulating a chest organ, and we estimated the statistical information using the images for training. The results of simulation showed that it is possible to fit the UNI-GMM to the chest X-ray CT images and reduce the specific noise. (author)

  4. Guided Image Filtering-Based Pan-Sharpening Method: A Case Study of GaoFen-2 Imagery

    Directory of Open Access Journals (Sweden)

    Yalan Zheng

    2017-12-01

    Full Text Available GaoFen-2 (GF-2 is a civilian optical satellite self-developed by China equipped with both multispectral and panchromatic sensors, and is the first satellite in China with a resolution below 1 m. Because the pan-sharpening methods on GF-2 imagery have not been a focus of previous works, we propose a novel pan-sharpening method based on guided image filtering and compare the performance to state-of-the-art methods on GF-2 images. Guided image filtering was introduced to decompose and transfer the details and structures from the original panchromatic and multispectral bands. Thereafter, an adaptive model that considers the local spectral relationship was designed to properly inject spatial information back into the original spectral bands. Four pairs of GF-2 images acquired from urban, water body, cropland, and forest areas were selected for the experiments. Both quantitative and visual inspections were used for the assessment. The experimental results demonstrated that for GF-2 imagery acquired over different scenes, the proposed approach consistently achieves high spectral fidelity and enhances spatial details, thereby benefitting the potential classification procedures.

  5. Recent progress in plasmonic colour filters for image sensor and multispectral applications

    Science.gov (United States)

    Pinton, Nadia; Grant, James; Choubey, Bhaskar; Cumming, David; Collins, Steve

    2016-04-01

    Using nanostructured thin metal films as colour filters offers several important advantages, in particular high tunability across the entire visible spectrum and some of the infrared region, and also compatibility with conventional CMOS processes. Since 2003, the field of plasmonic colour filters has evolved rapidly and several different designs and materials, or combination of materials, have been proposed and studied. In this paper we present a simulation study for a single- step lithographically patterned multilayer structure able to provide competitive transmission efficiencies above 40% and contemporary FWHM of the order of 30 nm across the visible spectrum. The total thickness of the proposed filters is less than 200 nm and is constant for every wavelength, unlike e.g. resonant cavity-based filters such as Fabry-Perot that require a variable stack of several layers according to the working frequency, and their passband characteristics are entirely controlled by changing the lithographic pattern. It will also be shown that a key to obtaining narrow-band optical response lies in the dielectric environment of a nanostructure and that it is not necessary to have a symmetric structure to ensure good coupling between the SPPs at the top and bottom interfaces. Moreover, an analytical method to evaluate the periodicity, given a specific structure and a desirable working wavelength, will be proposed and its accuracy demonstrated. This method conveniently eliminate the need to optimize the design of a filter numerically, i.e. by running several time-consuming simulations with different periodicities.

  6. Information content of the space-frequency filtering of blood plasma layers laser images in the diagnosis of pathological changes

    Science.gov (United States)

    Ushenko, A. G.; Boychuk, T. M.; Mincer, O. P.; Bodnar, G. B.; Kushnerick, L. Ya.; Savich, V. O.

    2013-12-01

    The bases of method of the space-frequency of the filtering phase allocation of blood plasma pellicle are given here. The model of the optical-anisotropic properties of the albumen chain of blood plasma pellicle with regard to linear and circular double refraction of albumen and globulin crystals is proposed. Comparative researches of the effectiveness of methods of the direct polarized mapping of the azimuth images of blood plasma pcllicle layers and space-frequency polarimetry of the laser radiation transformed by divaricate and holelikc optical-anisotropic chains of blood plasma pellicles were held. On the basis of the complex statistic, correlative and fracta.1 analysis of the filtered frcquencydimensional polarizing azimuth maps of the blood plasma pellicles structure a set of criteria of the change of the double refraction of the albumen chains caused by the prostate cancer was traced and proved.

  7. FDTD parallel computational analysis of grid-type scattering filter characteristics for medical X-ray image diagnosis

    International Nuclear Information System (INIS)

    Takahashi, Koichi; Miyazaki, Yasumitsu; Goto, Nobuo

    2007-01-01

    X-ray diagnosis depends on the intensity of transmitted and scattered waves in X-ray propagation in biomedical media. X-ray is scattered and absorbed by tissues, such as fat, bone and internal organs. However, image processing for medical diagnosis, based on the scattering and absorption characteristics of these tissues in X-ray spectrum is not so much studied. To obtain precise information of tissues in a living body, the accurate characteristics of scattering and absorption are required. In this paper, X-ray scattering and absorption in biomedical media are studied using 2-dimensional finite difference time domain (FDTD) method. In FDTD method, the size of analysis space is very limited by the performance of available computers. To overcome this limitation, parallel and successive FDTD method is introduced. As a result of computer simulation, the amplitude of transmitted and scattered waves are presented numerically. The fundamental filtering characteristics of grid-type filter are also shown numerically. (author)

  8. Noise reduction and functional maps image quality improvement in dynamic CT perfusion using a new k-means clustering guided bilateral filter (KMGB).

    Science.gov (United States)

    Pisana, Francesco; Henzler, Thomas; Schönberg, Stefan; Klotz, Ernst; Schmidt, Bernhard; Kachelrieß, Marc

    2017-07-01

    Dynamic CT perfusion (CTP) consists in repeated acquisitions of the same volume in different time steps, slightly before, during and slightly afterwards the injection of contrast media. Important functional information can be derived for each voxel, which reflect the local hemodynamic properties and hence the metabolism of the tissue. Different approaches are being investigated to exploit data redundancy and prior knowledge for noise reduction of such datasets, ranging from iterative reconstruction schemes to high dimensional filters. We propose a new spatial bilateral filter which makes use of the k-means clustering algorithm and of an optimal calculated guiding image. We named the proposed filter as k-means clustering guided bilateral filter (KMGB). In this study, the KMGB filter is compared with the partial temporal non-local means filter (PATEN), with the time-intensity profile similarity (TIPS) filter, and with a new version derived from it, by introducing the guiding image (GB-TIPS). All the filters were tested on a digital in-house developed brain CTP phantom, were noise was added to simulate 80 kV and 200 mAs (default scanning parameters), 100 mAs and 30 mAs. Moreover, the filters performances were tested on 7 noisy clinical datasets with different pathologies in different body regions. The original contribution of our work is two-fold: first we propose an efficient algorithm to calculate a guiding image to improve the results of the TIPS filter, secondly we propose the introduction of the k-means clustering step and demonstrate how this can potentially replace the TIPS part of the filter obtaining better results at lower computational efforts. As expected, in the GB-TIPS, the introduction of the guiding image limits the over-smoothing of the TIPS filter, improving spatial resolution by more than 50%. Furthermore, replacing the time-intensity profile similarity calculation with a fuzzy k-means clustering strategy (KMGB) allows to control the edge preserving

  9. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild.

    Science.gov (United States)

    Asthana, Akshay; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-06-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important advantages. First, by virtue of discriminative training, invariance to external variations (like identity, pose, illumination and expression) is achieved. Second, we show that the responses generated by discriminatively trained filters (or patch-experts) are sparse and can be modeled using a very small number of parameters. As a result, the optimization methods based on the proposed texture model can better cope with unseen variations. We illustrate this point by formulating both part-based and holistic approaches for generic face alignment and show that our framework outperforms the state-of-the-art on multiple "wild" databases. The code and dataset annotations are available for research purposes from http://ibug.doc.ic.ac.uk/resources.

  10. SU-D-207-07: Implementation of Full/half Bowtie Filter Model in a Commercial Treatment Planning System for Kilovoltage X-Ray Imaging Dose Estimation

    International Nuclear Information System (INIS)

    Kim, S; Alaei, P

    2015-01-01

    Purpose: To implement full/half bowtie filter models in a commercial treatment planning system (TPS) to calculate kilovoltage (kV) x-ray imaging dose of Varian On-Board Imager (OBI) cone beam CT (CBCT) system. Methods: Full/half bowtie filters of Varian OBI were created as compensator models in Pinnacle TPS (version 9.6) using Matlab software (version 2011a). The profiles of both bowtie filters were acquired from the manufacturer, imported into the Matlab system and hard coded in binary file format. A Pinnacle script was written to import each bowtie filter data into a Pinnacle treatment plan as a compensator. A kV x-ray beam model without including the compensator model was commissioned per each bowtie filter setting based on percent depth dose and lateral profile data acquired from Monte Carlo simulations. To validate the bowtie filter models, a rectangular water phantom was generated in the planning system and an anterior/posterior beam with each bowtie filter was created. Using the Pinnacle script, each bowtie filter compensator was added to the treatment plan. Lateral profile at the depth of 3cm and percent depth dose were measured using an ion chamber and compared with the data extracted from the treatment plans. Results: The kV x-ray beams for both full and half bowtie filter have been modeled in a commercial TPS. The difference of lateral and depth dose profiles between dose calculations and ion chamber measurements were within 6%. Conclusion: Both full/half bowtie filter models provide reasonable results in kV x-ray dose calculations in the water phantom. This study demonstrates the possibility of using a model-based treatment planning system to calculate the kV imaging dose for both full and half bowtie filter modes. Further study is to be performed to evaluate the models in clinical situations

  11. Sub-nanometre resolution imaging of polymer-fullerene photovoltaic blends using energy-filtered scanning electron microscopy.

    Science.gov (United States)

    Masters, Robert C; Pearson, Andrew J; Glen, Tom S; Sasam, Fabian-Cyril; Li, Letian; Dapor, Maurizio; Donald, Athene M; Lidzey, David G; Rodenburg, Cornelia

    2015-04-24

    The resolution capability of the scanning electron microscope has increased immensely in recent years, and is now within the sub-nanometre range, at least for inorganic materials. An equivalent advance has not yet been achieved for imaging the morphologies of nanostructured organic materials, such as organic photovoltaic blends. Here we show that energy-selective secondary electron detection can be used to obtain high-contrast, material-specific images of an organic photovoltaic blend. We also find that we can differentiate mixed phases from pure material phases in our data. The lateral resolution demonstrated is twice that previously reported from secondary electron imaging. Our results suggest that our energy-filtered scanning electron microscopy approach will be able to make major inroads into the understanding of complex, nano-structured organic materials.

  12. Sub-nanometre resolution imaging of polymer–fullerene photovoltaic blends using energy-filtered scanning electron microscopy

    Science.gov (United States)

    Masters, Robert C.; Pearson, Andrew J.; Glen, Tom S.; Sasam, Fabian-Cyril; Li, Letian; Dapor, Maurizio; Donald, Athene M.; Lidzey, David G.; Rodenburg, Cornelia

    2015-01-01

    The resolution capability of the scanning electron microscope has increased immensely in recent years, and is now within the sub-nanometre range, at least for inorganic materials. An equivalent advance has not yet been achieved for imaging the morphologies of nanostructured organic materials, such as organic photovoltaic blends. Here we show that energy-selective secondary electron detection can be used to obtain high-contrast, material-specific images of an organic photovoltaic blend. We also find that we can differentiate mixed phases from pure material phases in our data. The lateral resolution demonstrated is twice that previously reported from secondary electron imaging. Our results suggest that our energy-filtered scanning electron microscopy approach will be able to make major inroads into the understanding of complex, nano-structured organic materials. PMID:25906738

  13. An electronic image processing device featuring continuously selectable two-dimensional bipolar filter functions and real-time operation

    International Nuclear Information System (INIS)

    Charleston, B.D.; Beckman, F.H.; Franco, M.J.; Charleston, D.B.

    1981-01-01

    A versatile electronic-analogue image processing system has been developed for use in improving the quality of various types of images with emphasis on those encountered in experimental and diagnostic medicine. The operational principle utilizes spatial filtering which selectively controls the contrast of an image according to the spatial frequency content of relevant and non-relevant features of the image. Noise can be reduced or eliminated by selectively lowering the contrast of information in the high spatial frequency range. Edge sharpness can be enhanced by accentuating the upper midrange spatial frequencies. Both methods of spatial frequency control may be adjusted continuously in the same image to obtain maximum visibility of the features of interest. A precision video camera is used to view medical diagnostic images, either prints, transparencies or CRT displays. The output of the camera provides the analogue input signal for both the electronic processing system and the video display of the unprocessed image. The video signal input to the electronic processing system is processed by a two-dimensional spatial convolution operation. The system employs charged-coupled devices (CCDs), both tapped analogue delay lines (TADs) and serial analogue delay lines (SADs), to store information in the form of analogue potentials which are constantly being updated as new sampled analogue data arrive at the input. This information is convolved with a programmed bipolar radially symmetrical hexagonal function which may be controlled and varied at each radius by the operator in real-time by adjusting a set of front panel controls or by a programmed microprocessor control. Two TV monitors are used, one for processed image display and the other for constant reference to the original image. The working prototype has a full-screen display matrix size of 200 picture elements per horizontal line by 240 lines. The matrix can be expanded vertically and horizontally for the

  14. Low-loss interference filter arrays made by plasma-assisted reactive magnetron sputtering (PARMS) for high-performance multispectral imaging

    Science.gov (United States)

    Broßmann, Jan; Best, Thorsten; Bauer, Thomas; Jakobs, Stefan; Eisenhammer, Thomas

    2016-10-01

    Optical remote sensing of the earth from air and space typically utilizes several channels in the visible and near infrared spectrum. Thin-film optical interference filters, mostly of narrow bandpass type, are applied to select these channels. The filters are arranged in filter wheels, arrays of discrete stripe filters mounted in frames, or patterned arrays on a monolithic substrate. Such multi-channel filter assemblies can be mounted close to the detector, which allows a compact and lightweight camera design. Recent progress in image resolution and sensor sensitivity requires improvements of the optical filter performance. Higher demands placed on blocking in the UV and NIR and in between the spectral channels, in-band transmission and filter edge steepness as well as scattering lead to more complex filter coatings with thicknesses in the range of 10 - 25μm. Technological limits of the conventionally used ion-assisted evaporation process (IAD) can be overcome only by more precise and higher-energetic coating technologies like plasma-assisted reactive magnetron sputtering (PARMS) in combination with optical broadband monitoring. Optics Balzers has developed a photolithographic patterning process for coating thicknesses up to 15μm that is fully compatible with the advanced PARMS coating technology. This provides the possibility of depositing multiple complex high-performance filters on a monolithic substrate. We present an overview of the performance of recently developed filters with improved spectral performance designed for both monolithic filter-arrays and stripe filters mounted in frames. The pros and cons as well as the resulting limits of the filter designs for both configurations are discussed.

  15. Spatially assisted down-track median filter for GPR image post-processing

    Science.gov (United States)

    Paglieroni, David W; Beer, N Reginald

    2014-10-07

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  16. A Low-Noise CMOS THz Imager Based on Source Modulation and an In-Pixel High-Q Passive Switched-Capacitor N-Path Filter.

    Science.gov (United States)

    Boukhayma, Assim; Dupret, Antoine; Rostaing, Jean-Pierre; Enz, Christian

    2016-03-03

    This paper presents the first low noise complementary metal oxide semiconductor (CMOS) deletedCMOS terahertz (THz) imager based on source modulation and in-pixel high-Q filtering. The 31 × 31 focal plane array has been fully integrated in a 0 . 13 μ m standard CMOS process. The sensitivity has been improved significantly by modulating the active THz source that lights the scene and performing on-chip high-Q filtering. Each pixel encompass a broadband bow tie antenna coupled to an N-type metal-oxide-semiconductor (NMOS) detector that shifts the THz radiation, a low noise adjustable gain amplifier and a high-Q filter centered at the modulation frequency. The filter is based on a passive switched-capacitor (SC) N-path filter combined with a continuous-time broad-band Gm-C filter. A simplified analysis that helps in designing and tuning the passive SC N-path filter is provided. The characterization of the readout chain shows that a Q factor of 100 has been achieved for the filter with a good matching between the analytical calculation and the measurement results. An input-referred noise of 0 . 2 μ V RMS has been measured. Characterization of the chip with different THz wavelengths confirms the broadband feature of the antenna and shows that this THz imager reaches a total noise equivalent power of 0 . 6 nW at 270 GHz and 0 . 8 nW at 600 GHz.

  17. A high-transmission liquid-crystal Fabry-Perot infrared filter for electrically tunable spectral imaging detection

    Science.gov (United States)

    Liu, Zhonglun; Xin, Zhaowei; Long, Huabao; Wei, Dong; Dai, Wanwan; Zhang, Xinyu; Wang, Haiwei; Xie, Changsheng

    2018-02-01

    Previous studies have presented the usefulness of typical liquid-crystal Fabry-Perot (LC-FP) infrared filters for spectral imaging detection. Yet, their infrared transmission performances still remain to improve or even rise. In this paper, we propose a new type of electrically tunable LC-FP infrared filter to solve the problem above. The key component of the device is a FP resonant cavity composed of two parallel plane mirrors, in which the zinc selenide (ZnSe) materials with a very high transmittance in the mid-long-wavelength infrared regions are used as the electrode substrates and a layer of nano-aluminum (Al) film, which is directly contacted with liquid-crystal materials, is chosen to make high reflective mirrors as well as the electrodes. Particularly, it should be noted that the directional layer made up of ployimide (PI) used previously is removed. The experiment results indicate that the filter can reduce the absorption of infrared wave remarkably, and thus highlight a road to effectively improve the infrared transmittance ability.

  18. A motion-compensated image filter for low-dose fluoroscopy in a real-time tumor-tracking radiotherapy system

    International Nuclear Information System (INIS)

    Miyamoto, Naoki; Ishikawa, Masayori; Sutherland, Kenneth

    2015-01-01

    In the real-time tumor-tracking radiotherapy system, a surrogate fiducial marker inserted in or near the tumor is detected by fluoroscopy to realize respiratory-gated radiotherapy. The imaging dose caused by fluoroscopy should be minimized. In this work, an image processing technique is proposed for tracing a moving marker in low-dose imaging. The proposed tracking technique is a combination of a motion-compensated recursive filter and template pattern matching. The proposed image filter can reduce motion artifacts resulting from the recursive process based on the determination of the region of interest for the next frame according to the current marker position in the fluoroscopic images. The effectiveness of the proposed technique and the expected clinical benefit were examined by phantom experimental studies with actual tumor trajectories generated from clinical patient data. It was demonstrated that the marker motion could be traced in low-dose imaging by applying the proposed algorithm with acceptable registration error and high pattern recognition score in all trajectories, although some trajectories were not able to be tracked with the conventional spatial filters or without image filters. The positional accuracy is expected to be kept within ±2 mm. The total computation time required to determine the marker position is a few milliseconds. The proposed image processing technique is applicable for imaging dose reduction. (author)

  19. Investigation of the influence of image reconstruction filter and scan parameters on operation of automatic tube current modulation systems for different CT scanners

    International Nuclear Information System (INIS)

    Sookpeng, Supawitoo; Martin, Colin J.; Gentle, David J.

    2015-01-01

    Variation in the user selected CT scanning parameters under automatic tube current modulation (ATCM) between hospitals has a substantial influence on the radiation doses and image quality for patients. The aim of this study was to investigate the effect of changing image reconstruction filter and scan parameter settings on tube current, dose and image quality for various CT scanners operating under ATCM. The scan parameters varied were pitch factor, rotation time, collimator configuration, kVp, image thickness and image filter convolution (FC) used for reconstruction. The Toshiba scanner varies the tube current to achieve a set target noise. Changes in the FC setting and image thickness for the first reconstruction were the major factors affecting patient dose. A two-step change in FC from smoother to sharper filters doubles the dose, but is counterbalanced by an improvement in spatial resolution. In contrast, Philips and Siemens scanners maintained tube current values similar to those for a reference image and patient, and the tube current only varied slightly for changes in individual CT scan parameters. The selection of a sharp filter increased the image noise, while use of iDose iterative reconstruction reduced the noise. Since the principles used by CT manufacturers for ATCM vary, it is important that parameters which affect patient dose and image quality for each scanner are made clear to operator to aid in optimisation. (authors)

  20. Signal-to-noise ratio enhancement on SEM images using a cubic spline interpolation with Savitzky-Golay filters and weighted least squares error.

    Science.gov (United States)

    Kiani, M A; Sim, K S; Nia, M E; Tso, C P

    2015-05-01

    A new technique based on cubic spline interpolation with Savitzky-Golay smoothing using weighted least squares error filter is enhanced for scanning electron microscope (SEM) images. A diversity of sample images is captured and the performance is found to be better when compared with the moving average and the standard median filters, with respect to eliminating noise. This technique can be implemented efficiently on real-time SEM images, with all mandatory data for processing obtained from a single image. Noise in images, and particularly in SEM images, are undesirable. A new noise reduction technique, based on cubic spline interpolation with Savitzky-Golay and weighted least squares error method, is developed. We apply the combined technique to single image signal-to-noise ratio estimation and noise reduction for SEM imaging system. This autocorrelation-based technique requires image details to be correlated over a few pixels, whereas the noise is assumed to be uncorrelated from pixel to pixel. The noise component is derived from the difference between the image autocorrelation at zero offset, and the estimation of the corresponding original autocorrelation. In the few test cases involving different images, the efficiency of the developed noise reduction filter is proved to be significantly better than those obtained from the other methods. Noise can be reduced efficiently with appropriate choice of scan rate from real-time SEM images, without generating corruption or increasing scanning time. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  1. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, Wesley; Sattarivand, Mike [Department of Radiation Oncology, Dalhousie University at Nova Scotia Health Authority, Department of Radiation Oncology, Dalhousie University at Nova Scotia Health Authority (Canada)

    2016-08-15

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknesses range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.

  2. Sci-Thur AM: YIS – 07: Optimizing dual-energy x-ray parameters using a single filter for both high and low-energy images to enhance soft-tissue imaging

    International Nuclear Information System (INIS)

    Bowman, Wesley; Sattarivand, Mike

    2016-01-01

    Objective: To optimize dual-energy parameters of ExacTrac stereoscopic x-ray imaging system for lung SBRT patients Methods: Simulated spectra and a lung phantom were used to optimize filter material, thickness, kVps, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number (Z) range [3–83] based on a metric defined to separate spectrums of high and low energies. Both energies used the same filter due to time constraints of image acquisition in lung SBRT imaging. A lung phantom containing bone, soft tissue, and a tumor mimicking material was imaged with filter thicknesses range [0–1] mm and kVp range [60–140]. A cost function based on contrast-to-noise-ratio of bone, soft tissue, and tumor, as well as image noise content, was defined to optimize filter thickness and kVp. Using the optimized parameters, dual-energy images of anthropomorphic Rando phantom were acquired and evaluated for bone subtraction. Imaging dose was measured with dual-energy technique using tin filtering. Results: Tin was the material of choice providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-only image in the lung phantom was obtained using 0.3 mm tin and [140, 80] kVp pair. Dual-energy images of the Rando phantom had noticeable bone elimination when compared to no filtration. Dose was lower with tin filtering compared to no filtration. Conclusions: Dual-energy soft-tissue imaging is feasible using ExacTrac stereoscopic imaging system utilizing a single tin filter for both high and low energies and optimized acquisition parameters.

  3. An image-space parallel convolution filtering algorithm based on shadow map

    Science.gov (United States)

    Li, Hua; Yang, Huamin; Zhao, Jianping

    2017-07-01

    Shadow mapping is commonly used in real-time rendering. In this paper, we presented an accurate and efficient method of soft shadows generation from planar area lights. First this method generated a depth map from light's view, and analyzed the depth-discontinuities areas as well as shadow boundaries. Then these areas were described as binary values in the texture map called binary light-visibility map, and a parallel convolution filtering algorithm based on GPU was enforced to smooth out the boundaries with a box filter. Experiments show that our algorithm is an effective shadow map based method that produces perceptually accurate soft shadows in real time with more details of shadow boundaries compared with the previous works.

  4. On-Chip Hardware for Cell Monitoring: Contact Imaging and Notch Filtering

    Science.gov (United States)

    2005-07-07

    a polymer carrier. Spectrophotometer chosen and purchased for testing optical filters and materials. Characterization and comparison of fabricated...reproducibility of behavior. Multi-level SU8 process developed. Optimization of actuator for closing vial lids and development of lid sealing technology is...bending angles characterized as a function of temperature in NaDBS solution. " Photopatternable polymers are a viable interim packaging solution; through

  5. A content-boosted collaborative filtering algorithm for personalized training in interpretation of radiological imaging.

    Science.gov (United States)

    Lin, Hongli; Yang, Xuedong; Wang, Weisheng

    2014-08-01

    Devising a method that can select cases based on the performance levels of trainees and the characteristics of cases is essential for developing a personalized training program in radiology education. In this paper, we propose a novel hybrid prediction algorithm called content-boosted collaborative filtering (CBCF) to predict the difficulty level of each case for each trainee. The CBCF utilizes a content-based filtering (CBF) method to enhance existing trainee-case ratings data and then provides final predictions through a collaborative filtering (CF) algorithm. The CBCF algorithm incorporates the advantages of both CBF and CF, while not inheriting the disadvantages of either. The CBCF method is compared with the pure CBF and pure CF approaches using three datasets. The experimental data are then evaluated in terms of the MAE metric. Our experimental results show that the CBCF outperforms the pure CBF and CF methods by 13.33 and 12.17 %, respectively, in terms of prediction precision. This also suggests that the CBCF can be used in the development of personalized training systems in radiology education.

  6. Application of a novel Kalman filter based block matching method to ultrasound images for hand tendon displacement estimation.

    Science.gov (United States)

    Lai, Ting-Yu; Chen, Hsiao-I; Shih, Cho-Chiang; Kuo, Li-Chieh; Hsu, Hsiu-Yun; Huang, Chih-Chung

    2016-01-01

    Information about tendon displacement is important for allowing clinicians to not only quantify preoperative tendon injuries but also to identify any adhesive scaring between tendon and adjacent tissue. The Fisher-Tippett (FT) similarity measure has recently been shown to be more accurate than the Laplacian sum of absolute differences (SAD) and Gaussian sum of squared differences (SSD) similarity measures for tracking tendon displacement in ultrasound B-mode images. However, all of these similarity measures can easily be influenced by the quality of the ultrasound image, particularly its signal-to-noise ratio. Ultrasound images of injured hands are unfortunately often of poor quality due to the presence of adhesive scars. The present study investigated a novel Kalman-filter scheme for overcoming this problem. Three state-of-the-art tracking methods (FT, SAD, and SSD) were used to track the displacements of phantom and cadaver tendons, while FT was used to track human tendons. These three tracking methods were combined individually with the proposed Kalman-filter (K1) scheme and another Kalman-filter scheme used in a previous study to optimize the displacement trajectories of the phantom and cadaver tendons. The motion of the human extensor digitorum communis tendon was measured in the present study using the FT-K1 scheme. The experimental results indicated that SSD exhibited better accuracy in the phantom experiments, whereas FT exhibited better performance for tracking real tendon motion in the cadaver experiments. All three tracking methods were influenced by the signal-to-noise ratio of the images. On the other hand, the K1 scheme was able to optimize the tracking trajectory of displacement in all experiments, even from a location with a poor image quality. The human experimental data indicated that the normal tendons were displaced more than the injured tendons, and that the motion ability of the injured tendon was restored after appropriate rehabilitation

  7. Comment on Vaknine, R. and Lorenz, W.J. Lateral filtering of medical ultrasonic B-scans before image generation.

    Science.gov (United States)

    Dickinson, R J

    1985-04-01

    In a recent paper, Vaknine and Lorenz discuss the merits of lateral deconvolution of demodulated B-scans. While this technique will decrease the lateral blurring of single discrete targets, such as the diaphragm in their figure 3, it is inappropriate to apply the method to the echoes arising from inhomogeneous structures such as soft tissue. In this latter case, the echoes from individual scatterers within the resolution cell of the transducer interfere to give random fluctuations in received echo amplitude termed speckle. Although his process can be modeled as a linear convolution similar to that of conventional image formation theory, the process of demodulation is a nonlinear process which loses the all-important phase information, and prevents the subsequent restoration of the image by Wiener filtering, itself a linear process.

  8. Characterizing high energy spectra of NIF ignition Hohlraums using a differentially filtered high energy multipinhole x-ray imager.

    Science.gov (United States)

    Park, Hye-Sook; Dewald, E D; Glenzer, S; Kalantar, D H; Kilkenny, J D; MacGowan, B J; Maddox, B R; Milovich, J L; Prasad, R R; Remington, B A; Robey, H F; Thomas, C A

    2010-10-01

    Understanding hot electron distributions generated inside Hohlraums is important to the national ignition campaign for controlling implosion symmetry and sources of preheat. While direct imaging of hot electrons is difficult, their spatial distribution and spectrum can be deduced by detecting high energy x-rays generated as they interact with target materials. We used an array of 18 pinholes with four independent filter combinations to image entire Hohlraums with a magnification of 0.87× during the Hohlraum energetics campaign on NIF. Comparing our results with Hohlraum simulations indicates that the characteristic 10-40 keV hot electrons are mainly generated from backscattered laser-plasma interactions rather than from Hohlraum hydrodynamics.

  9. MO-FG-CAMPUS-IeP1-01: Alternative K-Edge Filters for Low-Energy Image Acquisition in Contrast Enhanced Spectral Mammography

    Energy Technology Data Exchange (ETDEWEB)

    Shrestha, S; Vedantham, S; Karellas, A [University of Massachusetts Medical School, Worcester, MA (United States)

    2016-06-15

    Purpose: In Contrast Enhanced Spectral Mammography (CESM), Rh filter is often used during low-energy image acquisition. The potential for using Ag, In and Sn filters, which exhibit K-edge closer to, and just below that of Iodine, instead of the Rh filter, was investigated for the low-energy image acquisition. Methods: Analytical computations of the half-value thickness (HVT) and the photon fluence per mAs (photons/mm2/mAs) for 50µm Rh were compared with other potential K-edge filters (Ag, In and Sn), all with K-absorption edge below that of Iodine. Two strategies were investigated: fixed kVp and filter thickness (50µm for all filters) resulting in HVT variation, and fixed kVp and HVT resulting in variation in Ag, In and Sn thickness. Monte Carlo simulations (GEANT4) were conducted to determine if the scatter-to-primary ratio (SPR) and the point spread function of scatter (scatter PSF) differed between Rh and other K-edge filters. Results: Ag, In and Sn filters (50µm thick) increased photon fluence/mAs by 1.3–1.4, 1.8–2, and 1.7–2 at 28-32 kVp compared to 50µm Rh, which could decrease exposure time. Additionally, the fraction of spectra closer to and just below Iodine’s K-edge increased with these filters, which could improve post-subtraction image contrast. For HVT matched to 50µm Rh filtered spectra, the thickness range for Ag, In, and Sn were (41,44)µm, (49,55)µm and (45,53)µm, and increased photon fluence/mAs by 1.5–1.7, 1.6–2, and 1.6–2.2, respectively. Monte Carlo simulations showed that neither the SPR nor the scatter PSF of Ag, In and Sn differed from Rh, indicating no additional detriment due to x-ray scatter. Conclusion: The use of Ag, In and Sn filters for low-energy image acquisition in CESM is potentially feasible and could decrease exposure time and may improve post-subtraction image contrast. Effect of these filters on radiation dose, contrast, noise and associated metrics are being investigated. Funding Support: Supported in

  10. A background subtraction routine for enhancing energy-filtered plasmon images of MgAl2O4 implanted with Al+ and Mg+ ions

    International Nuclear Information System (INIS)

    Evans, N.D.; Kenik, E.A.; Bentley, J.; Zinkle, S.J.

    1995-01-01

    MgAl 2 O 4 , a candidate fusion reactor material, was irradiated with Al + or Mg + ions; electron energy-loss spectra and energy-filtered plasmon images showed that metallic Al colloids are present in the ion-irradiated regions. This paper shows the subtraction of the spinel plasmon component in images using 15-eV-loss electrons in some detail

  11. Preliminary energy-filtering neutron imaging with time-of-flight method on PKUNIFTY: A compact accelerator based neutron imaging facility at Peking University

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hu; Zou, Yubin, E-mail: zouyubin@pku.edu.cn; Wen, Weiwei; Lu, Yuanrong; Guo, Zhiyu

    2016-07-01

    Peking University Neutron Imaging Facility (PKUNIFTY) works on an accelerator–based neutron source with a repetition period of 10 ms and pulse duration of 0.4 ms, which has a rather low Cd ratio. To improve the effective Cd ratio and thus improve the detection capability of the facility, energy-filtering neutron imaging was realized with the intensified CCD camera and time-of-flight (TOF) method. Time structure of the pulsed neutron source was firstly simulated with Geant4, and the simulation result was evaluated with experiment. Both simulation and experiment results indicated that fast neutrons and epithermal neutrons were concentrated in the first 0.8 ms of each pulse period; meanwhile in the period of 0.8–2.0 ms only thermal neutrons existed. Based on this result, neutron images with and without energy filtering were acquired respectively, and it showed that detection capability of PKUNIFTY was improved with setting the exposure interval as 0.8–2.0 ms, especially for materials with strong moderating capability.

  12. Rotationally invariant correlation filtering

    International Nuclear Information System (INIS)

    Schils, G.F.; Sweeney, D.W.

    1985-01-01

    A method is presented for analyzing and designing optical correlation filters that have tailored rotational invariance properties. The concept of a correlation of an image with a rotation of itself is introduced. A unified theory of rotation-invariant filtering is then formulated. The unified approach describes matched filters (with no rotation invariance) and circular-harmonic filters (with full rotation invariance) as special cases. The continuum of intermediate cases is described in terms of a cyclic convolution operation over angle. The angular filtering approach allows an exact choice for the continuous trade-off between loss of the correlation energy (or specificity regarding the image) and the amount of rotational invariance desired

  13. Comparative study of reconstruction filters for cranio examinations of the Philips system and its influence on the quality of the tomographic image

    International Nuclear Information System (INIS)

    Silveira, V.C.; Kodlulovich, S.; Delduck, R.S.; Oliveira, L.C.G.

    2011-01-01

    The aim of this study was to evaluate different reconstruction algorithm (kernels) applied for head examinations. The research was carried out using 40-slice MDCT (Philips, Brilliance 40 CT scanner) and an ACR Phantoms to evaluate the image quality. The doses were estimated applying the coefficients obtained by IMPACT .The study showed that independently of the filter used the values of CT number have no change. The low contrast showed that the choice of correct filter might result a decrease of 9% of doses values. For special resolution, the sharp filter showed a better response for low m As. The noise value of image to certain filters smooth has no changed, even by reducing the m As values. (author)

  14. Efficient Implementation of Volterra Filters for De-interlacing TV Images - Snell and Wilcox

    DEFF Research Database (Denmark)

    Budd, Chris; Gravesen, Jens; Willson, Eddie

    1998-01-01

    A standard TV image is transmitted as a series of horizontal lines. To reduce band-width effects in transmission, half of a picture is transmitted in each frame, i.e., information is only given about pictures on alternate lines, a process called interlacing. A difficulty with this process is that...... is that other images (for example computer images) do not have alternate lines omitted. Thus it is desirable to be able to interpolate an existing TV images to obtain information on the images between alternate lines; this is the de-interlacing process.......A standard TV image is transmitted as a series of horizontal lines. To reduce band-width effects in transmission, half of a picture is transmitted in each frame, i.e., information is only given about pictures on alternate lines, a process called interlacing. A difficulty with this process...

  15. Multicomponent Seismic Imaging of the Cheyenne Belt: Data Improvement Through Non-Conventional Filtering

    Science.gov (United States)

    Johnson, R. A.; Shoshitaishvili, E.; Sorenson, L. S.

    2001-12-01

    The Cheyenne Belt in southeastern Wyoming separates Archean Wyoming Craton from accreted juvenile Proterozoic crust making it one of the fundamental sutures in the Proterozoic assemblage of western North America. As one of the multidisciplinary components of the Continental Dynamics - Rocky Mountains Transect project (CDROM), reflection seismic data were acquired from south-central Wyoming to central Colorado to characterize crustal structure associated with this boundary and younger Proterozoic shear zones to the south. In addition to acquisition of more conventional vertical-component data, 3-component data were acquired to better constrain rock properties and reflection directionality, providing information that tends to be lost with one-component recording. In order to achieve the highest possible signal-to-noise ratios in the processed data, considerable work was focused on removal of noise caused by private vehicles driving on forest roads during active recording and, perhaps more problematical, harmonic noise generated from power-line and other electrical-equipment interference. Noise generated from these sources was successfully attenuated using 1) short-window 2D FFT filtering to remove irregular, high-amplitude vehicular noise, and 2) harmonic-noise-subtraction algorithms developed at the University of Arizona to remove harmonic electrical-induction noise. This latter filtering procedure used a time-domain-based method of automatic estimation of noise frequencies and their amplitudes, followed by subtraction of these estimated anomalous harmonics from the data. Since the technique estimates the best fit of noise for the entire trace, subtraction of the noise avoids many of the deleterious effects of simple notch filtering. After noise removal, it was possible to pick both P-wave and S-wave first arrivals and model shallow subsurface rock properties. This model provides a link between deeper events and the surface geology.

  16. Biomedical bandpass filter for fluorescence microscopy imaging based on TiO2/SiO2 and TiO2/MgF2 dielectric multilayers

    International Nuclear Information System (INIS)

    Butt, M A; Fomchenkov, S A; Verma, P; Khonina, S N; Ullah, A

    2016-01-01

    We report a design for creating a multilayer dielectric optical filters based on TiO 2 and SiO 2 /MgF 2 alternating layers. We have selected Titanium dioxide (TiO 2 ) for high refractive index (2.5), Silicon dioxide (SiO 2 ) and Magnesium fluoride (MgF 2 ) as a low refractive index layer (1.45 and 1.37) respectively. Miniaturized visible spectrometers are useful for quick and mobile characterization of biological samples. Such devices can be fabricated by using Fabry-Perot (FP) filters consisting of two highly reflecting mirrors with a central cavity in between. Distributed Bragg Reflectors (DBRs) consisting of alternating high and low refractive index material pairs are the most commonly used mirrors in FP filters, due to their high reflectivity. However, DBRs have high reflectivity for a selected range of wavelengths known as the stopband of the DBR. This range is usually much smaller than the sensitivity range of the spectrometer range. Therefore a bandpass filters are required to restrict wavelength outside the stopband of the FP DBRs. The proposed filter shows a high quality with average transmission of 97.4% within the passbands and the transmission outside the passband is around 4%. Special attention has been given to keep the thickness of the filters within the economic limits. It can be suggested that these filters are exceptional choice for florescence imaging and Endoscope narrow band imaging. (paper)

  17. Geometry-Driven-Diffusion filtering of MR Brain Images using dissimilarities and optimal relaxation parameter

    Energy Technology Data Exchange (ETDEWEB)

    Bajla, Ivan [Austrian Research Centres Sibersdorf, Department of High Performance Image Processing and Video-Technology, A-2444 Seibersdorf (Austria); Hollander, Igor [Institute of information Processing, Austrian Academy of Sciences, Sonnenfelsgasse 19/2, 1010 Wien (Austria)

    1999-12-31

    A novel method of local adapting of the conductance using a pixel dissimilarity measure is developed. An alternative processing methodology is proposed, which is based on intensity gradient histogram calculated for region interiors and boundaries of a phantom which models real MR brain scans. It involves a specific cost function suitable for the calculation of the optimum relaxation parameter Kopt and for the selection of the optimal exponential conductance. Computer experiments for locally adaptive geometry-driven-diffusion filtering of an MR brain phantom have been performed and evaluated. (authors) 6 refs., 3 figs.2 tabs.

  18. Geometry-Driven-Diffusion filtering of MR Brain Images using dissimilarities and optimal relaxation parameter

    International Nuclear Information System (INIS)

    Bajla, Ivan; Hollander, Igor

    1998-01-01

    A novel method of local adapting of the conductance using a pixel dissimilarity measure is developed. An alternative processing methodology is proposed, which is based on intensity gradient histogram calculated for region interiors and boundaries of a phantom which models real MR brain scans. It involves a specific cost function suitable for the calculation of the optimum relaxation parameter Kopt and for the selection of the optimal exponential conductance. Computer experiments for locally adaptive geometry-driven-diffusion filtering of an MR brain phantom have been performed and evaluated. (authors)

  19. Cross-correlated imaging of single-mode photonic crystal rod fiber with distributed mode filtering

    DEFF Research Database (Denmark)

    Laurila, Marko; Barankov, Roman; Jørgensen, Mette Marie

    2013-01-01

    Photonic crystal bandgap fibers employing distributed mode filtering design provide near diffraction-limited light outputs, a critical property of fiber-based high-power lasers. Microstructure of the fibers is tailored to achieve single-mode operation at specific wavelength by resonant mode...... identify regimes of resonant coupling between higher-order core modes and cladding band. We demonstrate a passive fiber design in which the higher-order modal content inside the single-mode guiding regime is suppressed by at least 20 dB even for significantly misaligned input-coupling configurations....

  20. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging

    International Nuclear Information System (INIS)

    Meyer, Mathias; Haubenreisser, Holger; Schoenberg, Stefan O.; Henzler, Thomas; Raupach, Rainer; Schmidt, Bernhard; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Lietzmann, Florian; Schad, Lothar R.

    2015-01-01

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm 2 removesthe necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63 %/39 % lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. (orig.)

  1. Adaptive bilateral filter for image denoising and its application to in-vitro Time-of-Flight data

    Science.gov (United States)

    Seitel, Alexander; dos Santos, Thiago R.; Mersmann, Sven; Penne, Jochen; Groch, Anja; Yung, Kwong; Tetzlaff, Ralf; Meinzer, Hans-Peter; Maier-Hein, Lena

    2011-03-01

    Image-guided therapy systems generally require registration of pre-operative planning data with the patient's anatomy. One common approach to achieve this is to acquire intra-operative surface data and match it to surfaces extracted from the planning image. Although increasingly popular for surface generation in general, the novel Time-of-Flight (ToF) technology has not yet been applied in this context. This may be attributed to the fact that the ToF range images are subject to considerable noise. The contribution of this study is two-fold. Firstly, we present an adaption of the well-known bilateral filter for denoising ToF range images based on the noise characteristics of the camera. Secondly, we assess the quality of organ surfaces generated from ToF range data with and without bilateral smoothing using corresponding high resolution CT data as ground truth. According to an evaluation on five porcine organs, the root mean squared (RMS) distance between the denoised ToF data points and the reference computed tomography (CT) surfaces ranged from 3.0 mm (lung) to 9.0 mm (kidney). This corresponds to an error-reduction of up to 36% compared to the error of the original ToF surfaces.

  2. Eigenimage filtering in MR imaging: Application in the abnormal chest wall

    International Nuclear Information System (INIS)

    Compton, R.A.; Kormos, D.W.; Kritzer, M.R.; Lent, A.H.; Murdoch, J.B.; Yeung, H.N.

    1986-01-01

    Three-dimensional multislab acquisition permits the simultaneous imaging of many thin sections in selected ''slabs.'' The technique was proposed in 1985, clinically validated in 1986, and has now been further improved by recent work. The authors have applied the methodology of 1985 to design new ''slabcutter'' 90 0 and 180 0 radio frequency pulses that excite slabs which are ''squarer'' than any previously known. These new pulses increase the number of usable sections per slab by 40% and also improve image quality. They demonstrate the enhancement by imaging two separate slabs of the temporomandibular joint simultaneously

  3. Context-based adaptive filtering of interest points in image retrieval

    DEFF Research Database (Denmark)

    Nguyen, Phuong Giang; Andersen, Hans Jørgen

    2009-01-01

    Interest points have been used as local features with success in many computer vision applications such as image/video retrieval and object recognition. However, a major issue when using this approach is a large number of interest points detected from each image and created a dense feature space...... a subset of features. Our approach differs from others in a fact that selected feature is based on the context of the given image. Our experimental results show a significant reduction rate of features while preserving the retrieval performance....

  4. Phase-correcting non-local means filtering for diffusion-weighted imaging of the spinal cord.

    Science.gov (United States)

    Kafali, Sevgi Gokce; Çukur, Tolga; Saritas, Emine Ulku

    2018-02-09

    DWI suffers from low SNR when compared to anatomical MRI. To maintain reasonable SNR at relatively high spatial resolution, multiple acquisitions must be averaged. However, subject motion or involuntary physiological motion during diffusion-sensitizing gradients cause phase offsets among acquisitions. When the motion is localized to a small region, these phase offsets become particularly problematic. Complex averaging of acquisitions lead to cancellations from these phase offsets, whereas magnitude averaging results in noise amplification. Here, we propose an improved reconstruction for multi-acquisition DWI that effectively corrects for phase offsets while reducing noise. Each acquisition is processed with a refocusing reconstruction for global phase correction and a partial k-space reconstruction via projection-onto-convex-sets (POCS). The proposed reconstruction then embodies a new phase-correcting non-local means (PC-NLM) filter. PC-NLM is performed on the complex-valued outputs of the POCS algorithm aggregated across acquisitions. The PC-NLM filter leverages the shared structure among multiple acquisitions to simultaneously alleviate nuisance factors including phase offsets and noise. Extensive simulations and in vivo DWI experiments of the cervical spinal cord are presented. The results demonstrate that the proposed reconstruction improves image quality by mitigating signal loss because of phase offsets and reducing noise. Importantly, these improvements are achieved while preserving the accuracy of apparent diffusion coefficient maps. An improved reconstruction incorporating a PC-NLM filter for multi-acquisition DWI is presented. This reconstruction can be particularly beneficial for high-resolution or high-b-value DWI acquisitions that suffer from low SNR and phase offsets from local motion. © 2018 International Society for Magnetic Resonance in Medicine.

  5. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    International Nuclear Information System (INIS)

    Kamezawa, H; Arimura, H; Ohki, M; Shirieda, K; Kameda, N

    2014-01-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems

  6. SU-E-J-243: Possibility of Exposure Dose Reduction of Cone-Beam Computed Tomography in An Image Guided Patient Positioning System by Using Various Noise Suppression Filters

    Energy Technology Data Exchange (ETDEWEB)

    Kamezawa, H [Graduate School of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan); Arimura, H; Ohki, M [Faculty of Medical Sciences, Kyushu University, Higashi-ku, Fukuoka (Japan); Shirieda, K; Kameda, N [Fujimoto General Hospital, Miyakonojo, Miyazaki (Japan)

    2014-06-01

    Purpose: To investigate the possibility of exposure dose reduction of the cone-beam computed tomography (CBCT) in an image guided patient positioning system by using 6 noise suppression filters. Methods: First, a reference dose (RD) and low-dose (LD)-CBCT (X-ray volume imaging system, Elekta Co.) images were acquired with a reference dose of 86.2 mGy (weighted CT dose index: CTDIw) and various low doses of 1.4 to 43.1 mGy, respectively. Second, an automated rigid registration for three axes was performed for estimating setup errors between a planning CT image and the LD-CBCT images, which were processed by 6 noise suppression filters, i.e., averaging filter (AF), median filter (MF), Gaussian filter (GF), bilateral filter (BF), edge preserving smoothing filter (EPF) and adaptive partial median filter (AMF). Third, residual errors representing the patient positioning accuracy were calculated as an Euclidean distance between the setup error vectors estimated using the LD-CBCT image and RD-CBCT image. Finally, the relationships between the residual error and CTDIw were obtained for 6 noise suppression filters, and then the CTDIw for LD-CBCT images processed by the noise suppression filters were measured at the same residual error, which was obtained with the RD-CBCT. This approach was applied to an anthropomorphic pelvic phantom and two cancer patients. Results: For the phantom, the exposure dose could be reduced from 61% (GF) to 78% (AMF) by applying the noise suppression filters to the CBCT images. The exposure dose in a prostate cancer case could be reduced from 8% (AF) to 61% (AMF), and the exposure dose in a lung cancer case could be reduced from 9% (AF) to 37% (AMF). Conclusion: Using noise suppression filters, particularly an adaptive partial median filter, could be feasible to decrease the additional exposure dose to patients in image guided patient positioning systems.

  7. Reducing contrast contamination in radial turbo-spin-echo acquisitions by combining a narrow-band KWIC filter with parallel imaging.

    Science.gov (United States)

    Neumann, Daniel; Breuer, Felix A; Völker, Michael; Brandt, Tobias; Griswold, Mark A; Jakob, Peter M; Blaimer, Martin

    2014-12-01

    Cartesian turbo spin-echo (TSE) and radial TSE images are usually reconstructed by assembling data containing different contrast information into a single k-space. This approach results in mixed contrast contributions in the images, which may reduce their diagnostic value. The goal of this work is to improve the image contrast from radial TSE acquisitions by reducing the contribution of signals with undesired contrast information. Radial TSE acquisitions allow the reconstruction of multiple images with different T2 contrasts using the k-space weighted image contrast (KWIC) filter. In this work, the image contrast is improved by reducing the band-width of the KWIC filter. Data for the reconstruction of a single image are selected from within a small temporal range around the desired echo time. The resulting dataset is undersampled and, therefore, an iterative parallel imaging algorithm is applied to remove aliasing artifacts. Radial TSE images of the human brain reconstructed with the proposed method show an improved contrast when compared with Cartesian TSE images or radial TSE images with conventional KWIC reconstructions. The proposed method provides multi-contrast images from radial TSE data with contrasts similar to multi spin-echo images. Contaminations from unwanted contrast weightings are strongly reduced. © 2014 Wiley Periodicals, Inc.

  8. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    Energy Technology Data Exchange (ETDEWEB)

    Buhk, J.H. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Neuroradiology; Laqmani, A.; Schultzendorff, H.C. von; Hammerle, D.; Adam, G.; Regier, M. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Dept. of Diagnostic and Interventional Radiology; Sehner, S. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Inst. of Medical Biometry and Epidemiology; Fiehler, J. [Univ. Medical Center, Hamburg-Eppendorf (Germany). Neuroradiology; Nagel, H.D. [Dr. HD Nagel, Science and Technology for Radiology, Buchholz (Germany)

    2013-08-15

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  9. Intraindividual evaluation of the influence of iterative reconstruction and filter kernel on subjective and objective image quality in computed tomography of the brain

    International Nuclear Information System (INIS)

    Buhk, J.H.

    2013-01-01

    Objectives: To intraindividually evaluate the potential of 4th generation iterative reconstruction (IR) on brain CT with regard to subjective and objective image quality. Methods: 31 consecutive raw data sets of clinical routine native sequential brain CT scans were reconstructed with IR level 0 (= filtered back projection), 1, 3 and 4; 3 different brain filter kernels (smooth/standard/sharp) were applied respectively. Five independent radiologists with different levels of experience performed subjective image rating. Detailed ROI analysis of image contrast and noise was performed. Statistical analysis was carried out by applying a random intercept model. Results: Subjective scores for the smooth and the standard kernels were best at low IR levels, but both, in particular the smooth kernel, scored inferior with an increasing IR level. The sharp kernel scored lowest at IR 0, while the scores substantially increased at high IR levels, reaching significantly best scores at IR 4. Objective measurements revealed an overall increase in contrast-to-noise ratio at higher IR levels, which was highest when applying the soft filter kernel. The absolute grey-white contrast decreased with an increasing IR level and was highest when applying the sharp filter kernel. All subjective effects were independent of the raters' experience and the patients' age and sex. Conclusion: Different combinations of IR level and filter kernel substantially influence subjective and objective image quality of brain CT. (orig.)

  10. Exposure reduction in general dental practice using digital x-ray imaging system for intraoral radiography with additional x-ray beam filter

    International Nuclear Information System (INIS)

    Shibuya, Hitoshi; Mori, Toshimichi; Hayakawa, Yoshihiko; Kuroyanagi, Kinya; Ota, Yoshiko

    1997-01-01

    To measure exposure reduction in general dental practice using digital x-ray imaging systems for intraoral radiography with additional x-ray beam filter. Two digital x-ray imaging systems, Pana Digital (Pana-Heraus Dental) and CDR (Schick Technologies), were applied for intraoral radiography in general dental practice. Due to the high sensitivity to x-rays, additional x-ray beam filters for output reduction were used for examination. An Orex W II (Osada Electric Industry) x-ray generator was operated at 60 kVp, 7 mA. X-ray output (air-kerma; Gy) necessary for obtaining clinically acceptable images was measured at 0 to 20 cm in 5 cm steps from the cone tip using an ionizing chamber type 660 (Nuclear Associates) and compared with those for Ektaspeed Plus film (Eastman Kodak). The Pana Digital system was used with the optional filter supplied by Pana-Heraus Dental which reduced the output to 38%. The exposure necessary to obtain clinically acceptable images was only 40% of that for the film. The CDR system was used with the Dental X-ray Beam Filter Kit (Eastman Kodak) which reduced the x-ray output to 30%. The exposure necessary to obtain clinically acceptable images was only 20% of that for the film. The two digital x-ray imaging systems, Pana Digital and CDR, provided large dose savings (60-80%) compared with Ektaspeed Plus film when applied for intraoral radiography in general dental practice. (author)

  11. Generalised Filtering

    Directory of Open Access Journals (Sweden)

    Karl Friston

    2010-01-01

    Full Text Available We describe a Bayesian filtering scheme for nonlinear state-space models in continuous time. This scheme is called Generalised Filtering and furnishes posterior (conditional densities on hidden states and unknown parameters generating observed data. Crucially, the scheme operates online, assimilating data to optimize the conditional density on time-varying states and time-invariant parameters. In contrast to Kalman and Particle smoothing, Generalised Filtering does not require a backwards pass. In contrast to variational schemes, it does not assume conditional independence between the states and parameters. Generalised Filtering optimises the conditional density with respect to a free-energy bound on the model's log-evidence. This optimisation uses the generalised motion of hidden states and parameters, under the prior assumption that the motion of the parameters is small. We describe the scheme, present comparative evaluations with a fixed-form variational version, and conclude with an illustrative application to a nonlinear state-space model of brain imaging time-series.

  12. A new parallel algorithm and its simulation on hypercube simulator for low pass digital image filtering using systolic array

    International Nuclear Information System (INIS)

    Al-Hallaq, A.; Amin, S.

    1998-01-01

    This paper introduces a new parallel algorithm and its simulation on a hypercube simulator for the low pass digital image filtering using a systolic array. This new algorithm is faster than the old one (Amin, 1988). This is due to the the fact that the old algorithm carries out the addition operations in a sequential mode. But in our new design these addition operations are divided into tow groups, which can be performed in parallel. One group will be performed on one half of the systolic array and the other on the second half, that is, by folding. This parallelism reduces the time required for the whole process by almost quarter the time of the old algorithm.(authors). 18 refs., 3 figs

  13. Object-Oriented Semisupervised Classification of VHR Images by Combining MedLDA and a Bilateral Filter

    Directory of Open Access Journals (Sweden)

    Shi He

    2015-01-01

    Full Text Available A Bayesian hierarchical model is presented to classify very high resolution (VHR images in a semisupervised manner, in which both a maximum entropy discrimination latent Dirichlet allocation (MedLDA and a bilateral filter are combined into a novel application framework. The primary contribution of this paper is to nullify the disadvantages of traditional probabilistic topic models on pixel-level supervised information and to achieve the effective classification of VHR remote sensing images. This framework consists of the following two iterative steps. In the training stage, the model utilizes the central labeled pixel and its neighborhood, as a squared labeled image object, to train the classifiers. In the classification stage, each central unlabeled pixel with its neighborhood, as an unlabeled object, is classified as a user-provided geoobject class label with the maximum posterior probability. Gibbs sampling is adopted for model inference. The experimental results demonstrate that the proposed method outperforms two classical SVM-based supervised classification methods and probabilistic-topic-models-based classification methods.

  14. Water level response measurement in a steel cylindrical liquid storage tank using image filter processing under seismic excitation

    Science.gov (United States)

    Kim, Sung-Wan; Choi, Hyoung-Suk; Park, Dong-Uk; Baek, Eun-Rim; Kim, Jae-Min

    2018-02-01

    Sloshing refers to the movement of fluid that occurs when the kinetic energy of various storage tanks containing fluid (e.g., excitation and vibration) is continuously applied to the fluid inside the tanks. As the movement induced by an external force gets closer to the resonance frequency of the fluid, the effect of sloshing increases, and this can lead to a serious problem with the structural stability of the system. Thus, it is important to accurately understand the physics of sloshing, and to effectively suppress and reduce the sloshing. Also, a method for the economical measurement of the water level response of a liquid storage tank is needed for the exact analysis of sloshing. In this study, a method using images was employed among the methods for measuring the water level response of a liquid storage tank, and the water level response was measured using an image filter processing algorithm for the reduction of the noise of the fluid induced by light, and for the sharpening of the structure installed at the liquid storage tank. A shaking table test was performed to verify the validity of the method of measuring the water level response of a liquid storage tank using images, and the result was analyzed and compared with the response measured using a water level gauge.

  15. Using photoshop filters to create anatomic line-art medical images.

    Science.gov (United States)

    Kirsch, Jacobo; Geller, Brian S

    2006-08-01

    There are multiple ways to obtain anatomic drawings suitable for publication or presentations. This article demonstrates how to use Photoshop to alter digital radiologic images to create line-art illustrations in a quick and easy way. We present two simple to use methods; however, not every image can adequately be transformed and personal preferences and specific changes need to be applied to each image to obtain the desired result. There are multiple ways to obtain anatomic drawings suitable for publication or to prepare presentations. Medical illustrators have always played a major role in the radiology and medical education process. Whether used to teach a complex surgical or radiologic procedure, to define typical or atypical patterns of the spread of disease, or to illustrate normal or aberrant anatomy, medical illustration significantly affects learning (). However, if you are not an accomplished illustrator, the alternatives can be expensive (contacting a professional medical illustrator or buying an already existing stock of digital images) or simply not necessarily applicable to what you are trying to communicate. The purpose of this article is to demonstrate how using Photoshop (Adobe Systems, San Jose, CA) to alter digital radiologic images we can create line-art illustrations in a quick, inexpensive, and easy way in preparation for electronic presentations and publication.

  16. High-dynamic range compressive spectral imaging by grayscale coded aperture adaptive filtering

    Directory of Open Access Journals (Sweden)

    Nelson Eduardo Diaz

    2015-09-01

    Full Text Available The coded aperture snapshot spectral imaging system (CASSI is an imaging architecture which senses the three dimensional informa-tion of a scene with two dimensional (2D focal plane array (FPA coded projection measurements. A reconstruction algorithm takes advantage of the compressive measurements sparsity to recover the underlying 3D data cube. Traditionally, CASSI uses block-un-block coded apertures (BCA to spatially modulate the light. In CASSI the quality of the reconstructed images depends on the design of these coded apertures and the FPA dynamic range. This work presents a new CASSI architecture based on grayscaled coded apertu-res (GCA which reduce the FPA saturation and increase the dynamic range of the reconstructed images. The set of GCA is calculated in a real-time adaptive manner exploiting the information from the FPA compressive measurements. Extensive simulations show the attained improvement in the quality of the reconstructed images when GCA are employed.  In addition, a comparison between traditional coded apertures and GCA is realized with respect to noise tolerance.

  17. Retina-Inspired Filter.

    Science.gov (United States)

    Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien

    2018-07-01

    This paper introduces a novel filter, which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer, and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model "virtual retina." This model is the cornerstone to derive the non-separable spatio-temporal OPL retina-inspired filter, briefly renamed retina-inspired filter, studied in this paper. This filter is connected to the dynamic behavior of the retina, which enables the retina to increase the sharpness of the visual stimulus during filtering before its transmission to the brain. We establish that this retina-inspired transform forms a group of spatio-temporal Weighted Difference of Gaussian (WDoG) filters when it is applied to a still image visible for a given time. We analyze the spatial frequency bandwidth of the retina-inspired filter with respect to time. It is shown that the WDoG spectrum varies from a lowpass filter to a bandpass filter. Therefore, while time increases, the retina-inspired filter enables to extract different kinds of information from the input image. Finally, we discuss the benefits of using the retina-inspired filter in image processing applications such as edge detection and compression.

  18. Choosing and using astronomical filters

    CERN Document Server

    Griffiths, Martin

    2014-01-01

    As a casual read through any of the major amateur astronomical magazines will demonstrate, there are filters available for all aspects of optical astronomy. This book provides a ready resource on the use of the following filters, among others, for observational astronomy or for imaging: Light pollution filters Planetary filters Solar filters Neutral density filters for Moon observation Deep-sky filters, for such objects as galaxies, nebulae and more Deep-sky objects can be imaged in much greater detail than was possible many years ago. Amateur astronomers can take

  19. Extraction of topographic and material contrasts on surfaces from SEM images obtained by energy filtering detection with low-energy primary electrons

    International Nuclear Information System (INIS)

    Nagoshi, Masayasu; Aoyama, Tomohiro; Sato, Kaoru

    2013-01-01

    Secondary electron microscope (SEM) images have been obtained for practical materials using low primary electron energies and an in-lens type annular detector with changing negative bias voltage supplied to a grid placed in front of the detector. The kinetic-energy distribution of the detected electrons was evaluated by the gradient of the bias-energy dependence of the brightness of the images. This is divided into mainly two parts at about 500 V, high and low brightness in the low- and high-energy regions, respectively and shows difference among the surface regions having different composition and topography. The combination of the negative grid bias and the pixel-by-pixel image subtraction provides the band-pass filtered images and extracts the material and topographic information of the specimen surfaces. -- Highlights: ► Scanning electron (SE) images contain many kind of information on material surfaces. ► We investigate energy-filtered SE images for practical materials. ► The brightness of the images is divided into two parts by the bias voltage. ► Topographic and material contrasts are extracted by subtracting the filtered images.

  20. Spatial frequency domain imaging using a snap-shot filter mosaic camera with multi-wavelength sensitive pixels

    Science.gov (United States)

    Strömberg, Tomas; Saager, Rolf B.; Kennedy, Gordon T.; Fredriksson, Ingemar; Salerud, Göran; Durkin, Anthony J.; Larsson, Marcus

    2018-02-01

    Spatial frequency domain imaging (SFDI) utilizes a digital light processing (DLP) projector for illuminating turbid media with sinusoidal patterns. The tissue absorption (μa) and reduced scattering coefficient (μ,s) are calculated by analyzing the modulation transfer function for at least two spatial frequencies. We evaluated different illumination strategies with a red, green and blue light emitting diodes (LED) in the DLP, while imaging with a filter mosaic camera, XiSpec, with 16 different multi-wavelength sensitive pixels in the 470-630 nm wavelength range. Data were compared to SFDI by a multispectral camera setup (MSI) consisting of four cameras with bandpass filters centered at 475, 560, 580 and 650 nm. A pointwise system for comprehensive microcirculation analysis was used (EPOS) for comparison. A 5-min arterial occlusion and release protocol on the forearm of a Caucasian male with fair skin was analyzed by fitting the absorption spectra of the chromophores HbO2, Hb and melanin to the estimatedμa. The tissue fractions of red blood cells (fRBC), melanin (/mel) and the Hb oxygenation (S02 ) were calculated at baseline, end of occlusion, early after release and late after release. EPOS results showed a decrease in S02 during the occlusion and hyperemia during release (S02 = 40%, 5%, 80% and 51%). The fRBC showed an increase during occlusion and release phases. The best MSI resemblance to the EPOS was for green LED illumination (S02 = 53%, 9%, 82%, 65%). Several illumination and analysis strategies using the XiSpec gave un-physiological results (e.g. negative S02 ). XiSpec with green LED illumination gave the expected change in /RBC , while the dynamics in S02 were less than those for EPOS. These results may be explained by the calculation of modulation using an illumination and detector setup with a broad spectral transmission bandwidth, with considerable variation in μa of included chromophores. Approaches for either reducing the effective bandwidth of

  1. Influence of Respiratory Gating, Image Filtering, and Animal Positioning on High-Resolution Electrocardiography-Gated Murine Cardiac Single-Photon Emission Computed Tomography

    Directory of Open Access Journals (Sweden)

    Chao Wu

    2015-01-01

    Full Text Available Cardiac parameters obtained from single-photon emission computed tomographic (SPECT images can be affected by respiratory motion, image filtering, and animal positioning. We investigated the influence of these factors on ultra-high-resolution murine myocardial perfusion SPECT. Five mice were injected with 99m technetium (99mTc-tetrofosmin, and each was scanned in supine and prone positions in a U-SPECT-II scanner with respiratory and electrocardiographic (ECG gating. ECG-gated SPECT images were created without applying respiratory motion correction or with two different respiratory motion correction strategies. The images were filtered with a range of three-dimensional gaussian kernels, after which end-diastolic volumes (EDVs, end-systolic volumes (ESVs, and left ventricular ejection fractions were calculated. No significant differences in the measured cardiac parameters were detected when any strategy to reduce or correct for respiratory motion was applied, whereas big differences (> 5% in EDV and ESV were found with regard to different positioning of animals. A linear relationship (p < .001 was found between the EDV or ESV and the kernel size of the gaussian filter. In short, respiratory gating did not significantly affect the cardiac parameters of mice obtained with ultra-high-resolution SPECT, whereas the position of the animals and the image filters should be the same in a comparative study with multiple scans to avoid systematic differences in measured cardiac parameters.

  2. Initial results of a new generation dual source CT system using only an in-plane comb filter for ultra-high resolution temporal bone imaging.

    Science.gov (United States)

    Meyer, Mathias; Haubenreisser, Holger; Raupach, Rainer; Schmidt, Bernhard; Lietzmann, Florian; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Schad, Lothar R; Schoenberg, Stefan O; Henzler, Thomas

    2015-01-01

    To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm(2) removes the necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. • Omitting the z-axis-filter allows a reduction in radiation dose of 50% • A smaller focal spot of 0.2 mm (2) significantly improves spatial resolution • Ultra-high-resolution temporal-bone-CT helps to gain diagnostic information of the middle/inner ear.

  3. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    Science.gov (United States)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  4. Human tracking in thermal images using adaptive particle filters with online random forest learning

    Science.gov (United States)

    Ko, Byoung Chul; Kwak, Joon-Young; Nam, Jae-Yeal

    2013-11-01

    This paper presents a fast and robust human tracking method to use in a moving long-wave infrared thermal camera under poor illumination with the existence of shadows and cluttered backgrounds. To improve the human tracking performance while minimizing the computation time, this study proposes an online learning of classifiers based on particle filters and combination of a local intensity distribution (LID) with oriented center-symmetric local binary patterns (OCS-LBP). Specifically, we design a real-time random forest (RF), which is the ensemble of decision trees for confidence estimation, and confidences of the RF are converted into a likelihood function of the target state. First, the target model is selected by the user and particles are sampled. Then, RFs are generated using the positive and negative examples with LID and OCS-LBP features by online learning. The learned RF classifiers are used to detect the most likely target position in the subsequent frame in the next stage. Then, the RFs are learned again by means of fast retraining with the tracked object and background appearance in the new frame. The proposed algorithm is successfully applied to various thermal videos as tests and its tracking performance is better than those of other methods.

  5. A Low-Noise CMOS THz Imager Based on Source Modulation and an In-Pixel High-Q Passive Switched-Capacitor N-Path Filter

    Science.gov (United States)

    Boukhayma, Assim; Dupret, Antoine; Rostaing, Jean-Pierre; Enz, Christian

    2016-01-01

    This paper presents the first low noise complementary metal oxide semiconductor (CMOS) terahertz (THz) imager based on source modulation and in-pixel high-Q filtering. The 31×31 focal plane array has been fully integrated in a 0.13μm standard CMOS process. The sensitivity has been improved significantly by modulating the active THz source that lights the scene and performing on-chip high-Q filtering. Each pixel encompass a broadband bow tie antenna coupled to an N-type metal-oxide-semiconductor (NMOS) detector that shifts the THz radiation, a low noise adjustable gain amplifier and a high-Q filter centered at the modulation frequency. The filter is based on a passive switched-capacitor (SC) N-path filter combined with a continuous-time broad-band Gm-C filter. A simplified analysis that helps in designing and tuning the passive SC N-path filter is provided. The characterization of the readout chain shows that a Q factor of 100 has been achieved for the filter with a good matching between the analytical calculation and the measurement results. An input-referred noise of 0.2μV RMS has been measured. Characterization of the chip with different THz wavelengths confirms the broadband feature of the antenna and shows that this THz imager reaches a total noise equivalent power of 0.6 nW at 270 GHz and 0.8 nW at 600 GHz. PMID:26950131

  6. A diagnostic imaging approach for online characterization of multi-impact in aircraft composite structures based on a scanning spatial-wavenumber filter of guided wave

    Science.gov (United States)

    Ren, Yuanqiang; Qiu, Lei; Yuan, Shenfang; Su, Zhongqing

    2017-06-01

    Monitoring of impact and multi-impact in particular in aircraft composite structures has been an intensive research topic in the field of guided-wave-based structural health monitoring (SHM). Compared with the majority of existing methods such as those using signal features in the time-, frequency- or joint time-frequency domain, the approach based on spatial-wavenumber filter of guided wave shows superb advantage in effectively distinguishing particular wave modes and identifying their propagation direction relative to the sensor array. However, there exist two major issues when conducting online characterization of multi-impact event. Firstly, the spatial-wavenumber filter should be realized in the situation that the wavenumber of high spatial resolution of the complicated multi-impact signal cannot be measured or modeled. Secondly, it's difficult to identify the multiple impacts and realize multi-impact localization due to the overlapping of wavenumbers. To address these issues, a scanning spatial-wavenumber filter based diagnostic imaging method for online characterization of multi-impact event is proposed to conduct multi-impact imaging and localization in this paper. The principle of the scanning filter for multi-impact is developed first to conduct spatial-wavenumber filtering and to achieve wavenumber-time imaging of the multiple impacts. Then, a feature identification method of multi-impact based on eigenvalue decomposition and wavenumber searching is presented to estimate the number of impacts and calculate the wavenumber of the multi-impact signal, and an image mapping method is proposed as well to convert the wavenumber-time image to an angle-distance image to distinguish and locate the multiple impacts. A series of multi-impact events are applied to a carbon fiber laminate plate to validate the proposed methods. The validation results show that the localization of the multiple impacts are well achieved.

  7. 3D soil water nowcasting using electromagnetic conductivity imaging and the ensemble Kalman filter

    Science.gov (United States)

    Huang, Jingyi; McBratney, Alex B.; Minasny, Budiman; Triantafilis, John

    2017-06-01

    Mapping and immediate forecasting of soil water content (θ) and its movement can be challenging. Although inversion of apparent electrical conductivity (ECa) measured by electromagnetic induction to calculate depth-specific electrical conductivity (σ) has been used, it is difficult to apply it across a field. In this paper we use a calibration established along a transect, across a 3.94-ha field with varying soil texture, using an ensemble Kalman filter (EnKF) to monitor and nowcast the 3-dimensional θ dynamics on 16 separate days over a period of 38 days. The EnKF combined a physical model fitted with θ measured by soil moisture sensors and an Artificial Neural Network model comprising σ generated by quasi-3d inversions of DUALEM-421S ECa data. Results showed that the distribution of θ was controlled by soil texture, topography, and vegetation. Soil water dried fastest at the beginning after the initial irrigation event and decreased with time and soil depth, which was consistent with classical soil drying theory and experiments. It was also found that the soil dried fastest in the loamy and duplex soils present in the field, which was attributable to deep drainage and preferential flow. It was concluded that the EnKF approach can be used to improve the irrigation efficiency by applying variable irrigation rates across the field. In addition, soil water status can be nowcasted across large spatial extents using this method with weather forecast information, which will provide guidance to farmers for real-time irrigation management.

  8. Observational demonstration of a high image rejection SIS mixer receiver using a new waveguide filter at 230 GHz

    Science.gov (United States)

    Hasegawa, Yutaka; Asayama, Shinichiro; Harada, Ryohei; Tokuda, Kazuki; Kimura, Kimihiro; Ogawa, Hideo; Onishi, Toshikazu

    2017-12-01

    A new sideband separation method was developed for use in millimeter-/submillimeter-band radio receivers using a novel waveguide frequency separation filter (FSF), which consists of two branch line hybrid couplers and two waveguide high-pass filters. The FSF was designed to allow the radio frequency (RF) signal to pass through to an output port when the frequency is higher than a certain value (225 GHz), and to reflect the RF signal back to another output port when the frequency is lower. The FSF is connected to two double sideband superconductor-insulator-superconductor (SIS) mixers, and an image rejection ratio (IRR) is determined by the FSF characteristics. With this new sideband separation method, we can achieve good and stable IRR without the balancing two SIS mixers such as is necessary for conventional sideband-separating SIS mixers. To demonstrate the applicability of this method, we designed and developed an FSF for simultaneous observations of the J = 2-1 rotational transition lines of three CO isotopes (12CO, 13CO, and C18O): the 12CO line is in the upper sideband and the others are in the lower sideband with an intermediate-frequency range of 4-8 GHz at the radio frequency of 220/230 GHz. This FSF was then installed in the receiver system of the 1.85 m radio telescope of Osaka Prefecture University, and was used during the 2014 observation season. The observation results indicate that the IRR of the proposed receiver is 25 dB or higher for the 12CO line, and no significant fluctuation larger than 1 dB in the IRR was observed throughout the season. These results demonstrate the practical utility of the FSF receiver for observations like extensive molecular cloud surveys in specified lines with a fixed frequency setting.

  9. AFM tip characterization by using FFT filtered images of step structures

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Yongda, E-mail: yanyongda@hit.edu.cn [Key Laboratory of Micro-systems and Micro-structures Manufacturing of Ministry of Education, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Center For Precision Engineering, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Xue, Bo [Key Laboratory of Micro-systems and Micro-structures Manufacturing of Ministry of Education, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Center For Precision Engineering, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China); Hu, Zhenjiang; Zhao, Xuesen [Center For Precision Engineering, Harbin Institute of Technology, Harbin, Heilongjiang 150001 (China)

    2016-01-15

    The measurement resolution of an atomic force microscope (AFM) is largely dependent on the radius of the tip. Meanwhile, when using AFM to study nanoscale surface properties, the value of the tip radius is needed in calculations. As such, estimation of the tip radius is important for analyzing results taken using an AFM. In this study, a geometrical model created by scanning a step structure with an AFM tip was developed. The tip was assumed to have a hemispherical cone shape. Profiles simulated by tips with different scanning radii were calculated by fast Fourier transform (FFT). By analyzing the influence of tip radius variation on the spectra of simulated profiles, it was found that low-frequency harmonics were more susceptible, and that the relationship between the tip radius and the low-frequency harmonic amplitude of the step structure varied monotonically. Based on this regularity, we developed a new method to characterize the radius of the hemispherical tip. The tip radii estimated with this approach were comparable to the results obtained using scanning electron microscope imaging and blind reconstruction methods. - Highlights: • The AFM tips with different radii were simulated to scan a nano-step structure. • The spectra of the simulation scans under different radii were analyzed. • The functions of tip radius and harmonic amplitude were used for evaluating tip. • The proposed method has been validated by SEM imaging and blind reconstruction.

  10. Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?

    International Nuclear Information System (INIS)

    Pan, Xiaochuan; Sidky, Emil Y; Vannier, Michael

    2009-01-01

    Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues. (topical review)

  11. Non-Euclidean phasor analysis for quantification of oxidative stress in ex vivo human skin exposed to sun filters using fluorescence lifetime imaging microscopy

    Science.gov (United States)

    Osseiran, Sam; Roider, Elisabeth M.; Wang, Hequn; Suita, Yusuke; Murphy, Michael; Fisher, David E.; Evans, Conor L.

    2017-12-01

    Chemical sun filters are commonly used as active ingredients in sunscreens due to their efficient absorption of ultraviolet (UV) radiation. Yet, it is known that these compounds can photochemically react with UV light and generate reactive oxygen species and oxidative stress in vitro, though this has yet to be validated in vivo. One label-free approach to probe oxidative stress is to measure and compare the relative endogenous fluorescence generated by cellular coenzymes nicotinamide adenine dinucleotides and flavin adenine dinucleotides. However, chemical sun filters are fluorescent, with emissive properties that contaminate endogenous fluorescent signals. To accurately distinguish the source of fluorescence in ex vivo skin samples treated with chemical sun filters, fluorescence lifetime imaging microscopy data were processed on a pixel-by-pixel basis using a non-Euclidean separation algorithm based on Mahalanobis distance and validated on simulated data. Applying this method, ex vivo samples exhibited a small oxidative shift when exposed to sun filters alone, though this shift was much smaller than that imparted by UV irradiation. Given the need for investigative tools to further study the clinical impact of chemical sun filters in patients, the reported methodology may be applied to visualize chemical sun filters and measure oxidative stress in patients' skin.

  12. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    International Nuclear Information System (INIS)

    Huang, J; Szczykutowicz, T; Bayouth, J; Miller, J

    2016-01-01

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between the acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials

  13. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    Energy Technology Data Exchange (ETDEWEB)

    Huang, J; Szczykutowicz, T; Bayouth, J; Miller, J [University of Wisconsin, Madison, WI (United States)

    2016-06-15

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between the acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials

  14. Reduction of radiation exposure while maintaining high-quality fluoroscopic images during interventional cardiology using novel x-ray tube technology with extra beam filtering.

    Science.gov (United States)

    den Boer, A; de Feyter, P J; Hummel, W A; Keane, D; Roelandt, J R

    1994-06-01

    Radiographic technology plays an integral role in interventional cardiology. The number of interventions continues to increase, and the associated radiation exposure to patients and personnel is of major concern. This study was undertaken to determine whether a newly developed x-ray tube deploying grid-switched pulsed fluoroscopy and extra beam filtering can achieve a reduction in radiation exposure while maintaining fluoroscopic images of high quality. Three fluoroscopic techniques were compared: continuous fluoroscopy, pulsed fluoroscopy, and a newly developed high-output pulsed fluoroscopy with extra filtering. To ascertain differences in the quality of images and to determine differences in patient entrance and investigator radiation exposure, the radiated volume curve was measured to determine the required high voltage levels (kVpeak) for different object sizes for each fluoroscopic mode. The fluoroscopic data of 124 patient procedures were combined. The data were analyzed for radiographic projections, image intensifier field size, and x-ray tube kilovoltage levels (kVpeak). On the basis of this analysis, a reference procedure was constructed. The reference procedure was tested on a phantom or dummy patient by all three fluoroscopic modes. The phantom was so designed that the kilovoltage requirements for each projection were comparable to those needed for the average patient. Radiation exposure of the operator and patient was measured during each mode. The patient entrance dose was measured in air, and the operator dose was measured by 18 dosimeters on a dummy operator. Pulsed compared with continuous fluoroscopy could be performed with improved image quality at lower kilovoltages. The patient entrance dose was reduced by 21% and the operator dose by 54%. High-output pulsed fluoroscopy with extra beam filtering compared with continuous fluoroscopy improved the image quality, lowered the kilovoltage requirements, and reduced the patient entrance dose by 55% and

  15. Comparison of 3D Maximum A Posteriori and Filtered Backprojection algorithms for high resolution animal imaging in microPET

    International Nuclear Information System (INIS)

    Chatziioannou, A.; Qi, J.; Moore, A.; Annala, A.; Nguyen, K.; Leahy, R.M.; Cherry, S.R.

    2000-01-01

    We have evaluated the performance of two three dimensional reconstruction algorithms with data acquired from microPET, a high resolution tomograph dedicated to small animal imaging. The first was a linear filtered-backprojection algorithm (FBP) with reprojection of the missing data and the second was a statistical maximum-aposteriori probability algorithm (MAP). The two algorithms were evaluated in terms of their resolution performance, both in phantoms and in vivo. Sixty independent realizations of a phantom simulating the brain of a baby monkey were acquired, each containing 3 million counts. Each of these realizations was reconstructed independently with both algorithms. The ensemble of the sixty reconstructed realizations was used to estimate the standard deviation as a measure of the noise for each reconstruction algorithm. More detail was recovered in the MAP reconstruction without an increase in noise relative to FBP. Studies in a simple cylindrical compartment phantom demonstrated improved recovery of known activity ratios with MAP. Finally in vivo studies also demonstrated a clear improvement in spatial resolution using the MAP algorithm. The quantitative accuracy of the MAP reconstruction was also evaluated by comparison with autoradiography and direct well counting of tissue samples and was shown to be superior

  16. Adaptive statistical iterative reconstruction versus filtered back projection in the same patient: 64 channel liver CT image quality and patient radiation dose

    International Nuclear Information System (INIS)

    Mitsumori, Lee M.; Shuman, William P.; Busey, Janet M.; Kolokythas, Orpheus; Koprowicz, Kent M.

    2012-01-01

    To compare routine dose liver CT reconstructed with filtered back projection (FBP) versus low dose images reconstructed with FBP and adaptive statistical iterative reconstruction (ASIR). In this retrospective study, patients had a routine dose protocol reconstructed with FBP, and again within 17 months (median 6.1 months), had a low dose protocol reconstructed twice, with FBP and ASIR. These reconstructions were compared for noise, image quality, and radiation dose. Nineteen patients were included. (12 male, mean age 58). Noise was significantly lower in low dose images reconstructed with ASIR compared to routine dose images reconstructed with FBP (liver: p <.05, aorta: p < 0.001). Low dose FBP images were scored significantly lower for subjective image quality than low dose ASIR (2.1 ± 0.5, 3.2 ± 0.8, p < 0.001). There was no difference in subjective image quality scores between routine dose FBP images and low dose ASIR images (3.6 ± 0.5, 3.2 ± 0.8, NS).Radiation dose was 41% less for the low dose protocol (4.4 ± 2.4 mSv versus 7.5 ± 5.5 mSv, p < 0.05). Our initial results suggest low dose CT images reconstructed with ASIR may have lower measured noise, similar image quality, yet significantly less radiation dose compared with higher dose images reconstructed with FBP. (orig.)

  17. Adaptive statistical iterative reconstruction versus filtered back projection in the same patient: 64 channel liver CT image quality and patient radiation dose

    Energy Technology Data Exchange (ETDEWEB)

    Mitsumori, Lee M.; Shuman, William P.; Busey, Janet M.; Kolokythas, Orpheus; Koprowicz, Kent M. [University of Washington School of Medicine, Department of Radiology, Seattle, WA (United States)

    2012-01-15

    To compare routine dose liver CT reconstructed with filtered back projection (FBP) versus low dose images reconstructed with FBP and adaptive statistical iterative reconstruction (ASIR). In this retrospective study, patients had a routine dose protocol reconstructed with FBP, and again within 17 months (median 6.1 months), had a low dose protocol reconstructed twice, with FBP and ASIR. These reconstructions were compared for noise, image quality, and radiation dose. Nineteen patients were included. (12 male, mean age 58). Noise was significantly lower in low dose images reconstructed with ASIR compared to routine dose images reconstructed with FBP (liver: p <.05, aorta: p < 0.001). Low dose FBP images were scored significantly lower for subjective image quality than low dose ASIR (2.1 {+-} 0.5, 3.2 {+-} 0.8, p < 0.001). There was no difference in subjective image quality scores between routine dose FBP images and low dose ASIR images (3.6 {+-} 0.5, 3.2 {+-} 0.8, NS).Radiation dose was 41% less for the low dose protocol (4.4 {+-} 2.4 mSv versus 7.5 {+-} 5.5 mSv, p < 0.05). Our initial results suggest low dose CT images reconstructed with ASIR may have lower measured noise, similar image quality, yet significantly less radiation dose compared with higher dose images reconstructed with FBP. (orig.)

  18. Characterization of the Edges and Contrasts in a digital image with the variation of the Parameters of the High-pass Filters used in the Estimation of Atmospheric Visibility

    Directory of Open Access Journals (Sweden)

    Martha C. Guzmán-Zapata

    2013-11-01

    Full Text Available This paper considers the edges and contrasts obtained with high-pass filters used in the estimation of daytime atmospheric visibility from digital images, and the behavior of these edges and contrasts is characterized by varying the parameters of high-pass filters such as the Ideal, Gaussian, and Homomorphic-Gaussian. A synthetic image of regions with different contrasts is used to apply different filters, then, we define an index to measure the quality of the edges obtained in the filtered image and it is used to analyze the results. The results show that both, the filter selection and the selection of its parameters: affects the characteristics and quality of the detected edges in the filtered image, also determine the amount of noise that the filter added to the image (artifacts that were not present in the original image, and also establish if achieved, or not, the edge detection. The results also show that the edge quality index reaches maximum values at certain combinations of the filters parameters, which means that some combinations of parameters reduce situations distorting the edges and distorting atmospheric visibility measures based on the Fourier transform. So these parameters which provide maximum quality edges are established as suitable for use in visibility measurement.

  19. Periodic additive noises reduction in 3D images used in building of voxel phantoms through an efficient implementation of the 3D FFT: zipper artifacts filtering

    International Nuclear Information System (INIS)

    Oliveira, Alex C.H. de; Lima, Fernando R.A.; Vieira, Jose W.; Leal Neto, Viriato

    2009-01-01

    The anthropomorphic models used in computational dosimetry are predominantly build from scanning CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) image stacks obtained of patients or volunteers. The building of these stacks (usually called of voxel phantoms or tomography phantoms) requires computer processing to be used in an exposure computational model. Noises present in these stacks can be confused with significant structures. In a 3D image with periodic additive noise in the frequency domain, the noise is fully added to its central slice. The discrete Fourier transform is the fundamental mathematical tool that allows the switch of the spatial domain for the frequency domain, and vice versa. The FFT (fast Fourier transform) algorithm is an ideal computational tool for this switch in domain with efficiency. This paper presents a new methodology for implementation in managed C++ language (Microsoft Visual Studio R .NET) of the fast Fourier transform of 3D digital images (FFT3D) using, essentially, the trigonometric recombination. The reduction of periodic additive noise consists in filtering only the central slice of 3D image in the frequency domain and transforms it back into the spatial domain through the inverse FFT3D. An example of application of this method it is the zipper artifacts filtering in images of MRI. These processes were implemented in the software DIP (Digital Image Processing). (author)

  20. Primer uticaja filtriranja slike u sistemima za praćenje ciljeva primenom termovizije / An example of image filtering in target tracking systems with thermal imagery

    Directory of Open Access Journals (Sweden)

    Zvonko M. Radosavljević

    2003-07-01

    Full Text Available U radu je dat primer primene jedne vrste niskofrekventnog filtriranja sa usrednjavanjem, koje se primenjuje u sistemima za detekciju i praćenje ciljeva u vazdušnom prostoru primenom termovizije. Date su dve metode filtriranja slike. Prva metoda koristi niskofrekventno konvoluciono filtriranje a druga usrednjavajući filtar na osnovu srednje vrednosti nivoa sivog. Ovi filtri su primenjeni u sistemima za praćenje uz pomoć infracrvenih senzora. Određivanje nivoa praga filtriranja vrši se uz pomoć statističkih osobina slike. Veoma važan korak u procesu praćenja je određivanje prozora praćenja, koji maze biti, po dimenzijama, fiksan ili adaptibilan. Pogrešna procena o postojanju cilja u prozoru može se doneti u slučaju prisustva šuma pozadine, predpojačavača, detektora, itd. Filtriranje je neophodan korak u ovim sistemima, kao značajan činilac U povećanju brzine i tačnosti praćenja. / A case of image filtering in air target detecting and tracking systems is described in this paper. Two image filtering methods are given. The first method is performed using a low pass convolving filter and the second one uses the mean value of gray level filter. The main goal of the cited filtering is implementation in IR (infra red systems. Some statistical features of the images were used for selecting the threshold level. The next step in the algorithm is the determination of a 'tracking window' that can be fixed or adaptive in size. A false estimation of a target existing in the window may be influenced by the background noise, low noise amplifier detector, etc.

  1. Testing the Feasibility of Using PERM to Apply Scattering-Angle Filtering in the Image-Domain for FWI Applications

    KAUST Repository

    Alzahrani, Hani Ataiq

    2014-09-01

    ABSTRACT Testing the Feasibility of Using PERM to Apply Scattering-Angle Filtering in the Image-Domain for FWI Applications Hani Ataiq Alzahrani Full Waveform Inversion (FWI) is a non-linear optimization problem aimed to estimating subsurface parameters by minimizing the mis t between modeled and recorded seismic data using gradient descent methods, which are the only practical choice because of the size of the problem. Due to the high non-linearity of the problem, gradient methods will converge to a local minimum if the starting model is not close to the true one. The accuracy of the long-wavelength components of the initial model controls the level of non-linearity of the inversion. In order for FWI to converge to the global minimum, we have to obtain the long wavelength components of the model before inverting for the short wavelengths. Ultra-low temporal frequencies are sensitive to the smooth (long wavelength) part of the model, and can be utilized by waveform inversion to resolve that part. Un- fortunately, frequencies in this range are normally missing in eld data due to data- acquisition limitations. The lack of low frequencies can be compensated for by uti- lizing wide-aperture data, as they include arrivals that are especially sensitive to the long wavelength components of the model. The higher the scattering angle of a 5 recorded event, the higher the model wavelength it can resolve. Based on this prop- erty, a scattering-angle ltering algorithm is proposed to start the inversion process with events corresponding to the highest scattering angle available in the data, and then include lower scattering angles progressively. The large scattering angles will resolve the smooth part of the model and reduce the non-linearity of the problem, then the lower ones will enhance the resolution of the model. Recorded data is rst migrated using Pre-stack Exploding Re ector Migration (PERM), then the resulting pre-stack image is transformed into angle gathers to which

  2. Ross filter pairs for metal artefact reduction in x-ray tomography: a case study based on imaging and segmentation of metallic implants

    Science.gov (United States)

    Arhatari, Benedicta D.; Abbey, Brian

    2018-01-01

    Ross filter pairs have recently been demonstrated as a highly effective means of producing quasi-monoenergetic beams from polychromatic X-ray sources. They have found applications in both X-ray spectroscopy and for elemental separation in X-ray computed tomography (XCT). Here we explore whether they could be applied to the problem of metal artefact reduction (MAR) for applications in medical imaging. Metal artefacts are a common problem in X-ray imaging of metal implants embedded in bone and soft tissue. A number of data post-processing approaches to MAR have been proposed in the literature, however these can be time-consuming and sometimes have limited efficacy. Here we describe and demonstrate an alternative approach based on beam conditioning using Ross filter pairs. This approach obviates the need for any complex post-processing of the data and enables MAR and segmentation from the surrounding tissue by exploiting the absorption edge contrast of the implant.

  3. Filter arrays

    Science.gov (United States)

    Page, Ralph H.; Doty, Patrick F.

    2017-08-01

    The various technologies presented herein relate to a tiled filter array that can be used in connection with performance of spatial sampling of optical signals. The filter array comprises filter tiles, wherein a first plurality of filter tiles are formed from a first material, the first material being configured such that only photons having wavelengths in a first wavelength band pass therethrough. A second plurality of filter tiles is formed from a second material, the second material being configured such that only photons having wavelengths in a second wavelength band pass therethrough. The first plurality of filter tiles and the second plurality of filter tiles can be interspersed to form the filter array comprising an alternating arrangement of first filter tiles and second filter tiles.

  4. Mid infra-red hyper-spectral imaging with bright super continuum source and fast acousto-optic tuneable filter for cytological applications

    International Nuclear Information System (INIS)

    Farries, Mark; Ward, Jon; Valle, Stefano; Stephens, Gary; Moselund, Peter; Van der Zanden, Koen; Napier, Bruce

    2015-01-01

    Mid-IR imaging spectroscopy has the potential to offer an effective tool for early cancer diagnosis. Current development of bright super-continuum sources, narrow band acousto-optic tunable filters and fast cameras have made feasible a system that can be used for fast diagnosis of cancer in vivo at point of care. The performance of a proto system that has been developed under the Minerva project is described. (paper)

  5. Energy-filtered real- and k-space secondary and energy-loss electron imaging with Dual Emission Electron spectro-Microscope: Cs/Mo(110)

    Energy Technology Data Exchange (ETDEWEB)

    Grzelakowski, Krzysztof P., E-mail: k.grzelakowski@opticon-nanotechnology.com

    2016-05-15

    Since its introduction the importance of complementary k{sub ||}-space (LEED) and real space (LEEM) information in the investigation of surface science phenomena has been widely demonstrated over the last five decades. In this paper we report the application of a novel kind of electron spectromicroscope Dual Emission Electron spectroMicroscope (DEEM) with two independent electron optical channels for reciprocal and real space quasi-simultaneous imaging in investigation of a Cs covered Mo(110) single crystal by using the 800 eV electron beam from an “in-lens” electron gun system developed for the sample illumination. With the DEEM spectromicroscope it is possible to observe dynamic, irreversible processes at surfaces in the energy-filtered real space and in the corresponding energy-filtered k{sub ǁ}-space quasi-simultaneously in two independent imaging columns. The novel concept of the high energy electron beam sample illumination in the cathode lens based microscopes allows chemically selective imaging and analysis under laboratory conditions. - Highlights: • A novel concept of the electron sample illumination with “in-lens” e- gun is realized. • Quasi-simultaneous energy selective observation of the real- and k-space in EELS mode. • Observation of the energy filtered Auger electron diffraction at Cs atoms on Mo(110). • Energy-loss, Auger and secondary electron momentum microscopy is realized.

  6. Unsharp masking technique as a preprocessing filter for improvement of 3D-CT image of bony structure in the maxillofacial region

    International Nuclear Information System (INIS)

    Harada, Takuya; Nishikawa, Keiichi; Kuroyanagi, Kinya

    1998-01-01

    We evaluated the usefulness of the unsharp masking technique as a preprocessing filter to improve 3D-CT images of bony structure in the maxillofacial region. The effect of the unsharp masking technique with several combinations of mask size and weighting factor on image resolution was investigated using a spatial frequency phantom made of bone-equivalent material. The 3D-CT images were obtained with scans perpendicular to and parallel to the phantom plates. The contrast transfer function (CTF) and the full width at half maximum (FWHM) of each spatial frequency component were measured. The FWHM was expressed as a ratio against the actual thickness of phantom plate. The effect on pseudoforamina was assessed using sliced CT images obtained in clinical bony 3D-CT examinations. The effect of the unsharp masking technique on image quality was also visually evaluated using five clinical fracture cases. CTFs did not change. FWHM ratios of original 3D-CT images were smaller than 1.0, regardless of the scanning direction. Those in scans perpendicular to the phantom plates were not changed by the unsharp masking technique. Those in parallel scanning were increased by mask size and weighting factor. The area of pseudoforamina decreased with increases in mask size and weighting factor. The combination of mask size 3 x 3 pixels and weighting factor 5 was optimal. Visual evaluation indicated that preprocessing with the unsharp masking technique improved the image quality of the 3D-CT images. The unsharp masking technique is useful as a preprocessing filter to improve the 3D-CT image of bony structure in the maxillofacial region. (author)

  7. Spectrum image analysis tool - A flexible MATLAB solution to analyze EEL and CL spectrum images.

    Science.gov (United States)

    Schmidt, Franz-Philipp; Hofer, Ferdinand; Krenn, Joachim R

    2017-02-01

    Spectrum imaging techniques, gaining simultaneously structural (image) and spectroscopic data, require appropriate and careful processing to extract information of the dataset. In this article we introduce a MATLAB based software that uses three dimensional data (EEL/CL spectrum image in dm3 format (Gatan Inc.'s DigitalMicrograph ® )) as input. A graphical user interface enables a fast and easy mapping of spectral dependent images and position dependent spectra. First, data processing such as background subtraction, deconvolution and denoising, second, multiple display options including an EEL/CL moviemaker and, third, the applicability on a large amount of data sets with a small work load makes this program an interesting tool to visualize otherwise hidden details. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Electrical resistance imaging of a time-varying interface in stratified flows using an unscented Kalman filter

    International Nuclear Information System (INIS)

    Ijaz, Umer Zeeshan; Khambampati, Anil Kumar; Kim, Kyung Youn; Chung, Soon Il; Kim, Sin

    2008-01-01

    In this paper, we estimate a time-varying interfacial boundary in stratified flows of two immiscible liquids using electrical resistance tomography. The interfacial boundary is approximated with front points spaced discretely along the interface. The design variables to be estimated are the locations of the front points, which are varying with the moving interface. The inverse problem is treated as a stochastic nonlinear state estimation problem with the nonstationary phase boundary (state) being estimated with the aid of an unscented Kalman filter. Numerical experiments are performed to evaluate the performance of an unscented Kalman filter. Specifically, a detailed analysis has been done on the effect of the number of front points and contrast ratio on the reconstruction performance. The reconstruction results show that an unscented Kalman filter is better suited for estimation in comparison to the conventional extended Kalman filter

  9. Extraction of topographic and material contrasts on surfaces from SEM images obtained by energy filtering detection with low-energy primary electrons.

    Science.gov (United States)

    Nagoshi, Masayasu; Aoyama, Tomohiro; Sato, Kaoru

    2013-01-01

    Secondary electron microscope (SEM) images have been obtained for practical materials using low primary electron energies and an in-lens type annular detector with changing negative bias voltage supplied to a grid placed in front of the detector. The kinetic-energy distribution of the detected electrons was evaluated by the gradient of the bias-energy dependence of the brightness of the images. This is divided into mainly two parts at about 500 V, high and low brightness in the low- and high-energy regions, respectively and shows difference among the surface regions having different composition and topography. The combination of the negative grid bias and the pixel-by-pixel image subtraction provides the band-pass filtered images and extracts the material and topographic information of the specimen surfaces. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Generalized Selection Weighted Vector Filters

    Directory of Open Access Journals (Sweden)

    Rastislav Lukac

    2004-09-01

    Full Text Available This paper introduces a class of nonlinear multichannel filters capable of removing impulsive noise in color images. The here-proposed generalized selection weighted vector filter class constitutes a powerful filtering framework for multichannel signal processing. Previously defined multichannel filters such as vector median filter, basic vector directional filter, directional-distance filter, weighted vector median filters, and weighted vector directional filters are treated from a global viewpoint using the proposed framework. Robust order-statistic concepts and increased degree of freedom in filter design make the proposed method attractive for a variety of applications. Introduced multichannel sigmoidal adaptation of the filter parameters and its modifications allow to accommodate the filter parameters to varying signal and noise statistics. Simulation studies reported in this paper indicate that the proposed filter class is computationally attractive, yields excellent performance, and is able to preserve fine details and color information while efficiently suppressing impulsive noise. This paper is an extended version of the paper by Lukac et al. presented at the 2003 IEEE-EURASIP Workshop on Nonlinear Signal and Image Processing (NSIP '03 in Grado, Italy.

  11. Comparison of iterative model, hybrid iterative, and filtered back projection reconstruction techniques in low-dose brain CT: impact of thin-slice imaging

    Energy Technology Data Exchange (ETDEWEB)

    Nakaura, Takeshi; Iyama, Yuji; Kidoh, Masafumi; Yokoyama, Koichi [Amakusa Medical Center, Diagnostic Radiology, Amakusa, Kumamoto (Japan); Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Oda, Seitaro; Yamashita, Yasuyuki [Kumamoto University, Department of Diagnostic Radiology, Graduate School of Medical Sciences, Kumamoto (Japan); Tokuyasu, Shinichi [Philips Electronics, Kumamoto (Japan); Harada, Kazunori [Amakusa Medical Center, Department of Surgery, Kumamoto (Japan)

    2016-03-15

    The purpose of this study was to evaluate the utility of iterative model reconstruction (IMR) in brain CT especially with thin-slice images. This prospective study received institutional review board approval, and prior informed consent to participate was obtained from all patients. We enrolled 34 patients who underwent brain CT and reconstructed axial images with filtered back projection (FBP), hybrid iterative reconstruction (HIR) and IMR with 1 and 5 mm slice thicknesses. The CT number, image noise, contrast, and contrast noise ratio (CNR) between the thalamus and internal capsule, and the rate of increase of image noise in 1 and 5 mm thickness images between the reconstruction methods, were assessed. Two independent radiologists assessed image contrast, image noise, image sharpness, and overall image quality on a 4-point scale. The CNRs in 1 and 5 mm slice thickness were significantly higher with IMR (1.2 ± 0.6 and 2.2 ± 0.8, respectively) than with FBP (0.4 ± 0.3 and 1.0 ± 0.4, respectively) and HIR (0.5 ± 0.3 and 1.2 ± 0.4, respectively) (p < 0.01). The mean rate of increasing noise from 5 to 1 mm thickness images was significantly lower with IMR (1.7 ± 0.3) than with FBP (2.3 ± 0.3) and HIR (2.3 ± 0.4) (p < 0.01). There were no significant differences in qualitative analysis of unfamiliar image texture between the reconstruction techniques. IMR offers significant noise reduction and higher contrast and CNR in brain CT, especially for thin-slice images, when compared to FBP and HIR. (orig.)

  12. Faraday anomalous dispersion optical filters

    Science.gov (United States)

    Shay, T. M.; Yin, B.; Alvarez, L. S.

    1993-01-01

    The effect of Faraday anomalous dispersion optical filters on infrared and blue transitions of some alkali atoms is calculated. A composite system is designed to further increase the background noise rejection. The measured results of the solar background rejection and image quality through the filter are presented. The results show that the filter may provide high transmission and high background noise rejection with excellent image quality.

  13. Rectifier Filters

    Directory of Open Access Journals (Sweden)

    Y. A. Bladyko

    2010-01-01

    Full Text Available The paper contains definition of a smoothing factor which is suitable for any rectifier filter. The formulae of complex smoothing factors have been developed for simple and complex passive filters. The paper shows conditions for application of calculation formulae and filters

  14. Combine TV-L1 model with guided image filtering for wide and faint ring artifacts correction of in-line x-ray phase contrast computed tomography.

    Science.gov (United States)

    Ji, Dongjiang; Qu, Gangrong; Hu, Chunhong; Zhao, Yuqing; Chen, Xiaodong

    2018-01-01

    In practice, mis-calibrated detector pixels give rise to wide and faint ring artifacts in the reconstruction image of the In-line phase-contrast computed tomography (IL-PC-CT). Ring artifacts correction is essential in IL-PC-CT. In this study, a novel method of wide and faint ring artifacts correction was presented based on combining TV-L1 model with guided image filtering (GIF) in the reconstruction image domain. The new correction method includes two main steps namely, the GIF step and the TV-L1 step. To validate the performance of this method, simulation data and real experimental synchrotron data are provided. The results demonstrate that TV-L1 model with GIF step can effectively correct the wide and faint ring artifacts for IL-PC-CT.

  15. Deferred slanted-edge analysis: a unified approach to spatial frequency response measurement on distorted images and color filter array subsets.

    Science.gov (United States)

    van den Bergh, F

    2018-03-01

    The slanted-edge method of spatial frequency response (SFR) measurement is usually applied to grayscale images under the assumption that any distortion of the expected straight edge is negligible. By decoupling the edge orientation and position estimation step from the edge spread function construction step, it is shown in this paper that the slanted-edge method can be extended to allow it to be applied to images suffering from significant geometric distortion, such as produced by equiangular fisheye lenses. This same decoupling also allows the slanted-edge method to be applied directly to Bayer-mosaicked images so that the SFR of the color filter array subsets can be measured directly without the unwanted influence of demosaicking artifacts. Numerical simulation results are presented to demonstrate the efficacy of the proposed deferred slanted-edge method in relation to existing methods.

  16. A New Switching-Based Median Filtering Scheme and Algorithm for Removal of High-Density Salt and Pepper Noise in Images

    Directory of Open Access Journals (Sweden)

    Jayaraj V

    2010-01-01

    Full Text Available A new switching-based median filtering scheme for restoration of images that are highly corrupted by salt and pepper noise is proposed. An algorithm based on the scheme is developed. The new scheme introduces the concept of substitution of noisy pixels by linear prediction prior to estimation. A novel simplified linear predictor is developed for this purpose. The objective of the scheme and algorithm is the removal of high-density salt and pepper noise in images. The new algorithm shows significantly better image quality with good PSNR, reduced MSE, good edge preservation, and reduced streaking. The good performance is achieved with reduced computational complexity. A comparison of the performance is made with several existing algorithms in terms of visual and quantitative results. The performance of the proposed scheme and algorithm is demonstrated.

  17. Comparison of the image qualities of filtered back-projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction for CT venography at 80 kVp

    International Nuclear Information System (INIS)

    Kim, Jin Hyeok; Choo, Ki Seok; Moon, Tae Yong; Lee, Jun Woo; Jeon, Ung Bae; Kim, Tae Un; Hwang, Jae Yeon; Yun, Myeong-Ja; Jeong, Dong Wook; Lim, Soo Jin

    2016-01-01

    To evaluate the subjective and objective qualities of computed tomography (CT) venography images at 80 kVp using model-based iterative reconstruction (MBIR) and to compare these with those of filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) using the same CT data sets. Forty-four patients (mean age: 56.1 ± 18.1) who underwent 80 kVp CT venography (CTV) for the evaluation of deep vein thrombosis (DVT) during 4 months were enrolled in this retrospective study. The same raw data were reconstructed using FBP, ASIR, and MBIR. Objective and subjective image analysis were performed at the inferior vena cava (IVC), femoral vein, and popliteal vein. The mean CNR of MBIR was significantly greater than those of FBP and ASIR and images reconstructed using MBIR had significantly lower objective image noise (p <.001). Subjective image quality and confidence of detecting DVT by MBIR group were significantly greater than those of FBP and ASIR (p <.005), and MBIR had the lowest score for subjective image noise (p <.001). CTV at 80 kVp with MBIR was superior to FBP and ASIR regarding subjective and objective image qualities. (orig.)

  18. Relationship between pre-reconstruction filter and accuracy of registration software based on mutual-information maximization. A study of SPECT-MR brain phantom images

    International Nuclear Information System (INIS)

    Mito, Suzuko; Magota, Keiichi; Arai, Hiroshi; Omote, Hidehiko; Katsuura, Hidenori; Suzuki, Kotaro; Kubo Naoki

    2005-01-01

    Image registration technique is becoming an increasingly important tool in SPECT. Recently, software based on mutual-information maximization has been developed for automatic multimodality image registration. The accuracy of the software is important for its application to image registration. During SPECT reconstruction, the projection data are pre-filtered in order to reduce Poisson noise, commonly using a Butterworth filter. We have investigated the dependence of the absolute accuracy of MRI-SPECT registration on the cut-off frequencies of a range of Butterworth filters. This study used a 3D Hoffman phantom (Model No. 9000, Data-spectrum Co.). For the reference volume, an magnetization prepared rapid gradient echo (MPRage) sequence was performed on a Vision MRI (Siemence, 1.5 T). For the floating volumes, SPECT data of a phantom including 99m Tc 85 kBq/mL were acquired by a GCA-9300 (Toshiba Medical Systems Co.). During SPECT, the orbito-meatal (OM) line of the phantom was tilted by 5 deg and 15 deg to mimic the incline of a patient's head. The projection data were pre-filtered with Butterworth filters (cut-off frequency varying between 0.24 to 0.94 cycles/cm in 0.02 steps, order 8). The automated registrations were performed using iNRT β version software (Nihon Medi. Co.) and the rotation angles of SPECT for registration were noted. In this study, the registrations of all SPECT data were successful. Graphs of registration rotation angles against cut-off frequencies were scattered and showed no correlation between the two. The registration rotation angles ranged with changing cut-off frequency from -0.4 deg to +3.8 deg at a 5 deg tilt and from +12.7 deg to +19.6 deg at a 15 deg tilt. The registration rotation angles showed variation even for slight differences in cut-off frequencies. The absolute errors were a few degrees for any cut-off frequency. Regardless of the cut-off frequency, automatic registration using this software provides similar results. (author)

  19. TU-H-206-04: An Effective Homomorphic Unsharp Mask Filtering Method to Correct Intensity Inhomogeneity in Daily Treatment MR Images

    International Nuclear Information System (INIS)

    Yang, D; Gach, H; Li, H; Mutic, S

    2016-01-01

    Purpose: The daily treatment MRIs acquired on MR-IGRT systems, like diagnostic MRIs, suffer from intensity inhomogeneity issue, associated with B1 and B0 inhomogeneities. An improved homomorphic unsharp mask (HUM) filtering method, automatic and robust body segmentation, and imaging field-of-view (FOV) detection methods were developed to compute the multiplicative slow-varying correction field and correct the intensity inhomogeneity. The goal is to improve and normalize the voxel intensity so that the images could be processed more accurately by quantitative methods (e.g., segmentation and registration) that require consistent image voxel intensity values. Methods: HUM methods have been widely used for years. A body mask is required, otherwise the body surface in the corrected image would be incorrectly bright due to the sudden intensity transition at the body surface. In this study, we developed an improved HUM-based correction method that includes three main components: 1) Robust body segmentation on the normalized image gradient map, 2) Robust FOV detection (needed for body segmentation) using region growing and morphologic filters, and 3) An effective implementation of HUM using repeated Gaussian convolution. Results: The proposed method was successfully tested on patient images of common anatomical sites (H/N, lung, abdomen and pelvis). Initial qualitative comparisons showed that this improved HUM method outperformed three recently published algorithms (FCM, LEMS, MICO) in both computation speed (by 50+ times) and robustness (in intermediate to severe inhomogeneity situations). Currently implemented in MATLAB, it takes 20 to 25 seconds to process a 3D MRI volume. Conclusion: Compared to more sophisticated MRI inhomogeneity correction algorithms, the improved HUM method is simple and effective. The inhomogeneity correction, body mask, and FOV detection methods developed in this study would be useful as preprocessing tools for many MRI-related research and

  20. Meta-optimization of the extended kalman filter's parameters for improved feature extraction on hyper-temporal images

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-07-01

    Full Text Available . This paper proposes a meta-optimization approach for setting the parameters of the non-linear Extended Kalman Filter to rapidly and efficiently estimate the features for the pair of triply modulated cosine functions. The approach is based on a unsupervised...

  1. Electron energy loss spectroscopy microanalysis and imaging in the transmission electron microscope: example of biological applications

    International Nuclear Information System (INIS)

    Diociaiuti, Marco

    2005-01-01

    This paper reports original results obtained in our laboratory over the past few years in the application of both electron energy loss spectroscopy (EELS) and electron spectroscopy imaging (ESI) to biological samples, performed in two transmission electron microscopes (TEM) equipped with high-resolution electron filters and spectrometers: a Gatan model 607 single magnetic sector double focusing EEL serial spectrometer attached to a Philips 430 TEM and a Zeiss EM902 Energy Filtering TEM. The primary interest was on the possibility offered by the combined application of these spectroscopic techniques with those offered by the TEM. In particular, the electron beam focusing available in a TEM allowed us to perform EELS and ESI on very small sample volumes, where high-resolution imaging and electron diffraction techniques can provide important structural information. I show that ESI was able to improve TEM performance, due to the reduced chromatic aberration and the possibility of avoiding the sample staining procedure. Finally, the analysis of the oscillating extended energy loss fine structure (EXELFS) beyond the ionization edges characterizing the EELS spectra allowed me, in a manner very similar to the extended X-ray absorption fine structure (EXAFS) analysis of the X-ray absorption spectra, to obtain short-range structural information for such light elements of biological interest as O or Fe. The Philips EM430 (250-300 keV) TEM was used to perform EELS microanalysis on Ca, P, O, Fe, Al and Si. The assessment of the detection limits of this method was obtained working with well-characterized samples containing Ca and P, and mimicking the actual cellular matrix. I applied EELS microanalysis to Ca detection in bone tissue during the mineralization process and to P detection in the cellular membrane of erythrocytes treated with an anti-tumoral drug, demonstrating that the cellular membrane is a drug target. I applied EELS microanalysis and selected area electron

  2. A search algorithm to meta-optimize the parameters for an extended Kalman filter to improve classification on hyper-temporal images

    CSIR Research Space (South Africa)

    Salmon

    2012-07-01

    Full Text Available stream_source_info Salmon1_2012_ABSTRACT ONLY.pdf.txt stream_content_type text/plain stream_size 1654 Content-Encoding ISO-8859-1 stream_name Salmon1_2012_ABSTRACT ONLY.pdf.txt Content-Type text/plain; charset=ISO-8859...-1 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22-27 July 2012 A search algorithm to meta-optimize the parameters for an extended Kalman filter to improve classification on hyper-temporal images yzB.P. Salmon, yz...

  3. Face Recognition using Gabor Filters

    Directory of Open Access Journals (Sweden)

    Sajjad MOHSIN

    2011-01-01

    Full Text Available An Elastic Bunch Graph Map (EBGM algorithm is being proposed in this research paper that successfully implements face recognition using Gabor filters. The proposed system applies 40 different Gabor filters on an image. As aresult of which 40 images with different angles and orientation are received. Next, maximum intensity points in each filtered image are calculated and mark them as Fiducial points. The system reduces these points in accordance to distance between them. The next step is calculating the distances between the reduced points using distance formula. At last, the distances are compared with database. If match occurs, it means that the image is recognized.

  4. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging...... using the original OMERACT Filter and the newly proposed structure. Breakout groups critically reviewed the extent to which the candidate biomarkers complied with the proposed stepwise approach, as a way of examining the utility of the proposed 3-dimensional structure. RESULTS: Although...... was obtained for a proposed tri-axis structure to assess validation of imaging and soluble biomarkers; nevertheless, additional work is required to better evaluate its place within the OMERACT Filter 2.0....

  5. Effect of number of of projections on inverse radon transform based image reconstruction by using filtered back-projection for parallel beam transmission tomography

    International Nuclear Information System (INIS)

    Qureshi, S.A.; Mirza, S.M.; Arif, M.

    2007-01-01

    This paper present the effect of number of projections on inverse Radon transform (IRT) estimation using filtered back-projection (FBP) technique for parallel beam transmission tomography. The head phantom and the lung phantom have been used in this work. Various filters used in this study include Ram-Lak, Shepp-Logan, Cosin, Hamming and Hanning filters. The slices have been reconstructed by increasing the number of projections through parallel beam transmission tomography keeping the projections uniformly distributed. The Euclidean and Mean Squared errors and peak signal-to-noise ratio (PSNR) have been analyzed for their sensitiveness as functions of number of projections. It has found that image quality improves with the number of projections but at the cost of the computer time. The error has been minimized to get the best approximation of inverse Radon transform (IRT) as the number of projections is enhanced. The value of PSNR has been found to increase from 8.20 to 24.53 dB as the number of projections is raised from 5 to 180 for head phantom. (author)

  6. 3D skin surface reconstruction from a single image by merging global curvature and local texture using the guided filtering for 3D haptic palpation.

    Science.gov (United States)

    Lee, K; Kim, M; Kim, K

    2018-05-11

    Skin surface evaluation has been studied using various imaging techniques. However, all these studies had limited impact because they were performed using visual exam only. To improve on this scenario with haptic feedback, we propose 3D reconstruction of the skin surface using a single image. Unlike extant 3D skin surface reconstruction algorithms, we utilize the local texture and global curvature regions, combining the results for reconstruction. The first entails the reconstruction of global curvature, achieved by bilateral filtering that removes noise on the surface while maintaining the edge (ie, furrow) to obtain the overall curvature. The second entails the reconstruction of local texture, representing the fine wrinkles of the skin, using an advanced form of bilateral filtering. The final image is then composed by merging the two reconstructed images. We tested the curvature reconstruction part by comparing the resulting curvatures with measured values from real phantom objects while local texture reconstruction was verified by measuring skin surface roughness. Then, we showed the reconstructed result of our proposed algorithm via the reconstruction of various real skin surfaces. The experimental results demonstrate that our approach is a promising technology to reconstruct an accurate skin surface with a single skin image. We proposed 3D skin surface reconstruction using only a single camera. We highlighted the utility of global curvature, which has not been considered important in the past. Thus, we proposed a new method for 3D reconstruction that can be used for 3D haptic palpation, dividing the concepts of local and global regions. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    International Nuclear Information System (INIS)

    Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn; Yoon, Jeong Hee; Choi, Jin Woo

    2014-01-01

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  8. Adaptive iterative dose reduction algorithm in CT: Effect on image quality compared with filtered back projection in body phantoms of different sizes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Milim; Lee, Jeong Min; Son, Hyo Shin; Han, Joon Koo; Choi, Byung Ihn [College of Medicine, Seoul National University, Seoul (Korea, Republic of); Yoon, Jeong Hee; Choi, Jin Woo [Dept. of Radiology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2014-04-15

    To evaluate the impact of the adaptive iterative dose reduction (AIDR) three-dimensional (3D) algorithm in CT on noise reduction and the image quality compared to the filtered back projection (FBP) algorithm and to compare the effectiveness of AIDR 3D on noise reduction according to the body habitus using phantoms with different sizes. Three different-sized phantoms with diameters of 24 cm, 30 cm, and 40 cm were built up using the American College of Radiology CT accreditation phantom and layers of pork belly fat. Each phantom was scanned eight times using different mAs. Images were reconstructed using the FBP and three different strengths of the AIDR 3D. The image noise, the contrast-to-noise ratio (CNR) and the signal-to-noise ratio (SNR) of the phantom were assessed. Two radiologists assessed the image quality of the 4 image sets in consensus. The effectiveness of AIDR 3D on noise reduction compared with FBP were also compared according to the phantom sizes. Adaptive iterative dose reduction 3D significantly reduced the image noise compared with FBP and enhanced the SNR and CNR (p < 0.05) with improved image quality (p < 0.05). When a stronger reconstruction algorithm was used, greater increase of SNR and CNR as well as noise reduction was achieved (p < 0.05). The noise reduction effect of AIDR 3D was significantly greater in the 40-cm phantom than in the 24-cm or 30-cm phantoms (p < 0.05). The AIDR 3D algorithm is effective to reduce the image noise as well as to improve the image-quality parameters compared by FBP algorithm, and its effectiveness may increase as the phantom size increases.

  9. Image quality and radiation dose of low dose coronary CT angiography in obese patients: Sinogram affirmed iterative reconstruction versus filtered back projection

    International Nuclear Information System (INIS)

    Wang, Rui; Schoepf, U. Joseph; Wu, Runze; Reddy, Ryan P.; Zhang, Chuanchen; Yu, Wei; Liu, Yi; Zhang, Zhaoqi

    2012-01-01

    Purpose: To investigate the image quality and radiation dose of low radiation dose CT coronary angiography (CTCA) using sinogram affirmed iterative reconstruction (SAFIRE) compared with standard dose CTCA using filtered back-projection (FBP) in obese patients. Materials and methods: Seventy-eight consecutive obese patients were randomized into two groups and scanned using a prospectively ECG-triggered step-and-shot (SAS) CTCA protocol on a dual-source CT scanner. Thirty-nine patients (protocol A) were examined using a routine radiation dose protocol at 120 kV and images were reconstructed with FBP (protocol A). Thirty-nine patients (protocol B) were examined using a low dose protocol at 100 kV and images were reconstructed with SAFIRE. Two blinded observers independently assessed the image quality of each coronary segment using a 4-point scale (1 = non-diagnostic, 4 = excellent) and measured the objective parameters image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR). Radiation dose was calculated. Results: The coronary artery image quality scores, image noise, SNR and CNR were not significantly different between protocols A and B (all p > 0.05), with image quality scores of 3.51 ± 0.70 versus 3.55 ± 0.47, respectively. The effective radiation dose was significantly lower in protocol B (4.41 ± 0.83 mSv) than that in protocol A (8.83 ± 1.74 mSv, p < 0.01). Conclusion: Compared with standard dose CTCA using FBP, low dose CTCA using SAFIRE can maintain diagnostic image quality with 50% reduction of radiation dose.

  10. Cerebral Ischemia Detected with Diffusion-Weighted MR Imaging after Protected Carotid Artery Stenting: Comparison of Distal Balloon and Filter Device

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Jung; Jeon, Pyoung [Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of); Roh, Hong Gee [Konkuk University Hospital, Seoul (Korea, Republic of)] (and others)

    2007-08-15

    The aim of this study was to examine the incidence of ischemia during protected carotid artery stenting (CAS) as well as to compare the protective efficacy of the balloon and filter devices on diffusion-weighted MR imaging (DWI). Seventy-one consecutive protected CAS procedures in 70 patients with a severe (> 70%) or symptomatic moderate (> 50%) carotid artery stenosis were examined. A balloon device (PercuSurge GuardWire) and a filter device (FilterWire EX/EZ, Emboshield) was used in 33 cases (CAS-B group) and 38 cases (CAS-F group) to prevent distal embolization, respectively. All the patients underwent DWI within seven days before and after the procedures. The number of new cerebral ischemic lesions on the post-procedural DWI were counted and divided into ipsilateral and contralateral lesions according to the relationship with the stenting side. New cerebral ischemic lesions were detected in 13 (39.4%) out of the 33 CAS-Bs and in 15 (39.5%) out of the 38 CAS-Fs. The mean number of total, ipsilateral and contralateral new cerebral ischemic lesion was 2.39, 1.67 and 0.73 in the CAS-B group and 2.11, 1.32 and 0.79 in the CAS-F group, respectively. No statistical differences were found between the two groups (p = 0.96, 0.74 and 0.65, respectively). The embolic complications encountered included two retinal infarctions and one hemiparesis in the CAS-B group (9.09%), and one retinal infarction, one hemiparesis and one ataxia in the CAS-F group (7.89%). There was a similar incidence of embolic complications in the two groups (p 1.00). The type of distal protection device used such as a balloon and filter does not affect the incidence of cerebral embolization after protected CAS.

  11. SU-F-T-260: Using Portal Image Device for Pre-Treatment QA in Volumetric Modulated Arc Plans with Flattening Filter Free (FFF) Beams

    Energy Technology Data Exchange (ETDEWEB)

    Qu, H; Qi, P; Yu, N; Xia, P [The Cleveland Clinic Foundation, Cleveland, OH (United States)

    2016-06-15

    Purpose: To implement and validate a method of using electronic portal image device (EPID) for pre-treatment quality assurance (QA) of volumetric modulated arc therapy (VMAT) plans using flattering filter free (FFF) beams for stereotactic body radiotherapy (SBRT). Methods: On Varian Edge with 6MV FFF beam, open field (from 2×2 cm to 20×20 cm) EPID images were acquired with 200 monitor unit (MU) at the image device to radiation source distance of 150cm. With 10×10 open field and calibration unit (CU) provided by vendor to EPID image pixel, a dose conversion factor was determined by dividing the center dose calculated from the treatment planning system (TPS) to the corresponding CU readout on the image. Water phantom measured beam profile and the output factors for various field sizes were further correlated to those of EPID images. The dose conversion factor and correction factors were then used for converting the portal images to the planner dose distributions of clinical fields. A total of 28 VMAT fields of 14 SBRT plans (8 lung, 2 prostate, 2 liver and 2 spine) were measured. With 10% low threshold cutoff, the delivered dose distributions were compared to the reference doses calculated in water phantom from the TPS. A gamma index analysis was performed for the comparison in percentage dose difference/distance-to-agreement specifications. Results: The EPID device has a linear response to the open fields with increasing MU. For the clinical fields, the gamma indices between the converted EPID dose distributions and the TPS calculated 2D dose distributions were 98.7%±1.1%, 94.0%±3.4% and 70.3%±7.7% for the criteria of 3%/3mm, 2%/2mm and 1%/1mm, respectively. Conclusion: Using a portal image device, a high resolution and high accuracy portal dosimerty was achieved for pre-treatment QA verification for SBRT VMAT plans with FFF beams.

  12. Evaluation to Obtain the Image According to the Spatial Domain Filtering of Various Convolution Kernels in the Multi-Detector Row Computed Tomography

    International Nuclear Information System (INIS)

    Lee, Hoo Min; Yoo, Beong Gyu; Kweon, Dae Cheol

    2008-01-01

    Our objective was to evaluate the image of spatial domain filtering as an alternative to additional image reconstruction using different kernels in MDCT. Derived from thin collimated source images were generated using water phantom and abdomen B10(very smooth), B20(smooth), B30(medium smooth), B40 (medium), B50(medium sharp), B60(sharp), B70(very sharp) and B80(ultra sharp) kernels. MTF and spatial resolution measured with various convolution kernels. Quantitative CT attenuation coefficient and noise measurements provided comparable HU(Hounsfield) units in this respect. CT attenuation coefficient(mean HU) values in the water were values in the water were 1.1∼1.8 HU, air(-998∼-1000 HU) and noise in the water(5.4∼44.8 HU), air(3.6∼31.4 HU). In the abdominal fat a CT attenuation coefficient(-2.2∼0.8 HU) and noise(10.1∼82.4 HU) was measured. In the abdominal was CT attenuation coefficient(53.3∼54.3 HU) and noise(10.4∼70.7 HU) in the muscle and in the liver parenchyma of CT attenuation coefficient(60.4∼62.2 HU) and noise (7.6∼63.8 HU) in the liver parenchyma. Image reconstructed with a convolution kernel led to an increase in noise, whereas the results for CT attenuation coefficient were comparable. Image scanned with a high convolution kernel(B80) led to an increase in noise, whereas the results for CT attenuation coefficient were comparable. Image medications of image sharpness and noise eliminate the need for reconstruction using different kernels in the future. Adjusting CT various kernels, which should be adjusted to take into account the kernels of the CT undergoing the examination, may control CT images increase the diagnostic accuracy.

  13. Filter apparatus

    International Nuclear Information System (INIS)

    Butterworth, D.J.

    1980-01-01

    This invention relates to liquid filters, precoated by replaceable powders, which are used in the production of ultra pure water required for steam generation of electricity. The filter elements are capable of being installed and removed by remote control so that they can be used in nuclear power reactors. (UK)

  14. SU-E-I-62: Assessing Radiation Dose Reduction and CT Image Optimization Through the Measurement and Analysis of the Detector Quantum Efficiency (DQE) of CT Images Using Different Beam Hardening Filters

    International Nuclear Information System (INIS)

    Collier, J; Aldoohan, S; Gill, K

    2014-01-01

    Purpose: Reducing patient dose while maintaining (or even improving) image quality is one of the foremost goals in CT imaging. To this end, we consider the feasibility of optimizing CT scan protocols in conjunction with the application of different beam-hardening filtrations and assess this augmentation through noise-power spectrum (NPS) and detector quantum efficiency (DQE) analysis. Methods: American College of Radiology (ACR) and Catphan phantoms (The Phantom Laboratory) were scanned with a 64 slice CT scanner when additional filtration of thickness and composition (e.g., copper, nickel, tantalum, titanium, and tungsten) had been applied. A MATLAB-based code was employed to calculate the image of noise NPS. The Catphan Image Owl software suite was then used to compute the modulated transfer function (MTF) responses of the scanner. The DQE for each additional filter, including the inherent filtration, was then computed from these values. Finally, CT dose index (CTDIvol) values were obtained for each applied filtration through the use of a 100 mm pencil ionization chamber and CT dose phantom. Results: NPS, MTF, and DQE values were computed for each applied filtration and compared to the reference case of inherent beam-hardening filtration only. Results showed that the NPS values were reduced between 5 and 12% compared to inherent filtration case. Additionally, CTDIvol values were reduced between 15 and 27% depending on the composition of filtration applied. However, no noticeable changes in image contrast-to-noise ratios were noted. Conclusion: The reduction in the quanta noise section of the NPS profile found in this phantom-based study is encouraging. The reduction in both noise and dose through the application of beam-hardening filters is reflected in our phantom image quality. However, further investigation is needed to ascertain the applicability of this approach to reducing patient dose while maintaining diagnostically acceptable image qualities in a

  15. Stack filter classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Reid B [Los Alamos National Laboratory; Hush, Don [Los Alamos National Laboratory

    2009-01-01

    Just as linear models generalize the sample mean and weighted average, weighted order statistic models generalize the sample median and weighted median. This analogy can be continued informally to generalized additive modeels in the case of the mean, and Stack Filters in the case of the median. Both of these model classes have been extensively studied for signal and image processing but it is surprising to find that for pattern classification, their treatment has been significantly one sided. Generalized additive models are now a major tool in pattern classification and many different learning algorithms have been developed to fit model parameters to finite data. However Stack Filters remain largely confined to signal and image processing and learning algorithms for classification are yet to be seen. This paper is a step towards Stack Filter Classifiers and it shows that the approach is interesting from both a theoretical and a practical perspective.

  16. Filtered Rayleigh Scattering Measurements in a Buoyant Flow Field

    National Research Council Canada - National Science Library

    Meents, Steven M

    2008-01-01

    Filtered Rayleigh Scattering (FRS) is a non-intrusive, laser-based flow characterization technique that consists of a narrow linewidth laser, a molecular absorption filter, and a high resolution camera behind the filter to record images...

  17. Design and application of finite impulse response digital filters

    International Nuclear Information System (INIS)

    Miller, T.R.; Sampathkumaran, K.S.

    1982-01-01

    The finite impulse response (FIR) digital filter is a spatial domain filter with a frequency domain representation. The theory of the FIR filter is presented and techniques are described for designing FIR filters with known frequency response characteristics. Rational design principles are emphasized based on characterization of the imaging system using the modulation transfer function and physical properties of the imaged objects. Bandpass, Wiener, and low-pass filters were designed and applied to 201 Tl myocardial images. The bandpass filter eliminates low-frequency image components that represent background activity and high-frequency components due to noise. The Wiener, or minimum mean square error filter 'sharpens' the image while also reducing noise. The Wiener filter illustrates the power of the FIR technique to design filters with any desired frequency reponse. The low-pass filter, while of relative limited use, is presented to compare it with a popular elementary 'smoothing' filter. (orig.)

  18. Cluster Based Vector Attribute Filtering

    NARCIS (Netherlands)

    Kiwanuka, Fred N.; Wilkinson, Michael H.F.

    2016-01-01

    Morphological attribute filters operate on images based on properties or attributes of connected components. Until recently, attribute filtering was based on a single global threshold on a scalar property to remove or retain objects. A single threshold struggles in case no single property or

  19. Tunable electro-optic filter stack

    Science.gov (United States)

    Fontecchio, Adam K.; Shriyan, Sameet K.; Bellingham, Alyssa

    2017-09-05

    A holographic polymer dispersed liquid crystal (HPDLC) tunable filter exhibits switching times of no more than 20 microseconds. The HPDLC tunable filter can be utilized in a variety of applications. An HPDLC tunable filter stack can be utilized in a hyperspectral imaging system capable of spectrally multiplexing hyperspectral imaging data acquired while the hyperspectral imaging system is airborne. HPDLC tunable filter stacks can be utilized in high speed switchable optical shielding systems, for example as a coating for a visor or an aircraft canopy. These HPDLC tunable filter stacks can be fabricated using a spin coating apparatus and associated fabrication methods.

  20. Prediction of the filter no-reflow phenomenon in patients with angina pectoris by using multimodality: Magnetic resonance imaging, optical coherence tomography, and serum biomarkers.

    Science.gov (United States)

    Matsumoto, Kenji; Ehara, Shoichi; Hasegawa, Takao; Otsuka, Kenichiro; Yoshikawa, Junichi; Shimada, Kenei

    2016-05-01

    Although the occurrence of no-reflow during percutaneous coronary intervention (PCI) has been shown to be associated with worse short- and long-term clinical outcomes, the clinical relevance of preventing flow deterioration by using the filter-based distal protection devices (DPDs) is controversial. We investigated predictors of the filter no-reflow (FNR) phenomenon during PCI by using multimodality, such as hyperintense plaques (HIPs) in the coronary artery on T1-weighted imaging (T1WI) non-contrast magnetic resonance, plaque composition by using optical coherence tomography (OCT), and serum biomarkers, in patients with angina pectoris. Fifty lesions from 50 patients with angina were examined. All patients underwent T1WI within 24 h before invasive coronary angiography was performed, and preinterventional OCT was performed on a native atherosclerotic culprit lesion. The signal intensity of coronary plaque to cardiac muscle ratio (PMR) was calculated on a standard console of the magnetic resonance system. Of the 50 lesions, 20 lesions showed FNR during PCI, while non-FNR was observed in 30 lesions. A cut-off value >1.85 of PMR had a sensitivity of 65%, a specificity of 93%, a positive predictive value of 87%, and a negative predictive value of 80% for identifying lesions with FNR. Multivariate analysis revealed that the presence of HIPs with PMR >1.85 (p=0.008) was the only independent predictor of the FNR phenomenon during PCI. This study shows that the presence of HIPs with PMR >1.85 on T1WI was a novel independent predictor of the FNR phenomenon during PCI in angina patients. This result may help in identifying high-risk lesions for no-reflow to deploy filter-based DPDs. Copyright © 2015 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  1. Filter systems

    International Nuclear Information System (INIS)

    Vanin, V.R.

    1990-01-01

    The multidetector systems for high resolution gamma spectroscopy are presented. The observable parameters for identifying nuclides produced simultaneously in the reaction are analysed discussing the efficiency of filter systems. (M.C.K.)

  2. Basic methodology of tomographic imaging by filtered inverse projection at a turbo-pump. Project report; Methodische Grundlagen fuer die Tomographie durch gefilterte Rueckprojektion an einer Axialpumpe. Projektbericht

    Energy Technology Data Exchange (ETDEWEB)

    Hoppe, D.

    2000-11-01

    A two-phase medium consisting of a fluid containing gas is transported in a turbo-pump via an impeller. The interaction between the gaseous phase and the impeller is to be examined by tomography with gamma rays. Reconstruction of the image of the object is to be made by way of filtered inverse projection. The methodology of using this principle in the given system (geometry and measuring conditions) is explained. (orig./CB) [German] Ein zweiphasiges, aus einer gashaltigen Fluessigkeit bestehendes Medium wird in einer Axialpumpe von einem propellerartigen Laufrad senkrecht zur Drehachse dieses Laufrades transportiert. Die Wechselwirkung zwischen der Gasphase und dem Laufrad soll unter Verwendung von Gamma-Strahlung mittels Tomographie untersucht werden. Dabei ist fuer die Rekonstruktion des Objektbildes das Prinzip der sogenannten gefilterten Rueckprojektion vorgesehen. Die methodischen Grundlagen fuer die Nutzung dieses Prinzips unter von vorgesehenen geometrischen und messtechnischen Bedingungen sind Gegenstand dieser Arbeit. (orig.)

  3. Characterisation of deuterium spectra from laser driven multi-species sources by employing differentially filtered image plate detectors in Thomson spectrometers

    International Nuclear Information System (INIS)

    Alejo, A.; Kar, S.; Ahmed, H.; Doria, D.; Green, A.; Jung, D.; Lewis, C. L. S.; Nersisyan, G.; Krygier, A. G.; Freeman, R. R.; Clarke, R.; Green, J. S.; Notley, M.; Fernandez, J.; Fuchs, J.; Kleinschmidt, A.; Roth, M.; Morrison, J. T.; Najmudin, Z.; Nakamura, H.

    2014-01-01

    A novel method for characterising the full spectrum of deuteron ions emitted by laser driven multi-species ion sources is discussed. The procedure is based on using differential filtering over the detector of a Thompson parabola ion spectrometer, which enables discrimination of deuterium ions from heavier ion species with the same charge-to-mass ratio (such as C 6+ , O 8+ , etc.). Commonly used Fuji Image plates were used as detectors in the spectrometer, whose absolute response to deuterium ions over a wide range of energies was calibrated by using slotted CR-39 nuclear track detectors. A typical deuterium ion spectrum diagnosed in a recent experimental campaign is presented, which was produced from a thin deuterated plastic foil target irradiated by a high power laser

  4. Characterisation of deuterium spectra from laser driven multi-species sources by employing differentially filtered image plate detectors in Thomson spectrometers

    Science.gov (United States)

    Alejo, A.; Kar, S.; Ahmed, H.; Krygier, A. G.; Doria, D.; Clarke, R.; Fernandez, J.; Freeman, R. R.; Fuchs, J.; Green, A.; Green, J. S.; Jung, D.; Kleinschmidt, A.; Lewis, C. L. S.; Morrison, J. T.; Najmudin, Z.; Nakamura, H.; Nersisyan, G.; Norreys, P.; Notley, M.; Oliver, M.; Roth, M.; Ruiz, J. A.; Vassura, L.; Zepf, M.; Borghesi, M.

    2014-09-01

    A novel method for characterising the full spectrum of deuteron ions emitted by laser driven multi-species ion sources is discussed. The procedure is based on using differential filtering over the detector of a Thompson parabola ion spectrometer, which enables discrimination of deuterium ions from heavier ion species with the same charge-to-mass ratio (such as C6 +, O8 +, etc.). Commonly used Fuji Image plates were used as detectors in the spectrometer, whose absolute response to deuterium ions over a wide range of energies was calibrated by using slotted CR-39 nuclear track detectors. A typical deuterium ion spectrum diagnosed in a recent experimental campaign is presented, which was produced from a thin deuterated plastic foil target irradiated by a high power laser.

  5. Characterisation of deuterium spectra from laser driven multi-species sources by employing differentially filtered image plate detectors in Thomson spectrometers

    Energy Technology Data Exchange (ETDEWEB)

    Alejo, A.; Kar, S., E-mail: s.kar@qub.ac.uk; Ahmed, H.; Doria, D.; Green, A.; Jung, D.; Lewis, C. L. S.; Nersisyan, G. [Centre for Plasma Physics, School of Mathematics and Physics, Queen' s University Belfast, Belfast BT7 1NN (United Kingdom); Krygier, A. G.; Freeman, R. R. [Department of Physics, The Ohio State University, Columbus, Ohio 43210 (United States); Clarke, R.; Green, J. S.; Notley, M. [Central Laser Facility, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0QX (United Kingdom); Fernandez, J. [Central Laser Facility, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0QX (United Kingdom); Instituto de Fusión Nuclear, Universidad Politécnica de Madrid, 28006 Madrid (Spain); Fuchs, J. [LULI, École Polytechnique, CNRS, CEA, UPMC, 91128 Palaiseau (France); Kleinschmidt, A.; Roth, M. [Institut für Kernphysik, Technische Universität Darmstadt, Schloßgartenstrasse 9, D-64289 Darmstadt (Germany); Morrison, J. T. [Propulsion Systems Directorate, Air Force Research Lab, Wright Patterson Air Force Base, Ohio 45433 (United States); Najmudin, Z.; Nakamura, H. [Blackett Laboratory, Department of Physics, Imperial College, London SW7 2AZ (United Kingdom); and others

    2014-09-15

    A novel method for characterising the full spectrum of deuteron ions emitted by laser driven multi-species ion sources is discussed. The procedure is based on using differential filtering over the detector of a Thompson parabola ion spectrometer, which enables discrimination of deuterium ions from heavier ion species with the same charge-to-mass ratio (such as C{sup 6+}, O{sup 8+}, etc.). Commonly used Fuji Image plates were used as detectors in the spectrometer, whose absolute response to deuterium ions over a wide range of energies was calibrated by using slotted CR-39 nuclear track detectors. A typical deuterium ion spectrum diagnosed in a recent experimental campaign is presented, which was produced from a thin deuterated plastic foil target irradiated by a high power laser.

  6. Bloodstain detection and discrimination impacted by spectral shift when using an interference filter-based visible and near-infrared multispectral crime scene imaging system

    Science.gov (United States)

    Yang, Jie; Messinger, David W.; Dube, Roger R.

    2018-03-01

    Bloodstain detection and discrimination from nonblood substances on various substrates are critical in forensic science as bloodstains are a critical source for confirmatory DNA tests. Conventional bloodstain detection methods often involve time-consuming sample preparation, a chance of harm to investigators, the possibility of destruction of blood samples, and acquisition of too little data at crime scenes either in the field or in the laboratory. An imaging method has the advantages of being nondestructive, noncontact, real-time, and covering a large field-of-view. The abundant spectral information provided by multispectral imaging makes it a potential presumptive bloodstain detection and discrimination method. This article proposes an interference filter (IF) based area scanning three-spectral-band crime scene imaging system used for forensic bloodstain detection and discrimination. The impact of large angle of views on the spectral shift of calibrated IFs is determined, for both detecting and discriminating bloodstains from visually similar substances on multiple substrates. Spectral features in the visible and near-infrared portion employed by the relative band depth method are used. This study shows that 1 ml bloodstain on black felt, gray felt, red felt, white cotton, white polyester, and raw wood can be detected. Bloodstains on the above substrates can be discriminated from cola, coffee, ketchup, orange juice, red wine, and green tea.

  7. The visible to the near infrared narrow band acousto-optic tunable filter and the hyperspectral microscopic imaging on biomedicine study

    International Nuclear Information System (INIS)

    Zhang, Chunguang; Wang, Hao; Huang, Junfeng; Gao, Qiang

    2014-01-01

    Based on the parallel tangents momentum-matching condition, a narrow band noncollinear acousto-optic tunable filter (AOTF) using a single TeO 2 crystal is designed with the consideration of the birefringence and the rotatory property of the material. An effective setup is established to evaluate the performance of the designed AOTF. The experimental observed spectrum pattern of the diffracted light is nearly the same with the theoretical result. The measured tuning relationship between the diffracted central optical wavelength and acoustic frequency is in accordance with the theoretical prospect. The optical bandwidth of the diffracted light is as narrow as 1.88 nm when the central wavelength is 556.75 nm. The high spectral resolution is significant in practical applications of imaging AOTF. Additionally, the AOTF based hyperspectral microscopic imaging system is established. The stability and the image resolution of the designed narrow band AOTF are satisfying. Finally, the study of the biologic samples indicates the feasibility of our system on biomedicine. (paper)

  8. Image quality of ct angiography using model-based iterative reconstruction in infants with congenital heart disease: Comparison with filtered back projection and hybrid iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Jia, Qianjun, E-mail: jiaqianjun@126.com [Southern Medical University, Guangzhou, Guangdong (China); Department of Radiology, Guangdong General Hospital, Guangdong Academy of Medical Sciences, Guangzhou, Guangdong (China); Department of Catheterization Lab, Guangdong Cardiovascular Institute, Guangdong Provincial Key Laboratory of South China Structural Heart Disease, Guangdong General Hospital, Guangdong Academy of Medical Sciences, Guangzhou, Guangdong (China); Zhuang, Jian, E-mail: zhuangjian5413@tom.com [Department of Cardiac Surgery, Guangdong Cardiovascular Institute, Guangdong Provincial Key Laboratory of South China Structural Heart Disease, Guangdong General Hospital, Guangdong Academy of Medical Sciences, Guangzhou, Guangdong (China); Jiang, Jun, E-mail: 81711587@qq.com [Department of Radiology, Shenzhen Second People’s Hospital, Shenzhen, Guangdong (China); Li, Jiahua, E-mail: 970872804@qq.com [Department of Catheterization Lab, Guangdong Cardiovascular Institute, Guangdong Provincial Key Laboratory of South China Structural Heart Disease, Guangdong General Hospital, Guangdong Academy of Medical Sciences, Guangzhou, Guangdong (China); Huang, Meiping, E-mail: huangmeiping_vip@163.com [Department of Catheterization Lab, Guangdong Cardiovascular Institute, Guangdong Provincial Key Laboratory of South China Structural Heart Disease, Guangdong General Hospital, Guangdong Academy of Medical Sciences, Guangzhou, Guangdong (China); Southern Medical University, Guangzhou, Guangdong (China); Liang, Changhong, E-mail: cjr.lchh@vip.163.com [Department of Radiology, Guangdong General Hospital, Guangdong Academy of Medical Sciences, Guangzhou, Guangdong (China); Southern Medical University, Guangzhou, Guangdong (China)

    2017-01-15

    Purpose: To compare the image quality, rate of coronary artery visualization and diagnostic accuracy of 256-slice multi-detector computed tomography angiography (CTA) with prospective electrocardiographic (ECG) triggering at a tube voltage of 80 kVp between 3 reconstruction algorithms (filtered back projection (FBP), hybrid iterative reconstruction (iDose{sup 4}) and iterative model reconstruction (IMR)) in infants with congenital heart disease (CHD). Methods: Fifty-one infants with CHD who underwent cardiac CTA in our institution between December 2014 and March 2015 were included. The effective radiation doses were calculated. Imaging data were reconstructed using the FBP, iDose{sup 4} and IMR algorithms. Parameters of objective image quality (noise, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR)); subjective image quality (overall image quality, image noise and margin sharpness); coronary artery visibility; and diagnostic accuracy for the three algorithms were measured and compared. Results: The mean effective radiation dose was 0.61 ± 0.32 mSv. Compared to FBP and iDose{sup 4}, IMR yielded significantly lower noise (P < 0.01), higher SNR and CNR values (P < 0.01), and a greater subjective image quality score (P < 0.01). The total number of coronary segments visualized was significantly higher for both iDose{sup 4} and IMR than for FBP (P = 0.002 and P = 0.025, respectively), but there was no significant difference in this parameter between iDose{sup 4} and IMR (P = 0.397). There was no significant difference in the diagnostic accuracy between the FBP, iDose{sup 4} and IMR algorithms (χ{sup 2} = 0.343, P = 0.842). Conclusions: For infants with CHD undergoing cardiac CTA, the IMR reconstruction algorithm provided significantly increased objective and subjective image quality compared with the FBP and iDose{sup 4} algorithms. However, IMR did not improve the diagnostic accuracy or coronary artery visualization compared with iDose{sup 4}.

  9. A three-dimensional-weighted cone beam filtered backprojection (CB-FBP) algorithm for image reconstruction in volumetric CT-helical scanning

    International Nuclear Information System (INIS)

    Tang Xiangyang; Hsieh Jiang; Nilsen, Roy A; Dutta, Sandeep; Samsonov, Dmitry; Hagiwara, Akira

    2006-01-01

    Based on the structure of the original helical FDK algorithm, a three-dimensional (3D)-weighted cone beam filtered backprojection (CB-FBP) algorithm is proposed for image reconstruction in volumetric CT under helical source trajectory. In addition to its dependence on view and fan angles, the 3D weighting utilizes the cone angle dependency of a ray to improve reconstruction accuracy. The 3D weighting is ray-dependent and the underlying mechanism is to give a favourable weight to the ray with the smaller cone angle out of a pair of conjugate rays but an unfavourable weight to the ray with the larger cone angle out of the conjugate ray pair. The proposed 3D-weighted helical CB-FBP reconstruction algorithm is implemented in the cone-parallel geometry that can improve noise uniformity and image generation speed significantly. Under the cone-parallel geometry, the filtering is naturally carried out along the tangential direction of the helical source trajectory. By exploring the 3D weighting's dependence on cone angle, the proposed helical 3D-weighted CB-FBP reconstruction algorithm can provide significantly improved reconstruction accuracy at moderate cone angle and high helical pitches. The 3D-weighted CB-FBP algorithm is experimentally evaluated by computer-simulated phantoms and phantoms scanned by a diagnostic volumetric CT system with a detector dimension of 64 x 0.625 mm over various helical pitches. The computer simulation study shows that the 3D weighting enables the proposed algorithm to reach reconstruction accuracy comparable to that of exact CB reconstruction algorithms, such as the Katsevich algorithm, under a moderate cone angle (4 deg.) and various helical pitches. Meanwhile, the experimental evaluation using the phantoms scanned by a volumetric CT system shows that the spatial resolution along the z-direction and noise characteristics of the proposed 3D-weighted helical CB-FBP reconstruction algorithm are maintained very well in comparison to the FDK

  10. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    International Nuclear Information System (INIS)

    Böning, G.; Schäfer, M.; Grupp, U.; Kaul, D.; Kahn, J.; Pavel, M.; Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F.

    2015-01-01

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI vol ) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI vol (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0

  11. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Böning, G., E-mail: georg.boening@charite.de [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Schäfer, M.; Grupp, U. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kaul, D. [Department of Radiation Oncology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Kahn, J. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Pavel, M. [Department of Gastroenterology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany); Maurer, M.; Denecke, T.; Hamm, B.; Streitparth, F. [Department of Radiology, Charité, Humboldt-University Medical School, Charitéplatz 1, 10117 Berlin (Germany)

    2015-08-15

    Highlights: • Iterative reconstruction (IR) in staging CT provides equal objective image quality compared to filtered back projection (FBP). • IR delivers excellent subjective quality and reduces effective dose compared to FBP. • In patients with neuroendocrine tumor (NET) or may other hypervascular abdominal tumors IR can be used without scarifying diagnostic confidence. - Abstract: Objective: To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. Methods: A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDI{sub vol}) of each scan was taken from the dose report. Results: ASIR 40% significantly reduced CTDI{sub vol} (10.17 ± 3.06 mGy [FBP], 6.34 ± 2.25 mGy [ASIR] (p < 0.001) by 37.6% and significantly increased CNRs (complete tumor-to-liver, 2.76 ± 1.87 [FBP], 3.2 ± 2.32 [ASIR]) (p < 0.05) (complete tumor-to-muscle, 2.74 ± 2.67 [FBP], 4.31 ± 4.61 [ASIR]) (p < 0.05) compared to FBP. Subjective scoring revealed no significant changes for diagnostic confidence (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]), visibility of suspicious lesion (4.8 ± 0.5 [FBP], 4.8 ± 0.5 [ASIR]) and artifacts (5.0 ± 0 [FBP], 5.0 ± 0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3 ± 0.6 [FBP], 4.0 ± 0.8 [ASIR]) (p < 0.05), contrast (4.4 ± 0.6 [FBP], 4.1 ± 0.8 [ASIR]) (p < 0.001) and visibility of small structures (4.5 ± 0.7 [FBP], 4.3 ± 0.8 [ASIR]) (p < 0

  12. CT urography in the urinary bladder: To compare excretory phase images using a low noise index and a high noise index with adaptive noise reduction filter

    International Nuclear Information System (INIS)

    Takeyama, Nobuyuki; Hayashi, Takaki; Ohgiya, Yoshimitsu

    2011-01-01

    Background: Although CT urography (CTU) is widely used for the evaluation of the entire urinary tract, the most important drawback is the radiation exposure. Purpose: To evaluate the effect of a noise reduction filter (NRF) using a phantom and to quantitatively and qualitatively compare excretory phase (EP) images using a low noise index (NI) with those using a high NI and postprocessing NRF (pNRF). Material and Methods: Each NI value was defined for a slice thickness of 5 mm, and reconstructed images with a slice thickness of 1.25 mm were assessed. Sixty patients who were at high risk of developing bladder tumors (BT) were divided into two groups according to whether their EP images were obtained using an NI of 9.88 (29 patients; group A) or an NI of 20 and pNRF (31 patients; group B). The CT dose index volume (CTDI vol ) and the contrast-to-noise ratio (CNR) of the bladder with respect to the anterior pelvic fat were compared in both groups. Qualitative assessment of the urinary bladder for image noise, sharpness, streak artifacts, homogeneity, and the conspicuity of polypoid or sessile-shaped BTs with a short-axis diameter greater than 10 mm was performed using a 3-point scale. Results: The phantom study showed noise reduction of approximately 40% and 76% dose reduction between group A and group B. CTDI vol demonstrated a 73% reduction in group B (4.6 ± 1.1 mGy) compared with group A (16.9 ± 3.4 mGy). The CNR value was not significantly different (P = 0.60) between group A (16.1 ± 5.1) and group B (16.6 ± 7.6). Although group A was superior (P < 0.01) to group B with regard to image noise, other qualitative analyses did not show significant differences. Conclusion: EP images using a high NI and pNRF were quantitatively and qualitatively comparable to those using a low NI, except with regard to image noise

  13. Ex-PRESS glaucoma filter: an MRI compatible metallic orbital foreign body imaged at 1.5 and 3 T

    International Nuclear Information System (INIS)

    Mabray, M.C.; Uzelac, A.; Talbott, J.F.; Lin, S.C.; Gean, A.D.

    2015-01-01

    Aim: To report on the MRI compatibility of the Ex-PRESS glaucoma filtration device, a tiny metallic implant placed into the anterior chamber of the eye that is much smaller than traditional glaucoma shunts, and to educate the radiology community regarding its appearance. Materials and methods: Seven patients with Ex-PRESS glaucoma filtration devices were identified that had undergone MRI at San Francisco General Hospital/University of California San Francisco Medical Center by searching and cross-referencing the radiology reporting system and the electronic medical record. MRI images were reviewed for artefact interfering with interpretation. Ophthalmology examinations were reviewed for evidence of complications. Results: Eighteen individual MRI examinations were performed during 12 unique MRI events on these 7 patients. 13/18 individual MRI examinations and 7/12 MRI events were performed at 3 T with the others performed at 1.5 T. Mean time from Ex-PRESS implantation to MRI was 17.5 months. Mean time from MRI to first ophthalmology examination was 1.1 months and from MRI to latest ophthalmology examination was 6.6 months. Susceptibility artefact did not interfere with image interpretation and no complications related to MRI were encountered. Conclusion: The Ex-PRESS glaucoma filtration device appears to be safe for MRI at 1.5 and 3 T and does not produce significant susceptibility artefact to affect diagnostic interpretation adversely. - Highlights: • The Ex-PRESS glaucoma filtration device is a tiny metallic orbital implant. • It can simulate a metallic orbital foreign body on imaging. • There is little information in the literature about it's MRI safety. • We report 18 MRIs performed on 7 patients including the first at 3 T. • Imaging appears to be safe at 1.5 and 3 T in patients with this device

  14. Switching non-local vector median filter

    Science.gov (United States)

    Matsuoka, Jyohei; Koga, Takanori; Suetake, Noriaki; Uchino, Eiji

    2016-04-01

    This paper describes a novel image filtering method that removes random-valued impulse noise superimposed on a natural color image. In impulse noise removal, it is essential to employ a switching-type filtering method, as used in the well-known switching median filter, to preserve the detail of an original image with good quality. In color image filtering, it is generally preferable to deal with the red (R), green (G), and blue (B) components of each pixel of a color image as elements of a vectorized signal, as in the well-known vector median filter, rather than as component-wise signals to prevent a color shift after filtering. By taking these fundamentals into consideration, we propose a switching-type vector median filter with non-local processing that mainly consists of a noise detector and a noise removal filter. Concretely, we propose a noise detector that proactively detects noise-corrupted pixels by focusing attention on the isolation tendencies of pixels of interest not in an input image but in difference images between RGB components. Furthermore, as the noise removal filter, we propose an extended version of the non-local median filter, we proposed previously for grayscale image processing, named the non-local vector median filter, which is designed for color image processing. The proposed method realizes a superior balance between the preservation of detail and impulse noise removal by proactive noise detection and non-local switching vector median filtering, respectively. The effectiveness and validity of the proposed method are verified in a series of experiments using natural color images.

  15. Noise reduction with complex bilateral filter.

    Science.gov (United States)

    Matsumoto, Mitsuharu

    2017-12-01

    This study introduces a noise reduction technique that uses a complex bilateral filter. A bilateral filter is a nonlinear filter originally developed for images that can reduce noise while preserving edge information. It is an attractive filter and has been used in many applications in image processing. When it is applied to an acoustical signal, small-amplitude noise is reduced while the speech signal is preserved. However, a bilateral filter cannot handle noise with relatively large amplitudes owing to its innate characteristics. In this study, the noisy signal is transformed into the time-frequency domain and the filter is improved to handle complex spectra. The high-amplitude noise is reduced in the time-frequency domain via the proposed filter. The features and the potential of the proposed filter are also confirmed through experiments.

  16. Filter This

    Directory of Open Access Journals (Sweden)

    Audrey Barbakoff

    2011-03-01

    Full Text Available In the Library with the Lead Pipe welcomes Audrey Barbakoff, a librarian at the Milwaukee Public Library, and Ahniwa Ferrari, Virtual Experience Manager at the Pierce County Library System in Washington, for a point-counterpoint piece on filtering in libraries. The opinions expressed here are those of the authors, and are not endorsed by their employers. [...

  17. Identifying differences in brain activities and an accurate detection of autism spectrum disorder using resting state functional-magnetic resonance imaging : A spatial filtering approach.

    Science.gov (United States)

    Subbaraju, Vigneshwaran; Suresh, Mahanand Belathur; Sundaram, Suresh; Narasimhan, Sundararajan

    2017-01-01

    This paper presents a new approach for detecting major differences in brain activities between Autism Spectrum Disorder (ASD) patients and neurotypical subjects using the resting state fMRI. Further the method also extracts discriminative features for an accurate diagnosis of ASD. The proposed approach determines a spatial filter that projects the covariance matrices of the Blood Oxygen Level Dependent (BOLD) time-series signals from both the ASD patients and neurotypical subjects in orthogonal directions such that they are highly separable. The inverse of this filter also provides a spatial pattern map within the brain that highlights those regions responsible for the distinguishable activities between the ASD patients and neurotypical subjects. For a better classification, highly discriminative log-variance features providing the maximum separation between the two classes are extracted from the projected BOLD time-series data. A detailed study has been carried out using the publicly available data from the Autism Brain Imaging Data Exchange (ABIDE) consortium for the different gender and age-groups. The study results indicate that for all the above categories, the regional differences in resting state activities are more commonly found in the right hemisphere compared to the left hemisphere of the brain. Among males, a clear shift in activities to the prefrontal cortex is observed for ASD patients while other parts of the brain show diminished activities compared to neurotypical subjects. Among females, such a clear shift is not evident; however, several regions, especially in the posterior and medial portions of the brain show diminished activities due to ASD. Finally, the classification performance obtained using the log-variance features is found to be better when compared to earlier studies in the literature. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Ex-PRESS glaucoma filter: an MRI compatible metallic orbital foreign body imaged at 1.5 and 3T.

    Science.gov (United States)

    Mabray, M C; Uzelac, A; Talbott, J F; Lin, S C; Gean, A D

    2015-05-01

    To report on the MRI compatibility of the Ex-PRESS glaucoma filtration device, a tiny metallic implant placed into the anterior chamber of the eye that is much smaller than traditional glaucoma shunts, and to educate the radiology community regarding its appearance. Seven patients with Ex-PRESS glaucoma filtration devices were identified that had undergone MRI at San Francisco General Hospital/University of California San Francisco Medical Center by searching and cross-referencing the radiology reporting system and the electronic medical record. MRI images were reviewed for artefact interfering with interpretation. Ophthalmology examinations were reviewed for evidence of complications. Eighteen individual MRI examinations were performed during 12 unique MRI events on these 7 patients. 13/18 individual MRI examinations and 7/12 MRI events were performed at 3 T with the others performed at 1.5 T. Mean time from Ex-PRESS implantation to MRI was 17.5 months. Mean time from MRI to first ophthalmology examination was 1.1 months and from MRI to latest ophthalmology examination was 6.6 months. Susceptibility artefact did not interfere with image interpretation and no complications related to MRI were encountered. The Ex-PRESS glaucoma filtration device appears to be safe for MRI at 1.5 and 3 T and does not produce significant susceptibility artefact to affect diagnostic interpretation adversely. Copyright © 2015 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  19. CHANGE DETECTION VIA SELECTIVE GUIDED CONTRASTING FILTERS

    Directory of Open Access Journals (Sweden)

    Y. V. Vizilter

    2017-05-01

    Full Text Available Change detection scheme based on guided contrasting was previously proposed. Guided contrasting filter takes two images (test and sample as input and forms the output as filtered version of test image. Such filter preserves the similar details and smooths the non-similar details of test image with respect to sample image. Due to this the difference between test image and its filtered version (difference map could be a basis for robust change detection. Guided contrasting is performed in two steps: at the first step some smoothing operator (SO is applied for elimination of test image details; at the second step all matched details are restored with local contrast proportional to the value of some local similarity coefficient (LSC. The guided contrasting filter was proposed based on local average smoothing as SO and local linear correlation as LSC. In this paper we propose and implement new set of selective guided contrasting filters based on different combinations of various SO and thresholded LSC. Linear average and Gaussian smoothing, nonlinear median filtering, morphological opening and closing are considered as SO. Local linear correlation coefficient, morphological correlation coefficient (MCC, mutual information, mean square MCC and geometrical correlation coefficients are applied as LSC. Thresholding of LSC allows operating with non-normalized LSC and enhancing the selective properties of guided contrasting filters: details are either totally recovered or not recovered at all after the smoothing. These different guided contrasting filters are tested as a part of previously proposed change detection pipeline, which contains following stages: guided contrasting filtering on image pyramid, calculation of difference map, binarization, extraction of change proposals and testing change proposals using local MCC. Experiments on real and simulated image bases demonstrate the applicability of all proposed selective guided contrasting filters. All

  20. Union operation image processing of data cubes separately processed by different objective filters and its application to void analysis in an all-solid-state lithium-ion battery.

    Science.gov (United States)

    Yamamoto, Yuta; Iriyama, Yasutoshi; Muto, Shunsuke

    2016-04-01

    In this article, we propose a smart image-analysis method suitable for extracting target features with hierarchical dimension from original data. The method was applied to three-dimensional volume data of an all-solid lithium-ion battery obtained by the automated sequential sample milling and imaging process using a focused ion beam/scanning electron microscope to investigate the spatial configuration of voids inside the battery. To automatically fully extract the shape and location of the voids, three types of filters were consecutively applied: a median blur filter to extract relatively larger voids, a morphological opening operation filter for small dot-shaped voids and a morphological closing operation filter for small voids with concave contrasts. Three data cubes separately processed by the above-mentioned filters were integrated by a union operation to the final unified volume data, which confirmed the correct extraction of the voids over the entire dimension contained in the original data. © The Author 2015. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  1. Comparison of applied dose and image quality in staging CT of neuroendocrine tumor patients using standard filtered back projection and adaptive statistical iterative reconstruction.

    Science.gov (United States)

    Böning, G; Schäfer, M; Grupp, U; Kaul, D; Kahn, J; Pavel, M; Maurer, M; Denecke, T; Hamm, B; Streitparth, F

    2015-08-01

    To investigate whether dose reduction via adaptive statistical iterative reconstruction (ASIR) affects image quality and diagnostic accuracy in neuroendocrine tumor (NET) staging. A total of 28 NET patients were enrolled in the study. Inclusion criteria were histologically proven NET and visible tumor in abdominal computed tomography (CT). In an intraindividual study design, the patients underwent a baseline CT (filtered back projection, FBP) and follow-up CT (ASIR 40%) using matched scan parameters. Image quality was assessed subjectively using a 5-grade scoring system and objectively by determining signal-to-noise ratio (SNR) and contrast-to-noise ratios (CNRs). Applied volume computed tomography dose index (CTDIvol) of each scan was taken from the dose report. ASIR 40% significantly reduced CTDIvol (10.17±3.06mGy [FBP], 6.34±2.25mGy [ASIR] (pASIR]) (pASIR]) (pASIR]), visibility of suspicious lesion (4.8±0.5 [FBP], 4.8±0.5 [ASIR]) and artifacts (5.0±0 [FBP], 5.0±0 [ASIR]). ASIR 40% significantly decreased scores for noise (4.3±0.6 [FBP], 4.0±0.8 [ASIR]) (pASIR]) (pASIR]) (pASIR can be used to reduce radiation dose without sacrificing image quality and diagnostic confidence in staging CT of NET patients. This may be beneficial for patients with frequent follow-up and significant cumulative radiation exposure. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. MR angiography with a matched filter

    International Nuclear Information System (INIS)

    De Castro, J.B.; Riederer, S.J.; Lee, J.N.

    1987-01-01

    The technique of matched filtering was applied to a series of cine MR images. The filter was devised to yield a subtraction angiographic image in which direct current components present in the cine series are removed and the signal-to-noise ratio (S/N) of the vascular structures is optimized. The S/N of a matched filter was compared with that of a simple subtraction, in which an image with high flow is subtracted from one with low flow. Experimentally, a range of results from minimal improvement to significant (60%) improvement in S/N was seen in the comparisons of matched filtered subtraction with simple subtraction

  3. Preliminary validation of 2 magnetic resonance image scoring systems for osteoarthritis of the hip according to the OMERACT filter.

    Science.gov (United States)

    Maksymowych, Walter P; Cibere, Jolanda; Loeuille, Damien; Weber, Ulrich; Zubler, Veronika; Roemer, Frank W; Jaremko, Jacob L; Sayre, Eric C; Lambert, Robert G W

    2014-02-01

    Development of a validated magnetic resonance image (MRI) scoring system is essential in hip OA because radiographs are insensitive to change. We assessed the feasibility and reliability of 2 previously developed scoring methods: (1) the Hip Inflammation MRI Scoring System (HIMRISS) and (2) the Hip Osteoarthritis MRI Scoring System (HOAMS). Six readers (3 radiologists, 3 rheumatologists) participated in 2 reading exercises. In Reading Exercise 1, MRI of the hip of 20 subjects were read at a single time point followed by further standardization of methodology. In Reading Exercise 2, MRI of the hip of 18 subjects from a randomized controlled trial, assessed at 2 timepoints, and 27 subjects from a cross-sectional study were read for HIMRISS and HOAMS bone marrow lesions (BML) and synovitis. Reliability was assessed using intraclass correlation coefficient (ICC) and kappa statistics. Both methods were considered feasible. For Reading 1, HIMRISS ICC were 0.52, 0.61, 0.70, and 0.58 for femoral BML, acetabular BML, effusion, and total scores, respectively; and for HOAMS, summed BML and synovitis ICC were 0.52 and 0.46, respectively. For Reading 2, HIMRISS and HOAMS ICC for BML and synovitis-effusion improved substantially. Interobserver reliability for change scores was 0.81 and 0.71 for HIMRISS femoral and HOAMS summed BML, respectively. Responsiveness and discrimination was moderate to high for synovitis-effusion. Significant associations were noted between BML or synovitis scores and Western Ontario and McMaster Universities Osteoarthritis Index pain scores for baseline values (p ≤ 0.001). The BML and synovitis-effusion components of both HIMRISS and HOAMS scoring systems are feasible and reliable, and should be validated further.

  4. Deposition of Aerosol Particles in Electrically Charged Membrane Filters

    Energy Technology Data Exchange (ETDEWEB)

    Stroem, L

    1972-05-15

    A theory for the influence of electric charge on particle deposition on the surface of charged filters has been developed. It has been tested experimentally on ordinary membrane filters and Nuclepore filters of 8 mum pore size, with a bipolar monodisperse test aerosol of 1 mum particle diameter, and at a filter charge up to 20 muC/m2. Agreement with theory was obtained for the Coulomb force between filter and particle for both kinds of filters. The image force between charged filter and neutral particles did not result in the predicted deposition in the ordinary membrane filter, probably due to lacking correspondence between the filter model employed for the theory, and the real filter. For the Nuclepore filter a satisfactory agreement with theory was obtained, also at image interaction

  5. A comparison of filters for thoracic diagnosis

    International Nuclear Information System (INIS)

    Oestmann, J.W.; Hendrickx, P.; Rieder, P.; Geerlings, H.; Medizinische Hochschule Hannover

    1986-01-01

    The effect of three types of filter on the quality of radiographs of the chest was compared. These filters improve visualization of mediastinal structures without significantly reducing the quality of the pulmonary image. In practice the Du Pont filter proved best; the quality of the central and peripheral portions of the lung image is equal to that of an ordinary radiograph and visualization of the mediastinum is improved. The Agfa-Gevaert filter showed no significant disadvantages compared with the ordinary techniques but the improvement in mediastinal visualization is not that marked. The 3M-filter yields poor images of the central portions of the lung and its type of construction prevents the retrocardiac structures from being pictured as well as with the other filters. (orig.) [de

  6. Matched-Filter Thermography

    Directory of Open Access Journals (Sweden)

    Nima Tabatabaei

    2018-04-01

    Full Text Available Conventional infrared thermography techniques, including pulsed and lock-in thermography, have shown great potential for non-destructive evaluation of broad spectrum of materials, spanning from metals to polymers to biological tissues. However, performance of these techniques is often limited due to the diffuse nature of thermal wave fields, resulting in an inherent compromise between inspection depth and depth resolution. Recently, matched-filter thermography has been introduced as a means for overcoming this classic limitation to enable depth-resolved subsurface thermal imaging and improving axial/depth resolution. This paper reviews the basic principles and experimental results of matched-filter thermography: first, mathematical and signal processing concepts related to matched-fileting and pulse compression are discussed. Next, theoretical modeling of thermal-wave responses to matched-filter thermography using two categories of pulse compression techniques (linear frequency modulation and binary phase coding are reviewed. Key experimental results from literature demonstrating the maintenance of axial resolution while inspecting deep into opaque and turbid media are also presented and discussed. Finally, the concept of thermal coherence tomography for deconvolution of thermal responses of axially superposed sources and creation of depth-selective images in a diffusion-wave field is reviewed.

  7. Bag filters

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, M; Komeda, I; Takizaki, K

    1982-01-01

    Bag filters are widely used throughout the cement industry for recovering raw materials and products and for improving the environment. Their general mechanism, performance and advantages are shown in a classification table, and there are comparisons and explanations. The outer and inner sectional construction of the Shinto ultra-jet collector for pulverized coal is illustrated and there are detailed descriptions of dust cloud prevention, of measures used against possible sources of ignition, of oxygen supply and of other topics. Finally, explanations are given of matters that require careful and comprehensive study when selecting equipment.

  8. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  9. Dynamic beam filtering for miscentered patients.

    Science.gov (United States)

    Mao, Andrew; Shyr, William; Gang, Grace J; Stayman, J Webster

    2018-02-01

    Accurate centering of the patient within the bore of a CT scanner takes time and is often difficult to achieve precisely. Patient miscentering can result in significant dose and image noise penalties with the use of traditional bowtie filters. This work describes a system to dynamically position an x-ray beam filter during image acquisition to enable more consistent image performance and potentially lower dose needed for CT imaging. We propose a new approach in which two orthogonal low-dose scout images are used to estimate a parametric model of the object describing its shape, size, and location within the field of view (FOV). This model is then used to compute an optimal filter motion profile by minimizing the variance of the expected detector fluence for each projection. Dynamic filtration was implemented on a cone-beam CT (CBCT) test bench using two different physical filters: 1) an aluminum bowtie and 2) a structured binary filter called a multiple aperture device (MAD). Dynamic filtration performance was compared to a static filter in studies of dose and reconstruction noise as a function of the degree of miscentering of a homogeneous water phantom. Estimated filter trajectories were found to be largely sinusoidal with an amplitude proportional to the amount of miscentering. Dynamic filtration demonstrated an improved ability to keep the spatial distribution of dose and reconstruction noise at baseline levels across varying levels of miscentering, reducing the maximum noise and dose deviation from 53% to 15% and 42% to 14% respectively for the bowtie filter, and 25% to 8% and 24% to 15% respectively for the MAD filter. Dynamic positioning of beam filters during acquisition improves dose utilization and image quality over static filters for miscentered patients. Such dynamic filters relax positioning requirements and have the potential to reduce set-up time and lower dose requirements.

  10. Graphic filter library implemented in CUDA language

    OpenAIRE

    Peroutková, Hedvika

    2009-01-01

    This thesis deals with the problem of reducing computation time of raster image processing by parallel computing on graphics processing unit. Raster image processing thereby refers to the application of graphic filters, which can be applied in sequence with different settings. This thesis evaluates the suitability of using parallelization on graphic card for raster image adjustments based on multicriterial choice. Filters are implemented for graphics processing unit in CUDA language. Opacity ...

  11. Filtering algorithm for dotted interferences

    Energy Technology Data Exchange (ETDEWEB)

    Osterloh, K., E-mail: kurt.osterloh@bam.de [Federal Institute for Materials Research and Testing (BAM), Division VIII.3, Radiological Methods, Unter den Eichen 87, 12205 Berlin (Germany); Buecherl, T.; Lierse von Gostomski, Ch. [Technische Universitaet Muenchen, Lehrstuhl fuer Radiochemie, Walther-Meissner-Str. 3, 85748 Garching (Germany); Zscherpel, U.; Ewert, U. [Federal Institute for Materials Research and Testing (BAM), Division VIII.3, Radiological Methods, Unter den Eichen 87, 12205 Berlin (Germany); Bock, S. [Technische Universitaet Muenchen, Lehrstuhl fuer Radiochemie, Walther-Meissner-Str. 3, 85748 Garching (Germany)

    2011-09-21

    An algorithm has been developed to remove reliably dotted interferences impairing the perceptibility of objects within a radiographic image. This particularly is a major challenge encountered with neutron radiographs collected at the NECTAR facility, Forschungs-Neutronenquelle Heinz Maier-Leibnitz (FRM II): the resulting images are dominated by features resembling a snow flurry. These artefacts are caused by scattered neutrons, gamma radiation, cosmic radiation, etc. all hitting the detector CCD directly in spite of a sophisticated shielding. This makes such images rather useless for further direct evaluations. One approach to resolve this problem of these random effects would be to collect a vast number of single images, to combine them appropriately and to process them with common image filtering procedures. However, it has been shown that, e.g. median filtering, depending on the kernel size in the plane and/or the number of single shots to be combined, is either insufficient or tends to blur sharp lined structures. This inevitably makes a visually controlled processing image by image unavoidable. Particularly in tomographic studies, it would be by far too tedious to treat each single projection by this way. Alternatively, it would be not only more comfortable but also in many cases the only reasonable approach to filter a stack of images in a batch procedure to get rid of the disturbing interferences. The algorithm presented here meets all these requirements. It reliably frees the images from the snowy pattern described above without the loss of fine structures and without a general blurring of the image. It consists of an iterative, within a batch procedure parameter free filtering algorithm aiming to eliminate the often complex interfering artefacts while leaving the original information untouched as far as possible.

  12. Filtering algorithm for dotted interferences

    International Nuclear Information System (INIS)

    Osterloh, K.; Buecherl, T.; Lierse von Gostomski, Ch.; Zscherpel, U.; Ewert, U.; Bock, S.

    2011-01-01

    An algorithm has been developed to remove reliably dotted interferences impairing the perceptibility of objects within a radiographic image. This particularly is a major challenge encountered with neutron radiographs collected at the NECTAR facility, Forschungs-Neutronenquelle Heinz Maier-Leibnitz (FRM II): the resulting images are dominated by features resembling a snow flurry. These artefacts are caused by scattered neutrons, gamma radiation, cosmic radiation, etc. all hitting the detector CCD directly in spite of a sophisticated shielding. This makes such images rather useless for further direct evaluations. One approach to resolve this problem of these random effects would be to collect a vast number of single images, to combine them appropriately and to process them with common image filtering procedures. However, it has been shown that, e.g. median filtering, depending on the kernel size in the plane and/or the number of single shots to be combined, is either insufficient or tends to blur sharp lined structures. This inevitably makes a visually controlled processing image by image unavoidable. Particularly in tomographic studies, it would be by far too tedious to treat each single projection by this way. Alternatively, it would be not only more comfortable but also in many cases the only reasonable approach to filter a stack of images in a batch procedure to get rid of the disturbing interferences. The algorithm presented here meets all these requirements. It reliably frees the images from the snowy pattern described above without the loss of fine structures and without a general blurring of the image. It consists of an iterative, within a batch procedure parameter free filtering algorithm aiming to eliminate the often complex interfering artefacts while leaving the original information untouched as far as possible.

  13. Clinical evaluation of image quality and radiation dose reduction in upper abdominal computed tomography using model-based iterative reconstruction; comparison with filtered back projection and adaptive statistical iterative reconstruction

    International Nuclear Information System (INIS)

    Nakamoto, Atsushi; Kim, Tonsok; Hori, Masatoshi; Onishi, Hiromitsu; Tsuboyama, Takahiro; Sakane, Makoto; Tatsumi, Mitsuaki; Tomiyama, Noriyuki

    2015-01-01

    Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to

  14. Clinical evaluation of image quality and radiation dose reduction in upper abdominal computed tomography using model-based iterative reconstruction; comparison with filtered back projection and adaptive statistical iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Nakamoto, Atsushi, E-mail: a-nakamoto@radiol.med.osaka-u.ac.jp; Kim, Tonsok, E-mail: kim@radiol.med.osaka-u.ac.jp; Hori, Masatoshi, E-mail: mhori@radiol.med.osaka-u.ac.jp; Onishi, Hiromitsu, E-mail: h-onishi@radiol.med.osaka-u.ac.jp; Tsuboyama, Takahiro, E-mail: t-tsuboyama@radiol.med.osaka-u.ac.jp; Sakane, Makoto, E-mail: m-sakane@radiol.med.osaka-u.ac.jp; Tatsumi, Mitsuaki, E-mail: m-tatsumi@radiol.med.osaka-u.ac.jp; Tomiyama, Noriyuki, E-mail: tomiyama@radiol.med.osaka-u.ac.jp

    2015-09-15

    Highlights: • MBIR significantly improves objective image quality. • MBIR reduces the radiation dose by 87.5% without increasing objective image noise. • A half dose will be needed to maintain the subjective image quality. - Abstract: Purpose: To evaluate the image quality of upper abdominal CT images reconstructed with model-based iterative reconstruction (MBIR) in comparison with filtered back projection (FBP) and adaptive statistical iterative reconstruction (ASIR) on scans acquired with various radiation exposure dose protocols. Materials and methods: This prospective study was approved by our institutional review board, and informed consent was obtained from all 90 patients who underwent both control-dose (CD) and reduced-dose (RD) CT of the upper abdomen (unenhanced: n = 45, contrast-enhanced: n = 45). The RD scan protocol was randomly selected from three protocols; Protocol A: 12.5% dose, Protocol B: 25% dose, Protocol C: 50% dose. Objective image noise, signal-to-noise (SNR) ratio for the liver parenchyma, visual image score and lesion conspicuity were compared among CD images of FBP and RD images of FBP, ASIR and MBIR. Results: RD images of MBIR yielded significantly lower objective image noise and higher SNR compared with RD images of FBP and ASIR for all protocols (P < .01) and CD images of FBP for Protocol C (P < .05). Although the subjective image quality of RD images of MBIR was almost acceptable for Protocol C, it was inferior to that of CD images of FBP for Protocols A and B (P < .0083). The conspicuity of the small lesions in RD images of MBIR tended to be superior to that in RD images of FBP and ASIR and inferior to that in CD images for Protocols A and B, although the differences were not significant (P > .0083). Conclusion: Although 12.5%-dose MBIR images (mean size-specific dose estimates [SSDE] of 1.13 mGy) yielded objective image noise and SNR comparable to CD-FBP images, at least a 50% dose (mean SSDE of 4.63 mGy) would be needed to

  15. GENERALIZATION OF RAYLEIGH MAXIMUM LIKELIHOOD DESPECKLING FILTER USING QUADRILATERAL KERNELS

    Directory of Open Access Journals (Sweden)

    S. Sridevi

    2013-02-01

    Full Text Available Speckle noise is the most prevalent noise in clinical ultrasound images. It visibly looks like light and dark spots and deduce the pixel intensity as murkiest. Gazing at fetal ultrasound images, the impact of edge and local fine details are more palpable for obstetricians and gynecologists to carry out prenatal diagnosis of congenital heart disease. A robust despeckling filter has to be contrived to proficiently suppress speckle noise and simultaneously preserve the features. The proposed filter is the generalization of Rayleigh maximum likelihood filter by the exploitation of statistical tools as tuning parameters and use different shapes of quadrilateral kernels to estimate the noise free pixel from neighborhood. The performance of various filters namely Median, Kuwahura, Frost, Homogenous mask filter and Rayleigh maximum likelihood filter are compared with the proposed filter in terms PSNR and image profile. Comparatively the proposed filters surpass the conventional filters.

  16. GPU Accelerated Vector Median Filter

    Science.gov (United States)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  17. FLOWING BILATERAL FILTER: DEFINITION AND IMPLEMENTATIONS

    Directory of Open Access Journals (Sweden)

    Maxime Moreaud

    2015-06-01

    Full Text Available The bilateral filter plays a key role in image processing applications due to its intuitive parameterization and its high quality filter result, smoothing homogeneous regions while preserving the edges of the objects. Considering the image as a topological relief, seeing pixel intensities as peaks and valleys, we introduce a way to control the tonal weighting coefficients, the flowing bilateral filter, reducing "halo" artifacts typically produced by the regular bilateral filter around a large peak surrounded by two valleys of lower values. In this paper we propose to investigate exact and approximated versions of CPU and parallel GPU (Graphical Processing Unit based implementations of the regular and flowing bilateral filter using the NVidia CUDA API. Fast implementations of these filters are important for the processing of large 3D volumes up to several GB acquired by x-ray or electron tomography.

  18. Virtual experiment of optical spatial filtering in Matlab environment

    Science.gov (United States)

    Ji, Yunjing; Wang, Chunyong; Song, Yang; Lai, Jiancheng; Wang, Qinghua; Qi, Jing; Shen, Zhonghua

    2017-08-01

    The principle of spatial filtering experiment has been introduced, and the computer simulation platform with graphical user interface (GUI) has been made out in Matlab environment. Using it various filtering processes for different input image or different filtering purpose will be completed accurately, and filtering effect can be observed clearly with adjusting experimental parameters. The physical nature of the optical spatial filtering can be showed vividly, and so experimental teaching effect will be promoted.

  19. Analog Electronic Filters Theory, Design and Synthesis

    CERN Document Server

    Dimopoulos, Hercules G

    2012-01-01

    Filters are essential subsystems in a huge variety of electronic systems. Filter applications are innumerable; they are used for noise reduction, demodulation, signal detection, multiplexing, sampling, sound and speech processing, transmission line equalization and image processing, to name just a few. In practice, no electronic system can exist without filters. They can be found in everything from power supplies to mobile phones and hard disk drives and from loudspeakers and MP3 players to home cinema systems and broadband Internet connections. This textbook introduces basic concepts and methods and the associated mathematical and computational tools employed in electronic filter theory, synthesis and design.  This book can be used as an integral part of undergraduate courses on analog electronic filters. Includes numerous, solved examples, applied examples and exercises for each chapter. Includes detailed coverage of active and passive filters in an independent but correlated manner. Emphasizes real filter...

  20. The Dependence of Signal-To-Noise Ratio (S/N) Between Star Brightness and Background on the Filter Used in Images Taken by the Vulcan Photometric Planet Search Camera

    Science.gov (United States)

    Mena-Werth, Jose

    1998-01-01

    The Vulcan Photometric Planet Search is the ground-based counterpart of Kepler Mission Proposal. The Kepler Proposal calls for the launch of telescope to look intently at a small patch of sky for four year. The mission is designed to look for extra-solar planets that transit sun-like stars. The Kepler Mission should be able to detect Earth-size planets. This goal requires an instrument and software capable of detecting photometric changes of several parts per hundred thousand in the flux of a star. The goal also requires the continuous monitoring of about a hundred thousand stars. The Kepler Mission is a NASA Discovery Class proposal similar in cost to the Lunar Prospector. The Vulcan Search is also a NASA project but based at Lick Observatory. A small wide-field telescope monitors various star fields successively during the year. Dozens of images, each containing tens of thousands of stars, are taken any night that weather permits. The images are then monitored for photometric changes of the order of one part in a thousand. These changes would reveal the transit of an inner-orbit Jupiter-size planet similar to those discovered recently in spectroscopic searches. In order to achieve a one part in one thousand photometric precision even the choice of a filter used in taking an exposure can be critical. The ultimate purpose of an filter is to increase the signal-to-noise ratio (S/N) of one's observation. Ideally, filters reduce the sky glow cause by street lights and, thereby, make the star images more distinct. The higher the S/N, the higher is the chance to observe a transit signal that indicates the presence of a new planet. It is, therefore, important to select the filter that maximizes the S/N.

  1. Wiener filter applied to a neutrongraphic system

    International Nuclear Information System (INIS)

    Crispim, V.R.; Lopes, R.T.; Borges, J.C.

    1986-01-01

    The randon characteristics of the image formation process influence the spatial image obtained in a neutrongraphy. Several methods can be used to optimize this image, though estimation of the noise added to the original signal. This work deals with the optimal filtering technique, using Wiener's filter. A simulation is made, where the signal (spatial resolution function) has a Lorentz's form, and ten kinds of random noise with increasing R.M.S. are generated and individually added to the original signal. Wiener's filter is applied to different noise amplitudes and the behaviour of the spatial resolution function for our system is also analysed. (Author) [pt

  2. Miniaturized dielectric waveguide filters

    OpenAIRE

    Sandhu, MY; Hunter, IC

    2016-01-01

    Design techniques for a new class of integrated monolithic high-permittivity ceramic waveguide filters are presented. These filters enable a size reduction of 50% compared to air-filled transverse electromagnetic filters with the same unloaded Q-factor. Designs for Chebyshev and asymmetric generalised Chebyshev filter and a diplexer are presented with experimental results for an 1800 MHz Chebyshev filter and a 1700 MHz generalised Chebyshev filter showing excellent agreement with theory.

  3. Performance tuning for CUDA-accelerated neighborhood denoising filters

    Energy Technology Data Exchange (ETDEWEB)

    Zheng, Ziyi; Mueller, Klaus [Stony Brook Univ., NY (United States). Center for Visual Computing, Computer Science; Xu, Wei

    2011-07-01

    Neighborhood denoising filters are powerful techniques in image processing and can effectively enhance the image quality in CT reconstructions. In this study, by taking the bilateral filter and the non-local mean filter as two examples, we discuss their implementations and perform fine-tuning on the targeted GPU architecture. Experimental results show that the straightforward GPU-based neighborhood filters can be further accelerated by pre-fetching. The optimized GPU-accelerated denoising filters are ready for plug-in into reconstruction framework to enable fast denoising without compromising image quality. (orig.)

  4. Selection vector filter framework

    Science.gov (United States)

    Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.

    2003-10-01

    We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.

  5. 基于人体关键部位检测的敏感图像过滤方法%PORNOGRAPHIC IMAGE FILTERING METHOD BASED ON EROTOGENIC-ZONE DETECTION

    Institute of Scientific and Technical Information of China (English)

    陆蓓; 巩玉旺; 姚金良; 周建政

    2011-01-01

    目前多数敏感图像过滤方法对皮肤裸露较多或类肤色区域较多的图像容易产生误检.为降低对这类图像的误检率,提出一种基于人体关键部位检测的敏感图像过滤方法.该方法提取肤色特征、表征局部对象外观和形状的HOG(Histograms of Orien-ted Gradient)特征、空间分布特征及描述区域灰度分布的Haar-like等特征,利用Adaboost学习算法,训练得到人体关键部位的分类器,通过此分类器实现敏感图像的过滤.实验表明,该方法能够准确地检测关键部位,可以有效地降低敏感图像的误检率.%At present, many non-pornographic images containing larger exposure of skin area or approximate skin-colour area are often prone to be detected as the pornographic images by most of the pornographic image filtering methods. In order to decrease the false detection rate, a new pornographic image filtering method based on erotogenic-zone detection is proposed in the paper. The method extracts main features, including skin-colour features, HOG features which describe the shape and appearance of local objects, spatial distribution based features and Haar-like features which describe local grayscale distribution, trains and obtains the classifier of erotogenic-zone recognition with Adaboost learning algorithm, and achieves the pornographic image filtering through the classifier. Results gained from the experiments confirmed that this method can precisely detect erotogenic-zone in an image, and can effectively reduce the fault detection rate against nonpornographic images.

  6. Parieto-occipital areas involved in efficient filtering in search: a time course analysis of visual marking using behavioural and functional imaging procedures

    DEFF Research Database (Denmark)

    Humphreys, Glyn W; Kyllingsbæk, Søren; Watson, Derrick G.

    2004-01-01

    Search for a colour-form conjunction target can be facilitated by presenting one set of distractors prior to the second set of distractors and the target: the preview benefit (Watson & Humphreys, 1997). The early presentation of one set of distractors enables them to be efficiently filtered from...

  7. Comparison of image quality and visibility of normal and abnormal findings at submillisievert chest CT using filtered back projection, iterative model reconstruction (IMR) and iDose{sup 4}™

    Energy Technology Data Exchange (ETDEWEB)

    Laqmani, Azien, E-mail: a.laqmani@uke.de [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Avanesov, Maxim; Butscheidt, Sebastian; Kurfürst, Maximilian [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Sehner, Susanne [Department of Medical Biometry and Epidemiology, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Schmidt-Holtz, Jakob; Derlin, Thorsten; Behzadi, Cyrus [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany); Nagel, Hans D. [Science & Technology for Radiology, Fritz-Reuter-Weg 5f, 21244 Buchholz, Germany, (Germany); Adam, Gerhard; Regier, Marc [Department of Diagnostic and Interventional Radiology and Nuclear Medicine, University Medical Center Hamburg-Eppendorf, Martinistr. 52, 20246 Hamburg (Germany)

    2016-11-15

    Objective: To compare both image quality and visibility of normal and abnormal findings at submillisievert chest CT (smSv-CT) using filtered back projection (FBP) and the two different iterative reconstruction (IR) techniques iterative model reconstruction (IMR) and iDose{sup 4}™. Materials and methods: This institutional review board approved study was based on retrospective interpretation of clinically indicated acquired data. The requirement to obtain informed consent was waived. 81 patients with suspected pneumonia underwent smSv-CT (Brilliance iCT, Philips Healthcare; mean effective dose: 0.86 ± 0.2 mSv). Data were reconstructed using FBP and two different IR techniques iDose{sup 4}™ and IMR (Philips Healthcare) at various iteration levels. Objective image noise (OIN) was measured. Two experienced readers independently assessed all images for image noise, image appearance and visibility of normal anatomic and abnormal findings. A random intercept model was used for statistical analysis. Results: Compared to FBP and iDose{sup 4}™, IMR reduced OIN up to 88% and 72%, respectively (p < 0.001). A mild blotchy image appearance was seen in IMR images, affecting diagnostic confidence. iDose{sup 4}™ images provided satisfactory to good image quality for visibility of normal and abnormal findings and were superior to FBP (p < 0.001). IMR images were significantly inferior for visibility of normal structures compared to iDose{sup 4}™, while being superior for visibility of abnormal findings except for reticular pattern (p < 0.001). Conclusion: IMR results for visibility of normal and abnormal lung findings are heterogeneous, indicating that IMR may not represent a priority technique for clinical routine. iDose{sup 4}™ represents a suitable method for evaluation of lung tissue at submillisievert chest CT.

  8. Feature-Based Nonlocal Polarimetric SAR Filtering

    Directory of Open Access Journals (Sweden)

    Xiaoli Xing

    2017-10-01

    Full Text Available Polarimetric synthetic aperture radar (PolSAR images are inherently contaminated by multiplicative speckle noise, which complicates the image interpretation and image analyses. To reduce the speckle effect, several adaptive speckle filters have been developed based on the weighted average of the similarity measures commonly depending on the model or probability distribution, which are often affected by the distribution parameters and modeling texture components. In this paper, a novel filtering method introduces the coefficient of variance ( CV and Pauli basis (PB to measure the similarity, and the two features are combined with the framework of the nonlocal mean filtering. The CV is used to describe the complexity of various scenes and distinguish the scene heterogeneity; moreover, the Pauli basis is able to express the polarimetric information in PolSAR image processing. This proposed filtering combines the CV and Pauli basis to improve the estimation accuracy of the similarity weights. Then, the similarity of the features is deduced according to the test statistic. Subsequently, the filtering is proceeded by using the nonlocal weighted estimation. The performance of the proposed filter is tested with the simulated images and real PolSAR images, which are acquired by AIRSAR system and ESAR system. The qualitative and quantitative experiments indicate the validity of the proposed method by comparing with the widely-used despeckling methods.

  9. Recirculating electric air filter

    Science.gov (United States)

    Bergman, W.

    1985-01-09

    An electric air filter cartridge has a cylindrical inner high voltage electrode, a layer of filter material, and an outer ground electrode formed of a plurality of segments moveably connected together. The outer electrode can be easily opened to remove or insert filter material. Air flows through the two electrodes and the filter material and is exhausted from the center of the inner electrode.

  10. Passive Power Filters

    CERN Document Server

    Künzi, R.

    2015-06-15

    Power converters require passive low-pass filters which are capable of reducing voltage ripples effectively. In contrast to signal filters, the components of power filters must carry large currents or withstand large voltages, respectively. In this paper, three different suitable filter struc tures for d.c./d.c. power converters with inductive load are introduced. The formulas needed to calculate the filter components are derived step by step and practical examples are given. The behaviour of the three discussed