WorldWideScience

Sample records for speckle imaging algorithms

  1. Speckle imaging algorithms for planetary imaging

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, E. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.

  2. Speckle imaging of globular clusters

    International Nuclear Information System (INIS)

    Sams, B.J. III

    1990-01-01

    Speckle imaging is a powerful tool for high resolution astronomy. Its application to the core regions of globular clusters produces high resolution stellar maps of the bright stars, but is unable to image the faint stars which are most reliable dynamical indicators. The limits on resolving these faint, extended objects are physical, not algorithmic, and cannot be overcome using speckle. High resolution maps may be useful for resolving multicomponent stellar systems in the cluster centers. 30 refs

  3. Speckle imaging using the principle value decomposition method

    International Nuclear Information System (INIS)

    Sherman, J.W.

    1978-01-01

    Obtaining diffraction-limited images in the presence of atmospheric turbulence is a topic of current interest. Two types of approaches have evolved: real-time correction and speckle imaging. A speckle imaging reconstruction method was developed by use of an ''optimal'' filtering approach. This method is based on a nonlinear integral equation which is solved by principle value decomposition. The method was implemented on a CDC 7600 for study. The restoration algorithm is discussed and its performance is illustrated. 7 figures

  4. Accelerated speckle imaging with the ATST visible broadband imager

    Science.gov (United States)

    Wöger, Friedrich; Ferayorni, Andrew

    2012-09-01

    The Advanced Technology Solar Telescope (ATST), a 4 meter class telescope for observations of the solar atmosphere currently in construction phase, will generate data at rates of the order of 10 TB/day with its state of the art instrumentation. The high-priority ATST Visible Broadband Imager (VBI) instrument alone will create two data streams with a bandwidth of 960 MB/s each. Because of the related data handling issues, these data will be post-processed with speckle interferometry algorithms in near-real time at the telescope using the cost-effective Graphics Processing Unit (GPU) technology that is supported by the ATST Data Handling System. In this contribution, we lay out the VBI-specific approach to its image processing pipeline, put this into the context of the underlying ATST Data Handling System infrastructure, and finally describe the details of how the algorithms were redesigned to exploit data parallelism in the speckle image reconstruction algorithms. An algorithm re-design is often required to efficiently speed up an application using GPU technology; we have chosen NVIDIA's CUDA language as basis for our implementation. We present our preliminary results of the algorithm performance using our test facilities, and base a conservative estimate on the requirements of a full system that could achieve near real-time performance at ATST on these results.

  5. Multiple speckle illumination for optical-resolution photoacoustic imaging

    Science.gov (United States)

    Poisson, Florian; Stasio, Nicolino; Moser, Christophe; Psaltis, Demetri; Bossy, Emmanuel

    2017-03-01

    Optical-resolution photoacoustic microscopy offers exquisite and specific contrast to optical absorption. Conventional approaches generally involves raster scanning a focused spot over the sample. Here, we demonstrate that a full-field illumination approach with multiple speckle illumination can also provide diffraction-limited optical-resolution photoacoustic images. Two different proof-of-concepts are demonstrated with micro-structured test samples. The first approach follows the principle of correlation/ghost imaging,1, 2 and is based on cross-correlating photoacoustic signals under multiple speckle illumination with known speckle patterns measured during a calibration step. The second approach is a speckle scanning microscopy technique, which adapts the technique proposed in fluorescence microscopy by Bertolotti and al.:3 in our work, spatially unresolved photoacoustic measurements are performed for various translations of unknown speckle patterns. A phase-retrieval algorithm is used to reconstruct the object from the knowledge of the modulus of its Fourier Transform yielded by the measurements. Because speckle patterns naturally appear in many various situations, including propagation through biological tissue or multi-mode fibers (for which focusing light is either very demanding if not impossible), speckle-illumination-based photoacoustic microscopy provides a powerful framework for the development of novel reconstruction approaches, well-suited to compressed sensing approaches.2

  6. Multiresolution edge detection using enhanced fuzzy c-means clustering for ultrasound image speckle reduction

    Energy Technology Data Exchange (ETDEWEB)

    Tsantis, Stavros [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); Spiliopoulos, Stavros; Karnabatidis, Dimitrios [Department of Radiology, School of Medicine, University of Patras, Rion, GR 26504 (Greece); Skouroliakou, Aikaterini [Department of Energy Technology Engineering, Technological Education Institute of Athens, Athens 12210 (Greece); Hazle, John D. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Kagadis, George C., E-mail: gkagad@gmail.com, E-mail: George.Kagadis@med.upatras.gr, E-mail: GKagadis@mdanderson.org [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504, Greece and Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)

    2014-07-15

    Purpose: Speckle suppression in ultrasound (US) images of various anatomic structures via a novel speckle noise reduction algorithm. Methods: The proposed algorithm employs an enhanced fuzzy c-means (EFCM) clustering and multiresolution wavelet analysis to distinguish edges from speckle noise in US images. The edge detection procedure involves a coarse-to-fine strategy with spatial and interscale constraints so as to classify wavelet local maxima distribution at different frequency bands. As an outcome, an edge map across scales is derived whereas the wavelet coefficients that correspond to speckle are suppressed in the inverse wavelet transform acquiring the denoised US image. Results: A total of 34 thyroid, liver, and breast US examinations were performed on a Logiq 9 US system. Each of these images was subjected to the proposed EFCM algorithm and, for comparison, to commercial speckle reduction imaging (SRI) software and another well-known denoising approach, Pizurica's method. The quantification of the speckle suppression performance in the selected set of US images was carried out via Speckle Suppression Index (SSI) with results of 0.61, 0.71, and 0.73 for EFCM, SRI, and Pizurica's methods, respectively. Peak signal-to-noise ratios of 35.12, 33.95, and 29.78 and edge preservation indices of 0.94, 0.93, and 0.86 were found for the EFCM, SIR, and Pizurica's method, respectively, demonstrating that the proposed method achieves superior speckle reduction performance and edge preservation properties. Based on two independent radiologists’ qualitative evaluation the proposed method significantly improved image characteristics over standard baseline B mode images, and those processed with the Pizurica's method. Furthermore, it yielded results similar to those for SRI for breast and thyroid images significantly better results than SRI for liver imaging, thus improving diagnostic accuracy in both superficial and in-depth structures. Conclusions: A

  7. Multiresolution edge detection using enhanced fuzzy c-means clustering for ultrasound image speckle reduction

    International Nuclear Information System (INIS)

    Tsantis, Stavros; Spiliopoulos, Stavros; Karnabatidis, Dimitrios; Skouroliakou, Aikaterini; Hazle, John D.; Kagadis, George C.

    2014-01-01

    Purpose: Speckle suppression in ultrasound (US) images of various anatomic structures via a novel speckle noise reduction algorithm. Methods: The proposed algorithm employs an enhanced fuzzy c-means (EFCM) clustering and multiresolution wavelet analysis to distinguish edges from speckle noise in US images. The edge detection procedure involves a coarse-to-fine strategy with spatial and interscale constraints so as to classify wavelet local maxima distribution at different frequency bands. As an outcome, an edge map across scales is derived whereas the wavelet coefficients that correspond to speckle are suppressed in the inverse wavelet transform acquiring the denoised US image. Results: A total of 34 thyroid, liver, and breast US examinations were performed on a Logiq 9 US system. Each of these images was subjected to the proposed EFCM algorithm and, for comparison, to commercial speckle reduction imaging (SRI) software and another well-known denoising approach, Pizurica's method. The quantification of the speckle suppression performance in the selected set of US images was carried out via Speckle Suppression Index (SSI) with results of 0.61, 0.71, and 0.73 for EFCM, SRI, and Pizurica's methods, respectively. Peak signal-to-noise ratios of 35.12, 33.95, and 29.78 and edge preservation indices of 0.94, 0.93, and 0.86 were found for the EFCM, SIR, and Pizurica's method, respectively, demonstrating that the proposed method achieves superior speckle reduction performance and edge preservation properties. Based on two independent radiologists’ qualitative evaluation the proposed method significantly improved image characteristics over standard baseline B mode images, and those processed with the Pizurica's method. Furthermore, it yielded results similar to those for SRI for breast and thyroid images significantly better results than SRI for liver imaging, thus improving diagnostic accuracy in both superficial and in-depth structures. Conclusions: A

  8. Speckle correlation resolution enhancement of wide-field fluorescence imaging (Conference Presentation)

    Science.gov (United States)

    Yilmaz, Hasan

    2016-03-01

    Structured illumination enables high-resolution fluorescence imaging of nanostructures [1]. We demonstrate a new high-resolution fluorescence imaging method that uses a scattering layer with a high-index substrate as a solid immersion lens [2]. Random scattering of coherent light enables a speckle pattern with a very fine structure that illuminates the fluorescent nanospheres on the back surface of the high-index substrate. The speckle pattern is raster-scanned over the fluorescent nanospheres using a speckle correlation effect known as the optical memory effect. A series of standard-resolution fluorescence images per each speckle pattern displacement are recorded by an electron-multiplying CCD camera using a commercial microscope objective. We have developed a new phase-retrieval algorithm to reconstruct a high-resolution, wide-field image from several standard-resolution wide-field images. We have introduced phase information of Fourier components of standard-resolution images as a new constraint in our algorithm which discards ambiguities therefore ensures convergence to a unique solution. We demonstrate two-dimensional fluorescence images of a collection of nanospheres with a deconvolved Abbe resolution of 116 nm and a field of view of 10 µm × 10 µm. Our method is robust against optical aberrations and stage drifts, therefore excellent for imaging nanostructures under ambient conditions. [1] M. G. L. Gustafsson, J. Microsc. 198, 82-87 (2000). [2] H. Yilmaz, E. G. van Putten, J. Bertolotti, A. Lagendijk, W. L. Vos, and A. P. Mosk, Optica 2, 424-429 (2015).

  9. Development of Speckle Interferometry Algorithm and System

    International Nuclear Information System (INIS)

    Shamsir, A. A. M.; Jafri, M. Z. M.; Lim, H. S.

    2011-01-01

    Electronic speckle pattern interferometry (ESPI) method is a wholefield, non destructive measurement method widely used in the industries such as detection of defects on metal bodies, detection of defects in intergrated circuits in digital electronics components and in the preservation of priceless artwork. In this research field, this method is widely used to develop algorithms and to develop a new laboratory setup for implementing the speckle pattern interferometry. In speckle interferometry, an optically rough test surface is illuminated with an expanded laser beam creating a laser speckle pattern in the space surrounding the illuminated region. The speckle pattern is optically mixed with a second coherent light field that is either another speckle pattern or a smooth light field. This produces an interferometric speckle pattern that will be detected by sensor to count the change of the speckle pattern due to force given. In this project, an experimental setup of ESPI is proposed to analyze a stainless steel plate using 632.8 nm (red) wavelength of lights.

  10. Comparison of phase unwrapping algorithms for topography reconstruction based on digital speckle pattern interferometry

    Science.gov (United States)

    Li, Yuanbo; Cui, Xiaoqian; Wang, Hongbei; Zhao, Mengge; Ding, Hongbin

    2017-10-01

    Digital speckle pattern interferometry (DSPI) can diagnose the topography evolution in real-time, continuous and non-destructive, and has been considered as a most promising technique for Plasma-Facing Components (PFCs) topography diagnostic under the complicated environment of tokamak. It is important for the study of digital speckle pattern interferometry to enhance speckle patterns and obtain the real topography of the ablated crater. In this paper, two kinds of numerical model based on flood-fill algorithm has been developed to obtain the real profile by unwrapping from the wrapped phase in speckle interference pattern, which can be calculated through four intensity images by means of 4-step phase-shifting technique. During the process of phase unwrapping by means of flood-fill algorithm, since the existence of noise pollution, and other inevitable factors will lead to poor quality of the reconstruction results, this will have an impact on the authenticity of the restored topography. The calculation of the quality parameters was introduced to obtain the quality-map from the wrapped phase map, this work presents two different methods to calculate the quality parameters. Then quality parameters are used to guide the path of flood-fill algorithm, and the pixels with good quality parameters are given priority calculation, so that the quality of speckle interference pattern reconstruction results are improved. According to the comparison between the flood-fill algorithm which is suitable for speckle pattern interferometry and the quality-guided flood-fill algorithm (with two different calculation approaches), the errors which caused by noise pollution and the discontinuous of the strips were successfully reduced.

  11. Speckle Reduction on Ultrasound Liver Images Based on a Sparse Representation over a Learned Dictionary

    Directory of Open Access Journals (Sweden)

    Mohamed Yaseen Jabarulla

    2018-05-01

    Full Text Available Ultrasound images are corrupted with multiplicative noise known as speckle, which reduces the effectiveness of image processing and hampers interpretation. This paper proposes a multiplicative speckle suppression technique for ultrasound liver images, based on a new signal reconstruction model known as sparse representation (SR over dictionary learning. In the proposed technique, the non-uniform multiplicative signal is first converted into additive noise using an enhanced homomorphic filter. This is followed by pixel-based total variation (TV regularization and patch-based SR over a dictionary trained using K-singular value decomposition (KSVD. Finally, the split Bregman algorithm is used to solve the optimization problem and estimate the de-speckled image. The simulations performed on both synthetic and clinical ultrasound images for speckle reduction, the proposed technique achieved peak signal-to-noise ratios of 35.537 dB for the dictionary trained on noisy image patches and 35.033 dB for the dictionary trained using a set of reference ultrasound image patches. Further, the evaluation results show that the proposed method performs better than other state-of-the-art denoising algorithms in terms of both peak signal-to-noise ratio and subjective visual quality assessment.

  12. Algorithmic processing of intrinsic signals in affixed transmission speckle analysis (ATSA) (Conference Presentation)

    Science.gov (United States)

    Ghijsen, Michael T.; Tromberg, Bruce J.

    2017-03-01

    Affixed Transmission Speckle Analysis (ATSA) is a method recently developed to measure blood flow that is based on laser speckle imaging miniaturized into a clip-on form factor the size of a pulse-oximeter. Measuring at a rate of 250 Hz, ATSA is capable or obtaining the cardiac waveform in blood flow data, referred to as the Speckle-Plethysmogram (SPG). ATSA is also capable of simultaneously measuring the Photoplethysmogram (PPG), a more conventional signal related to light intensity. In this work we present several novel algorithms for extracting physiologically relevant information from the combined SPG-PPG waveform data. First we show that there is a slight time-delay between the SPG and PPG that can be extracted computationally. Second, we present a set of frequency domain algorithms that measure harmonic content on pulse-by-pulse basis for both the SPG and PPG. Finally, we apply these algorithms to data obtained from a set of subjects including healthy controls and individuals with heightened cardiovascular risk. We hypothesize that the time-delay and frequency content are correlated with cardiovascular health; specifically with vascular stiffening.

  13. Speckle pattern processing by digital image correlation

    Directory of Open Access Journals (Sweden)

    Gubarev Fedor

    2016-01-01

    Full Text Available Testing the method of speckle pattern processing based on the digital image correlation is carried out in the current work. Three the most widely used formulas of the correlation coefficient are tested. To determine the accuracy of the speckle pattern processing, test speckle patterns with known displacement are used. The optimal size of a speckle pattern template used for determination of correlation and corresponding the speckle pattern displacement is also considered in the work.

  14. Optimized digital speckle patterns for digital image correlation by consideration of both accuracy and efficiency.

    Science.gov (United States)

    Chen, Zhenning; Shao, Xinxing; Xu, Xiangyang; He, Xiaoyuan

    2018-02-01

    The technique of digital image correlation (DIC), which has been widely used for noncontact deformation measurements in both the scientific and engineering fields, is greatly affected by the quality of speckle patterns in terms of its performance. This study was concerned with the optimization of the digital speckle pattern (DSP) for DIC in consideration of both the accuracy and efficiency. The root-mean-square error of the inverse compositional Gauss-Newton algorithm and the average number of iterations were used as quality metrics. Moreover, the influence of subset sizes and the noise level of images, which are the basic parameters in the quality assessment formulations, were also considered. The simulated binary speckle patterns were first compared with the Gaussian speckle patterns and captured DSPs. Both the single-radius and multi-radius DSPs were optimized. Experimental tests and analyses were conducted to obtain the optimized and recommended DSP. The vector diagram of the optimized speckle pattern was also uploaded as reference.

  15. Estimation of vessel diameter and blood flow dynamics from laser speckle images

    DEFF Research Database (Denmark)

    Postnov, Dmitry D.; Tuchin, Valery V.; Sosnovtseva, Olga

    2016-01-01

    Laser speckle imaging is a rapidly developing method to study changes of blood velocity in the vascular networks. However, to assess blood flow and vascular responses it is crucial to measure vessel diameter in addition to blood velocity dynamics. We suggest an algorithm that allows for dynamical...

  16. An adaptive Kalman filter for speckle reductions in ultrasound images

    International Nuclear Information System (INIS)

    Castellini, G.; Labate, D.; Masotti, L.; Mannini, E.; Rocchi, S.

    1988-01-01

    Speckle is the term used to describe the granular appearance found in ultrasound images. The presence of speckle reduces the diagnostic potential of the echographic technique because it tends to mask small inhomogeneities of the investigated tissue. We developed a new method of speckle reductions that utilizes an adaptive one-dimensional Kalman filter based on the assumption that the observed image can be considered as a superimposition of speckle on a ''true images''. The filter adaptivity, necessary to avoid loss of resolution, has been obtained by statistical considerations on the local signal variations. The results of the applications of this particular Kalman filter, both on A-Mode and B-MODE images, show a significant speckle reduction

  17. Ultrasound speckle reduction based on fractional order differentiation.

    Science.gov (United States)

    Shao, Dangguo; Zhou, Ting; Liu, Fan; Yi, Sanli; Xiang, Yan; Ma, Lei; Xiong, Xin; He, Jianfeng

    2017-07-01

    Ultrasound images show a granular pattern of noise known as speckle that diminishes their quality and results in difficulties in diagnosis. To preserve edges and features, this paper proposes a fractional differentiation-based image operator to reduce speckle in ultrasound. An image de-noising model based on fractional partial differential equations with balance relation between k (gradient modulus threshold that controls the conduction) and v (the order of fractional differentiation) was constructed by the effective combination of fractional calculus theory and a partial differential equation, and the numerical algorithm of it was achieved using a fractional differential mask operator. The proposed algorithm has better speckle reduction and structure preservation than the three existing methods [P-M model, the speckle reducing anisotropic diffusion (SRAD) technique, and the detail preserving anisotropic diffusion (DPAD) technique]. And it is significantly faster than bilateral filtering (BF) in producing virtually the same experimental results. Ultrasound phantom testing and in vivo imaging show that the proposed method can improve the quality of an ultrasound image in terms of tissue SNR, CNR, and FOM values.

  18. General filtering method for electronic speckle pattern interferometry fringe images with various densities based on variational image decomposition.

    Science.gov (United States)

    Li, Biyuan; Tang, Chen; Gao, Guannan; Chen, Mingming; Tang, Shuwei; Lei, Zhenkun

    2017-06-01

    Filtering off speckle noise from a fringe image is one of the key tasks in electronic speckle pattern interferometry (ESPI). In general, ESPI fringe images can be divided into three categories: low-density fringe images, high-density fringe images, and variable-density fringe images. In this paper, we first present a general filtering method based on variational image decomposition that can filter speckle noise for ESPI fringe images with various densities. In our method, a variable-density ESPI fringe image is decomposed into low-density fringes, high-density fringes, and noise. A low-density fringe image is decomposed into low-density fringes and noise. A high-density fringe image is decomposed into high-density fringes and noise. We give some suitable function spaces to describe low-density fringes, high-density fringes, and noise, respectively. Then we construct several models and numerical algorithms for ESPI fringe images with various densities. And we investigate the performance of these models via our extensive experiments. Finally, we compare our proposed models with the windowed Fourier transform method and coherence enhancing diffusion partial differential equation filter. These two methods may be the most effective filtering methods at present. Furthermore, we use the proposed method to filter a collection of the experimentally obtained ESPI fringe images with poor quality. The experimental results demonstrate the performance of our proposed method.

  19. Phase-processing as a tool for speckle reduction in pulse-echo images

    DEFF Research Database (Denmark)

    Healey, AJ; Leeman, S; Forsberg, F

    1991-01-01

    . Traditional speckle reduction procedures regard speckle correction as a stochastic process and trade image smoothing (resolution loss) for speckle reduction. Recently, a new phase acknowledging technique has been proposed that is unique in its ability to correct for speckle interference with no image......Due to the coherent nature of conventional ultrasound medical imaging systems interference artefacts occur in pulse echo images. These artefacts are generically termed 'speckle'. The phenomenon may severely limit low contrast resolution with clinically relevant information being obscured...

  20. Detection of white spot lesions by segmenting laser speckle images using computer vision methods.

    Science.gov (United States)

    Gavinho, Luciano G; Araujo, Sidnei A; Bussadori, Sandra K; Silva, João V P; Deana, Alessandro M

    2018-05-05

    This paper aims to develop a method for laser speckle image segmentation of tooth surfaces for diagnosis of early stages caries. The method, applied directly to a raw image obtained by digital photography, is based on the difference between the speckle pattern of a carious lesion tooth surface area and that of a sound area. Each image is divided into blocks which are identified in a working matrix by their χ 2 distance between block histograms of the analyzed image and the reference histograms previously obtained by K-means from healthy (h_Sound) and lesioned (h_Decay) areas, separately. If the χ 2 distance between a block histogram and h_Sound is greater than the distance to h_Decay, this block is marked as decayed. The experiments showed that the method can provide effective segmentation for initial lesions. We used 64 images to test the algorithm and we achieved 100% accuracy in segmentation. Differences between the speckle pattern of a sound tooth surface region and a carious region, even in the early stage, can be evidenced by the χ 2 distance between histograms. This method proves to be more effective for segmenting the laser speckle image, which enhances the contrast between sound and lesioned tissues. The results were obtained with low computational cost. The method has the potential for early diagnosis in a clinical environment, through the development of low-cost portable equipment.

  1. Statistical Image Recovery From Laser Speckle Patterns With Polarization Diversity

    Science.gov (United States)

    2010-09-01

    several techniques for speckle suppression in optical imaging [19]. However, averaging nonimaged laser speckle patterns does not yield the same result...Comparison”. Applied Optics , 21(15):2758–2769, August 1982. 13. Fienup, James R. “Image Formation from Nonimaged Laser Speckle Patterns”. S. R. Robinson...6 ν Optical Frequency . . . . . . . . . . . . . . . . . . . . . . 6 t Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 ϕ

  2. Wavelet tree structure based speckle noise removal for optical coherence tomography

    Science.gov (United States)

    Yuan, Xin; Liu, Xuan; Liu, Yang

    2018-02-01

    We report a new speckle noise removal algorithm in optical coherence tomography (OCT). Though wavelet domain thresholding algorithms have demonstrated superior advantages in suppressing noise magnitude and preserving image sharpness in OCT, the wavelet tree structure has not been investigated in previous applications. In this work, we propose an adaptive wavelet thresholding algorithm via exploiting the tree structure in wavelet coefficients to remove the speckle noise in OCT images. The threshold for each wavelet band is adaptively selected following a special rule to retain the structure of the image across different wavelet layers. Our results demonstrate that the proposed algorithm outperforms conventional wavelet thresholding, with significant advantages in preserving image features.

  3. Laser Speckle Contrast Imaging: theory, instrumentation and applications.

    Science.gov (United States)

    Senarathna, Janaka; Rege, Abhishek; Li, Nan; Thakor, Nitish V

    2013-01-01

    Laser Speckle Contrast Imaging (LSCI) is a wide field of view, non scanning optical technique for observing blood flow. Speckles are produced when coherent light scattered back from biological tissue is diffracted through the limiting aperture of focusing optics. Mobile scatterers cause the speckle pattern to blur; a model can be constructed by inversely relating the degree of blur, termed speckle contrast to the scatterer speed. In tissue, red blood cells are the main source of moving scatterers. Therefore, blood flow acts as a virtual contrast agent, outlining blood vessels. The spatial resolution (~10 μm) and temporal resolution (10 ms to 10 s) of LSCI can be tailored to the application. Restricted by the penetration depth of light, LSCI can only visualize superficial blood flow. Additionally, due to its non scanning nature, LSCI is unable to provide depth resolved images. The simple setup and non-dependence on exogenous contrast agents have made LSCI a popular tool for studying vascular structure and blood flow dynamics. We discuss the theory and practice of LSCI and critically analyze its merit in major areas of application such as retinal imaging, imaging of skin perfusion as well as imaging of neurophysiology.

  4. Statistical characterization of speckle noise in coherent imaging systems

    Science.gov (United States)

    Yaroslavsky, Leonid; Shefler, A.

    2003-05-01

    Speckle noise imposes fundamental limitation on image quality in coherent radiation based imaging and optical metrology systems. Speckle noise phenomena are associated with properties of objects to diffusely scatter irradiation and with the fact that in recording the wave field, a number of signal distortions inevitably occur due to technical limitations inherent to hologram sensors. The statistical theory of speckle noise was developed with regard to only limited resolving power of coherent imaging devices. It is valid only asymptotically as much as the central limit theorem of the probability theory can be applied. In applications this assumption is not always applicable. Moreover, in treating speckle noise problem one should also consider other sources of the hologram deterioration. In the paper, statistical properties of speckle due to the limitation of hologram size, dynamic range and hologram signal quantization are studied by Monte-Carlo simulation for holograms recorded in near and far diffraction zones. The simulation experiments have shown that, for limited resolving power of the imaging system, widely accepted opinion that speckle contrast is equal to one holds only for rather severe level of the hologram size limitation. For moderate limitations, speckle contrast changes gradually from zero for no limitation to one for limitation to less than about 20% of hologram size. The results obtained for the limitation of the hologram sensor"s dynamic range and hologram signal quantization reveal that speckle noise due to these hologram signal distortions is not multiplicative and is directly associated with the severity of the limitation and quantization. On the base of the simulation results, analytical models are suggested.

  5. Simulations of multi-contrast x-ray imaging using near-field speckles

    Energy Technology Data Exchange (ETDEWEB)

    Zdora, Marie-Christine [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Thibault, Pierre [Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Herzen, Julia; Pfeiffer, Franz [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Zanette, Irene [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE (United Kingdom); Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany)

    2016-01-28

    X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.

  6. Laser speckle imaging based on photothermally driven convection

    Science.gov (United States)

    Regan, Caitlin; Choi, Bernard

    2016-02-01

    Laser speckle imaging (LSI) is an interferometric technique that provides information about the relative speed of moving scatterers in a sample. Photothermal LSI overcomes limitations in depth resolution faced by conventional LSI by incorporating an excitation pulse to target absorption by hemoglobin within the vascular network. Here we present results from experiments designed to determine the mechanism by which photothermal LSI decreases speckle contrast. We measured the impact of mechanical properties on speckle contrast, as well as the spatiotemporal temperature dynamics and bulk convective motion occurring during photothermal LSI. Our collective data strongly support the hypothesis that photothermal LSI achieves a transient reduction in speckle contrast due to bulk motion associated with thermally driven convection. The ability of photothermal LSI to image structures below a scattering medium may have important preclinical and clinical applications.

  7. Speckle Suppression by Weighted Euclidean Distance Anisotropic Diffusion

    Directory of Open Access Journals (Sweden)

    Fengcheng Guo

    2018-05-01

    Full Text Available To better reduce image speckle noise while also maintaining edge information in synthetic aperture radar (SAR images, we propose a novel anisotropic diffusion algorithm using weighted Euclidean distance (WEDAD. Presented here is a modified speckle reducing anisotropic diffusion (SRAD method, which constructs a new edge detection operator using weighted Euclidean distances. The new edge detection operator can adaptively distinguish between homogenous and heterogeneous image regions, effectively generate anisotropic diffusion coefficients for each image pixel, and filter each pixel at different scales. Additionally, the effects of two different weighting methods (Gaussian weighting and non-linear weighting of de-noising were analyzed. The effect of different adjustment coefficient settings on speckle suppression was also explored. A series of experiments were conducted using an added noise image, GF-3 SAR image, and YG-29 SAR image. The experimental results demonstrate that the proposed method can not only significantly suppress speckle, thus improving the visual effects, but also better preserve the edge information of images.

  8. Speckle Imaging of Binary Stars with Large-Format CCDs

    Science.gov (United States)

    Horch, E.; Ninkov, Z.; Slawson, R. W.; van Altena, W. F.; Meyer, R. D.; Girard, T. M.

    1997-12-01

    In the past, bare (unintensified) CCDs have not been widely used in speckle imaging for two main reasons: 1) the readout rate of most scientific-grade CCDs is too slow to be able to observe at the high frame rates necessary to capture speckle patterns efficiently, and 2) the read noise of CCDs limits the detectability of fainter objects where it becomes difficult to distinguish between speckles and noise peaks in the image. These facts have led to the current supremacy of intensified imaging systems (such as intensified-CCDs) in this field, which can typically be read out at video rates or faster. We have developed a new approach that uses a large format CCD not only to detect the incident photons but also to record many speckle patterns before the chip is read out. This approach effectively uses the large area of the CCD as a physical ``memory cache'' of previous speckle data frames. The method is described, and binary star observations from the University of Toronto Southern Observatory 60-cm telescope and the Wisconsin-Indiana-Yale-NOAO (WIYN) 3.5-m telescope are presented. Plans for future observing and instrumentation improvements are also outlined.

  9. Modeling laser speckle imaging of perfusion in the skin (Conference Presentation)

    Science.gov (United States)

    Regan, Caitlin; Hayakawa, Carole K.; Choi, Bernard

    2016-02-01

    Laser speckle imaging (LSI) enables visualization of relative blood flow and perfusion in the skin. It is frequently applied to monitor treatment of vascular malformations such as port wine stain birthmarks, and measure changes in perfusion due to peripheral vascular disease. We developed a computational Monte Carlo simulation of laser speckle contrast imaging to quantify how tissue optical properties, blood vessel depths and speeds, and tissue perfusion affect speckle contrast values originating from coherent excitation. The simulated tissue geometry consisted of multiple layers to simulate the skin, or incorporated an inclusion such as a vessel or tumor at different depths. Our simulation used a 30x30mm uniform flat light source to optically excite the region of interest in our sample to better mimic wide-field imaging. We used our model to simulate how dynamically scattered photons from a buried blood vessel affect speckle contrast at different lateral distances (0-1mm) away from the vessel, and how these speckle contrast changes vary with depth (0-1mm) and flow speed (0-10mm/s). We applied the model to simulate perfusion in the skin, and observed how different optical properties, such as epidermal melanin concentration (1%-50%) affected speckle contrast. We simulated perfusion during a systolic forearm occlusion and found that contrast decreased by 35% (exposure time = 10ms). Monte Carlo simulations of laser speckle contrast give us a tool to quantify what regions of the skin are probed with laser speckle imaging, and measure how the tissue optical properties and blood flow affect the resulting images.

  10. Compensation for the signal processing characteristics of ultrasound B-mode scanners in adaptive speckle reduction.

    Science.gov (United States)

    Crawford, D C; Bell, D S; Bamber, J C

    1993-01-01

    A systematic method to compensate for nonlinear amplification of individual ultrasound B-scanners has been investigated in order to optimise performance of an adaptive speckle reduction (ASR) filter for a wide range of clinical ultrasonic imaging equipment. Three potential methods have been investigated: (1) a method involving an appropriate selection of the speckle recognition feature was successful when the scanner signal processing executes simple logarithmic compressions; (2) an inverse transform (decompression) of the B-mode image was effective in correcting for the measured characteristics of image data compression when the algorithm was implemented in full floating point arithmetic; (3) characterising the behaviour of the statistical speckle recognition feature under conditions of speckle noise was found to be the method of choice for implementation of the adaptive speckle reduction algorithm in limited precision integer arithmetic. In this example, the statistical features of variance and mean were investigated. The third method may be implemented on commercially available fast image processing hardware and is also better suited for transfer into dedicated hardware to facilitate real-time adaptive speckle reduction. A systematic method is described for obtaining ASR calibration data from B-mode images of a speckle producing phantom.

  11. IMAGE ENHANCEMENT AND SPECKLE REDUCTION OF FULL POLARIMETRIC SAR DATA BY GAUSSIAN MARKOV RANDOM FIELD

    Directory of Open Access Journals (Sweden)

    M. Mahdian

    2013-09-01

    Full Text Available In recent years, the use of Polarimetric Synthetic Aperture Radar (PolSAR data in different applications dramatically has been increased. In SAR imagery an interference phenomenon with random behavior exists which is called speckle noise. The interpretation of data encounters some troubles due to the presence of speckle which can be considered as a multiplicative noise affecting all coherent imaging systems. Indeed, speckle degrade radiometric resolution of PolSAR images, therefore it is needful to perform speckle filtering on the SAR data type. Markov Random Field (MRF has proven to be a powerful method for drawing out eliciting contextual information from remotely sensed images. In the present paper, a probability density function (PDF, which is fitted well with the PolSAR data based on the goodness-of-fit test, is first obtained for the pixel-wise analysis. Then the contextual smoothing is achieved with the MRF method. This novel speckle reduction method combines an advanced statistical distribution with spatial contextual information for PolSAR data. These two parts of information are combined based on weighted summation of pixel-wise and contextual models. This approach not only preserves edge information in the images, but also improves signal-to-noise ratio of the results. The method maintains the mean value of original signal in the homogenous areas and preserves the edges of features in the heterogeneous regions. Experiments on real medium resolution ALOS data from Tehran, and also high resolution full polarimetric SAR data over the Oberpfaffenhofen test-site in Germany, demonstrate the effectiveness of the algorithm compared with well-known despeckling methods.

  12. Speckle reduction in optical coherence tomography images based on wave atoms

    Science.gov (United States)

    Du, Yongzhao; Liu, Gangjun; Feng, Guoying; Chen, Zhongping

    2014-01-01

    Abstract. Optical coherence tomography (OCT) is an emerging noninvasive imaging technique, which is based on low-coherence interferometry. OCT images suffer from speckle noise, which reduces image contrast. A shrinkage filter based on wave atoms transform is proposed for speckle reduction in OCT images. Wave atoms transform is a new multiscale geometric analysis tool that offers sparser expansion and better representation for images containing oscillatory patterns and textures than other traditional transforms, such as wavelet and curvelet transforms. Cycle spinning-based technology is introduced to avoid visual artifacts, such as Gibbs-like phenomenon, and to develop a translation invariant wave atoms denoising scheme. The speckle suppression degree in the denoised images is controlled by an adjustable parameter that determines the threshold in the wave atoms domain. The experimental results show that the proposed method can effectively remove the speckle noise and improve the OCT image quality. The signal-to-noise ratio, contrast-to-noise ratio, average equivalent number of looks, and cross-correlation (XCOR) values are obtained, and the results are also compared with the wavelet and curvelet thresholding techniques. PMID:24825507

  13. Spiking cortical model-based nonlocal means method for speckle reduction in optical coherence tomography images

    Science.gov (United States)

    Zhang, Xuming; Li, Liu; Zhu, Fei; Hou, Wenguang; Chen, Xinjian

    2014-06-01

    Optical coherence tomography (OCT) images are usually degraded by significant speckle noise, which will strongly hamper their quantitative analysis. However, speckle noise reduction in OCT images is particularly challenging because of the difficulty in differentiating between noise and the information components of the speckle pattern. To address this problem, the spiking cortical model (SCM)-based nonlocal means method is presented. The proposed method explores self-similarities of OCT images based on rotation-invariant features of image patches extracted by SCM and then restores the speckled images by averaging the similar patches. This method can provide sufficient speckle reduction while preserving image details very well due to its effectiveness in finding reliable similar patches under high speckle noise contamination. When applied to the retinal OCT image, this method provides signal-to-noise ratio improvements of >16 dB with a small 5.4% loss of similarity.

  14. Signal-to-noise based local decorrelation compensation for speckle interferometry applications

    International Nuclear Information System (INIS)

    Molimard, Jerome; Cordero, Raul; Vautrin, Alain

    2008-01-01

    Speckle-based interferometric techniques allow assessing the whole-field deformation induced on a specimen due to the application of load. These high sensitivity optical techniques yield fringe images generated by subtracting speckle patterns captured while the specimen undergoes deformation. The quality of the fringes, and in turn the accuracy of the deformation measurements, strongly depends on the speckle correlation. Specimen rigid body motion leads to speckle decorrelation that, in general, cannot be effectively counteracted by applying a global translation to the involved speckle patterns. In this paper, we propose a recorrelation procedure based on the application of locally evaluated translations. The proposed procedure implies dividing the field into several regions, applying a local translation, and calculating, in every region, the signal-to-noise ratio (SNR). Since the latter is a correlation indicator (the noise increases with the decorrelation) we argue that the proper translation is that which maximizes the locally evaluated SNR. The search of the proper local translations is, of course, an interactive process that can be facilitated by using a SNR optimization algorithm. The performance of the proposed recorrelation procedure was tested on two examples. First, the SNR optimization algorithm was applied to fringe images obtained by subtracting simulated speckle patterns. Next, it was applied to fringe images obtained by using a shearography optical setup from a specimen subjected to mechanical deformation. Our results show that the proposed SNR optimization method can significantly improve the reliability of measurements performed by using speckle-based techniques

  15. Evaluation of digital image correlation techniques using realistic ground truth speckle images

    International Nuclear Information System (INIS)

    Cofaru, C; Philips, W; Van Paepegem, W

    2010-01-01

    Digital image correlation (DIC) has been acknowledged and widely used in recent years in the field of experimental mechanics as a contactless method for determining full field displacements and strains. Even though several sub-pixel motion estimation algorithms have been proposed in the literature, little is known about their accuracy and limitations in reproducing complex underlying motion fields occurring in real mechanical tests. This paper presents a new method for evaluating sub-pixel motion estimation algorithms using ground truth speckle images that are realistically warped using artificial motion fields that were obtained following two distinct approaches: in the first, the horizontal and vertical displacement fields are created according to theoretical formulas for the given type of experiment while the second approach constructs the displacements through radial basis function interpolation starting from real DIC results. The method is applied in the evaluation of five DIC algorithms with results indicating that the gradient-based DIC methods generally have a quality advantage when using small sized blocks and are a better choice for calculating very small displacements and strains. The Newton–Raphson is the overall best performing method with a notable quality advantage when large block sizes are employed and in experiments where large strain fields are of interest

  16. Integration of image exposure time into a modified laser speckle imaging method

    Energy Technology Data Exchange (ETDEWEB)

    RamIrez-San-Juan, J C; Salazar-Hermenegildo, N; Ramos-Garcia, R; Munoz-Lopez, J [Optics Department, INAOE, Puebla (Mexico); Huang, Y C [Department of Electrical Engineering and Computer Science, University of California, Irvine, CA (United States); Choi, B, E-mail: jcram@inaoep.m [Beckman Laser Institute and Medical Clinic, University of California, Irvine, CA (United States)

    2010-11-21

    Speckle-based methods have been developed to characterize tissue blood flow and perfusion. One such method, called modified laser speckle imaging (mLSI), enables computation of blood flow maps with relatively high spatial resolution. Although it is known that the sensitivity and noise in LSI measurements depend on image exposure time, a fundamental disadvantage of mLSI is that it does not take into account this parameter. In this work, we integrate the exposure time into the mLSI method and provide experimental support of our approach with measurements from an in vitro flow phantom.

  17. Integration of image exposure time into a modified laser speckle imaging method

    International Nuclear Information System (INIS)

    RamIrez-San-Juan, J C; Salazar-Hermenegildo, N; Ramos-Garcia, R; Munoz-Lopez, J; Huang, Y C; Choi, B

    2010-01-01

    Speckle-based methods have been developed to characterize tissue blood flow and perfusion. One such method, called modified laser speckle imaging (mLSI), enables computation of blood flow maps with relatively high spatial resolution. Although it is known that the sensitivity and noise in LSI measurements depend on image exposure time, a fundamental disadvantage of mLSI is that it does not take into account this parameter. In this work, we integrate the exposure time into the mLSI method and provide experimental support of our approach with measurements from an in vitro flow phantom.

  18. M2 FILTER FOR SPECKLE NOISE SUPPRESSION IN BREAST ULTRASOUND IMAGES

    Directory of Open Access Journals (Sweden)

    E.S. Samundeeswari

    2016-11-01

    Full Text Available Breast cancer, commonly found in women is a serious life threatening disease due to its invasive nature. Ultrasound (US imaging method plays an effective role in screening early detection and diagnosis of Breast cancer. Speckle noise generally affects medical ultrasound images and also causes a number of difficulties in identifying the Region of Interest. Suppressing speckle noise is a challenging task as it destroys fine edge details. No specific filter is designed yet to get a noise free BUS image that is contaminated by speckle noise. In this paper M2 filter, a novel hybrid of linear and nonlinear filter is proposed and compared to other spatial filters with 3×3 kernel size. The performance of the proposed M2 filter is measured by statistical quantity parameters like MSE, PSNR and SSI. The experimental analysis clearly shows that the proposed M2 filter outperforms better than other spatial filters by 2% high PSNR values with regards to speckle suppression.

  19. NESSI and `Alopeke: Two new dual-channel speckle imaging instruments

    Science.gov (United States)

    Scott, Nicholas J.

    2018-01-01

    NESSI and `Alopeke are two new speckle imagers built at NASA's Ames Research Center for community use at the WIYN and Gemini telescopes, respectively. The two instruments are functionally similar and include the capability for wide-field imaging in additional to speckle interferometry. The diffraction-limited imaging available through speckle effectively eliminates distortions due to the presence of Earth's atmosphere by `freezing out' changes in the atmosphere by taking extremely short exposures and combining the resultant speckles in Fourier space. This technique enables angular resolutions equal to the theoretical best possible for a given telescope, effectively giving space-based resolution from the ground. Our instruments provide the highest spatial resolution available today on any single aperture telescope.A primary role of these instruments is exoplanet validation for the Kepler, K2, TESS, and many RV programs. Contrast ratios of 6 or more magnitudes are easily obtained. The instrument uses two emCCD cameras providing simultaneous dual-color observations help to characterize detected companions. High resolution imaging enables the identification of blended binaries that contaminate many exoplanet detections, leading to incorrectly measured radii. In this way small, rocky systems, such as Kepler-186b and the TRAPPIST-1 planet family, may be validated and thus the detected planets radii are correctly measured.

  20. A Novel Sub-pixel Measurement Algorithm Based on Mixed the Fractal and Digital Speckle Correlation in Frequency Domain

    Directory of Open Access Journals (Sweden)

    Zhangfang Hu

    2014-10-01

    Full Text Available The digital speckle correlation is a non-contact in-plane displacement measurement method based on machine vision. Motivated by the facts that the low accuracy and large amount of calculation produced by the traditional digital speckle correlation method in spatial domain, we introduce a sub-pixel displacement measurement algorithm which employs a fast interpolation method based on fractal theory and digital speckle correlation in frequency domain. This algorithm can overcome either the blocking effect or the blurring caused by the traditional interpolation methods, and the frequency domain processing also avoids the repeated searching in the correlation recognition of the spatial domain, thus the operation quantity is largely reduced and the information extracting speed is improved. The comparative experiment is given to verify that the proposed algorithm in this paper is effective.

  1. Correcting for motion artifact in handheld laser speckle images

    Science.gov (United States)

    Lertsakdadet, Ben; Yang, Bruce Y.; Dunn, Cody E.; Ponticorvo, Adrien; Crouzet, Christian; Bernal, Nicole; Durkin, Anthony J.; Choi, Bernard

    2018-03-01

    Laser speckle imaging (LSI) is a wide-field optical technique that enables superficial blood flow quantification. LSI is normally performed in a mounted configuration to decrease the likelihood of motion artifact. However, mounted LSI systems are cumbersome and difficult to transport quickly in a clinical setting for which portability is essential in providing bedside patient care. To address this issue, we created a handheld LSI device using scientific grade components. To account for motion artifact of the LSI device used in a handheld setup, we incorporated a fiducial marker (FM) into our imaging protocol and determined the difference between highest and lowest speckle contrast values for the FM within each data set (Kbest and Kworst). The difference between Kbest and Kworst in mounted and handheld setups was 8% and 52%, respectively, thereby reinforcing the need for motion artifact quantification. When using a threshold FM speckle contrast value (KFM) to identify a subset of images with an acceptable level of motion artifact, mounted and handheld LSI measurements of speckle contrast of a flow region (KFLOW) in in vitro flow phantom experiments differed by 8%. Without the use of the FM, mounted and handheld KFLOW values differed by 20%. To further validate our handheld LSI device, we compared mounted and handheld data from an in vivo porcine burn model of superficial and full thickness burns. The speckle contrast within the burn region (KBURN) of the mounted and handheld LSI data differed by burns. Collectively, our results suggest the potential of handheld LSI with an FM as a suitable alternative to mounted LSI, especially in challenging clinical settings with space limitations such as the intensive care unit.

  2. Speckle suppression via sparse representation for wide-field imaging through turbid media.

    Science.gov (United States)

    Jang, Hwanchol; Yoon, Changhyeong; Chung, Euiheon; Choi, Wonshik; Lee, Heung-No

    2014-06-30

    Speckle suppression is one of the most important tasks in the image transmission through turbid media. Insufficient speckle suppression requires an additional procedure such as temporal ensemble averaging over multiple exposures. In this paper, we consider the image recovery process based on the so-called transmission matrix (TM) of turbid media for the image transmission through the media. We show that the speckle left unremoved in the TM-based image recovery can be suppressed effectively via sparse representation (SR). SR is a relatively new signal reconstruction framework which works well even for ill-conditioned problems. This is the first study to show the benefit of using the SR as compared to the phase conjugation (PC) a de facto standard method to date for TM-based imaging through turbid media including a live cell through tissue slice.

  3. Optoelectronic imaging of speckle using image processing method

    Science.gov (United States)

    Wang, Jinjiang; Wang, Pengfei

    2018-01-01

    A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.

  4. Three Dimensional Speckle Imaging Employing a Frequency-Locked Tunable Diode Laser

    Energy Technology Data Exchange (ETDEWEB)

    Cannon, Bret D.; Bernacki, Bruce E.; Schiffern, John T.; Mendoza, Albert

    2015-09-01

    We describe a high accuracy frequency stepping method for a tunable diode laser to improve a three dimensional (3D) imaging approach based upon interferometric speckle imaging. The approach, modeled after Takeda, exploits tuning an illumination laser in frequency as speckle interferograms of the object (specklegrams) are acquired at each frequency in a Michelson interferometer. The resulting 3D hypercube of specklegrams encode spatial information in the x-y plane of each image with laser tuning arrayed along its z-axis. We present laboratory data of before and after results showing enhanced 3D imaging resulting from precise laser frequency control.

  5. ARTIFICIAL INCOHERENT SPECKLES ENABLE PRECISION ASTROMETRY AND PHOTOMETRY IN HIGH-CONTRAST IMAGING

    Energy Technology Data Exchange (ETDEWEB)

    Jovanovic, N.; Guyon, O.; Pathak, P.; Kudo, T. [National Astronomical Observatory of Japan, Subaru Telescope, 650 North A’Ohoku Place, Hilo, HI, 96720 (United States); Martinache, F. [Observatoire de la Cote d’Azur, Boulevard de l’Observatoire, F-06304 Nice (France); Hagelberg, J., E-mail: jovanovic.nem@gmail.com [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States)

    2015-11-10

    State-of-the-art coronagraphs employed on extreme adaptive optics enabled instruments are constantly improving the contrast detection limit for companions at ever-closer separations from the host star. In order to constrain their properties and, ultimately, compositions, it is important to precisely determine orbital parameters and contrasts with respect to the stars they orbit. This can be difficult in the post-coronagraphic image plane, as by definition the central star has been occulted by the coronagraph. We demonstrate the flexibility of utilizing the deformable mirror in the adaptive optics system of the Subaru Coronagraphic Extreme Adaptive Optics system to generate a field of speckles for the purposes of calibration. Speckles can be placed up to 22.5 λ/D from the star, with any position angle, brightness, and abundance required. Most importantly, we show that a fast modulation of the added speckle phase, between 0 and π, during a long science integration renders these speckles effectively incoherent with the underlying halo. We quantitatively show for the first time that this incoherence, in turn, increases the robustness and stability of the adaptive speckles, which will improve the precision of astrometric and photometric calibration procedures. This technique will be valuable for high-contrast imaging observations with imagers and integral field spectrographs alike.

  6. Software phantom with realistic speckle modeling for validation of image analysis methods in echocardiography

    Science.gov (United States)

    Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten

    2014-03-01

    Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.

  7. Laser speckle imaging: a novel method for detecting dental erosion.

    Directory of Open Access Journals (Sweden)

    Nelson H Koshoji

    Full Text Available Erosion is a highly prevalent condition known as a non-carious lesion that causes progressive tooth wear due to chemical processes that do not involve the action of bacteria. Speckle images proved sensitive to even minimal mineral loss from the enamel. The aim of the present study was to investigate the use of laser speckle imaging analysis in the spatial domain to quantify shifts in the microstructure of the tooth surface in an erosion model. 32 fragments of the vestibular surface of bovine incisors were divided in for groups (10 min, 20 min. 30 min and 40 min of acid etching immersed in a cola-based beverage (pH approximately 2.5 twice a day during 7 days to create an artificial erosion. By analyzing the laser speckle contrast map (LASCA in the eroded region compared to the sound it was observed that the LASCA map shifts, proportionally to the acid each duration, by: 18%; 23%; 39% and 44% for the 10 min; 20 min; 30 min and 40 min groups, respectively. To the best of our knowledge, this is the first study to demonstrate the correlation between speckle patterns and erosion progression.

  8. A practical approach to optimizing the preparation of speckle patterns for digital-image correlation

    International Nuclear Information System (INIS)

    Lionello, Giacomo; Cristofolini, Luca

    2014-01-01

    The quality of strain measurements by digital image correlation (DIC) strongly depends on the quality of the pattern on the specimen’s surface. An ideal pattern should be highly contrasted, stochastic, and isotropic. In addition, the speckle pattern should have an average size that exceeds the image pixel size by a factor of 3–5. (Smaller speckles cause poor contrast, and larger speckles cause poor spatial resolution.) Finally, the ideal pattern should have a limited scatter in terms of speckle sizes. The aims of this study were: (i) to define the ideal speckle size in relation to the specimen size and acquisition system; (ii) provide practical guidelines to identify the optimal settings of an airbrush gun, in order to produce a pattern that is as close as possible to the desired one while minimizing the scatter of speckle sizes. Patterns of different sizes were produced using two different airbrush guns with different settings of the four most influential factors (dilution, airflow setting, spraying distance, and air pressure). A full-factorial DOE strategy was implemented to explore the four factors at two levels each: 36 specimens were analyzed for each of the 16 combinations. The images were acquired using the digital cameras of a DIC system. The distribution of speckle sizes was analyzed to calculate the average speckle size and the standard deviation of the corresponding truncated Gaussian distribution. A mathematical model was built to enable prediction of the average speckle size in relation to the airbrush gun settings. We showed that it is possible to obtain a pattern with a highly controlled average and a limited scatter of speckle sizes, so as to match the ideal distribution of speckle sizes for DIC. Although the settings identified here apply only to the specific equipment being used, this method can be adapted to any airbrush to produce a desired speckle pattern. (technical design note)

  9. Speckle Reduction and Structure Enhancement by Multichannel Median Boosted Anisotropic Diffusion

    Directory of Open Access Journals (Sweden)

    Yang Zhi

    2004-01-01

    Full Text Available We propose a new approach to reduce speckle noise and enhance structures in speckle-corrupted images. It utilizes a median-anisotropic diffusion compound scheme. The median-filter-based reaction term acts as a guided energy source to boost the structures in the image being processed. In addition, it regularizes the diffusion equation to ensure the existence and uniqueness of a solution. We also introduce a decimation and back reconstruction scheme to further enhance the processing result. Before the iteration of the diffusion process, the image is decimated and a subpixel shifted image set is formed. This allows a multichannel parallel diffusion iteration, and more importantly, the speckle noise is broken into impulsive or salt-pepper noise, which is easy to remove by median filtering. The advantage of the proposed technique is clear when it is compared to other diffusion algorithms and the well-known adaptive weighted median filtering (AWMF scheme in both simulation and real medical ultrasound images.

  10. The Research on Denoising of SAR Image Based on Improved K-SVD Algorithm

    Science.gov (United States)

    Tan, Linglong; Li, Changkai; Wang, Yueqin

    2018-04-01

    SAR images often receive noise interference in the process of acquisition and transmission, which can greatly reduce the quality of images and cause great difficulties for image processing. The existing complete DCT dictionary algorithm is fast in processing speed, but its denoising effect is poor. In this paper, the problem of poor denoising, proposed K-SVD (K-means and singular value decomposition) algorithm is applied to the image noise suppression. Firstly, the sparse dictionary structure is introduced in detail. The dictionary has a compact representation and can effectively train the image signal. Then, the sparse dictionary is trained by K-SVD algorithm according to the sparse representation of the dictionary. The algorithm has more advantages in high dimensional data processing. Experimental results show that the proposed algorithm can remove the speckle noise more effectively than the complete DCT dictionary and retain the edge details better.

  11. Speckle Noise Reduction for the Enhancement of Retinal Layers in Optical Coherence Tomography Images

    Directory of Open Access Journals (Sweden)

    Fereydoon Nowshiravan Rahatabad

    2015-09-01

    Full Text Available Introduction One of the most important pre-processing steps in optical coherence tomography (OCT is reducing speckle noise, resulting from multiple scattering of tissues, which degrades the quality of OCT images. Materials and Methods The present study focused on speckle noise reduction and edge detection techniques. Statistical filters with different masks and noise variances were applied on OCT and test images. Objective evaluation of both types of images was performed, using various image metrics such as peak signal-to-noise ratio (PSNR, root mean square error, correlation coefficient and elapsed time. For the purpose of recovery, Kuan filter was used as an input for edge enhancement. Also, a spatial filter was applied to improve image quality. Results The obtained results were presented as statistical tables and images. Based on statistical measures and visual quality of OCT images, Enhanced Lee filter (3×3 with a PSNR value of 43.6735 in low noise variance and Kuan filter (3×3 with a PSNR value of 37.2850 in high noise variance showed superior performance over other filters. Conclusion Based on the obtained results, by using speckle reduction filters such as Enhanced Lee and Kuan filters on OCT images, the number of compounded images, required to achieve a given image quality, could be reduced. Moreover, use of Kuan filters for promoting the edges allowed smoothing of speckle regions, while preserving image tissue texture.

  12. An algorithm for improving the quality of structural images of turbid media in endoscopic optical coherence tomography

    Science.gov (United States)

    Potlov, A. Yu.; Frolov, S. V.; Proskurin, S. G.

    2018-04-01

    High-quality OCT structural images reconstruction algorithm for endoscopic optical coherence tomography of biological tissue is described. The key features of the presented algorithm are: (1) raster scanning and averaging of adjacent Ascans and pixels; (2) speckle level minimization. The described algorithm can be used in the gastroenterology, urology, gynecology, otorhinolaryngology for mucous membranes and skin diagnostics in vivo and in situ.

  13. Motion Estimation Using the Firefly Algorithm in Ultrasonic Image Sequence of Soft Tissue

    Directory of Open Access Journals (Sweden)

    Chih-Feng Chao

    2015-01-01

    Full Text Available Ultrasonic image sequence of the soft tissue is widely used in disease diagnosis; however, the speckle noises usually influenced the image quality. These images usually have a low signal-to-noise ratio presentation. The phenomenon gives rise to traditional motion estimation algorithms that are not suitable to measure the motion vectors. In this paper, a new motion estimation algorithm is developed for assessing the velocity field of soft tissue in a sequence of ultrasonic B-mode images. The proposed iterative firefly algorithm (IFA searches for few candidate points to obtain the optimal motion vector, and then compares it to the traditional iterative full search algorithm (IFSA via a series of experiments of in vivo ultrasonic image sequences. The experimental results show that the IFA can assess the vector with better efficiency and almost equal estimation quality compared to the traditional IFSA method.

  14. SU-D-210-05: The Accuracy of Raw and B-Mode Image Data for Ultrasound Speckle Tracking in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    O’Shea, T; Bamber, J; Harris, E [The Institute of Cancer Research & Royal Marsden, Sutton and London (United Kingdom)

    2015-06-15

    Purpose: For ultrasound speckle tracking there is some evidence that the envelope-detected signal (the main step in B-mode image formation) may be more accurate than raw ultrasound data for tracking larger inter-frame tissue motion. This study investigates the accuracy of raw radio-frequency (RF) versus non-logarithmic compressed envelope-detected (B-mode) data for ultrasound speckle tracking in the context of image-guided radiation therapy. Methods: Transperineal ultrasound RF data was acquired (with a 7.5 MHz linear transducer operating at a 12 Hz frame rate) from a speckle phantom moving with realistic intra-fraction prostate motion derived from a commercial tracking system. A normalised cross-correlation template matching algorithm was used to track speckle motion at the focus using (i) the RF signal and (ii) the B-mode signal. A range of imaging rates (0.5 to 12 Hz) were simulated by decimating the imaging sequences, therefore simulating larger to smaller inter-frame displacements. Motion estimation accuracy was quantified by comparison with known phantom motion. Results: The differences between RF and B-mode motion estimation accuracy (2D mean and 95% errors relative to ground truth displacements) were less than 0.01 mm for stable and persistent motion types and 0.2 mm for transient motion for imaging rates of 0.5 to 12 Hz. The mean correlation for all motion types and imaging rates was 0.851 and 0.845 for RF and B-mode data, respectively. Data type is expected to have most impact on axial (Superior-Inferior) motion estimation. Axial differences were <0.004 mm for stable and persistent motion and <0.3 mm for transient motion (axial mean errors were lowest for B-mode in all cases). Conclusions: Using the RF or B-mode signal for speckle motion estimation is comparable for translational prostate motion. B-mode image formation may involve other signal-processing steps which also influence motion estimation accuracy. A similar study for respiratory-induced motion

  15. Fast Superpixel Segmentation Algorithm for PolSAR Images

    Directory of Open Access Journals (Sweden)

    Zhang Yue

    2017-10-01

    Full Text Available As a pre-processing technique, superpixel segmentation algorithms should be of high computational efficiency, accurate boundary adherence and regular shape in homogeneous regions. A fast superpixel segmentation algorithm based on Iterative Edge Refinement (IER has shown to be applicable on optical images. However, it is difficult to obtain the ideal results when IER is applied directly to PolSAR images due to the speckle noise and small or slim regions in PolSAR images. To address these problems, in this study, the unstable pixel set is initialized as all the pixels in the PolSAR image instead of the initial grid edge pixels. In the local relabeling of the unstable pixels, the fast revised Wishart distance is utilized instead of the Euclidean distance in CIELAB color space. Then, a post-processing procedure based on dissimilarity measure is empolyed to remove isolated small superpixels as well as to retain the strong point targets. Finally, extensive experiments based on a simulated image and a real-world PolSAR image from Airborne Synthetic Aperture Radar (AirSAR are conducted, showing that the proposed algorithm, compared with three state-of-the-art methods, performs better in terms of several commonly used evaluation criteria with high computational efficiency, accurate boundary adherence, and homogeneous regularity.

  16. Local scattering property scales flow speed estimation in laser speckle contrast imaging

    International Nuclear Information System (INIS)

    Miao, Peng; Chao, Zhen; Feng, Shihan; Ji, Yuanyuan; Yu, Hang; Thakor, Nitish V; Li, Nan

    2015-01-01

    Laser speckle contrast imaging (LSCI) has been widely used in in vivo blood flow imaging. However, the effect of local scattering property (scattering coefficient µ s ) on blood flow speed estimation has not been well investigated. In this study, such an effect was quantified and involved in relation between speckle autocorrelation time τ c and flow speed v based on simulation flow experiments. For in vivo blood flow imaging, an improved estimation strategy was developed to eliminate the estimation bias due to the inhomogeneous distribution of the scattering property. Compared to traditional LSCI, a new estimation method significantly suppressed the imaging noise and improves the imaging contrast of vasculatures. Furthermore, the new method successfully captured the blood flow changes and vascular constriction patterns in rats’ cerebral cortex from normothermia to mild and moderate hypothermia. (letter)

  17. Comparison of classification algorithms for various methods of preprocessing radar images of the MSTAR base

    Science.gov (United States)

    Borodinov, A. A.; Myasnikov, V. V.

    2018-04-01

    The present work is devoted to comparing the accuracy of the known qualification algorithms in the task of recognizing local objects on radar images for various image preprocessing methods. Preprocessing involves speckle noise filtering and normalization of the object orientation in the image by the method of image moments and by a method based on the Hough transform. In comparison, the following classification algorithms are used: Decision tree; Support vector machine, AdaBoost, Random forest. The principal component analysis is used to reduce the dimension. The research is carried out on the objects from the base of radar images MSTAR. The paper presents the results of the conducted studies.

  18. Objective speckle velocimetry for autonomous vehicle odometry.

    Science.gov (United States)

    Francis, D; Charrett, T O H; Waugh, L; Tatam, R P

    2012-06-01

    Speckle velocimetry is investigated as a means of determining odometry data with potential for application on autonomous robotic vehicles. The technique described here relies on the integration of translation measurements made by normalized cross-correlation of speckle patterns to determine the change in position over time. The use of objective (non-imaged) speckle offers a number of advantages over subjective (imaged) speckle, such as a reduction in the number of optical components, reduced modulation of speckles at the edges of the image, and improved light efficiency. The influence of the source/detector configuration on the speckle translation to vehicle translation scaling factor for objective speckle is investigated using a computer model and verified experimentally. Experimental measurements are presented at velocities up to 80  mm s(-1) which show accuracy better than 0.4%.

  19. Speckle noise reduction for optical coherence tomography based on adaptive 2D dictionary

    Science.gov (United States)

    Lv, Hongli; Fu, Shujun; Zhang, Caiming; Zhai, Lin

    2018-05-01

    As a high-resolution biomedical imaging modality, optical coherence tomography (OCT) is widely used in medical sciences. However, OCT images often suffer from speckle noise, which can mask some important image information, and thus reduce the accuracy of clinical diagnosis. Taking full advantage of nonlocal self-similarity and adaptive 2D-dictionary-based sparse representation, in this work, a speckle noise reduction algorithm is proposed for despeckling OCT images. To reduce speckle noise while preserving local image features, similar nonlocal patches are first extracted from the noisy image and put into groups using a gamma- distribution-based block matching method. An adaptive 2D dictionary is then learned for each patch group. Unlike traditional vector-based sparse coding, we express each image patch by the linear combination of a few matrices. This image-to-matrix method can exploit the local correlation between pixels. Since each image patch might belong to several groups, the despeckled OCT image is finally obtained by aggregating all filtered image patches. The experimental results demonstrate the superior performance of the proposed method over other state-of-the-art despeckling methods, in terms of objective metrics and visual inspection.

  20. Simulations of x-ray speckle-based dark-field and phase-contrast imaging with a polychromatic beam

    Energy Technology Data Exchange (ETDEWEB)

    Zdora, Marie-Christine, E-mail: marie-christine.zdora@diamond.ac.uk [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Department of Physics & Astronomy, University College London, London WC1E 6BT (United Kingdom); Thibault, Pierre [Department of Physics & Astronomy, University College London, London WC1E 6BT (United Kingdom); Pfeiffer, Franz [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Zanette, Irene [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)

    2015-09-21

    Following the first experimental demonstration of x-ray speckle-based multimodal imaging using a polychromatic beam [I. Zanette et al., Phys. Rev. Lett. 112(25), 253903 (2014)], we present a simulation study on the effects of a polychromatic x-ray spectrum on the performance of this technique. We observe that the contrast of the near-field speckles is only mildly influenced by the bandwidth of the energy spectrum. Moreover, using a homogeneous object with simple geometry, we characterize the beam hardening artifacts in the reconstructed transmission and refraction angle images, and we describe how the beam hardening also affects the dark-field signal provided by speckle tracking. This study is particularly important for further implementations and developments of coherent speckle-based techniques at laboratory x-ray sources.

  1. Color quality improvement of reconstructed images in color digital holography using speckle method and spectral estimation

    Science.gov (United States)

    Funamizu, Hideki; Onodera, Yusei; Aizu, Yoshihisa

    2018-05-01

    In this study, we report color quality improvement of reconstructed images in color digital holography using the speckle method and the spectral estimation. In this technique, an object is illuminated by a speckle field and then an object wave is produced, while a plane wave is used as a reference wave. For three wavelengths, the interference patterns of two coherent waves are recorded as digital holograms on an image sensor. Speckle fields are changed by moving a ground glass plate in an in-plane direction, and a number of holograms are acquired to average the reconstructed images. After the averaging process of images reconstructed from multiple holograms, we use the Wiener estimation method for obtaining spectral transmittance curves in reconstructed images. The color reproducibility in this method is demonstrated and evaluated using a Macbeth color chart film and staining cells of onion.

  2. Measurement of deformation field in CT specimen using laser speckle

    International Nuclear Information System (INIS)

    Jeon, Moon Chang; Kang, Ki Ju

    2001-01-01

    To obtain A 2 experimentally in the J-A 2 theory, deformation field on the lateral surface of a CT specimen was to be determined using laser speckle method. The crack growth was measured using direct current potential drop method and most procedure of experimental and data reduction was performed according to ASTM Standard E1737-96. Laser speckle images during crack propagation were monitored by two CCD cameras to cancel the effect of rotation and translation of the specimen. An algorithm to pursue displacement of a point from each image was developed and successfully used to measure A 2 continuously as the crack tip was propagated. The effects of specimen thickness on J-R curve and A 2 were explored

  3. Quantitative, depth-resolved determination of particle motion using multi-exposure, spatial frequency domain laser speckle imaging.

    Science.gov (United States)

    Rice, Tyler B; Kwan, Elliott; Hayakawa, Carole K; Durkin, Anthony J; Choi, Bernard; Tromberg, Bruce J

    2013-01-01

    Laser Speckle Imaging (LSI) is a simple, noninvasive technique for rapid imaging of particle motion in scattering media such as biological tissue. LSI is generally used to derive a qualitative index of relative blood flow due to unknown impact from several variables that affect speckle contrast. These variables may include optical absorption and scattering coefficients, multi-layer dynamics including static, non-ergodic regions, and systematic effects such as laser coherence length. In order to account for these effects and move toward quantitative, depth-resolved LSI, we have developed a method that combines Monte Carlo modeling, multi-exposure speckle imaging (MESI), spatial frequency domain imaging (SFDI), and careful instrument calibration. Monte Carlo models were used to generate total and layer-specific fractional momentum transfer distributions. This information was used to predict speckle contrast as a function of exposure time, spatial frequency, layer thickness, and layer dynamics. To verify with experimental data, controlled phantom experiments with characteristic tissue optical properties were performed using a structured light speckle imaging system. Three main geometries were explored: 1) diffusive dynamic layer beneath a static layer, 2) static layer beneath a diffuse dynamic layer, and 3) directed flow (tube) submerged in a dynamic scattering layer. Data fits were performed using the Monte Carlo model, which accurately reconstructed the type of particle flow (diffusive or directed) in each layer, the layer thickness, and absolute flow speeds to within 15% or better.

  4. Speckle perception and disturbance limit in laser based projectors

    Science.gov (United States)

    Verschaffelt, Guy; Roelandt, Stijn; Meuret, Youri; Van den Broeck, Wendy; Kilpi, Katriina; Lievens, Bram; Jacobs, An; Janssens, Peter; Thienpont, Hugo

    2016-04-01

    We investigate the level of speckle that can be tolerated in a laser cinema projector. For this purpose, we equipped a movie theatre room with a prototype laser projector. A group of 186 participants was gathered to evaluate the speckle perception of several, short movie trailers in a subjective `Quality of Experience' experiment. This study is important as the introduction of lasers in projection systems has been hampered by the presence of speckle in projected images. We identify a speckle disturbance threshold by statistically analyzing the observers' responses for different values of the amount of speckle, which was monitored using a well-defined speckle measurement method. The analysis shows that the speckle perception of a human observer is not only dependent on the objectively measured amount of speckle, but it is also strongly influenced by the image content. As is also discussed in [Verschaffelt et al., Scientific Reports 5, art. nr. 14105, 2015] we find that, for moving images, the speckle becomes disturbing if the speckle contrast becomes larger than 6.9% for the red, 6.0% for the green, and 4.8% for the blue primary colors of the projector, whereas for still images the speckle detection threshold is about 3%. As we could not independently tune the speckle contrast of each of the primary colors, this speckle disturbance limit seems to be determined by the 6.9% speckle contrast of the red color as this primary color contains the largest amount of speckle. The speckle disturbance limit for movies thus turns out to be substantially larger than that for still images, and hence is easier to attain.

  5. Integration of instrumentation and processing software of a laser speckle contrast imaging system

    Science.gov (United States)

    Carrick, Jacob J.

    Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.

  6. Speckle-learning-based object recognition through scattering media.

    Science.gov (United States)

    Ando, Takamasa; Horisaki, Ryoichi; Tanida, Jun

    2015-12-28

    We experimentally demonstrated object recognition through scattering media based on direct machine learning of a number of speckle intensity images. In the experiments, speckle intensity images of amplitude or phase objects on a spatial light modulator between scattering plates were captured by a camera. We used the support vector machine for binary classification of the captured speckle intensity images of face and non-face data. The experimental results showed that speckles are sufficient for machine learning.

  7. Segmentation of pomegranate MR images using spatial fuzzy c-means (SFCM) algorithm

    Science.gov (United States)

    Moradi, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.

    2011-10-01

    Segmentation is one of the fundamental issues of image processing and machine vision. It plays a prominent role in a variety of image processing applications. In this paper, one of the most important applications of image processing in MRI segmentation of pomegranate is explored. Pomegranate is a fruit with pharmacological properties such as being anti-viral and anti-cancer. Having a high quality product in hand would be critical factor in its marketing. The internal quality of the product is comprehensively important in the sorting process. The determination of qualitative features cannot be manually made. Therefore, the segmentation of the internal structures of the fruit needs to be performed as accurately as possible in presence of noise. Fuzzy c-means (FCM) algorithm is noise-sensitive and pixels with noise are classified inversely. As a solution, in this paper, the spatial FCM algorithm in pomegranate MR images' segmentation is proposed. The algorithm is performed with setting the spatial neighborhood information in FCM and modification of fuzzy membership function for each class. The segmentation algorithm results on the original and the corrupted Pomegranate MR images by Gaussian, Salt Pepper and Speckle noises show that the SFCM algorithm operates much more significantly than FCM algorithm. Also, after diverse steps of qualitative and quantitative analysis, we have concluded that the SFCM algorithm with 5×5 window size is better than the other windows.

  8. Speckle: Friend or foe?

    Science.gov (United States)

    Goodman, Joseph W.

    2013-05-01

    Speckle appears whenever coherent radiation of any kind is used. We review here the basic properties of speckle, the negative effects it has on imaging systems of various kinds, and the positive benefits it offers in certain nondestructive testing and metrology problems.

  9. Speckle interferometry

    Science.gov (United States)

    Sirohi, Rajpal S.

    2002-03-01

    Illumination of a rough surface by a coherent monochromatic wave creates a grainy structure in space termed a speckle pattern. It was considered a special kind of noise and was the bane of holographers. However, its information-carrying property was soon discovered and the phenomenon was used for metrological applications. The realization that a speckle pattern carried information led to a new measurement technique known as speckle interferometry (SI). Although the speckle phenomenon in itself is a consequence of interference among numerous randomly dephased waves, a reference wave is required in SI. Further, it employs an imaging geometry. Initially SI was performed mostly by using silver emulsions as the recording media. The double-exposure specklegram was filtered to extract the desired information. Since SI can be configured so as to be sensitive to the in-plane displacement component, the out-of-plane displacement component or their derivatives, the interferograms corresponding to these were extracted from the specklegram for further analysis. Since the speckle size can be controlled by the F number of the imaging lens, it was soon realized that SI could be performed with electronic detection, thereby increasing its accuracy and speed of measurement. Furthermore, a phase-shifting technique can also be incorporated. This technique came to be known as electronic speckle pattern interferometry (ESPI). It employed the same experimental configurations as SI. ESPI found many industrial applications as it supplements holographic interferometry. We present three examples covering diverse areas. In one application it has been used to measure residual stress in a blank recordable compact disk. In another application, microscopic ESPI has been used to study the influence of relative humidity on paint-coated figurines and also the effect of a conservation agent applied on top of this. The final application is to find the defects in pipes. These diverse applications

  10. Airplane wing deformation and flight flutter detection method by using three-dimensional speckle image correlation technology.

    Science.gov (United States)

    Wu, Jun; Yu, Zhijing; Wang, Tao; Zhuge, Jingchang; Ji, Yue; Xue, Bin

    2017-06-01

    Airplane wing deformation is an important element of aerodynamic characteristics, structure design, and fatigue analysis for aircraft manufacturing, as well as a main test content of certification regarding flutter for airplanes. This paper presents a novel real-time detection method for wing deformation and flight flutter detection by using three-dimensional speckle image correlation technology. Speckle patterns whose positions are determined through the vibration characteristic of the aircraft are coated on the wing; then the speckle patterns are imaged by CCD cameras which are mounted inside the aircraft cabin. In order to reduce the computation, a matching technique based on Geodetic Systems Incorporated coded points combined with the classical epipolar constraint is proposed, and a displacement vector map for the aircraft wing can be obtained through comparing the coordinates of speckle points before and after deformation. Finally, verification experiments containing static and dynamic tests by using an aircraft wing model demonstrate the accuracy and effectiveness of the proposed method.

  11. Improved quality of optical coherence tomography imaging of basal cell carcinomas using speckle reduction

    DEFF Research Database (Denmark)

    Mogensen, Mette; Jørgensen, Thomas Martini; Thrane, Lars

    2010-01-01

    suggests a method for improving OCT image quality for skin cancer imaging. EXPERIMENTAL DESIGN: OCT is an optical imaging method analogous to ultrasound. Two basal cell carcinomas (BCC) were imaged using an OCT speckle reduction technique (SR-OCT) based on repeated scanning by altering the distance between...

  12. Speckle reduction in digital holography with resampling ring masks

    Science.gov (United States)

    Zhang, Wenhui; Cao, Liangcai; Jin, Guofan

    2018-01-01

    One-shot digital holographic imaging has the advantages of high stability and low temporal cost. However, the reconstruction is affected by the speckle noise. Resampling ring-mask method in spectrum domain is proposed for speckle reduction. The useful spectrum of one hologram is divided into several sub-spectra by ring masks. In the reconstruction, angular spectrum transform is applied to guarantee the calculation accuracy which has no approximation. N reconstructed amplitude images are calculated from the corresponding sub-spectra. Thanks to speckle's random distribution, superimposing these N uncorrelated amplitude images would lead to a final reconstructed image with lower speckle noise. Normalized relative standard deviation values of the reconstructed image are used to evaluate the reduction of speckle. Effect of the method on the spatial resolution of the reconstructed image is also quantitatively evaluated. Experimental and simulation results prove the feasibility and effectiveness of the proposed method.

  13. A pilot study to image the vascular network of small melanocytic choroidal tumors with speckle noise-free 1050-nm swept source optical coherence tomography (OCT choroidal angiography).

    Science.gov (United States)

    Maloca, Peter; Gyger, Cyrill; Hasler, Pascal W

    2016-06-01

    To visualize and measure the vascular network of melanocytic choroidal tumors with speckle noise-free swept source optical coherence tomography (SS-OCT choroidal angiography). Melanocytic choroidal tumors from 24 eyes were imaged with 1050-nm optical coherence tomography (Topcon DRI OCT-1 Atlantis). A semi-automated algorithm was developed to remove speckle noise and to extract and measure the volume of the choroidal vessels from the obtained OCT data. In all cases, analysis of the choroidal vessels could be performed with SS-OCT without the need for pupillary dilation. The proposed method allows speckle noise-free, structure-guided visualization and measurement of the larger choroidal vessels in three dimensions. The obtained data suggest that speckle noise-free OCT may be more effective at identifying choroidal structures than traditional OCT methods. The measured volume of the extracted choroidal vessels of Haller's layer and Sattler's layer in the examined tumorous eyes was on average 0.982463955 mm(3) /982463956 μm(3) (range of 0.209764406 mm(3) /209764405.9 μm(3)to 1.78105544 mm(3) /1781055440 μm(3)). Full thickness obstruction of the choroidal vasculature by the tumor was found in 18 cases (72 %). In seven cases (18 %), choroidal vessel architecture did not show pronounced morphological abnormalities (18 %). Speckle noise-free OCT may serve as a new illustrative imaging technology and enhance visualization of the choroidal vessels without the need for dye injection. OCT can be used to identify and evaluate the choroidal vessels of melanocytic choroidal tumors, and may represent a potentially useful tool for imaging and monitoring of choroidal nevi and melanoma.

  14. Optically Sectioned Imaging of Microvasculature of In-Vivo and Ex-Vivo Thick Tissue Models with Speckle-illumination HiLo Microscopy and HiLo Image Processing Implementation in MATLAB Architecture

    Science.gov (United States)

    Suen, Ricky Wai

    The work described in this thesis covers the conversion of HiLo image processing into MATLAB architecture and the use of speckle-illumination HiLo microscopy for use of ex-vivo and in-vivo imaging of thick tissue models. HiLo microscopy is a wide-field fluorescence imaging technique and has been demonstrated to produce optically sectioned images comparable to confocal in thin samples. The imaging technique was developed by Jerome Mertz and the Boston University Biomicroscopy Lab and has been implemented in our lab as a stand-alone optical setup and a modification to a conventional fluorescence microscope. Speckle-illumination HiLo microscopy combines two images taken under speckle-illumination and standard uniform-illumination to generate an optically sectioned image that reject out-of-focus fluorescence. The evaluated speckle contrast in the images is used as a weighting function where elements that move out-of-focus have a speckle contrast that decays to zero. The experiments shown here demonstrate the capability of our HiLo microscopes to produce optically-sectioned images of the microvasculature of ex-vivo and in-vivo thick tissue models. The HiLo microscope were used to image the microvasculature of ex-vivo mouse heart sections prepared for optical histology and the microvasculature of in-vivo rodent dorsal window chamber models. Studies in label-free surface profiling with HiLo microscopy is also presented.

  15. Speckle disturbance limit in laser-based cinema projection systems

    Science.gov (United States)

    Verschaffelt, Guy; Roelandt, Stijn; Meuret, Youri; van den Broeck, Wendy; Kilpi, Katriina; Lievens, Bram; Jacobs, An; Janssens, Peter; Thienpont, Hugo

    2015-09-01

    In a multi-disciplinary effort, we investigate the level of speckle that can be tolerated in a laser cinema projector based on a quality of experience experiment with movie clips shown to a test audience in a real-life movie theatre setting. We identify a speckle disturbance threshold by statistically analyzing the observers’ responses for different values of the amount of speckle, which was monitored using a well-defined speckle measurement method. The analysis shows that the speckle perception of a human observer is not only dependent on the objectively measured amount of speckle, but it is also strongly influenced by the image content. The speckle disturbance limit for movies turns out to be substantially larger than that for still images, and hence is easier to attain.

  16. Texture analysis of speckle in optical coherence tomography images of tissue phantoms

    International Nuclear Information System (INIS)

    Gossage, Kirk W; Smith, Cynthia M; Kanter, Elizabeth M; Hariri, Lida P; Stone, Alice L; Rodriguez, Jeffrey J; Williams, Stuart K; Barton, Jennifer K

    2006-01-01

    Optical coherence tomography (OCT) is an imaging modality capable of acquiring cross-sectional images of tissue using back-reflected light. Conventional OCT images have a resolution of 10-15 μm, and are thus best suited for visualizing tissue layers and structures. OCT images of collagen (with and without endothelial cells) have no resolvable features and may appear to simply show an exponential decrease in intensity with depth. However, examination of these images reveals that they display a characteristic repetitive structure due to speckle.The purpose of this study is to evaluate the application of statistical and spectral texture analysis techniques for differentiating living and non-living tissue phantoms containing various sizes and distributions of scatterers based on speckle content in OCT images. Statistically significant differences between texture parameters and excellent classification rates were obtained when comparing various endothelial cell concentrations ranging from 0 cells/ml to 25 million cells/ml. Statistically significant results and excellent classification rates were also obtained using various sizes of microspheres with concentrations ranging from 0 microspheres/ml to 500 million microspheres/ml. This study has shown that texture analysis of OCT images may be capable of differentiating tissue phantoms containing various sizes and distributions of scatterers

  17. Development of an optimized algorithm for the characterization of microflow using speckle patterns present in optical coherence tomography signal

    International Nuclear Information System (INIS)

    Pretto, Lucas Ramos de

    2015-01-01

    This work discusses the Optical Coherence Tomography system (OCT) and its application to the microfluidics area. To this end, physical characterization of microfluidic circuits were performed using 3D (three-dimensional) models constructed from OCT images of such circuits. The technique was thus evaluated as a potential tool to aid in the inspection of microchannels. Going further, this work paper studies and develops analytical techniques for microfluidic flow, in particular techniques based on speckle pattern. In the first instance, existing methods were studied and improved, such as Speckle Variance - OCT, where a gain of 31% was obtained in processing time. Other methods, such as LASCA (Laser Speckle Contrast Analysis), based on speckle autocorrelation, are adapted to OCT images. Derived from LASCA, the developed analysis technique based on intensity autocorrelation motivated the development of a custom OCT system as well as an optimized acquisition software, with a sampling rate of 8 kHz. The proposed method was, then, able to distinguish different flow rates, and limits of detection were tested, proving its feasibility for implementation on Brownian motion analysis and flow rates below 10 μl/min. (author)

  18. Laser speckle contrast imaging of skin blood perfusion responses induced by laser coagulation

    Energy Technology Data Exchange (ETDEWEB)

    Ogami, M; Kulkarni, R; Wang, H; Reif, R; Wang, R K [University of Washington, Department of Bioengineering, Seattle, Washington 98195 (United States)

    2014-08-31

    We report application of laser speckle contrast imaging (LSCI), i.e., a fast imaging technique utilising backscattered light to distinguish such moving objects as red blood cells from such stationary objects as surrounding tissue, to localise skin injury. This imaging technique provides detailed information about the acute perfusion response after a blood vessel is occluded. In this study, a mouse ear model is used and pulsed laser coagulation serves as the method of occlusion. We have found that the downstream blood vessels lacked blood flow due to occlusion at the target site immediately after injury. Relative flow changes in nearby collaterals and anastomotic vessels have been approximated based on differences in intensity in the nearby collaterals and anastomoses. We have also estimated the density of the affected downstream vessels. Laser speckle contrast imaging is shown to be used for highresolution and fast-speed imaging for the skin microvasculature. It also allows direct visualisation of the blood perfusion response to injury, which may provide novel insights to the field of cutaneous wound healing. (laser biophotonics)

  19. Laser speckle contrast imaging of skin blood perfusion responses induced by laser coagulation

    Science.gov (United States)

    Ogami, M.; Kulkarni, R.; Wang, H.; Reif, R.; Wang, R. K.

    2014-08-01

    We report application of laser speckle contrast imaging (LSCI), i.e., a fast imaging technique utilising backscattered light to distinguish such moving objects as red blood cells from such stationary objects as surrounding tissue, to localise skin injury. This imaging technique provides detailed information about the acute perfusion response after a blood vessel is occluded. In this study, a mouse ear model is used and pulsed laser coagulation serves as the method of occlusion. We have found that the downstream blood vessels lacked blood flow due to occlusion at the target site immediately after injury. Relative flow changes in nearby collaterals and anastomotic vessels have been approximated based on differences in intensity in the nearby collaterals and anastomoses. We have also estimated the density of the affected downstream vessels. Laser speckle contrast imaging is shown to be used for highresolution and fast-speed imaging for the skin microvasculature. It also allows direct visualisation of the blood perfusion response to injury, which may provide novel insights to the field of cutaneous wound healing.

  20. Effect of static scatterers in laser speckle contrast imaging: an experimental study on correlation and contrast

    Science.gov (United States)

    Vaz, Pedro G.; Humeau-Heurtier, Anne; Figueiras, Edite; Correia, Carlos; Cardoso, João

    2018-01-01

    Laser speckle contrast imaging (LSCI) is a non-invasive microvascular blood flow assessment technique with good temporal and spatial resolution. Most LSCI systems, including commercial devices, can perform only qualitative blood flow evaluation, which is a major limitation of this technique. There are several factors that prevent the utilization of LSCI as a quantitative technique. Among these factors, we can highlight the effect of static scatterers. The goal of this work was to study the influence of differences in static and dynamic scatterer concentration on laser speckle correlation and contrast. In order to achieve this, a laser speckle prototype was developed and tested using an optical phantom with various concentrations of static and dynamic scatterers. It was found that the laser speckle correlation could be used to estimate the relative concentration of static/dynamic scatterers within a sample. Moreover, the speckle correlation proved to be independent of the dynamic scatterer velocity, which is a fundamental characteristic to be used in contrast correction.

  1. Rat retinal vasomotion assessed by laser speckle imaging

    DEFF Research Database (Denmark)

    Neganova, Anastasiia Y; Postnov, Dmitry D; Sosnovtseva, Olga

    2017-01-01

    Vasomotion is spontaneous or induced rhythmic changes in vascular tone or vessel diameter that lead to rhythmic changes in flow. While the vascular research community debates the physiological and pathophysiological consequence of vasomotion, there is a great need for experimental techniques...... that can address the role and dynamical properties of vasomotion in vivo. We apply laser speckle imaging to study spontaneous and drug induced vasomotion in retinal network of anesthetized rats. The results reveal a wide variety of dynamical patterns. Wavelet-based analysis shows that (i) spontaneous...

  2. Sensitivity evaluation of dynamic speckle activity measurements using clustering methods

    International Nuclear Information System (INIS)

    Etchepareborda, Pablo; Federico, Alejandro; Kaufmann, Guillermo H.

    2010-01-01

    We evaluate and compare the use of competitive neural networks, self-organizing maps, the expectation-maximization algorithm, K-means, and fuzzy C-means techniques as partitional clustering methods, when the sensitivity of the activity measurement of dynamic speckle images needs to be improved. The temporal history of the acquired intensity generated by each pixel is analyzed in a wavelet decomposition framework, and it is shown that the mean energy of its corresponding wavelet coefficients provides a suited feature space for clustering purposes. The sensitivity obtained by using the evaluated clustering techniques is also compared with the well-known methods of Konishi-Fujii, weighted generalized differences, and wavelet entropy. The performance of the partitional clustering approach is evaluated using simulated dynamic speckle patterns and also experimental data.

  3. Speckle interferometry of asteroids

    International Nuclear Information System (INIS)

    Drummond, J.

    1988-01-01

    By studying the image two-dimensional power spectra or autocorrelations projected by an asteroid as it rotates, it is possible to locate its rotational pole and derive its three axes dimensions through speckle interferometry under certain assumptions of uniform, geometric scattering, and triaxial ellipsoid shape. However, in cases where images can be reconstructed, the need for making the assumptions is obviated. Furthermore, the ultimate goal for speckle interferometry of image reconstruction will lead to mapping albedo features (if they exist) as impact areas or geological units. The first glimpses of the surface of an asteroid were obtained from images of 4 Vesta reconstructed from speckle interferometric observations. These images reveal that Vesta is quite Moon-like in having large hemispheric-scale albedo features. All of its lightcurves can be produced from a simple model developed from the images. Although undoubtedly more intricate than the model, Vesta's lightcurves can be matched by a model with three dark and four bright spots. The dark areas so dominate one hemisphere that a lightcurve minimum occurs when the maximum cross-section area is visible. The triaxial ellipsoid shape derived for Vesta is not consistent with the notion that the asteroid has an equilibrium shape in spite of its having apparently been differentiated

  4. Speckle dynamics under ergodicity breaking

    Science.gov (United States)

    Sdobnov, Anton; Bykov, Alexander; Molodij, Guillaume; Kalchenko, Vyacheslav; Jarvinen, Topias; Popov, Alexey; Kordas, Krisztian; Meglinski, Igor

    2018-04-01

    Laser speckle contrast imaging (LSCI) is a well-known and versatile approach for the non-invasive visualization of flows and microcirculation localized in turbid scattering media, including biological tissues. In most conventional implementations of LSCI the ergodic regime is typically assumed valid. However, most composite turbid scattering media, especially biological tissues, are non-ergodic, containing a mixture of dynamic and static centers of light scattering. In the current study, we examined the speckle contrast in different dynamic conditions with the aim of assessing limitations in the quantitative interpretation of speckle contrast images. Based on a simple phenomenological approach, we introduced a coefficient of speckle dynamics to quantitatively assess the ratio of the dynamic part of a scattering medium to the static one. The introduced coefficient allows one to distinguish real changes in motion from the mere appearance of static components in the field of view. As examples of systems with static/dynamic transitions, thawing and heating of Intralipid samples were studied by the LSCI approach.

  5. Laser speckle imaging of rat retinal blood flow with hybrid temporal and spatial analysis method

    Science.gov (United States)

    Cheng, Haiying; Yan, Yumei; Duong, Timothy Q.

    2009-02-01

    Noninvasive monitoring of blood flow in retinal circulation will reveal the progression and treatment of ocular disorders, such as diabetic retinopathy, age-related macular degeneration and glaucoma. A non-invasive and direct BF measurement technique with high spatial-temporal resolution is needed for retinal imaging. Laser speckle imaging (LSI) is such a method. Currently, there are two analysis methods for LSI: spatial statistics LSI (SS-LSI) and temporal statistical LSI (TS-LSI). Comparing these two analysis methods, SS-LSI has higher signal to noise ratio (SNR) and TSLSI is less susceptible to artifacts from stationary speckle. We proposed a hybrid temporal and spatial analysis method (HTS-LSI) to measure the retinal blood flow. Gas challenge experiment was performed and images were analyzed by HTS-LSI. Results showed that HTS-LSI can not only remove the stationary speckle but also increase the SNR. Under 100% O2, retinal BF decreased by 20-30%. This was consistent with the results observed with laser Doppler technique. As retinal blood flow is a critical physiological parameter and its perturbation has been implicated in the early stages of many retinal diseases, HTS-LSI will be an efficient method in early detection of retina diseases.

  6. MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach

    Science.gov (United States)

    Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.

    2018-04-01

    Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.

  7. Analyzing speckle contrast for HiLo microscopy optimization

    Science.gov (United States)

    Mazzaferri, J.; Kunik, D.; Belisle, J. M.; Singh, K.; Lefrançois, S.; Costantino, S.

    2011-07-01

    HiLo microscopy is a recently developed technique that provides both optical sectioning and fast imaging with a simple implementation and at a very low cost. The methodology combines widefield and speckled illumination images to obtain one optically sectioned image. Hence, the characteristics of such speckle illumination ultimately determine the quality of HiLo images and the overall performance of the method. In this work, we study how speckle contrast influence local variations of fluorescence intensity and brightness profiles of thick samples. We present this article as a guide to adjust the parameters of the system for optimizing the capabilities of this novel technology.

  8. X-ray speckle correlation interferometer

    International Nuclear Information System (INIS)

    Eisenhower, Rachel; Materlik, Gerhard

    2000-01-01

    Speckle Pattern Correlation Interferometry (SPCI) is a well-established technique in the visible-light regime for observing surface disturbances. Although not a direct imaging technique, SPCI gives full-field, high-resolution information about an object's motion. Since x-ray synchrotron radiation beamlines with high coherent flux have allowed the observation of x-ray speckle, x-ray SPCI could provide a means to measure strains and other quasi-static motions in disordered systems. This paper therefore examines the feasibility of an x-ray speckle correlation interferometer

  9. Analysis of microvascular perfusion with multi-dimensional complete ensemble empirical mode decomposition with adaptive noise algorithm: Processing of laser speckle contrast images recorded in healthy subjects, at rest and during acetylcholine stimulation.

    Science.gov (United States)

    Humeau-Heurtier, Anne; Marche, Pauline; Dubois, Severine; Mahe, Guillaume

    2015-01-01

    Laser speckle contrast imaging (LSCI) is a full-field imaging modality to monitor microvascular blood flow. It is able to give images with high temporal and spatial resolutions. However, when the skin is studied, the interpretation of the bidimensional data may be difficult. This is why an averaging of the perfusion values in regions of interest is often performed and the result is followed in time, reducing the data to monodimensional time series. In order to avoid such a procedure (that leads to a loss of the spatial resolution), we propose to extract patterns from LSCI data and to compare these patterns for two physiological states in healthy subjects: at rest and at the peak of acetylcholine-induced perfusion peak. For this purpose, the recent multi-dimensional complete ensemble empirical mode decomposition with adaptive noise (MCEEMDAN) algorithm is applied to LSCI data. The results show that the intrinsic mode functions and residue given by MCEEMDAN show different patterns for the two physiological states. The images, as bidimensional data, can therefore be processed to reveal microvascular perfusion patterns, hidden in the images themselves. This work is therefore a feasibility study before analyzing data in patients with microvascular dysfunctions.

  10. Speckle contrast diffuse correlation tomography of complex turbid medium flow

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Chong; Irwin, Daniel; Lin, Yu; Shang, Yu; He, Lian; Kong, Weikai; Yu, Guoqiang [Department of Biomedical Engineering, University of Kentucky, Lexington, Kentucky 40506 (United States); Luo, Jia [Department of Pharmacology and Nutritional Sciences, University of Kentucky, Lexington, Kentucky 40506 (United States)

    2015-07-15

    Purpose: Developed herein is a three-dimensional (3D) flow contrast imaging system leveraging advancements in the extension of laser speckle contrast imaging theories to deep tissues along with our recently developed finite-element diffuse correlation tomography (DCT) reconstruction scheme. This technique, termed speckle contrast diffuse correlation tomography (scDCT), enables incorporation of complex optical property heterogeneities and sample boundaries. When combined with a reflectance-based design, this system facilitates a rapid segue into flow contrast imaging of larger, in vivo applications such as humans. Methods: A highly sensitive CCD camera was integrated into a reflectance-based optical system. Four long-coherence laser source positions were coupled to an optical switch for sequencing of tomographic data acquisition providing multiple projections through the sample. This system was investigated through incorporation of liquid and solid tissue-like phantoms exhibiting optical properties and flow characteristics typical of human tissues. Computer simulations were also performed for comparisons. A uniquely encountered smear correction algorithm was employed to correct point-source illumination contributions during image capture with the frame-transfer CCD and reflectance setup. Results: Measurements with scDCT on a homogeneous liquid phantom showed that speckle contrast-based deep flow indices were within 12% of those from standard DCT. Inclusion of a solid phantom submerged below the liquid phantom surface allowed for heterogeneity detection and validation. The heterogeneity was identified successfully by reconstructed 3D flow contrast tomography with scDCT. The heterogeneity center and dimensions and averaged relative flow (within 3%) and localization were in agreement with actuality and computer simulations, respectively. Conclusions: A custom cost-effective CCD-based reflectance 3D flow imaging system demonstrated rapid acquisition of dense boundary

  11. Lagrangian speckle model and tissue-motion estimation--theory.

    Science.gov (United States)

    Maurice, R L; Bertrand, M

    1999-07-01

    It is known that when a tissue is subjected to movements such as rotation, shearing, scaling, etc., changes in speckle patterns that result act as a noise source, often responsible for most of the displacement-estimate variance. From a modeling point of view, these changes can be thought of as resulting from two mechanisms: one is the motion of the speckles and the other, the alterations of their morphology. In this paper, we propose a new tissue-motion estimator to counteract these speckle decorrelation effects. The estimator is based on a Lagrangian description of the speckle motion. This description allows us to follow local characteristics of the speckle field as if they were a material property. This method leads to an analytical description of the decorrelation in a way which enables the derivation of an appropriate inverse filter for speckle restoration. The filter is appropriate for linear geometrical transformation of the scattering function (LT), i.e., a constant-strain region of interest (ROI). As the LT itself is a parameter of the filter, a tissue-motion estimator can be formulated as a nonlinear minimization problem, seeking the best match between the pre-tissue-motion image and a restored-speckle post-motion image. The method is tested, using simulated radio-frequency (RF) images of tissue undergoing axial shear.

  12. Speckle imaging with the PAPA detector. [Precision Analog Photon Address

    Science.gov (United States)

    Papaliolios, C.; Nisenson, P.; Ebstein, S.

    1985-01-01

    A new 2-D photon-counting camera, the PAPA (precision analog photon address) detector has been built, tested, and used successfully for the acquisition of speckle imaging data. The camera has 512 x 512 pixels and operates at count rates of at least 200,000/sec. In this paper, technical details on the camera are presented and some of the laboratory and astronomical results are included which demonstrate the detector's capabilities.

  13. Digital Speckle Photography of Subpixel Displacements of Speckle Structures Based on Analysis of Their Spatial Spectra

    Science.gov (United States)

    Maksimova, L. A.; Ryabukho, P. V.; Mysina, N. Yu.; Lyakin, D. V.; Ryabukho, V. P.

    2018-04-01

    We have investigated the capabilities of the method of digital speckle interferometry for determining subpixel displacements of a speckle structure formed by a displaceable or deformable object with a scattering surface. An analysis of spatial spectra of speckle structures makes it possible to perform measurements with a subpixel accuracy and to extend the lower boundary of the range of measurements of displacements of speckle structures to the range of subpixel values. The method is realized on the basis of digital recording of the images of undisplaced and displaced speckle structures, their spatial frequency analysis using numerically specified constant phase shifts, and correlation analysis of spatial spectra of speckle structures. Transformation into the frequency range makes it possible to obtain quantities to be measured with a subpixel accuracy from the shift of the interference-pattern minimum in the diffraction halo by introducing an additional phase shift into the complex spatial spectrum of the speckle structure or from the slope of the linear plot of the function of accumulated phase difference in the field of the complex spatial spectrum of the displaced speckle structure. The capabilities of the method have been investigated in natural experiment.

  14. Shift-Invariant Image Reconstruction of Speckle-Degraded Images Using Bispectrum Estimation

    Science.gov (United States)

    1990-05-01

    process with the requisite negative exponential pelf. I call this model the Negative Exponential Model ( NENI ). The NENI flowchart is seen in Figure 6...Figure ]3d-g. Statistical Histograms and Phase for the RPj NG EXP FDF MULT METHOD FILuteC 14a. Truth Object Speckled Via the NENI HISTOGRAM OF SPECKLE

  15. State of the Art of X-ray Speckle-Based Phase-Contrast and Dark-Field Imaging

    Directory of Open Access Journals (Sweden)

    Marie-Christine Zdora

    2018-04-01

    Full Text Available In the past few years, X-ray phase-contrast and dark-field imaging have evolved to be invaluable tools for non-destructive sample visualisation, delivering information inaccessible by conventional absorption imaging. X-ray phase-sensing techniques are furthermore increasingly used for at-wavelength metrology and optics characterisation. One of the latest additions to the group of differential phase-contrast methods is the X-ray speckle-based technique. It has drawn significant attention due to its simple and flexible experimental arrangement, cost-effectiveness and multimodal character, amongst others. Since its first demonstration at highly brilliant synchrotron sources, the method has seen rapid development, including the translation to polychromatic laboratory sources and extension to higher-energy X-rays. Recently, different advanced acquisition schemes have been proposed to tackle some of the main limitations of previous implementations. Current applications of the speckle-based method range from optics characterisation and wavefront measurement to biomedical imaging and materials science. This review provides an overview of the state of the art of the X-ray speckle-based technique. Its basic principles and different experimental implementations as well as the the latest advances and applications are illustrated. In the end, an outlook for anticipated future developments of this promising technique is given.

  16. Laser speckle contrast imaging using light field microscope approach

    Science.gov (United States)

    Ma, Xiaohui; Wang, Anting; Ma, Fenghua; Wang, Zi; Ming, Hai

    2018-01-01

    In this paper, a laser speckle contrast imaging (LSCI) system using light field (LF) microscope approach is proposed. As far as we known, it is first time to combine LSCI with LF. To verify this idea, a prototype consists of a modified LF microscope imaging system and an experimental device was built. A commercially used Lytro camera was modified for microscope imaging. Hollow glass tubes with different depth fixed in glass dish were used to simulate the vessels in brain and test the performance of the system. Compared with conventional LSCI, three new functions can be realized by using our system, which include refocusing, extending the depth of field (DOF) and gathering 3D information. Experiments show that the principle is feasible and the proposed system works well.

  17. Color speckle in laser displays

    Science.gov (United States)

    Kuroda, Kazuo

    2015-07-01

    At the beginning of this century, lighting technology has been shifted from discharge lamps, fluorescent lamps and electric bulbs to solid-state lighting. Current solid-state lighting is based on the light emitting diodes (LED) technology, but the laser lighting technology is developing rapidly, such as, laser cinema projectors, laser TVs, laser head-up displays, laser head mounted displays, and laser headlamps for motor vehicles. One of the main issues of laser displays is the reduction of speckle noise1). For the monochromatic laser light, speckle is random interference pattern on the image plane (retina for human observer). For laser displays, RGB (red-green-blue) lasers form speckle patterns independently, which results in random distribution of chromaticity, called color speckle2).

  18. Evaluation of segmentation algorithms for optical coherence tomography images of ovarian tissue

    Science.gov (United States)

    Sawyer, Travis W.; Rice, Photini F. S.; Sawyer, David M.; Koevary, Jennifer W.; Barton, Jennifer K.

    2018-02-01

    Ovarian cancer has the lowest survival rate among all gynecologic cancers due to predominantly late diagnosis. Early detection of ovarian cancer can increase 5-year survival rates from 40% up to 92%, yet no reliable early detection techniques exist. Optical coherence tomography (OCT) is an emerging technique that provides depthresolved, high-resolution images of biological tissue in real time and demonstrates great potential for imaging of ovarian tissue. Mouse models are crucial to quantitatively assess the diagnostic potential of OCT for ovarian cancer imaging; however, due to small organ size, the ovaries must rst be separated from the image background using the process of segmentation. Manual segmentation is time-intensive, as OCT yields three-dimensional data. Furthermore, speckle noise complicates OCT images, frustrating many processing techniques. While much work has investigated noise-reduction and automated segmentation for retinal OCT imaging, little has considered the application to the ovaries, which exhibit higher variance and inhomogeneity than the retina. To address these challenges, we evaluated a set of algorithms to segment OCT images of mouse ovaries. We examined ve preprocessing techniques and six segmentation algorithms. While all pre-processing methods improve segmentation, Gaussian filtering is most effective, showing an improvement of 32% +/- 1.2%. Of the segmentation algorithms, active contours performs best, segmenting with an accuracy of 0.948 +/- 0.012 compared with manual segmentation (1.0 being identical). Nonetheless, further optimization could lead to maximizing the performance for segmenting OCT images of the ovaries.

  19. Highly porous nanoberyllium for X-ray beam speckle suppression

    Energy Technology Data Exchange (ETDEWEB)

    Goikhman, Alexander, E-mail: agoikhman@ymail.com; Lyatun, Ivan; Ershov, Petr [Immanuel Kant Baltic Federal University, Nevskogo str. 14, Kaliningrad 236041 (Russian Federation); Snigireva, Irina [European Synchrotron Radiation Facility, BP 220, 38043 Grenoble (France); Wojda, Pawel [Immanuel Kant Baltic Federal University, Nevskogo str. 14, Kaliningrad 236041 (Russian Federation); Gdańsk University of Technology, 11/12 G. Narutowicza, Gdańsk 80-233 (Poland); Gorlevsky, Vladimir; Semenov, Alexander; Sheverdyaev, Maksim; Koletskiy, Viktor [A. A. Bochvar High-Technology Scientific Research Institute for Inorganic Materials, Rogova str. 5a, Moscow 123098 (Russian Federation); Snigirev, Anatoly [Immanuel Kant Baltic Federal University, Nevskogo str. 14, Kaliningrad 236041 (Russian Federation); European Synchrotron Radiation Facility, BP 220, 38043 Grenoble (France)

    2015-04-09

    A speckle suppression device containing highly porous nanoberyllium is proposed for manipulating the spatial coherence length and removing undesirable speckle structure during imaging experiments. This paper reports a special device called a ‘speckle suppressor’, which contains a highly porous nanoberyllium plate squeezed between two beryllium windows. The insertion of the speckle suppressor in an X-ray beam allows manipulation of the spatial coherence length, thus changing the effective source size and removing the undesirable speckle structure in X-ray imaging experiments almost without beam attenuation. The absorption of the nanoberyllium plate is below 1% for 1 mm thickness at 12 keV. The speckle suppressor was tested on the ID06 ESRF beamline with X-rays in the energy range from 9 to 15 keV. It was applied for the transformation of the phase–amplitude contrast to the pure amplitude contrast in full-field microscopy.

  20. Highly porous nanoberyllium for X-ray beam speckle suppression

    International Nuclear Information System (INIS)

    Goikhman, Alexander; Lyatun, Ivan; Ershov, Petr; Snigireva, Irina; Wojda, Pawel; Gorlevsky, Vladimir; Semenov, Alexander; Sheverdyaev, Maksim; Koletskiy, Viktor; Snigirev, Anatoly

    2015-01-01

    A speckle suppression device containing highly porous nanoberyllium is proposed for manipulating the spatial coherence length and removing undesirable speckle structure during imaging experiments. This paper reports a special device called a ‘speckle suppressor’, which contains a highly porous nanoberyllium plate squeezed between two beryllium windows. The insertion of the speckle suppressor in an X-ray beam allows manipulation of the spatial coherence length, thus changing the effective source size and removing the undesirable speckle structure in X-ray imaging experiments almost without beam attenuation. The absorption of the nanoberyllium plate is below 1% for 1 mm thickness at 12 keV. The speckle suppressor was tested on the ID06 ESRF beamline with X-rays in the energy range from 9 to 15 keV. It was applied for the transformation of the phase–amplitude contrast to the pure amplitude contrast in full-field microscopy

  1. A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging

    International Nuclear Information System (INIS)

    Jiang, J; Hall, T J

    2007-01-01

    Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows (registered) system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s -1 ) that exceed our previous methods

  2. From synchrotron radiation to lab source: advanced speckle-based X-ray imaging using abrasive paper

    Science.gov (United States)

    Wang, Hongchang; Kashyap, Yogesh; Sawhney, Kawal

    2016-02-01

    X-ray phase and dark-field imaging techniques provide complementary and inaccessible information compared to conventional X-ray absorption or visible light imaging. However, such methods typically require sophisticated experimental apparatus or X-ray beams with specific properties. Recently, an X-ray speckle-based technique has shown great potential for X-ray phase and dark-field imaging using a simple experimental arrangement. However, it still suffers from either poor resolution or the time consuming process of collecting a large number of images. To overcome these limitations, in this report we demonstrate that absorption, dark-field, phase contrast, and two orthogonal differential phase contrast images can simultaneously be generated by scanning a piece of abrasive paper in only one direction. We propose a novel theoretical approach to quantitatively extract the above five images by utilising the remarkable properties of speckles. Importantly, the technique has been extended from a synchrotron light source to utilise a lab-based microfocus X-ray source and flat panel detector. Removing the need to raster the optics in two directions significantly reduces the acquisition time and absorbed dose, which can be of vital importance for many biological samples. This new imaging method could potentially provide a breakthrough for numerous practical imaging applications in biomedical research and materials science.

  3. GPU-Based Block-Wise Nonlocal Means Denoising for 3D Ultrasound Images

    Directory of Open Access Journals (Sweden)

    Liu Li

    2013-01-01

    Full Text Available Speckle suppression plays an important role in improving ultrasound (US image quality. While lots of algorithms have been proposed for 2D US image denoising with remarkable filtering quality, there is relatively less work done on 3D ultrasound speckle suppression, where the whole volume data rather than just one frame needs to be considered. Then, the most crucial problem with 3D US denoising is that the computational complexity increases tremendously. The nonlocal means (NLM provides an effective method for speckle suppression in US images. In this paper, a programmable graphic-processor-unit- (GPU- based fast NLM filter is proposed for 3D ultrasound speckle reduction. A Gamma distribution noise model, which is able to reliably capture image statistics for Log-compressed ultrasound images, was used for the 3D block-wise NLM filter on basis of Bayesian framework. The most significant aspect of our method was the adopting of powerful data-parallel computing capability of GPU to improve the overall efficiency. Experimental results demonstrate that the proposed method can enormously accelerate the algorithm.

  4. Speckle-modulating optical coherence tomography in living mice and humans

    Science.gov (United States)

    Liba, Orly; Lew, Matthew D.; Sorelle, Elliott D.; Dutta, Rebecca; Sen, Debasish; Moshfeghi, Darius M.; Chu, Steven; de La Zerda, Adam

    2017-06-01

    Optical coherence tomography (OCT) is a powerful biomedical imaging technology that relies on the coherent detection of backscattered light to image tissue morphology in vivo. As a consequence, OCT is susceptible to coherent noise (speckle noise), which imposes significant limitations on its diagnostic capabilities. Here we show speckle-modulating OCT (SM-OCT), a method based purely on light manipulation that virtually eliminates speckle noise originating from a sample. SM-OCT accomplishes this by creating and averaging an unlimited number of scans with uncorrelated speckle patterns without compromising spatial resolution. Using SM-OCT, we reveal small structures in the tissues of living animals, such as the inner stromal structure of a live mouse cornea, the fine structures inside the mouse pinna, and sweat ducts and Meissner's corpuscle in the human fingertip skin--features that are otherwise obscured by speckle noise when using conventional OCT or OCT with current state of the art speckle reduction methods.

  5. X-ray pulse wavefront metrology using speckle tracking

    International Nuclear Information System (INIS)

    Berujon, Sebastien; Ziegler, Eric; Cloetens, Peter

    2015-01-01

    The theoretical description and experimental implementation of a speckle-tracking-based instrument which permits the characterisation of X-ray pulse wavefronts. An instrument allowing the quantitative analysis of X-ray pulsed wavefronts is presented and its processing method explained. The system relies on the X-ray speckle tracking principle to accurately measure the phase gradient of the X-ray beam from which beam optical aberrations can be deduced. The key component of this instrument, a semi-transparent scintillator emitting visible light while transmitting X-rays, allows simultaneous recording of two speckle images at two different propagation distances from the X-ray source. The speckle tracking procedure for a reference-less metrology mode is described with a detailed account on the advanced processing schemes used. A method to characterize and compensate for the imaging detector distortion, whose principle is also based on speckle, is included. The presented instrument is expected to find interest at synchrotrons and at the new X-ray free-electron laser sources under development worldwide where successful exploitation of beams relies on the availability of an accurate wavefront metrology

  6. SPECKLE IMAGING EXCLUDES LOW-MASS COMPANIONS ORBITING THE EXOPLANET HOST STAR TRAPPIST-1

    International Nuclear Information System (INIS)

    Howell, Steve B.; Scott, Nicholas J.; Everett, Mark E.; Horch, Elliott P.; Winters, Jennifer G.; Hirsch, Lea; Nusdeo, Dan

    2016-01-01

    We have obtained the highest-resolution images available of TRAPPIST-1 using the Gemini-South telescope and our speckle imaging camera. Observing at 692 and 883 nm, we reached the diffraction limit of the telescope providing a best resolution of 27 mas or, at the distance of TRAPPIST-1, a spatial resolution of 0.32 au. Our imaging of the star extends from 0.32 to 14.5 au. We show that to a high confidence level, we can exclude all possible stellar and brown dwarf companions, indicating that TRAPPIST-1 is a single star.

  7. SPECKLE IMAGING EXCLUDES LOW-MASS COMPANIONS ORBITING THE EXOPLANET HOST STAR TRAPPIST-1

    Energy Technology Data Exchange (ETDEWEB)

    Howell, Steve B.; Scott, Nicholas J. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Everett, Mark E. [National Optical Astronomy Observatory, 950 N. Cherry Avenue, Tucson, AZ 85719 (United States); Horch, Elliott P. [Department of Physics, Southern Connecticut State University, 501 Crescent Street, New Haven, CT, 06515 (United States); Winters, Jennifer G. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA, 02138 (United States); Hirsch, Lea [Astronomy Department, University of California, Berkeley, 510 Campbell Hall, Berkeley, CA, 94720 (United States); Nusdeo, Dan [Department of Physics and Astronomy, Georgia State University, P.O. Box 5060, Atlanta, GA 30302 (United States)

    2016-09-20

    We have obtained the highest-resolution images available of TRAPPIST-1 using the Gemini-South telescope and our speckle imaging camera. Observing at 692 and 883 nm, we reached the diffraction limit of the telescope providing a best resolution of 27 mas or, at the distance of TRAPPIST-1, a spatial resolution of 0.32 au. Our imaging of the star extends from 0.32 to 14.5 au. We show that to a high confidence level, we can exclude all possible stellar and brown dwarf companions, indicating that TRAPPIST-1 is a single star.

  8. SAR image regularization with fast approximate discrete minimization.

    Science.gov (United States)

    Denis, Loïc; Tupin, Florence; Darbon, Jérôme; Sigelle, Marc

    2009-07-01

    Synthetic aperture radar (SAR) images, like other coherent imaging modalities, suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. Numerous approaches have been proposed to filter speckle noise. Markov random field (MRF) modelization provides a convenient way to express both data fidelity constraints and desirable properties of the filtered image. In this context, total variation minimization has been extensively used to constrain the oscillations in the regularized image while preserving its edges. Speckle noise follows heavy-tailed distributions, and the MRF formulation leads to a minimization problem involving nonconvex log-likelihood terms. Such a minimization can be performed efficiently by computing minimum cuts on weighted graphs. Due to memory constraints, exact minimization, although theoretically possible, is not achievable on large images required by remote sensing applications. The computational burden of the state-of-the-art algorithm for approximate minimization (namely the alpha -expansion) is too heavy specially when considering joint regularization of several images. We show that a satisfying solution can be reached, in few iterations, by performing a graph-cut-based combinatorial exploration of large trial moves. This algorithm is applied to joint regularization of the amplitude and interferometric phase in urban area SAR images.

  9. Investigation of the ripeness of oil palm fresh fruit bunches using bio-speckle imaging

    Science.gov (United States)

    Salambue, R.; Adnan, A.; Shiddiq, M.

    2018-03-01

    The ripeness of the oil palm Fresh Fruit Bunches (FFB) determines the yield of the oil produced. Traditionally there are two ways to determine FFB ripeness which are the number of loose fruits and the color changes. Nevertheless, one drawback of visual determination is subjective and qualitative judgment. In this study, the FFB ripeness was investigated using laser based image processing technique. The advantages of using this technique are non-destructive, simple and quantitative. The working principle of the investigation is that a FFB is inserted into a light tight box which contains a laser diode and a CMOS camera, the FFB is illuminated, and then an image is recorded. The FFB image recorder was performed on four FFB fractions i.e. F0, F3, F4 and F5 on the front and rear surfaces at three sections. The recorded images are speckled granules that have light intensity variation (bio-speckle imaging). The feature extracted from the specked image is the contrast value obtained from the average gray value intensity and the standard deviation. Based on the contrast values, the four fractions of FFB can be grouped into three levels of ripeness of unripe (F0), ripe (F3) and overripe (F4 and F5) on the front surface of base section of FFB by 75%.

  10. Exploiting the speckle-correlation scattering matrix for a compact reference-free holographic image sensor.

    Science.gov (United States)

    Lee, KyeoReh; Park, YongKeun

    2016-10-31

    The word 'holography' means a drawing that contains all of the information for light-both amplitude and wavefront. However, because of the insufficient bandwidth of current electronics, the direct measurement of the wavefront of light has not yet been achieved. Though reference-field-assisted interferometric methods have been utilized in numerous applications, introducing a reference field raises several fundamental and practical issues. Here we demonstrate a reference-free holographic image sensor. To achieve this, we propose a speckle-correlation scattering matrix approach; light-field information passing through a thin disordered layer is recorded and retrieved from a single-shot recording of speckle intensity patterns. Self-interference via diffusive scattering enables access to impinging light-field information, when light transport in the diffusive layer is precisely calibrated. As a proof-of-concept, we demonstrate direct holographic measurements of three-dimensional optical fields using a compact device consisting of a regular image sensor and a diffusor.

  11. Edge Probability and Pixel Relativity-Based Speckle Reducing Anisotropic Diffusion.

    Science.gov (United States)

    Mishra, Deepak; Chaudhury, Santanu; Sarkar, Mukul; Soin, Arvinder Singh; Sharma, Vivek

    2018-02-01

    Anisotropic diffusion filters are one of the best choices for speckle reduction in the ultrasound images. These filters control the diffusion flux flow using local image statistics and provide the desired speckle suppression. However, inefficient use of edge characteristics results in either oversmooth image or an image containing misinterpreted spurious edges. As a result, the diagnostic quality of the images becomes a concern. To alleviate such problems, a novel anisotropic diffusion-based speckle reducing filter is proposed in this paper. A probability density function of the edges along with pixel relativity information is used to control the diffusion flux flow. The probability density function helps in removing the spurious edges and the pixel relativity reduces the oversmoothing effects. Furthermore, the filtering is performed in superpixel domain to reduce the execution time, wherein a minimum of 15% of the total number of image pixels can be used. For performance evaluation, 31 frames of three synthetic images and 40 real ultrasound images are used. In most of the experiments, the proposed filter shows a better performance as compared to the state-of-the-art filters in terms of the speckle region's signal-to-noise ratio and mean square error. It also shows a comparative performance for figure of merit and structural similarity measure index. Furthermore, in the subjective evaluation, performed by the expert radiologists, the proposed filter's outputs are preferred for the improved contrast and sharpness of the object boundaries. Hence, the proposed filtering framework is suitable to reduce the unwanted speckle and improve the quality of the ultrasound images.

  12. Speckle Interferometry

    Science.gov (United States)

    Chiang, F. P.; Jin, F.; Wang, Q.; Zhu, N.

    Before the milestone work of Leedertz in 1970 coherent speckles generated from a laser illuminated object are considered noise to be eliminated or minimized. Leedertz shows that coherent speckles are actually information carriers. Since then the speckle technique has found many applications to fields of mechanics, metrology, nondestructive evaluation and material sciences. Speckles need not be coherent. Artificially created socalled white light speckles can also be used as information carriers. In this paper we present two recent developments of speckle technique with applications to micromechanics problems using SIEM (Speckle Interferometry with Electron Microscopy), to nondestructive evaluation of crevice corrosion and composite disbond and vibration of large structures using TADS (Time-Average Digital Specklegraphy).

  13. Utility of spatial frequency domain imaging (SFDI) and laser speckle imaging (LSI) to non-invasively diagnose burn depth in a porcine model☆

    Science.gov (United States)

    Burmeister, David M.; Ponticorvo, Adrien; Yang, Bruce; Becerra, Sandra C.; Choi, Bernard; Durkin, Anthony J.; Christy, Robert J.

    2015-01-01

    Surgical intervention of second degree burns is often delayed because of the difficulty in visual diagnosis, which increases the risk of scarring and infection. Non-invasive metrics have shown promise in accurately assessing burn depth. Here, we examine the use of spatial frequency domain imaging (SFDI) and laser speckle imaging (LSI) for predicting burn depth. Contact burn wounds of increasing severity were created on the dorsum of a Yorkshire pig, and wounds were imaged with SFDI/LSI starting immediately after-burn and then daily for the next 4 days. In addition, on each day the burn wounds were biopsied for histological analysis of burn depth, defined by collagen coagulation, apoptosis, and adnexal/vascular necrosis. Histological results show that collagen coagulation progressed from day 0 to day 1, and then stabilized. Results of burn wound imaging using non-invasive techniques were able to produce metrics that correlate to different predictors of burn depth. Collagen coagulation and apoptosis correlated with SFDI scattering coefficient parameter ( μs′) and adnexal/vascular necrosis on the day of burn correlated with blood flow determined by LSI. Therefore, incorporation of SFDI scattering coefficient and blood flow determined by LSI may provide an algorithm for accurate assessment of the severity of burn wounds in real time. PMID:26138371

  14. Comet Shoemaker-Levy 9/Jupiter collision observed with a high resolution speckle imaging system

    Energy Technology Data Exchange (ETDEWEB)

    Gravel, D. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    During the week of July 16, 1994, comet Shoemaker-Levy 9, broken into 20 plus pieces by tidal forces on its last orbit, smashed into the planet Jupiter, releasing the explosive energy of 500 thousand megatons. A team of observers from LLNL used the LLNL Speckle Imaging Camera mounted on the University of California`s Lick Observatory 3 Meter Telescope to capture continuous sequences of planet images during the comet encounter. Post processing with the bispectral phase reconstruction algorithm improves the resolution by removing much of the blurring due to atmospheric turbulence. High resolution images of the planet surface showing the aftermath of the impact are probably the best that were obtained from any ground-based telescope. We have been looking at the regions of the fragment impacts to try to discern any dynamic behavior of the spots left on Jupiter`s cloud tops. Such information can lead to conclusions about the nature of the comet and of Jupiter`s atmosphere. So far, the Hubble Space Telescope has observed expanding waves from the G impact whose mechanism is enigmatic since they appear to be too slow to be sound waves and too fast to be gravity waves, given the present knowledge of Jupiter`s atmosphere. Some of our data on the G and L impact region complements the Hubble observations but, so far, is inconclusive about spot dynamics.

  15. Statistical model for OCT image denoising

    KAUST Repository

    Li, Muxingzi

    2017-08-01

    Optical coherence tomography (OCT) is a non-invasive technique with a large array of applications in clinical imaging and biological tissue visualization. However, the presence of speckle noise affects the analysis of OCT images and their diagnostic utility. In this article, we introduce a new OCT denoising algorithm. The proposed method is founded on a numerical optimization framework based on maximum-a-posteriori estimate of the noise-free OCT image. It combines a novel speckle noise model, derived from local statistics of empirical spectral domain OCT (SD-OCT) data, with a Huber variant of total variation regularization for edge preservation. The proposed approach exhibits satisfying results in terms of speckle noise reduction as well as edge preservation, at reduced computational cost.

  16. Development of an optimized algorithm for the characterization of microflow using speckle patterns present in optical coherence tomography signal; Desenvolvimento de um algoritimo otimizado para caracterizacao de fluxos microfluidicos utilizando padroes de speckle presentes no sinal de tomografia por coerencia optica

    Energy Technology Data Exchange (ETDEWEB)

    Pretto, Lucas Ramos de

    2015-07-01

    This work discusses the Optical Coherence Tomography system (OCT) and its application to the microfluidics area. To this end, physical characterization of microfluidic circuits were performed using 3D (three-dimensional) models constructed from OCT images of such circuits. The technique was thus evaluated as a potential tool to aid in the inspection of microchannels. Going further, this work paper studies and develops analytical techniques for microfluidic flow, in particular techniques based on speckle pattern. In the first instance, existing methods were studied and improved, such as Speckle Variance - OCT, where a gain of 31% was obtained in processing time. Other methods, such as LASCA (Laser Speckle Contrast Analysis), based on speckle autocorrelation, are adapted to OCT images. Derived from LASCA, the developed analysis technique based on intensity autocorrelation motivated the development of a custom OCT system as well as an optimized acquisition software, with a sampling rate of 8 kHz. The proposed method was, then, able to distinguish different flow rates, and limits of detection were tested, proving its feasibility for implementation on Brownian motion analysis and flow rates below 10 μl/min. (author)

  17. The POKEMON Speckle Survey of Nearby M-Dwarfs

    Science.gov (United States)

    van Belle, Gerard; von Braun, Kaspar; Horch, Elliott; Clark, Catherine; DSSI Speckle Team

    2018-01-01

    The POKEMON (Pervasive Overview of Kompanions of Every M-dwarf in Our Neighborhood) survey of nearby M-dwarfs intends to inspect, at diffraction-limited resolution, every low-mass star out to 15pc, along with selected additional objects to 25pc. The primary emphasis of the survey is detection of low-mass companions to these M-dwarfs for refinement of the low-mass star multiplicity rate. The resultant catalog of M-dwarf companions will also guide immediate refinement of transit planet detection results from surveys such as TESS. POKEMON is using Lowell Observatory's 4.3-m Discovery Channel Telescope (DCT) with the Differential Speckle Survey Instrument (DSSI) speckle camera, along with the NN-Explore Exoplanet Stellar Speckle Imager (NESSI) speckle imager on 3.5-m WIYN; the survey takes advantage of the extremely rapid observing cadence rates possible with WIYN and (especially) DCT. The current status and preliminary results from the first 20+ nights of observing will be presented. Gotta observe them all!

  18. Vegetation Detection in Stress of Moisture Shortage Based on Laser Speckle Recognition

    Science.gov (United States)

    Ishizawa, Hiroaki; Matsuo, Tsukasa; Miki, Takashi

    This paper describes a new measuring method of plant vigor by using Laser speckle pattern. Furthermore, this proposes a practical application of this presented measurement system. The measuring instrument is consisted by a He-Ne Laser as the light source, and a set of optics, such as reflectors, a beam expander. The speckle pattern could be measured by a CCD camera through lenses. A Pothos (Epiremnum aureum) and Japanese morning glory (Ipomoea nil) were used as the sample plant. Their intact leaves were measured the speckle pattern images. Visible but small vigor veins could be clearly observed in the images obtained by the speckle patterns. On the other hand, withered ones have shown different images. The relationship has been obtained between the feature of the images and the chlorophyll degradation. It would be expected that the symptom of plant against some stress could be detected by measuring the Laser speckle pattern. It could be used as the sensor of the field server system at every field monitoring site.

  19. Optically sectioned in vivo imaging with speckle illumination HiLo microscopy

    Science.gov (United States)

    Lim, Daryl; Ford, Tim N.; Chu, Kengyeh K.; Mertz, Jerome

    2011-01-01

    We present a simple wide-field imaging technique, called HiLo microscopy, that is capable of producing optically sectioned images in real time, comparable in quality to confocal laser scanning microscopy. The technique is based on the fusion of two raw images, one acquired with speckle illumination and another with standard uniform illumination. The fusion can be numerically adjusted, using a single parameter, to produce optically sectioned images of varying thicknesses with the same raw data. Direct comparison between our HiLo microscope and a commercial confocal laser scanning microscope is made on the basis of sectioning strength and imaging performance. Specifically, we show that HiLo and confocal 3-D imaging of a GFP-labeled mouse brain hippocampus are comparable in quality. Moreover, HiLo microscopy is capable of faster, near video rate imaging over larger fields of view than attainable with standard confocal microscopes. The goal of this paper is to advertise the simplicity, robustness, and versatility of HiLo microscopy, which we highlight with in vivo imaging of common model organisms including planaria, C. elegans, and zebrafish.

  20. Speckle reduction in echocardiography by temporal compounding and anisotropic diffusion filtering

    Science.gov (United States)

    Giraldo-Guzmán, Jader; Porto-Solano, Oscar; Cadena-Bonfanti, Alberto; Contreras-Ortiz, Sonia H.

    2015-01-01

    Echocardiography is a medical imaging technique based on ultrasound signals that is used to evaluate heart anatomy and physiology. Echocardiographic images are affected by speckle, a type of multiplicative noise that obscures details of the structures, and reduces the overall image quality. This paper shows an approach to enhance echocardiography using two processing techniques: temporal compounding and anisotropic diffusion filtering. We used twenty echocardiographic videos that include one or three cardiac cycles to test the algorithms. Two images from each cycle were aligned in space and averaged to obtain the compound images. These images were then processed using anisotropic diffusion filters to further improve their quality. Resultant images were evaluated using quality metrics and visual assessment by two medical doctors. The average total improvement on signal-to-noise ratio was up to 100.29% for videos with three cycles, and up to 32.57% for videos with one cycle.

  1. Granulometry use for the study of dynamics speckles patterns

    International Nuclear Information System (INIS)

    Mavilioa, Adriana; Fernandez, Margarita; Trivi, Marcelo; Rabal, Hector; Arizaga, Ricardo

    2009-01-01

    Dynamic speckle patterns are generated by laser light scattering on surfaces that exhibit some kind of activity, due to physical or biological processes that take place in the illuminated object. The characterization of this dynamic process is carried out by studying the texture changes of auxiliary images: temporal history of the speckle pattern (THSP) obtained from this speckles patterns. The drying process of water borne paint is studied through a method based on mathematical morphology applied to the THSP image processing. It is based on obtaining the granulometry of these images and their characteristic granulometric spectrum. From the granulometric size distribution of each THSP image four parameters are obtained: mean length, standard deviation, asymmetry and kurtosis. These parameters are found to be suitable as texture features. The Mahalanobis distance is calculated between the texture features of the THSP images representative of the temporary stages of the drying process and the features of the final stage or pattern texture. The behavior of the distance function describes satisfactorily the drying process of the water borne paint. Finally, these results are compared with the obtained by other methods. Compared with others, the granulometric method reported in this work distinguished by its simplicity and easy implementation and can be used to characterize the evolution of any process recorded through dynamic speckles. (Author)

  2. Speckle Noise Reduction via Nonconvex High Total Variation Approach

    Directory of Open Access Journals (Sweden)

    Yulian Wu

    2015-01-01

    Full Text Available We address the problem of speckle noise removal. The classical total variation is extensively used in this field to solve such problem, but this method suffers from the staircase-like artifacts and the loss of image details. In order to resolve these problems, a nonconvex total generalized variation (TGV regularization is used to preserve both edges and details of the images. The TGV regularization which is able to remove the staircase effect has strong theoretical guarantee by means of its high order smooth feature. Our method combines the merits of both the TGV method and the nonconvex variational method and avoids their main drawbacks. Furthermore, we develop an efficient algorithm for solving the nonconvex TGV-based optimization problem. We experimentally demonstrate the excellent performance of the technique, both visually and quantitatively.

  3. Despeckle filtering for ultrasound imaging and video II selected applications

    CERN Document Server

    Loizou, Christos P

    2015-01-01

    In ultrasound imaging and video visual perception is hindered by speckle multiplicative noise that degrades the quality. Noise reduction is therefore essential for improving the visual observation quality or as a pre-processing step for further automated analysis, such as image/video segmentation, texture analysis and encoding in ultrasound imaging and video. The goal of the first book (book 1 of 2 books) was to introduce the problem of speckle in ultrasound image and video as well as the theoretical background, algorithmic steps, and the MatlabTM for the following group of despeckle filters:

  4. Dynamical properties of speckled speckles

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Iversen, Theis Faber Quist; Hansen, Rene Skov

    2010-01-01

    the static diffuser and the plane of observation consist of an optical system that can be characterized by a complex-valued ABCD-matrix (e.g. simple and complex imaging systems, free space propagation in both the near-and far-field, and Fourier transform systems). The use of the complex ABCD-method means...... to be Gaussian but the derived expressions are not restricted to a plane incident beam. The results are applicable for speckle-based systems for determining mechanical displacements, especially for long-range systems, and for analyzing systems for measuring biological activity beyond a diffuse layer, e.g. blood...

  5. Simulation of speckle patterns with pre-defined correlation distributions

    Science.gov (United States)

    Song, Lipei; Zhou, Zhen; Wang, Xueyan; Zhao, Xing; Elson, Daniel S.

    2016-01-01

    We put forward a method to easily generate a single or a sequence of fully developed speckle patterns with pre-defined correlation distribution by utilizing the principle of coherent imaging. The few-to-one mapping between the input correlation matrix and the correlation distribution between simulated speckle patterns is realized and there is a simple square relationship between the values of these two correlation coefficient sets. This method is demonstrated both theoretically and experimentally. The square relationship enables easy conversion from any desired correlation distribution. Since the input correlation distribution can be defined by a digital matrix or a gray-scale image acquired experimentally, this method provides a convenient way to simulate real speckle-related experiments and to evaluate data processing techniques. PMID:27231589

  6. An overview of methods to mitigate artifacts in optical coherence tomography imaging of the skin.

    Science.gov (United States)

    Adabi, Saba; Fotouhi, Audrey; Xu, Qiuyun; Daveluy, Steve; Mehregan, Darius; Podoleanu, Adrian; Nasiriavanaki, Mohammadreza

    2018-05-01

    Optical coherence tomography (OCT) of skin delivers three-dimensional images of tissue microstructures. Although OCT imaging offers a promising high-resolution modality, OCT images suffer from some artifacts that lead to misinterpretation of tissue structures. Therefore, an overview of methods to mitigate artifacts in OCT imaging of the skin is of paramount importance. Speckle, intensity decay, and blurring are three major artifacts in OCT images. Speckle is due to the low coherent light source used in the configuration of OCT. Intensity decay is a deterioration of light with respect to depth, and blurring is the consequence of deficiencies of optical components. Two speckle reduction methods (one based on artificial neural network and one based on spatial compounding), an attenuation compensation algorithm (based on Beer-Lambert law) and a deblurring procedure (using deconvolution), are described. Moreover, optical properties extraction algorithm based on extended Huygens-Fresnel (EHF) principle to obtain some additional information from OCT images are discussed. In this short overview, we summarize some of the image enhancement algorithms for OCT images which address the abovementioned artifacts. The results showed a significant improvement in the visibility of the clinically relevant features in the images. The quality improvement was evaluated using several numerical assessment measures. Clinical dermatologists benefit from using these image enhancement algorithms to improve OCT diagnosis and essentially function as a noninvasive optical biopsy. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Optical design of the comet Shoemaker-Levy speckle camera

    Energy Technology Data Exchange (ETDEWEB)

    Bissinger, H. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    An optical design is presented in which the Lick 3 meter telescope and a bare CCD speckle camera system was used to image the collision sites of the Shoemaker-Levy 9 comet with the Planet Jupiter. The brief overview includes of the optical constraints and system layout. The choice of a Risley prism combination to compensate for the time dependent atmospheric chromatic changes are described. Plate scale and signal-to-noise ratio curves resulting from imaging reference stars are compared with theory. Comparisons between un-corrected and reconstructed images of Jupiter`s impact sites. The results confirm that speckle imaging techniques can be used over an extended time period to provide a method to image large extended objects.

  8. Feasibility of speckle variance OCT for imaging cutaneous microvasculature regeneration during healing of wounds in diabetic mice

    Science.gov (United States)

    Sharma, P.; Kumawat, J.; Kumar, S.; Sahu, K.; Verma, Y.; Gupta, P. K.; Rao, K. D.

    2018-02-01

    We report on a study to assess the feasibility of a swept source-based speckle variance optical coherence tomography setup for monitoring cutaneous microvasculature. Punch wounds created in the ear pinnae of diabetic mice were monitored at different times post wounding to assess the structural and vascular changes. It was observed that the epithelium thickness increases post wounding and continues to be thick even after healing. Also, the wound size assessed by vascular images is larger than the physical wound size. The results show that the developed speckle variance optical coherence tomography system can be used to monitor vascular regeneration during wound healing in diabetic mice.

  9. Deformation measurements of materials at low temperatures using laser speckle photography method

    International Nuclear Information System (INIS)

    Sumio Nakahara; Yukihide Maeda; Kazunori Matsumura; Shigeyoshi Hisada; Takeyoshi Fujita; Kiyoshi Sugihara

    1992-01-01

    The authors observed deformations of several materials during cooling down process from room temperature to liquid nitrogen temperature using the laser speckle photography method. The in-plane displacements were measured by the image plane speckle photography and the out-of-plane displacement gradients by the defocused speckle photography. The results of measurements of in-plane displacement are compared with those of FEM analysis. The applicability of laser speckle photography method to cryogenic engineering are also discussed

  10. Speckle imaging of active galactic nuclei: NGC 1068 and NGC 4151

    International Nuclear Information System (INIS)

    Ebstein, S.M.

    1987-01-01

    High-resolution images of NGC 1068 and NGC 4151 in the [O III) 5007A line the nearby continuum produced from data taken with the PAPA photon-counting imaging detector using the technique of speckle imaging are presented. The images show an unresolved core of [O III] 5007A emission in the middle of an extended emission region. The extended emission tends to lie alongside the subarcsecond radio structure. In NGC 4151, the extended emission comes from a nearly linear structure extending on both sides of the unresolved core. In NGC 1068, the extended emission is also a linear structure centered on the unresolved core but the emission is concentrated in lobes lying to either side of the major axis. The continuum of NGC 4151 is spatially unresolved. The continuum of NGC 1068 is extended ∼1'' to the SW of the center of the [O III] 5007A emission. Certain aspects of the PAPA detector are discussed, including the variable-threshold discriminators that track the image intensifier pulse height and the camera artifacts. The data processing is described in detail

  11. Synchronized renal blood flow dynamics mapped with wavelet analysis of laser speckle flowmetry data

    DEFF Research Database (Denmark)

    Brazhe, Alexey R; Marsh, Donald J; von Holstein-Rathlou, Niels-Henrik

    2014-01-01

    of rat kidneys. The regulatory mechanism in the renal microcirculation generates oscillations in arterial blood flow at several characteristic frequencies. Our approach to laser speckle image processing allows detection of frequency and phase entrainments, visualization of their patterns, and estimation......Full-field laser speckle microscopy provides real-time imaging of superficial blood flow rate. Here we apply continuous wavelet transform to time series of speckle-estimated blood flow from each pixel of the images to map synchronous patterns in instantaneous frequency and phase on the surface...... of the extent of synchronization in renal cortex dynamics....

  12. Cognitive Machine-Learning Algorithm for Cardiac Imaging: A Pilot Study for Differentiating Constrictive Pericarditis From Restrictive Cardiomyopathy.

    Science.gov (United States)

    Sengupta, Partho P; Huang, Yen-Min; Bansal, Manish; Ashrafi, Ali; Fisher, Matt; Shameer, Khader; Gall, Walt; Dudley, Joel T

    2016-06-01

    Associating a patient's profile with the memories of prototypical patients built through previous repeat clinical experience is a key process in clinical judgment. We hypothesized that a similar process using a cognitive computing tool would be well suited for learning and recalling multidimensional attributes of speckle tracking echocardiography data sets derived from patients with known constrictive pericarditis and restrictive cardiomyopathy. Clinical and echocardiographic data of 50 patients with constrictive pericarditis and 44 with restrictive cardiomyopathy were used for developing an associative memory classifier-based machine-learning algorithm. The speckle tracking echocardiography data were normalized in reference to 47 controls with no structural heart disease, and the diagnostic area under the receiver operating characteristic curve of the associative memory classifier was evaluated for differentiating constrictive pericarditis from restrictive cardiomyopathy. Using only speckle tracking echocardiography variables, associative memory classifier achieved a diagnostic area under the curve of 89.2%, which improved to 96.2% with addition of 4 echocardiographic variables. In comparison, the area under the curve of early diastolic mitral annular velocity and left ventricular longitudinal strain were 82.1% and 63.7%, respectively. Furthermore, the associative memory classifier demonstrated greater accuracy and shorter learning curves than other machine-learning approaches, with accuracy asymptotically approaching 90% after a training fraction of 0.3 and remaining flat at higher training fractions. This study demonstrates feasibility of a cognitive machine-learning approach for learning and recalling patterns observed during echocardiographic evaluations. Incorporation of machine-learning algorithms in cardiac imaging may aid standardized assessments and support the quality of interpretations, particularly for novice readers with limited experience. © 2016

  13. Acousto-electrical speckle pattern in Lorentz force electrical impedance tomography

    International Nuclear Information System (INIS)

    Grasland-Mongrain, Pol; Destrempes, François; Cloutier, Guy; Mari, Jean-Martial; Souchon, Rémi; Catheline, Stefan; Chapelon, Jean-Yves; Lafon, Cyril

    2015-01-01

    Ultrasound speckle is a granular texture pattern appearing in ultrasound imaging. It can be used to distinguish tissues and identify pathologies. Lorentz force electrical impedance tomography is an ultrasound-based medical imaging technique of the tissue electrical conductivity. It is based on the application of an ultrasound wave in a medium placed in a magnetic field and on the measurement of the induced electric current due to Lorentz force. Similarly to ultrasound imaging, we hypothesized that a speckle could be observed with Lorentz force electrical impedance tomography imaging. In this study, we first assessed the theoretical similarity between the measured signals in Lorentz force electrical impedance tomography and in ultrasound imaging modalities. We then compared experimentally the signal measured in both methods using an acoustic and electrical impedance interface. Finally, a bovine muscle sample was imaged using the two methods. Similar speckle patterns were observed. This indicates the existence of an ‘acousto-electrical speckle’ in the Lorentz force electrical impedance tomography with spatial characteristics driven by the acoustic parameters but due to electrical impedance inhomogeneities instead of acoustic ones as is the case of ultrasound imaging. (paper)

  14. Acousto-electrical speckle pattern in Lorentz force electrical impedance tomography

    Science.gov (United States)

    Grasland-Mongrain, Pol; Destrempes, François; Mari, Jean-Martial; Souchon, Rémi; Catheline, Stefan; Chapelon, Jean-Yves; Lafon, Cyril; Cloutier, Guy

    2015-05-01

    Ultrasound speckle is a granular texture pattern appearing in ultrasound imaging. It can be used to distinguish tissues and identify pathologies. Lorentz force electrical impedance tomography is an ultrasound-based medical imaging technique of the tissue electrical conductivity. It is based on the application of an ultrasound wave in a medium placed in a magnetic field and on the measurement of the induced electric current due to Lorentz force. Similarly to ultrasound imaging, we hypothesized that a speckle could be observed with Lorentz force electrical impedance tomography imaging. In this study, we first assessed the theoretical similarity between the measured signals in Lorentz force electrical impedance tomography and in ultrasound imaging modalities. We then compared experimentally the signal measured in both methods using an acoustic and electrical impedance interface. Finally, a bovine muscle sample was imaged using the two methods. Similar speckle patterns were observed. This indicates the existence of an ‘acousto-electrical speckle’ in the Lorentz force electrical impedance tomography with spatial characteristics driven by the acoustic parameters but due to electrical impedance inhomogeneities instead of acoustic ones as is the case of ultrasound imaging.

  15. MLESAC Based Localization of Needle Insertion Using 2D Ultrasound Images

    Science.gov (United States)

    Xu, Fei; Gao, Dedong; Wang, Shan; Zhanwen, A.

    2018-04-01

    In the 2D ultrasound image of ultrasound-guided percutaneous needle insertions, it is difficult to determine the positions of needle axis and tip because of the existence of artifacts and other noises. In this work the speckle is regarded as the noise of an ultrasound image, and a novel algorithm is presented to detect the needle in a 2D ultrasound image. Firstly, the wavelet soft thresholding technique based on BayesShrink rule is used to denoise the speckle of ultrasound image. Secondly, we add Otsu’s thresholding method and morphologic operations to pre-process the ultrasound image. Finally, the localization of the needle is identified and positioned in the 2D ultrasound image based on the maximum likelihood estimation sample consensus (MLESAC) algorithm. The experimental results show that it is valid for estimating the position of needle axis and tip in the ultrasound images with the proposed algorithm. The research work is hopeful to be used in the path planning and robot-assisted needle insertion procedures.

  16. Mobile phone based laser speckle contrast imager for assessment of skin blood flow

    Science.gov (United States)

    Jakovels, Dainis; Saknite, Inga; Krievina, Gita; Zaharans, Janis; Spigulis, Janis

    2014-10-01

    Assessment of skin blood flow is of interest for evaluation of skin viability as well as for reflection of the overall condition of the circulatory system. Laser Doppler perfusion imaging (LDPI) and laser speckle contrast imaging (LASCI) are optical techniques used for assessment of skin perfusion. However, these systems are still too expensive and bulky to be widely available. Implementation of such techniques as connection kits for mobile phones have a potential for primary diagnostics. In this work we demonstrate simple and low cost LASCI connection kit for mobile phone and its comparison to laser Doppler perfusion imager. Post-occlusive hyperemia and local thermal hyperemia tests are used to compare both techniques and to demonstrate the potential of LASCI device.

  17. Pixel Classification of SAR ice images using ANFIS-PSO Classifier

    Directory of Open Access Journals (Sweden)

    G. Vasumathi

    2016-12-01

    Full Text Available Synthetic Aperture Radar (SAR is playing a vital role in taking extremely high resolution radar images. It is greatly used to monitor the ice covered ocean regions. Sea monitoring is important for various purposes which includes global climate systems and ship navigation. Classification on the ice infested area gives important features which will be further useful for various monitoring process around the ice regions. Main objective of this paper is to classify the SAR ice image that helps in identifying the regions around the ice infested areas. In this paper three stages are considered in classification of SAR ice images. It starts with preprocessing in which the speckled SAR ice images are denoised using various speckle removal filters; comparison is made on all these filters to find the best filter in speckle removal. Second stage includes segmentation in which different regions are segmented using K-means and watershed segmentation algorithms; comparison is made between these two algorithms to find the best in segmenting SAR ice images. The last stage includes pixel based classification which identifies and classifies the segmented regions using various supervised learning classifiers. The algorithms includes Back propagation neural networks (BPN, Fuzzy Classifier, Adaptive Neuro Fuzzy Inference Classifier (ANFIS classifier and proposed ANFIS with Particle Swarm Optimization (PSO classifier; comparison is made on all these classifiers to propose which classifier is best suitable for classifying the SAR ice image. Various evaluation metrics are performed separately at all these three stages.

  18. Dynamic laser speckle for non-destructive quality evaluation of bread

    Science.gov (United States)

    Stoykova, E.; Ivanov, B.; Shopova, M.; Lyubenova, T.; Panchev, I.; Sainov, V.

    2010-10-01

    Coherent illumination of a diffuse object yields a randomly varying interference pattern, which changes over time at any modification of the object. This phenomenon can be used for detection and visualization of physical or biological activity in various objects (e.g. fruits, seeds, coatings) through statistical description of laser speckle dynamics. The present report aims at non-destructive full-field evaluation of bread by spatial-temporal characterization of laser speckle. The main purpose of the conducted experiments was to prove the ability of the dynamic speckle method to indicate activity within the studied bread samples. In the set-up for acquisition and storage of dynamic speckle patterns an expanded beam from a DPSS laser (532 nm and 100mW) illuminated the sample through a ground glass diffuser. A CCD camera, adjusted to focus the sample, recorded regularly a sequence of images (8 bits and 780 x 582 squared pixels, sized 8.1 × 8.1 μm) at sampling frequency 0.25 Hz. A temporal structure function was calculated to evaluate activity of the bread samples in time using the full images in the sequence. In total, 7 samples of two types of bread were monitored during a chemical and physical process of bread's staling. Segmentation of images into matrixes of isometric fragments was also utilized. The results proved the potential of dynamic speckle as effective means for monitoring the process of bread staling and ability of this approach to differentiate between different types of bread.

  19. A New Approach for Speckle Reduction in Holographic 3D printer

    International Nuclear Information System (INIS)

    Utsugi, Takeru; Yamaguchi, Masahiro

    2013-01-01

    A Holographic 3D printer produces a high quality 3D image reproduced by a full-color, full-parallax holographic stereogram with high-density light-ray recording. But speckle-pattern noise localized behind the reconstructed image is causing a loss of the display quality. This noise is originated from the speckle generated by a diffuser for equalizing the intensity distribution of the object light on the recording medium. We analyze some conventional ways for speckle reduction using a band-limited diffuser, and it is found that these ways cannot reduce the noise sufficiently. Then we propose two methods, one introduces a moving diffuser and the other introduces multiple exposures and a digital diffuser called as 4L-PRPS.

  20. A decade of innovation with laser speckle metrology

    Science.gov (United States)

    Ettemeyer, Andreas

    2003-05-01

    Speckle Pattern Interferometry has emerged from the experimental substitution of holographic interferometry to become a powerful problem solving tool in research and industry. The rapid development of computer and digital imaging techniques in combination with minaturization of the optical equipment led to new applications which had not been anticipated before. While classical holographic interferometry had always required careful consideration of the environmental conditions such as vibration, noise, light, etc. and could generally only be performed in the optical laboratory, it is now state of the art, to handle portable speckle measuring equipment at almost any place. During the last decade, the change in design and technique has dramatically influenced the range of applications of speckle metrology and opened new markets. The integration of recent research results into speckle measuring equipment has led to handy equipment, simplified the operation and created high quality data output.

  1. Speckle-illuminated fluorescence confocal microscopy, using a digital micro-mirror device

    International Nuclear Information System (INIS)

    Jiang, Shi-Hong; Walker, John G

    2009-01-01

    An implementation of a speckle-illuminated fluorescence confocal microscope using a digital micro-mirror device (DMD) is described. The DMD not only projects a sequence of imaged binary speckle patterns onto the specimen at a very high frame rate but also operates as a spatial light modulator to perform real-time optical data processing. Frame averaging is accomplished by CCD charge accumulation during a single exposure. The recorded time-averaged image is a confocal image plus an unwanted non-confocal image which can be removed by recording a separate image. Experimental results with image acquisition within a fraction of a second are shown. Images of a thin biological sample are also shown to demonstrate practical application of the technique

  2. Study of nanometer-level precise phase-shift system used in electronic speckle shearography and phase-shift pattern interferometry

    Science.gov (United States)

    Jing, Chao; Liu, Zhongling; Zhou, Ge; Zhang, Yimo

    2011-11-01

    The nanometer-level precise phase-shift system is designed to realize the phase-shift interferometry in electronic speckle shearography pattern interferometry. The PZT is used as driving component of phase-shift system and translation component of flexure hinge is developed to realize micro displacement of non-friction and non-clearance. Closed-loop control system is designed for high-precision micro displacement, in which embedded digital control system is developed for completing control algorithm and capacitive sensor is used as feedback part for measuring micro displacement in real time. Dynamic model and control model of the nanometer-level precise phase-shift system is analyzed, and high-precision micro displacement is realized with digital PID control algorithm on this basis. It is proved with experiments that the location precision of the precise phase-shift system to step signal of displacement is less than 2nm and the location precision to continuous signal of displacement is less than 5nm, which is satisfied with the request of the electronic speckle shearography and phase-shift pattern interferometry. The stripe images of four-step phase-shift interferometry and the final phase distributed image correlated with distortion of objects are listed in this paper to prove the validity of nanometer-level precise phase-shift system.

  3. Supervised detection of exoplanets in high-contrast imaging sequences

    Science.gov (United States)

    Gomez Gonzalez, C. A.; Absil, O.; Van Droogenbroeck, M.

    2018-06-01

    Context. Post-processing algorithms play a key role in pushing the detection limits of high-contrast imaging (HCI) instruments. State-of-the-art image processing approaches for HCI enable the production of science-ready images relying on unsupervised learning techniques, such as low-rank approximations, for generating a model point spread function (PSF) and subtracting the residual starlight and speckle noise. Aims: In order to maximize the detection rate of HCI instruments and survey campaigns, advanced algorithms with higher sensitivities to faint companions are needed, especially for the speckle-dominated innermost region of the images. Methods: We propose a reformulation of the exoplanet detection task (for ADI sequences) that builds on well-established machine learning techniques to take HCI post-processing from an unsupervised to a supervised learning context. In this new framework, we present algorithmic solutions using two different discriminative models: SODIRF (random forests) and SODINN (neural networks). We test these algorithms on real ADI datasets from VLT/NACO and VLT/SPHERE HCI instruments. We then assess their performances by injecting fake companions and using receiver operating characteristic analysis. This is done in comparison with state-of-the-art ADI algorithms, such as ADI principal component analysis (ADI-PCA). Results: This study shows the improved sensitivity versus specificity trade-off of the proposed supervised detection approach. At the diffraction limit, SODINN improves the true positive rate by a factor ranging from 2 to 10 (depending on the dataset and angular separation) with respect to ADI-PCA when working at the same false-positive level. Conclusions: The proposed supervised detection framework outperforms state-of-the-art techniques in the task of discriminating planet signal from speckles. In addition, it offers the possibility of re-processing existing HCI databases to maximize their scientific return and potentially improve

  4. Laser speckle imaging identification of increases in cortical microcirculatory blood flow induced by motor activity during awake craniotomy

    NARCIS (Netherlands)

    Klijn, Eva; Hulscher, Hester C.; Balvers, Rutger K.; Holland, Wim P. J.; Bakker, Jan; Vincent, Arnaud J. P. E.; Dirven, Clemens M. F.; Ince, Can

    2013-01-01

    The goal of awake neurosurgery is to maximize resection of brain lesions with minimal injury to functional brain areas. Laser speckle imaging (LSI) is a noninvasive macroscopic technique with high spatial and temporal resolution used to monitor changes in capillary perfusion. In this study, the

  5. Early diagnosis of teeth erosion using polarized laser speckle imaging

    Science.gov (United States)

    Nader, Christelle Abou; Pellen, Fabrice; Loutfi, Hadi; Mansour, Rassoul; Jeune, Bernard Le; Brun, Guy Le; Abboud, Marie

    2016-07-01

    Dental erosion starts with a chemical attack on dental tissue causing tooth demineralization, altering the tooth structure and making it more sensitive to mechanical erosion. Medical diagnosis of dental erosion is commonly achieved through a visual inspection by the dentist during dental checkups and is therefore highly dependent on the operator's experience. The detection of this disease at preliminary stages is important since, once the damage is done, cares become more complicated. We investigate the difference in light-scattering properties between healthy and eroded teeth. A change in light-scattering properties is observed and a transition from volume to surface backscattering is detected by means of polarized laser speckle imaging as teeth undergo acid etching, suggesting an increase in enamel surface roughness.

  6. A novel effective method for the assessment of microvascular function in male patients with coronary artery disease: a pilot study using laser speckle contrast imaging

    Energy Technology Data Exchange (ETDEWEB)

    Borges, J.P. [Laboratório de Atividade Física e Promoção è Saúde, Departamento de Desporto Coletivo, Instituto de Educação Física e Desportos, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, RJ (Brazil); Lopes, G.O. [Laboratório de Atividade Física e Promoção è Saúde, Departamento de Desporto Coletivo, Instituto de Educação Física e Desportos, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, RJ (Brazil); Instituto Nacional de Cardiologia, Rio de Janeiro, RJ (Brazil); Verri, V.; Coelho, M.P.; Nascimento, P.M.C.; Kopiler, D.A. [Instituto Nacional de Cardiologia, Rio de Janeiro, RJ (Brazil); Tibirica, E. [Instituto Nacional de Cardiologia, Rio de Janeiro, RJ (Brazil); Laboratório de Investigação Cardiovascular, Departamento Osório de Almeida, Instituto Oswaldo Cruz, FIOCRUZ, Rio de Janeiro, RJ (Brazil)

    2016-09-01

    Evaluation of microvascular endothelial function is essential for investigating the pathophysiology and treatment of cardiovascular and metabolic diseases. Although laser speckle contrast imaging technology is well accepted as a noninvasive methodology for assessing microvascular endothelial function, it has never been used to compare male patients with coronary artery disease with male age-matched healthy controls. Thus, the aim of this study was to determine whether laser speckle contrast imaging could be used to detect differences in the systemic microvascular functions of patients with established cardiovascular disease (n=61) and healthy age-matched subjects (n=24). Cutaneous blood flow was assessed in the skin of the forearm using laser speckle contrast imaging coupled with the transdermal iontophoretic delivery of acetylcholine and post-occlusive reactive hyperemia. The maximum increase in skin blood flow induced by acetylcholine was significantly reduced in the cardiovascular disease patients compared with the control subjects (74 vs 116%; P<0.01). With regard to post-occlusive reactive hyperemia-induced vasodilation, the patients also presented reduced responses compared to the controls (0.42±0.15 vs 0.50±0.13 APU/mmHg; P=0.04). In conclusion, laser speckle contrast imaging can identify endothelial and microvascular dysfunctions in male individuals with cardiovascular disease. Thus, this technology appears to be an efficient non-invasive technique for evaluating systemic microvascular and endothelial functions, which could be valuable as a peripheral marker of atherothrombotic diseases in men.

  7. A novel effective method for the assessment of microvascular function in male patients with coronary artery disease: a pilot study using laser speckle contrast imaging

    International Nuclear Information System (INIS)

    Borges, J.P.; Lopes, G.O.; Verri, V.; Coelho, M.P.; Nascimento, P.M.C.; Kopiler, D.A.; Tibirica, E.

    2016-01-01

    Evaluation of microvascular endothelial function is essential for investigating the pathophysiology and treatment of cardiovascular and metabolic diseases. Although laser speckle contrast imaging technology is well accepted as a noninvasive methodology for assessing microvascular endothelial function, it has never been used to compare male patients with coronary artery disease with male age-matched healthy controls. Thus, the aim of this study was to determine whether laser speckle contrast imaging could be used to detect differences in the systemic microvascular functions of patients with established cardiovascular disease (n=61) and healthy age-matched subjects (n=24). Cutaneous blood flow was assessed in the skin of the forearm using laser speckle contrast imaging coupled with the transdermal iontophoretic delivery of acetylcholine and post-occlusive reactive hyperemia. The maximum increase in skin blood flow induced by acetylcholine was significantly reduced in the cardiovascular disease patients compared with the control subjects (74 vs 116%; P<0.01). With regard to post-occlusive reactive hyperemia-induced vasodilation, the patients also presented reduced responses compared to the controls (0.42±0.15 vs 0.50±0.13 APU/mmHg; P=0.04). In conclusion, laser speckle contrast imaging can identify endothelial and microvascular dysfunctions in male individuals with cardiovascular disease. Thus, this technology appears to be an efficient non-invasive technique for evaluating systemic microvascular and endothelial functions, which could be valuable as a peripheral marker of atherothrombotic diseases in men

  8. Analysis of the speckle properties in a laser projection system based on a human eye model.

    Science.gov (United States)

    Cui, Zhe; Wang, Anting; Ma, Qianli; Ming, Hai

    2014-03-01

    In this paper, the properties of the speckle that is observed by humans in laser projection systems are theoretically analyzed. The speckle pattern on the fovea of the human retina is numerically simulated by introducing a chromatic human eye model. The results show that the speckle contrast experienced by humans is affected by the light intensity of the projected images and the wavelength of the laser source when considering the paracentral vision. Furthermore, the image quality is also affected by these two parameters. We believe that these results are useful for evaluating the speckle noise in laser projection systems.

  9. Speckle reduction methods in laser-based picture projectors

    Science.gov (United States)

    Akram, M. Nadeem; Chen, Xuyuan

    2016-02-01

    Laser sources have been promised for many years to be better light sources as compared to traditional lamps or light-emitting diodes (LEDs) for projectors, which enable projectors having wide colour gamut for vivid image, super brightness and high contrast for the best picture quality, long lifetime for maintain free operation, mercury free, and low power consumption for green environment. A major technology obstacle in using lasers for projection has been the speckle noise caused by to the coherent nature of the lasers. For speckle reduction, current state of the art solutions apply moving parts with large physical space demand. Solutions beyond the state of the art need to be developed such as integrated optical components, hybrid MOEMS devices, and active phase modulators for compact speckle reduction. In this article, major methods reported in the literature for the speckle reduction in laser projectors are presented and explained. With the advancement in semiconductor lasers with largely reduced cost for the red, green and the blue primary colours, and the developed methods for their speckle reduction, it is hoped that the lasers will be widely utilized in different projector applications in the near future.

  10. MODERN POSSIBILITIES OF SPECKLE TRACKING ECHOCARDIOGRAPHY IN CLINICAL PRACTICE

    Directory of Open Access Journals (Sweden)

    V. S. Nikiforov

    2017-01-01

    Full Text Available Speckle-tracking echocardiography is promising modern technique for evaluation of structural and functional changes in the myocardium. It evaluates the indicator of global longitudinal myocardial deformation, which is more sensitive than ejection fraction to early changes of left ventricular contractility. The diagnostic capabilities of speckle tracking echocardiography are reflected in clinical recommendations and consensus statements of European Society of Cardiology (ESC, European Association of Cardiovascular Imaging (EACVI and American Society of Echocardiography (ASE. The aim of this paper is describe basic principles of speckle tracking echocardiography and clinical applications of this new technology. Attention is paid to the use of speckle tracking echocardiography in such heart pathologies as heart failure, coronary heart disease and myocardial infarction, left ventricular hypertrophy in arterial hypertension, hypertrophic cardiomyopathy and amyloidosis of the heart, valvular heart disease, constrictive pericarditis and cancer therapy-induced cardiotoxicity.

  11. In-vivo brain blood flow imaging based on laser speckle contrast imaging and synchrotron radiation microangiography

    International Nuclear Information System (INIS)

    Miao, Peng; Feng, Shihan; Zhang, Qi; Lin, Xiaojie; Xie, Bohua; Liu, Chenwei; Yang, Guo-Yuan

    2014-01-01

    Abstract In-vivo imaging of blood flow in the cortex and sub-cortex is still a challenge in biological and pathological studies of cerebral vascular diseases. Laser speckle contrast imaging (LSCI) only provides cortex blood flow information. Traditional synchrotron radiation micro-angiography (SRA) provides sub-cortical vasculature information with high resolution. In this study, a bolus front-tracking method was developed to extract blood flow information based on SRA. Combining LSCI and SRA, arterial blood flow in the ipsilateral cortex and sub-cortex was monitored after experimental intracerebral hemorrhage of mice. At 72 h after injury, a significant blood flow increase was observed in the lenticulostriate artery along with blood flow decrease in cortical branches of the middle cerebral artery. This combined strategy provides a new approach for the investigation of brain vasculature and blood flow changes in preclinical studies. (paper)

  12. Novel medical image enhancement algorithms

    Science.gov (United States)

    Agaian, Sos; McClendon, Stephen A.

    2010-01-01

    In this paper, we present two novel medical image enhancement algorithms. The first, a global image enhancement algorithm, utilizes an alpha-trimmed mean filter as its backbone to sharpen images. The second algorithm uses a cascaded unsharp masking technique to separate the high frequency components of an image in order for them to be enhanced using a modified adaptive contrast enhancement algorithm. Experimental results from enhancing electron microscopy, radiological, CT scan and MRI scan images, using the MATLAB environment, are then compared to the original images as well as other enhancement methods, such as histogram equalization and two forms of adaptive contrast enhancement. An image processing scheme for electron microscopy images of Purkinje cells will also be implemented and utilized as a comparison tool to evaluate the performance of our algorithm.

  13. Understanding the exposure-time effect on speckle contrast measurements for laser displays

    Science.gov (United States)

    Suzuki, Koji; Kubota, Shigeo

    2018-02-01

    To evaluate the influence of exposure time on speckle noise for laser displays, speckle contrast measurement method was developed observable at a human eye response time using a high-sensitivity camera which has a signal multiplying function. The nonlinearity of camera light sensitivity was calibrated to measure accurate speckle contrasts, and the measuring lower limit noise of speckle contrast was improved by applying spatial-frequency low pass filter to the captured images. Three commercially available laser displays were measured over a wide range of exposure times from tens of milliseconds to several seconds without adjusting the brightness of laser displays. The speckle contrast of raster-scanned mobile projector without any speckle-reduction device was nearly constant over various exposure times. On the contrary to this, in full-frame projection type laser displays equipped with a temporally-averaging speckle-reduction device, some of their speckle contrasts close to the lower limits noise were slightly increased at the shorter exposure time due to the noise. As a result, the exposure-time effect of speckle contrast could not be observed in our measurements, although it is more reasonable to think that the speckle contrasts of laser displays, which are equipped with the temporally-averaging speckle-reduction device, are dependent on the exposure time. This discrepancy may be attributed to the underestimation of temporal averaging factor. We expected that this method is useful for evaluating various laser displays and clarify the relationship between the speckle noise and the exposure time for a further verification of speckle reduction.

  14. Nephron blood flow dynamics measured by laser speckle contrast imaging

    DEFF Research Database (Denmark)

    von Holstein-Rathlou, Niels-Henrik; Sosnovtseva, Olga V; Pavlov, Alexey N

    2011-01-01

    Tubuloglomerular feedback (TGF) has an important role in autoregulation of renal blood flow and glomerular filtration rate (GFR). Because of the characteristics of signal transmission in the feedback loop, the TGF undergoes self-sustained oscillations in single-nephron blood flow, GFR, and tubular...... simultaneously. The interacting nephron fields are likely to be more extensive. We have turned to laser speckle contrast imaging to measure the blood flow dynamics of 50-100 nephrons simultaneously on the renal surface of anesthetized rats. We report the application of this method and describe analytic...... pressure and flow. Nephrons interact by exchanging electrical signals conducted electrotonically through cells of the vascular wall, leading to synchronization of the TGF-mediated oscillations. Experimental studies of these interactions have been limited to observations on two or at most three nephrons...

  15. Patch Similarity Modulus and Difference Curvature Based Fourth-Order Partial Differential Equation for Image Denoising

    Directory of Open Access Journals (Sweden)

    Yunjiao Bai

    2015-01-01

    Full Text Available The traditional fourth-order nonlinear diffusion denoising model suffers the isolated speckles and the loss of fine details in the processed image. For this reason, a new fourth-order partial differential equation based on the patch similarity modulus and the difference curvature is proposed for image denoising. First, based on the intensity similarity of neighbor pixels, this paper presents a new edge indicator called patch similarity modulus, which is strongly robust to noise. Furthermore, the difference curvature which can effectively distinguish between edges and noise is incorporated into the denoising algorithm to determine the diffusion process by adaptively adjusting the size of the diffusion coefficient. The experimental results show that the proposed algorithm can not only preserve edges and texture details, but also avoid isolated speckles and staircase effect while filtering out noise. And the proposed algorithm has a better performance for the images with abundant details. Additionally, the subjective visual quality and objective evaluation index of the denoised image obtained by the proposed algorithm are higher than the ones from the related methods.

  16. Speckle Filtering of GF-3 Polarimetric SAR Data with Joint Restriction Principle.

    Science.gov (United States)

    Xie, Jinwei; Li, Zhenfang; Zhou, Chaowei; Fang, Yuyuan; Zhang, Qingjun

    2018-05-12

    Polarimetric SAR (PolSAR) scattering characteristics of imagery are always obtained from the second order moments estimation of multi-polarization data, that is, the estimation of covariance or coherency matrices. Due to the extra-paths that signal reflected from separate scatterers within the resolution cell has to travel, speckle noise always exists in SAR images and has a severe impact on the scattering performance, especially on single look complex images. In order to achieve high accuracy in estimating covariance or coherency matrices, three aspects are taken into consideration: (1) the edges and texture of the scene are distinct after speckle filtering; (2) the statistical characteristic should be similar to the object pixel; and (3) the polarimetric scattering signature should be preserved, in addition to speckle reduction. In this paper, a joint restriction principle is proposed to meet the requirement. Three different restriction principles are introduced to the processing of speckle filtering. First, a new template, which is more suitable for the point or line targets, is designed to ensure the morphological consistency. Then, the extent sigma filter is used to restrict the pixels in the template aforementioned to have an identical statistic characteristic. At last, a polarimetric similarity factor is applied to the same pixels above, to guarantee the similar polarimetric features amongst the optional pixels. This processing procedure is named as speckle filtering with joint restriction principle and the approach is applied to GF-3 polarimetric SAR data acquired in San Francisco, CA, USA. Its effectiveness of keeping the image sharpness and preserving the scattering mechanism as well as speckle reduction is validated by the comparison with boxcar filters and refined Lee filter.

  17. Accelerated numerical processing of electronically recorded holograms with reduced speckle noise.

    Science.gov (United States)

    Trujillo, Carlos; Garcia-Sucerquia, Jorge

    2013-09-01

    The numerical reconstruction of digitally recorded holograms suffers from speckle noise. An accelerated method that uses general-purpose computing in graphics processing units to reduce that noise is shown. The proposed methodology utilizes parallelized algorithms to record, reconstruct, and superimpose multiple uncorrelated holograms of a static scene. For the best tradeoff between reduction of the speckle noise and processing time, the method records, reconstructs, and superimposes six holograms of 1024 × 1024 pixels in 68 ms; for this case, the methodology reduces the speckle noise by 58% compared with that exhibited by a single hologram. The fully parallelized method running on a commodity graphics processing unit is one order of magnitude faster than the same technique implemented on a regular CPU using its multithreading capabilities. Experimental results are shown to validate the proposal.

  18. Laser-induced speckle scatter patterns in Bacillus colonies

    Directory of Open Access Journals (Sweden)

    Huisung eKim

    2014-10-01

    Full Text Available Label-free bacterial colony phenotyping technology called BARDOT (BActerial Rapid Detection using Optical scattering Technology provided successful classification of several different bacteria at the genus, species, and serovar level. Recent experiments with colonies of Bacillus species provided strikingly different characteristics of elastic light scatter (ELS patterns, which were comprised of random speckles compared to other bacteria, which are dominated by concentric rings and spokes. Since this laser-based optical sensor interrogates the whole volume of the colony, 3-D information of micro- and macro-structures are all encoded in the far-field scatter patterns. Here, we present a theoretical model explaining the underlying mechanism of the speckle formation by the colonies from Bacillus species. Except for Bacillus polymyxa, all Bacillus spp. produced random bright spots on the imaging plane, which presumably dependent on the cellular and molecular organization and content within the colony. Our scatter model-based analysis revealed that colony spread resulting in variable surface roughness can modify the wavefront of the scatter field. As the center diameter of the Bacillus spp. colony grew from 500 μm to 900 μm, average speckles area decreased 2-fold and the number of small speckles increased 7-fold. In conclusion, as Bacillus colony grows, the average speckle size in the scatter pattern decreases and the number of smaller speckle increases due to the swarming growth characteristics of bacteria within the colony.

  19. Entendue invariance in speckle fields

    International Nuclear Information System (INIS)

    Medina, F.F.; Garcia-Sucerquia, J.; Henao, R.; Trivi, M.

    2000-04-01

    Experimental evidence is shown that confirms the Entendue invariance in speckle fields. Because of this condition, the coherence patch of the speckle field can be significantly greater than the mean size of the speckles, as is shown by double exposure speckle interferometry. (author)

  20. Optical Coherence Tomography Technology and Quality Improvement Methods for Optical Coherence Tomography Images of Skin: A Short Review

    Science.gov (United States)

    Adabi, Saba; Turani, Zahra; Fatemizadeh, Emad; Clayton, Anne; Nasiriavanaki, Mohammadreza

    2017-01-01

    Optical coherence tomography (OCT) delivers 3-dimensional images of tissue microstructures. Although OCT imaging offers a promising high-resolution method, OCT images experience some artifacts that lead to misapprehension of tissue structures. Speckle, intensity decay, and blurring are 3 major artifacts in OCT images. Speckle is due to the low coherent light source used in the configuration of OCT. Intensity decay is a deterioration of light with respect to depth, and blurring is the consequence of deficiencies of optical components. In this short review, we summarize some of the image enhancement algorithms for OCT images which address the abovementioned artifacts. PMID:28638245

  1. Optical Coherence Tomography Technology and Quality Improvement Methods for Optical Coherence Tomography Images of Skin: A Short Review

    Directory of Open Access Journals (Sweden)

    Saba Adabi

    2017-06-01

    Full Text Available Optical coherence tomography (OCT delivers 3-dimensional images of tissue microstructures. Although OCT imaging offers a promising high-resolution method, OCT images experience some artifacts that lead to misapprehension of tissue structures. Speckle, intensity decay, and blurring are 3 major artifacts in OCT images. Speckle is due to the low coherent light source used in the configuration of OCT. Intensity decay is a deterioration of light with respect to depth, and blurring is the consequence of deficiencies of optical components. In this short review, we summarize some of the image enhancement algorithms for OCT images which address the abovementioned artifacts.

  2. Speckle reduction in optical coherence tomography images of human skin by a spatial diversity method - art. no. 66270P

    DEFF Research Database (Denmark)

    Jørgensen, Thomas Martini; Thrane, Lars; Mogensen, M.

    2007-01-01

    the scheme with a mobile fiber-based time-domain real-time OCT system. Essential enhancement was obtained in image contrast when performing in vivo imaging of normal skin and lesions. Resulting images show improved delineation of structure in correspondence with the observed improvements in contrast...... system. Here, we consider a method that in principle can be fitted to most OCT systems without major modifications. Specifically, we address a spatial diversity technique for suppressing speckle noise in OCT images of human skin. The method is a variant of changing the position of the sample relative...

  3. New developments in NDT through electronic speckle pattern interferometry

    International Nuclear Information System (INIS)

    Mohan, S.; Murugesan, P; Mas, R.H.

    2007-01-01

    experiment was carried out using mechanical and thermal loading techniques. different types of defects are introduced in the test specimen and the corresponding interference fringe patterns generated are studied for the identification and detection of defects. The nature of fringe anomalies in the presence of defects in the case of mechanical and thermal loading are studied for the characterization defects. This technique is also successfully used in medical diagnosis. Endoscopy is minimally invasive diagnostic medical procedure used to evaluate the interior surfaces of an organ by inserting a small tube into body, often, but not necessarily, through a natural body opening. Through endoscope, one can see lesions and the other surface conditions. An endoscopic electronic speckle pattern interferometer camera can be applied to examine objects as well as for in vitro and in vivo minimal invasive medical diagnostics. The combination of holographic interferometric metrology with endoscopic imaging allows the development of a special class of instruments for nondestructive quantitative diagnostics with in body cavities. The development of digital imaging, digital holographic interferometry, electronic speckle pattern interferometry are very useful in medical diagnosis. In the present investigation, the non destructive dynamic holographic endoscopy was used to study the disturbances of stomach wall intensity using speckle images. The various speckle images were recorded at different portions of the human stomach and esophagus. It is concluded that this method is very good method to study deformations, abnormalities in the stomach and related organs. The speckle interferometry is a very useful tool in biological and medical fields to study the deformations and displacements in tissues and related parameters. (author)

  4. Despeckling Polsar Images Based on Relative Total Variation Model

    Science.gov (United States)

    Jiang, C.; He, X. F.; Yang, L. J.; Jiang, J.; Wang, D. Y.; Yuan, Y.

    2018-04-01

    Relatively total variation (RTV) algorithm, which can effectively decompose structure information and texture in image, is employed in extracting main structures of the image. However, applying the RTV directly to polarimetric SAR (PolSAR) image filtering will not preserve polarimetric information. A new RTV approach based on the complex Wishart distribution is proposed considering the polarimetric properties of PolSAR. The proposed polarization RTV (PolRTV) algorithm can be used for PolSAR image filtering. The L-band Airborne SAR (AIRSAR) San Francisco data is used to demonstrate the effectiveness of the proposed algorithm in speckle suppression, structural information preservation, and polarimetric property preservation.

  5. Speckle Reduction for Ultrasonic Imaging Using Frequency Compounding and Despeckling Filters along with Coded Excitation and Pulse Compression

    Directory of Open Access Journals (Sweden)

    Joshua S. Ullom

    2012-01-01

    Full Text Available A method for improving the contrast-to-noise ratio (CNR while maintaining the −6 dB axial resolution of ultrasonic B-mode images is proposed. The technique proposed is known as eREC-FC, which enhances a recently developed REC-FC technique. REC-FC is a combination of the coded excitation technique known as resolution enhancement compression (REC and the speckle-reduction technique frequency compounding (FC. In REC-FC, image CNR is improved but at the expense of a reduction in axial resolution. However, by compounding various REC-FC images made from various subband widths, the tradeoff between axial resolution and CNR enhancement can be extended. Further improvements in CNR can be obtained by applying postprocessing despeckling filters to the eREC-FC B-mode images. The despeckling filters evaluated were the following: median, Lee, homogeneous mask area, geometric, and speckle-reducing anisotropic diffusion (SRAD. Simulations and experimental measurements were conducted with a single-element transducer (f/2.66 having a center frequency of 2.25 MHz and a −3 dB bandwidth of 50%. In simulations and experiments, the eREC-FC technique resulted in the same axial resolution that would be typically observed with conventional excitation with a pulse. Moreover, increases in CNR of 348% were obtained in experiments when comparing eREC-FC with a Lee filter to conventional pulsing methods.

  6. Texture Based Quality Analysis of Simulated Synthetic Ultrasound Images Using Local Binary Patterns †

    Directory of Open Access Journals (Sweden)

    Prerna Singh

    2017-12-01

    Full Text Available Speckle noise reduction is an important area of research in the field of ultrasound image processing. Several algorithms for speckle noise characterization and analysis have been recently proposed in the area. Synthetic ultrasound images can play a key role in noise evaluation methods as they can be used to generate a variety of speckle noise models under different interpolation and sampling schemes, and can also provide valuable ground truth data for estimating the accuracy of the chosen methods. However, not much work has been done in the area of modeling synthetic ultrasound images, and in simulating speckle noise generation to get images that are as close as possible to real ultrasound images. An important aspect of simulated synthetic ultrasound images is the requirement for extensive quality assessment for ensuring that they have the texture characteristics and gray-tone features of real images. This paper presents texture feature analysis of synthetic ultrasound images using local binary patterns (LBP and demonstrates the usefulness of a set of LBP features for image quality assessment. Experimental results presented in the paper clearly show how these features could provide an accurate quality metric that correlates very well with subjective evaluations performed by clinical experts.

  7. Laser speckle imaging identification of increases in cortical microcirculatory blood flow induced by motor activity during awake craniotomy ; Clinical article

    NARCIS (Netherlands)

    E. Klijn (Elko); M.E.J.L. Hulscher (Marlies); R.K. Balvers (Rutger); W.P.J. Holland (Wim); J. Bakker (Jan); A.J.P.E. Vincent (Arnoud); C.M.F. Dirven (Clemens); C. Ince (Can)

    2013-01-01

    textabstractObject. The goal of awake neurosurgery is to maximize resection of brain lesions with minimal injury to functional brain areas. Laser speckle imaging (LSI) is a noninvasive macroscopic technique with high spatial and temporal resolution used to monitor changes in capillary perfusion. In

  8. Off-axis holographic laser speckle contrast imaging of blood vessels in tissues

    Science.gov (United States)

    Abdurashitov, Arkady; Bragina, Olga; Sindeeva, Olga; Sergey, Sindeev; Semyachkina-Glushkovskaya, Oxana V.; Tuchin, Valery V.

    2017-09-01

    Laser speckle contrast imaging (LSCI) has become one of the most common tools for functional imaging in tissues. Incomplete theoretical description and sophisticated interpretation of measurement results are completely sidelined by a low-cost and simple hardware, fastness, consistent results, and repeatability. In addition to the relatively low measuring volume with around 700 μm of the probing depth for the visible spectral range of illumination, there is no depth selectivity in conventional LSCI configuration; furthermore, in a case of high NA objective, the actual penetration depth of light in tissues is greater than depth of field (DOF) of an imaging system. Thus, the information about these out-of-focus regions persists in the recorded frames but cannot be retrieved due to intensity-based registration method. We propose a simple modification of LSCI system based on the off-axis holography to introduce after-registration refocusing ability to overcome both depth-selectivity and DOF problems as well as to get the potential possibility of producing a cross-section view of the specimen.

  9. SPECKLE CAMERA OBSERVATIONS FOR THE NASA KEPLER MISSION FOLLOW-UP PROGRAM

    International Nuclear Information System (INIS)

    Howell, Steve B.; Everett, Mark E.; Sherry, William; Horch, Elliott; Ciardi, David R.

    2011-01-01

    We present the first results from a speckle imaging survey of stars classified as candidate exoplanet host stars discovered by the Kepler mission. We use speckle imaging to search for faint companions or closely aligned background stars that could contribute flux to the Kepler light curves of their brighter neighbors. Background stars are expected to contribute significantly to the pool of false positive candidate transiting exoplanets discovered by the Kepler mission, especially in the case that the faint neighbors are eclipsing binary stars. Here, we describe our Kepler follow-up observing program, the speckle imaging camera used, our data reduction, and astrometric and photometric performance. Kepler stars range from R = 8 to 16 and our observations attempt to provide background non-detection limits 5-6 mag fainter and binary separations of ∼0.05-2.0 arcsec. We present data describing the relative brightness, separation, and position angles for secondary sources, as well as relative plate limits for non-detection of faint nearby stars around each of 156 target stars. Faint neighbors were found near 10 of the stars.

  10. Momentum transfer Monte Carlo for the simulation of laser speckle imaging and its application in the skin.

    Science.gov (United States)

    Regan, Caitlin; Hayakawa, Carole; Choi, Bernard

    2017-12-01

    Due to its simplicity and low cost, laser speckle imaging (LSI) has achieved widespread use in biomedical applications. However, interpretation of the blood-flow maps remains ambiguous, as LSI enables only limited visualization of vasculature below scattering layers such as the epidermis and skull. Here, we describe a computational model that enables flexible in-silico study of the impact of these factors on LSI measurements. The model uses Monte Carlo methods to simulate light and momentum transport in a heterogeneous tissue geometry. The virtual detectors of the model track several important characteristics of light. This model enables study of LSI aspects that may be difficult or unwieldy to address in an experimental setting, and enables detailed study of the fundamental origins of speckle contrast modulation in tissue-specific geometries. We applied the model to an in-depth exploration of the spectral dependence of speckle contrast signal in the skin, the effects of epidermal melanin content on LSI, and the depth-dependent origins of our signal. We found that LSI of transmitted light allows for a more homogeneous integration of the signal from the entire bulk of the tissue, whereas epi-illumination measurements of contrast are limited to a fraction of the light penetration depth. We quantified the spectral depth dependence of our contrast signal in the skin, and did not observe a statistically significant effect of epidermal melanin on speckle contrast. Finally, we corroborated these simulated results with experimental LSI measurements of flow beneath a thin absorbing layer. The results of this study suggest the use of LSI in the clinic to monitor perfusion in patients with different skin types, or inhomogeneous epidermal melanin distributions.

  11. Learnable despeckling framework for optical coherence tomography images

    Science.gov (United States)

    Adabi, Saba; Rashedi, Elaheh; Clayton, Anne; Mohebbi-Kalkhoran, Hamed; Chen, Xue-wen; Conforto, Silvia; Nasiriavanaki, Mohammadreza

    2018-01-01

    Optical coherence tomography (OCT) is a prevalent, interferometric, high-resolution imaging method with broad biomedical applications. Nonetheless, OCT images suffer from an artifact called speckle, which degrades the image quality. Digital filters offer an opportunity for image improvement in clinical OCT devices, where hardware modification to enhance images is expensive. To reduce speckle, a wide variety of digital filters have been proposed; selecting the most appropriate filter for an OCT image/image set is a challenging decision, especially in dermatology applications of OCT where a different variety of tissues are imaged. To tackle this challenge, we propose an expandable learnable despeckling framework, we call LDF. LDF decides which speckle reduction algorithm is most effective on a given image by learning a figure of merit (FOM) as a single quantitative image assessment measure. LDF is learnable, which means when implemented on an OCT machine, each given image/image set is retrained and its performance is improved. Also, LDF is expandable, meaning that any despeckling algorithm can easily be added to it. The architecture of LDF includes two main parts: (i) an autoencoder neural network and (ii) filter classifier. The autoencoder learns the FOM based on several quality assessment measures obtained from the OCT image including signal-to-noise ratio, contrast-to-noise ratio, equivalent number of looks, edge preservation index, and mean structural similarity index. Subsequently, the filter classifier identifies the most efficient filter from the following categories: (a) sliding window filters including median, mean, and symmetric nearest neighborhood, (b) adaptive statistical-based filters including Wiener, homomorphic Lee, and Kuwahara, and (c) edge preserved patch or pixel correlation-based filters including nonlocal mean, total variation, and block matching three-dimensional filtering.

  12. Modeled and Measured Partially Coherent Illumination Speckle Effects from Sloped Surfaces for Tactical Tracking

    Science.gov (United States)

    2015-03-26

    the number of speckle samples obtained, laser power and coherence length, spot size, target reflectance, speckle size, and pixels per speckle width...gated imaging systems,” Proc. SPIE, 6542: 654218, April 2007. 90 St. Pierre, Randall J. and others. “Active Tracker Laser (ATLAS),” IEEE J. Sel...numerical model developed here and existing theory developed by Hu. A 671 nm diode laser source with coherence length of 259 +/- 7 µm is reflected

  13. En face speckle reduction in optical coherence microscopy by frequency compounding.

    Science.gov (United States)

    Magnain, Caroline; Wang, Hui; Sakadžić, Sava; Fischl, Bruce; Boas, David A

    2016-05-01

    We report the use of frequency compounding to significantly reduce speckle noise in optical coherence microscopy, more specifically on the en face images. This method relies on the fact that the speckle patterns recorded from different wavelengths simultaneously are independent; hence their summation yields significant reduction in noise, with only a single acquisition. The results of our experiments with microbeads show that the narrow confocal parameter, due to a high numerical aperture objective, restricts the axial resolution loss that would otherwise theoretically broaden linearly with the number of optical frequency bands used. This speckle reduction scheme preserves the lateral resolution since it is performed on individual A-scans. Finally, we apply this technique to images of fixed human brain tissue, showing significant improvements in contrast-to-noise ratio with only moderate loss of axial resolution, in an effort to improve automatic three-dimensional detection of cells and fibers in the cortex.

  14. Waveguide generated mitigation of speckle and scintillation on an actively illuminated target

    Science.gov (United States)

    Moore, Trevor D.; Raynor, Robert A.; Spencer, Mark F.; Schmidt, Jason D.

    2016-09-01

    Active illumination is often used when passive illumination cannot produce enough signal intensity to be a reliable imaging method. However, an increase in signal intensity is often achieved by using highly coherent laser sources, which produce undesirable effects such as speckle and scintillation. The deleterious effects of speckle and scintillation are often so immense that the imaging camera cannot receive intelligible data, thereby rendering the active illumination technique useless. By reducing the spatial coherence of the laser beam that is actively illuminating the object, it is possible to reduce the corruption of the received data caused by speckle and scintillation. The waveguide method discussed in this paper reduces spatial coherence through multiple total internal reflections, which create multiple virtual sources of diverse path lengths. The differing path lengths between the virtual sources and the target allow for the temporal coherence properties of the laser to be translated into spatial coherence properties. The resulting partial spatial coherence helps to mitigate the self-interference of the beam as it travels through the atmosphere and reflects off of optically rough targets. This mitigation method results in a cleaner, intelligible image that may be further processed for the intended use, unlike its unmitigated counterpart. Previous research has been done to independently reduce speckle or scintillation by way of spatial incoherence, but there has been no focus on modeling the waveguide, specifically the image plane the waveguide creates. Utilizing a ray-tracing method we can determine the coherence length of the source necessary to create incoherent spots in the image plane, as well as accurately modeling the image plane.

  15. Quantitative X-ray dark-field and phase tomography using single directional speckle scanning technique

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hongchang, E-mail: hongchang.wang@diamond.ac.uk; Kashyap, Yogesh; Sawhney, Kawal [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)

    2016-03-21

    X-ray dark-field contrast tomography can provide important supplementary information inside a sample to the conventional absorption tomography. Recently, the X-ray speckle based technique has been proposed to provide qualitative two-dimensional dark-field imaging with a simple experimental arrangement. In this letter, we deduce a relationship between the second moment of scattering angle distribution and cross-correlation degradation of speckle and establish a quantitative basis of X-ray dark-field tomography using single directional speckle scanning technique. In addition, the phase contrast images can be simultaneously retrieved permitting tomographic reconstruction, which yields enhanced contrast in weakly absorbing materials. Such complementary tomography technique can allow systematic investigation of complex samples containing both soft and hard materials.

  16. A Nonlinear Diffusion Equation-Based Model for Ultrasound Speckle Noise Removal

    Science.gov (United States)

    Zhou, Zhenyu; Guo, Zhichang; Zhang, Dazhi; Wu, Boying

    2018-04-01

    Ultrasound images are contaminated by speckle noise, which brings difficulties in further image analysis and clinical diagnosis. In this paper, we address this problem in the view of nonlinear diffusion equation theories. We develop a nonlinear diffusion equation-based model by taking into account not only the gradient information of the image, but also the information of the gray levels of the image. By utilizing the region indicator as the variable exponent, we can adaptively control the diffusion type which alternates between the Perona-Malik diffusion and the Charbonnier diffusion according to the image gray levels. Furthermore, we analyze the proposed model with respect to the theoretical and numerical properties. Experiments show that the proposed method achieves much better speckle suppression and edge preservation when compared with the traditional despeckling methods, especially in the low gray level and low-contrast regions.

  17. Enhanced diagnostic of skin conditions by polarized laser speckles: phantom studies and computer modeling

    Science.gov (United States)

    Tchvialeva, Lioudmila; Lee, Tim K.; Markhvida, Igor; Zeng, Haishan; Doronin, Alexander; Meglinski, Igor

    2014-03-01

    The incidence of the skin melanoma, the most commonly fatal form of skin cancer, is increasing faster than any other potentially preventable cancer. Clinical practice is currently hampered by the lack of the ability to rapidly screen the functional and morphological properties of tissues. In our previous study we show that the quantification of scattered laser light polarization provides a useful metrics for diagnostics of the malignant melanoma. In this study we exploit whether the image speckle could improve skin cancer diagnostic in comparison with the previously used free-space speckle. The study includes skin phantom measurements and computer modeling. To characterize the depolarization of light we measure the spatial distribution of speckle patterns and analyse their depolarization ratio taken into account radial symmetry. We examine the dependences of depolarization ratio vs. roughness for phantoms which optical properties are of the order of skin lesions. We demonstrate that the variation in bulk optical properties initiates the assessable changes in the depolarization ratio. We show that image speckle differentiates phantoms significantly better than free-space speckle. The results of experimental measurements are compared with the results of Monte Carlo simulation.

  18. Accuracy concerns in digital speckle photography combined with Fresnel digital holographic interferometry

    Science.gov (United States)

    Zhao, Yuchen; Zemmamouche, Redouane; Vandenrijt, Jean-François; Georges, Marc P.

    2018-05-01

    A combination of digital holographic interferometry (DHI) and digital speckle photography (DSP) allows in-plane and out-of-plane displacement measurement between two states of an object. The former can be determined by correlating the two speckle patterns whereas the latter is given by the phase difference obtained from DHI. We show that the amplitude of numerically reconstructed object wavefront obtained from Fresnel in-line digital holography (DH), in combination with phase shifting techniques, can be used as speckle patterns in DSP. The accuracy of in-plane measurement is improved after correcting the phase errors induced by reference wave during reconstruction process. Furthermore, unlike conventional imaging system, Fresnel DH offers the possibility to resize the pixel size of speckle patterns situated on the reconstruction plane under the same optical configuration simply by zero-padding the hologram. The flexibility of speckle size adjustment in Fresnel DH ensures the accuracy of estimation result using DSP.

  19. Optical diagnostics of vascular reactions triggered by weak allergens using laser speckle-contrast imaging technique

    International Nuclear Information System (INIS)

    Kuznetsov, Yu L; Kalchenko, V V; Astaf'eva, N G; Meglinski, I V

    2014-01-01

    The capability of using the laser speckle contrast imaging technique with a long exposure time for visualisation of primary acute skin vascular reactions caused by a topical application of a weak contact allergen is considered. The method is shown to provide efficient and accurate detection of irritant-induced primary acute vascular reactions of skin. The presented technique possesses a high potential in everyday diagnostic practice, preclinical studies, as well as in the prognosis of skin reactions to the interaction with potentially allergenic materials. (laser biophotonics)

  20. Optical diagnostics of vascular reactions triggered by weak allergens using laser speckle-contrast imaging technique

    Energy Technology Data Exchange (ETDEWEB)

    Kuznetsov, Yu L; Kalchenko, V V [Department of Veterinary Resources, Weizmann Institute of Science, Rehovot, 76100 (Israel); Astaf' eva, N G [V.I.Razumovsky Saratov State Medical University, Saratov (Russian Federation); Meglinski, I V [N.G. Chernyshevsky Saratov State University, Saratov (Russian Federation)

    2014-08-31

    The capability of using the laser speckle contrast imaging technique with a long exposure time for visualisation of primary acute skin vascular reactions caused by a topical application of a weak contact allergen is considered. The method is shown to provide efficient and accurate detection of irritant-induced primary acute vascular reactions of skin. The presented technique possesses a high potential in everyday diagnostic practice, preclinical studies, as well as in the prognosis of skin reactions to the interaction with potentially allergenic materials. (laser biophotonics)

  1. Single-shot speckle reduction in numerical reconstruction of digitally recorded holograms.

    Science.gov (United States)

    Hincapie, Diego; Herrera-Ramírez, Jorge; Garcia-Sucerquia, Jorge

    2015-04-15

    A single-shot method to reduce the speckle noise in the numerical reconstructions of electronically recorded holograms is presented. A recorded hologram with the dimensions N×M is split into S=T×T sub-holograms. The uncorrelated superposition of the individually reconstructed sub-holograms leads to an image with the speckle noise reduced proportionally to the 1/S law. The experimental results are presented to support the proposed methodology.

  2. Real time speckle monitoring to control retinal photocoagulation

    Science.gov (United States)

    Bliedtner, Katharina; Seifert, Eric; Brinkmann, Ralf

    2017-07-01

    Photocoagulation is a treatment modality for several retinal diseases. Intra- and inter-individual variations of the retinal absorption as well as ocular transmission and light scattering makes it impossible to achieve a uniform effective exposure with one set of laser parameters. To guarantee a uniform damage throughout the therapy a real-time control is highly requested. Here, an approach to realize a real-time optical feedback using dynamic speckle analysis in-vivo is presented. A 532 nm continuous wave Nd:YAG laser is used for coagulation. During coagulation, speckle dynamics are monitored by a coherent object illumination using a 633 nm diode laser and analyzed by a CMOS camera with a frame rate up to 1 kHz. An algorithm is presented that can discriminate between different categories of retinal pigment epithelial damage ex-vivo in enucleated porcine eyes and that seems to be robust to noise in-vivo. Tissue changes in rabbits during retinal coagulation could be observed for different lesion strengths. This algorithm can run on a FPGA and is able to calculate a feedback value which is correlated to the thermal and coagulation induced tissue motion and thus the achieved damage.

  3. Multiple rotation assessment through isothetic fringes in speckle photography

    International Nuclear Information System (INIS)

    Angel, Luciano; Tebaldi, Myrian; Bolognini, Nestor

    2007-01-01

    The use of different pupils for storing each speckled image in speckle photography is employed to determine multiple in-plane rotations. The method consists of recording a four-exposure specklegram where the rotations are done between exposures. This specklegram is then optically processed in a whole field approach rendering isothetic fringes, which give detailed information about the multiple rotations. It is experimentally demonstrated that the proposed arrangement permits the depiction of six isothetics in order to measure either six different angles or three nonparallel components for two local general in-plane displacements

  4. Infrared speckle observations of the binary Ross 614 AB - combined shift-and-add and zero-and-add analysis

    International Nuclear Information System (INIS)

    Davey, B.L.K.; Bates, R.H.T.; Cocke, W.J.; Mccarthy, D.W. Jr.; Christou, J.C.

    1989-01-01

    One-dimensional infrared speckle scans of Ross 614 AB were recorded at a wavelength of 2.2 microns, and the three bins corresponding to the three best seeing conditions were further processed by applying a shift-and-add algorithm to the set of images contained within each bin, generating three shift-and-add images with differing shift-and-add point-spread functions. A zero-and-add technique was used to deconvolve the three shift-and-add images in order to obtain parameters corresponding to the separation and the brightness ratio of a two-component model of Ross 614 Ab. Least-squares analysis results reveal a separation of 1.04 arcsec and a brightness ratio of 4.3 for the binary system at this wavelength. 31 refs

  5. Robust information encryption diffractive-imaging-based scheme with special phase retrieval algorithm for a customized data container

    Science.gov (United States)

    Qin, Yi; Wang, Zhipeng; Wang, Hongjuan; Gong, Qiong; Zhou, Nanrun

    2018-06-01

    The diffractive-imaging-based encryption (DIBE) scheme has aroused wide interesting due to its compact architecture and low requirement of conditions. Nevertheless, the primary information can hardly be recovered exactly in the real applications when considering the speckle noise and potential occlusion imposed on the ciphertext. To deal with this issue, the customized data container (CDC) into DIBE is introduced and a new phase retrieval algorithm (PRA) for plaintext retrieval is proposed. The PRA, designed according to the peculiarity of the CDC, combines two key techniques from previous approaches, i.e., input-support-constraint and median-filtering. The proposed scheme can guarantee totally the reconstruction of the primary information despite heavy noise or occlusion and its effectiveness and feasibility have been demonstrated with simulation results.

  6. Fabrication of nanoscale speckle using broad ion beam milling on polymers for deformation analysis

    Directory of Open Access Journals (Sweden)

    Qinghua Wang

    2016-07-01

    Full Text Available We first report a fabrication technique of nanoscale speckle patterns on polymers using broad ion beam milling. The proposed technique is simple and low-cost to produce speckles ranging from dozens of nanometers to less than three micrometers in a large area of several millimeters. Random patterns were successfully produced with an argon (Ar ion beam on the surfaces of four kinds of polymers: the epoxy matrix of carbon fiber reinforced plastic, polyester, polyvinyl formal-acetal, and polyimide. The speckle morphologies slightly vary with different polymers. The fabricated speckle patterns have good time stability and are promising to be used to measure the nanoscale deformations of polymers using the digital image correlation method.

  7. Intrinsic speckle noise in in-line particle holography due to polydisperse and continuous particle sizes

    Science.gov (United States)

    Edwards, Philip J.; Hobson, Peter R.; Rodgers, G. J.

    2000-08-01

    In-line particle holography is subject to image deterioration due to intrinsic speckle noise. The resulting reduction in the signal to noise ratio (SNR) of the replayed image can become critical for applications such as holographic particle velocimetry (HPV) and 3D visualisation of marine plankton. Work has been done to extend the mono-disperse model relevant to HPV to include poly-disperse particle fields appropriate for the visualisation of marine plankton. Continuous and discrete particle fields are both considered. It is found that random walk statistics still apply for the poly-disperse case. The speckle field is simply the summation of the individual speckle patters due to each scatter size. Therefor the characteristic speckle parameter (which encompasses particle diameter, concentration and sample depth) is alos just the summation of the individual speckle parameters. This reduces the SNR calculation to the same form as for the mono-disperse case. For the continuous situation three distributions, power, exponential and Gaussian are discussed with the resulting SNR calcuated. The work presented here was performed as part of the Holomar project to produce a working underwater holographic camera for recording plankton.

  8. Computer vision elastography: speckle adaptive motion estimation for elastography using ultrasound sequences.

    Science.gov (United States)

    Revell, James; Mirmehdi, Majid; McNally, Donal

    2005-06-01

    We present the development and validation of an image based speckle tracking methodology, for determining temporal two-dimensional (2-D) axial and lateral displacement and strain fields from ultrasound video streams. We refine a multiple scale region matching approach incorporating novel solutions to known speckle tracking problems. Key contributions include automatic similarity measure selection to adapt to varying speckle density, quantifying trajectory fields, and spatiotemporal elastograms. Results are validated using tissue mimicking phantoms and in vitro data, before applying them to in vivo musculoskeletal ultrasound sequences. The method presented has the potential to improve clinical knowledge of tendon pathology from carpel tunnel syndrome, inflammation from implants, sport injuries, and many others.

  9. Speckle-based spectrometer

    DEFF Research Database (Denmark)

    Chakrabarti, Maumita; Jakobsen, Michael Linde; Hanson, Steen Grüner

    2015-01-01

    A novel spectrometer concept is analyzed and experimentally verified. The method relies on probing the speckle displacement due to a change in the incident wavelength. A rough surface is illuminated at an oblique angle, and the peak position of the covariance between the speckle patterns observed...

  10. Quantum Image Encryption Algorithm Based on Image Correlation Decomposition

    Science.gov (United States)

    Hua, Tianxiang; Chen, Jiamin; Pei, Dongju; Zhang, Wenquan; Zhou, Nanrun

    2015-02-01

    A novel quantum gray-level image encryption and decryption algorithm based on image correlation decomposition is proposed. The correlation among image pixels is established by utilizing the superposition and measurement principle of quantum states. And a whole quantum image is divided into a series of sub-images. These sub-images are stored into a complete binary tree array constructed previously and then randomly performed by one of the operations of quantum random-phase gate, quantum revolving gate and Hadamard transform. The encrypted image can be obtained by superimposing the resulting sub-images with the superposition principle of quantum states. For the encryption algorithm, the keys are the parameters of random phase gate, rotation angle, binary sequence and orthonormal basis states. The security and the computational complexity of the proposed algorithm are analyzed. The proposed encryption algorithm can resist brute force attack due to its very large key space and has lower computational complexity than its classical counterparts.

  11. Ant Colony Clustering Algorithm and Improved Markov Random Fusion Algorithm in Image Segmentation of Brain Images

    Directory of Open Access Journals (Sweden)

    Guohua Zou

    2016-12-01

    Full Text Available New medical imaging technology, such as Computed Tomography and Magnetic Resonance Imaging (MRI, has been widely used in all aspects of medical diagnosis. The purpose of these imaging techniques is to obtain various qualitative and quantitative data of the patient comprehensively and accurately, and provide correct digital information for diagnosis, treatment planning and evaluation after surgery. MR has a good imaging diagnostic advantage for brain diseases. However, as the requirements of the brain image definition and quantitative analysis are always increasing, it is necessary to have better segmentation of MR brain images. The FCM (Fuzzy C-means algorithm is widely applied in image segmentation, but it has some shortcomings, such as long computation time and poor anti-noise capability. In this paper, firstly, the Ant Colony algorithm is used to determine the cluster centers and the number of FCM algorithm so as to improve its running speed. Then an improved Markov random field model is used to improve the algorithm, so that its antinoise ability can be improved. Experimental results show that the algorithm put forward in this paper has obvious advantages in image segmentation speed and segmentation effect.

  12. 3-color photometry of a sunspot using speckle masking techniques

    NARCIS (Netherlands)

    Wiehr, E.; Sütterlin, P.

    1998-01-01

    A three-colour photometry is used to deduce the temperature of sunspot fine-structures. Using the Speckle-Masking method for image restoration, the resulting images (one per colour and burst) have a spatial resolution only limited by the telescope's aperture, i.e. 95km (blue), 145 km (red) and

  13. Lensless coherent imaging of proteins and supramolecular assemblies: Efficient phase retrieval by the charge flipping algorithm.

    Science.gov (United States)

    Dumas, Christian; van der Lee, Arie; Palatinus, Lukáš

    2013-05-01

    Diffractive imaging using the intense and coherent beam of X-ray free-electron lasers opens new perspectives for structural studies of single nanoparticles and biomolecules. Simulations were carried out to generate 3D oversampled diffraction patterns of non-crystalline biological samples, ranging from peptides and proteins to megadalton complex assemblies, and to recover their molecular structure from nanometer to near-atomic resolutions. Using these simulated data, we show here that iterative reconstruction methods based on standard and variant forms of the charge flipping algorithm, can efficiently solve the phase retrieval problem and extract a unique and reliable molecular structure. Contrary to the case of conventional algorithms, where the estimation and the use of a compact support is imposed, our approach does not require any prior information about the molecular assembly, and is amenable to a wide range of biological assemblies. Importantly, the robustness of this ab initio approach is illustrated by the fact that it tolerates experimental noise and incompleteness of the intensity data at the center of the speckle pattern. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Monitoring of bread cooling by statistical analysis of laser speckle patterns

    Science.gov (United States)

    Lyubenova, Tanya; Stoykova, Elena; Nacheva, Elena; Ivanov, Branimir; Panchev, Ivan; Sainov, Ventseslav

    2013-03-01

    The phenomenon of laser speckle can be used for detection and visualization of physical or biological activity in various objects (e.g. fruits, seeds, coatings) through statistical description of speckle dynamics. The paper presents the results of non-destructive monitoring of bread cooling by co-occurrence matrix and temporal structure function analysis of speckle patterns which have been recorded continuously within a few days. In total, 72960 and 39680 images were recorded and processed for two similar bread samples respectively. The experiments proved the expected steep decrease of activity related to the processes in the bread samples during the first several hours and revealed its oscillating character within the next few days. Characterization of activity over the bread sample surface was also obtained.

  15. OBSERVATIONS OF BINARY STARS WITH THE DIFFERENTIAL SPECKLE SURVEY INSTRUMENT. I. INSTRUMENT DESCRIPTION AND FIRST RESULTS

    International Nuclear Information System (INIS)

    Horch, Elliott P.; Veillette, Daniel R.; Shah, Sagar C.; O'Rielly, Grant V.; Baena Galle, Roberto; Van Altena, William F.

    2009-01-01

    First results of a new speckle imaging system, the Differential Speckle Survey Instrument, are reported. The instrument is designed to take speckle data in two filters simultaneously with two independent CCD imagers. This feature results in three advantages over other speckle cameras: (1) twice as many frames can be obtained in the same observation time which can increase the signal-to-noise ratio for astrometric measurements, (2) component colors can be derived from a single observation, and (3) the two colors give substantial leverage over atmospheric dispersion, allowing for subdiffraction-limited separations to be measured reliably. Fifty-four observations are reported from the first use of the instrument at the Wisconsin-Indiana-Yale-NOAO 3.5 m Telescope 9 The WIYN Observatory is a joint facility of the University of Wisconsin-Madison, Indiana University, Yale University, and the National Optical Astronomy Observatories. in 2008 September, including seven components resolved for the first time. These observations are used to judge the basic capabilities of the instrument.

  16. Digital image processing an algorithmic approach with Matlab

    CERN Document Server

    Qidwai, Uvais

    2009-01-01

    Introduction to Image Processing and the MATLAB EnvironmentIntroduction Digital Image Definitions: Theoretical Account Image Properties MATLAB Algorithmic Account MATLAB CodeImage Acquisition, Types, and File I/OImage Acquisition Image Types and File I/O Basics of Color Images Other Color Spaces Algorithmic Account MATLAB CodeImage ArithmeticIntroduction Operator Basics Theoretical TreatmentAlgorithmic Treatment Coding ExamplesAffine and Logical Operations, Distortions, and Noise in ImagesIntroduction Affine Operations Logical Operators Noise in Images Distortions in ImagesAlgorithmic Account

  17. Application of speckle image correlation for real-time assessment of metabolic activity in herpes virus-infected cells

    Science.gov (United States)

    Vladimirov, A. P.; Malygin, A. S.; Mikhailova, J. A.; Borodin, E. M.; Bakharev, A. A.; Poryvayeva, A. P.

    2014-09-01

    Earlier we reported developing a speckle interferometry technique and a device designed to assess the metabolic activity of a cell monolayer cultivated on a glass substrate. This paper aimed at upgrading the technique and studying its potential for real-time assessment of herpes virus development process. Speckle dynamics was recorded in the image plane of intact and virus-infected cell monolayer. HLE-3, L-41 and Vero cells were chosen as research targets. Herpes simplex virus-1-(HSV-1)- infected cell cultures were studied. For 24 h we recorded the digital value of optical signal I in one pixel and parameter η characterizing change in the distribution of the optical signal on 10 × 10-pixel areas. The coefficient of multiple determination calculated by η time dependences for three intact cell cultures equals 0.94. It was demonstrated that the activity parameters are significantly different for intact and virus-infected cells. The difference of η value for intact and HSV-1-infected cells is detectable 10 minutes from the experiment start.

  18. Application of speckle image correlation for real-time assessment of metabolic activity in herpes virus-infected cells

    International Nuclear Information System (INIS)

    Vladimirov, A P; Malygin, A S; Mikhailova, J A; Borodin, E M; Bakharev, A A; Poryvayeva, A P

    2014-01-01

    Earlier we reported developing a speckle interferometry technique and a device designed to assess the metabolic activity of a cell monolayer cultivated on a glass substrate. This paper aimed at upgrading the technique and studying its potential for real-time assessment of herpes virus development process. Speckle dynamics was recorded in the image plane of intact and virus-infected cell monolayer. HLE-3, L-41 and Vero cells were chosen as research targets. Herpes simplex virus-1-(HSV-1)- infected cell cultures were studied. For 24 h we recorded the digital value of optical signal I in one pixel and parameter η characterizing change in the distribution of the optical signal on 10 × 10-pixel areas. The coefficient of multiple determination calculated by η time dependences for three intact cell cultures equals 0.94. It was demonstrated that the activity parameters are significantly different for intact and virus-infected cells. The difference of η value for intact and HSV-1-infected cells is detectable 10 minutes from the experiment start.

  19. Real time processor for array speckle interferometry

    International Nuclear Information System (INIS)

    Chin, G.; Florez, J.; Borelli, R.; Fong, W.; Miko, J.; Trujillo, C.

    1989-01-01

    With the construction of several new large aperture telescopes and the development of large format array detectors in the near IR, the ability to obtain diffraction limited seeing via IR array speckle interferometry offers a powerful tool. We are constructing a real-time processor to acquire image frames, perform array flat-fielding, execute a 64 x 64 element 2D complex FFT, and to average the power spectrum all within the 25 msec coherence time for speckles at near IR wavelength. The processor is a compact unit controlled by a PC with real time display and data storage capability. It provides the ability to optimize observations and obtain results on the telescope rather than waiting several weeks before the data can be analyzed and viewed with off-line methods

  20. Speckle interferometry. Data acquisition and control for the SPID instrument.

    Science.gov (United States)

    Altarac, S.; Tallon, M.; Thiebaut, E.; Foy, R.

    1998-08-01

    SPID (SPeckle Imaging by Deconvolution) is a new speckle camera currently under construction at CRAL-Observatoire de Lyon. Its high spectral resolution and high image restoration capabilities open new astrophysical programs. The instrument SPID is composed of four main optical modules which are fully automated and computer controlled by a software written in Tcl/Tk/Tix and C. This software provides an intelligent assistance to the user by choosing observational parameters as a function of atmospheric parameters, computed in real time, and the desired restored image quality. Data acquisition is made by a photon-counting detector (CP40). A VME-based computer under OS9 controls the detector and stocks the data. The intelligent system runs under Linux on a PC. A slave PC under DOS commands the motors. These 3 computers communicate through an Ethernet network. SPID can be considered as a precursor for VLT's (Very Large Telescope, four 8-meter telescopes currently built in Chile by European Southern Observatory) very high spatial resolution camera.

  1. Camera-based speckle noise reduction for 3-D absolute shape measurements.

    Science.gov (United States)

    Zhang, Hao; Kuschmierz, Robert; Czarske, Jürgen; Fischer, Andreas

    2016-05-30

    Simultaneous position and velocity measurements enable absolute 3-D shape measurements of fast rotating objects for instance for monitoring the cutting process in a lathe. Laser Doppler distance sensors enable simultaneous position and velocity measurements with a single sensor head by evaluating the scattered light signals. The superposition of several speckles with equal Doppler frequency but random phase on the photo detector results in an increased velocity and shape uncertainty, however. In this paper, we present a novel image evaluation method that overcomes the uncertainty limitations due to the speckle effect. For this purpose, the scattered light is detected with a camera instead of single photo detectors. Thus, the Doppler frequency from each speckle can be evaluated separately and the velocity uncertainty decreases with the square root of the number of camera lines. A reduction of the velocity uncertainty by the order of one magnitude is verified by the numerical simulations and experimental results, respectively. As a result, the measurement uncertainty of the absolute shape is not limited by the speckle effect anymore.

  2. Single shot imaging through turbid medium and around corner using coherent light

    Science.gov (United States)

    Li, Guowei; Li, Dayan; Situ, Guohai

    2018-01-01

    Optical imaging through turbid media and around corner is a difficult challenge. Even a very thin layer of a turbid media, which randomly scatters the probe light, can appear opaque and hide any objects behind it. Despite many recent advances, no current method can image the object behind turbid media with single record using coherent laser illumination. Here we report a method that allows non-invasive single-shot optical imaging through turbid media and around corner via speckle correlation. Instead of being as an obstacle in forming diffractionlimited images, speckle actually can be a carrier that encodes sufficient information to imaging through visually opaque layers. Optical imaging through turbid media and around corner is experimentally demonstrated using traditional imaging system with the aid of iterative phase retrieval algorithm. Our method require neither scan of illumination nor two-arm interferometry or long-time exposure in acquisition, which has new implications in optical sensing through common obscurants such as fog, smoke and haze.

  3. Impact of transducer frequency setting on speckle tracking measures

    DEFF Research Database (Denmark)

    Olsen, Flemming Javier; Svendsen, Jesper Hastrup; Køber, Lars

    2018-01-01

    .5/3.0 MHz. The images were obtained immediately after each other at the exact same position for the two settings. Speckle tracking was performed in three apical projections, allowing for acquisition of layered global longitudinal strain (GLS) and strain rate measures. Concordance between the frequency...

  4. Speckle noise reduction technique for Lidar echo signal based on self-adaptive pulse-matching independent component analysis

    Science.gov (United States)

    Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi

    2018-04-01

    Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.

  5. Multi-wavelength speckle reduction for laser pico-projectors using diffractive optics

    Science.gov (United States)

    Thomas, Weston H.

    Personal electronic devices, such as cell phones and tablets, continue to decrease in size while the number of features and add-ons keep increasing. One particular feature of great interest is an integrated projector system. Laser pico-projectors have been considered, but the technology has not been developed enough to warrant integration. With new advancements in diode technology and MEMS devices, laser-based projection is currently being advanced for pico-projectors. A primary problem encountered when using a pico-projector is coherent interference known as speckle. Laser speckle can lead to eye irritation and headaches after prolonged viewing. Diffractive optical elements known as diffusers have been examined as a means to lower speckle contrast. Diffusers are often rotated to achieve temporal averaging of the spatial phase pattern provided by diffuser surface. While diffusers are unable to completely eliminate speckle, they can be utilized to decrease the resultant contrast to provide a more visually acceptable image. This dissertation measures the reduction in speckle contrast achievable through the use of diffractive diffusers. A theoretical Fourier optics model is used to provide the diffuser's stationary and in-motion performance in terms of the resultant contrast level. Contrast measurements of two diffractive diffusers are calculated theoretically and compared with experimental results. In addition, a novel binary diffuser design based on Hadamard matrices will be presented. Using two static in-line Hadamard diffusers eliminates the need for rotation or vibration of the diffuser for temporal averaging. Two Hadamard diffusers were fabricated and contrast values were subsequently measured, showing good agreement with theory and simulated values. Monochromatic speckle contrast values of 0.40 were achieved using the Hadamard diffusers. Finally, color laser projection devices require the use of red, green, and blue laser sources; therefore, using a

  6. 2D biological representations with reduced speckle obtained from two perpendicular ultrasonic arrays.

    Science.gov (United States)

    Rodriguez-Hernandez, Miguel A; Gomez-Sacristan, Angel; Sempere-Payá, Víctor M

    2016-04-29

    Ultrasound diagnosis is a widely used medical tool. Among the various ultrasound techniques, ultrasonic imaging is particularly relevant. This paper presents an improvement to a two-dimensional (2D) ultrasonic system using measurements taken from perpendicular planes, where digital signal processing techniques are used to combine one-dimensional (1D) A-scans were acquired by individual transducers in arrays located in perpendicular planes. An algorithm used to combine measurements is improved based on the wavelet transform, which includes a denoising step during the 2D representation generation process. The inclusion of this new denoising stage generates higher quality 2D representations with a reduced level of speckling. The paper includes different 2D representations obtained from noisy A-scans and compares the improvements obtained by including the denoising stage.

  7. Ultrasonic particle image velocimetry for improved flow gradient imaging: algorithms, methodology and validation

    International Nuclear Information System (INIS)

    Niu Lili; Qian Ming; Yu Wentao; Jin Qiaofeng; Ling Tao; Zheng Hairong; Wan Kun; Gao Shen

    2010-01-01

    This paper presents a new algorithm for ultrasonic particle image velocimetry (Echo PIV) for improving the flow velocity measurement accuracy and efficiency in regions with high velocity gradients. The conventional Echo PIV algorithm has been modified by incorporating a multiple iterative algorithm, sub-pixel method, filter and interpolation method, and spurious vector elimination algorithm. The new algorithms' performance is assessed by analyzing simulated images with known displacements, and ultrasonic B-mode images of in vitro laminar pipe flow, rotational flow and in vivo rat carotid arterial flow. Results of the simulated images show that the new algorithm produces much smaller bias from the known displacements. For laminar flow, the new algorithm results in 1.1% deviation from the analytically derived value, and 8.8% for the conventional algorithm. The vector quality evaluation for the rotational flow imaging shows that the new algorithm produces better velocity vectors. For in vivo rat carotid arterial flow imaging, the results from the new algorithm deviate 6.6% from the Doppler-measured peak velocities averagely compared to 15% of that from the conventional algorithm. The new Echo PIV algorithm is able to effectively improve the measurement accuracy in imaging flow fields with high velocity gradients.

  8. A new chaotic algorithm for image encryption

    International Nuclear Information System (INIS)

    Gao Haojiang; Zhang Yisheng; Liang Shuyun; Li Dequn

    2006-01-01

    Recent researches of image encryption algorithms have been increasingly based on chaotic systems, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper presents a new nonlinear chaotic algorithm (NCA) which uses power function and tangent function instead of linear function. Its structural parameters are obtained by experimental analysis. And an image encryption algorithm in a one-time-one-password system is designed. The experimental results demonstrate that the image encryption algorithm based on NCA shows advantages of large key space and high-level security, while maintaining acceptable efficiency. Compared with some general encryption algorithms such as DES, the encryption algorithm is more secure

  9. Close Binary Star Speckle Interferometry on the McMath-Pierce 0.8-Meter Solar Telescope

    Science.gov (United States)

    Wiley, Edward; Harshaw, Richard; Jones, Gregory; Branston, Detrick; Boyce, Patrick; Rowe, David; Ridgely, John; Estrada, Reed; Genet, Russell

    2015-09-01

    Observations were made in April 2014 to assess the utility of the 0.8-meter solar telescope at the McMath-Pierce Solar Observatory at Kitt Peak National Observatory for performing speckle interferometry observations of close binary stars. Several configurations using science cameras, acquisition cameras, eyepieces, and flip mirrors were evaluated. Speckle images were obtained and recommendations for further improvement of the acquisition system are presented.

  10. In-situ measurement of the strain distribution in a tensile specimen by using a digital speckle pattern interferometer

    International Nuclear Information System (INIS)

    Park, Seung-Kyu; Baik, Sung-Hoon; Cha, Hyung-Ki; Kim, Young-Suk; Cheong, Yong-Moo

    2010-01-01

    Less sensitivity to environmental vibrations is essential for industrial applications of a digital speckle pattern interferometer (DSPI) to measure micro deformations. In this paper, a robust DSPI using single fringe to mechanical vibrations is designed for measuring the strain distribution of a tensile specimen. This system adopts a noise-immune signal processing algorithm to acquire a 3D strain distribution image. To acquire an accurate strain distribution for a tensile-specimen, locally-averaged and directionally-oriented filters operating in the frequency domain are used. This system uses a path-independent least-squares phase-unwrapping algorithm to acquire the 3D shape of the strain distribution. As for the initial experiments to measure the strain distribution of a tensile specimen in a vibration field, this system demonstrated a feasibility for industrial applications by providing reliable strain data.

  11. Infrared speckle observations of Io - an eruption in the Loki region

    International Nuclear Information System (INIS)

    Howell, R.R.; Mcginn, M.T.

    1985-01-01

    Speckle observations of Jupiter's satellite Io at a wavelength of 5 micrometers during July 1984 resolved the disk and showed emission from a hot spot in the Loki region. The hot spot contributed a flux approximately equal to 60 percent of that from the disk.Images reconstructed by means of the Knox-Thompson algorithm showed the spot moving across the disk as the satellite rotated. It was located at 301 deg + or - 6 deg west longitude, 10 deg + or - 6 deg north latitude, and had a radiance of (2.96 + or - 0.54) x 10 to the 22nd ergs/sec cm sr/A where A is the area of the spot. For an assumed temperature of 400 K, the area of the source would be 11,400 square kilometers. An active lava lake similar to that seen by Voyager may be the source of the infrared emission. 10 references

  12. Behaviors study of image registration algorithms in image guided radiation therapy

    International Nuclear Information System (INIS)

    Zou Lian; Hou Qing

    2008-01-01

    Objective: Study the behaviors of image registration algorithms, and analyze the elements which influence the performance of image registrations. Methods: Pre-known corresponding coordinates were appointed for reference image and moving image, and then the influence of region of interest (ROI) selection, transformation function initial parameters and coupled parameter spaces on registration results were studied with a software platform developed in home. Results: Region of interest selection had a manifest influence on registration performance. An improperly chosen ROI resulted in a bad registration. Transformation function initial parameters selection based on pre-known information could improve the accuracy of image registration. Coupled parameter spaces would enhance the dependence of image registration algorithm on ROI selection. Conclusions: It is necessary for clinic IGRT to obtain a ROI selection strategy (depending on specific commercial software) correlated to tumor sites. Three suggestions for image registration technique developers are automatic selection of the initial parameters of transformation function based on pre-known information, developing specific image registration algorithm for specific image feature, and assembling real-time image registration algorithms according to tumor sites selected by software user. (authors)

  13. Algorithms for contrast enhancement of electronic portal images

    International Nuclear Information System (INIS)

    Díez, S.; Sánchez, S.

    2015-01-01

    An implementation of two new automatized image processing algorithms for contrast enhancement of portal images is presented as suitable tools which facilitate the setup verification and visualization of patients during radiotherapy treatments. In the first algorithm, called Automatic Segmentation and Histogram Stretching (ASHS), the portal image is automatically segmented in two sub-images delimited by the conformed treatment beam: one image consisting of the imaged patient obtained directly from the radiation treatment field, and the second one is composed of the imaged patient outside it. By segmenting the original image, a histogram stretching can be independently performed and improved in both regions. The second algorithm involves a two-step process. In the first step, a Normalization to Local Mean (NLM), an inverse restoration filter is applied by dividing pixel by pixel a portal image by its blurred version. In the second step, named Lineally Combined Local Histogram Equalization (LCLHE), the contrast of the original image is strongly improved by a Local Contrast Enhancement (LCE) algorithm, revealing the anatomical structures of patients. The output image is lineally combined with a portal image of the patient. Finally the output images of the previous algorithms (NLM and LCLHE) are lineally combined, once again, in order to obtain a contrast enhanced image. These two algorithms have been tested on several portal images with great results. - Highlights: • Two Algorithms are implemented to improve the contrast of Electronic Portal Images. • The multi-leaf and conformed beam are automatically segmented into Portal Images. • Hidden anatomical and bony structures in portal images are revealed. • The task related to the patient setup verification is facilitated by the contrast enhancement then achieved.

  14. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Adis Alihodzic

    2014-01-01

    Full Text Available Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.

  15. Quantum Image Steganography and Steganalysis Based On LSQu-Blocks Image Information Concealing Algorithm

    Science.gov (United States)

    A. AL-Salhi, Yahya E.; Lu, Songfeng

    2016-08-01

    Quantum steganography can solve some problems that are considered inefficient in image information concealing. It researches on Quantum image information concealing to have been widely exploited in recent years. Quantum image information concealing can be categorized into quantum image digital blocking, quantum image stereography, anonymity and other branches. Least significant bit (LSB) information concealing plays vital roles in the classical world because many image information concealing algorithms are designed based on it. Firstly, based on the novel enhanced quantum representation (NEQR), image uniform blocks clustering around the concrete the least significant Qu-block (LSQB) information concealing algorithm for quantum image steganography is presented. Secondly, a clustering algorithm is proposed to optimize the concealment of important data. Finally, we used Con-Steg algorithm to conceal the clustered image blocks. Information concealing located on the Fourier domain of an image can achieve the security of image information, thus we further discuss the Fourier domain LSQu-block information concealing algorithm for quantum image based on Quantum Fourier Transforms. In our algorithms, the corresponding unitary Transformations are designed to realize the aim of concealing the secret information to the least significant Qu-block representing color of the quantum cover image. Finally, the procedures of extracting the secret information are illustrated. Quantum image LSQu-block image information concealing algorithm can be applied in many fields according to different needs.

  16. A PHOTOMETRIC ANALYSIS OF SEVENTEEN BINARY STARS USING SPECKLE IMAGING

    International Nuclear Information System (INIS)

    Davidson, James W.; Baptista, Brian J.; Horch, Elliott P.; Franz, Otto; Van Altena, William F.

    2009-01-01

    Magnitude differences obtained from speckle imaging are used in combination with other data in the literature to place the components of binary star systems on the H-R diagram. Isochrones are compared with the positions obtained, and a best-fit isochrone is determined for each system, yielding both masses of the components as well as an age range consistent with the system parameters. Seventeen systems are studied, 12 of which were observed with the 0.6 m Lowell-Tololo Telescope at Cerro Tololo Inter-American Observatory and six of which were observed with the WIYN 3.5 m Telescope (The WIYN Observatory is a joint facility of the University of Wisconsin-Madison, Indiana University, Yale University, and the National Optical Astronomy Observatories) at Kitt Peak. One system was observed from both sites. In comparing photometric masses to mass information from orbit determinations, we find that the photometric masses agree very well with the dynamical masses, and are generally more precise. For three systems, no dynamical masses exist at present, and therefore the photometrically determined values are the first mass estimates derived for these components.

  17. Speckle and fringe dynamics in imagingspeckle-pattern interferometry for spatial-filtering velocimetry

    DEFF Research Database (Denmark)

    Jakobsen, Michael Linde; Iversen, Theis F. Q.; Yura, Harold T.

    2011-01-01

    This paper analyzes the dynamics of laser speckles and fringes, formed in an imaging-speckle-pattern interferometer with the purpose of sensing linear three-dimensional motion and out-of-plane components of rotation in real time, using optical spatial-filtering-velocimetry techniques. The ensemble......-average definition of the cross-correlation function is applied to the intensity distributions, obtained in the observation plane at two positions of the object. The theoretical analysis provides a description for the dynamics of both the speckles and the fringes. The analysis reveals that both the magnitude...... and direction of all three linear displacement components of the object movement can be determined. Simultaneously, out-ofplane rotation of the object including the corresponding directions can be determined from the spatial gradient of the in-plane fringe motion throughout the observation plane. The theory...

  18. Image-based Proof of Work Algorithm for the Incentivization of Blockchain Archival of Interesting Images

    OpenAIRE

    Billings, Jake

    2017-01-01

    A new variation of blockchain proof of work algorithm is proposed to incentivize the timely execution of image processing algorithms. A sample image processing algorithm is proposed to determine interesting images using analysis of the entropy of pixel subsets within images. The efficacy of the image processing algorithm is examined using two small sets of training and test data. The interesting image algorithm is then integrated into a simplified blockchain mining proof of work algorithm bas...

  19. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Rosenberger C

    2008-01-01

    Full Text Available Abstract Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  20. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    H. Laurent

    2008-05-01

    Full Text Available Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  1. The SKED: speckle knife edge detector

    International Nuclear Information System (INIS)

    Sharpies, S D; Light, R A; Achamfuo-Yeboah, S O; Clark, M; Somekh, M G

    2014-01-01

    The knife edge detector—also known as optical beam deflection—is a simple and robust method of detecting ultrasonic waves using a laser. It is particularly suitable for detection of high frequency surface acoustic waves as the response is proportional to variation of the local tilt of the surface. In the case of a specular reflection of the incident laser beam from a smooth surface, any lateral movement of the reflected beam caused by the ultrasonic waves is easily detected by a pair of photodiodes. The major disadvantage of the knife edge detector is that it does not cope well with optically rough surfaces, those that give a speckled reflection. The optical speckles from a rough surface adversely affect the efficiency of the knife edge detector, because 'dark' speckles move synchronously with 'bright' speckles, and their contributions to the ultrasonic signal cancel each other out. We have developed a new self-adapting sensor which can cope with the optical speckles reflected from a rough surface. It is inelegantly called the SKED—speckle knife edge detector—and like its smooth surface namesake it is simple, cheap, compact, and robust. We describe the theory of its operation, and present preliminary experimental results validating the overall concept and the operation of the prototype device

  2. The SKED: speckle knife edge detector

    Science.gov (United States)

    Sharpies, S. D.; Light, R. A.; Achamfuo-Yeboah, S. O.; Clark, M.; Somekh, M. G.

    2014-06-01

    The knife edge detector—also known as optical beam deflection—is a simple and robust method of detecting ultrasonic waves using a laser. It is particularly suitable for detection of high frequency surface acoustic waves as the response is proportional to variation of the local tilt of the surface. In the case of a specular reflection of the incident laser beam from a smooth surface, any lateral movement of the reflected beam caused by the ultrasonic waves is easily detected by a pair of photodiodes. The major disadvantage of the knife edge detector is that it does not cope well with optically rough surfaces, those that give a speckled reflection. The optical speckles from a rough surface adversely affect the efficiency of the knife edge detector, because 'dark' speckles move synchronously with 'bright' speckles, and their contributions to the ultrasonic signal cancel each other out. We have developed a new self-adapting sensor which can cope with the optical speckles reflected from a rough surface. It is inelegantly called the SKED—speckle knife edge detector—and like its smooth surface namesake it is simple, cheap, compact, and robust. We describe the theory of its operation, and present preliminary experimental results validating the overall concept and the operation of the prototype device.

  3. Detection of early carious lesions using contrast enhancement with coherent light scattering (speckle imaging)

    International Nuclear Information System (INIS)

    Deana, A M; Jesus, S H C; Koshoji, N H; Bussadori, S K; Oliveira, M T

    2013-01-01

    Currently, dental caries still represent one of the chronic diseases with the highest prevalence and present in most countries. The interaction between light and teeth (absorption, scattering and fluorescence) is intrinsically connected to the constitution of the dental tissue. Decay induced mineral loss introduces a shift in the optical properties of the affected tissue; therefore, study of these properties may produce novel techniques aimed at the early diagnosis of carious lesions. Based on the optical properties of the enamel, we demonstrate the application of first-order spatial statistics in laser speckle imaging, allowing the detection of carious lesions in their early stages. A highlight of this noninvasive, non-destructive, real time and cost effective approach is that it allows a dentist to detect a lesion even in the absence of biofilm or moisture. (paper)

  4. Speckle-free and halo-free low coherent Mach-Zehnder quantitative-phase-imaging module as a replacement of objective lens in conventional inverted microscopes

    Science.gov (United States)

    Yamauchi, Toyohiko; Yamada, Hidenao; Matsui, Hisayuki; Yasuhiko, Osamu; Ueda, Yukio

    2018-02-01

    We developed a compact Mach-Zehnder interferometer module to be used as a replacement of the objective lens in a conventional inverted microscope (Nikon, TS100-F) in order to make them quantitative phase microscopes. The module has a 90-degree-flipped U-shape; the dimensions of the module are 160 mm by 120 mm by 40 mm and the weight is 380 grams. The Mach-Zehnder interferometer equipped with the separate reference and sample arms was implemented in this U-shaped housing and the path-length difference between the two arms was manually adjustable. The sample under test was put on the stage of the microscope and a sample light went through it. Both arms had identical achromatic lenses for image formation and the lateral positions of them were also manually adjustable. Therefore, temporally and spatially low coherent illumination was applicable because the users were able to balance precisely the path length of the two arms and to overlap the two wavefronts. In the experiment, spectrally filtered LED light for illumination (wavelength = 633 nm and bandwidth = 3 nm) was input to the interferometer module via a 50 micrometer core optical fiber. We have successfully captured full-field interference images by a camera put on the trinocular tube of the microscope and constructed quantitative phase images of the cultured cells by means of the quarter-wavelength phase shifting algorithm. The resultant quantitative phase images were speckle-free and halo-free due to spectrally and spatially low coherent illumination.

  5. Tracking speckle displacement by double Kalman filtering

    Institute of Scientific and Technical Information of China (English)

    Donghui Li; Li Guo

    2006-01-01

    @@ A tracking technique using two sequentially-connected Kalman filter for tracking laser speckle displacement is presented. One Kalman filter tracks temporal speckle displacement, while another Kalman filter tracks spatial speckle displacement. The temporal Kalman filter provides a prior for the spatial Kalman filter, and the spatial Kalman filter provides measurements for the temporal Kalman filter. The contribution of a prior to estimations of the spatial Kalman filter is analyzed. An optical analysis system was set up to verify the double-Kalman-filter tracker's ability of tracking laser speckle's constant displacement.

  6. Speckle-based three-dimensional velocity measurement using spatial filtering velocimetry

    DEFF Research Database (Denmark)

    Iversen, Theis Faber Quist; Jakobsen, Michael Linde; Hanson, Steen Grüner

    2011-01-01

    pattern is formed in the observation plane of the imaging system due to reflection from an area of the object illuminated by a coherent light source. The speckle pattern translates in response to in-plane translation of the object, and the presence of an angular offset reference wave coinciding...

  7. A combinational fast algorithm for image reconstruction

    International Nuclear Information System (INIS)

    Wu Zhongquan

    1987-01-01

    A combinational fast algorithm has been developed in order to increase the speed of reconstruction. First, an interpolation method based on B-spline functions is used in image reconstruction. Next, the influence of the boundary conditions assumed here on the interpolation of filtered projections and on the image reconstruction is discussed. It is shown that this boundary condition has almost no influence on the image in the central region of the image space, because the error of interpolation rapidly decreases by a factor of ten in shifting two pixels from the edge toward the center. In addition, a fast algorithm for computing the detecting angle has been used with the mentioned interpolation algorithm, and the cost for detecting angle computaton is reduced by a factor of two. The implementation results show that in the same subjective and objective fidelity, the computational cost for the interpolation using this algorithm is about one-twelfth of the conventional algorithm

  8. Incident Light Frequency-Based Image Defogging Algorithm

    Directory of Open Access Journals (Sweden)

    Wenbo Zhang

    2017-01-01

    Full Text Available To solve the color distortion problem produced by the dark channel prior algorithm, an improved method for calculating transmittance of all channels, respectively, was proposed in this paper. Based on the Beer-Lambert Law, the influence between the frequency of the incident light and the transmittance was analyzed, and the ratios between each channel’s transmittance were derived. Then, in order to increase efficiency, the input image was resized to a smaller size before acquiring the refined transmittance which will be resized to the same size of original image. Finally, all the transmittances were obtained with the help of the proportion between each color channel, and then they were used to restore the defogging image. Experiments suggest that the improved algorithm can produce a much more natural result image in comparison with original algorithm, which means the problem of high color saturation was eliminated. What is more, the improved algorithm speeds up by four to nine times compared to the original algorithm.

  9. Intraoperative laser speckle contrast imaging for monitoring cerebral blood flow: results from a 10-patient pilot study

    Science.gov (United States)

    Richards, Lisa M.; Weber, Erica L.; Parthasarathy, Ashwin B.; Kappeler, Kaelyn L.; Fox, Douglas J.; Dunn, Andrew K.

    2012-02-01

    Monitoring cerebral blood flow (CBF) during neurosurgery can provide important physiological information for a variety of surgical procedures. Although multiple intraoperative vascular monitoring technologies are currently available, a quantitative method that allows for continuous monitoring is still needed. Laser speckle contrast imaging (LSCI) is an optical imaging method with high spatial and temporal resolution that has been widely used to image CBF in animal models in vivo. In this pilot clinical study, we adapted a Zeiss OPMI Pentero neurosurgical microscope to obtain LSCI images by attaching a camera and a laser diode. This LSCI adapted instrument has been used to acquire full field flow images from 10 patients during tumor resection procedures. The patient's ECG was recorded during acquisition and image registration was performed in post-processing to account for pulsatile motion artifacts. Digital photographs confirmed alignment of vasculature and flow images in four cases, and a relative change in blood flow was observed in two patients after bipolar cautery. The LSCI adapted instrument has the capability to produce real-time, full field CBF image maps with excellent spatial resolution and minimal intervention to the surgical procedure. Results from this study demonstrate the feasibility of using LSCI to monitor blood flow during neurosurgery.

  10. (Non-) homomorphic approaches to denoise intensity SAR images with non-local means and stochastic distances

    Science.gov (United States)

    Penna, Pedro A. A.; Mascarenhas, Nelson D. A.

    2018-02-01

    The development of new methods to denoise images still attract researchers, who seek to combat the noise with the minimal loss of resolution and details, like edges and fine structures. Many algorithms have the goal to remove additive white Gaussian noise (AWGN). However, it is not the only type of noise which interferes in the analysis and interpretation of images. Therefore, it is extremely important to expand the filters capacity to different noise models present in li-terature, for example the multiplicative noise called speckle that is present in synthetic aperture radar (SAR) images. The state-of-the-art algorithms in remote sensing area work with similarity between patches. This paper aims to develop two approaches using the non local means (NLM), developed for AWGN. In our research, we expanded its capacity for intensity SAR ima-ges speckle. The first approach is grounded on the use of stochastic distances based on the G0 distribution without transforming the data to the logarithm domain, like homomorphic transformation. It takes into account the speckle and backscatter to estimate the parameters necessary to compute the stochastic distances on NLM. The second method uses a priori NLM denoising with a homomorphic transformation and applies the inverse Gamma distribution to estimate the parameters that were used into NLM with stochastic distances. The latter method also presents a new alternative to compute the parameters for the G0 distribution. Finally, this work compares and analyzes the synthetic and real results of the proposed methods with some recent filters of the literature.

  11. Comparison of SeaWinds Backscatter Imaging Algorithms

    Science.gov (United States)

    Long, David G.

    2017-01-01

    This paper compares the performance and tradeoffs of various backscatter imaging algorithms for the SeaWinds scatterometer when multiple passes over a target are available. Reconstruction methods are compared with conventional gridding algorithms. In particular, the performance and tradeoffs in conventional ‘drop in the bucket’ (DIB) gridding at the intrinsic sensor resolution are compared to high-spatial-resolution imaging algorithms such as fine-resolution DIB and the scatterometer image reconstruction (SIR) that generate enhanced-resolution backscatter images. Various options for each algorithm are explored, including considering both linear and dB computation. The effects of sampling density and reconstruction quality versus time are explored. Both simulated and actual data results are considered. The results demonstrate the effectiveness of high-resolution reconstruction using SIR as well as its limitations and the limitations of DIB and fDIB. PMID:28828143

  12. Sound recovery via intensity variations of speckle pattern pixels selected with variance-based method

    Science.gov (United States)

    Zhu, Ge; Yao, Xu-Ri; Qiu, Peng; Mahmood, Waqas; Yu, Wen-Kai; Sun, Zhi-Bin; Zhai, Guang-Jie; Zhao, Qing

    2018-02-01

    In general, the sound waves can cause the vibration of the objects that are encountered in the traveling path. If we make a laser beam illuminate the rough surface of an object, it will be scattered into a speckle pattern that vibrates with these sound waves. Here, an efficient variance-based method is proposed to recover the sound information from speckle patterns captured by a high-speed camera. This method allows us to select the proper pixels that have large variances of the gray-value variations over time, from a small region of the speckle patterns. The gray-value variations of these pixels are summed together according to a simple model to recover the sound with a high signal-to-noise ratio. Meanwhile, our method will significantly simplify the computation compared with the traditional digital-image-correlation technique. The effectiveness of the proposed method has been verified by applying a variety of objects. The experimental results illustrate that the proposed method is robust to the quality of the speckle patterns and costs more than one-order less time to perform the same number of the speckle patterns. In our experiment, a sound signal of time duration 1.876 s is recovered from various objects with time consumption of 5.38 s only.

  13. Performance evaluation of spatial compounding in the presence of aberration and adaptive imaging

    Science.gov (United States)

    Dahl, Jeremy J.; Guenther, Drake; Trahey, Gregg E.

    2003-05-01

    Spatial compounding has been used for years to reduce speckle in ultrasonic images and to resolve anatomical features hidden behind the grainy appearance of speckle. Adaptive imaging restores image contrast and resolution by compensating for beamforming errors caused by tissue-induced phase errors. Spatial compounding represents a form of incoherent imaging, whereas adaptive imaging attempts to maintain a coherent, diffraction-limited aperture in the presence of aberration. Using a Siemens Antares scanner, we acquired single channel RF data on a commercially available 1-D probe. Individual channel RF data was acquired on a cyst phantom in the presence of a near field electronic phase screen. Simulated data was also acquired for both a 1-D and a custom built 8x96, 1.75-D probe (Tetrad Corp.). The data was compounded using a receive spatial compounding algorithm; a widely used algorithm because it takes advantage of parallel beamforming to avoid reductions in frame rate. Phase correction was also performed by using a least mean squares algorithm to estimate the arrival time errors. We present simulation and experimental data comparing the performance of spatial compounding to phase correction in contrast and resolution tasks. We evaluate spatial compounding and phase correction, and combinations of the two methods, under varying aperture sizes, aperture overlaps, and aberrator strength to examine the optimum configuration and conditions in which spatial compounding will provide a similar or better result than adaptive imaging. We find that, in general, phase correction is hindered at high aberration strengths and spatial frequencies, whereas spatial compounding is helped by these aberrators.

  14. Enhanced imaging of microcalcifications in digital breast tomosynthesis through improved image-reconstruction algorithms

    International Nuclear Information System (INIS)

    Sidky, Emil Y.; Pan Xiaochuan; Reiser, Ingrid S.; Nishikawa, Robert M.; Moore, Richard H.; Kopans, Daniel B.

    2009-01-01

    Purpose: The authors develop a practical, iterative algorithm for image-reconstruction in undersampled tomographic systems, such as digital breast tomosynthesis (DBT). Methods: The algorithm controls image regularity by minimizing the image total p variation (TpV), a function that reduces to the total variation when p=1.0 or the image roughness when p=2.0. Constraints on the image, such as image positivity and estimated projection-data tolerance, are enforced by projection onto convex sets. The fact that the tomographic system is undersampled translates to the mathematical property that many widely varied resultant volumes may correspond to a given data tolerance. Thus the application of image regularity serves two purposes: (1) Reduction in the number of resultant volumes out of those allowed by fixing the data tolerance, finding the minimum image TpV for fixed data tolerance, and (2) traditional regularization, sacrificing data fidelity for higher image regularity. The present algorithm allows for this dual role of image regularity in undersampled tomography. Results: The proposed image-reconstruction algorithm is applied to three clinical DBT data sets. The DBT cases include one with microcalcifications and two with masses. Conclusions: Results indicate that there may be a substantial advantage in using the present image-reconstruction algorithm for microcalcification imaging.

  15. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  16. High speed display algorithm for 3D medical images using Multi Layer Range Image

    International Nuclear Information System (INIS)

    Ban, Hideyuki; Suzuki, Ryuuichi

    1993-01-01

    We propose high speed algorithm that display 3D voxel images obtained from medical imaging systems such as MRI. This algorithm convert voxel image data to 6 Multi Layer Range Image (MLRI) data, which is an augmentation of the range image data. To avoid the calculation for invisible voxels, the algorithm selects at most 3 MLRI data from 6 in accordance with the view direction. The proposed algorithm displays 256 x 256 x 256 voxel data within 0.6 seconds using 22 MIPS Workstation without a special hardware such as Graphics Engine. Real-time display will be possible on 100 MIPS class Workstation by our algorithm. (author)

  17. Real time processor for array speckle interferometry

    Science.gov (United States)

    Chin, Gordon; Florez, Jose; Borelli, Renan; Fong, Wai; Miko, Joseph; Trujillo, Carlos

    1989-02-01

    The authors are constructing a real-time processor to acquire image frames, perform array flat-fielding, execute a 64 x 64 element two-dimensional complex FFT (fast Fourier transform) and average the power spectrum, all within the 25 ms coherence time for speckles at near-IR (infrared) wavelength. The processor will be a compact unit controlled by a PC with real-time display and data storage capability. This will provide the ability to optimize observations and obtain results on the telescope rather than waiting several weeks before the data can be analyzed and viewed with offline methods. The image acquisition and processing, design criteria, and processor architecture are described.

  18. Finding small displacements of recorded speckle patterns: revisited

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Jakobsen, Michael Linde; Chakrabarti, Maumita

    2015-01-01

    An analytical expression for the bias effect in digital speckle correlation is derived based on a Gaussian approximation of the spatial pixel size and array extent. The evaluation is carried out having assumed an incident speckle field. The analysis is focused on speckle displacements in the order...

  19. Twofold processing for denoising ultrasound medical images.

    Science.gov (United States)

    Kishore, P V V; Kumar, K V V; Kumar, D Anil; Prasad, M V D; Goutham, E N D; Rahul, R; Krishna, C B S Vamsi; Sandeep, Y

    2015-01-01

    Ultrasound medical (US) imaging non-invasively pictures inside of a human body for disease diagnostics. Speckle noise attacks ultrasound images degrading their visual quality. A twofold processing algorithm is proposed in this work to reduce this multiplicative speckle noise. First fold used block based thresholding, both hard (BHT) and soft (BST), on pixels in wavelet domain with 8, 16, 32 and 64 non-overlapping block sizes. This first fold process is a better denoising method for reducing speckle and also inducing object of interest blurring. The second fold process initiates to restore object boundaries and texture with adaptive wavelet fusion. The degraded object restoration in block thresholded US image is carried through wavelet coefficient fusion of object in original US mage and block thresholded US image. Fusion rules and wavelet decomposition levels are made adaptive for each block using gradient histograms with normalized differential mean (NDF) to introduce highest level of contrast between the denoised pixels and the object pixels in the resultant image. Thus the proposed twofold methods are named as adaptive NDF block fusion with hard and soft thresholding (ANBF-HT and ANBF-ST). The results indicate visual quality improvement to an interesting level with the proposed twofold processing, where the first fold removes noise and second fold restores object properties. Peak signal to noise ratio (PSNR), normalized cross correlation coefficient (NCC), edge strength (ES), image quality Index (IQI) and structural similarity index (SSIM), measure the quantitative quality of the twofold processing technique. Validation of the proposed method is done by comparing with anisotropic diffusion (AD), total variational filtering (TVF) and empirical mode decomposition (EMD) for enhancement of US images. The US images are provided by AMMA hospital radiology labs at Vijayawada, India.

  20. Algorithms for boundary detection in radiographic images

    International Nuclear Information System (INIS)

    Gonzaga, Adilson; Franca, Celso Aparecido de

    1996-01-01

    Edge detecting techniques applied to radiographic digital images are discussed. Some algorithms have been implemented and the results are displayed to enhance boundary or hide details. An algorithm applied in a pre processed image with contrast enhanced is proposed and the results are discussed

  1. Digital Image Encryption Algorithm Design Based on Genetic Hyperchaos

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2016-01-01

    Full Text Available In view of the present chaotic image encryption algorithm based on scrambling (diffusion is vulnerable to choosing plaintext (ciphertext attack in the process of pixel position scrambling, we put forward a image encryption algorithm based on genetic super chaotic system. The algorithm, by introducing clear feedback to the process of scrambling, makes the scrambling effect related to the initial chaos sequence and the clear text itself; it has realized the image features and the organic fusion of encryption algorithm. By introduction in the process of diffusion to encrypt plaintext feedback mechanism, it improves sensitivity of plaintext, algorithm selection plaintext, and ciphertext attack resistance. At the same time, it also makes full use of the characteristics of image information. Finally, experimental simulation and theoretical analysis show that our proposed algorithm can not only effectively resist plaintext (ciphertext attack, statistical attack, and information entropy attack but also effectively improve the efficiency of image encryption, which is a relatively secure and effective way of image communication.

  2. G0-WISHART Distribution Based Classification from Polarimetric SAR Images

    Science.gov (United States)

    Hu, G. C.; Zhao, Q. H.

    2017-09-01

    Enormous scientific and technical developments have been carried out to further improve the remote sensing for decades, particularly Polarimetric Synthetic Aperture Radar(PolSAR) technique, so classification method based on PolSAR images has getted much more attention from scholars and related department around the world. The multilook polarmetric G0-Wishart model is a more flexible model which describe homogeneous, heterogeneous and extremely heterogeneous regions in the image. Moreover, the polarmetric G0-Wishart distribution dose not include the modified Bessel function of the second kind. It is a kind of simple statistical distribution model with less parameter. To prove its feasibility, a process of classification has been tested with the full-polarized Synthetic Aperture Radar (SAR) image by the method. First, apply multilook polarimetric SAR data process and speckle filter to reduce speckle influence for classification result. Initially classify the image into sixteen classes by H/A/α decomposition. Using the ICM algorithm to classify feature based on the G0-Wshart distance. Qualitative and quantitative results show that the proposed method can classify polaimetric SAR data effectively and efficiently.

  3. A Multiresolution Image Completion Algorithm for Compressing Digital Color Images

    Directory of Open Access Journals (Sweden)

    R. Gomathi

    2014-01-01

    Full Text Available This paper introduces a new framework for image coding that uses image inpainting method. In the proposed algorithm, the input image is subjected to image analysis to remove some of the portions purposefully. At the same time, edges are extracted from the input image and they are passed to the decoder in the compressed manner. The edges which are transmitted to decoder act as assistant information and they help inpainting process fill the missing regions at the decoder. Textural synthesis and a new shearlet inpainting scheme based on the theory of p-Laplacian operator are proposed for image restoration at the decoder. Shearlets have been mathematically proven to represent distributed discontinuities such as edges better than traditional wavelets and are a suitable tool for edge characterization. This novel shearlet p-Laplacian inpainting model can effectively reduce the staircase effect in Total Variation (TV inpainting model whereas it can still keep edges as well as TV model. In the proposed scheme, neural network is employed to enhance the value of compression ratio for image coding. Test results are compared with JPEG 2000 and H.264 Intracoding algorithms. The results show that the proposed algorithm works well.

  4. Twisted speckle entities inside wave-front reversal mirrors

    International Nuclear Information System (INIS)

    Okulov, A. Yu

    2009-01-01

    The previously unknown property of the optical speckle pattern reported. The interference of a speckle with the counterpropagating phase-conjugated (PC) speckle wave produces a randomly distributed ensemble of a twisted entities (ropes) surrounding optical vortex lines. These entities appear in a wide range of a randomly chosen speckle parameters inside the phase-conjugating mirrors regardless to an internal physical mechanism of the wave-front reversal. These numerically generated interference patterns are relevant to the Brillouin PC mirrors and to a four-wave mixing PC mirrors based upon laser trapped ultracold atomic cloud.

  5. Type I Diabetic Akita Mouse Model is Characterized by Abnormal Cardiac Deformation During Early Stages of Diabetic Cardiomyopathy with Speckle-Tracking Based Strain Imaging.

    Science.gov (United States)

    Zhou, Yingchao; Xiao, Hong; Wu, Jianfei; Zha, Lingfeng; Zhou, Mengchen; Li, Qianqian; Wang, Mengru; Shi, Shumei; Li, Yanze; Lyu, Liangkun; Wang, Qing; Tu, Xin; Lu, Qiulun

    2018-01-01

    Diabetes mellitus (DM) has been demonstrated to have a strong association with heart failure. Conventional echocardiographic analysis cannot sensitively monitor cardiac dysfunction in type I diabetic Akita hearts, but the phenotype of heart failure is observed in molecular levels during the early stages. Male Akita (Ins2WT/C96Y) mice were monitored with echocardiographic imaging at various ages, and then with conventional echocardiographic analysis and speckle-tracking based strain analyses. With speckle-tracking based strain analyses, diabetic Akita mice showed changes in average global radial strain at the age of 12 weeks, as well as decreased longitudinal strain. These changes occurred in the early stage and remained throughout the progression of diabetic cardiomyopathy in Akita mice. Speckle-tracking showed that the detailed and precise changes of cardiac deformation in the progression of diabetic cardiomyopathy in the genetic type I diabetic Akita mice were uncoupled. We monitored early-stage changes in the heart of diabetic Akita mice. We utilize this technique to elucidate the underlying mechanism for heart failure in Akita genetic type I diabetic mice. It will further advance the assessment of cardiac abnormalities, as well as the discovery of new drug treatments using Akita genetic type I diabetic mice. © 2018 The Author(s). Published by S. Karger AG, Basel.

  6. Speckle noise reduction in breast ultrasound images: SMU (srad median unsharp) approch

    International Nuclear Information System (INIS)

    Njeh, I.; Sassi, O. B.; Ben Hamida, A.; Chtourou, K.

    2011-01-01

    Image denoising has become a very essential for better information extraction from the image and mainly from so noised ones, such as ultrasound images. In certain cases, for instance in ultrasound images, the noise can restrain information which is valuable for the general practitioner. Consequently medical images are very inconsistent, and it is crucial to operate case to case. This paper presents a novel algorithm SMU (Srad Median Unsharp) for noise suppression in ultrasound breast images in order to realize a computer aided diagnosis (CAD) for breast cancer.

  7. Ultrasound Vector Flow Imaging: Part I: Sequential Systems

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt; Nikolov, Svetoslav Ivanov; Yu, Alfred C. H.

    2016-01-01

    , and variants of these. The review covers both 2-D and 3-D velocity estimation and gives a historical perspective on the development along with a summary of various vector flow visualization algorithms. The current state-of-the-art is explained along with an overview of clinical studies conducted and methods......The paper gives a review of the most important methods for blood velocity vector flow imaging (VFI) for conventional, sequential data acquisition. This includes multibeam methods, speckle tracking, transverse oscillation, color flow mapping derived vector flow imaging, directional beamforming...

  8. Statistics of spatially integrated speckle intensity difference

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Yura, Harold

    2009-01-01

    We consider the statistics of the spatially integrated speckle intensity difference obtained from two separated finite collecting apertures. For fully developed speckle, closed-form analytic solutions for both the probability density function and the cumulative distribution function are derived...... here for both arbitrary values of the mean number of speckles contained within an aperture and the degree of coherence of the optical field. Additionally, closed-form expressions are obtained for the corresponding nth statistical moments....

  9. The optimal algorithm for Multi-source RS image fusion.

    Science.gov (United States)

    Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan

    2016-01-01

    In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.

  10. Medical image segmentation using genetic algorithms.

    Science.gov (United States)

    Maulik, Ujjwal

    2009-03-01

    Genetic algorithms (GAs) have been found to be effective in the domain of medical image segmentation, since the problem can often be mapped to one of search in a complex and multimodal landscape. The challenges in medical image segmentation arise due to poor image contrast and artifacts that result in missing or diffuse organ/tissue boundaries. The resulting search space is therefore often noisy with a multitude of local optima. Not only does the genetic algorithmic framework prove to be effective in coming out of local optima, it also brings considerable flexibility into the segmentation procedure. In this paper, an attempt has been made to review the major applications of GAs to the domain of medical image segmentation.

  11. The Noise Clinic: a Blind Image Denoising Algorithm

    Directory of Open Access Journals (Sweden)

    Marc Lebrun

    2015-01-01

    Full Text Available This paper describes the complete implementation of a blind image algorithm, that takes any digital image as input. In a first step the algorithm estimates a Signal and Frequency Dependent (SFD noise model. In a second step, the image is denoised by a multiscale adaptation of the Non-local Bayes denoising method. We focus here on a careful analysis of the denoising step and present a detailed discussion of the influence of its parameters. Extensive commented tests of the blind denoising algorithm are presented, on real JPEG images and scans of old photographs.

  12. Identification of a chemical inhibitor for nuclear speckle formation: Implications for the function of nuclear speckles in regulation of alternative pre-mRNA splicing

    Energy Technology Data Exchange (ETDEWEB)

    Kurogi, Yutaro; Matsuo, Yota; Mihara, Yuki; Yagi, Hiroaki; Shigaki-Miyamoto, Kaya; Toyota, Syukichi; Azuma, Yuko [Department of Biological Sciences, Graduate School of Science and Technology, Kumamoto University, Chuo-ku, Kumamoto 860-8555 (Japan); Igarashi, Masayuki [Laboratory of Disease Biology, Institute of Microbial Chemistry, Shinagawa-ku, Tokyo 141-0021 (Japan); Tani, Tokio, E-mail: ttani@sci.kumamoto-u.ac.jp [Department of Biological Sciences, Graduate School of Science and Technology, Kumamoto University, Chuo-ku, Kumamoto 860-8555 (Japan)

    2014-03-28

    Highlights: • We identified tubercidin as a compound inducing aberrant formation of the speckles. • Tubercidin causes delocalization of poly (A){sup +}RNAs from nuclear speckles. • Tubercidin induces dispersion of splicing factors from nuclear speckles. • Tubercidin affects alternative pre-mRNA splicing. • Nuclear speckles play a role in regulation of alternative pre-mRNA splicing. - Abstract: Nuclear speckles are subnuclear structures enriched with RNA processing factors and poly (A){sup +} RNAs comprising mRNAs and poly (A){sup +} non-coding RNAs (ncRNAs). Nuclear speckles are thought to be involved in post-transcriptional regulation of gene expression, such as pre-mRNA splicing. By screening 3585 culture extracts of actinomycetes with in situ hybridization using an oligo dT probe, we identified tubercidin, an analogue of adenosine, as an inhibitor of speckle formation, which induces the delocalization of poly (A){sup +} RNA and dispersion of splicing factor SRSF1/SF2 from nuclear speckles in HeLa cells. Treatment with tubercidin also decreased steady-state MALAT1 long ncRNA, thought to be involved in the retention of SRSF1/SF2 in nuclear speckles. In addition, we found that tubercidin treatment promoted exon skipping in the alternative splicing of Clk1 pre-mRNA. These results suggest that nuclear speckles play a role in modulating the concentration of splicing factors in the nucleoplasm to regulate alternative pre-mRNA splicing.

  13. Speckle-tracking echocardiography for predicting outcome in chronic aortic regurgitation during conservative management and after surgery

    DEFF Research Database (Denmark)

    Olsen, Niels Thue; Søgaard, Peter; Larsson, Henrik B W

    2011-01-01

    Objectives The aim of this study was to test myocardial deformation imaging using speckle-tracking echocardiography for predicting outcomes in chronic aortic regurgitation. Background In chronic aortic regurgitation, left ventricular (LV) dysfunction must be detected early to allow timely surgery....... Speckle-tracking echocardiography has been proposed for this purpose, but the clinical value of this method in aortic regurgitation has not been established. Methods A longitudinal study was performed in 64 patients with moderate to severe aortic regurgitation. Thirty-five patients were managed...... conservatively with frequent clinical visits and sequential echocardiography and followed for an average of 19 ± 8 months, while 29 patients underwent surgery for the valve lesion and were followed for 6 months post-operatively. Baseline LV function by speckle-tracking and conventional echocardiography...

  14. A Constrained Algorithm Based NMFα for Image Representation

    Directory of Open Access Journals (Sweden)

    Chenxue Yang

    2014-01-01

    Full Text Available Nonnegative matrix factorization (NMF is a useful tool in learning a basic representation of image data. However, its performance and applicability in real scenarios are limited because of the lack of image information. In this paper, we propose a constrained matrix decomposition algorithm for image representation which contains parameters associated with the characteristics of image data sets. Particularly, we impose label information as additional hard constraints to the α-divergence-NMF unsupervised learning algorithm. The resulted algorithm is derived by using Karush-Kuhn-Tucker (KKT conditions as well as the projected gradient and its monotonic local convergence is proved by using auxiliary functions. In addition, we provide a method to select the parameters to our semisupervised matrix decomposition algorithm in the experiment. Compared with the state-of-the-art approaches, our method with the parameters has the best classification accuracy on three image data sets.

  15. The influence of image reconstruction algorithms on linear thorax EIT image analysis of ventilation

    International Nuclear Information System (INIS)

    Zhao, Zhanqi; Möller, Knut; Frerichs, Inéz; Pulletz, Sven; Müller-Lisse, Ullrich

    2014-01-01

    Analysis methods of electrical impedance tomography (EIT) images based on different reconstruction algorithms were examined. EIT measurements were performed on eight mechanically ventilated patients with acute respiratory distress syndrome. A maneuver with step increase of airway pressure was performed. EIT raw data were reconstructed offline with (1) filtered back-projection (BP); (2) the Dräger algorithm based on linearized Newton–Raphson (DR); (3) the GREIT (Graz consensus reconstruction algorithm for EIT) reconstruction algorithm with a circular forward model (GR C ) and (4) GREIT with individual thorax geometry (GR T ). Individual thorax contours were automatically determined from the routine computed tomography images. Five indices were calculated on the resulting EIT images respectively: (a) the ratio between tidal and deep inflation impedance changes; (b) tidal impedance changes in the right and left lungs; (c) center of gravity; (d) the global inhomogeneity index and (e) ventilation delay at mid-dorsal regions. No significant differences were found in all examined indices among the four reconstruction algorithms (p > 0.2, Kruskal–Wallis test). The examined algorithms used for EIT image reconstruction do not influence the selected indices derived from the EIT image analysis. Indices that validated for images with one reconstruction algorithm are also valid for other reconstruction algorithms. (paper)

  16. The influence of image reconstruction algorithms on linear thorax EIT image analysis of ventilation.

    Science.gov (United States)

    Zhao, Zhanqi; Frerichs, Inéz; Pulletz, Sven; Müller-Lisse, Ullrich; Möller, Knut

    2014-06-01

    Analysis methods of electrical impedance tomography (EIT) images based on different reconstruction algorithms were examined. EIT measurements were performed on eight mechanically ventilated patients with acute respiratory distress syndrome. A maneuver with step increase of airway pressure was performed. EIT raw data were reconstructed offline with (1) filtered back-projection (BP); (2) the Dräger algorithm based on linearized Newton-Raphson (DR); (3) the GREIT (Graz consensus reconstruction algorithm for EIT) reconstruction algorithm with a circular forward model (GR(C)) and (4) GREIT with individual thorax geometry (GR(T)). Individual thorax contours were automatically determined from the routine computed tomography images. Five indices were calculated on the resulting EIT images respectively: (a) the ratio between tidal and deep inflation impedance changes; (b) tidal impedance changes in the right and left lungs; (c) center of gravity; (d) the global inhomogeneity index and (e) ventilation delay at mid-dorsal regions. No significant differences were found in all examined indices among the four reconstruction algorithms (p > 0.2, Kruskal-Wallis test). The examined algorithms used for EIT image reconstruction do not influence the selected indices derived from the EIT image analysis. Indices that validated for images with one reconstruction algorithm are also valid for other reconstruction algorithms.

  17. Inverse synthetic aperture radar imaging principles, algorithms and applications

    CERN Document Server

    Chen , Victor C

    2014-01-01

    Inverse Synthetic Aperture Radar Imaging: Principles, Algorithms and Applications is based on the latest research on ISAR imaging of moving targets and non-cooperative target recognition (NCTR). With a focus on the advances and applications, this book will provide readers with a working knowledge on various algorithms of ISAR imaging of targets and implementation with MATLAB. These MATLAB algorithms will prove useful in order to visualize and manipulate some simulated ISAR images.

  18. Real-time three-dimensional speckle tracking echocardiography: technical aspects and clinical applications

    Directory of Open Access Journals (Sweden)

    Sorrentino R

    2016-11-01

    Full Text Available Regina Sorrentino, Roberta Esposito, Enrica Pezzullo, Maurizio Galderisi Department of Advanced Biomedical Sciences, Interdepartmental Laboratory of Cardiac Imaging, Federico II University Hospital, Naples, Italy Abstract: Three-dimensional speckle tracking echocardiography (3D STE is a novel technique for the quantification of cardiac deformation based on tracking of ultrasonic speckles in gray scale full-volume 3D images. Developments in ultrasound technologies have made 3D speckle tracking widely available. Two-dimensional echocardiography has intrinsic limitations regarding estimation of left ventricular (LV volumes, ejection fraction, and LV mechanics, due to its inherent foreshortening errors and dependency on geometric models. The development of 3D echocardiography has improved reproducibility and accuracy. Data regarding the feasibility, accuracy, and clinical applications of 3D STE are rapidly assembling. From the tracking results, 3D STE derives several parameters, including longitudinal, circumferential and radial strain, as well as a combined assessment of longitudinal and circumferential strain, termed area strain. 3D STE can also quantify LV rotational movements such as rotation, twist, and torsion. 3D STE provides a better insight on global and regional myocardial deformation. Main applications include detection of subclinical myocardial involvement in heart failure, arterial hypertension, dyssynchrony, and ischemic heart disease. Emerging areas of application include a large spectrum of heart-involving systemic conditions, such as prediction of rejection in heart transplant patients, early detection of cardiotoxicity in patients receiving chemotherapy for cancer, and deeper physiological understanding of LV contraction mechanics in different types of athletes. Aim of this review is to discuss background, technical acquisition and processing aspects as well as recognized and developing clinical applications of this emerging

  19. An improved ASIFT algorithm for indoor panorama image matching

    Science.gov (United States)

    Fu, Han; Xie, Donghai; Zhong, Ruofei; Wu, Yu; Wu, Qiong

    2017-07-01

    The generation of 3D models for indoor objects and scenes is an attractive tool for digital city, virtual reality and SLAM purposes. Panoramic images are becoming increasingly more common in such applications due to their advantages to capture the complete environment in one single image with large field of view. The extraction and matching of image feature points are important and difficult steps in three-dimensional reconstruction, and ASIFT is a state-of-the-art algorithm to implement these functions. Compared with the SIFT algorithm, more feature points can be generated and the matching accuracy of ASIFT algorithm is higher, even for the panoramic images with obvious distortions. However, the algorithm is really time-consuming because of complex operations and performs not very well for some indoor scenes under poor light or without rich textures. To solve this problem, this paper proposes an improved ASIFT algorithm for indoor panoramic images: firstly, the panoramic images are projected into multiple normal perspective images. Secondly, the original ASIFT algorithm is simplified from the affine transformation of tilt and rotation with the images to the only tilt affine transformation. Finally, the results are re-projected to the panoramic image space. Experiments in different environments show that this method can not only ensure the precision of feature points extraction and matching, but also greatly reduce the computing time.

  20. Experimental study on deformation field evolution in rock sample with en echelon faults using digital speckle correlation method

    Science.gov (United States)

    Ma, S.; Ma, J.; Liu, L.; Liu, P.

    2007-12-01

    Digital speckle correlation method (DSCM) is one kind of photomechanical deformation measurement method. DSCM could obtain continuous deformation field contactlessly by just capturing speckle images from specimen surface. Therefore, it is suitable to observe high spatial resolution deformation field in tectonophysical experiment. However, in the general DSCM experiment, the inspected surface of specimen needs to be painted to bear speckle grains in order to obtain the high quality speckle image. This also affects the realization of other measurement techniques. In this study, an improved DSCM system is developed and utilized to measure deformation field of rock specimen without surface painting. The granodiorite with high contrast nature grains is chosen to manufacture the specimen, and a specially designed DSCM algorithm is developed to analyze this kind of nature speckle images. Verification and calibration experiments show that the system could inspect a continuous (about 15Hz) high resolution displacement field (with resolution of 5μm) and strain field (with resolution of 50μɛ), dispensing with any preparation on rock specimen. Therefore, it could be conveniently utilized to study the failure of rock structure. Samples with compressive en echelon faults and extensional en echelon faults are studied on a two-direction servo-control test machine. The failure process of the samples is discussed based on the DSCM results. Experiment results show that: 1) The contours of displacement field could clearly indicate the activities of faults and new cracks. The displacement gradient adjacent to active faults and cracks is much greater than other areas. 2) Before failure of the samples, the mean strain of the jog area is largest for the compressive en echelon fault, while that is smallest for the extensional en echelon fault. This consists with the understanding that the jog area of compressive fault subjects to compression and that of extensional fault subjects to

  1. Multiple-algorithm parallel fusion of infrared polarization and intensity images based on algorithmic complementarity and synergy

    Science.gov (United States)

    Zhang, Lei; Yang, Fengbao; Ji, Linna; Lv, Sheng

    2018-01-01

    Diverse image fusion methods perform differently. Each method has advantages and disadvantages compared with others. One notion is that the advantages of different image methods can be effectively combined. A multiple-algorithm parallel fusion method based on algorithmic complementarity and synergy is proposed. First, in view of the characteristics of the different algorithms and difference-features among images, an index vector-based feature-similarity is proposed to define the degree of complementarity and synergy. This proposed index vector is a reliable evidence indicator for algorithm selection. Second, the algorithms with a high degree of complementarity and synergy are selected. Then, the different degrees of various features and infrared intensity images are used as the initial weights for the nonnegative matrix factorization (NMF). This avoids randomness of the NMF initialization parameter. Finally, the fused images of different algorithms are integrated using the NMF because of its excellent data fusing performance on independent features. Experimental results demonstrate that the visual effect and objective evaluation index of the fused images obtained using the proposed method are better than those obtained using traditional methods. The proposed method retains all the advantages that individual fusion algorithms have.

  2. High-contrast imaging in the cloud with klipReduce and Findr

    Science.gov (United States)

    Haug-Baltzell, Asher; Males, Jared R.; Morzinski, Katie M.; Wu, Ya-Lin; Merchant, Nirav; Lyons, Eric; Close, Laird M.

    2016-08-01

    Astronomical data sets are growing ever larger, and the area of high contrast imaging of exoplanets is no exception. With the advent of fast, low-noise detectors operating at 10 to 1000 Hz, huge numbers of images can be taken during a single hours-long observation. High frame rates offer several advantages, such as improved registration, frame selection, and improved speckle calibration. However, advanced image processing algorithms are computationally challenging to apply. Here we describe a parallelized, cloud-based data reduction system developed for the Magellan Adaptive Optics VisAO camera, which is capable of rapidly exploring tens of thousands of parameter sets affecting the Karhunen-Loève image processing (KLIP) algorithm to produce high-quality direct images of exoplanets. We demonstrate these capabilities with a visible wavelength high contrast data set of a hydrogen-accreting brown dwarf companion.

  3. On the link between the speckle free nature of optoacoustics and visibility of structures in limited-view tomography

    Directory of Open Access Journals (Sweden)

    Xosé Luís Deán-Ben

    2016-12-01

    Full Text Available Similar to pulse-echo ultrasound, optoacoustic imaging encodes the location of optical absorbers by the time-of-flight of ultrasound waves. Yet, signal generation mechanisms are fundamentally different for the two modalities, leading to significant distinction between the optimum image formation strategies. While interference of back-scattered ultrasound waves with random phases causes speckle noise in ultrasound images, speckle formation is hindered by the strong correlation between the optoacoustic responses corresponding to individual sources. However, visibility of structures is severely hampered when attempting to acquire optoacoustic images under limited-view tomographic geometries. In this tutorial article, we systematically describe the basic principles of optoacoustic signal generation and image formation for objects ranging from individual sub-resolution absorbers to a continuous absorption distribution. The results are of relevance for the proper interpretation of optoacoustic images acquired under limited-view scenarios and may also serve as a basis for optimal design of tomographic acquisition geometries and image formation strategies.

  4. Parallel image encryption algorithm based on discretized chaotic map

    International Nuclear Information System (INIS)

    Zhou Qing; Wong Kwokwo; Liao Xiaofeng; Xiang Tao; Hu Yue

    2008-01-01

    Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms

  5. Developments In Electronic Speckle Pattern Interferometry For Automotive Vibration Analysis.

    Science.gov (United States)

    Davies, Jeremy C.; Buckberry, Clive H.; Jones, Julian D. C.; Pannell, Chris N.

    1989-01-01

    The incorporation of monomode fibre optics into an argon ion powered Electronic Speckle Pattern Interferometer (ESPI) is reported. The system, consisting of an optics assembly linked to the laser and a CCD camera transceiver, flexibly connected by 40m of monomode fibre optic cable to the optics, has been used to analyse the modal behaviour of structures up to 5m X 3m X 2m in size. Phase modulation of the reference beam in order to operate in a heterodyne mode has been implemented using a piezo-electric crystal operating on the monomode fibre. A new mode of operation - sequential time-average subtraction - and the results of a new processing algorithm are also reported. Their implementation enables speckle free, time-average vibration maps to be generated in real-time on large, unstable structures. Example results for a four cylinder power unit, a vehicle body shell component and an engine oil pan are included. In all cases the analysis was conducted in a general workshop environment without the need for vibration isolation facilities.

  6. Fast image matching algorithm based on projection characteristics

    Science.gov (United States)

    Zhou, Lijuan; Yue, Xiaobo; Zhou, Lijun

    2011-06-01

    Based on analyzing the traditional template matching algorithm, this paper identified the key factors restricting the speed of matching and put forward a brand new fast matching algorithm based on projection. Projecting the grayscale image, this algorithm converts the two-dimensional information of the image into one-dimensional one, and then matches and identifies through one-dimensional correlation, meanwhile, because of normalization has been done, when the image brightness or signal amplitude increasing in proportion, it could also perform correct matching. Experimental results show that the projection characteristics based image registration method proposed in this article could greatly improve the matching speed, which ensuring the matching accuracy as well.

  7. Image segmentation algorithm based on T-junctions cues

    Science.gov (United States)

    Qian, Yanyu; Cao, Fengyun; Wang, Lu; Yang, Xuejie

    2016-03-01

    To improve the over-segmentation and over-merge phenomenon of single image segmentation algorithm,a novel approach of combing Graph-Based algorithm and T-junctions cues is proposed in this paper. First, a method by L0 gradient minimization is applied to the smoothing of the target image eliminate artifacts caused by noise and texture detail; Then, the initial over-segmentation result of the smoothing image using the graph-based algorithm; Finally, the final results via a region fusion strategy by t-junction cues. Experimental results on a variety of images verify the new approach's efficiency in eliminating artifacts caused by noise,segmentation accuracy and time complexity has been significantly improved.

  8. An Unsupervised Change Detection Method Using Time-Series of PolSAR Images from Radarsat-2 and GaoFen-3.

    Science.gov (United States)

    Liu, Wensong; Yang, Jie; Zhao, Jinqi; Shi, Hongtao; Yang, Le

    2018-02-12

    The traditional unsupervised change detection methods based on the pixel level can only detect the changes between two different times with same sensor, and the results are easily affected by speckle noise. In this paper, a novel method is proposed to detect change based on time-series data from different sensors. Firstly, the overall difference image of the time-series PolSAR is calculated by omnibus test statistics, and difference images between any two images in different times are acquired by R j test statistics. Secondly, the difference images are segmented with a Generalized Statistical Region Merging (GSRM) algorithm which can suppress the effect of speckle noise. Generalized Gaussian Mixture Model (GGMM) is then used to obtain the time-series change detection maps in the final step of the proposed method. To verify the effectiveness of the proposed method, we carried out the experiment of change detection using time-series PolSAR images acquired by Radarsat-2 and Gaofen-3 over the city of Wuhan, in China. Results show that the proposed method can not only detect the time-series change from different sensors, but it can also better suppress the influence of speckle noise and improve the overall accuracy and Kappa coefficient.

  9. Measurements of 427 Double Stars With Speckle Interferometry: The Winter/Spring 2017 Observing Program at Brilliant Sky Observatory, Part 1

    Science.gov (United States)

    Harshaw, Richard

    2018-04-01

    In the winter and spring of 2017, an aggressive observing program of measuring close double stars with speckle interferometry and CCD imaging was undertaken at Brilliant Sky Observatory, my observing site in Cave Creek, Arizona. A total of 596 stars were observed, 8 of which were rejected for various reasons, leaving 588 pairs. Of these, 427 were observed and measured with speckle interferometry, while the remaining 161 were measured with a CCD. This paper reports the results of the observations of the 427 speckle cases. A separate paper in this issue will report the CCD measurements of the 161 other pairs.

  10. Hybrid wavefront sensing and image correction algorithm for imaging through turbulent media

    Science.gov (United States)

    Wu, Chensheng; Robertson Rzasa, John; Ko, Jonathan; Davis, Christopher C.

    2017-09-01

    It is well known that passive image correction of turbulence distortions often involves using geometry-dependent deconvolution algorithms. On the other hand, active imaging techniques using adaptive optic correction should use the distorted wavefront information for guidance. Our work shows that a hybrid hardware-software approach is possible to obtain accurate and highly detailed images through turbulent media. The processing algorithm also takes much fewer iteration steps in comparison with conventional image processing algorithms. In our proposed approach, a plenoptic sensor is used as a wavefront sensor to guide post-stage image correction on a high-definition zoomable camera. Conversely, we show that given the ground truth of the highly detailed image and the plenoptic imaging result, we can generate an accurate prediction of the blurred image on a traditional zoomable camera. Similarly, the ground truth combined with the blurred image from the zoomable camera would provide the wavefront conditions. In application, our hybrid approach can be used as an effective way to conduct object recognition in a turbulent environment where the target has been significantly distorted or is even unrecognizable.

  11. A SIMPLE HETERODYNE TEMPORAL SPECKLE-PATTERN INTERFEROMETER

    International Nuclear Information System (INIS)

    Wong, W. O.; Gao, Z.; Lu, J.

    2010-01-01

    A common light path design of heterodyne speckle pattern interferometer based on temporal speckle pattern interferometry is proposed for non-contact, full-field and real-time continuous displacement measurement. Double frequency laser is produced by rotating a half wave plate. An experiment was carried out to measure the dynamic displacement of a cantilever plate for testing the proposed common path heterodyne speckle pattern interferometer. The accuracy of displacement measurement was checked by measuring the motion at the mid-point of the plate with a point displacement sensor.

  12. Parallel asynchronous systems and image processing algorithms

    Science.gov (United States)

    Coon, D. D.; Perera, A. G. U.

    1989-01-01

    A new hardware approach to implementation of image processing algorithms is described. The approach is based on silicon devices which would permit an independent analog processing channel to be dedicated to evey pixel. A laminar architecture consisting of a stack of planar arrays of the device would form a two-dimensional array processor with a 2-D array of inputs located directly behind a focal plane detector array. A 2-D image data stream would propagate in neuronlike asynchronous pulse coded form through the laminar processor. Such systems would integrate image acquisition and image processing. Acquisition and processing would be performed concurrently as in natural vision systems. The research is aimed at implementation of algorithms, such as the intensity dependent summation algorithm and pyramid processing structures, which are motivated by the operation of natural vision systems. Implementation of natural vision algorithms would benefit from the use of neuronlike information coding and the laminar, 2-D parallel, vision system type architecture. Besides providing a neural network framework for implementation of natural vision algorithms, a 2-D parallel approach could eliminate the serial bottleneck of conventional processing systems. Conversion to serial format would occur only after raw intensity data has been substantially processed. An interesting challenge arises from the fact that the mathematical formulation of natural vision algorithms does not specify the means of implementation, so that hardware implementation poses intriguing questions involving vision science.

  13. Lensless Photoluminescence Hyperspectral Camera Employing Random Speckle Patterns.

    Czech Academy of Sciences Publication Activity Database

    Žídek, Karel; Denk, Ondřej; Hlubuček, Jiří

    2017-01-01

    Roč. 7, č. 1 (2017), č. článku 15309. ISSN 2045-2322 R&D Projects: GA MŠk(CZ) LO1206; GA ČR(CZ) GJ17-26284Y Institutional support: RVO:61389021 Keywords : compressed sensing * photoluminescence imaging * laser speckles * single-pixel camera Subject RIV: BH - Optics, Masers, Lasers OBOR OECD: Optics (including laser optics and quantum optics) Impact factor: 4.259, year: 2016 https://www.nature.com/articles/s41598-017-14443-4

  14. Algorithms for image processing and computer vision

    CERN Document Server

    Parker, J R

    2010-01-01

    A cookbook of algorithms for common image processing applications Thanks to advances in computer hardware and software, algorithms have been developed that support sophisticated image processing without requiring an extensive background in mathematics. This bestselling book has been fully updated with the newest of these, including 2D vision methods in content-based searches and the use of graphics cards as image processing computational aids. It's an ideal reference for software engineers and developers, advanced programmers, graphics programmers, scientists, and other specialists wh

  15. Laser speckle contrast imaging identifies ischemic areas on gastric tube reconstructions following esophagectomy.

    Science.gov (United States)

    Milstein, Dan M J; Ince, Can; Gisbertz, Suzanne S; Boateng, Kofi B; Geerts, Bart F; Hollmann, Markus W; van Berge Henegouwen, Mark I; Veelo, Denise P

    2016-06-01

    Gastric tube reconstruction (GTR) is a high-risk surgical procedure with substantial perioperative morbidity. Compromised arterial blood supply and venous congestion are believed to be the main etiologic factors associated with early and late anastomotic complications. Identifying low blood perfusion areas may provide information on the risks of future anastomotic leakage and could be essential for improving surgical techniques. The aim of this study was to generate a method for gastric microvascular perfusion analysis using laser speckle contrast imaging (LSCI) and to test the hypothesis that LSCI is able to identify ischemic regions on GTRs.Patients requiring elective laparoscopy-assisted GTR participated in this single-center observational investigation. A method for intraoperative evaluation of blood perfusion and postoperative analysis was generated and validated for reproducibility. Laser speckle measurements were performed at 3 different time pointes, baseline (devascularized) stomach (T0), after GTR (T1), and GTR at 20° reverse Trendelenburg (T2).Blood perfusion analysis inter-rater reliability was high, with intraclass correlation coefficients for each time point approximating 1 (P < 0.0001). Baseline (T0) and GTR (T1) mean blood perfusion profiles were highest at the base of the stomach and then progressively declined towards significant ischemia at the most cranial point or anastomotic tip (P < 0.01). After GTR, a statistically significant improvement in mean blood perfusion was observed in the cranial gastric regions of interest (P < 0.05). A generalized significant decrease in mean blood perfusion was observed across all GTR regions of interest during 20° reverse Trendelenburg (P < 0.05).It was feasible to implement LSCI intraoperatively to produce blood perfusion assessments on intact and reconstructed whole stomachs. The analytical design presented in this study resulted in good reproducibility of gastric perfusion measurements

  16. Multiscale Distance Coherence Vector Algorithm for Content-Based Image Retrieval

    Science.gov (United States)

    Jiexian, Zeng; Xiupeng, Liu

    2014-01-01

    Multiscale distance coherence vector algorithm for content-based image retrieval (CBIR) is proposed due to the same descriptor with different shapes and the shortcomings of antinoise performance of the distance coherence vector algorithm. By this algorithm, the image contour curve is evolved by Gaussian function first, and then the distance coherence vector is, respectively, extracted from the contour of the original image and evolved images. Multiscale distance coherence vector was obtained by reasonable weight distribution of the distance coherence vectors of evolved images contour. This algorithm not only is invariable to translation, rotation, and scaling transformation but also has good performance of antinoise. The experiment results show us that the algorithm has a higher recall rate and precision rate for the retrieval of images polluted by noise. PMID:24883416

  17. Correlation of Spatially Filtered Dynamic Speckles in Distance Measurement Application

    International Nuclear Information System (INIS)

    Semenov, Dmitry V.; Nippolainen, Ervin; Kamshilin, Alexei A.; Miridonov, Serguei V.

    2008-01-01

    In this paper statistical properties of spatially filtered dynamic speckles are considered. This phenomenon was not sufficiently studied yet while spatial filtering is an important instrument for speckles velocity measurements. In case of spatial filtering speckle velocity information is derived from the modulation frequency of filtered light power which is measured by photodetector. Typical photodetector output is represented by a narrow-band random noise signal which includes non-informative intervals. Therefore more or less precious frequency measurement requires averaging. In its turn averaging implies uncorrelated samples. However, conducting research we found that correlation is typical property not only of dynamic speckle patterns but also of spatially filtered speckles. Using spatial filtering the correlation is observed as a response of measurements provided to the same part of the object surface or in case of simultaneously using several adjacent photodetectors. Found correlations can not be explained using just properties of unfiltered dynamic speckles. As we demonstrate the subject of this paper is important not only from pure theoretical point but also from the point of applied speckle metrology. E.g. using single spatial filter and an array of photodetector can greatly improve accuracy of speckle velocity measurements

  18. Clinical utility of speckle-tracking echocardiography in cardiac resynchronisation therapy.

    Science.gov (United States)

    Khan, Sitara G; Klettas, Dimitris; Kapetanakis, Stam; Monaghan, Mark J

    2016-03-01

    Cardiac resynchronisation therapy (CRT) can profoundly improve outcome in selected patients with heart failure; however, response is difficult to predict and can be absent in up to one in three patients. There has been a substantial amount of interest in the echocardiographic assessment of left ventricular dyssynchrony, with the ultimate aim of reliably identifying patients who will respond to CRT. The measurement of myocardial deformation (strain) has conventionally been assessed using tissue Doppler imaging (TDI), which is limited by its angle dependence and ability to measure in a single plane. Two-dimensional speckle-tracking echocardiography is a technique that provides measurements of strain in three planes, by tracking patterns of ultrasound interference ('speckles') in the myocardial wall throughout the cardiac cycle. Since its initial use over 15 years ago, it has emerged as a tool that provides more robust, reproducible and sensitive markers of dyssynchrony than TDI. This article reviews the use of two-dimensional and three-dimensional speckle-tracking echocardiography in the assessment of dyssynchrony, including the identification of echocardiographic parameters that may hold predictive potential for the response to CRT. It also reviews the application of these techniques in guiding optimal LV lead placement pre-implant, with promising results in clinical improvement post-CRT. © 2016 The authors.

  19. AN IMPROVED FUZZY CLUSTERING ALGORITHM FOR MICROARRAY IMAGE SPOTS SEGMENTATION

    Directory of Open Access Journals (Sweden)

    V.G. Biju

    2015-11-01

    Full Text Available An automatic cDNA microarray image processing using an improved fuzzy clustering algorithm is presented in this paper. The spot segmentation algorithm proposed uses the gridding technique developed by the authors earlier, for finding the co-ordinates of each spot in an image. Automatic cropping of spots from microarray image is done using these co-ordinates. The present paper proposes an improved fuzzy clustering algorithm Possibility fuzzy local information c means (PFLICM to segment the spot foreground (FG from background (BG. The PFLICM improves fuzzy local information c means (FLICM algorithm by incorporating typicality of a pixel along with gray level information and local spatial information. The performance of the algorithm is validated using a set of simulated cDNA microarray images added with different levels of AWGN noise. The strength of the algorithm is tested by computing the parameters such as the Segmentation matching factor (SMF, Probability of error (pe, Discrepancy distance (D and Normal mean square error (NMSE. SMF value obtained for PFLICM algorithm shows an improvement of 0.9 % and 0.7 % for high noise and low noise microarray images respectively compared to FLICM algorithm. The PFLICM algorithm is also applied on real microarray images and gene expression values are computed.

  20. Application of the EM algorithm to radiographic images.

    Science.gov (United States)

    Brailean, J C; Little, D; Giger, M L; Chen, C T; Sullivan, B J

    1992-01-01

    The expectation maximization (EM) algorithm has received considerable attention in the area of positron emitted tomography (PET) as a restoration and reconstruction technique. In this paper, the restoration capabilities of the EM algorithm when applied to radiographic images is investigated. This application does not involve reconstruction. The performance of the EM algorithm is quantitatively evaluated using a "perceived" signal-to-noise ratio (SNR) as the image quality metric. This perceived SNR is based on statistical decision theory and includes both the observer's visual response function and a noise component internal to the eye-brain system. For a variety of processing parameters, the relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to compare quantitatively the effects of the EM algorithm with two other image enhancement techniques: global contrast enhancement (windowing) and unsharp mask filtering. The results suggest that the EM algorithm's performance is superior when compared to unsharp mask filtering and global contrast enhancement for radiographic images which contain objects smaller than 4 mm.

  1. High performance deformable image registration algorithms for manycore processors

    CERN Document Server

    Shackleford, James; Sharp, Gregory

    2013-01-01

    High Performance Deformable Image Registration Algorithms for Manycore Processors develops highly data-parallel image registration algorithms suitable for use on modern multi-core architectures, including graphics processing units (GPUs). Focusing on deformable registration, we show how to develop data-parallel versions of the registration algorithm suitable for execution on the GPU. Image registration is the process of aligning two or more images into a common coordinate frame and is a fundamental step to be able to compare or fuse data obtained from different sensor measurements. E

  2. Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis

    Science.gov (United States)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-01-01

    To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.

  3. Advances in speckle metrology and related techniques

    CERN Document Server

    Kaufmann, Guillermo H

    2010-01-01

    Speckle metrology includes various optical techniques that are based on the speckle fields generated by reflection from a rough surface or by transmission through a rough diffuser. These techniques have proven to be very useful in testing different materials in a non-destructive way. They have changed dramatically during the last years due to the development of modern optical components, with faster and more powerful digital computers, and novel data processing approaches. This most up-to-date overview of the topic describes new techniques developed in the field of speckle metrology over the l

  4. OBSERVATIONS OF BINARY STARS WITH THE DIFFERENTIAL SPECKLE SURVEY INSTRUMENT. III. MEASURES BELOW THE DIFFRACTION LIMIT OF THE WIYN TELESCOPE

    International Nuclear Information System (INIS)

    Horch, Elliott P.; Van Altena, William F.; Howell, Steve B.; Sherry, William H.; Ciardi, David R.

    2011-01-01

    In this paper, we study the ability of CCD- and electron-multiplying-CCD-based speckle imaging to obtain reliable astrometry and photometry of binary stars below the diffraction limit of the WIYN 3.5 m Telescope. We present a total of 120 measures of binary stars, 75 of which are below the diffraction limit. The measures are divided into two groups that have different measurement accuracy and precision. The first group is composed of standard speckle observations, that is, a sequence of speckle images taken in a single filter, while the second group consists of paired observations where the two observations are taken on the same observing run and in different filters. The more recent paired observations were taken simultaneously with the Differential Speckle Survey Instrument, which is a two-channel speckle imaging system. In comparing our results to the ephemeris positions of binaries with known orbits, we find that paired observations provide the opportunity to identify cases of systematic error in separation below the diffraction limit and after removing these from consideration, we obtain a linear measurement uncertainty of 3-4 mas. However, if observations are unpaired or if two observations taken in the same filter are paired, it becomes harder to identify cases of systematic error, presumably because the largest source of this error is residual atmospheric dispersion, which is color dependent. When observations are unpaired, we find that it is unwise to report separations below approximately 20 mas, as these are most susceptible to this effect. Using the final results obtained, we are able to update two older orbits in the literature and present preliminary orbits for three systems that were discovered by Hipparcos.

  5. Skin perfusion evaluation between laser speckle contrast imaging and laser Doppler flowmetry

    Science.gov (United States)

    Humeau-Heurtier, Anne; Mahe, Guillaume; Durand, Sylvain; Abraham, Pierre

    2013-03-01

    In the biomedical field, laser Doppler flowmetry (LDF) and laser speckle contrast imaging (LSCI) are two optical techniques aiming at monitoring - non-invasively - the microvascular blood perfusion. LDF has been used for nearly 40 years whereas LSCI is a recent technique that overcomes some drawbacks of LDF. Both LDF and LSCI give perfusion assessments in arbitrary units. However, the possible relationship existing between perfusions given by LDF and by LSCI over large blood flow values has not been completely studied yet. We therefore herein evaluate the relationship between the LDF and LSCI perfusion values across a broad range of skin blood flows. For this purpose, LDF and LSCI data were acquired simultaneously on the forearm of 12 healthy subjects, at rest, during different durations of vascular occlusion and during reactive hyperemia. For the range of skin blood flows studied, the power function fits the data better than the linear function: powers for individual subjects go from 1.2 to 1.7 and the power is close to 1.3 when all the subjects are studied together. We thus suggest distinguishing perfusion values given by the two optical systems.

  6. An enhanced fractal image denoising algorithm

    International Nuclear Information System (INIS)

    Lu Jian; Ye Zhongxing; Zou Yuru; Ye Ruisong

    2008-01-01

    In recent years, there has been a significant development in image denoising using fractal-based method. This paper presents an enhanced fractal predictive denoising algorithm for denoising the images corrupted by an additive white Gaussian noise (AWGN) by using quadratic gray-level function. Meanwhile, a quantization method for the fractal gray-level coefficients of the quadratic function is proposed to strictly guarantee the contractivity requirement of the enhanced fractal coding, and in terms of the quality of the fractal representation measured by PSNR, the enhanced fractal image coding using quadratic gray-level function generally performs better than the standard fractal coding using linear gray-level function. Based on this enhanced fractal coding, the enhanced fractal image denoising is implemented by estimating the fractal gray-level coefficients of the quadratic function of the noiseless image from its noisy observation. Experimental results show that, compared with other standard fractal-based image denoising schemes using linear gray-level function, the enhanced fractal denoising algorithm can improve the quality of the restored image efficiently

  7. New variational image decomposition model for simultaneously denoising and segmenting optical coherence tomography images

    International Nuclear Information System (INIS)

    Duan, Jinming; Bai, Li; Tench, Christopher; Gottlob, Irene; Proudlock, Frank

    2015-01-01

    Optical coherence tomography (OCT) imaging plays an important role in clinical diagnosis and monitoring of diseases of the human retina. Automated analysis of optical coherence tomography images is a challenging task as the images are inherently noisy. In this paper, a novel variational image decomposition model is proposed to decompose an OCT image into three components: the first component is the original image but with the noise completely removed; the second contains the set of edges representing the retinal layer boundaries present in the image; and the third is an image of noise, or in image decomposition terms, the texture, or oscillatory patterns of the original image. In addition, a fast Fourier transform based split Bregman algorithm is developed to improve computational efficiency of solving the proposed model. Extensive experiments are conducted on both synthesised and real OCT images to demonstrate that the proposed model outperforms the state-of-the-art speckle noise reduction methods and leads to accurate retinal layer segmentation. (paper)

  8. Real-time acquisition and display of flow contrast using speckle variance optical coherence tomography in a graphics processing unit.

    Science.gov (United States)

    Xu, Jing; Wong, Kevin; Jian, Yifan; Sarunic, Marinko V

    2014-02-01

    In this report, we describe a graphics processing unit (GPU)-accelerated processing platform for real-time acquisition and display of flow contrast images with Fourier domain optical coherence tomography (FDOCT) in mouse and human eyes in vivo. Motion contrast from blood flow is processed using the speckle variance OCT (svOCT) technique, which relies on the acquisition of multiple B-scan frames at the same location and tracking the change of the speckle pattern. Real-time mouse and human retinal imaging using two different custom-built OCT systems with processing and display performed on GPU are presented with an in-depth analysis of performance metrics. The display output included structural OCT data, en face projections of the intensity data, and the svOCT en face projections of retinal microvasculature; these results compare projections with and without speckle variance in the different retinal layers to reveal significant contrast improvements. As a demonstration, videos of real-time svOCT for in vivo human and mouse retinal imaging are included in our results. The capability of performing real-time svOCT imaging of the retinal vasculature may be a useful tool in a clinical environment for monitoring disease-related pathological changes in the microcirculation such as diabetic retinopathy.

  9. Target recognition of ladar range images using slice image: comparison of four improved algorithms

    Science.gov (United States)

    Xia, Wenze; Han, Shaokun; Cao, Jingya; Wang, Liang; Zhai, Yu; Cheng, Yang

    2017-07-01

    Compared with traditional 3-D shape data, ladar range images possess properties of strong noise, shape degeneracy, and sparsity, which make feature extraction and representation difficult. The slice image is an effective feature descriptor to resolve this problem. We propose four improved algorithms on target recognition of ladar range images using slice image. In order to improve resolution invariance of the slice image, mean value detection instead of maximum value detection is applied in these four improved algorithms. In order to improve rotation invariance of the slice image, three new improved feature descriptors-which are feature slice image, slice-Zernike moments, and slice-Fourier moments-are applied to the last three improved algorithms, respectively. Backpropagation neural networks are used as feature classifiers in the last two improved algorithms. The performance of these four improved recognition systems is analyzed comprehensively in the aspects of the three invariances, recognition rate, and execution time. The final experiment results show that the improvements for these four algorithms reach the desired effect, the three invariances of feature descriptors are not directly related to the final recognition performance of recognition systems, and these four improved recognition systems have different performances under different conditions.

  10. An Advanced Rotation Invariant Descriptor for SAR Image Registration

    Directory of Open Access Journals (Sweden)

    Yuming Xiang

    2017-07-01

    Full Text Available The Scale-Invariant Feature Transform (SIFT algorithm and its many variants have been widely used in Synthetic Aperture Radar (SAR image registration. The SIFT-like algorithms maintain rotation invariance by assigning a dominant orientation for each keypoint, while the calculation of dominant orientation is not robust due to the effect of speckle noise in SAR imagery. In this paper, we propose an advanced local descriptor for SAR image registration to achieve rotation invariance without assigning a dominant orientation. Based on the improved intensity orders, we first divide a circular neighborhood into several sub-regions. Second, rotation-invariant ratio orientation histograms of each sub-region are proposed by accumulating the ratio values of different directions in a rotation-invariant coordinate system. The proposed descriptor is composed of the concatenation of the histograms of each sub-region. In order to increase the distinctiveness of the proposed descriptor, multiple image neighborhoods are aggregated. Experimental results on several satellite SAR images have shown an improvement in the matching performance over other state-of-the-art algorithms.

  11. Analysis of eroded bovine teeth through laser speckle imaging

    Science.gov (United States)

    Koshoji, Nelson H.; Bussadori, Sandra K.; Bortoletto, Carolina C.; Oliveira, Marcelo T.; Prates, Renato A.; Deana, Alessandro M.

    2015-02-01

    Dental erosion is a non-carious lesion that causes progressive tooth wear of structure through chemical processes that do not involve bacterial action. Its origin is related to eating habits or systemic diseases involving tooth contact with substances that pose a very low pH. This work demonstrates a new methodology to quantify the erosion by coherent light scattering of tooth surface. This technique shows a correlation between acid etch duration and laser speckle contrast map (LASCA). The experimental groups presented a relative contrast between eroded and sound tissue of 17.8(45)%, 23.4 (68)% 39.2 (40)% and 44.3 (30)%, for 10 min, 20 min, 30 min and 40 min of acid etching, respectively.

  12. Regularization iteration imaging algorithm for electrical capacitance tomography

    Science.gov (United States)

    Tong, Guowei; Liu, Shi; Chen, Hongyan; Wang, Xueyao

    2018-03-01

    The image reconstruction method plays a crucial role in real-world applications of the electrical capacitance tomography technique. In this study, a new cost function that simultaneously considers the sparsity and low-rank properties of the imaging targets is proposed to improve the quality of the reconstruction images, in which the image reconstruction task is converted into an optimization problem. Within the framework of the split Bregman algorithm, an iterative scheme that splits a complicated optimization problem into several simpler sub-tasks is developed to solve the proposed cost function efficiently, in which the fast-iterative shrinkage thresholding algorithm is introduced to accelerate the convergence. Numerical experiment results verify the effectiveness of the proposed algorithm in improving the reconstruction precision and robustness.

  13. Laser speckle technique to study the effect of chemical pre-treatment on the quality of minimally processed apples

    International Nuclear Information System (INIS)

    Minz, Preeti D; Nirala, A K

    2016-01-01

    In the present study, the laser speckle technique has been used for the quality evaluation of chemically treated cut apples. Chemical pre-treatment includes 1% (w/v) solution of citric acid (CA), sodium chloride (SC), and a combination of CA and sodium chloride (CS). The variation in weight loss, respiration rate, total soluble solids (TSS), titratable acidity (TA), and absorbance of chemically treated cut apples stored at 5 °C was monitored for 11 d. The speckle grain size was calculated by an autocovariance method from the speckled images of freshly cut chemically treated apples. The effect of chemicals on TSS and the TA content variation of the cut apples were well correlated to the linear speckle grain size. Circular degree of polarization confirms the presence of a small scatterer and hence Rayleigh diffusion region. For all the treated cut apples, a decrease in the concentration of small particles nearly after the mid-period of storage results in the fast decay of circular degree of polarization. For non-invasive and fast analysis of the chemical constituent of fruits during minimal processing, the laser speckle can be practically used in the food industry. (paper)

  14. Laser speckle technique to study the effect of chemical pre-treatment on the quality of minimally processed apples

    Science.gov (United States)

    Minz, Preeti D.; Nirala, A. K.

    2016-04-01

    In the present study, the laser speckle technique has been used for the quality evaluation of chemically treated cut apples. Chemical pre-treatment includes 1% (w/v) solution of citric acid (CA), sodium chloride (SC), and a combination of CA and sodium chloride (CS). The variation in weight loss, respiration rate, total soluble solids (TSS), titratable acidity (TA), and absorbance of chemically treated cut apples stored at 5 °C was monitored for 11 d. The speckle grain size was calculated by an autocovariance method from the speckled images of freshly cut chemically treated apples. The effect of chemicals on TSS and the TA content variation of the cut apples were well correlated to the linear speckle grain size. Circular degree of polarization confirms the presence of a small scatterer and hence Rayleigh diffusion region. For all the treated cut apples, a decrease in the concentration of small particles nearly after the mid-period of storage results in the fast decay of circular degree of polarization. For non-invasive and fast analysis of the chemical constituent of fruits during minimal processing, the laser speckle can be practically used in the food industry.

  15. An efficient fractal image coding algorithm using unified feature and DCT

    International Nuclear Information System (INIS)

    Zhou Yiming; Zhang Chao; Zhang Zengke

    2009-01-01

    Fractal image compression is a promising technique to improve the efficiency of image storage and image transmission with high compression ratio, however, the huge time consumption for the fractal image coding is a great obstacle to the practical applications. In order to improve the fractal image coding, efficient fractal image coding algorithms using a special unified feature and a DCT coder are proposed in this paper. Firstly, based on a necessary condition to the best matching search rule during fractal image coding, the fast algorithm using a special unified feature (UFC) is addressed, and it can reduce the search space obviously and exclude most inappropriate matching subblocks before the best matching search. Secondly, on the basis of UFC algorithm, in order to improve the quality of the reconstructed image, a DCT coder is combined to construct a hybrid fractal image algorithm (DUFC). Experimental results show that the proposed algorithms can obtain good quality of the reconstructed images and need much less time than the baseline fractal coding algorithm.

  16. Intraoperative laser speckle contrast imaging improves the stability of rodent middle cerebral artery occlusion model

    Science.gov (United States)

    Yuan, Lu; Li, Yao; Li, Hangdao; Lu, Hongyang; Tong, Shanbao

    2015-09-01

    Rodent middle cerebral artery occlusion (MCAO) model is commonly used in stroke research. Creating a stable infarct volume has always been challenging for technicians due to the variances of animal anatomy and surgical operations. The depth of filament suture advancement strongly influences the infarct volume as well. We investigated the cerebral blood flow (CBF) changes in the affected cortex using laser speckle contrast imaging when advancing suture during MCAO surgery. The relative CBF drop area (CBF50, i.e., the percentage area with CBF less than 50% of the baseline) showed an increase from 20.9% to 69.1% when the insertion depth increased from 1.6 to 1.8 cm. Using the real-time CBF50 marker to guide suture insertion during the surgery, our animal experiments showed that intraoperative CBF-guided surgery could significantly improve the stability of MCAO with a more consistent infarct volume and less mortality.

  17. Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm

    Science.gov (United States)

    Elahi, Sana; kaleem, Muhammad; Omer, Hammad

    2018-01-01

    Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.

  18. Position control of ECRH launcher mirrors by laser speckle sensor

    International Nuclear Information System (INIS)

    Michelsen, Poul K.; Bindslev, Henrik; Hansen, Rene Skov; Hanson, Steen G.

    2003-01-01

    The planned ECRH system for JET included several fixed and steerable mirrors some of which should have been fixed to the building structure and some to the JET vessel structure. A similar system may be anticipated for ITER and for other fusion devices in the future. In order to have high reproducibility of the ECRH beam direction, it is necessary to know the exact positions of the mirrors. This is not a trivial problem because of thermal expansion of the vessel structures and of the launcher itself and of its support structure, the mechanical load on mirrors and support structures, and the accessibility to the various mirrors. We suggest to use a combination of infrared diagnostic of beam spot positions and a new technique published recently, which is based on a non-contact laser speckle sensor for measuring one- and two-dimensional angular displacement. The method is based on Fourier transforming the scattered field from a single laser beam that illuminates the target. The angular distribution of the light field at the target is linearly mapped onto an array image sensor placed in the Fourier plane. Measuring the displacement of this so-called speckle pattern facilitates the determination of the mirror orientation. Transverse target movement can be measured by observing the speckle movement in the image plane of the object. No special surface treatment is required for surfaces having irregularities of the order of or larger than the wavelength of the incident light. For the JET ECRH launcher it is mainly for the last mirror pointing towards the plasma where the technique may be useful. This mirror has to be steerable in order to reflect the microwave beam in the correct direction towards the plasma. Maximum performance of the microwave heating requires that the beam hits this mirror at its centre and that the mirror is turned in the correct angle. Inaccuracies in the positioning of the pull rods for controlling the mirror turning and thermal effects makes it

  19. Autonomous algorithms for image restoration

    OpenAIRE

    Griniasty , Meir

    1994-01-01

    We describe a general theoretical framework for algorithms that adaptively tune all their parameters during the restoration of a noisy image. The adaptation procedure is based on a mean field approach which is known as ``Deterministic Annealing'', and is reminiscent of the ``Deterministic Bolzmann Machiné'. The algorithm is less time consuming in comparison with its simulated annealing alternative. We apply the theory to several architectures and compare their performances.

  20. Analysis of Minute Features in Speckled Imagery with Maximum Likelihood Estimation

    Directory of Open Access Journals (Sweden)

    Alejandro C. Frery

    2004-12-01

    Full Text Available This paper deals with numerical problems arising when performing maximum likelihood parameter estimation in speckled imagery using small samples. The noise that appears in images obtained with coherent illumination, as is the case of sonar, laser, ultrasound-B, and synthetic aperture radar, is called speckle, and it can neither be assumed Gaussian nor additive. The properties of speckle noise are well described by the multiplicative model, a statistical framework from which stem several important distributions. Amongst these distributions, one is regarded as the universal model for speckled data, namely, the 𝒢0 law. This paper deals with amplitude data, so the 𝒢A0 distribution will be used. The literature reports that techniques for obtaining estimates (maximum likelihood, based on moments and on order statistics of the parameters of the 𝒢A0 distribution require samples of hundreds, even thousands, of observations in order to obtain sensible values. This is verified for maximum likelihood estimation, and a proposal based on alternate optimization is made to alleviate this situation. The proposal is assessed with real and simulated data, showing that the convergence problems are no longer present. A Monte Carlo experiment is devised to estimate the quality of maximum likelihood estimators in small samples, and real data is successfully analyzed with the proposed alternated procedure. Stylized empirical influence functions are computed and used to choose a strategy for computing maximum likelihood estimates that is resistant to outliers.

  1. Comparison of analyzer-based imaging computed tomography extraction algorithms and application to bone-cartilage imaging

    International Nuclear Information System (INIS)

    Diemoz, Paul C; Bravin, Alberto; Coan, Paola; Glaser, Christian

    2010-01-01

    In x-ray phase-contrast analyzer-based imaging, the contrast is provided by a combination of absorption, refraction and scattering effects. Several extraction algorithms, which attempt to separate and quantify these different physical contributions, have been proposed and applied. In a previous work, we presented a quantitative comparison of five among the most well-known extraction algorithms based on the geometrical optics approximation applied to planar images: diffraction-enhanced imaging (DEI), extended diffraction-enhanced imaging (E-DEI), generalized diffraction-enhanced imaging (G-DEI), multiple-image radiography (MIR) and Gaussian curve fitting (GCF). In this paper, we compare these algorithms in the case of the computed tomography (CT) modality. The extraction algorithms are applied to analyzer-based CT images of both plastic phantoms and biological samples (cartilage-on-bone cylinders). Absorption, refraction and scattering signals are derived. Results obtained with the different algorithms may vary greatly, especially in the case of large refraction angles. We show that ABI-CT extraction algorithms can provide an excellent tool to enhance the visualization of cartilage internal structures, which may find applications in a clinical context. Besides, by using the refraction images, the refractive index decrements for both the cartilage matrix and the cartilage cells have been estimated.

  2. Low-Complexity Regularization Algorithms for Image Deblurring

    KAUST Repository

    Alanazi, Abdulrahman

    2016-11-01

    Image restoration problems deal with images in which information has been degraded by blur or noise. In practice, the blur is usually caused by atmospheric turbulence, motion, camera shake, and several other mechanical or physical processes. In this study, we present two regularization algorithms for the image deblurring problem. We first present a new method based on solving a regularized least-squares (RLS) problem. This method is proposed to find a near-optimal value of the regularization parameter in the RLS problems. Experimental results on the non-blind image deblurring problem are presented. In all experiments, comparisons are made with three benchmark methods. The results demonstrate that the proposed method clearly outperforms the other methods in terms of both the output PSNR and structural similarity, as well as the visual quality of the deblurred images. To reduce the complexity of the proposed algorithm, we propose a technique based on the bootstrap method to estimate the regularization parameter in low and high-resolution images. Numerical results show that the proposed technique can effectively reduce the computational complexity of the proposed algorithms. In addition, for some cases where the point spread function (PSF) is separable, we propose using a Kronecker product so as to reduce the computations. Furthermore, in the case where the image is smooth, it is always desirable to replace the regularization term in the RLS problems by a total variation term. Therefore, we propose a novel method for adaptively selecting the regularization parameter in a so-called square root regularized total variation (SRTV). Experimental results demonstrate that our proposed method outperforms the other benchmark methods when applied to smooth images in terms of PSNR, SSIM and the restored image quality. In this thesis, we focus on the non-blind image deblurring problem, where the blur kernel is assumed to be known. However, we developed algorithms that also work

  3. A Stereo Dual-Channel Dynamic Programming Algorithm for UAV Image Stitching.

    Science.gov (United States)

    Li, Ming; Chen, Ruizhi; Zhang, Weilong; Li, Deren; Liao, Xuan; Wang, Lei; Pan, Yuanjin; Zhang, Peng

    2017-09-08

    Dislocation is one of the major challenges in unmanned aerial vehicle (UAV) image stitching. In this paper, we propose a new algorithm for seamlessly stitching UAV images based on a dynamic programming approach. Our solution consists of two steps: Firstly, an image matching algorithm is used to correct the images so that they are in the same coordinate system. Secondly, a new dynamic programming algorithm is developed based on the concept of a stereo dual-channel energy accumulation. A new energy aggregation and traversal strategy is adopted in our solution, which can find a more optimal seam line for image stitching. Our algorithm overcomes the theoretical limitation of the classical Duplaquet algorithm. Experiments show that the algorithm can effectively solve the dislocation problem in UAV image stitching, especially for the cases in dense urban areas. Our solution is also direction-independent, which has better adaptability and robustness for stitching images.

  4. [Speckle tracking--a new ultrasound tool for the assessment of fetal myocardial function].

    Science.gov (United States)

    Willruth, A; Geipel, A; Merz, W; Gembruch, U

    2012-06-01

    Speckle tracking is a new ultrasound tool to assess 2D ventricular global and segmental myocardial velocity and deformation (strain, strain rate). Multiple factors such as fetal motion, high heart rates, low blood pressure, small size of the heart, physiological cardiac translation, filling and maturational changes of myocardium, polyhydramnion, maternal obesity and aortic pulsation can degrade the image quality and result in artifacts and measurement errors which may have an impact on the final analysis. Therefore deformation indices such as strain and strain rate offer a quantitative technique for the estimation of global and segmental myocardial function and contractility. At present longitudinal peak systolic strain is the most commonly applied deformation parameter used to analyse segmental and global myocardial contractility in adults. When obtained using Doppler methods, these measurements are angle dependent, whereas speckle tracking techniques overcome the limitations of Doppler echocardiography which is a particular advantage in foetal echocardiography. Nevertheless, the time and training necessary to acquire high-quality video clips limit the implementation of speckle tracking into clinical routine. It is not yet clear whether this new technique will identify subclinical myocardial impairment earlier than with current techniques or allow for better discrimination between healthy fetuses and fetuses with congenital heart disease. The clinical use of speckle tracking will have to be demonstrated in larger groups of complicated pregnancies. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Enhancement of SAR images using fuzzy shrinkage technique

    Indian Academy of Sciences (India)

    This paper presents speckle noise reduction in SAR images using a combination of curvelet and fuzzy logic technique to restore speckle-affected images. This method overcomes the limitation of discontinuity in hard threshold and permanent deviation in soft threshold. First, it decomposes noise image into different ...

  6. SAR Imagery Segmentation by Statistical Region Growing and Hierarchical Merging

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela Mayumi; Carvalho, E.A.; Medeiros, F.N.S.; Martins, C.I.O.; Marques, R.C.P.; Oliveira, I.N.S.

    2010-05-22

    This paper presents an approach to accomplish synthetic aperture radar (SAR) image segmentation, which are corrupted by speckle noise. Some ordinary segmentation techniques may require speckle filtering previously. Our approach performs radar image segmentation using the original noisy pixels as input data, eliminating preprocessing steps, an advantage over most of the current methods. The algorithm comprises a statistical region growing procedure combined with hierarchical region merging to extract regions of interest from SAR images. The region growing step over-segments the input image to enable region aggregation by employing a combination of the Kolmogorov-Smirnov (KS) test with a hierarchical stepwise optimization (HSWO) algorithm for the process coordination. We have tested and assessed the proposed technique on artificially speckled image and real SAR data containing different types of targets.

  7. LSB Based Quantum Image Steganography Algorithm

    Science.gov (United States)

    Jiang, Nan; Zhao, Na; Wang, Luo

    2016-01-01

    Quantum steganography is the technique which hides a secret message into quantum covers such as quantum images. In this paper, two blind LSB steganography algorithms in the form of quantum circuits are proposed based on the novel enhanced quantum representation (NEQR) for quantum images. One algorithm is plain LSB which uses the message bits to substitute for the pixels' LSB directly. The other is block LSB which embeds a message bit into a number of pixels that belong to one image block. The extracting circuits can regain the secret message only according to the stego cover. Analysis and simulation-based experimental results demonstrate that the invisibility is good, and the balance between the capacity and the robustness can be adjusted according to the needs of applications.

  8. Thinning an object boundary on digital image using pipelined algorithm

    International Nuclear Information System (INIS)

    Dewanto, S.; Aliyanta, B.

    1997-01-01

    In digital image processing, the thinning process to an object boundary is required to analyze the image structure with a measurement of parameter such as area, circumference of the image object. The process needs a sufficient large memory and time consuming if all the image pixels stored in the memory and the following process is done after all the pixels has ben transformed. pipelined algorithm can reduce the time used in the process. This algorithm uses buffer memory where its size can be adjusted. the next thinning process doesn't need to wait all the transformation of pixels. This paper described pipelined algorithm with some result on the use of the algorithm to digital image

  9. Evaluation of Underwater Image Enhancement Algorithms under Different Environmental Conditions

    Directory of Open Access Journals (Sweden)

    Marino Mangeruga

    2018-01-01

    Full Text Available Underwater images usually suffer from poor visibility, lack of contrast and colour casting, mainly due to light absorption and scattering. In literature, there are many algorithms aimed to enhance the quality of underwater images through different approaches. Our purpose was to identify an algorithm that performs well in different environmental conditions. We have selected some algorithms from the state of the art and we have employed them to enhance a dataset of images produced in various underwater sites, representing different environmental and illumination conditions. These enhanced images have been evaluated through some quantitative metrics. By analysing the results of these metrics, we tried to understand which of the selected algorithms performed better than the others. Another purpose of our research was to establish if a quantitative metric was enough to judge the behaviour of an underwater image enhancement algorithm. We aim to demonstrate that, even if the metrics can provide an indicative estimation of image quality, they could lead to inconsistent or erroneous evaluations.

  10. Ectoparasites and intestinal helminths of speckled pigeon ...

    African Journals Online (AJOL)

    Ectoparasites and intestinal helminths of speckled pigeon ( Columba guinea Hartlaub and Finsch 1870) in Zaria, Nigeria. ... Science World Journal ... A total of 30 (20 males and 10 females) Speckled Pigeons trapped from the wild in Zaria and its environs, Nigeria, were examined for ectoparasites and intestinal helminths, ...

  11. Improved cancer diagnostics by different image processing techniques on OCT images

    Science.gov (United States)

    Kanawade, Rajesh; Lengenfelder, Benjamin; Marini Menezes, Tassiana; Hohmann, Martin; Kopfinger, Stefan; Hohmann, Tim; Grabiec, Urszula; Klämpfl, Florian; Gonzales Menezes, Jean; Waldner, Maximilian; Schmidt, Michael

    2015-07-01

    Optical-coherence tomography (OCT) is a promising non-invasive, high-resolution imaging modality which can be used for cancer diagnosis and its therapeutic assessment. However, speckle noise makes detection of cancer boundaries and image segmentation problematic and unreliable. Therefore, to improve the image analysis for a precise cancer border detection, the performance of different image processing algorithms such as mean, median, hybrid median filter and rotational kernel transformation (RKT) for this task is investigated. This is done on OCT images acquired from an ex-vivo human cancerous mucosa and in vitro by using cultivated tumour applied on organotypical hippocampal slice cultures. The preliminary results confirm that the border between the healthy and the cancer lesions can be identified precisely. The obtained results are verified with fluorescence microscopy. This research can improve cancer diagnosis and the detection of borders between healthy and cancerous tissue. Thus, it could also reduce the number of biopsies required during screening endoscopy by providing better guidance to the physician.

  12. Spatial compression algorithm for the analysis of very large multivariate images

    Science.gov (United States)

    Keenan, Michael R [Albuquerque, NM

    2008-07-15

    A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.

  13. Optimization of image processing algorithms on mobile platforms

    Science.gov (United States)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  14. An Efficient SAR Image Segmentation Framework Using Transformed Nonlocal Mean and Multi-Objective Clustering in Kernel Space

    Directory of Open Access Journals (Sweden)

    Dongdong Yang

    2015-02-01

    Full Text Available Synthetic aperture radar (SAR image segmentation usually involves two crucial issues: suitable speckle noise removing technique and effective image segmentation methodology. Here, an efficient SAR image segmentation method considering both of the two aspects is presented. As for the first issue, the famous nonlocal mean (NLM filter is introduced in this study to suppress the multiplicative speckle noise in SAR image. Furthermore, to achieve a higher denoising accuracy, the local neighboring pixels in the searching window are projected into a lower dimensional subspace by principal component analysis (PCA. Thus, the nonlocal mean filter is implemented in the subspace. Afterwards, a multi-objective clustering algorithm is proposed using the principals of artificial immune system (AIS and kernel-induced distance measures. The multi-objective clustering has been shown to discover the data distribution with different characteristics and the kernel methods can improve its robustness to noise and outliers. Experiments demonstrate that the proposed method is able to partition the SAR image robustly and accurately than the conventional approaches.

  15. Wholefield displacement measurements using speckle image processing techniques for crash tests

    Science.gov (United States)

    Sriram, P.; Hanagud, S.; Ranson, W. F.

    The digital correlation scheme of Peters et al. (1983) was extended to measure out-of-plane deformations, using a white light projection speckle technique. A simple ray optic theory and the digital correlation scheme are outlined. The technique was applied successfully to measure out-of-plane displacements of initially flat rotorcraft structures (an acrylic circular plate and a steel cantilever beam), using a low cost video camera and a desktop computer. The technique can be extended to measurements of three-dimensional deformations and dynamic deformations.

  16. Comparison of cerebral microcirculation of alloxan diabetes and healthy mice using laser speckle contrast imaging

    Science.gov (United States)

    Timoshina, Polina A.; Shi, Rui; Zhang, Yang; Zhu, Dan; Semyachkina-Glushkovskaya, Oxana V.; Tuchin, Valery V.; Luo, Qingming

    2015-03-01

    The study of blood microcirculation is one of the most important problems of the medicine. This paper presents results of experimental study of cerebral blood flow microcirculation in mice with alloxan-induced diabetes using Temporal Laser Speckle Imaging (TLSI). Additionally, a direct effect of glucose water solution (concentration 20% and 45%) on blood flow microcirculation was studied. In the research, 20 white laboratory mice weighing 20-30 g were used. The TLSI method allows one to investigate time dependent scattering from the objects with complex dynamics, since it possesses greater temporal resolution. Results show that in brain of animal diabetic group diameter of sagittal vein is increased and the speed of blood flow reduced relative to the control group. Topical application of 20%- or 45%-glucose solutions also causes increase of diameter of blood vessels and slows down blood circulation. The results obtained show that diabetes development causes changes in the cerebral microcirculatory system and TLSI techniques can be effectively used to quantify these alterations.

  17. Applying laser speckle images to skin science: skin lesion differentiation by polarization

    Science.gov (United States)

    Lee, Tim K.; Tchvialeva, Lioudmila; Dhadwal, Gurbir; Sotoodian, Bahman; Kalai, Sunil; Zeng, Haishan; Lui, Harvey; McLean, David I.

    2012-01-01

    Skin cancer is a worldwide health problem. It is the most common cancer in the countries with a large white population; furthermore, the incidence of malignant melanoma, the most dangerous form of skin cancer, has been increasing steadily over the last three decades. There is an urgent need to develop in-vivo, noninvasive diagnostic tools for the disease. This paper attempts to response to the challenge by introducing a simple and fast method based on polarization and laser speckle. The degree of maintaining polarization estimates the fraction of linearly maintaining polarization in the backscattered speckle field. Clinical experiments of 214 skin lesions including malignant melanomas, squamous cell carcinomas, basal cell carcinomas, nevi, and seborrheic keratoses demonstrated that such a parameter can potentially diagnose different skin lesion types. ROC analyses showed that malignant melanoma and seborrheic keratosis could be differentiated by both the blue and red lasers with the area under the curve (AUC) = 0.8 and 0.7, respectively. Also malignant melanoma and squamous cell carcinoma could be separated by the blue laser (AUC = 0.9), while nevus and seborrheic keratosis could be identified using the red laser (AUC = 0.7). These experiments demonstrated that polarization could be a potential in-vivo diagnostic indicator for skin diseases.

  18. Analysis and improvement of a chaos-based image encryption algorithm

    International Nuclear Information System (INIS)

    Xiao Di; Liao Xiaofeng; Wei Pengcheng

    2009-01-01

    The security of digital image attracts much attention recently. In Guan et al. [Guan Z, Huang F, Guan W. Chaos-based image encryption algorithm. Phys Lett A 2005; 346: 153-7.], a chaos-based image encryption algorithm has been proposed. In this paper, the cause of potential flaws in the original algorithm is analyzed in detail, and then the corresponding enhancement measures are proposed. Both theoretical analysis and computer simulation indicate that the improved algorithm can overcome these flaws and maintain all the merits of the original one.

  19. Multilevel Image Segmentation Based on an Improved Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Kai Chen

    2016-01-01

    Full Text Available Multilevel image segmentation is time-consuming and involves large computation. The firefly algorithm has been applied to enhancing the efficiency of multilevel image segmentation. However, in some cases, firefly algorithm is easily trapped into local optima. In this paper, an improved firefly algorithm (IFA is proposed to search multilevel thresholds. In IFA, in order to help fireflies escape from local optima and accelerate the convergence, two strategies (i.e., diversity enhancing strategy with Cauchy mutation and neighborhood strategy are proposed and adaptively chosen according to different stagnation stations. The proposed IFA is compared with three benchmark optimal algorithms, that is, Darwinian particle swarm optimization, hybrid differential evolution optimization, and firefly algorithm. The experimental results show that the proposed method can efficiently segment multilevel images and obtain better performance than the other three methods.

  20. Secure image encryption algorithm design using a novel chaos based S-Box

    International Nuclear Information System (INIS)

    Çavuşoğlu, Ünal; Kaçar, Sezgin; Pehlivan, Ihsan; Zengin, Ahmet

    2017-01-01

    Highlights: • A new chaotic system is developed for creating S-Box and image encryption algorithm. • Chaos based random number generator is designed with the help of the new chaotic system. NIST tests are run on generated random numbers to verify randomness. • A new S-Box design algorithm is developed to create the chaos based S-Box to be utilized in encryption algorithm and performance tests are made. • The new developed S-Box based image encryption algorithm is introduced and image encryption application is carried out. • To show the quality and strong of the encryption process, security analysis are performed and compared with the AES and chaos algorithms. - Abstract: In this study, an encryption algorithm that uses chaos based S-BOX is developed for secure and speed image encryption. First of all, a new chaotic system is developed for creating S-Box and image encryption algorithm. Chaos based random number generator is designed with the help of the new chaotic system. Then, NIST tests are run on generated random numbers to verify randomness. A new S-Box design algorithm is developed to create the chaos based S-Box to be utilized in encryption algorithm and performance tests are made. As the next step, the new developed S-Box based image encryption algorithm is introduced in detail. Finally, image encryption application is carried out. To show the quality and strong of the encryption process, security analysis are performed. Proposed algorithm is compared with the AES and chaos algorithms. According to tests results, the proposed image encryption algorithm is secure and speed for image encryption application.

  1. Image analysis of speckle patterns as a probe of melting transitions in laser-heated diamond anvil cell experiments.

    Science.gov (United States)

    Salem, Ran; Matityahu, Shlomi; Melchior, Aviva; Nikolaevsky, Mark; Noked, Ori; Sterer, Eran

    2015-09-01

    The precision of melting curve measurements using laser-heated diamond anvil cell (LHDAC) is largely limited by the correct and reliable determination of the onset of melting. We present a novel image analysis of speckle interference patterns in the LHDAC as a way to define quantitative measures which enable an objective determination of the melting transition. Combined with our low-temperature customized IR pyrometer, designed for measurements down to 500 K, our setup allows studying the melting curve of materials with low melting temperatures, with relatively high precision. As an application, the melting curve of Te was measured up to 35 GPa. The results are found to be in good agreement with previous data obtained at pressures up to 10 GPa.

  2. Brain-inspired algorithms for retinal image analysis

    NARCIS (Netherlands)

    ter Haar Romeny, B.M.; Bekkers, E.J.; Zhang, J.; Abbasi-Sureshjani, S.; Huang, F.; Duits, R.; Dasht Bozorg, Behdad; Berendschot, T.T.J.M.; Smit-Ockeloen, I.; Eppenhof, K.A.J.; Feng, J.; Hannink, J.; Schouten, J.; Tong, M.; Wu, H.; van Triest, J.W.; Zhu, S.; Chen, D.; He, W.; Xu, L.; Han, P.; Kang, Y.

    2016-01-01

    Retinal image analysis is a challenging problem due to the precise quantification required and the huge numbers of images produced in screening programs. This paper describes a series of innovative brain-inspired algorithms for automated retinal image analysis, recently developed for the RetinaCheck

  3. Measurement of eye aberrations in a speckle field

    International Nuclear Information System (INIS)

    Larichev, A V; Ivanov, P V; Iroshnikov, N G; Shmalgauzen, V I

    2001-01-01

    The influence of speckles on the performance of a Shark-Hartmann wavefront sensor is investigated in the eye aberration studies. The dependence of the phase distortion measurement error on the characteristic speckle size is determined experimentally. Scanning of the reference source was used to suppress the speckle structure of the laser beam scattered by the retina. The technique developed by us made it possible to study the time dependence of the human eye aberrations with a resolution of 30 ms. (laser applications and other topics in quantum electronics)

  4. A Superresolution Image Reconstruction Algorithm Based on Landweber in Electrical Capacitance Tomography

    Directory of Open Access Journals (Sweden)

    Chen Deyun

    2013-01-01

    Full Text Available According to the image reconstruction accuracy influenced by the “soft field” nature and ill-conditioned problems in electrical capacitance tomography, a superresolution image reconstruction algorithm based on Landweber is proposed in the paper, which is based on the working principle of the electrical capacitance tomography system. The method uses the algorithm which is derived by regularization of solutions derived and derives closed solution by fast Fourier transform of the convolution kernel. So, it ensures the certainty of the solution and improves the stability and quality of image reconstruction results. Simulation results show that the imaging precision and real-time imaging of the algorithm are better than Landweber algorithm, and this algorithm proposes a new method for the electrical capacitance tomography image reconstruction algorithm.

  5. A fast global fitting algorithm for fluorescence lifetime imaging microscopy based on image segmentation.

    Science.gov (United States)

    Pelet, S; Previte, M J R; Laiho, L H; So, P T C

    2004-10-01

    Global fitting algorithms have been shown to improve effectively the accuracy and precision of the analysis of fluorescence lifetime imaging microscopy data. Global analysis performs better than unconstrained data fitting when prior information exists, such as the spatial invariance of the lifetimes of individual fluorescent species. The highly coupled nature of global analysis often results in a significantly slower convergence of the data fitting algorithm as compared with unconstrained analysis. Convergence speed can be greatly accelerated by providing appropriate initial guesses. Realizing that the image morphology often correlates with fluorophore distribution, a global fitting algorithm has been developed to assign initial guesses throughout an image based on a segmentation analysis. This algorithm was tested on both simulated data sets and time-domain lifetime measurements. We have successfully measured fluorophore distribution in fibroblasts stained with Hoechst and calcein. This method further allows second harmonic generation from collagen and elastin autofluorescence to be differentiated in fluorescence lifetime imaging microscopy images of ex vivo human skin. On our experimental measurement, this algorithm increased convergence speed by over two orders of magnitude and achieved significantly better fits. Copyright 2004 Biophysical Society

  6. A chaos-based image encryption algorithm with variable control parameters

    International Nuclear Information System (INIS)

    Wang Yong; Wong, K.-W.; Liao Xiaofeng; Xiang Tao; Chen Guanrong

    2009-01-01

    In recent years, a number of image encryption algorithms based on the permutation-diffusion structure have been proposed. However, the control parameters used in the permutation stage are usually fixed in the whole encryption process, which favors attacks. In this paper, a chaos-based image encryption algorithm with variable control parameters is proposed. The control parameters used in the permutation stage and the keystream employed in the diffusion stage are generated from two chaotic maps related to the plain-image. As a result, the algorithm can effectively resist all known attacks against permutation-diffusion architectures. Theoretical analyses and computer simulations both confirm that the new algorithm possesses high security and fast encryption speed for practical image encryption.

  7. An in vivo analysis of facial muscle change treated with botulinum toxin type A using digital image speckle correlation

    Science.gov (United States)

    Xu, Yan; Palmaccio, Samantha Palmaccio; Bui, Duc; Dagum, Alexander; Rafailovich, Miriam

    Been famous for clinical use from early 1980s, the neuromuscular blocking agent Botulinum toxin type A (BTX-A), has been used to reduce wrinkles for a long time. Only little research has been done to quantify the change of muscle contraction before and after injection and most research paper depend on subjective evaluation from both patients and surgeons. In our research, Digital Image Speckle Correlation (DISC) was employed to study the mechanical properties of skin, contraction mode of muscles (injected) and reaction of neighbor muscle group (un-injected).At the same time, displacement patterns (vector maps)generated by DISC can predict injection locus for surgeons who normally handle it depending only on visual observation.

  8. A hash-based image encryption algorithm

    Science.gov (United States)

    Cheddad, Abbas; Condell, Joan; Curran, Kevin; McKevitt, Paul

    2010-03-01

    There exist several algorithms that deal with text encryption. However, there has been little research carried out to date on encrypting digital images or video files. This paper describes a novel way of encrypting digital images with password protection using 1D SHA-2 algorithm coupled with a compound forward transform. A spatial mask is generated from the frequency domain by taking advantage of the conjugate symmetry of the complex imagery part of the Fourier Transform. This mask is then XORed with the bit stream of the original image. Exclusive OR (XOR), a logical symmetric operation, that yields 0 if both binary pixels are zeros or if both are ones and 1 otherwise. This can be verified simply by modulus (pixel1, pixel2, 2). Finally, confusion is applied based on the displacement of the cipher's pixels in accordance with a reference mask. Both security and performance aspects of the proposed method are analyzed, which prove that the method is efficient and secure from a cryptographic point of view. One of the merits of such an algorithm is to force a continuous tone payload, a steganographic term, to map onto a balanced bits distribution sequence. This bit balance is needed in certain applications, such as steganography and watermarking, since it is likely to have a balanced perceptibility effect on the cover image when embedding.

  9. An Example-Based Super-Resolution Algorithm for Selfie Images

    Directory of Open Access Journals (Sweden)

    Jino Hans William

    2016-01-01

    Full Text Available A selfie is typically a self-portrait captured using the front camera of a smartphone. Most state-of-the-art smartphones are equipped with a high-resolution (HR rear camera and a low-resolution (LR front camera. As selfies are captured by front camera with limited pixel resolution, the fine details in it are explicitly missed. This paper aims to improve the resolution of selfies by exploiting the fine details in HR images captured by rear camera using an example-based super-resolution (SR algorithm. HR images captured by rear camera carry significant fine details and are used as an exemplar to train an optimal matrix-value regression (MVR operator. The MVR operator serves as an image-pair priori which learns the correspondence between the LR-HR patch-pairs and is effectively used to super-resolve LR selfie images. The proposed MVR algorithm avoids vectorization of image patch-pairs and preserves image-level information during both learning and recovering process. The proposed algorithm is evaluated for its efficiency and effectiveness both qualitatively and quantitatively with other state-of-the-art SR algorithms. The results validate that the proposed algorithm is efficient as it requires less than 3 seconds to super-resolve LR selfie and is effective as it preserves sharp details without introducing any counterfeit fine details.

  10. Speckle reduction for a laser light sectioning sensor

    Directory of Open Access Journals (Sweden)

    Tutsch Rainer

    2015-01-01

    Full Text Available Automated optical inspection is an important test procedure in electronic circuits assembly. Frequently 3d information is required and laser light sectioning sensors are often applied. However, some effects complicate the reliable automatic detection of the shape of such assemblies and their components. The packages of electronic components often are made of black plastics or ceramics so that the intensity available for the optical detection is quite low, especially in comparison to the surface of the PCBs where the components are mounted on. In addition due to rough surfaces of the components and the coherence length of the laser light speckle structures arise. In the work presented here a piezo actuator is used to oscillate the illuminating laser lines along the direction of the lines. The aim is to reduce the visibility of the speckle structures by averaging while maintaining the geometrical shape of the lines. In addition, image processing methods like segmentation and skeletonization are used to allow the detection of the shape of components and assemblies also if materials with distinct differences in the reflectivity are involved. Investigations include the influence of the parameters amplitude and frequency of the piezo actuator.

  11. Diagnosing cysts with correlation coefficient images from 2-dimensional freehand elastography.

    Science.gov (United States)

    Booi, Rebecca C; Carson, Paul L; O'Donnell, Matthew; Richards, Michael S; Rubin, Jonathan M

    2007-09-01

    We compared the diagnostic potential of using correlation coefficient images versus elastograms from 2-dimensional (2D) freehand elastography to characterize breast cysts. In this preliminary study, which was approved by the Institutional Review Board and compliant with the Health Insurance Portability and Accountability Act, we imaged 4 consecutive human subjects (4 cysts, 1 biopsy-verified benign breast parenchyma) with freehand 2D elastography. Data were processed offline with conventional 2D phase-sensitive speckle-tracking algorithms. The correlation coefficient in the cyst and surrounding tissue was calculated, and appearances of the cysts in the correlation coefficient images and elastograms were compared. The correlation coefficient in the cysts was considerably lower (14%-37%) than in the surrounding tissue because of the lack of sufficient speckle in the cysts, as well as the prominence of random noise, reverberations, and clutter, which decorrelated quickly. Thus, the cysts were visible in all correlation coefficient images. In contrast, the elastograms associated with these cysts each had different elastographic patterns. The solid mass in this study did not have the same high decorrelation rate as the cysts, having a correlation coefficient only 2.1% lower than that of surrounding tissue. Correlation coefficient images may produce a more direct, reliable, and consistent method for characterizing cysts than elastograms.

  12. Sensitivity study of voxel-based PET image comparison to image registration algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Yip, Stephen, E-mail: syip@lroc.harvard.edu; Chen, Aileen B.; Berbeco, Ross [Department of Radiation Oncology, Brigham and Women’s Hospital, Dana-Farber Cancer Institute and Harvard Medical School, Boston, Massachusetts 02115 (United States); Aerts, Hugo J. W. L. [Department of Radiation Oncology, Brigham and Women’s Hospital, Dana-Farber Cancer Institute and Harvard Medical School, Boston, Massachusetts 02115 and Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, Boston, Massachusetts 02115 (United States)

    2014-11-01

    Purpose: Accurate deformable registration is essential for voxel-based comparison of sequential positron emission tomography (PET) images for proper adaptation of treatment plan and treatment response assessment. The comparison may be sensitive to the method of deformable registration as the optimal algorithm is unknown. This study investigated the impact of registration algorithm choice on therapy response evaluation. Methods: Sixteen patients with 20 lung tumors underwent a pre- and post-treatment computed tomography (CT) and 4D FDG-PET scans before and after chemoradiotherapy. All CT images were coregistered using a rigid and ten deformable registration algorithms. The resulting transformations were then applied to the respective PET images. Moreover, the tumor region defined by a physician on the registered PET images was classified into progressor, stable-disease, and responder subvolumes. Particularly, voxels with standardized uptake value (SUV) decreases >30% were classified as responder, while voxels with SUV increases >30% were progressor. All other voxels were considered stable-disease. The agreement of the subvolumes resulting from difference registration algorithms was assessed by Dice similarity index (DSI). Coefficient of variation (CV) was computed to assess variability of DSI between individual tumors. Root mean square difference (RMS{sub rigid}) of the rigidly registered CT images was used to measure the degree of tumor deformation. RMS{sub rigid} and DSI were correlated by Spearman correlation coefficient (R) to investigate the effect of tumor deformation on DSI. Results: Median DSI{sub rigid} was found to be 72%, 66%, and 80%, for progressor, stable-disease, and responder, respectively. Median DSI{sub deformable} was 63%–84%, 65%–81%, and 82%–89%. Variability of DSI was substantial and similar for both rigid and deformable algorithms with CV > 10% for all subvolumes. Tumor deformation had moderate to significant impact on DSI for progressor

  13. Statistical spatial properties of speckle patterns generated by multiple laser beams

    International Nuclear Information System (INIS)

    Le Cain, A.; Sajer, J. M.; Riazuelo, G.

    2011-01-01

    This paper investigates hot spot characteristics generated by the superposition of multiple laser beams. First, properties of speckle statistics are studied in the context of only one laser beam by computing the autocorrelation function. The case of multiple laser beams is then considered. In certain conditions, it is shown that speckles have an ellipsoidal shape. Analytical expressions of hot spot radii generated by multiple laser beams are derived and compared to numerical estimates made from the autocorrelation function. They are also compared to numerical simulations performed within the paraxial approximation. Excellent agreement is found for the speckle width as well as for the speckle length. Application to the speckle patterns generated in the Laser MegaJoule configuration in the zone where all the beams overlap is presented. Influence of polarization on the size of the speckles as well as on their abundance is studied.

  14. Image quality evaluation of full reference algorithm

    Science.gov (United States)

    He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan

    2018-03-01

    Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.

  15. Texture-based characterization of subskin features by specified laser speckle effects at λ = 650 nm region for more accurate parametric 'skin age' modelling.

    Science.gov (United States)

    Orun, A B; Seker, H; Uslan, V; Goodyer, E; Smith, G

    2017-06-01

    The textural structure of 'skin age'-related subskin components enables us to identify and analyse their unique characteristics, thus making substantial progress towards establishing an accurate skin age model. This is achieved by a two-stage process. First by the application of textural analysis using laser speckle imaging, which is sensitive to textural effects within the λ = 650 nm spectral band region. In the second stage, a Bayesian inference method is used to select attributes from which a predictive model is built. This technique enables us to contrast different skin age models, such as the laser speckle effect against the more widely used normal light (LED) imaging method, whereby it is shown that our laser speckle-based technique yields better results. The method introduced here is non-invasive, low cost and capable of operating in real time; having the potential to compete against high-cost instrumentation such as confocal microscopy or similar imaging devices used for skin age identification purposes. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  16. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  17. Novel prediction- and subblock-based algorithm for fractal image compression

    International Nuclear Information System (INIS)

    Chung, K.-L.; Hsu, C.-H.

    2006-01-01

    Fractal encoding is the most consuming part in fractal image compression. In this paper, a novel two-phase prediction- and subblock-based fractal encoding algorithm is presented. Initially the original gray image is partitioned into a set of variable-size blocks according to the S-tree- and interpolation-based decomposition principle. In the first phase, each current block of variable-size range block tries to find the best matched domain block based on the proposed prediction-based search strategy which utilizes the relevant neighboring variable-size domain blocks. The first phase leads to a significant computation-saving effect. If the domain block found within the predicted search space is unacceptable, in the second phase, a subblock strategy is employed to partition the current variable-size range block into smaller blocks to improve the image quality. Experimental results show that our proposed prediction- and subblock-based fractal encoding algorithm outperforms the conventional full search algorithm and the recently published spatial-correlation-based algorithm by Truong et al. in terms of encoding time and image quality. In addition, the performance comparison among our proposed algorithm and the other two algorithms, the no search-based algorithm and the quadtree-based algorithm, are also investigated

  18. A segmentation algorithm based on image projection for complex text layout

    Science.gov (United States)

    Zhu, Wangsheng; Chen, Qin; Wei, Chuanyi; Li, Ziyang

    2017-10-01

    Segmentation algorithm is an important part of layout analysis, considering the efficiency advantage of the top-down approach and the particularity of the object, a breakdown of projection layout segmentation algorithm. Firstly, the algorithm will algorithm first partitions the text image, and divided into several columns, then for each column scanning projection, the text image is divided into several sub regions through multiple projection. The experimental results show that, this method inherits the projection itself and rapid calculation speed, but also can avoid the effect of arc image information page segmentation, and also can accurate segmentation of the text image layout is complex.

  19. FACT. New image parameters based on the watershed-algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Linhoff, Lena; Bruegge, Kai Arno; Buss, Jens [TU Dortmund (Germany). Experimentelle Physik 5b; Collaboration: FACT-Collaboration

    2016-07-01

    FACT, the First G-APD Cherenkov Telescope, is the first imaging atmospheric Cherenkov telescope that is using Geiger-mode avalanche photodiodes (G-APDs) as photo sensors. The raw data produced by this telescope are processed in an analysis chain, which leads to a classification of the primary particle that induce a shower and to an estimation of its energy. One important step in this analysis chain is the parameter extraction from shower images. By the application of a watershed algorithm to the camera image, new parameters are computed. Perceiving the brightness of a pixel as height, a set of pixels can be seen as 'landscape' with hills and valleys. A watershed algorithm groups all pixels to a cluster that belongs to the same hill. From the emerging segmented image, one can find new parameters for later analysis steps, e.g. number of clusters, their shape and containing photon charge. For FACT data, the FellWalker algorithm was chosen from the class of watershed algorithms, because it was designed to work on discrete distributions, in this case the pixels of a camera image. The FellWalker algorithm is implemented in FACT-tools, which provides the low level analysis framework for FACT. This talk will focus on the computation of new, FellWalker based, image parameters, which can be used for the gamma-hadron separation. Additionally, their distributions concerning real and Monte Carlo Data are compared.

  20. Multi-viewpoint Image Array Virtual Viewpoint Rapid Generation Algorithm Based on Image Layering

    Science.gov (United States)

    Jiang, Lu; Piao, Yan

    2018-04-01

    The use of multi-view image array combined with virtual viewpoint generation technology to record 3D scene information in large scenes has become one of the key technologies for the development of integrated imaging. This paper presents a virtual viewpoint rendering method based on image layering algorithm. Firstly, the depth information of reference viewpoint image is quickly obtained. During this process, SAD is chosen as the similarity measure function. Then layer the reference image and calculate the parallax based on the depth information. Through the relative distance between the virtual viewpoint and the reference viewpoint, the image layers are weighted and panned. Finally the virtual viewpoint image is rendered layer by layer according to the distance between the image layers and the viewer. This method avoids the disadvantages of the algorithm DIBR, such as high-precision requirements of depth map and complex mapping operations. Experiments show that, this algorithm can achieve the synthesis of virtual viewpoints in any position within 2×2 viewpoints range, and the rendering speed is also very impressive. The average result proved that this method can get satisfactory image quality. The average SSIM value of the results relative to real viewpoint images can reaches 0.9525, the PSNR value can reaches 38.353 and the image histogram similarity can reaches 93.77%.

  1. Compound speckles and their statistical and dynamical properties

    DEFF Research Database (Denmark)

    Hanson, Steen Grüner; Jakobsen, Michael Linde; Skov Hansen, Rene

    2008-01-01

    Two issues will be treated in this presentation, both focusing on gaining a deeper understanding of dynamic speckles, aiming at the use for probing dynamical properties of scattering structures. The first issue to be addressed is the dynamics of speckles arising from illuminating a solid surface...

  2. Improving performance of wavelet-based image denoising algorithm using complex diffusion process

    DEFF Research Database (Denmark)

    Nadernejad, Ehsan; Sharifzadeh, Sara; Korhonen, Jari

    2012-01-01

    using a variety of standard images and its performance has been compared against several de-noising algorithms known from the prior art. Experimental results show that the proposed algorithm preserves the edges better and in most cases, improves the measured visual quality of the denoised images......Image enhancement and de-noising is an essential pre-processing step in many image processing algorithms. In any image de-noising algorithm, the main concern is to keep the interesting structures of the image. Such interesting structures often correspond to the discontinuities (edges...... in comparison to the existing methods known from the literature. The improvement is obtained without excessive computational cost, and the algorithm works well on a wide range of different types of noise....

  3. A high-performance spatial database based approach for pathology imaging algorithm evaluation.

    Science.gov (United States)

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A D; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J; Saltz, Joel H

    2013-01-01

    Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. WE HAVE CONSIDERED TWO SCENARIOS FOR ALGORITHM EVALUATION: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and

  4. A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing

    Science.gov (United States)

    Overmeyer, Austin D.

    2015-01-01

    A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.

  5. A locally adaptive algorithm for shadow correction in color images

    Science.gov (United States)

    Karnaukhov, Victor; Kober, Vitaly

    2017-09-01

    The paper deals with correction of color images distorted by spatially nonuniform illumination. A serious distortion occurs in real conditions when a part of the scene containing 3D objects close to a directed light source is illuminated much brighter than the rest of the scene. A locally-adaptive algorithm for correction of shadow regions in color images is proposed. The algorithm consists of segmentation of shadow areas with rank-order statistics followed by correction of nonuniform illumination with human visual perception approach. The performance of the proposed algorithm is compared to that of common algorithms for correction of color images containing shadow regions.

  6. Changes in speckle patterns induced by load application onto an optical fiber and its possible application for sensing purpose

    Science.gov (United States)

    Hasegawa, Makoto; Okumura, Jyun-ya; Hyuga, Akio

    2015-08-01

    Speckle patterns to be observed in an output light spot from an optical fiber are known to be changed due to external disturbances applied onto the optical fiber. In order to investigate possibilities of utilizing such changes in speckle patterns for sensing application, a certain load was applied onto a jacket-covered communication-grade multi-mode glass optical fiber through which laser beams emitted from a laser diode were propagating, and observed changes in speckle patterns in the output light spot from the optical fiber were investigated both as image data via a CCD camera and as an output voltage from a photovoltaic panel irradiated with the output light spot. The load was applied via a load application mechanism in which several ridges were provided onto opposite flat plates and a certain number of weights were placed there so that corrugated bending of the optical fiber was intentionally induced via load application due to the ridges. The obtained results showed that the number of speckles in the observed pattern in the output light spot as well as the output voltage from the photovoltaic panel irradiated with the output light spot showed decreases upon load application with relatively satisfactory repeatability. When the load was reduced, i.e., the weights were removed, the number of speckles then showed recovery. These results indicate there is a certain possibility of utilizing changes in speckle patterns for sensing of load application onto the optical fiber.

  7. Optimisation of centroiding algorithms for photon event counting imaging

    International Nuclear Information System (INIS)

    Suhling, K.; Airey, R.W.; Morgan, B.L.

    1999-01-01

    Approaches to photon event counting imaging in which the output events of an image intensifier are located using a centroiding technique have long been plagued by fixed pattern noise in which a grid of dimensions similar to those of the CCD pixels is superimposed on the image. This is caused by a mismatch between the photon event shape and the centroiding algorithm. We have used hyperbolic cosine, Gaussian, Lorentzian, parabolic as well as 3-, 5-, and 7-point centre of gravity algorithms, and hybrids thereof, to assess means of minimising this fixed pattern noise. We show that fixed pattern noise generated by the widely used centre of gravity centroiding is due to intrinsic features of the algorithm. Our results confirm that the recently proposed use of Gaussian centroiding does indeed show a significant reduction of fixed pattern noise compared to centre of gravity centroiding (Michel et al., Mon. Not. R. Astron. Soc. 292 (1997) 611-620). However, the disadvantage of a Gaussian algorithm is a centroiding failure for small pulses, caused by a division by zero, which leads to a loss of detective quantum efficiency (DQE) and to small amounts of residual fixed pattern noise. Using both real data from an image intensifier system employing a progressive scan camera, framegrabber and PC, and also synthetic data from Monte-Carlo simulations, we find that hybrid centroiding algorithms can reduce the fixed pattern noise without loss of resolution or loss of DQE. Imaging a test pattern to assess the features of the different algorithms shows that a hybrid of Gaussian and 3-point centre of gravity centroiding algorithms results in an optimum combination of low fixed pattern noise (lower than a simple Gaussian), high DQE, and high resolution. The Lorentzian algorithm gives the worst results in terms of high fixed pattern noise and low resolution, and the Gaussian and hyperbolic cosine algorithms have the lowest DQEs

  8. Design of an Image Motion Compenstaion (IMC Algorithm for Image Registration of the Communication, Ocean, Meteorolotical Satellite (COMS-1

    Directory of Open Access Journals (Sweden)

    Taek Seo Jung

    2006-03-01

    Full Text Available This paper presents an Image Motion Compensation (IMC algorithm for the Korea's Communication, Ocean, and Meteorological Satellite (COMS-1. An IMC algorithm is a priority component of image registration in Image Navigation and Registration (INR system to locate and register radiometric image data. Due to various perturbations, a satellite has orbit and attitude errors with respect to a reference motion. These errors cause depointing of the imager aiming direction, and in consequence cause image distortions. To correct the depointing of the imager aiming direction, a compensation algorithm is designed by adapting different equations from those used for the GOES satellites. The capability of the algorithm is compared with that of existing algorithm applied to the GOES's INR system. The algorithm developed in this paper improves pointing accuracy by 40%, and efficiently compensates the depointings of the imager aiming direction.

  9. Polarization-multiplexing ghost imaging

    Science.gov (United States)

    Dongfeng, Shi; Jiamin, Zhang; Jian, Huang; Yingjian, Wang; Kee, Yuan; Kaifa, Cao; Chenbo, Xie; Dong, Liu; Wenyue, Zhu

    2018-03-01

    A novel technique for polarization-multiplexing ghost imaging is proposed to simultaneously obtain multiple polarimetric information by a single detector. Here, polarization-division multiplexing speckles are employed for object illumination. The light reflected from the objects is detected by a single-pixel detector. An iterative reconstruction method is used to restore the fused image containing the different polarimetric information by using the weighted sum of the multiplexed speckles based on the correlation coefficients obtained from the detected intensities. Next, clear images of the different polarimetric information are recovered by demultiplexing the fused image. The results clearly demonstrate that the proposed method is effective.

  10. Fractional order integration and fuzzy logic based filter for denoising of echocardiographic image.

    Science.gov (United States)

    Saadia, Ayesha; Rashdi, Adnan

    2016-12-01

    Ultrasound is widely used for imaging due to its cost effectiveness and safety feature. However, ultrasound images are inherently corrupted with speckle noise which severely affects the quality of these images and create difficulty for physicians in diagnosis. To get maximum benefit from ultrasound imaging, image denoising is an essential requirement. To perform image denoising, a two stage methodology using fuzzy weighted mean and fractional integration filter has been proposed in this research work. In stage-1, image pixels are processed by applying a 3 × 3 window around each pixel and fuzzy logic is used to assign weights to the pixels in each window, replacing central pixel of the window with weighted mean of all neighboring pixels present in the same window. Noise suppression is achieved by assigning weights to the pixels while preserving edges and other important features of an image. In stage-2, the resultant image is further improved by fractional order integration filter. Effectiveness of the proposed methodology has been analyzed for standard test images artificially corrupted with speckle noise and real ultrasound B-mode images. Results of the proposed technique have been compared with different state-of-the-art techniques including Lsmv, Wiener, Geometric filter, Bilateral, Non-local means, Wavelet, Perona et al., Total variation (TV), Global Adaptive Fractional Integral Algorithm (GAFIA) and Improved Fractional Order Differential (IFD) model. Comparison has been done on quantitative and qualitative basis. For quantitative analysis different metrics like Peak Signal to Noise Ratio (PSNR), Speckle Suppression Index (SSI), Structural Similarity (SSIM), Edge Preservation Index (β) and Correlation Coefficient (ρ) have been used. Simulations have been done using Matlab. Simulation results of artificially corrupted standard test images and two real Echocardiographic images reveal that the proposed method outperforms existing image denoising techniques

  11. Image-Data Compression Using Edge-Optimizing Algorithm for WFA Inference.

    Science.gov (United States)

    Culik, Karel II; Kari, Jarkko

    1994-01-01

    Presents an inference algorithm that produces a weighted finite automata (WFA), in particular, the grayness functions of graytone images. Image-data compression results based on the new inference algorithm produces a WFA with a relatively small number of edges. Image-data compression results alone and in combination with wavelets are discussed.…

  12. Digital Image Speckle Correlation for the Quantification of the Cosmetic Treatment with Botulinum Toxin Type A (BTX-A)

    Science.gov (United States)

    Bhatnagar, Divya; Conkling, Nicole; Rafailovich, Miriam; Dagum, Alexander

    2012-02-01

    The skin on the face is directly attached to the underlying muscles. Here, we successfully introduce a non-invasive, non-contact technique, Digital Image Speckle Correlation (DISC), to measure the precise magnitude and duration of facial muscle paralysis inflicted by BTX-A. Subjective evaluation by clinicians and patients fail to objectively quantify the direct effect and duration of BTX-A on the facial musculature. By using DISC, we can (a) Directly measure deformation field of the facial skin and determine the locus of facial muscular tension(b)Quantify and monitor muscular paralysis and subsequent re-innervation following injection; (c) Continuously correlate the appearance of wrinkles and muscular tension. Two sequential photographs of slight facial motion (frowning, raising eyebrows) are taken. DISC processes the images to produce a vector map of muscular displacement from which spatially resolved information is obtained regarding facial tension. DISC can track the ability of different muscle groups to contract and can be used to predict the site of injection, quantify muscle paralysis and the rate of recovery following BOTOX injection.

  13. A Novel Plant Root Foraging Algorithm for Image Segmentation Problems

    Directory of Open Access Journals (Sweden)

    Lianbo Ma

    2014-01-01

    Full Text Available This paper presents a new type of biologically-inspired global optimization methodology for image segmentation based on plant root foraging behavior, namely, artificial root foraging algorithm (ARFO. The essential motive of ARFO is to imitate the significant characteristics of plant root foraging behavior including branching, regrowing, and tropisms for constructing a heuristic algorithm for multidimensional and multimodal problems. A mathematical model is firstly designed to abstract various plant root foraging patterns. Then, the basic process of ARFO algorithm derived in the model is described in details. When tested against ten benchmark functions, ARFO shows the superiority to other state-of-the-art algorithms on several benchmark functions. Further, we employed the ARFO algorithm to deal with multilevel threshold image segmentation problem. Experimental results of the new algorithm on a variety of images demonstrated the suitability of the proposed method for solving such problem.

  14. Quantification of global myocardial function by cine MRI deformable registration-based analysis: Comparison with MR feature tracking and speckle-tracking echocardiography.

    Science.gov (United States)

    Lamacie, Mariana M; Thavendiranathan, Paaladinesh; Hanneman, Kate; Greiser, Andreas; Jolly, Marie-Pierre; Ward, Richard; Wintersperger, Bernd J

    2017-04-01

    To evaluate deformable registration algorithms (DRA)-based quantification of cine steady-state free-precession (SSFP) for myocardial strain assessment in comparison with feature-tracking (FT) and speckle-tracking echocardiography (STE). Data sets of 28 patients/10 volunteers, undergoing same-day 1.5T cardiac MRI and echocardiography were included. LV global longitudinal (GLS), circumferential (GCS) and radial (GRS) peak systolic strain were assessed on cine SSFP data using commercially available FT algorithms and prototype DRA-based algorithms. STE was applied as standard of reference for accuracy, precision and intra-/interobserver reproducibility testing. DRA showed narrower limits of agreement compared to STE for GLS (-4.0 [-0.9,-7.9]) and GCS (-5.1 [1.1,-11.2]) than FT (3.2 [11.2,-4.9]; 3.8 [13.9,-6.3], respectively). While both DRA and FT demonstrated significant differences to STE for GLS and GCS (all pcine MRI. • Inverse DRA demonstrated superior reproducibility compared to feature-tracking (FT) methods. • Cine MR DRA and FT analysis demonstrate differences to speckle-tracking echocardiography • DRA demonstrated better correlation with STE than FT for MR-derived global strain data.

  15. Decoding using back-project algorithm from coded image in ICF

    International Nuclear Information System (INIS)

    Jiang shaoen; Liu Zhongli; Zheng Zhijian; Tang Daoyuan

    1999-01-01

    The principle of the coded imaging and its decoding in inertial confinement fusion is described simply. The authors take ring aperture microscope for example and use back-project (BP) algorithm to decode the coded image. The decoding program has been performed for numerical simulation. Simulations of two models are made, and the results show that the accuracy of BP algorithm is high and effect of reconstruction is good. Thus, it indicates that BP algorithm is applicable to decoding for coded image in ICF experiments

  16. Automated Photogrammetric Image Matching with Sift Algorithm and Delaunay Triangulation

    DEFF Research Database (Denmark)

    Karagiannis, Georgios; Antón Castro, Francesc/François; Mioc, Darka

    2016-01-01

    An algorithm for image matching of multi-sensor and multi-temporal satellite images is developed. The method is based on the SIFT feature detector proposed by Lowe in (Lowe, 1999). First, SIFT feature points are detected independently in two images (reference and sensed image). The features detec...... of each feature set for each image are computed. The isomorphism of the Delaunay triangulations is determined to guarantee the quality of the image matching. The algorithm is implemented in Matlab and tested on World-View 2, SPOT6 and TerraSAR-X image patches....

  17. An accelerated threshold-based back-projection algorithm for Compton camera image reconstruction

    International Nuclear Information System (INIS)

    Mundy, Daniel W.; Herman, Michael G.

    2011-01-01

    Purpose: Compton camera imaging (CCI) systems are currently under investigation for radiotherapy dose reconstruction and verification. The ability of such a system to provide real-time images during dose delivery will be limited by the computational speed of the image reconstruction algorithm. In this work, the authors present a fast and simple method by which to generate an initial back-projected image from acquired CCI data, suitable for use in a filtered back-projection algorithm or as a starting point for iterative reconstruction algorithms, and compare its performance to the current state of the art. Methods: Each detector event in a CCI system describes a conical surface that includes the true point of origin of the detected photon. Numerical image reconstruction algorithms require, as a first step, the back-projection of each of these conical surfaces into an image space. The algorithm presented here first generates a solution matrix for each slice of the image space by solving the intersection of the conical surface with the image plane. Each element of the solution matrix is proportional to the distance of the corresponding voxel from the true intersection curve. A threshold function was developed to extract those pixels sufficiently close to the true intersection to generate a binary intersection curve. This process is repeated for each image plane for each CCI detector event, resulting in a three-dimensional back-projection image. The performance of this algorithm was tested against a marching algorithm known for speed and accuracy. Results: The threshold-based algorithm was found to be approximately four times faster than the current state of the art with minimal deficit to image quality, arising from the fact that a generically applicable threshold function cannot provide perfect results in all situations. The algorithm fails to extract a complete intersection curve in image slices near the detector surface for detector event cones having axes nearly

  18. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    Science.gov (United States)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  19. An AK-LDMeans algorithm based on image clustering

    Science.gov (United States)

    Chen, Huimin; Li, Xingwei; Zhang, Yongbin; Chen, Nan

    2018-03-01

    Clustering is an effective analytical technique for handling unmarked data for value mining. Its ultimate goal is to mark unclassified data quickly and correctly. We use the roadmap for the current image processing as the experimental background. In this paper, we propose an AK-LDMeans algorithm to automatically lock the K value by designing the Kcost fold line, and then use the long-distance high-density method to select the clustering centers to further replace the traditional initial clustering center selection method, which further improves the efficiency and accuracy of the traditional K-Means Algorithm. And the experimental results are compared with the current clustering algorithm and the results are obtained. The algorithm can provide effective reference value in the fields of image processing, machine vision and data mining.

  20. Performance evaluation of the EM algorithm applied to radiographic images

    International Nuclear Information System (INIS)

    Brailean, J.C.; Giger, M.L.; Chen, C.T.; Sullivan, B.J.

    1990-01-01

    In this paper the authors evaluate the expectation maximization (EM) algorithm, both qualitatively and quantitatively, as a technique for enhancing radiographic images. Previous studies have qualitatively shown the usefulness of the EM algorithm but have failed to quantify and compare its performance with those of other image processing techniques. Recent studies by Loo et al, Ishida et al, and Giger et al, have explained improvements in image quality quantitatively in terms of a signal-to-noise ratio (SNR) derived from signal detection theory. In this study, we take a similar approach in quantifying the effect of the EM algorithm on detection of simulated low-contrast square objects superimposed on radiographic mottle. The SNRs of the original and processed images are calculated taking into account both the human visual system response and the screen-film transfer function as well as a noise component internal to the eye-brain system. The EM algorithm was also implemented on digital screen-film images of test patterns and clinical mammograms

  1. Research on Adaptive Optics Image Restoration Algorithm by Improved Expectation Maximization Method

    Directory of Open Access Journals (Sweden)

    Lijuan Zhang

    2014-01-01

    Full Text Available To improve the effect of adaptive optics images’ restoration, we put forward a deconvolution algorithm improved by the EM algorithm which joints multiframe adaptive optics images based on expectation-maximization theory. Firstly, we need to make a mathematical model for the degenerate multiframe adaptive optics images. The function model is deduced for the points that spread with time based on phase error. The AO images are denoised using the image power spectral density and support constraint. Secondly, the EM algorithm is improved by combining the AO imaging system parameters and regularization technique. A cost function for the joint-deconvolution multiframe AO images is given, and the optimization model for their parameter estimations is built. Lastly, the image-restoration experiments on both analog images and the real AO are performed to verify the recovery effect of our algorithm. The experimental results show that comparing with the Wiener-IBD or RL-IBD algorithm, our iterations decrease 14.3% and well improve the estimation accuracy. The model distinguishes the PSF of the AO images and recovers the observed target images clearly.

  2. Applications of polarization speckle in skin cancer detection and monitoring

    Science.gov (United States)

    Lee, Tim K.; Tchvialeva, Lioudmila; Phillips, Jamie; Louie, Daniel C.; Zhao, Jianhua; Wang, Wei; Lui, Harvey; Kalia, Sunil

    2018-01-01

    Polarization speckle is a rapidly developed field. Unlike laser speckle, polarization speckle consists of stochastic interference patterns with spatially random polarizations, amplitudes and phases. We have been working in this exciting research field, developing techniques to generate polarization patterns from skin. We hypothesize that polarization speckle patterns could be used in biomedical applications, especially, for detecting and monitoring skin cancers, the most common neoplasmas for white populations around the world. This paper describes our effort in developing two polarization speckle devices. One of them captures the Stokes parameters So and S1 simultaneously, and another one captures all four Stokes parameters So, S1, S2, and S3 in one-shot, within milliseconds. Hence these two devices could be used in medical clinics and assessed skin conditions in-vivo. In order to validate our hypothesis, we conducted a series of three clinical studies. These are early pilot studies, and the results suggest that the devices have potential to detect and monitor skin cancers.

  3. Images Encryption Method using Steganographic LSB Method, AES and RSA algorithm

    Science.gov (United States)

    Moumen, Abdelkader; Sissaoui, Hocine

    2017-03-01

    Vulnerability of communication of digital images is an extremely important issue nowadays, particularly when the images are communicated through insecure channels. To improve communication security, many cryptosystems have been presented in the image encryption literature. This paper proposes a novel image encryption technique based on an algorithm that is faster than current methods. The proposed algorithm eliminates the step in which the secrete key is shared during the encryption process. It is formulated based on the symmetric encryption, asymmetric encryption and steganography theories. The image is encrypted using a symmetric algorithm, then, the secret key is encrypted by means of an asymmetrical algorithm and it is hidden in the ciphered image using a least significant bits steganographic scheme. The analysis results show that while enjoying the faster computation, our method performs close to optimal in terms of accuracy.

  4. Neural Network Blind Equalization Algorithm Applied in Medical CT Image Restoration

    Directory of Open Access Journals (Sweden)

    Yunshan Sun

    2013-01-01

    Full Text Available A new algorithm for iterative blind image restoration is presented in this paper. The method extends blind equalization found in the signal case to the image. A neural network blind equalization algorithm is derived and used in conjunction with Zigzag coding to restore the original image. As a result, the effect of PSF can be removed by using the proposed algorithm, which contributes to eliminate intersymbol interference (ISI. In order to obtain the estimation of the original image, what is proposed in this method is to optimize constant modulus blind equalization cost function applied to grayscale CT image by using conjugate gradient method. Analysis of convergence performance of the algorithm verifies the feasibility of this method theoretically; meanwhile, simulation results and performance evaluations of recent image quality metrics are provided to assess the effectiveness of the proposed method.

  5. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    Directory of Open Access Journals (Sweden)

    Fusheng Wang

    2013-01-01

    Full Text Available Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS data model. Aims: (1 Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2 Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3 Develop a set of queries to support data sampling and result comparisons; (4 Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1 algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2 algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The

  6. Implementation of digital image encryption algorithm using logistic function and DNA encoding

    Science.gov (United States)

    Suryadi, MT; Satria, Yudi; Fauzi, Muhammad

    2018-03-01

    Cryptography is a method to secure information that might be in form of digital image. Based on past research, in order to increase security level of chaos based encryption algorithm and DNA based encryption algorithm, encryption algorithm using logistic function and DNA encoding was proposed. Digital image encryption algorithm using logistic function and DNA encoding use DNA encoding to scramble the pixel values into DNA base and scramble it in DNA addition, DNA complement, and XOR operation. The logistic function in this algorithm used as random number generator needed in DNA complement and XOR operation. The result of the test show that the PSNR values of cipher images are 7.98-7.99 bits, the entropy values are close to 8, the histogram of cipher images are uniformly distributed and the correlation coefficient of cipher images are near 0. Thus, the cipher image can be decrypted perfectly and the encryption algorithm has good resistance to entropy attack and statistical attack.

  7. Algorithms evaluation for fundus images enhancement

    International Nuclear Information System (INIS)

    Braem, V; Marcos, M; Bizai, G; Drozdowicz, B; Salvatelli, A

    2011-01-01

    Color images of the retina inherently involve noise and illumination artifacts. In order to improve the diagnostic quality of the images, it is desirable to homogenize the non-uniform illumination and increase contrast while preserving color characteristics. The visual result of different pre-processing techniques can be very dissimilar and it is necessary to make an objective assessment of the techniques in order to select the most suitable. In this article the performance of eight algorithms to correct the non-uniform illumination, contrast modification and color preservation was evaluated. In order to choose the most suitable a general score was proposed. The results got good impression from experts, although some differences suggest that not necessarily the best statistical quality of image is the one of best diagnostic quality to the trained doctor eye. This means that the best pre-processing algorithm for an automatic classification may be different to the most suitable one for visual diagnosis. However, both should result in the same final diagnosis.

  8. Image Reconstruction Algorithm For Electrical Capacitance Tomography (ECT)

    International Nuclear Information System (INIS)

    Arko

    2001-01-01

    ). Most image reconstruction algorithms for electrical capacitance tomography (ECT) use sensitivity maps as weighting factors. The computation is fast, involving a simple multiply-and- accumulate (MAC) operation, but the resulting image suffers from blurring due to the soft-field effect of the sensor. This paper presents a low cost iterative method employing proportional thresholding, which improves image quality dramatically. The strategy for implementation, computational cost, and achievable speed is examined when using a personal computer (PC) and Digital Signal Processor (DSP). For PC implementation, Watcom C++ 10.6 and Visual C++ 5.0 compilers were used. The experimental results are compared to the images reconstructed by commercially available software. The new algorithm improves the image quality significantly at a cost of a few iterations. This technique can be readily exploited for online applications

  9. 3-D Image Encryption Based on Rubik's Cube and RC6 Algorithm

    Science.gov (United States)

    Helmy, Mai; El-Rabaie, El-Sayed M.; Eldokany, Ibrahim M.; El-Samie, Fathi E. Abd

    2017-12-01

    A novel encryption algorithm based on the 3-D Rubik's cube is proposed in this paper to achieve 3D encryption of a group of images. This proposed encryption algorithm begins with RC6 as a first step for encrypting multiple images, separately. After that, the obtained encrypted images are further encrypted with the 3-D Rubik's cube. The RC6 encrypted images are used as the faces of the Rubik's cube. From the concepts of image encryption, the RC6 algorithm adds a degree of diffusion, while the Rubik's cube algorithm adds a degree of permutation. The simulation results demonstrate that the proposed encryption algorithm is efficient, and it exhibits strong robustness and security. The encrypted images are further transmitted over wireless Orthogonal Frequency Division Multiplexing (OFDM) system and decrypted at the receiver side. Evaluation of the quality of the decrypted images at the receiver side reveals good results.

  10. Determining the mechanical properties of rat skin with digital image speckle correlation.

    Science.gov (United States)

    Guan, E; Smilow, Sarah; Rafailovich, Miriam; Sokolov, Jonathan

    2004-01-01

    Accurate measurement of the mechanical properties of skin has numerous implications in surgical repair, dermal disorders and the diagnosis and treatment of trauma to the skin. Investigation of facial wrinkle formation, as well as research in the areas of skin aging and cosmetic product assessment can also benefit from alternative methodologies for the measurement of mechanical properties. A noncontact, noninvasive technique, digital image speckle correlation (DISC), has been successfully introduced to measure the deformation field of a skin sample loaded by a material test machine. With the force information obtained from the loading device, the mechanical properties of the skin, such as Young's modulus, linear limitation and material strength, can be calculated using elastic or viscoelastic theory. The DISC method was used to measure the deformation of neonatal rat skin, with and without a glycerin-fruit-oil-based cream under uniaxial tension. Deformation to failure procedure of newborn rat skin was recorded and analyzed. Single skin layer failures were observed and located by finding the strain concentration. Young's moduli of freshly excised rat skin, cream-processed rat skin and unprocessed rat skin, 24 h after excision, were found with tensile tests to be 1.6, 1.4 and 0.7 MPa, respectively. Our results have shown that DISC provides a novel technique for numerous applications in dermatology and reconstructive surgeries. Copyright 2004 S. Karger AG, Basel

  11. Dynamical speckles in watery surfaces

    International Nuclear Information System (INIS)

    Llovera-Gonzalez, J.J.; Moreno-Yeras, A.; Garcia-Diaz, M.; Alvarez-Salgado, Y.

    2009-01-01

    Recovery of watery surfaces with monolayer of surfactant substances is of interest in diverse technological applications. The format ion and study of molecular monolayer deposited in these surfaces require the application of measurements techniques that allow evaluating the recovery grade locally without modifying practically the studied surface. In this paper the preliminary results obtained by the authors it plows exposed applying the technique of dynamic speckle interferometry in watery surfaces and their consideration like to possible resource to measure the grade of local recovery of these surfaces on the it bases that the speckles pattern dog reveal the dynamics of evaporation that takes place in the same ones. (Author)

  12. An Improved FCM Medical Image Segmentation Algorithm Based on MMTD

    Directory of Open Access Journals (Sweden)

    Ningning Zhou

    2014-01-01

    Full Text Available Image segmentation plays an important role in medical image processing. Fuzzy c-means (FCM is one of the popular clustering algorithms for medical image segmentation. But FCM is highly vulnerable to noise due to not considering the spatial information in image segmentation. This paper introduces medium mathematics system which is employed to process fuzzy information for image segmentation. It establishes the medium similarity measure based on the measure of medium truth degree (MMTD and uses the correlation of the pixel and its neighbors to define the medium membership function. An improved FCM medical image segmentation algorithm based on MMTD which takes some spatial features into account is proposed in this paper. The experimental results show that the proposed algorithm is more antinoise than the standard FCM, with more certainty and less fuzziness. This will lead to its practicable and effective applications in medical image segmentation.

  13. Quantization analysis of speckle intensity measurements for phase retrieval

    DEFF Research Database (Denmark)

    Maallo, Anne Margarette S.; Almoro, Percival F.; Hanson, Steen Grüner

    2010-01-01

    Speckle intensity measurements utilized for phase retrieval (PR) are sequentially taken with a digital camera, which introduces quantization error that diminishes the signal quality. Influences of quantization on the speckle intensity distribution and PR are investigated numerically...... and experimentally in the static wavefront sensing setup. Resultsshowthat 3 to 4 bits are adequate to represent the speckle intensities and yield acceptable reconstructions at relatively fast convergence rates. Computer memory requirements may be eased down by 2.4 times if a 4 bit instead of an 8 bit camera is used...

  14. The asymmetric facial skin perfusion distribution of Bell's palsy discovered by laser speckle imaging technology.

    Science.gov (United States)

    Cui, Han; Chen, Yi; Zhong, Weizheng; Yu, Haibo; Li, Zhifeng; He, Yuhai; Yu, Wenlong; Jin, Lei

    2016-01-01

    Bell's palsy is a kind of peripheral neural disease that cause abrupt onset of unilateral facial weakness. In the pathologic study, it was evidenced that ischemia of facial nerve at the affected side of face existed in Bell's palsy patients. Since the direction of facial nerve blood flow is primarily proximal to distal, facial skin microcirculation would also be affected after the onset of Bell's palsy. Therefore, monitoring the full area of facial skin microcirculation would help to identify the condition of Bell's palsy patients. In this study, a non-invasive, real time and full field imaging technology - laser speckle imaging (LSI) technology was applied for measuring facial skin blood perfusion distribution of Bell's palsy patients. 85 participants with different stage of Bell's palsy were included. Results showed that Bell's palsy patients' facial skin perfusion of affected side was lower than that of the normal side at the region of eyelid, and that the asymmetric distribution of the facial skin perfusion between two sides of eyelid is positively related to the stage of the disease (P Bell's palsy patients, and we discovered that the facial skin blood perfusion could reflect the stage of Bell's palsy, which suggested that microcirculation should be investigated in patients with this neurological deficit. It was also suggested LSI as potential diagnostic tool for Bell's palsy.

  15. A Novel Perceptual Hash Algorithm for Multispectral Image Authentication

    Directory of Open Access Journals (Sweden)

    Kaimeng Ding

    2018-01-01

    Full Text Available The perceptual hash algorithm is a technique to authenticate the integrity of images. While a few scholars have worked on mono-spectral image perceptual hashing, there is limited research on multispectral image perceptual hashing. In this paper, we propose a perceptual hash algorithm for the content authentication of a multispectral remote sensing image based on the synthetic characteristics of each band: firstly, the multispectral remote sensing image is preprocessed with band clustering and grid partition; secondly, the edge feature of the band subsets is extracted by band fusion-based edge feature extraction; thirdly, the perceptual feature of the same region of the band subsets is compressed and normalized to generate the perceptual hash value. The authentication procedure is achieved via the normalized Hamming distance between the perceptual hash value of the recomputed perceptual hash value and the original hash value. The experiments indicated that our proposed algorithm is robust compared to content-preserved operations and it efficiently authenticates the integrity of multispectral remote sensing images.

  16. A Degree Distribution Optimization Algorithm for Image Transmission

    Science.gov (United States)

    Jiang, Wei; Yang, Junjie

    2016-09-01

    Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.

  17. Research on Image Reconstruction Algorithms for Tuber Electrical Resistance Tomography System

    Directory of Open Access Journals (Sweden)

    Jiang Zili

    2016-01-01

    Full Text Available The application of electrical resistance tomography (ERT technology has been expanded to the field of agriculture, and the concept of TERT (Tuber Electrical Resistance Tomography is proposed. On the basis of the research on the forward and the inverse problems of the TERT system, a hybrid algorithm based on genetic algorithm is proposed, which can be used in TERT system to monitor the growth status of the plant tubers. The image reconstruction of TERT system is different from the conventional ERT system for two phase-flow measurement. Imaging of TERT needs more precision measurement and the conventional ERT cares more about the image reconstruction speed. A variety of algorithms are analyzed and optimized for the purpose of making them suitable for TERT system. For example: linear back projection, modified Newton-Raphson and genetic algorithm. Experimental results showed that the novel hybrid algorithm is superior to other algorithm and it can effectively improve the image reconstruction quality.

  18. Hybridizing Differential Evolution with a Genetic Algorithm for Color Image Segmentation

    Directory of Open Access Journals (Sweden)

    R. V. V. Krishna

    2016-10-01

    Full Text Available This paper proposes a hybrid of differential evolution and genetic algorithms to solve the color image segmentation problem. Clustering based color image segmentation algorithms segment an image by clustering the features of color and texture, thereby obtaining accurate prototype cluster centers. In the proposed algorithm, the color features are obtained using the homogeneity model. A new texture feature named Power Law Descriptor (PLD which is a modification of Weber Local Descriptor (WLD is proposed and further used as a texture feature for clustering. Genetic algorithms are competent in handling binary variables, while differential evolution on the other hand is more efficient in handling real parameters. The obtained texture feature is binary in nature and the color feature is a real value, which suits very well the hybrid cluster center optimization problem in image segmentation. Thus in the proposed algorithm, the optimum texture feature centers are evolved using genetic algorithms, whereas the optimum color feature centers are evolved using differential evolution.

  19. A SAR IMAGE REGISTRATION METHOD BASED ON SIFT ALGORITHM

    Directory of Open Access Journals (Sweden)

    W. Lu

    2017-09-01

    Full Text Available In order to improve the stability and rapidity of synthetic aperture radar (SAR images matching, an effective method was presented. Firstly, the adaptive smoothing filtering was employed for image denoising in image processing based on Wallis filtering to avoid the follow-up noise is amplified. Secondly, feature points were extracted by a simplified SIFT algorithm. Finally, the exact matching of the images was achieved with these points. Compared with the existing methods, it not only maintains the richness of features, but a-lso reduces the noise of the image. The simulation results show that the proposed algorithm can achieve better matching effect.

  20. Estimation of individual response in finger blood concentration change under occlusion on human arm using speckle patterns

    Science.gov (United States)

    Yokoi, Naomichi; Shinohara, Tomomi; Okazaki, Syunya; Funamizu, Hideki; Kyoso, Masaki; Shimatani, Yuichi; Yuasa, Tomonori; Aizu, Yoshihisa

    2017-07-01

    We have developed the method for imaging blood flow and blood concentration change by using laser speckle in fiber illumination. We experimentally discuss the relationship of blood occlusion condition and individual response of blood concentration change measured by the method.

  1. Non invasive blood flow assessment in diabetic foot ulcer using laser speckle contrast imaging technique

    Science.gov (United States)

    Jayanthy, A. K.; Sujatha, N.; Reddy, M. Ramasubba; Narayanamoorthy, V. B.

    2014-03-01

    Measuring microcirculatory tissue blood perfusion is of interest for both clinicians and researchers in a wide range of applications and can provide essential information of the progress of treatment of certain diseases which causes either an increased or decreased blood flow. Diabetic ulcer associated with alterations in tissue blood flow is the most common cause of non-traumatic lower extremity amputations. A technique which can detect the onset of ulcer and provide essential information on the progress of the treatment of ulcer would be of great help to the clinicians. A noninvasive, noncontact and whole field laser speckle contrast imaging (LSCI) technique has been described in this paper which is used to assess the changes in blood flow in diabetic ulcer affected areas of the foot. The blood flow assessment at the wound site can provide critical information on the efficiency and progress of the treatment given to the diabetic ulcer subjects. The technique may also potentially fulfill a significant need in diabetic foot ulcer screening and management.

  2. Improved SURF Algorithm and Its Application in Seabed Relief Image Matching

    Directory of Open Access Journals (Sweden)

    Zhang Hong-Mei

    2017-01-01

    Full Text Available The matching based on seabed relief image is widely used in underwater relief matching navigation and target recognition, etc. However, being influenced by various factors, some conventional matching algorithms are difficult to obtain an ideal result in the matching of seabed relief image. SURF(Speeded Up Robust Features algorithm is based on feature points pair to achieve matching, and can get good results in the seabed relief image matching. However, in practical applications, the traditional SURF algorithm is easy to get false matching, especially when the area’s features are similar or not obvious, the problem is more seriously. In order to improve the robustness of the algorithm, this paper proposes an improved matching algorithm, which combines the SURF, and RANSAC (Random Sample Consensus algorithms. The new algorithm integrates the two algorithms advantages, firstly, the SURF algorithm is applied to detect and extract the feature points then to pre-match. Secondly, RANSAC algorithm is utilized to eliminate mismatching points, and then the accurate matching is accomplished with the correct matching points. The experimental results show that the improved algorithm overcomes the mismatching problem effectively and have better precision and faster speed than the traditional SURF algorithm.

  3. Simultaneous acquisition of 3D shape and deformation by combination of interferometric and correlation-based laser speckle metrology.

    Science.gov (United States)

    Dekiff, Markus; Berssenbrügge, Philipp; Kemper, Björn; Denz, Cornelia; Dirksen, Dieter

    2015-12-01

    A metrology system combining three laser speckle measurement techniques for simultaneous determination of 3D shape and micro- and macroscopic deformations is presented. While microscopic deformations are determined by a combination of Digital Holographic Interferometry (DHI) and Digital Speckle Photography (DSP), macroscopic 3D shape, position and deformation are retrieved by photogrammetry based on digital image correlation of a projected laser speckle pattern. The photogrammetrically obtained data extend the measurement range of the DHI-DSP system and also increase the accuracy of the calculation of the sensitivity vector. Furthermore, a precise assignment of microscopic displacements to the object's macroscopic shape for enhanced visualization is achieved. The approach allows for fast measurements with a simple setup. Key parameters of the system are optimized, and its precision and measurement range are demonstrated. As application examples, the deformation of a mandible model and the shrinkage of dental impression material are measured.

  4. From Pixels to Region: A Salient Region Detection Algorithm for Location-Quantification Image

    Directory of Open Access Journals (Sweden)

    Mengmeng Zhang

    2014-01-01

    Full Text Available Image saliency detection has become increasingly important with the development of intelligent identification and machine vision technology. This process is essential for many image processing algorithms such as image retrieval, image segmentation, image recognition, and adaptive image compression. We propose a salient region detection algorithm for full-resolution images. This algorithm analyzes the randomness and correlation of image pixels and pixel-to-region saliency computation mechanism. The algorithm first obtains points with more saliency probability by using the improved smallest univalue segment assimilating nucleus operator. It then reconstructs the entire saliency region detection by taking these points as reference and combining them with image spatial color distribution, as well as regional and global contrasts. The results for subjective and objective image saliency detection show that the proposed algorithm exhibits outstanding performance in terms of technology indices such as precision and recall rates.

  5. Multimode waveguide speckle patterns for compressive sensing.

    Science.gov (United States)

    Valley, George C; Sefler, George A; Justin Shaw, T

    2016-06-01

    Compressive sensing (CS) of sparse gigahertz-band RF signals using microwave photonics may achieve better performances with smaller size, weight, and power than electronic CS or conventional Nyquist rate sampling. The critical element in a CS system is the device that produces the CS measurement matrix (MM). We show that passive speckle patterns in multimode waveguides potentially provide excellent MMs for CS. We measure and calculate the MM for a multimode fiber and perform simulations using this MM in a CS system. We show that the speckle MM exhibits the sharp phase transition and coherence properties needed for CS and that these properties are similar to those of a sub-Gaussian MM with the same mean and standard deviation. We calculate the MM for a multimode planar waveguide and find dimensions of the planar guide that give a speckle MM with a performance similar to that of the multimode fiber. The CS simulations show that all measured and calculated speckle MMs exhibit a robust performance with equal amplitude signals that are sparse in time, in frequency, and in wavelets (Haar wavelet transform). The planar waveguide results indicate a path to a microwave photonic integrated circuit for measuring sparse gigahertz-band RF signals using CS.

  6. Image Encryption Using a Lightweight Stream Encryption Algorithm

    Directory of Open Access Journals (Sweden)

    Saeed Bahrami

    2012-01-01

    Full Text Available Security of the multimedia data including image and video is one of the basic requirements for the telecommunications and computer networks. In this paper, we consider a simple and lightweight stream encryption algorithm for image encryption, and a series of tests are performed to confirm suitability of the described encryption algorithm. These tests include visual test, histogram analysis, information entropy, encryption quality, correlation analysis, differential analysis, and performance analysis. Based on this analysis, it can be concluded that the present algorithm in comparison to A5/1 and W7 stream ciphers has the same security level, is better in terms of the speed of performance, and is used for real-time applications.

  7. Algorithms for reconstructing images for industrial applications

    International Nuclear Information System (INIS)

    Lopes, R.T.; Crispim, V.R.

    1986-01-01

    Several algorithms for reconstructing objects from their projections are being studied in our Laboratory, for industrial applications. Such algorithms are useful locating the position and shape of different composition of materials in the object. A Comparative study of two algorithms is made. The two investigated algorithsm are: The MART (Multiplicative - Algebraic Reconstruction Technique) and the Convolution Method. The comparison are carried out from the point view of the quality of the image reconstructed, number of views and cost. (Author) [pt

  8. Statistics of polarization speckle: theory versus experiment

    DEFF Research Database (Denmark)

    Wang, Wei; Hanson, Steen Grüner; Takeda, Mitsuo

    2010-01-01

    In this paper, we reviewed our recent work on the statistical properties of polarization speckle, described by stochastic Stokes parameters fluctuating in space. Based on the Gaussian assumption for the random electric field components and polar-interferometer, we investigated theoretically...... and experimentally the statistics of Stokes parameters of polarization speckle, including probability density function of Stokes parameters with the spatial degree of polarization, autocorrelation of Stokes vector and statistics of spatial derivatives for Stokes parameters....

  9. A Spherical Model Based Keypoint Descriptor and Matching Algorithm for Omnidirectional Images

    Directory of Open Access Journals (Sweden)

    Guofeng Tong

    2014-04-01

    Full Text Available Omnidirectional images generally have nonlinear distortion in radial direction. Unfortunately, traditional algorithms such as scale-invariant feature transform (SIFT and Descriptor-Nets (D-Nets do not work well in matching omnidirectional images just because they are incapable of dealing with the distortion. In order to solve this problem, a new voting algorithm is proposed based on the spherical model and the D-Nets algorithm. Because the spherical-based keypoint descriptor contains the distortion information of omnidirectional images, the proposed matching algorithm is invariant to distortion. Keypoint matching experiments are performed on three pairs of omnidirectional images, and comparison is made among the proposed algorithm, the SIFT and the D-Nets. The result shows that the proposed algorithm is more robust and more precise than the SIFT, and the D-Nets in matching omnidirectional images. Comparing with the SIFT and the D-Nets, the proposed algorithm has two main advantages: (a there are more real matching keypoints; (b the coverage range of the matching keypoints is wider, including the seriously distorted areas.

  10. Efficient Active Contour and K-Means Algorithms in Image Segmentation

    Directory of Open Access Journals (Sweden)

    J.R. Rommelse

    2004-01-01

    Full Text Available In this paper we discuss a classic clustering algorithm that can be used to segment images and a recently developed active contour image segmentation model. We propose integrating aspects of the classic algorithm to improve the active contour model. For the resulting CVK and B-means segmentation algorithms we examine methods to decrease the size of the image domain. The CVK method has been implemented to run on parallel and distributed computers. By changing the order of updating the pixels, it was possible to replace synchronous communication with asynchronous communication and subsequently the parallel efficiency is improved.

  11. A MAP blind image deconvolution algorithm with bandwidth over-constrained

    Science.gov (United States)

    Ren, Zhilei; Liu, Jin; Liang, Yonghui; He, Yulong

    2018-03-01

    We demonstrate a maximum a posteriori (MAP) blind image deconvolution algorithm with bandwidth over-constrained and total variation (TV) regularization to recover a clear image from the AO corrected images. The point spread functions (PSFs) are estimated by bandwidth limited less than the cutoff frequency of the optical system. Our algorithm performs well in avoiding noise magnification. The performance is demonstrated on simulated data.

  12. Iris recognition using image moments and k-means algorithm.

    Science.gov (United States)

    Khan, Yaser Daanial; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed

    2014-01-01

    This paper presents a biometric technique for identification of a person using the iris image. The iris is first segmented from the acquired image of an eye using an edge detection algorithm. The disk shaped area of the iris is transformed into a rectangular form. Described moments are extracted from the grayscale image which yields a feature vector containing scale, rotation, and translation invariant moments. Images are clustered using the k-means algorithm and centroids for each cluster are computed. An arbitrary image is assumed to belong to the cluster whose centroid is the nearest to the feature vector in terms of Euclidean distance computed. The described model exhibits an accuracy of 98.5%.

  13. [Present status and trend of heart fluid mechanics research based on medical image analysis].

    Science.gov (United States)

    Gan, Jianhong; Yin, Lixue; Xie, Shenghua; Li, Wenhua; Lu, Jing; Luo, Anguo

    2014-06-01

    With introduction of current main methods for heart fluid mechanics researches, we studied the characteristics and weakness for three primary analysis methods based on magnetic resonance imaging, color Doppler ultrasound and grayscale ultrasound image, respectively. It is pointed out that particle image velocity (PIV), speckle tracking and block match have the same nature, and three algorithms all adopt block correlation. The further analysis shows that, with the development of information technology and sensor, the research for cardiac function and fluid mechanics will focus on energy transfer process of heart fluid, characteristics of Chamber wall related to blood fluid and Fluid-structure interaction in the future heart fluid mechanics fields.

  14. A Class of Manifold Regularized Multiplicative Update Algorithms for Image Clustering.

    Science.gov (United States)

    Yang, Shangming; Yi, Zhang; He, Xiaofei; Li, Xuelong

    2015-12-01

    Multiplicative update algorithms are important tools for information retrieval, image processing, and pattern recognition. However, when the graph regularization is added to the cost function, different classes of sample data may be mapped to the same subspace, which leads to the increase of data clustering error rate. In this paper, an improved nonnegative matrix factorization (NMF) cost function is introduced. Based on the cost function, a class of novel graph regularized NMF algorithms is developed, which results in a class of extended multiplicative update algorithms with manifold structure regularization. Analysis shows that in the learning, the proposed algorithms can efficiently minimize the rank of the data representation matrix. Theoretical results presented in this paper are confirmed by simulations. For different initializations and data sets, variation curves of cost functions and decomposition data are presented to show the convergence features of the proposed update rules. Basis images, reconstructed images, and clustering results are utilized to present the efficiency of the new algorithms. Last, the clustering accuracies of different algorithms are also investigated, which shows that the proposed algorithms can achieve state-of-the-art performance in applications of image clustering.

  15. An adaptive clustering algorithm for image matching based on corner feature

    Science.gov (United States)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-04-01

    The traditional image matching algorithm always can not balance the real-time and accuracy better, to solve the problem, an adaptive clustering algorithm for image matching based on corner feature is proposed in this paper. The method is based on the similarity of the matching pairs of vector pairs, and the adaptive clustering is performed on the matching point pairs. Harris corner detection is carried out first, the feature points of the reference image and the perceived image are extracted, and the feature points of the two images are first matched by Normalized Cross Correlation (NCC) function. Then, using the improved algorithm proposed in this paper, the matching results are clustered to reduce the ineffective operation and improve the matching speed and robustness. Finally, the Random Sample Consensus (RANSAC) algorithm is used to match the matching points after clustering. The experimental results show that the proposed algorithm can effectively eliminate the most wrong matching points while the correct matching points are retained, and improve the accuracy of RANSAC matching, reduce the computation load of whole matching process at the same time.

  16. A comparative study of image low level feature extraction algorithms

    Directory of Open Access Journals (Sweden)

    M.M. El-gayar

    2013-07-01

    Full Text Available Feature extraction and matching is at the base of many computer vision problems, such as object recognition or structure from motion. Current methods for assessing the performance of popular image matching algorithms are presented and rely on costly descriptors for detection and matching. Specifically, the method assesses the type of images under which each of the algorithms reviewed herein perform to its maximum or highest efficiency. The efficiency is measured in terms of the number of matches founds by the algorithm and the number of type I and type II errors encountered when the algorithm is tested against a specific pair of images. Current comparative studies asses the performance of the algorithms based on the results obtained in different criteria such as speed, sensitivity, occlusion, and others. This study addresses the limitations of the existing comparative tools and delivers a generalized criterion to determine beforehand the level of efficiency expected from a matching algorithm given the type of images evaluated. The algorithms and the respective images used within this work are divided into two groups: feature-based and texture-based. And from this broad classification only three of the most widely used algorithms are assessed: color histogram, FAST (Features from Accelerated Segment Test, SIFT (Scale Invariant Feature Transform, PCA-SIFT (Principal Component Analysis-SIFT, F-SIFT (fast-SIFT and SURF (speeded up robust features. The performance of the Fast-SIFT (F-SIFT feature detection methods are compared for scale changes, rotation, blur, illumination changes and affine transformations. All the experiments use repeatability measurement and the number of correct matches for the evaluation measurements. SIFT presents its stability in most situations although its slow. F-SIFT is the fastest one with good performance as the same as SURF, SIFT, PCA-SIFT show its advantages in rotation and illumination changes.

  17. Development of information preserving data compression algorithm for CT images

    International Nuclear Information System (INIS)

    Kobayashi, Yoshio

    1989-01-01

    Although digital imaging techniques in radiology develop rapidly, problems arise in archival storage and communication of image data. This paper reports on a new information preserving data compression algorithm for computed tomographic (CT) images. This algorithm consists of the following five processes: 1. Pixels surrounding the human body showing CT values smaller than -900 H.U. are eliminated. 2. Each pixel is encoded by its numerical difference from its neighboring pixel along a matrix line. 3. Difference values are encoded by a newly designed code rather than the natural binary code. 4. Image data, obtained with the above process, are decomposed into bit planes. 5. The bit state transitions in each bit plane are encoded by run length coding. Using this new algorithm, the compression ratios of brain, chest, and abdomen CT images are 4.49, 4.34. and 4.40 respectively. (author)

  18. The ANACONDA algorithm for deformable image registration in radiotherapy

    International Nuclear Information System (INIS)

    Weistrand, Ola; Svensson, Stina

    2015-01-01

    Purpose: The purpose of this work was to describe a versatile algorithm for deformable image registration with applications in radiotherapy and to validate it on thoracic 4DCT data as well as CT/cone beam CT (CBCT) data. Methods: ANAtomically CONstrained Deformation Algorithm (ANACONDA) combines image information (i.e., intensities) with anatomical information as provided by contoured image sets. The registration problem is formulated as a nonlinear optimization problem and solved with an in-house developed solver, tailored to this problem. The objective function, which is minimized during optimization, is a linear combination of four nonlinear terms: 1. image similarity term; 2. grid regularization term, which aims at keeping the deformed image grid smooth and invertible; 3. a shape based regularization term which works to keep the deformation anatomically reasonable when regions of interest are present in the reference image; and 4. a penalty term which is added to the optimization problem when controlling structures are used, aimed at deforming the selected structure in the reference image to the corresponding structure in the target image. Results: To validate ANACONDA, the authors have used 16 publically available thoracic 4DCT data sets for which target registration errors from several algorithms have been reported in the literature. On average for the 16 data sets, the target registration error is 1.17 ± 0.87 mm, Dice similarity coefficient is 0.98 for the two lungs, and image similarity, measured by the correlation coefficient, is 0.95. The authors have also validated ANACONDA using two pelvic cases and one head and neck case with planning CT and daily acquired CBCT. Each image has been contoured by a physician (radiation oncologist) or experienced radiation therapist. The results are an improvement with respect to rigid registration. However, for the head and neck case, the sample set is too small to show statistical significance. Conclusions: ANACONDA

  19. Algorithm-Architecture Matching for Signal and Image Processing

    CERN Document Server

    Gogniat, Guy; Morawiec, Adam; Erdogan, Ahmet

    2011-01-01

    Advances in signal and image processing together with increasing computing power are bringing mobile technology closer to applications in a variety of domains like automotive, health, telecommunication, multimedia, entertainment and many others. The development of these leading applications, involving a large diversity of algorithms (e.g. signal, image, video, 3D, communication, cryptography) is classically divided into three consecutive steps: a theoretical study of the algorithms, a study of the target architecture, and finally the implementation. Such a linear design flow is reaching its li

  20. Wavelet-based de-noising algorithm for images acquired with parallel magnetic resonance imaging (MRI)

    International Nuclear Information System (INIS)

    Delakis, Ioannis; Hammad, Omer; Kitney, Richard I

    2007-01-01

    Wavelet-based de-noising has been shown to improve image signal-to-noise ratio in magnetic resonance imaging (MRI) while maintaining spatial resolution. Wavelet-based de-noising techniques typically implemented in MRI require that noise displays uniform spatial distribution. However, images acquired with parallel MRI have spatially varying noise levels. In this work, a new algorithm for filtering images with parallel MRI is presented. The proposed algorithm extracts the edges from the original image and then generates a noise map from the wavelet coefficients at finer scales. The noise map is zeroed at locations where edges have been detected and directional analysis is also used to calculate noise in regions of low-contrast edges that may not have been detected. The new methodology was applied on phantom and brain images and compared with other applicable de-noising techniques. The performance of the proposed algorithm was shown to be comparable with other techniques in central areas of the images, where noise levels are high. In addition, finer details and edges were maintained in peripheral areas, where noise levels are low. The proposed methodology is fully automated and can be applied on final reconstructed images without requiring sensitivity profiles or noise matrices of the receiver coils, therefore making it suitable for implementation in a clinical MRI setting

  1. Interactive vs. automatic ultrasound image segmentation methods for staging hepatic lipidosis.

    Science.gov (United States)

    Weijers, Gert; Starke, Alexander; Haudum, Alois; Thijssen, Johan M; Rehage, Jürgen; De Korte, Chris L

    2010-07-01

    The aim of this study was to test the hypothesis that automatic segmentation of vessels in ultrasound (US) images can produce similar or better results in grading fatty livers than interactive segmentation. A study was performed in postpartum dairy cows (N=151), as an animal model of human fatty liver disease, to test this hypothesis. Five transcutaneous and five intraoperative US liver images were acquired in each animal and a liverbiopsy was taken. In liver tissue samples, triacylglycerol (TAG) was measured by biochemical analysis and hepatic diseases other than hepatic lipidosis were excluded by histopathologic examination. Ultrasonic tissue characterization (UTC) parameters--Mean echo level, standard deviation (SD) of echo level, signal-to-noise ratio (SNR), residual attenuation coefficient (ResAtt) and axial and lateral speckle size--were derived using a computer-aided US (CAUS) protocol and software package. First, the liver tissue was interactively segmented by two observers. With increasing fat content, fewer hepatic vessels were visible in the ultrasound images and, therefore, a smaller proportion of the liver needed to be excluded from these images. Automatic-segmentation algorithms were implemented and it was investigated whether better results could be achieved than with the subjective and time-consuming interactive-segmentation procedure. The automatic-segmentation algorithms were based on both fixed and adaptive thresholding techniques in combination with a 'speckle'-shaped moving-window exclusion technique. All data were analyzed with and without postprocessing as contained in CAUS and with different automated-segmentation techniques. This enabled us to study the effect of the applied postprocessing steps on single and multiple linear regressions ofthe various UTC parameters with TAG. Improved correlations for all US parameters were found by using automatic-segmentation techniques. Stepwise multiple linear-regression formulas where derived and used

  2. An Image Encryption Algorithm Based on Balanced Pixel and Chaotic Map

    Directory of Open Access Journals (Sweden)

    Jian Zhang

    2014-01-01

    Full Text Available Image encryption technology has been applied in many fields and is becoming the main way of protecting the image information security. There are also many ways of image encryption. However, the existing encryption algorithms, in order to obtain a better effect of encryption, always need encrypting several times. There is not an effective method to decide the number of encryption times, generally determined by the human eyes. The paper proposes an image encryption algorithm based on chaos and simultaneously proposes a balanced pixel algorithm to determine the times of image encryption. Many simulation experiments have been done including encryption effect and security analysis. Experimental results show that the proposed method is feasible and effective.

  3. Spatial filtering velocimetry of objective speckles for measuring out-of-plane motion

    DEFF Research Database (Denmark)

    Jakobsen, Michael Linde; Yura, H. T.; Hanson, Steen Grüner

    2012-01-01

    This paper analyzes the dynamics of objective laser speckles as the distance between the object and the observation plane continuously changes. With the purpose of applying optical spatial filtering velocimetry to the speckle dynamics, in order to measure out-of-plane motion in real time......, a rotational symmetric spatial filter is designed. The spatial filter converts the speckle dynamics into a photocurrent with a quasi-sinusoidal response to the out-of-plane motion. The spatial filter is here emulated with a CCD camera, and is tested on speckles arising from a real application. The analysis...

  4. New segmentation-based tone mapping algorithm for high dynamic range image

    Science.gov (United States)

    Duan, Weiwei; Guo, Huinan; Zhou, Zuofeng; Huang, Huimin; Cao, Jianzhong

    2017-07-01

    The traditional tone mapping algorithm for the display of high dynamic range (HDR) image has the drawback of losing the impression of brightness, contrast and color information. To overcome this phenomenon, we propose a new tone mapping algorithm based on dividing the image into different exposure regions in this paper. Firstly, the over-exposure region is determined using the Local Binary Pattern information of HDR image. Then, based on the peak and average gray of the histogram, the under-exposure and normal-exposure region of HDR image are selected separately. Finally, the different exposure regions are mapped by differentiated tone mapping methods to get the final result. The experiment results show that the proposed algorithm achieve the better performance both in visual quality and objective contrast criterion than other algorithms.

  5. First results of genetic algorithm application in ML image reconstruction in emission tomography

    International Nuclear Information System (INIS)

    Smolik, W.

    1999-01-01

    This paper concerns application of genetic algorithm in maximum likelihood image reconstruction in emission tomography. The example of genetic algorithm for image reconstruction is presented. The genetic algorithm was based on the typical genetic scheme modified due to the nature of solved problem. The convergence of algorithm was examined. The different adaption functions, selection and crossover methods were verified. The algorithm was tested on simulated SPECT data. The obtained results of image reconstruction are discussed. (author)

  6. Time Reversal Reconstruction Algorithm Based on PSO Optimized SVM Interpolation for Photoacoustic Imaging

    Directory of Open Access Journals (Sweden)

    Mingjian Sun

    2015-01-01

    Full Text Available Photoacoustic imaging is an innovative imaging technique to image biomedical tissues. The time reversal reconstruction algorithm in which a numerical model of the acoustic forward problem is run backwards in time is widely used. In the paper, a time reversal reconstruction algorithm based on particle swarm optimization (PSO optimized support vector machine (SVM interpolation method is proposed for photoacoustics imaging. Numerical results show that the reconstructed images of the proposed algorithm are more accurate than those of the nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation based time reversal algorithm, which can provide higher imaging quality by using significantly fewer measurement positions or scanning times.

  7. A novel image encryption algorithm based on a 3D chaotic map

    Science.gov (United States)

    Kanso, A.; Ghebleh, M.

    2012-07-01

    Recently [Solak E, Çokal C, Yildiz OT Biyikoǧlu T. Cryptanalysis of Fridrich's chaotic image encryption. Int J Bifur Chaos 2010;20:1405-1413] cryptanalyzed the chaotic image encryption algorithm of [Fridrich J. Symmetric ciphers based on two-dimensional chaotic maps. Int J Bifur Chaos 1998;8(6):1259-1284], which was considered a benchmark for measuring security of many image encryption algorithms. This attack can also be applied to other encryption algorithms that have a structure similar to Fridrich's algorithm, such as that of [Chen G, Mao Y, Chui, C. A symmetric image encryption scheme based on 3D chaotic cat maps. Chaos Soliton Fract 2004;21:749-761]. In this paper, we suggest a novel image encryption algorithm based on a three dimensional (3D) chaotic map that can defeat the aforementioned attack among other existing attacks. The design of the proposed algorithm is simple and efficient, and based on three phases which provide the necessary properties for a secure image encryption algorithm including the confusion and diffusion properties. In phase I, the image pixels are shuffled according to a search rule based on the 3D chaotic map. In phases II and III, 3D chaotic maps are used to scramble shuffled pixels through mixing and masking rules, respectively. Simulation results show that the suggested algorithm satisfies the required performance tests such as high level security, large key space and acceptable encryption speed. These characteristics make it a suitable candidate for use in cryptographic applications.

  8. A novel algorithm for thermal image encryption.

    Science.gov (United States)

    Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen

    2018-04-16

    Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.

  9. A new modified fast fractal image compression algorithm

    DEFF Research Database (Denmark)

    Salarian, Mehdi; Nadernejad, Ehsan; MiarNaimi, Hossein

    2013-01-01

    In this paper, a new fractal image compression algorithm is proposed, in which the time of the encoding process is considerably reduced. The algorithm exploits a domain pool reduction approach, along with the use of innovative predefined values for contrast scaling factor, S, instead of searching...

  10. An Orthogonal Learning Differential Evolution Algorithm for Remote Sensing Image Registration

    Directory of Open Access Journals (Sweden)

    Wenping Ma

    2014-01-01

    Full Text Available We introduce an area-based method for remote sensing image registration. We use orthogonal learning differential evolution algorithm to optimize the similarity metric between the reference image and the target image. Many local and global methods have been used to achieve the optimal similarity metric in the last few years. Because remote sensing images are usually influenced by large distortions and high noise, local methods will fail in some cases. For this reason, global methods are often required. The orthogonal learning (OL strategy is efficient when searching in complex problem spaces. In addition, it can discover more useful information via orthogonal experimental design (OED. Differential evolution (DE is a heuristic algorithm. It has shown to be efficient in solving the remote sensing image registration problem. So orthogonal learning differential evolution algorithm (OLDE is efficient for many optimization problems. The OLDE method uses the OL strategy to guide the DE algorithm to discover more useful information. Experiments show that the OLDE method is more robust and efficient for registering remote sensing images.

  11. Bas-relief map using texture analysis with application to live enhancement of ultrasound images.

    Science.gov (United States)

    Du, Huarui; Ma, Rui; Wang, Xiaoying; Zhang, Jue; Fang, Jing

    2015-05-01

    For ultrasound imaging, speckle is one of the most important factors in the degradation of contrast resolution because it masks meaningful texture and has the potential to interfere with diagnosis. It is expected that researchers would explore appropriate ways to reduce the speckle noise, to find the edges of structures and enhance weak borders between different organs in ultrasound imaging. Inspired by the principle of differential interference contrast microscopy, a "bas-relief map" is proposed that depicts the texture structure of ultrasound images. Based on a bas-relief map, an adaptive bas-relief filter was developed for ultrafast despeckling. Subsequently, an edge map was introduced to enhance the edges of images in real time. The holistic bas-relief map approach has been used experimentally with synthetic phantoms and digital ultrasound B-scan images of liver, kidney and gallbladder. Based on the visual inspection and the performance metrics of the despeckled images, it was found that the bas-relief map approach is capable of effectively reducing the speckle while significantly enhancing contrast and tissue boundaries for ultrasonic images, and its speckle reduction ability is comparable to that of Kuan, Lee and Frost filters. Meanwhile, the proposed technique could preserve more intra-region details compared with the popular speckle reducing anisotropic diffusion technique and more effectively enhance edges. In addition, the adaptive bas-relief filter was much less time consuming than the Kuan, Lee and Frost filter and speckle reducing anisotropic diffusion techniques. The bas-relief map strategy is effective for speckle reduction and live enhancement of ultrasound images, and can provide a valuable tool for clinical diagnosis. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  12. Image Retrieval Algorithm Based on Discrete Fractional Transforms

    Science.gov (United States)

    Jindal, Neeru; Singh, Kulbir

    2013-06-01

    The discrete fractional transforms is a signal processing tool which suggests computational algorithms and solutions to various sophisticated applications. In this paper, a new technique to retrieve the encrypted and scrambled image based on discrete fractional transforms has been proposed. Two-dimensional image was encrypted using discrete fractional transforms with three fractional orders and two random phase masks placed in the two intermediate planes. The significant feature of discrete fractional transforms benefits from its extra degree of freedom that is provided by its fractional orders. Security strength was enhanced (1024!)4 times by scrambling the encrypted image. In decryption process, image retrieval is sensitive for both correct fractional order keys and scrambling algorithm. The proposed approach make the brute force attack infeasible. Mean square error and relative error are the recital parameters to verify validity of proposed method.

  13. A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations

    Energy Technology Data Exchange (ETDEWEB)

    Felix, Simon; Bolzern, Roman; Battaglia, Marina, E-mail: simon.felix@fhnw.ch, E-mail: roman.bolzern@fhnw.ch, E-mail: marina.battaglia@fhnw.ch [University of Applied Sciences and Arts Northwestern Switzerland FHNW, 5210 Windisch (Switzerland)

    2017-11-01

    One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS-CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS-CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation of quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.

  14. A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations

    Science.gov (United States)

    Felix, Simon; Bolzern, Roman; Battaglia, Marina

    2017-11-01

    One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS_CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS_CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation of quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.

  15. Speckle reduction process based on digital filtering and wavelet compounding in optical coherence tomography for dermatology

    Science.gov (United States)

    Gómez Valverde, Juan J.; Ortuño, Juan E.; Guerra, Pedro; Hermann, Boris; Zabihian, Behrooz; Rubio-Guivernau, José L.; Santos, Andrés.; Drexler, Wolfgang; Ledesma-Carbayo, Maria J.

    2015-07-01

    Optical Coherence Tomography (OCT) has shown a great potential as a complementary imaging tool in the diagnosis of skin diseases. Speckle noise is the most prominent artifact present in OCT images and could limit the interpretation and detection capabilities. In this work we propose a new speckle reduction process and compare it with various denoising filters with high edge-preserving potential, using several sets of dermatological OCT B-scans. To validate the performance we used a custom-designed spectral domain OCT and two different data set groups. The first group consisted in five datasets of a single B-scan captured N times (with N<20), the second were five 3D volumes of 25 Bscans. As quality metrics we used signal to noise (SNR), contrast to noise (CNR) and equivalent number of looks (ENL) ratios. Our results show that a process based on a combination of a 2D enhanced sigma digital filter and a wavelet compounding method achieves the best results in terms of the improvement of the quality metrics. In the first group of individual B-scans we achieved improvements in SNR, CNR and ENL of 16.87 dB, 2.19 and 328 respectively; for the 3D volume datasets the improvements were 15.65 dB, 3.44 and 1148. Our results suggest that the proposed enhancement process may significantly reduce speckle, increasing SNR, CNR and ENL and reducing the number of extra acquisitions of the same frame.

  16. Acceleration of the direct reconstruction of linear parametric images using nested algorithms

    International Nuclear Information System (INIS)

    Wang Guobao; Qi Jinyi

    2010-01-01

    Parametric imaging using dynamic positron emission tomography (PET) provides important information for biological research and clinical diagnosis. Indirect and direct methods have been developed for reconstructing linear parametric images from dynamic PET data. Indirect methods are relatively simple and easy to implement because the image reconstruction and kinetic modeling are performed in two separate steps. Direct methods estimate parametric images directly from raw PET data and are statistically more efficient. However, the convergence rate of direct algorithms can be slow due to the coupling between the reconstruction and kinetic modeling. Here we present two fast gradient-type algorithms for direct reconstruction of linear parametric images. The new algorithms decouple the reconstruction and linear parametric modeling at each iteration by employing the principle of optimization transfer. Convergence speed is accelerated by running more sub-iterations of linear parametric estimation because the computation cost of the linear parametric modeling is much less than that of the image reconstruction. Computer simulation studies demonstrated that the new algorithms converge much faster than the traditional expectation maximization (EM) and the preconditioned conjugate gradient algorithms for dynamic PET.

  17. An algorithm for 4D CT image sorting using spatial continuity.

    Science.gov (United States)

    Li, Chen; Liu, Jie

    2013-01-01

    4D CT, which could locate the position of the movement of the tumor in the entire respiratory cycle and reduce image artifacts effectively, has been widely used in making radiation therapy of tumors. The current 4D CT methods required external surrogates of respiratory motion obtained from extra instruments. However, respiratory signals recorded by these external makers may not always accurately represent the internal tumor and organ movements, especially when irregular breathing patterns happened. In this paper we have proposed a novel automatic 4D CT sorting algorithm that performs without these external surrogates. The sorting algorithm requires collecting the image data with a cine scan protocol. Beginning with the first couch position, images from the adjacent couch position are selected out according to spatial continuity. The process is continued until images from all couch positions are sorted and the entire 3D volume is produced. The algorithm is verified by respiratory phantom image data and clinical image data. The primary test results show that the 4D CT images created by our algorithm have eliminated the motion artifacts effectively and clearly demonstrated the movement of tumor and organ in the breath period.

  18. Enhanced deterministic phase retrieval using a partially developed speckle field

    DEFF Research Database (Denmark)

    Almoro, Percival F.; Waller, Laura; Agour, Mostafa

    2012-01-01

    A technique for enhanced deterministic phase retrieval using a partially developed speckle field (PDSF) and a spatial light modulator (SLM) is demonstrated experimentally. A smooth test wavefront impinges on a phase diffuser, forming a PDSF that is directed to a 4f setup. Two defocused speckle...... intensity measurements are recorded at the output plane corresponding to axially-propagated representations of the PDSF in the input plane. The speckle intensity measurements are then used in a conventional transport of intensity equation (TIE) to reconstruct directly the test wavefront. The PDSF in our...

  19. An intelligent despeckling method for swept source optical coherence tomography images of skin

    Science.gov (United States)

    Adabi, Saba; Mohebbikarkhoran, Hamed; Mehregan, Darius; Conforto, Silvia; Nasiriavanaki, Mohammadreza

    2017-03-01

    Optical Coherence Optical coherence tomography is a powerful high-resolution imaging method with a broad biomedical application. Nonetheless, OCT images suffer from a multiplicative artefacts so-called speckle, a result of coherent imaging of system. Digital filters become ubiquitous means for speckle reduction. Addressing the fact that there still a room for despeckling in OCT, we proposed an intelligent speckle reduction framework based on OCT tissue morphological, textural and optical features that through a trained network selects the winner filter in which adaptively suppress the speckle noise while preserve structural information of OCT signal. These parameters are calculated for different steps of the procedure to be used in designed Artificial Neural Network decider that select the best denoising technique for each segment of the image. Results of training shows the dominant filter is BM3D from the last category.

  20. Low dose reconstruction algorithm for differential phase contrast imaging.

    Science.gov (United States)

    Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni

    2011-01-01

    Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.

  1. A novel highly parallel algorithm for linearly unmixing hyperspectral images

    Science.gov (United States)

    Guerra, Raúl; López, Sebastián.; Callico, Gustavo M.; López, Jose F.; Sarmiento, Roberto

    2014-10-01

    Endmember extraction and abundances calculation represent critical steps within the process of linearly unmixing a given hyperspectral image because of two main reasons. The first one is due to the need of computing a set of accurate endmembers in order to further obtain confident abundance maps. The second one refers to the huge amount of operations involved in these time-consuming processes. This work proposes an algorithm to estimate the endmembers of a hyperspectral image under analysis and its abundances at the same time. The main advantage of this algorithm is its high parallelization degree and the mathematical simplicity of the operations implemented. This algorithm estimates the endmembers as virtual pixels. In particular, the proposed algorithm performs the descent gradient method to iteratively refine the endmembers and the abundances, reducing the mean square error, according with the linear unmixing model. Some mathematical restrictions must be added so the method converges in a unique and realistic solution. According with the algorithm nature, these restrictions can be easily implemented. The results obtained with synthetic images demonstrate the well behavior of the algorithm proposed. Moreover, the results obtained with the well-known Cuprite dataset also corroborate the benefits of our proposal.

  2. The Magnetic Nanoparticle Movement in Magnetic Fluid Characterized by the Laser Dynamic Speckle Interferometry

    Directory of Open Access Journals (Sweden)

    Xijun Wang

    2014-01-01

    Full Text Available A dual scanning laser speckle interferometry experiment was designed to observe the dynamic behavior of the magnetic fluid actuated by a magnetic field. In order to improve the spatial resolution of the dynamic speckle measurement, the phase delay scanning was used to compensate the additional phase variation which was caused by the transverse scanning. The correlation coefficients corresponding to the temporal dynamic speckle patterns within the same time interval scattering from the nanoparticles were calculated in the experiment on nanoscale magnetic clusters. In the experiment, the speckle of the magnetic nanoparticle fluid movement has been recorded by the lens unmounted CCD within the interferometry strips, although the speckle led to the distinguished annihilation of the light coherence. The results have showed that the nanoparticle fluid dynamic properties appeared synergistically in the fringe speckles. The analyses of the nanoparticle's relative speed and the speckle pattern moving amount in the fringes have proved the nanoparticle’s movement in a laminar flow in the experiment.

  3. Motion tolerant iterative reconstruction algorithm for cone-beam helical CT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu [Hitachi Medical Corporation, Chiba-ken (Japan). CT System Div.

    2011-07-01

    We have developed a new advanced iterative reconstruction algorithm for cone-beam helical CT. The features of this algorithm are: (a) it uses separable paraboloidal surrogate (SPS) technique as a foundation for reconstruction to reduce noise and cone-beam artifact, (b) it uses a view weight in the back-projection process to reduce motion artifact. To confirm the improvement of our proposed algorithm over other existing algorithm, such as Feldkamp-Davis-Kress (FDK) or SPS algorithm, we compared the motion artifact reduction, image noise reduction (standard deviation of CT number), and cone-beam artifact reduction on simulated and clinical data set. Our results demonstrate that the proposed algorithm dramatically reduces motion artifacts compared with the SPS algorithm, and decreases image noise compared with the FDK algorithm. In addition, the proposed algorithm potentially improves time resolution of iterative reconstruction. (orig.)

  4. Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding

    Directory of Open Access Journals (Sweden)

    Yongjian Nian

    2013-01-01

    Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.

  5. High-speed computation of the EM algorithm for PET image reconstruction

    International Nuclear Information System (INIS)

    Rajan, K.; Patnaik, L.M.; Ramakrishna, J.

    1994-01-01

    The PET image reconstruction based on the EM algorithm has several attractive advantages over the conventional convolution backprojection algorithms. However, two major drawbacks have impeded the routine use of the EM algorithm, namely, the long computational time due to slow convergence and the large memory required for the storage of the image, projection data and the probability matrix. In this study, the authors attempts to solve these two problems by parallelizing the EM algorithm on a multiprocessor system. The authors have implemented an extended hypercube (EH) architecture for the high-speed computation of the EM algorithm using the commercially available fast floating point digital signal processor (DSP) chips as the processing elements (PEs). The authors discuss and compare the performance of the EM algorithm on a 386/387 machine, CD 4360 mainframe, and on the EH system. The results show that the computational speed performance of an EH using DSP chips as PEs executing the EM image reconstruction algorithm is about 130 times better than that of the CD 4360 mainframe. The EH topology is expandable with more number of PEs

  6. [Assessment of left ventricular twist in type 2 diabetes mellitus by using two-dimensional ultrasound speckle tracking imaging].

    Science.gov (United States)

    Zhu, Pei-hua; Huang, Jing-yuan; Ye, Meng; Zheng, Zhe-lan

    2014-09-01

    To evaluate the left ventricular twist characteristics in patients with type 2 diabetes by using two-dimensional speckle tracking imaging (STI). Ninety-three patients with type 2 diabetes admitted in Zhejiang Hospital from May 2012 to September 2013 were enrolled. According to left ventricular ejection fraction (LVEF), patients were divided into two groups: normal left ventricular systolic function group (group A, LVEF≥0.50, n=46) and abnormal left ventricular systolic function group (group B, LVEF Consistency check for STI was conducted to assess its stability and reliability. The Peaktw, AVCtw, and MVOtw in group A were significantly elevated than those in normal controls (Pconsistency limit=-2.8-2.7; within measurer: R=0.964, bias=-0.2, 95% consistency limits=-2.7-2.2). STI can be used for early recognition of abnormal changes of cardiac function in type 2 diabetic mellitus patients, with high stability and reliability.

  7. Machine-Learning Algorithms to Automate Morphological and Functional Assessments in 2D Echocardiography.

    Science.gov (United States)

    Narula, Sukrit; Shameer, Khader; Salem Omar, Alaa Mabrouk; Dudley, Joel T; Sengupta, Partho P

    2016-11-29

    Machine-learning models may aid cardiac phenotypic recognition by using features of cardiac tissue deformation. This study investigated the diagnostic value of a machine-learning framework that incorporates speckle-tracking echocardiographic data for automated discrimination of hypertrophic cardiomyopathy (HCM) from physiological hypertrophy seen in athletes (ATH). Expert-annotated speckle-tracking echocardiographic datasets obtained from 77 ATH and 62 HCM patients were used for developing an automated system. An ensemble machine-learning model with 3 different machine-learning algorithms (support vector machines, random forests, and artificial neural networks) was developed and a majority voting method was used for conclusive predictions with further K-fold cross-validation. Feature selection using an information gain (IG) algorithm revealed that volume was the best predictor for differentiating between HCM ands. ATH (IG = 0.24) followed by mid-left ventricular segmental (IG = 0.134) and average longitudinal strain (IG = 0.131). The ensemble machine-learning model showed increased sensitivity and specificity compared with early-to-late diastolic transmitral velocity ratio (p 13 mm. In this subgroup analysis, the automated model continued to show equal sensitivity, but increased specificity relative to early-to-late diastolic transmitral velocity ratio, e', and strain. Our results suggested that machine-learning algorithms can assist in the discrimination of physiological versus pathological patterns of hypertrophic remodeling. This effort represents a step toward the development of a real-time, machine-learning-based system for automated interpretation of echocardiographic images, which may help novice readers with limited experience. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  8. Speckle Tracking and Transthyretin Amyloid Cardiomyopathy

    Directory of Open Access Journals (Sweden)

    Alexandre Marins Rocha

    Full Text Available Abstract Background: Amyloidosis is a disease caused by deposits of insoluble fibrils in extracellular spaces. The most common type of familial amyloidosis is mediated by mutation of transthyretin, especially Val30Met. Symptoms and ejection fraction decrease may occur in cardiac amyloidosis only in case of poor prognosis. Myocardial strain detected by two-dimensional speckle tracking echocardiography can indicate changes in myocardial function at early stages of the disease. Objective: To determine the accuracy of left ventricular longitudinal strain by two-dimensional speckle tracking echocardiography in patients with familial amyloidosis caused by Val30Met transthyretin mutation. Methods: Eighteen consecutive patients, carriers of transthyretin mutation, were evaluated by two-dimensional speckle tracking echocardiography, by which myocardial strain curves were obtained, following the American Society of Echocardiography recommendations. Results: Patients were divided into three groups: 1- Val30Met with cardiac amyloidosis; 2-Val30Met with extracardiac amyloidosis; 3 - Val30Met without evidence of disease. As the three groups were compared by the Mann-Whitney test, we found a statistically significant difference between groups 1 and 2 in the mean longitudinal tension (p=0.01, mean basal longitudinal strain (p=0.014; in mean longitudinal tension and mean longitudinal strain between groups 1 and 3 (p=0.005; and in the ratio of longitudinal strain of apical septum segment to longitudinal strain of basal septum (p=0.041 between groups 2 and 3. Conclusion: Left ventricular longitudinal strain detected by two-dimensional speckle tracking echocardiography is able to diagnose left ventricular dysfunction in early stages of familial amyloidosis caused by transthyretin Val30Met mutation.

  9. Hypercube algorithms suitable for image understanding in uncertain environments

    International Nuclear Information System (INIS)

    Huntsberger, T.L.; Sengupta, A.

    1988-01-01

    Computer vision in a dynamic environment needs to be fast and able to tolerate incomplete or uncertain intermediate results. An appropriately chose representation coupled with a parallel architecture addresses both concerns. The wide range of numerical and symbolic processing needed for robust computer vision can only be achieved through a blend of SIMD and MIMD processing techniques. The 1024 element hypercube architecture has these capabilities, and was chosen as the test-bed hardware for development of highly parallel computer vision algorithms. This paper presents and analyzes parallel algorithms for color image segmentation and edge detection. These algorithms are part of a recently developed computer vision system which uses multiple valued logic to represent uncertainty in the imaging process and in intermediate results. Algorithms for the extraction of three dimensional properties of objects using dynamic scene analysis techniques within the same framework are examined. Results from experimental studies using a 1024 element hypercube implementation of the algorithm as applied to a series of natural scenes are reported

  10. Evaluation of imaging protocol for ECT based on CS image reconstruction algorithm

    International Nuclear Information System (INIS)

    Zhou Xiaolin; Yun Mingkai; Cao Xuexiang; Liu Shuangquan; Wang Lu; Huang Xianchao; Wei Long

    2014-01-01

    Single-photon emission computerized tomography and positron emission tomography are essential medical imaging tools, for which the sampling angle number and scan time should be carefully chosen to give a good compromise between image quality and radiopharmaceutical dose. In this study, the image quality of different acquisition protocols was evaluated via varied angle number and count number per angle with Monte Carlo simulation data. It was shown that, when similar imaging counts were used, the factor of acquisition counts was more important than that of the sampling number in emission computerized tomography. To further reduce the activity requirement and the scan duration, an iterative image reconstruction algorithm for limited-view and low-dose tomography based on compressed sensing theory has been developed. The total variation regulation was added to the reconstruction process to improve the signal to noise Ratio and reduce artifacts caused by the limited angle sampling. Maximization of the maximum likelihood of the estimated image and the measured data and minimization of the total variation of the image are alternatively implemented. By using this advanced algorithm, the reconstruction process is able to achieve image quality matching or exceed that of normal scans with only half of the injection radiopharmaceutical dose. (authors)

  11. Clustering Batik Images using Fuzzy C-Means Algorithm Based on Log-Average Luminance

    Directory of Open Access Journals (Sweden)

    Ahmad Sanmorino

    2012-06-01

    Full Text Available Batik is a fabric or clothes that are made ​​with a special staining technique called wax-resist dyeing and is one of the cultural heritage which has high artistic value. In order to improve the efficiency and give better semantic to the image, some researchers apply clustering algorithm for managing images before they can be retrieved. Image clustering is a process of grouping images based on their similarity. In this paper we attempt to provide an alternative method of grouping batik image using fuzzy c-means (FCM algorithm based on log-average luminance of the batik. FCM clustering algorithm is an algorithm that works using fuzzy models that allow all data from all cluster members are formed with different degrees of membership between 0 and 1. Log-average luminance (LAL is the average value of the lighting in an image. We can compare different image lighting from one image to another using LAL. From the experiments that have been made, it can be concluded that fuzzy c-means algorithm can be used for batik image clustering based on log-average luminance of each image possessed.

  12. A Fuzzy Homomorphic Algorithm for Image Enhancement | Nnolim ...

    African Journals Online (AJOL)

    The implementation and analysis of a novel Fuzzy Homomorphic image enhancement technique is presented. The technique combines the logarithmic transform with fuzzy membership functions to deliver an intuitive method of image enhancement. This algorithm reduces the computational complexity by eliminating the ...

  13. Speckle Interferometry with the McMath-Pierce East Auxiliary Telescope

    Science.gov (United States)

    Harshaw, Richard; Ray, Jimmy; Douglass, David; Prause, Lori; Genet, Russell

    2015-09-01

    Engineering runs and tests on the McMath-Pierce 0.8 meter East Auxiliary telescope successfully configured the telescope for speckle interferometry observations of close visual double stars. This paper reports the procedure and results of the speckle analysis of four double stars.

  14. Three-dimensional speckle tracking imaging assessment of left ventricular change in patient with coronary heart disease and its correlation with serum indexes

    Directory of Open Access Journals (Sweden)

    Jian-Li Fu

    2016-10-01

    Full Text Available Objective: To analyze the three-dimensional speckle tracking imaging assessment of left ventricular change in patient with coronary heart disease and its correlation with serum indexes. Methods: A total of 152 patients first diagnosed with coronary heart disease were the observation group of the study and 117 healthy subjects were the control group. Threedimensional speckle tracking imaging (3D-STI was used to evaluate the left ventricular function parameters of two groups, the serum content of endothelial function indexes and platelet function indexes were detected, and the correlation between left ventricular function parameters under 3D-STI and serum indexes was further analyzed. Results: Absolute values of left ventricular function parameters LVGLS, LVGRS, LVGCS and LVGAS from 3D-STI of observation group were significantly less than those of control group while Ptw and Torsion levels were greater than those of control group; endothelial function indexes vWF, sICAM-1, sVCAM-1 and ET-1 content in serum were significantly higher than those of control group while vWF-cp and NO content were significantly lower than those of control group; platelet function indexes CD62P, GMP-140, CD63, sP-selectin, sCD40L and PAC-1 content in serum were significantly higher than those of control group. The levels of left ventricular function parameters from 3D-STI in patients with coronary heart disease were directly correlated with serum indexes. Conclusion: 3D-STI can accurately assess the left ventricular function and the overall disease severity in patients with coronary heart disease, and it is expected to become an effective method for early diagnosis of diseases and guidance of clinical treatment.

  15. Low-dose multiple-information retrieval algorithm for X-ray grating-based imaging

    International Nuclear Information System (INIS)

    Wang Zhentian; Huang Zhifeng; Chen Zhiqiang; Zhang Li; Jiang Xiaolei; Kang Kejun; Yin Hongxia; Wang Zhenchang; Stampanoni, Marco

    2011-01-01

    The present work proposes a low dose information retrieval algorithm for X-ray grating-based multiple-information imaging (GB-MII) method, which can retrieve the attenuation, refraction and scattering information of samples by only three images. This algorithm aims at reducing the exposure time and the doses delivered to the sample. The multiple-information retrieval problem in GB-MII is solved by transforming a nonlinear equations set to a linear equations and adopting the nature of the trigonometric functions. The proposed algorithm is validated by experiments both on conventional X-ray source and synchrotron X-ray source, and compared with the traditional multiple-image-based retrieval algorithm. The experimental results show that our algorithm is comparable with the traditional retrieval algorithm and especially suitable for high Signal-to-Noise system.

  16. FPGA implementation of image dehazing algorithm for real time applications

    Science.gov (United States)

    Kumar, Rahul; Kaushik, Brajesh Kumar; Balasubramanian, R.

    2017-09-01

    Weather degradation such as haze, fog, mist, etc. severely reduces the effective range of visual surveillance. This degradation is a spatially varying phenomena, which makes this problem non trivial. Dehazing is an essential preprocessing stage in applications such as long range imaging, border security, intelligent transportation system, etc. However, these applications require low latency of the preprocessing block. In this work, single image dark channel prior algorithm is modified and implemented for fast processing with comparable visual quality of the restored image/video. Although conventional single image dark channel prior algorithm is computationally expensive, it yields impressive results. Moreover, a two stage image dehazing architecture is introduced, wherein, dark channel and airlight are estimated in the first stage. Whereas, transmission map and intensity restoration are computed in the next stages. The algorithm is implemented using Xilinx Vivado software and validated by using Xilinx zc702 development board, which contains an Artix7 equivalent Field Programmable Gate Array (FPGA) and ARM Cortex A9 dual core processor. Additionally, high definition multimedia interface (HDMI) has been incorporated for video feed and display purposes. The results show that the dehazing algorithm attains 29 frames per second for the image resolution of 1920x1080 which is suitable of real time applications. The design utilizes 9 18K_BRAM, 97 DSP_48, 6508 FFs and 8159 LUTs.

  17. Color Image Encryption Algorithm Based on TD-ERCS System and Wavelet Neural Network

    Directory of Open Access Journals (Sweden)

    Kun Zhang

    2015-01-01

    Full Text Available In order to solve the security problem of transmission image across public networks, a new image encryption algorithm based on TD-ERCS system and wavelet neural network is proposed in this paper. According to the permutation process and the binary XOR operation from the chaotic series by producing TD-ERCS system and wavelet neural network, it can achieve image encryption. This encryption algorithm is a reversible algorithm, and it can achieve original image in the rule inverse process of encryption algorithm. Finally, through computer simulation, the experiment results show that the new chaotic encryption algorithm based on TD-ERCS system and wavelet neural network is valid and has higher security.

  18. TRANSFORMATION ALGORITHM FOR IMAGES OBTAINED BY OMNIDIRECTIONAL CAMERAS

    Directory of Open Access Journals (Sweden)

    V. P. Lazarenko

    2015-01-01

    Full Text Available Omnidirectional optoelectronic systems find their application in areas where a wide viewing angle is critical. However, omnidirectional optoelectronic systems have a large distortion that makes their application more difficult. The paper compares the projection functions of traditional perspective lenses and omnidirectional wide angle fish-eye lenses with a viewing angle not less than 180°. This comparison proves that distortion models of omnidirectional cameras cannot be described as a deviation from the classic model of pinhole camera. To solve this problem, an algorithm for transforming omnidirectional images has been developed. The paper provides a brief comparison of the four calibration methods available in open source toolkits for omnidirectional optoelectronic systems. Geometrical projection model is given used for calibration of omnidirectional optical system. The algorithm consists of three basic steps. At the first step, we calculate he field of view of a virtual pinhole PTZ camera. This field of view is characterized by an array of 3D points in the object space. At the second step the array of corresponding pixels for these three-dimensional points is calculated. Then we make a calculation of the projection function that expresses the relation between a given 3D point in the object space and a corresponding pixel point. In this paper we use calibration procedure providing the projection function for calibrated instance of the camera. At the last step final image is formed pixel-by-pixel from the original omnidirectional image using calculated array of 3D points and projection function. The developed algorithm gives the possibility for obtaining an image for a part of the field of view of an omnidirectional optoelectronic system with the corrected distortion from the original omnidirectional image. The algorithm is designed for operation with the omnidirectional optoelectronic systems with both catadioptric and fish-eye lenses

  19. Sensitivity Analysis of the Scattering-Based SARBM3D Despeckling Algorithm.

    Science.gov (United States)

    Di Simone, Alessio

    2016-06-25

    Synthetic Aperture Radar (SAR) imagery greatly suffers from multiplicative speckle noise, typical of coherent image acquisition sensors, such as SAR systems. Therefore, a proper and accurate despeckling preprocessing step is almost mandatory to aid the interpretation and processing of SAR data by human users and computer algorithms, respectively. Very recently, a scattering-oriented version of the popular SAR Block-Matching 3D (SARBM3D) despeckling filter, named Scattering-Based (SB)-SARBM3D, was proposed. The new filter is based on the a priori knowledge of the local topography of the scene. In this paper, an experimental sensitivity analysis of the above-mentioned despeckling algorithm is carried out, and the main results are shown and discussed. In particular, the role of both electromagnetic and geometrical parameters of the surface and the impact of its scattering behavior are investigated. Furthermore, a comprehensive sensitivity analysis of the SB-SARBM3D filter against the Digital Elevation Model (DEM) resolution and the SAR image-DEM coregistration step is also provided. The sensitivity analysis shows a significant robustness of the algorithm against most of the surface parameters, while the DEM resolution plays a key role in the despeckling process. Furthermore, the SB-SARBM3D algorithm outperforms the original SARBM3D in the presence of the most realistic scattering behaviors of the surface. An actual scenario is also presented to assess the DEM role in real-life conditions.

  20. Linear array implementation of the EM algorithm for PET image reconstruction

    International Nuclear Information System (INIS)

    Rajan, K.; Patnaik, L.M.; Ramakrishna, J.

    1995-01-01

    The PET image reconstruction based on the EM algorithm has several attractive advantages over the conventional convolution back projection algorithms. However, the PET image reconstruction based on the EM algorithm is computationally burdensome for today's single processor systems. In addition, a large memory is required for the storage of the image, projection data, and the probability matrix. Since the computations are easily divided into tasks executable in parallel, multiprocessor configurations are the ideal choice for fast execution of the EM algorithms. In tis study, the authors attempt to overcome these two problems by parallelizing the EM algorithm on a multiprocessor systems. The parallel EM algorithm on a linear array topology using the commercially available fast floating point digital signal processor (DSP) chips as the processing elements (PE's) has been implemented. The performance of the EM algorithm on a 386/387 machine, IBM 6000 RISC workstation, and on the linear array system is discussed and compared. The results show that the computational speed performance of a linear array using 8 DSP chips as PE's executing the EM image reconstruction algorithm is about 15.5 times better than that of the IBM 6000 RISC workstation. The novelty of the scheme is its simplicity. The linear array topology is expandable with a larger number of PE's. The architecture is not dependant on the DSP chip chosen, and the substitution of the latest DSP chip is straightforward and could yield better speed performance

  1. Sonar Image Enhancements for Improved Detection of Sea Mines

    DEFF Research Database (Denmark)

    Jespersen, Karl; Sørensen, Helge Bjarup Dissing; Zerr, Benoit

    1999-01-01

    In this paper, five methods for enhancing sonar images prior to automatic detection of sea mines are investigated. Two of the methods have previously been published in connection with detection systems and serve as reference. The three new enhancement approaches are variance stabilizing log...... transform, nonlinear filtering, and pixel averaging for speckle reduction. The effect of the enhancement step is tested by using the full prcessing chain i.e. enhancement, detection and thresholding to determine the number of detections and false alarms. Substituting different enhancement algorithms...... in the processing chain gives a precise measure of the performance of the enhancement stage. The test is performed using a sonar image database with images ranging from very simple to very complex. The result of the comparison indicates that the new enhancement approaches improve the detection performance....

  2. Strain dyssynchrony index determined by three-dimensional speckle area tracking can predict response to cardiac resynchronization therapy

    Directory of Open Access Journals (Sweden)

    Onishi Tetsuari

    2011-04-01

    Full Text Available Abstract Background We have previously reported strain dyssynchrony index assessed by two-dimensional speckle tracking strain, and a marker of both dyssynchrony and residual myocardial contractility, can predict response to cardiac resynchronization therapy (CRT. A newly developed three-dimensional (3-D speckle tracking system can quantify endocardial area change ratio (area strain, which coupled with the factors of both longitudinal and circumferential strain, from all 16 standard left ventricular (LV segments using complete 3-D pyramidal datasets. Our objective was to test the hypothesis that strain dyssynchrony index using area tracking (ASDI can quantify dyssynchrony and predict response to CRT. Methods We studied 14 heart failure patients with ejection fraction of 27 ± 7% (all≤35% and QRS duration of 172 ± 30 ms (all≥120 ms who underwent CRT. Echocardiography was performed before and 6-month after CRT. ASDI was calculated as the average difference between peak and end-systolic area strain of LV endocardium obtained from 3-D speckle tracking imaging using 16 segments. Conventional dyssynchrony measures were assessed by interventricular mechanical delay, Yu Index, and two-dimensional radial dyssynchrony by speckle-tracking strain. Response was defined as a ≥15% decrease in LV end-systolic volume 6-month after CRT. Results ASDI ≥ 3.8% was the best predictor of response to CRT with a sensitivity of 78%, specificity of 100% and area under the curve (AUC of 0.93 (p Conclusions ASDI can predict responders and LV reverse remodeling following CRT. This novel index using the 3-D speckle tracking system, which shows circumferential and longitudinal LV dyssynchrony and residual endocardial contractility, may thus have clinical significance for CRT patients.

  3. Wavelet denoising of multiframe optical coherence tomography data.

    Science.gov (United States)

    Mayer, Markus A; Borsdorf, Anja; Wagner, Martin; Hornegger, Joachim; Mardin, Christian Y; Tornow, Ralf P

    2012-03-01

    We introduce a novel speckle noise reduction algorithm for OCT images. Contrary to present approaches, the algorithm does not rely on simple averaging of multiple image frames or denoising on the final averaged image. Instead it uses wavelet decompositions of the single frames for a local noise and structure estimation. Based on this analysis, the wavelet detail coefficients are weighted, averaged and reconstructed. At a signal-to-noise gain at about 100% we observe only a minor sharpness decrease, as measured by a full-width-half-maximum reduction of 10.5%. While a similar signal-to-noise gain would require averaging of 29 frames, we achieve this result using only 8 frames as input to the algorithm. A possible application of the proposed algorithm is preprocessing in retinal structure segmentation algorithms, to allow a better differentiation between real tissue information and unwanted speckle noise.

  4. A novel algorithm for segmentation of brain MR images

    International Nuclear Information System (INIS)

    Sial, M.Y.; Yu, L.; Chowdhry, B.S.; Rajput, A.Q.K.; Bhatti, M.I.

    2006-01-01

    Accurate and fully automatic segmentation of brain from magnetic resonance (MR) scans is a challenging problem that has received an enormous amount of . attention lately. Many researchers have applied various techniques however a standard fuzzy c-means algorithm has produced better results compared to other methods. In this paper, we present a modified fuzzy c-means (FCM) based algorithm for segmentation of brain MR images. Our algorithm is formulated by modifying the objective function of the standard FCM and uses a special spread method to get a smooth and slow varying bias field This method has the advantage that it can be applied at an early stage in an automated data analysis before a tissue model is available. The results on MRI images show that this method provides better results compared to standard FCM algorithms. (author)

  5. In vivo visualization method by absolute blood flow velocity based on speckle and fringe pattern using two-beam multipoint laser Doppler velocimetry

    Energy Technology Data Exchange (ETDEWEB)

    Kyoden, Tomoaki, E-mail: kyouden@nc-toyama.ac.jp; Naruki, Shoji; Akiguchi, Shunsuke; Momose, Noboru; Homae, Tomotaka; Hachiga, Tadashi [National Institute of Technology, Toyama College, 1-2 Ebie-Neriya, Imizu, Toyama 933-0293 (Japan); Ishida, Hiroki [Department of Applied Physics, Faculty of Science, Okayama University of Science, 1-1 Ridai-cho, Okayama 700-0005 (Japan); Andoh, Tsugunobu [Graduate School of Medicine and Pharmaceutical Sciences, University of Toyama, 2630 Sugitani, Toyama 930-0194 (Japan); Takada, Yogo [Graduate School of Engineering, Osaka City University, 3-3-138 Sugimoto, Sumiyoshi, Osaka 558-8585 (Japan)

    2016-08-28

    Two-beam multipoint laser Doppler velocimetry (two-beam MLDV) is a non-invasive imaging technique able to provide an image of two-dimensional blood flow and has potential for observing cancer as previously demonstrated in a mouse model. In two-beam MLDV, the blood flow velocity can be estimated from red blood cells passing through a fringe pattern generated in the skin. The fringe pattern is created at the intersection of two beams in conventional LDV and two-beam MLDV. Being able to choose the depth position is an advantage of two-beam MLDV, and the position of a blood vessel can be identified in a three-dimensional space using this technique. Initially, we observed the fringe pattern in the skin, and the undeveloped or developed speckle pattern generated in a deeper position of the skin. The validity of the absolute velocity value detected by two-beam MLDV was verified while changing the number of layers of skin around a transparent flow channel. The absolute velocity value independent of direction was detected using the developed speckle pattern, which is created by the skin construct and two beams in the flow channel. Finally, we showed the relationship between the signal intensity and the fringe pattern, undeveloped speckle, or developed speckle pattern based on the skin depth. The Doppler signals were not detected at deeper positions in the skin, which qualitatively indicates the depth limit for two-beam MLDV.

  6. Metal artifact reduction algorithm based on model images and spatial information

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Jay [Institute of Radiological Science, Central Taiwan University of Science and Technology, Taichung, Taiwan (China); Shih, Cheng-Ting [Department of Biomedical Engineering and Environmental Sciences, National Tsing-Hua University, Hsinchu, Taiwan (China); Chang, Shu-Jun [Health Physics Division, Institute of Nuclear Energy Research, Taoyuan, Taiwan (China); Huang, Tzung-Chi [Department of Biomedical Imaging and Radiological Science, China Medical University, Taichung, Taiwan (China); Sun, Jing-Yi [Institute of Radiological Science, Central Taiwan University of Science and Technology, Taichung, Taiwan (China); Wu, Tung-Hsin, E-mail: tung@ym.edu.tw [Department of Biomedical Imaging and Radiological Sciences, National Yang-Ming University, No.155, Sec. 2, Linong Street, Taipei 112, Taiwan (China)

    2011-10-01

    Computed tomography (CT) has become one of the most favorable choices for diagnosis of trauma. However, high-density metal implants can induce metal artifacts in CT images, compromising image quality. In this study, we proposed a model-based metal artifact reduction (MAR) algorithm. First, we built a model image using the k-means clustering technique with spatial information and calculated the difference between the original image and the model image. Then, the projection data of these two images were combined using an exponential weighting function. At last, the corrected image was reconstructed using the filter back-projection algorithm. Two metal-artifact contaminated images were studied. For the cylindrical water phantom image, the metal artifact was effectively removed. The mean CT number of water was improved from -28.95{+-}97.97 to -4.76{+-}4.28. For the clinical pelvic CT image, the dark band and the metal line were removed, and the continuity and uniformity of the soft tissue were recovered as well. These results indicate that the proposed MAR algorithm is useful for reducing metal artifact and could improve the diagnostic value of metal-artifact contaminated CT images.

  7. Stochastic parallel gradient descent based adaptive optics used for a high contrast imaging coronagraph

    International Nuclear Information System (INIS)

    Dong Bing; Ren Deqing; Zhang Xi

    2011-01-01

    An adaptive optics (AO) system based on a stochastic parallel gradient descent (SPGD) algorithm is proposed to reduce the speckle noises in the optical system of a stellar coronagraph in order to further improve the contrast. The principle of the SPGD algorithm is described briefly and a metric suitable for point source imaging optimization is given. The feasibility and good performance of the SPGD algorithm is demonstrated by an experimental system featured with a 140-actuator deformable mirror and a Hartmann-Shark wavefront sensor. Then the SPGD based AO is applied to a liquid crystal array (LCA) based coronagraph to improve the contrast. The LCA can modulate the incoming light to generate a pupil apodization mask of any pattern. A circular stepped pattern is used in our preliminary experiment and the image contrast shows improvement from 10 -3 to 10 -4.5 at an angular distance of 2λ/D after being corrected by SPGD based AO.

  8. FCM Clustering Algorithms for Segmentation of Brain MR Images

    Directory of Open Access Journals (Sweden)

    Yogita K. Dubey

    2016-01-01

    Full Text Available The study of brain disorders requires accurate tissue segmentation of magnetic resonance (MR brain images which is very important for detecting tumors, edema, and necrotic tissues. Segmentation of brain images, especially into three main tissue types: Cerebrospinal Fluid (CSF, Gray Matter (GM, and White Matter (WM, has important role in computer aided neurosurgery and diagnosis. Brain images mostly contain noise, intensity inhomogeneity, and weak boundaries. Therefore, accurate segmentation of brain images is still a challenging area of research. This paper presents a review of fuzzy c-means (FCM clustering algorithms for the segmentation of brain MR images. The review covers the detailed analysis of FCM based algorithms with intensity inhomogeneity correction and noise robustness. Different methods for the modification of standard fuzzy objective function with updating of membership and cluster centroid are also discussed.

  9. A Modified Image Comparison Algorithm Using Histogram Features

    OpenAIRE

    Al-Oraiqat, Anas M.; Kostyukova, Natalya S.

    2018-01-01

    This article discuss the problem of color image content comparison. Particularly, methods of image content comparison are analyzed, restrictions of color histogram are described and a modified method of images content comparison is proposed. This method uses the color histograms and considers color locations. Testing and analyzing of based and modified algorithms are performed. The modified method shows 97% average precision for a collection containing about 700 images without loss of the adv...

  10. Study on the algorithm of computational ghost imaging based on discrete fourier transform measurement matrix

    Science.gov (United States)

    Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua

    2016-07-01

    On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.

  11. Iterative Object Localization Algorithm Using Visual Images with a Reference Coordinate

    Directory of Open Access Journals (Sweden)

    We-Duke Cho

    2008-09-01

    Full Text Available We present a simplified algorithm for localizing an object using multiple visual images that are obtained from widely used digital imaging devices. We use a parallel projection model which supports both zooming and panning of the imaging devices. Our proposed algorithm is based on a virtual viewable plane for creating a relationship between an object position and a reference coordinate. The reference point is obtained from a rough estimate which may be obtained from the preestimation process. The algorithm minimizes localization error through the iterative process with relatively low-computational complexity. In addition, nonlinearity distortion of the digital image devices is compensated during the iterative process. Finally, the performances of several scenarios are evaluated and analyzed in both indoor and outdoor environments.

  12. ProxImaL: efficient image optimization using proximal algorithms

    KAUST Repository

    Heide, Felix

    2016-07-11

    Computational photography systems are becoming increasingly diverse, while computational resources-for example on mobile platforms-are rapidly increasing. As diverse as these camera systems may be, slightly different variants of the underlying image processing tasks, such as demosaicking, deconvolution, denoising, inpainting, image fusion, and alignment, are shared between all of these systems. Formal optimization methods have recently been demonstrated to achieve state-of-the-art quality for many of these applications. Unfortunately, different combinations of natural image priors and optimization algorithms may be optimal for different problems, and implementing and testing each combination is currently a time-consuming and error-prone process. ProxImaL is a domain-specific language and compiler for image optimization problems that makes it easy to experiment with different problem formulations and algorithm choices. The language uses proximal operators as the fundamental building blocks of a variety of linear and nonlinear image formation models and cost functions, advanced image priors, and noise models. The compiler intelligently chooses the best way to translate a problem formulation and choice of optimization algorithm into an efficient solver implementation. In applications to the image processing pipeline, deconvolution in the presence of Poisson-distributed shot noise, and burst denoising, we show that a few lines of ProxImaL code can generate highly efficient solvers that achieve state-of-the-art results. We also show applications to the nonlinear and nonconvex problem of phase retrieval.

  13. A Cognitive Machine Learning Algorithm for Cardiac Imaging: A Pilot Study for Differentiating Constrictive Pericarditis from Restrictive Cardiomyopathy

    Science.gov (United States)

    Sengupta, Partho P.; Huang, Yen-Min; Bansal, Manish; Ashrafi, Ali; Fisher, Matt; Shameer, Khader; Gall, Walt; Dudley, Joel T

    2016-01-01

    Background Associating a patient’s profile with the memories of prototypical patients built through previous repeat clinical experience is a key process in clinical judgment. We hypothesized that a similar process using a cognitive computing tool would be well suited for learning and recalling multidimensional attributes of speckle tracking echocardiography (STE) data sets derived from patients with known constrictive pericarditis (CP) and restrictive cardiomyopathy (RCM). Methods and Results Clinical and echocardiographic data of 50 patients with CP and 44 with RCM were used for developing an associative memory classifier (AMC) based machine learning algorithm. The STE data was normalized in reference to 47 controls with no structural heart disease, and the diagnostic area under the receiver operating characteristic curve (AUC) of the AMC was evaluated for differentiating CP from RCM. Using only STE variables, AMC achieved a diagnostic AUC of 89·2%, which improved to 96·2% with addition of 4 echocardiographic variables. In comparison, the AUC of early diastolic mitral annular velocity and left ventricular longitudinal strain were 82.1% and 63·7%, respectively. Furthermore, AMC demonstrated greater accuracy and shorter learning curves than other machine learning approaches with accuracy asymptotically approaching 90% after a training fraction of 0·3 and remaining flat at higher training fractions. Conclusions This study demonstrates feasibility of a cognitive machine learning approach for learning and recalling patterns observed during echocardiographic evaluations. Incorporation of machine learning algorithms in cardiac imaging may aid standardized assessments and support the quality of interpretations, particularly for novice readers with limited experience. PMID:27266599

  14. Anisotropic conductivity imaging with MREIT using equipotential projection algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Degirmenci, Evren [Department of Electrical and Electronics Engineering, Mersin University, Mersin (Turkey); Eyueboglu, B Murat [Department of Electrical and Electronics Engineering, Middle East Technical University, 06531, Ankara (Turkey)

    2007-12-21

    Magnetic resonance electrical impedance tomography (MREIT) combines magnetic flux or current density measurements obtained by magnetic resonance imaging (MRI) and surface potential measurements to reconstruct images of true conductivity with high spatial resolution. Most of the biological tissues have anisotropic conductivity; therefore, anisotropy should be taken into account in conductivity image reconstruction. Almost all of the MREIT reconstruction algorithms proposed to date assume isotropic conductivity distribution. In this study, a novel MREIT image reconstruction algorithm is proposed to image anisotropic conductivity. Relative anisotropic conductivity values are reconstructed iteratively, using only current density measurements without any potential measurement. In order to obtain true conductivity values, only either one potential or conductivity measurement is sufficient to determine a scaling factor. The proposed technique is evaluated on simulated data for isotropic and anisotropic conductivity distributions, with and without measurement noise. Simulation results show that the images of both anisotropic and isotropic conductivity distributions can be reconstructed successfully.

  15. Performance evaluation of 2D image registration algorithms with the numeric image registration and comparison platform

    International Nuclear Information System (INIS)

    Gerganov, G.; Kuvandjiev, V.; Dimitrova, I.; Mitev, K.; Kawrakow, I.

    2012-01-01

    The objective of this work is to present the capabilities of the NUMERICS web platform for evaluation of the performance of image registration algorithms. The NUMERICS platform is a web accessible tool which provides access to dedicated numerical algorithms for registration and comparison of medical images (http://numerics.phys.uni-sofia.bg). The platform allows comparison of noisy medical images by means of different types of image comparison algorithms, which are based on statistical tests for outliers. The platform also allows 2D image registration with different techniques like Elastic Thin-Plate Spline registration, registration based on rigid transformations, affine transformations, as well as non-rigid image registration based on Mobius transformations. In this work we demonstrate how the platform can be used as a tool for evaluation of the quality of the image registration process. We demonstrate performance evaluation of a deformable image registration technique based on Mobius transformations. The transformations are applied with appropriate cost functions like: Mutual information, Correlation coefficient, Sum of Squared Differences. The accent is on the results provided by the platform to the user and their interpretation in the context of the performance evaluation of 2D image registration. The NUMERICS image registration and image comparison platform provides detailed statistical information about submitted image registration jobs and can be used to perform quantitative evaluation of the performance of different image registration techniques. (authors)

  16. Static speckle experiments using white synchrotron radiation

    Energy Technology Data Exchange (ETDEWEB)

    Sant, Tushar; Panzner, Tobias; Pietsch, Ullrich [Solid State Physics Group, University of Siegen (Germany)

    2008-07-01

    Static speckle experiments were performed using coherent white X-ray radiation from a bending magnet at BESSYII. Semiconductor and polymer surfaces were investigated under incidence condition smaller than the critical angle of total external reflection. The scattering pattern of the sample results from the illumination function modified by the surface undulations. The periodic oscillations are caused by the illumination function whereas other irregular features are associated with sample surface. The speckle map of reflection from a laterally periodic structure like GaAs grating is studied. Under coherent illumination the grating peaks split into speckles because of fluctuations on the sample surface. It is important to understand which length scales on the sample surface are responsible for the oscillations in reflectivity map. To investigate this experiments are done with a triangular shaped sample. Different parts of the sample are illuminated with the footprint on the sample larger or smaller than the actual sample length. This gives prior information about total illuminated area on the sample. Using this additional information a detailed surface profile of the sample is reconstructed.

  17. Otsu Based Optimal Multilevel Image Thresholding Using Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    N. Sri Madhava Raja

    2014-01-01

    Full Text Available Histogram based multilevel thresholding approach is proposed using Brownian distribution (BD guided firefly algorithm (FA. A bounded search technique is also presented to improve the optimization accuracy with lesser search iterations. Otsu’s between-class variance function is maximized to obtain optimal threshold level for gray scale images. The performances of the proposed algorithm are demonstrated by considering twelve benchmark images and are compared with the existing FA algorithms such as Lévy flight (LF guided FA and random operator guided FA. The performance assessment comparison between the proposed and existing firefly algorithms is carried using prevailing parameters such as objective function, standard deviation, peak-to-signal ratio (PSNR, structural similarity (SSIM index, and search time of CPU. The results show that BD guided FA provides better objective function, PSNR, and SSIM, whereas LF based FA provides faster convergence with relatively lower CPU time.

  18. Frequency-domain imaging algorithm for ultrasonic testing by application of matrix phased arrays

    Directory of Open Access Journals (Sweden)

    Dolmatov Dmitry

    2017-01-01

    Full Text Available Constantly increasing demand for high-performance materials and systems in aerospace industry requires advanced methods of nondestructive testing. One of the most promising methods is ultrasonic imaging by using matrix phased arrays. This technique allows to create three-dimensional ultrasonic imaging with high lateral resolution. Further progress in matrix phased array ultrasonic testing is determined by the development of fast imaging algorithms. In this article imaging algorithm based on frequency domain calculations is proposed. This approach is computationally efficient in comparison with time domain algorithms. Performance of the proposed algorithm was tested via computer simulations for planar specimen with flat bottom holes.

  19. Speckles generated by skewed, short-coherence light beams

    International Nuclear Information System (INIS)

    Brogioli, D; Salerno, D; Ziano, R; Mantegazza, F; Croccolo, F

    2011-01-01

    When a coherent laser beam impinges on a random sample (e.g. a colloidal suspension), the scattered light exhibits characteristic speckles. If the temporal coherence of the light source is too short, then the speckles disappear, along with the possibility of performing homodyne or heterodyne scattering detection or photon correlation spectroscopy. Here we investigate the scattering of a so-called ‘skewed coherence beam’, i.e. a short-coherence beam modified such that the field is coherent within slabs that are skewed with respect to the wave fronts. We show that such a beam generates speckles and can be used for heterodyne scattering detection, despite its short temporal coherence. Moreover, we show that the heterodyne signal is not affected by multiple scattering. We suggest that the phenomenon presented here can be used as a means of carrying out heterodyne scattering measurement with any short-coherence radiation, including x-rays. (paper)

  20. Algorithm of pulmonary emphysema extraction using thoracic 3D CT images

    Science.gov (United States)

    Saita, Shinsuke; Kubo, Mitsuru; Kawata, Yoshiki; Niki, Noboru; Nakano, Yasutaka; Ohmatsu, Hironobu; Tominaga, Keigo; Eguchi, Kenji; Moriyama, Noriyuki

    2007-03-01

    Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.

  1. Automatic brightness control algorithms and their effect on fluoroscopic imaging

    International Nuclear Information System (INIS)

    Quinn, P.W.; Gagne, R.M.

    1989-01-01

    This paper reports a computer model used to investigate the effect on dose and image quality of three automatic brightness control (ABC) algorithms used in the imaging of barium during general-purpose fluoroscopy. A model incorporating all aspects of image formation - i.e., x- ray production, phantom attenuation, and energy absorption in the CSI phosphor - was driven according to each ABC algorithm as a function of patient thickness. The energy absorbed in the phosphor was kept constant, while the changes in exposure, integral dose, organ dose, and contrast were monitored

  2. Successive approximation algorithm for cancellation of artifacts in DSA images

    International Nuclear Information System (INIS)

    Funakami, Raiko; Hiroshima, Kyoichi; Nishino, Junji

    2000-01-01

    In this paper, we propose an algorithm for cancellation of artifacts in DSA images. We have already proposed an automatic registration method based on the detection of local movements. When motion of the object is large, it is difficult to estimate the exact movement, and the cancellation of artifacts may therefore fail. The algorithm we propose here is based on a simple rigid model. We present the results of applying the proposed method to a series of experimental X-ray images, as well as the results of applying the algorithm as preprocessing for a registration method based on local movement. (author)

  3. Algorithms for detection of objects in image sequences captured from an airborne imaging system

    Science.gov (United States)

    Kasturi, Rangachar; Camps, Octavia; Tang, Yuan-Liang; Devadiga, Sadashiva; Gandhi, Tarak

    1995-01-01

    This research was initiated as a part of the effort at the NASA Ames Research Center to design a computer vision based system that can enhance the safety of navigation by aiding the pilots in detecting various obstacles on the runway during critical section of the flight such as a landing maneuver. The primary goal is the development of algorithms for detection of moving objects from a sequence of images obtained from an on-board video camera. Image regions corresponding to the independently moving objects are segmented from the background by applying constraint filtering on the optical flow computed from the initial few frames of the sequence. These detected regions are tracked over subsequent frames using a model based tracking algorithm. Position and velocity of the moving objects in the world coordinate is estimated using an extended Kalman filter. The algorithms are tested using the NASA line image sequence with six static trucks and a simulated moving truck and experimental results are described. Various limitations of the currently implemented version of the above algorithm are identified and possible solutions to build a practical working system are investigated.

  4. An Improved Recovery Algorithm for Decayed AES Key Schedule Images

    Science.gov (United States)

    Tsow, Alex

    A practical algorithm that recovers AES key schedules from decayed memory images is presented. Halderman et al. [1] established this recovery capability, dubbed the cold-boot attack, as a serious vulnerability for several widespread software-based encryption packages. Our algorithm recovers AES-128 key schedules tens of millions of times faster than the original proof-of-concept release. In practice, it enables reliable recovery of key schedules at 70% decay, well over twice the decay capacity of previous methods. The algorithm is generalized to AES-256 and is empirically shown to recover 256-bit key schedules that have suffered 65% decay. When solutions are unique, the algorithm efficiently validates this property and outputs the solution for memory images decayed up to 60%.

  5. Three-dimensional displacement measurement by fringe projection and speckle photography

    International Nuclear Information System (INIS)

    Barrientos, B.; Garcia-Marquez, J.; Cerca, M.; Hernandez-Bernal, C.

    2008-01-01

    3D displacement fields are measured by the combination of two optical methods, fringe projection and speckle photography. The use of only one camera recording the necessary information implies that no calibration procedures are necessary as is the case in techniques based on stereoscopy. The out-of-plane displacement is measured by fringe projection whereas speckle photography yields the 2-D in-plane component. To show the feasibility of the technique, we analyze a detailed morphological spatio-temporal evolution of a model of the Earth's crust while subjected to compression forces. The results show that the combination of fringe projection and speckle photography is well suited for this type of studies

  6. Dynamics of laser speckle imaging of blood flow and morphological changes in tissues with a full time local ischemia of pancreas

    Directory of Open Access Journals (Sweden)

    Alexandrov D.A.

    2014-12-01

    Full Text Available The purpose: to establish influence of a full ischemia of different duration and the subsequent reperfusionon pathology development in pancreas of rats by means of laser speckle-visualization and lifetime digital microscopy. Materials and Methods. The work has been performed on 42 white rats of line Wistar in weight of 200-250 Research of properties of a blood-groove was made by means of methods laser Doppler flowmetry, digital biomicroscopy and a method of laser speckle-contrast visualization. Results. After the termination of a 5-minute full ischemia the speed of bloodflow has been increased in 2-3 times, clinic pancreatic necrosis is marked does not develop. After the termination of 20-minute full ischemia the increase in speed of a bloodflow did not occur, there were morphological and clinical signs of pancreatic necrosis. Conclusion, the efficiency of monitoring of microhemodynamics of pancreas in rats by the method of speckle-capillary of full field has been shown. Multidirectional phase of perfusion changes in pancreas have been revealed after reversible infringement of blood supply of different duration.

  7. Review of speckle observations of Supernova 1987A

    International Nuclear Information System (INIS)

    Meikle, W.P.S.

    1988-01-01

    SN 1987A is sufficiently close to allow a unique examination of the morphology of a supernova, using speckle interferometry. Several groups [Center for Astrophysics (CfA); Imperial College (IC); Mount Stromlo and Siding Spring Observatories/Anglo-Australian Observatory (M/A)] have reported optical speckle observations. At Hα, both CfA and M/A have determined the angular extent of the emission, and reasonable agreement is obtained. The speckle-derived values are consistent with those obtained from line profiles. IC has also succeeded in resolving the supernova at Hα. At wavelengths other than Hα, at early epochs, angular diameters obtained by CfA are larger than those derived from photometric and spectroscopic measurements, possibly due to scattering effects. At later epochs, the diameters exhibit little variation between the wavelengths examined. CfA reports significant asymmetry in the late epoch data. Several attempts have been made to re-observe (at optical wavelengths) the companion object, but none have succeeded. The nature of this phenomenon is still controversial, but the evidence indicates that the companion was real, with emission from dust apparently being the least problematic explanation. Support for this may lie in IR speckle observations (Haute Provence/Lyon) which, on about day 115, indicated the presence of one or more resolved components at an angular displacement comparable to that of the optical companion. 39 refs., 1 fig., 1 tab

  8. An automated algorithm for photoreceptors counting in adaptive optics retinal images

    Science.gov (United States)

    Liu, Xu; Zhang, Yudong; Yun, Dai

    2012-10-01

    Eyes are important organs of humans that detect light and form spatial and color vision. Knowing the exact number of cones in retinal image has great importance in helping us understand the mechanism of eyes' function and the pathology of some eye disease. In order to analyze data in real time and process large-scale data, an automated algorithm is designed to label cone photoreceptors in adaptive optics (AO) retinal images. Images acquired by the flood-illuminated AO system are taken to test the efficiency of this algorithm. We labeled these images both automatically and manually, and compared the results of the two methods. A 94.1% to 96.5% agreement rate between the two methods is achieved in this experiment, which demonstrated the reliability and efficiency of the algorithm.

  9. Hypoperfusion Induced by Preconditioning Treadmill Training in Hyper-Early Reperfusion After Cerebral Ischemia: A Laser Speckle Imaging Study.

    Science.gov (United States)

    He, Zhijie; Lu, Hongyang; Yang, Xiaojiao; Zhang, Li; Wu, Yi; Niu, Wenxiu; Ding, Li; Wang, Guili; Tong, Shanbao; Jia, Jie

    2018-01-01

    Exercise preconditioning induces neuroprotective effects during cerebral ischemia and reperfusion, which involves the recovery of cerebral blood flow (CBF). Mechanisms underlying the neuroprotective effects of re-established CBF following ischemia and reperfusion are unclear. The present study investigated CBF in hyper-early stage of reperfusion by laser speckle contrast imaging, a full-field high-resolution optical imaging technique. Rats with or without treadmill training were subjected to middle cerebral artery occlusion followed by reperfusion. CBF in arteries, veins, and capillaries in hyper-early stage of reperfusion (1, 2, and 3 h after reperfusion) and in subacute stage (24 h after reperfusion) were measured. Neurological scoring and 2,3,5-triphenyltetrazolium chloride staining were further applied to determine the neuroprotective effects of exercise preconditioning. In hyper-early stage of reperfusion, CBF in the rats with exercise preconditioning was reduced significantly in arteries and veins, respectively, compared to rats with no exercise preconditioning. Capillary CBF remained stable in the hyper-early stage of reperfusion, though it increased significantly 24 h after reperfusion in the rats with exercise preconditioning. As a neuroprotective strategy, exercise preconditioning reduced the blood perfusion of arteries and veins in the hyper-early stage of reperfusion, which indicated intervention-induced neuroprotective hypoperfusion after reperfusion onset.

  10. A Review of Surface Deformation and Strain Measurement Using Two-Dimensional Digital Image Correlation

    Directory of Open Access Journals (Sweden)

    Khoo Sze-Wei

    2016-09-01

    Full Text Available Among the full-field optical measurement methods, the Digital Image Correlation (DIC is one of the techniques which has been given particular attention. Technically, the DIC technique refers to a non-contact strain measurement method that mathematically compares the grey intensity changes of the images captured at two different states: before and after deformation. The measurement can be performed by numerically calculating the displacement of speckles which are deposited on the top of object’s surface. In this paper, the Two-Dimensional Digital Image Correlation (2D-DIC is presented and its fundamental concepts are discussed. Next, the development of the 2D-DIC algorithms in the past 33 years is reviewed systematically. The improvement of 2DDIC algorithms is presented with respect to two distinct aspects: their computation efficiency and measurement accuracy. Furthermore, analysis of the 2D-DIC accuracy is included, followed by a review of the DIC applications for two-dimensional measurements.

  11. GPU-based parallel algorithm for blind image restoration using midfrequency-based methods

    Science.gov (United States)

    Xie, Lang; Luo, Yi-han; Bao, Qi-liang

    2013-08-01

    GPU-based general-purpose computing is a new branch of modern parallel computing, so the study of parallel algorithms specially designed for GPU hardware architecture is of great significance. In order to solve the problem of high computational complexity and poor real-time performance in blind image restoration, the midfrequency-based algorithm for blind image restoration was analyzed and improved in this paper. Furthermore, a midfrequency-based filtering method is also used to restore the image hardly with any recursion or iteration. Combining the algorithm with data intensiveness, data parallel computing and GPU execution model of single instruction and multiple threads, a new parallel midfrequency-based algorithm for blind image restoration is proposed in this paper, which is suitable for stream computing of GPU. In this algorithm, the GPU is utilized to accelerate the estimation of class-G point spread functions and midfrequency-based filtering. Aiming at better management of the GPU threads, the threads in a grid are scheduled according to the decomposition of the filtering data in frequency domain after the optimization of data access and the communication between the host and the device. The kernel parallelism structure is determined by the decomposition of the filtering data to ensure the transmission rate to get around the memory bandwidth limitation. The results show that, with the new algorithm, the operational speed is significantly increased and the real-time performance of image restoration is effectively improved, especially for high-resolution images.

  12. Medical image registration by combining global and local information: a chain-type diffeomorphic demons algorithm

    International Nuclear Information System (INIS)

    Liu, Xiaozheng; Yuan, Zhenming; Zhu, Junming; Xu, Dongrong

    2013-01-01

    The demons algorithm is a popular algorithm for non-rigid image registration because of its computational efficiency and simple implementation. The deformation forces of the classic demons algorithm were derived from image gradients by considering the deformation to decrease the intensity dissimilarity between images. However, the methods using the difference of image intensity for medical image registration are easily affected by image artifacts, such as image noise, non-uniform imaging and partial volume effects. The gradient magnitude image is constructed from the local information of an image, so the difference in a gradient magnitude image can be regarded as more reliable and robust for these artifacts. Then, registering medical images by considering the differences in both image intensity and gradient magnitude is a straightforward selection. In this paper, based on a diffeomorphic demons algorithm, we propose a chain-type diffeomorphic demons algorithm by combining the differences in both image intensity and gradient magnitude for medical image registration. Previous work had shown that the classic demons algorithm can be considered as an approximation of a second order gradient descent on the sum of the squared intensity differences. By optimizing the new dissimilarity criteria, we also present a set of new demons forces which were derived from the gradients of the image and gradient magnitude image. We show that, in controlled experiments, this advantage is confirmed, and yields a fast convergence. (paper)

  13. Strain measurement of abdominal aortic aneurysm with real-time 3D ultrasound speckle tracking.

    Science.gov (United States)

    Bihari, P; Shelke, A; Nwe, T H; Mularczyk, M; Nelson, K; Schmandra, T; Knez, P; Schmitz-Rixen, T

    2013-04-01

    Abdominal aortic aneurysm rupture is caused by mechanical vascular tissue failure. Although mechanical properties within the aneurysm vary, currently available ultrasound methods assess only one cross-sectional segment of the aorta. This study aims to establish real-time 3-dimensional (3D) speckle tracking ultrasound to explore local displacement and strain parameters of the whole abdominal aortic aneurysm. Validation was performed on a silicone aneurysm model, perfused in a pulsatile artificial circulatory system. Wall motion of the silicone model was measured simultaneously with a commercial real-time 3D speckle tracking ultrasound system and either with laser-scan micrometry or with video photogrammetry. After validation, 3D ultrasound data were collected from abdominal aortic aneurysms of five patients and displacement and strain parameters were analysed. Displacement parameters measured in vitro by 3D ultrasound and laser scan micrometer or video analysis were significantly correlated at pulse pressures between 40 and 80 mmHg. Strong local differences in displacement and strain were identified within the aortic aneurysms of patients. Local wall strain of the whole abdominal aortic aneurysm can be analysed in vivo with real-time 3D ultrasound speckle tracking imaging, offering the prospect of individual non-invasive rupture risk analysis of abdominal aortic aneurysms. Copyright © 2013 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  14. Algorithms of image processing in nuclear medicine

    International Nuclear Information System (INIS)

    Oliveira, V.A.

    1990-01-01

    The problem of image restoration from noisy measurements as encountered in Nuclear Medicine is considered. A new approach for treating the measurements wherein they are represented by a spatial noncausal interaction model prior to maximum entropy restoration is given. This model describes the statistical dependence among the image values and their neighbourhood. The particular application of the algorithms presented here relates to gamma ray imaging systems, and is aimed at improving the resolution-noise suppression product. Results for actual gamma camera data are presented and compared with more conventional techniques. (author)

  15. Analysis of statistical properties of laser speckles, forming in skin and mucous of colon: potential application in laser surgery

    Science.gov (United States)

    Rubtsov, Vladimir; Kapralov, Sergey; Chalyk, Iuri; Ulianova, Onega; Ulyanov, Sergey

    2013-02-01

    Statistical properties of laser speckles, formed in skin and mucous of colon have been analyzed and compared. It has been demonstrated that first and second order statistics of "skin" speckles and "mucous" speckles are quite different. It is shown that speckles, formed in mucous, are not Gaussian one. Layered structure of colon mucous causes formation of speckled biospeckles. First- and second- order statistics of speckled speckles have been reviewed in this paper. Statistical properties of Fresnel and Fraunhofer doubly scattered and cascade speckles are described. Non-gaussian statistics of biospeckles may lead to high localization of intensity of coherent light in human tissue during the laser surgery. Way of suppression of highly localized non-gaussian speckles is suggested.

  16. Evaluation of clinical image processing algorithms used in digital mammography.

    Science.gov (United States)

    Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde

    2009-03-01

    Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the

  17. A novel high-frequency encoding algorithm for image compression

    Science.gov (United States)

    Siddeq, Mohammed M.; Rodrigues, Marcos A.

    2017-12-01

    In this paper, a new method for image compression is proposed whose quality is demonstrated through accurate 3D reconstruction from 2D images. The method is based on the discrete cosine transform (DCT) together with a high-frequency minimization encoding algorithm at compression stage and a new concurrent binary search algorithm at decompression stage. The proposed compression method consists of five main steps: (1) divide the image into blocks and apply DCT to each block; (2) apply a high-frequency minimization method to the AC-coefficients reducing each block by 2/3 resulting in a minimized array; (3) build a look up table of probability data to enable the recovery of the original high frequencies at decompression stage; (4) apply a delta or differential operator to the list of DC-components; and (5) apply arithmetic encoding to the outputs of steps (2) and (4). At decompression stage, the look up table and the concurrent binary search algorithm are used to reconstruct all high-frequency AC-coefficients while the DC-components are decoded by reversing the arithmetic coding. Finally, the inverse DCT recovers the original image. We tested the technique by compressing and decompressing 2D images including images with structured light patterns for 3D reconstruction. The technique is compared with JPEG and JPEG2000 through 2D and 3D RMSE. Results demonstrate that the proposed compression method is perceptually superior to JPEG with equivalent quality to JPEG2000. Concerning 3D surface reconstruction from images, it is demonstrated that the proposed method is superior to both JPEG and JPEG2000.

  18. An Uneven Illumination Correction Algorithm for Optical Remote Sensing Images Covered with Thin Clouds

    Directory of Open Access Journals (Sweden)

    Xiaole Shen

    2015-09-01

    Full Text Available The uneven illumination phenomenon caused by thin clouds will reduce the quality of remote sensing images, and bring adverse effects to the image interpretation. To remove the effect of thin clouds on images, an uneven illumination correction can be applied. In this paper, an effective uneven illumination correction algorithm is proposed to remove the effect of thin clouds and to restore the ground information of the optical remote sensing image. The imaging model of remote sensing images covered by thin clouds is analyzed. Due to the transmission attenuation, reflection, and scattering, the thin cloud cover usually increases region brightness and reduces saturation and contrast of the image. As a result, a wavelet domain enhancement is performed for the image in Hue-Saturation-Value (HSV color space. We use images with thin clouds in Wuhan area captured by QuickBird and ZiYuan-3 (ZY-3 satellites for experiments. Three traditional uneven illumination correction algorithms, i.e., multi-scale Retinex (MSR algorithm, homomorphic filtering (HF-based algorithm, and wavelet transform-based MASK (WT-MASK algorithm are performed for comparison. Five indicators, i.e., mean value, standard deviation, information entropy, average gradient, and hue deviation index (HDI are used to analyze the effect of the algorithms. The experimental results show that the proposed algorithm can effectively eliminate the influences of thin clouds and restore the real color of ground objects under thin clouds.

  19. Is STAPLE algorithm confident to assess segmentation methods in PET imaging?

    International Nuclear Information System (INIS)

    Dewalle-Vignion, Anne-Sophie; Betrouni, Nacim; Vermandel, Maximilien; Baillet, Clio

    2015-01-01

    Accurate tumor segmentation in [18F]-fluorodeoxyglucose positron emission tomography is crucial for tumor response assessment and target volume definition in radiation therapy. Evaluation of segmentation methods from clinical data without ground truth is usually based on physicians’ manual delineations. In this context, the simultaneous truth and performance level estimation (STAPLE) algorithm could be useful to manage the multi-observers variability. In this paper, we evaluated how this algorithm could accurately estimate the ground truth in PET imaging.Complete evaluation study using different criteria was performed on simulated data. The STAPLE algorithm was applied to manual and automatic segmentation results. A specific configuration of the implementation provided by the Computational Radiology Laboratory was used.Consensus obtained by the STAPLE algorithm from manual delineations appeared to be more accurate than manual delineations themselves (80% of overlap). An improvement of the accuracy was also observed when applying the STAPLE algorithm to automatic segmentations results.The STAPLE algorithm, with the configuration used in this paper, is more appropriate than manual delineations alone or automatic segmentations results alone to estimate the ground truth in PET imaging. Therefore, it might be preferred to assess the accuracy of tumor segmentation methods in PET imaging. (paper)

  20. Algorithms and programming tools for image processing on the MPP:3

    Science.gov (United States)

    Reeves, Anthony P.

    1987-01-01

    This is the third and final report on the work done for NASA Grant 5-403 on Algorithms and Programming Tools for Image Processing on the MPP:3. All the work done for this grant is summarized in the introduction. Work done since August 1986 is reported in detail. Research for this grant falls under the following headings: (1) fundamental algorithms for the MPP; (2) programming utilities for the MPP; (3) the Parallel Pascal Development System; and (4) performance analysis. In this report, the results of two efforts are reported: region growing, and performance analysis of important characteristic algorithms. In each case, timing results from MPP implementations are included. A paper is included in which parallel algorithms for region growing on the MPP is discussed. These algorithms permit different sized regions to be merged in parallel. Details on the implementation and peformance of several important MPP algorithms are given. These include a number of standard permutations, the FFT, convolution, arbitrary data mappings, image warping, and pyramid operations, all of which have been implemented on the MPP. The permutation and image warping functions have been included in the standard development system library.

  1. Is STAPLE algorithm confident to assess segmentation methods in PET imaging?

    Science.gov (United States)

    Dewalle-Vignion, Anne-Sophie; Betrouni, Nacim; Baillet, Clio; Vermandel, Maximilien

    2015-12-01

    Accurate tumor segmentation in [18F]-fluorodeoxyglucose positron emission tomography is crucial for tumor response assessment and target volume definition in radiation therapy. Evaluation of segmentation methods from clinical data without ground truth is usually based on physicians’ manual delineations. In this context, the simultaneous truth and performance level estimation (STAPLE) algorithm could be useful to manage the multi-observers variability. In this paper, we evaluated how this algorithm could accurately estimate the ground truth in PET imaging. Complete evaluation study using different criteria was performed on simulated data. The STAPLE algorithm was applied to manual and automatic segmentation results. A specific configuration of the implementation provided by the Computational Radiology Laboratory was used. Consensus obtained by the STAPLE algorithm from manual delineations appeared to be more accurate than manual delineations themselves (80% of overlap). An improvement of the accuracy was also observed when applying the STAPLE algorithm to automatic segmentations results. The STAPLE algorithm, with the configuration used in this paper, is more appropriate than manual delineations alone or automatic segmentations results alone to estimate the ground truth in PET imaging. Therefore, it might be preferred to assess the accuracy of tumor segmentation methods in PET imaging.

  2. Image defog algorithm based on open close filter and gradient domain recursive bilateral filter

    Science.gov (United States)

    Liu, Daqian; Liu, Wanjun; Zhao, Qingguo; Fei, Bowen

    2017-11-01

    To solve the problems of fuzzy details, color distortion, low brightness of the image obtained by the dark channel prior defog algorithm, an image defog algorithm based on open close filter and gradient domain recursive bilateral filter, referred to as OCRBF, was put forward. The algorithm named OCRBF firstly makes use of weighted quad tree to obtain more accurate the global atmospheric value, then exploits multiple-structure element morphological open and close filter towards the minimum channel map to obtain a rough scattering map by dark channel prior, makes use of variogram to correct the transmittance map,and uses gradient domain recursive bilateral filter for the smooth operation, finally gets recovery images by image degradation model, and makes contrast adjustment to get bright, clear and no fog image. A large number of experimental results show that the proposed defog method in this paper can be good to remove the fog , recover color and definition of the fog image containing close range image, image perspective, the image including the bright areas very well, compared with other image defog algorithms,obtain more clear and natural fog free images with details of higher visibility, what's more, the relationship between the time complexity of SIDA algorithm and the number of image pixels is a linear correlation.

  3. Adaptive Proximal Point Algorithms for Total Variation Image Restoration

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2015-02-01

    Full Text Available Image restoration is a fundamental problem in various areas of imaging sciences. This paper presents a class of adaptive proximal point algorithms (APPA with contraction strategy for total variational image restoration. In each iteration, the proposed methods choose an adaptive proximal parameter matrix which is not necessary symmetric. In fact, there is an inner extrapolation in the prediction step, which is followed by a correction step for contraction. And the inner extrapolation is implemented by an adaptive scheme. By using the framework of contraction method, global convergence result and a convergence rate of O(1/N could be established for the proposed methods. Numerical results are reported to illustrate the efficiency of the APPA methods for solving total variation image restoration problems. Comparisons with the state-of-the-art algorithms demonstrate that the proposed methods are comparable and promising.

  4. IMAGING THE EPOCH OF REIONIZATION: LIMITATIONS FROM FOREGROUND CONFUSION AND IMAGING ALGORITHMS

    International Nuclear Information System (INIS)

    Vedantham, Harish; Udaya Shankar, N.; Subrahmanyan, Ravi

    2012-01-01

    Tomography of redshifted 21 cm transition from neutral hydrogen using Fourier synthesis telescopes is a promising tool to study the Epoch of Reionization (EoR). Limiting the confusion from Galactic and extragalactic foregrounds is critical to the success of these telescopes. The instrumental response or the point-spread function (PSF) of such telescopes is inherently three dimensional with frequency mapping to the line-of-sight (LOS) distance. EoR signals will necessarily have to be detected in data where continuum confusion persists; therefore, it is important that the PSF has acceptable frequency structure so that the residual foreground does not confuse the EoR signature. This paper aims to understand the three-dimensional PSF and foreground contamination in the same framework. We develop a formalism to estimate the foreground contamination along frequency, or equivalently LOS dimension, and establish a relationship between foreground contamination in the image plane and visibility weights on the Fourier plane. We identify two dominant sources of LOS foreground contamination—'PSF contamination' and 'gridding contamination'. We show that PSF contamination is localized in LOS wavenumber space, beyond which there potentially exists an 'EoR window' with negligible foreground contamination where we may focus our efforts to detect EoR. PSF contamination in this window may be substantially reduced by judicious choice of a frequency window function. Gridding and imaging algorithms create additional gridding contamination and we propose a new imaging algorithm using the Chirp Z Transform that significantly reduces this contamination. Finally, we demonstrate the analytical relationships and the merit of the new imaging algorithm for the case of imaging with the Murchison Widefield Array.

  5. An three-dimensional imaging algorithm based on the radiation model of electric dipole

    International Nuclear Information System (INIS)

    Tian Bo; Zhong Weijun; Tong Chuangming

    2011-01-01

    A three-dimensional imaging algorithm based on the radiation model of dipole (DBP) is presented. On the foundation of researching the principle of the back projection (BP) algorithm, the relationship between the near field imaging model and far field imaging model is analyzed based on the scattering model. Firstly, the far field sampling data is transferred to the near field sampling data through applying the radiation theory of dipole. Then the dealt sampling data was projected to the imaging region to obtain the images of targets. The capability of the new algorithm to detect targets is verified by using finite-difference time-domain method (FDTD), and the coupling effect for imaging is analyzed. (authors)

  6. Comparison of different reconstruction algorithms for three-dimensional ultrasound imaging in a neurosurgical setting.

    Science.gov (United States)

    Miller, D; Lippert, C; Vollmer, F; Bozinov, O; Benes, L; Schulte, D M; Sure, U

    2012-09-01

    Freehand three-dimensional ultrasound imaging (3D-US) is increasingly used in image-guided surgery. During image acquisition, a set of B-scans is acquired that is distributed in a non-parallel manner over the area of interest. Reconstructing these images into a regular array allows 3D visualization. However, the reconstruction process may introduce artefacts and may therefore reduce image quality. The aim of the study is to compare different algorithms with respect to image quality and diagnostic value for image guidance in neurosurgery. 3D-US data sets were acquired during surgery of various intracerebral lesions using an integrated ultrasound-navigation device. They were stored for post-hoc evaluation. Five different reconstruction algorithms, a standard multiplanar reconstruction with interpolation (MPR), a pixel nearest neighbour method (PNN), a voxel nearest neighbour method (VNN) and two voxel based distance-weighted algorithms (VNN2 and DW) were tested with respect to image quality and artefact formation. The capability of the algorithm to fill gaps within the sample volume was investigated and a clinical evaluation with respect to the diagnostic value of the reconstructed images was performed. MPR was significantly worse than the other algorithms in filling gaps. In an image subtraction test, VNN2 and DW reliably reconstructed images even if large amounts of data were missing. However, the quality of the reconstruction improved, if data acquisition was performed in a structured manner. When evaluating the diagnostic value of reconstructed axial, sagittal and coronal views, VNN2 and DW were judged to be significantly better than MPR and VNN. VNN2 and DW could be identified as robust algorithms that generate reconstructed US images with a high diagnostic value. These algorithms improve the utility and reliability of 3D-US imaging during intraoperative navigation. Copyright © 2012 John Wiley & Sons, Ltd.

  7. A Pixel Correlation Technique for Smaller Telescopes to Measure Doubles

    Science.gov (United States)

    Wiley, E. O.

    2013-04-01

    Pixel correlation uses the same reduction techniques as speckle imaging but relies on autocorrelation among captured pixel hits rather than true speckles. A video camera operating at speeds (8-66 milliseconds) similar to lucky imaging to capture 400-1,000 video frames. The AVI files are converted to bitmap images and analyzed using the interferometric algorithms in REDUC using all frames. This results in a series of corellograms from which theta and rho can be measured. Results using a 20 cm (8") Dall-Kirkham working at f22.5 are presented for doubles with separations between 1" to 5.7" under average seeing conditions. I conclude that this form of visualizing and analyzing visual double stars is a viable alternative to lucky imaging that can be employed by telescopes that are too small in aperture to capture a sufficient number of speckles for true speckle interferometry.

  8. Spatial correlation genetic algorithm for fractal image compression

    International Nuclear Information System (INIS)

    Wu, M.-S.; Teng, W.-C.; Jeng, J.-H.; Hsieh, J.-G.

    2006-01-01

    Fractal image compression explores the self-similarity property of a natural image and utilizes the partitioned iterated function system (PIFS) to encode it. This technique is of great interest both in theory and application. However, it is time-consuming in the encoding process and such drawback renders it impractical for real time applications. The time is mainly spent on the search for the best-match block in a large domain pool. In this paper, a spatial correlation genetic algorithm (SC-GA) is proposed to speed up the encoder. There are two stages for the SC-GA method. The first stage makes use of spatial correlations in images for both the domain pool and the range pool to exploit local optima. The second stage is operated on the whole image to explore more adequate similarities if the local optima are not satisfied. With the aid of spatial correlation in images, the encoding time is 1.5 times faster than that of traditional genetic algorithm method, while the quality of the retrieved image is almost the same. Moreover, about half of the matched blocks come from the correlated space, so fewer bits are required to represent the fractal transform and therefore the compression ratio is also improved

  9. A Novel Image Encryption Algorithm Based on DNA Subsequence Operation

    Directory of Open Access Journals (Sweden)

    Qiang Zhang

    2012-01-01

    Full Text Available We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc. combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack.

  10. Overcoming turbulence-induced space-variant blur by using phase-diverse speckle.

    Science.gov (United States)

    Thelen, Brian J; Paxman, Richard G; Carrara, David A; Seldin, John H

    2009-01-01

    Space-variant blur occurs when imaging through volume turbulence over sufficiently large fields of view. Space-variant effects are particularly severe in horizontal-path imaging, slant-path (air-to-ground or ground-to-air) geometries, and ground-based imaging of low-elevation satellites or astronomical objects. In these geometries, the isoplanatic angle can be comparable to or even smaller than the diffraction-limited resolution angle. We report on a postdetection correction method that seeks to correct for the effects of space-variant aberrations, with the goal of reconstructing near-diffraction-limited imagery. Our approach has been to generalize the method of phase-diverse speckle (PDS) by using a physically motivated distributed-phase-screen model. Simulation results are presented that demonstrate the reconstruction of near-diffraction-limited imagery under both matched and mismatched model assumptions. In addition, we present evidence that PDS could be used as a beaconless wavefront sensor in a multiconjugate adaptive optics system when imaging extended scenes.

  11. Auto-SEIA: simultaneous optimization of image processing and machine learning algorithms

    Science.gov (United States)

    Negro Maggio, Valentina; Iocchi, Luca

    2015-02-01

    Object classification from images is an important task for machine vision and it is a crucial ingredient for many computer vision applications, ranging from security and surveillance to marketing. Image based object classification techniques properly integrate image processing and machine learning (i.e., classification) procedures. In this paper we present a system for automatic simultaneous optimization of algorithms and parameters for object classification from images. More specifically, the proposed system is able to process a dataset of labelled images and to return a best configuration of image processing and classification algorithms and of their parameters with respect to the accuracy of classification. Experiments with real public datasets are used to demonstrate the effectiveness of the developed system.

  12. Reliable Line Matching Algorithm for Stereo Images with Topological Relationship

    Directory of Open Access Journals (Sweden)

    WANG Jingxue

    2017-11-01

    Full Text Available Because of the lack of relationships between matching line and adjacent lines in the process of individual line matching, and the weak reliability of the individual line descriptor facing on discontinue texture, this paper presents a reliable line matching algorithm for stereo images with topological relationship. The algorithm firstly generates grouped line pairs from lines extracted from the reference image and searching image according to the basic topological relationships such as distance and angle between the lines. Then it takes the grouped line pairs as matching primitives, and matches these grouped line pairs by using epipolar constraint, homography constraint, quadrant constraint and gray correlation constraint of irregular triangle in order. And finally, it resolves the corresponding line pairs into two pairs of corresponding individual lines, and obtains one to one matching results after the post-processing of integrating, fitting, and checking. This paper adopts digital aerial images and close-range images with typical texture features to deal with the parameter analysis and line matching, and the experiment results demonstrate that the proposed algorithm in this paper can obtain reliable line matching results.

  13. High resolution reconstruction of PET images using the iterative OSEM algorithm

    International Nuclear Information System (INIS)

    Doll, J.; Bublitz, O.; Werling, A.; Haberkorn, U.; Semmler, W.; Adam, L.E.; Pennsylvania Univ., Philadelphia, PA; Brix, G.

    2004-01-01

    Aim: Improvement of the spatial resolution in positron emission tomography (PET) by incorporation of the image-forming characteristics of the scanner into the process of iterative image reconstruction. Methods: All measurements were performed at the whole-body PET system ECAT EXACT HR + in 3D mode. The acquired 3D sinograms were sorted into 2D sinograms by means of the Fourier rebinning (FORE) algorithm, which allows the usage of 2D algorithms for image reconstruction. The scanner characteristics were described by a spatially variant line-spread function (LSF), which was determined from activated copper-64 line sources. This information was used to model the physical degradation processes in PET measurements during the course of 2D image reconstruction with the iterative OSEM algorithm. To assess the performance of the high-resolution OSEM algorithm, phantom measurements performed at a cylinder phantom, the hotspot Jaszczack phantom, and the 3D Hoffmann brain phantom as well as different patient examinations were analyzed. Results: Scanner characteristics could be described by a Gaussian-shaped LSF with a full-width at half-maximum increasing from 4.8 mm at the center to 5.5 mm at a radial distance of 10.5 cm. Incorporation of the LSF into the iteration formula resulted in a markedly improved resolution of 3.0 and 3.5 mm, respectively. The evaluation of phantom and patient studies showed that the high-resolution OSEM algorithm not only lead to a better contrast resolution in the reconstructed activity distributions but also to an improved accuracy in the quantification of activity concentrations in small structures without leading to an amplification of image noise or even the occurrence of image artifacts. Conclusion: The spatial and contrast resolution of PET scans can markedly be improved by the presented image restauration algorithm, which is of special interest for the examination of both patients with brain disorders and small animals. (orig.)

  14. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  15. Speckle Interferometry with the OCA Kuhn 22" Telescope

    Science.gov (United States)

    Wasson, Rick

    2018-04-01

    Speckle interferometry measurements of double stars were made in 2015 and 2016, using the Kuhn 22-inch classical Cassegrain telescope of the Orange County Astronomers, a Point Grey Blackfly CMOS camera, and three interference filters. 272 observations are reported for 177 systems, with separations ranging from 0.29" to 2.9". Data reduction was by means of the REDUC and Speckle Tool Box programs. Equipment, observing procedures, calibration, data reduction, and analysis are described, and unusual results for 11 stars are discussed in detail.

  16. Holographic interferometric and correlation-based laser speckle metrology for 3D deformations in dentistry

    Science.gov (United States)

    Dekiff, Markus; Kemper, Björn; Kröger, Elke; Denz, Cornelia; Dirksen, Dieter

    2017-03-01

    The mechanical loading of dental restorations and hard tissue is often investigated numerically. For validation and optimization of such simulations, comparisons with measured deformations are essential. We combine digital holographic interferometry and digital speckle photography for the determination of microscopic deformations with a photogrammetric method that is based on digital image correlation of a projected laser speckle pattern. This multimodal workstation allows the simultaneous acquisition of the specimen's macroscopic 3D shape and thus a quantitative comparison of measured deformations with simulation data. In order to demonstrate the feasibility of our system, two applications are presented: the quantitative determination of (1) the deformation of a mandible model due to mechanical loading of an inserted dental implant and of (2) the deformation of a (dental) bridge model under mechanical loading. The results were compared with data from finite element analyses of the investigated applications. The experimental results showed close agreement with those of the simulations.

  17. A novel image-domain-based cone-beam computed tomography enhancement algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Li Xiang; Li Tianfang; Yang Yong; Heron, Dwight E; Huq, M Saiful, E-mail: lix@upmc.edu [Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, PA 15232 (United States)

    2011-05-07

    Kilo-voltage (kV) cone-beam computed tomography (CBCT) plays an important role in image-guided radiotherapy. However, due to a large cone-beam angle, scatter effects significantly degrade the CBCT image quality and limit its clinical application. The goal of this study is to develop an image enhancement algorithm to reduce the low-frequency CBCT image artifacts, which are also called the bias field. The proposed algorithm is based on the hypothesis that image intensities of different types of materials in CBCT images are approximately globally uniform (in other words, a piecewise property). A maximum a posteriori probability framework was developed to estimate the bias field contribution from a given CBCT image. The performance of the proposed CBCT image enhancement method was tested using phantoms and clinical CBCT images. Compared to the original CBCT images, the corrected images using the proposed method achieved a more uniform intensity distribution within each tissue type and significantly reduced cupping and shading artifacts. In a head and a pelvic case, the proposed method reduced the Hounsfield unit (HU) errors within the region of interest from 300 HU to less than 60 HU. In a chest case, the HU errors were reduced from 460 HU to less than 110 HU. The proposed CBCT image enhancement algorithm demonstrated a promising result by the reduction of the scatter-induced low-frequency image artifacts commonly encountered in kV CBCT imaging.

  18. Clinical utility of speckle-tracking echocardiography in cardiac resynchronisation therapy

    Directory of Open Access Journals (Sweden)

    Sitara G Khan

    2016-05-01

    Full Text Available Cardiac resynchronisation therapy (CRT can profoundly improve outcome in selected patients with heart failure; however, response is difficult to predict and can be absent in up to one in three patients. There has been a substantial amount of interest in the echocardiographic assessment of left ventricular dyssynchrony, with the ultimate aim of reliably identifying patients who will respond to CRT. The measurement of myocardial deformation (strain has conventionally been assessed using tissue Doppler imaging (TDI, which is limited by its angle dependence and ability to measure in a single plane. Two-dimensional speckle-tracking echocardiography is a technique that provides measurements of strain in three planes, by tracking patterns of ultrasound interference (‘speckles’ in the myocardial wall throughout the cardiac cycle. Since its initial use over 15 years ago, it has emerged as a tool that provides more robust, reproducible and sensitive markers of dyssynchrony than TDI. This article reviews the use of two-dimensional and three-dimensional speckle-tracking echocardiography in the assessment of dyssynchrony, including the identification of echocardiographic parameters that may hold predictive potential for the response to CRT. It also reviews the application of these techniques in guiding optimal LV lead placement pre-implant, with promising results in clinical improvement post-CRT.

  19. An efficient feedback calibration algorithm for direct imaging radio telescopes

    Science.gov (United States)

    Beardsley, Adam P.; Thyagarajan, Nithyanandan; Bowman, Judd D.; Morales, Miguel F.

    2017-10-01

    We present the E-field Parallel Imaging Calibration (EPICal) algorithm, which addresses the need for a fast calibration method for direct imaging radio astronomy correlators. Direct imaging involves a spatial fast Fourier transform of antenna signals, alleviating an O(Na ^2) computational bottleneck typical in radio correlators, and yielding a more gentle O(Ng log _2 Ng) scaling, where Na is the number of antennas in the array and Ng is the number of gridpoints in the imaging analysis. This can save orders of magnitude in computation cost for next generation arrays consisting of hundreds or thousands of antennas. However, because antenna signals are mixed in the imaging correlator without creating visibilities, gain correction must be applied prior to imaging, rather than on visibilities post-correlation. We develop the EPICal algorithm to form gain solutions quickly and without ever forming visibilities. This method scales as the number of antennas, and produces results comparable to those from visibilities. We use simulations to demonstrate the EPICal technique and study the noise properties of our gain solutions, showing they are similar to visibility-based solutions in realistic situations. By applying EPICal to 2 s of Long Wavelength Array data, we achieve a 65 per cent dynamic range improvement compared to uncalibrated images, showing this algorithm is a promising solution for next generation instruments.

  20. An analytical phantom for the evaluation of medical flow imaging algorithms

    International Nuclear Information System (INIS)

    Pashaei, A; Fatouraee, N

    2009-01-01

    Blood flow characteristics (e.g. velocity, pressure, shear stress, streamline and volumetric flow rate) are effective tools in diagnosis of cardiovascular diseases such as atherosclerotic plaque, aneurism and cardiac muscle failure. Noninvasive estimation of cardiovascular blood flow characteristics is mostly limited to the measurement of velocity components by medical imaging modalities. Once the velocity field is obtained from the images, other flow characteristics within the cardiovascular system can be determined using algorithms relating them to the velocity components. In this work, we propose an analytical flow phantom to evaluate these algorithms accurately. The Navier-Stokes equations are used to derive this flow phantom. The exact solution of these equations obtains analytical expression for the flow characteristics inside the domain. Features such as pulsatility, incompressibility and viscosity of flow are included in a three-dimensional domain. The velocity domain of the resulted system is presented as reference images. These images could be employed to evaluate the performance of different flow characteristic algorithms. In this study, we also present some applications of the obtained phantom. The calculation of pressure domain from velocity data, volumetric flow rate, wall shear stress and particle trace are the characteristics whose algorithms are evaluated here. We also present the application of this phantom in the analysis of noisy and low-resolution images. The presented phantom can be considered as a benchmark test to compare the accuracy of different flow characteristic algorithms.