Speckle imaging algorithms for planetary imaging
Energy Technology Data Exchange (ETDEWEB)
Johansson, E. [Lawrence Livermore National Lab., CA (United States)
1994-11-15
I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.
Speckle imaging of globular clusters
International Nuclear Information System (INIS)
Sams, B.J. III
1990-01-01
Speckle imaging is a powerful tool for high resolution astronomy. Its application to the core regions of globular clusters produces high resolution stellar maps of the bright stars, but is unable to image the faint stars which are most reliable dynamical indicators. The limits on resolving these faint, extended objects are physical, not algorithmic, and cannot be overcome using speckle. High resolution maps may be useful for resolving multicomponent stellar systems in the cluster centers. 30 refs
Development of Speckle Interferometry Algorithm and System
International Nuclear Information System (INIS)
Shamsir, A. A. M.; Jafri, M. Z. M.; Lim, H. S.
2011-01-01
Electronic speckle pattern interferometry (ESPI) method is a wholefield, non destructive measurement method widely used in the industries such as detection of defects on metal bodies, detection of defects in intergrated circuits in digital electronics components and in the preservation of priceless artwork. In this research field, this method is widely used to develop algorithms and to develop a new laboratory setup for implementing the speckle pattern interferometry. In speckle interferometry, an optically rough test surface is illuminated with an expanded laser beam creating a laser speckle pattern in the space surrounding the illuminated region. The speckle pattern is optically mixed with a second coherent light field that is either another speckle pattern or a smooth light field. This produces an interferometric speckle pattern that will be detected by sensor to count the change of the speckle pattern due to force given. In this project, an experimental setup of ESPI is proposed to analyze a stainless steel plate using 632.8 nm (red) wavelength of lights.
Speckle pattern processing by digital image correlation
Directory of Open Access Journals (Sweden)
Gubarev Fedor
2016-01-01
Full Text Available Testing the method of speckle pattern processing based on the digital image correlation is carried out in the current work. Three the most widely used formulas of the correlation coefficient are tested. To determine the accuracy of the speckle pattern processing, test speckle patterns with known displacement are used. The optimal size of a speckle pattern template used for determination of correlation and corresponding the speckle pattern displacement is also considered in the work.
Speckle imaging using the principle value decomposition method
International Nuclear Information System (INIS)
Sherman, J.W.
1978-01-01
Obtaining diffraction-limited images in the presence of atmospheric turbulence is a topic of current interest. Two types of approaches have evolved: real-time correction and speckle imaging. A speckle imaging reconstruction method was developed by use of an ''optimal'' filtering approach. This method is based on a nonlinear integral equation which is solved by principle value decomposition. The method was implemented on a CDC 7600 for study. The restoration algorithm is discussed and its performance is illustrated. 7 figures
Multiple speckle illumination for optical-resolution photoacoustic imaging
Poisson, Florian; Stasio, Nicolino; Moser, Christophe; Psaltis, Demetri; Bossy, Emmanuel
2017-03-01
Optical-resolution photoacoustic microscopy offers exquisite and specific contrast to optical absorption. Conventional approaches generally involves raster scanning a focused spot over the sample. Here, we demonstrate that a full-field illumination approach with multiple speckle illumination can also provide diffraction-limited optical-resolution photoacoustic images. Two different proof-of-concepts are demonstrated with micro-structured test samples. The first approach follows the principle of correlation/ghost imaging,1, 2 and is based on cross-correlating photoacoustic signals under multiple speckle illumination with known speckle patterns measured during a calibration step. The second approach is a speckle scanning microscopy technique, which adapts the technique proposed in fluorescence microscopy by Bertolotti and al.:3 in our work, spatially unresolved photoacoustic measurements are performed for various translations of unknown speckle patterns. A phase-retrieval algorithm is used to reconstruct the object from the knowledge of the modulus of its Fourier Transform yielded by the measurements. Because speckle patterns naturally appear in many various situations, including propagation through biological tissue or multi-mode fibers (for which focusing light is either very demanding if not impossible), speckle-illumination-based photoacoustic microscopy provides a powerful framework for the development of novel reconstruction approaches, well-suited to compressed sensing approaches.2
Accelerated speckle imaging with the ATST visible broadband imager
Wöger, Friedrich; Ferayorni, Andrew
2012-09-01
The Advanced Technology Solar Telescope (ATST), a 4 meter class telescope for observations of the solar atmosphere currently in construction phase, will generate data at rates of the order of 10 TB/day with its state of the art instrumentation. The high-priority ATST Visible Broadband Imager (VBI) instrument alone will create two data streams with a bandwidth of 960 MB/s each. Because of the related data handling issues, these data will be post-processed with speckle interferometry algorithms in near-real time at the telescope using the cost-effective Graphics Processing Unit (GPU) technology that is supported by the ATST Data Handling System. In this contribution, we lay out the VBI-specific approach to its image processing pipeline, put this into the context of the underlying ATST Data Handling System infrastructure, and finally describe the details of how the algorithms were redesigned to exploit data parallelism in the speckle image reconstruction algorithms. An algorithm re-design is often required to efficiently speed up an application using GPU technology; we have chosen NVIDIA's CUDA language as basis for our implementation. We present our preliminary results of the algorithm performance using our test facilities, and base a conservative estimate on the requirements of a full system that could achieve near real-time performance at ATST on these results.
Statistical Image Recovery From Laser Speckle Patterns With Polarization Diversity
2010-09-01
several techniques for speckle suppression in optical imaging [19]. However, averaging nonimaged laser speckle patterns does not yield the same result...Comparison”. Applied Optics , 21(15):2758–2769, August 1982. 13. Fienup, James R. “Image Formation from Nonimaged Laser Speckle Patterns”. S. R. Robinson...6 ν Optical Frequency . . . . . . . . . . . . . . . . . . . . . . 6 t Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 ϕ
Li, Yuanbo; Cui, Xiaoqian; Wang, Hongbei; Zhao, Mengge; Ding, Hongbin
2017-10-01
Digital speckle pattern interferometry (DSPI) can diagnose the topography evolution in real-time, continuous and non-destructive, and has been considered as a most promising technique for Plasma-Facing Components (PFCs) topography diagnostic under the complicated environment of tokamak. It is important for the study of digital speckle pattern interferometry to enhance speckle patterns and obtain the real topography of the ablated crater. In this paper, two kinds of numerical model based on flood-fill algorithm has been developed to obtain the real profile by unwrapping from the wrapped phase in speckle interference pattern, which can be calculated through four intensity images by means of 4-step phase-shifting technique. During the process of phase unwrapping by means of flood-fill algorithm, since the existence of noise pollution, and other inevitable factors will lead to poor quality of the reconstruction results, this will have an impact on the authenticity of the restored topography. The calculation of the quality parameters was introduced to obtain the quality-map from the wrapped phase map, this work presents two different methods to calculate the quality parameters. Then quality parameters are used to guide the path of flood-fill algorithm, and the pixels with good quality parameters are given priority calculation, so that the quality of speckle interference pattern reconstruction results are improved. According to the comparison between the flood-fill algorithm which is suitable for speckle pattern interferometry and the quality-guided flood-fill algorithm (with two different calculation approaches), the errors which caused by noise pollution and the discontinuous of the strips were successfully reduced.
An adaptive Kalman filter for speckle reductions in ultrasound images
International Nuclear Information System (INIS)
Castellini, G.; Labate, D.; Masotti, L.; Mannini, E.; Rocchi, S.
1988-01-01
Speckle is the term used to describe the granular appearance found in ultrasound images. The presence of speckle reduces the diagnostic potential of the echographic technique because it tends to mask small inhomogeneities of the investigated tissue. We developed a new method of speckle reductions that utilizes an adaptive one-dimensional Kalman filter based on the assumption that the observed image can be considered as a superimposition of speckle on a ''true images''. The filter adaptivity, necessary to avoid loss of resolution, has been obtained by statistical considerations on the local signal variations. The results of the applications of this particular Kalman filter, both on A-Mode and B-MODE images, show a significant speckle reduction
Phase-processing as a tool for speckle reduction in pulse-echo images
DEFF Research Database (Denmark)
Healey, AJ; Leeman, S; Forsberg, F
1991-01-01
. Traditional speckle reduction procedures regard speckle correction as a stochastic process and trade image smoothing (resolution loss) for speckle reduction. Recently, a new phase acknowledging technique has been proposed that is unique in its ability to correct for speckle interference with no image......Due to the coherent nature of conventional ultrasound medical imaging systems interference artefacts occur in pulse echo images. These artefacts are generically termed 'speckle'. The phenomenon may severely limit low contrast resolution with clinically relevant information being obscured...
Ghijsen, Michael T.; Tromberg, Bruce J.
2017-03-01
Affixed Transmission Speckle Analysis (ATSA) is a method recently developed to measure blood flow that is based on laser speckle imaging miniaturized into a clip-on form factor the size of a pulse-oximeter. Measuring at a rate of 250 Hz, ATSA is capable or obtaining the cardiac waveform in blood flow data, referred to as the Speckle-Plethysmogram (SPG). ATSA is also capable of simultaneously measuring the Photoplethysmogram (PPG), a more conventional signal related to light intensity. In this work we present several novel algorithms for extracting physiologically relevant information from the combined SPG-PPG waveform data. First we show that there is a slight time-delay between the SPG and PPG that can be extracted computationally. Second, we present a set of frequency domain algorithms that measure harmonic content on pulse-by-pulse basis for both the SPG and PPG. Finally, we apply these algorithms to data obtained from a set of subjects including healthy controls and individuals with heightened cardiovascular risk. We hypothesize that the time-delay and frequency content are correlated with cardiovascular health; specifically with vascular stiffening.
Energy Technology Data Exchange (ETDEWEB)
Tsantis, Stavros [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504 (Greece); Spiliopoulos, Stavros; Karnabatidis, Dimitrios [Department of Radiology, School of Medicine, University of Patras, Rion, GR 26504 (Greece); Skouroliakou, Aikaterini [Department of Energy Technology Engineering, Technological Education Institute of Athens, Athens 12210 (Greece); Hazle, John D. [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States); Kagadis, George C., E-mail: gkagad@gmail.com, E-mail: George.Kagadis@med.upatras.gr, E-mail: GKagadis@mdanderson.org [Department of Medical Physics, School of Medicine, University of Patras, Rion, GR 26504, Greece and Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, Houston, Texas 77030 (United States)
2014-07-15
Purpose: Speckle suppression in ultrasound (US) images of various anatomic structures via a novel speckle noise reduction algorithm. Methods: The proposed algorithm employs an enhanced fuzzy c-means (EFCM) clustering and multiresolution wavelet analysis to distinguish edges from speckle noise in US images. The edge detection procedure involves a coarse-to-fine strategy with spatial and interscale constraints so as to classify wavelet local maxima distribution at different frequency bands. As an outcome, an edge map across scales is derived whereas the wavelet coefficients that correspond to speckle are suppressed in the inverse wavelet transform acquiring the denoised US image. Results: A total of 34 thyroid, liver, and breast US examinations were performed on a Logiq 9 US system. Each of these images was subjected to the proposed EFCM algorithm and, for comparison, to commercial speckle reduction imaging (SRI) software and another well-known denoising approach, Pizurica's method. The quantification of the speckle suppression performance in the selected set of US images was carried out via Speckle Suppression Index (SSI) with results of 0.61, 0.71, and 0.73 for EFCM, SRI, and Pizurica's methods, respectively. Peak signal-to-noise ratios of 35.12, 33.95, and 29.78 and edge preservation indices of 0.94, 0.93, and 0.86 were found for the EFCM, SIR, and Pizurica's method, respectively, demonstrating that the proposed method achieves superior speckle reduction performance and edge preservation properties. Based on two independent radiologists’ qualitative evaluation the proposed method significantly improved image characteristics over standard baseline B mode images, and those processed with the Pizurica's method. Furthermore, it yielded results similar to those for SRI for breast and thyroid images significantly better results than SRI for liver imaging, thus improving diagnostic accuracy in both superficial and in-depth structures. Conclusions: A
International Nuclear Information System (INIS)
Tsantis, Stavros; Spiliopoulos, Stavros; Karnabatidis, Dimitrios; Skouroliakou, Aikaterini; Hazle, John D.; Kagadis, George C.
2014-01-01
Purpose: Speckle suppression in ultrasound (US) images of various anatomic structures via a novel speckle noise reduction algorithm. Methods: The proposed algorithm employs an enhanced fuzzy c-means (EFCM) clustering and multiresolution wavelet analysis to distinguish edges from speckle noise in US images. The edge detection procedure involves a coarse-to-fine strategy with spatial and interscale constraints so as to classify wavelet local maxima distribution at different frequency bands. As an outcome, an edge map across scales is derived whereas the wavelet coefficients that correspond to speckle are suppressed in the inverse wavelet transform acquiring the denoised US image. Results: A total of 34 thyroid, liver, and breast US examinations were performed on a Logiq 9 US system. Each of these images was subjected to the proposed EFCM algorithm and, for comparison, to commercial speckle reduction imaging (SRI) software and another well-known denoising approach, Pizurica's method. The quantification of the speckle suppression performance in the selected set of US images was carried out via Speckle Suppression Index (SSI) with results of 0.61, 0.71, and 0.73 for EFCM, SRI, and Pizurica's methods, respectively. Peak signal-to-noise ratios of 35.12, 33.95, and 29.78 and edge preservation indices of 0.94, 0.93, and 0.86 were found for the EFCM, SIR, and Pizurica's method, respectively, demonstrating that the proposed method achieves superior speckle reduction performance and edge preservation properties. Based on two independent radiologists’ qualitative evaluation the proposed method significantly improved image characteristics over standard baseline B mode images, and those processed with the Pizurica's method. Furthermore, it yielded results similar to those for SRI for breast and thyroid images significantly better results than SRI for liver imaging, thus improving diagnostic accuracy in both superficial and in-depth structures. Conclusions: A
Optoelectronic imaging of speckle using image processing method
Wang, Jinjiang; Wang, Pengfei
2018-01-01
A detailed image processing of laser speckle interferometry is proposed as an example for the course of postgraduate student. Several image processing methods were used together for dealing with optoelectronic imaging system, such as the partial differential equations (PDEs) are used to reduce the effect of noise, the thresholding segmentation also based on heat equation with PDEs, the central line is extracted based on image skeleton, and the branch is removed automatically, the phase level is calculated by spline interpolation method, and the fringe phase can be unwrapped. Finally, the imaging processing method was used to automatically measure the bubble in rubber with negative pressure which could be used in the tire detection.
Estimation of vessel diameter and blood flow dynamics from laser speckle images
DEFF Research Database (Denmark)
Postnov, Dmitry D.; Tuchin, Valery V.; Sosnovtseva, Olga
2016-01-01
Laser speckle imaging is a rapidly developing method to study changes of blood velocity in the vascular networks. However, to assess blood flow and vascular responses it is crucial to measure vessel diameter in addition to blood velocity dynamics. We suggest an algorithm that allows for dynamical...
Humeau-Heurtier, Anne; Marche, Pauline; Dubois, Severine; Mahe, Guillaume
2015-01-01
Laser speckle contrast imaging (LSCI) is a full-field imaging modality to monitor microvascular blood flow. It is able to give images with high temporal and spatial resolutions. However, when the skin is studied, the interpretation of the bidimensional data may be difficult. This is why an averaging of the perfusion values in regions of interest is often performed and the result is followed in time, reducing the data to monodimensional time series. In order to avoid such a procedure (that leads to a loss of the spatial resolution), we propose to extract patterns from LSCI data and to compare these patterns for two physiological states in healthy subjects: at rest and at the peak of acetylcholine-induced perfusion peak. For this purpose, the recent multi-dimensional complete ensemble empirical mode decomposition with adaptive noise (MCEEMDAN) algorithm is applied to LSCI data. The results show that the intrinsic mode functions and residue given by MCEEMDAN show different patterns for the two physiological states. The images, as bidimensional data, can therefore be processed to reveal microvascular perfusion patterns, hidden in the images themselves. This work is therefore a feasibility study before analyzing data in patients with microvascular dysfunctions.
Directory of Open Access Journals (Sweden)
Mohamed Yaseen Jabarulla
2018-05-01
Full Text Available Ultrasound images are corrupted with multiplicative noise known as speckle, which reduces the effectiveness of image processing and hampers interpretation. This paper proposes a multiplicative speckle suppression technique for ultrasound liver images, based on a new signal reconstruction model known as sparse representation (SR over dictionary learning. In the proposed technique, the non-uniform multiplicative signal is first converted into additive noise using an enhanced homomorphic filter. This is followed by pixel-based total variation (TV regularization and patch-based SR over a dictionary trained using K-singular value decomposition (KSVD. Finally, the split Bregman algorithm is used to solve the optimization problem and estimate the de-speckled image. The simulations performed on both synthetic and clinical ultrasound images for speckle reduction, the proposed technique achieved peak signal-to-noise ratios of 35.537 dB for the dictionary trained on noisy image patches and 35.033 dB for the dictionary trained using a set of reference ultrasound image patches. Further, the evaluation results show that the proposed method performs better than other state-of-the-art denoising algorithms in terms of both peak signal-to-noise ratio and subjective visual quality assessment.
Laser speckle imaging based on photothermally driven convection
Regan, Caitlin; Choi, Bernard
2016-02-01
Laser speckle imaging (LSI) is an interferometric technique that provides information about the relative speed of moving scatterers in a sample. Photothermal LSI overcomes limitations in depth resolution faced by conventional LSI by incorporating an excitation pulse to target absorption by hemoglobin within the vascular network. Here we present results from experiments designed to determine the mechanism by which photothermal LSI decreases speckle contrast. We measured the impact of mechanical properties on speckle contrast, as well as the spatiotemporal temperature dynamics and bulk convective motion occurring during photothermal LSI. Our collective data strongly support the hypothesis that photothermal LSI achieves a transient reduction in speckle contrast due to bulk motion associated with thermally driven convection. The ability of photothermal LSI to image structures below a scattering medium may have important preclinical and clinical applications.
Statistical characterization of speckle noise in coherent imaging systems
Yaroslavsky, Leonid; Shefler, A.
2003-05-01
Speckle noise imposes fundamental limitation on image quality in coherent radiation based imaging and optical metrology systems. Speckle noise phenomena are associated with properties of objects to diffusely scatter irradiation and with the fact that in recording the wave field, a number of signal distortions inevitably occur due to technical limitations inherent to hologram sensors. The statistical theory of speckle noise was developed with regard to only limited resolving power of coherent imaging devices. It is valid only asymptotically as much as the central limit theorem of the probability theory can be applied. In applications this assumption is not always applicable. Moreover, in treating speckle noise problem one should also consider other sources of the hologram deterioration. In the paper, statistical properties of speckle due to the limitation of hologram size, dynamic range and hologram signal quantization are studied by Monte-Carlo simulation for holograms recorded in near and far diffraction zones. The simulation experiments have shown that, for limited resolving power of the imaging system, widely accepted opinion that speckle contrast is equal to one holds only for rather severe level of the hologram size limitation. For moderate limitations, speckle contrast changes gradually from zero for no limitation to one for limitation to less than about 20% of hologram size. The results obtained for the limitation of the hologram sensor"s dynamic range and hologram signal quantization reveal that speckle noise due to these hologram signal distortions is not multiplicative and is directly associated with the severity of the limitation and quantization. On the base of the simulation results, analytical models are suggested.
Chen, Zhenning; Shao, Xinxing; Xu, Xiangyang; He, Xiaoyuan
2018-02-01
The technique of digital image correlation (DIC), which has been widely used for noncontact deformation measurements in both the scientific and engineering fields, is greatly affected by the quality of speckle patterns in terms of its performance. This study was concerned with the optimization of the digital speckle pattern (DSP) for DIC in consideration of both the accuracy and efficiency. The root-mean-square error of the inverse compositional Gauss-Newton algorithm and the average number of iterations were used as quality metrics. Moreover, the influence of subset sizes and the noise level of images, which are the basic parameters in the quality assessment formulations, were also considered. The simulated binary speckle patterns were first compared with the Gaussian speckle patterns and captured DSPs. Both the single-radius and multi-radius DSPs were optimized. Experimental tests and analyses were conducted to obtain the optimized and recommended DSP. The vector diagram of the optimized speckle pattern was also uploaded as reference.
Detection of white spot lesions by segmenting laser speckle images using computer vision methods.
Gavinho, Luciano G; Araujo, Sidnei A; Bussadori, Sandra K; Silva, João V P; Deana, Alessandro M
2018-05-05
This paper aims to develop a method for laser speckle image segmentation of tooth surfaces for diagnosis of early stages caries. The method, applied directly to a raw image obtained by digital photography, is based on the difference between the speckle pattern of a carious lesion tooth surface area and that of a sound area. Each image is divided into blocks which are identified in a working matrix by their χ 2 distance between block histograms of the analyzed image and the reference histograms previously obtained by K-means from healthy (h_Sound) and lesioned (h_Decay) areas, separately. If the χ 2 distance between a block histogram and h_Sound is greater than the distance to h_Decay, this block is marked as decayed. The experiments showed that the method can provide effective segmentation for initial lesions. We used 64 images to test the algorithm and we achieved 100% accuracy in segmentation. Differences between the speckle pattern of a sound tooth surface region and a carious region, even in the early stage, can be evidenced by the χ 2 distance between histograms. This method proves to be more effective for segmenting the laser speckle image, which enhances the contrast between sound and lesioned tissues. The results were obtained with low computational cost. The method has the potential for early diagnosis in a clinical environment, through the development of low-cost portable equipment.
Speckle Imaging of Binary Stars with Large-Format CCDs
Horch, E.; Ninkov, Z.; Slawson, R. W.; van Altena, W. F.; Meyer, R. D.; Girard, T. M.
1997-12-01
In the past, bare (unintensified) CCDs have not been widely used in speckle imaging for two main reasons: 1) the readout rate of most scientific-grade CCDs is too slow to be able to observe at the high frame rates necessary to capture speckle patterns efficiently, and 2) the read noise of CCDs limits the detectability of fainter objects where it becomes difficult to distinguish between speckles and noise peaks in the image. These facts have led to the current supremacy of intensified imaging systems (such as intensified-CCDs) in this field, which can typically be read out at video rates or faster. We have developed a new approach that uses a large format CCD not only to detect the incident photons but also to record many speckle patterns before the chip is read out. This approach effectively uses the large area of the CCD as a physical ``memory cache'' of previous speckle data frames. The method is described, and binary star observations from the University of Toronto Southern Observatory 60-cm telescope and the Wisconsin-Indiana-Yale-NOAO (WIYN) 3.5-m telescope are presented. Plans for future observing and instrumentation improvements are also outlined.
Laser Speckle Contrast Imaging: theory, instrumentation and applications.
Senarathna, Janaka; Rege, Abhishek; Li, Nan; Thakor, Nitish V
2013-01-01
Laser Speckle Contrast Imaging (LSCI) is a wide field of view, non scanning optical technique for observing blood flow. Speckles are produced when coherent light scattered back from biological tissue is diffracted through the limiting aperture of focusing optics. Mobile scatterers cause the speckle pattern to blur; a model can be constructed by inversely relating the degree of blur, termed speckle contrast to the scatterer speed. In tissue, red blood cells are the main source of moving scatterers. Therefore, blood flow acts as a virtual contrast agent, outlining blood vessels. The spatial resolution (~10 μm) and temporal resolution (10 ms to 10 s) of LSCI can be tailored to the application. Restricted by the penetration depth of light, LSCI can only visualize superficial blood flow. Additionally, due to its non scanning nature, LSCI is unable to provide depth resolved images. The simple setup and non-dependence on exogenous contrast agents have made LSCI a popular tool for studying vascular structure and blood flow dynamics. We discuss the theory and practice of LSCI and critically analyze its merit in major areas of application such as retinal imaging, imaging of skin perfusion as well as imaging of neurophysiology.
Yilmaz, Hasan
2016-03-01
Structured illumination enables high-resolution fluorescence imaging of nanostructures [1]. We demonstrate a new high-resolution fluorescence imaging method that uses a scattering layer with a high-index substrate as a solid immersion lens [2]. Random scattering of coherent light enables a speckle pattern with a very fine structure that illuminates the fluorescent nanospheres on the back surface of the high-index substrate. The speckle pattern is raster-scanned over the fluorescent nanospheres using a speckle correlation effect known as the optical memory effect. A series of standard-resolution fluorescence images per each speckle pattern displacement are recorded by an electron-multiplying CCD camera using a commercial microscope objective. We have developed a new phase-retrieval algorithm to reconstruct a high-resolution, wide-field image from several standard-resolution wide-field images. We have introduced phase information of Fourier components of standard-resolution images as a new constraint in our algorithm which discards ambiguities therefore ensures convergence to a unique solution. We demonstrate two-dimensional fluorescence images of a collection of nanospheres with a deconvolved Abbe resolution of 116 nm and a field of view of 10 µm × 10 µm. Our method is robust against optical aberrations and stage drifts, therefore excellent for imaging nanostructures under ambient conditions. [1] M. G. L. Gustafsson, J. Microsc. 198, 82-87 (2000). [2] H. Yilmaz, E. G. van Putten, J. Bertolotti, A. Lagendijk, W. L. Vos, and A. P. Mosk, Optica 2, 424-429 (2015).
Li, Biyuan; Tang, Chen; Gao, Guannan; Chen, Mingming; Tang, Shuwei; Lei, Zhenkun
2017-06-01
Filtering off speckle noise from a fringe image is one of the key tasks in electronic speckle pattern interferometry (ESPI). In general, ESPI fringe images can be divided into three categories: low-density fringe images, high-density fringe images, and variable-density fringe images. In this paper, we first present a general filtering method based on variational image decomposition that can filter speckle noise for ESPI fringe images with various densities. In our method, a variable-density ESPI fringe image is decomposed into low-density fringes, high-density fringes, and noise. A low-density fringe image is decomposed into low-density fringes and noise. A high-density fringe image is decomposed into high-density fringes and noise. We give some suitable function spaces to describe low-density fringes, high-density fringes, and noise, respectively. Then we construct several models and numerical algorithms for ESPI fringe images with various densities. And we investigate the performance of these models via our extensive experiments. Finally, we compare our proposed models with the windowed Fourier transform method and coherence enhancing diffusion partial differential equation filter. These two methods may be the most effective filtering methods at present. Furthermore, we use the proposed method to filter a collection of the experimentally obtained ESPI fringe images with poor quality. The experimental results demonstrate the performance of our proposed method.
Laser speckle imaging: a novel method for detecting dental erosion.
Directory of Open Access Journals (Sweden)
Nelson H Koshoji
Full Text Available Erosion is a highly prevalent condition known as a non-carious lesion that causes progressive tooth wear due to chemical processes that do not involve the action of bacteria. Speckle images proved sensitive to even minimal mineral loss from the enamel. The aim of the present study was to investigate the use of laser speckle imaging analysis in the spatial domain to quantify shifts in the microstructure of the tooth surface in an erosion model. 32 fragments of the vestibular surface of bovine incisors were divided in for groups (10 min, 20 min. 30 min and 40 min of acid etching immersed in a cola-based beverage (pH approximately 2.5 twice a day during 7 days to create an artificial erosion. By analyzing the laser speckle contrast map (LASCA in the eroded region compared to the sound it was observed that the LASCA map shifts, proportionally to the acid each duration, by: 18%; 23%; 39% and 44% for the 10 min; 20 min; 30 min and 40 min groups, respectively. To the best of our knowledge, this is the first study to demonstrate the correlation between speckle patterns and erosion progression.
Directory of Open Access Journals (Sweden)
M. Mahdian
2013-09-01
Full Text Available In recent years, the use of Polarimetric Synthetic Aperture Radar (PolSAR data in different applications dramatically has been increased. In SAR imagery an interference phenomenon with random behavior exists which is called speckle noise. The interpretation of data encounters some troubles due to the presence of speckle which can be considered as a multiplicative noise affecting all coherent imaging systems. Indeed, speckle degrade radiometric resolution of PolSAR images, therefore it is needful to perform speckle filtering on the SAR data type. Markov Random Field (MRF has proven to be a powerful method for drawing out eliciting contextual information from remotely sensed images. In the present paper, a probability density function (PDF, which is fitted well with the PolSAR data based on the goodness-of-fit test, is first obtained for the pixel-wise analysis. Then the contextual smoothing is achieved with the MRF method. This novel speckle reduction method combines an advanced statistical distribution with spatial contextual information for PolSAR data. These two parts of information are combined based on weighted summation of pixel-wise and contextual models. This approach not only preserves edge information in the images, but also improves signal-to-noise ratio of the results. The method maintains the mean value of original signal in the homogenous areas and preserves the edges of features in the heterogeneous regions. Experiments on real medium resolution ALOS data from Tehran, and also high resolution full polarimetric SAR data over the Oberpfaffenhofen test-site in Germany, demonstrate the effectiveness of the algorithm compared with well-known despeckling methods.
Correcting for motion artifact in handheld laser speckle images
Lertsakdadet, Ben; Yang, Bruce Y.; Dunn, Cody E.; Ponticorvo, Adrien; Crouzet, Christian; Bernal, Nicole; Durkin, Anthony J.; Choi, Bernard
2018-03-01
Laser speckle imaging (LSI) is a wide-field optical technique that enables superficial blood flow quantification. LSI is normally performed in a mounted configuration to decrease the likelihood of motion artifact. However, mounted LSI systems are cumbersome and difficult to transport quickly in a clinical setting for which portability is essential in providing bedside patient care. To address this issue, we created a handheld LSI device using scientific grade components. To account for motion artifact of the LSI device used in a handheld setup, we incorporated a fiducial marker (FM) into our imaging protocol and determined the difference between highest and lowest speckle contrast values for the FM within each data set (Kbest and Kworst). The difference between Kbest and Kworst in mounted and handheld setups was 8% and 52%, respectively, thereby reinforcing the need for motion artifact quantification. When using a threshold FM speckle contrast value (KFM) to identify a subset of images with an acceptable level of motion artifact, mounted and handheld LSI measurements of speckle contrast of a flow region (KFLOW) in in vitro flow phantom experiments differed by 8%. Without the use of the FM, mounted and handheld KFLOW values differed by 20%. To further validate our handheld LSI device, we compared mounted and handheld data from an in vivo porcine burn model of superficial and full thickness burns. The speckle contrast within the burn region (KBURN) of the mounted and handheld LSI data differed by burns. Collectively, our results suggest the potential of handheld LSI with an FM as a suitable alternative to mounted LSI, especially in challenging clinical settings with space limitations such as the intensive care unit.
Zhang, Xuming; Li, Liu; Zhu, Fei; Hou, Wenguang; Chen, Xinjian
2014-06-01
Optical coherence tomography (OCT) images are usually degraded by significant speckle noise, which will strongly hamper their quantitative analysis. However, speckle noise reduction in OCT images is particularly challenging because of the difficulty in differentiating between noise and the information components of the speckle pattern. To address this problem, the spiking cortical model (SCM)-based nonlocal means method is presented. The proposed method explores self-similarities of OCT images based on rotation-invariant features of image patches extracted by SCM and then restores the speckled images by averaging the similar patches. This method can provide sufficient speckle reduction while preserving image details very well due to its effectiveness in finding reliable similar patches under high speckle noise contamination. When applied to the retinal OCT image, this method provides signal-to-noise ratio improvements of >16 dB with a small 5.4% loss of similarity.
Speckle imaging with the PAPA detector. [Precision Analog Photon Address
Papaliolios, C.; Nisenson, P.; Ebstein, S.
1985-01-01
A new 2-D photon-counting camera, the PAPA (precision analog photon address) detector has been built, tested, and used successfully for the acquisition of speckle imaging data. The camera has 512 x 512 pixels and operates at count rates of at least 200,000/sec. In this paper, technical details on the camera are presented and some of the laboratory and astronomical results are included which demonstrate the detector's capabilities.
Rat retinal vasomotion assessed by laser speckle imaging
DEFF Research Database (Denmark)
Neganova, Anastasiia Y; Postnov, Dmitry D; Sosnovtseva, Olga
2017-01-01
Vasomotion is spontaneous or induced rhythmic changes in vascular tone or vessel diameter that lead to rhythmic changes in flow. While the vascular research community debates the physiological and pathophysiological consequence of vasomotion, there is a great need for experimental techniques...... that can address the role and dynamical properties of vasomotion in vivo. We apply laser speckle imaging to study spontaneous and drug induced vasomotion in retinal network of anesthetized rats. The results reveal a wide variety of dynamical patterns. Wavelet-based analysis shows that (i) spontaneous...
Evaluation of digital image correlation techniques using realistic ground truth speckle images
International Nuclear Information System (INIS)
Cofaru, C; Philips, W; Van Paepegem, W
2010-01-01
Digital image correlation (DIC) has been acknowledged and widely used in recent years in the field of experimental mechanics as a contactless method for determining full field displacements and strains. Even though several sub-pixel motion estimation algorithms have been proposed in the literature, little is known about their accuracy and limitations in reproducing complex underlying motion fields occurring in real mechanical tests. This paper presents a new method for evaluating sub-pixel motion estimation algorithms using ground truth speckle images that are realistically warped using artificial motion fields that were obtained following two distinct approaches: in the first, the horizontal and vertical displacement fields are created according to theoretical formulas for the given type of experiment while the second approach constructs the displacements through radial basis function interpolation starting from real DIC results. The method is applied in the evaluation of five DIC algorithms with results indicating that the gradient-based DIC methods generally have a quality advantage when using small sized blocks and are a better choice for calculating very small displacements and strains. The Newton–Raphson is the overall best performing method with a notable quality advantage when large block sizes are employed and in experiments where large strain fields are of interest
High Resolution Astrophysical Observations Using Speckle Imaging
1986-04-11
reserved. Printed in U.S A . A NEW OPTICAL SOURCE ASSOCIATED WITH T TAURI P. NISENSON, R. V. STACHNIK, M. KAROVSKA , AND R. NOYES Harvard-Smithsonian Center...NISENSON, STACHNIK, KAROVSKA . AND NoYEs (see page L18) APPENDIX F ON THE a ORIONIS TRIPLE SYSTEM M. Karovska , P. Nisenson, R. Noyes Harvard-Smithsonian...3.5 and 4.0 at a wavelengtRh of 530 nm. In Addition, Karovska (1984) inferred the possible existence of a second companion from an image recon
A practical approach to optimizing the preparation of speckle patterns for digital-image correlation
International Nuclear Information System (INIS)
Lionello, Giacomo; Cristofolini, Luca
2014-01-01
The quality of strain measurements by digital image correlation (DIC) strongly depends on the quality of the pattern on the specimen’s surface. An ideal pattern should be highly contrasted, stochastic, and isotropic. In addition, the speckle pattern should have an average size that exceeds the image pixel size by a factor of 3–5. (Smaller speckles cause poor contrast, and larger speckles cause poor spatial resolution.) Finally, the ideal pattern should have a limited scatter in terms of speckle sizes. The aims of this study were: (i) to define the ideal speckle size in relation to the specimen size and acquisition system; (ii) provide practical guidelines to identify the optimal settings of an airbrush gun, in order to produce a pattern that is as close as possible to the desired one while minimizing the scatter of speckle sizes. Patterns of different sizes were produced using two different airbrush guns with different settings of the four most influential factors (dilution, airflow setting, spraying distance, and air pressure). A full-factorial DOE strategy was implemented to explore the four factors at two levels each: 36 specimens were analyzed for each of the 16 combinations. The images were acquired using the digital cameras of a DIC system. The distribution of speckle sizes was analyzed to calculate the average speckle size and the standard deviation of the corresponding truncated Gaussian distribution. A mathematical model was built to enable prediction of the average speckle size in relation to the airbrush gun settings. We showed that it is possible to obtain a pattern with a highly controlled average and a limited scatter of speckle sizes, so as to match the ideal distribution of speckle sizes for DIC. Although the settings identified here apply only to the specific equipment being used, this method can be adapted to any airbrush to produce a desired speckle pattern. (technical design note)
Laser speckle contrast imaging using light field microscope approach
Ma, Xiaohui; Wang, Anting; Ma, Fenghua; Wang, Zi; Ming, Hai
2018-01-01
In this paper, a laser speckle contrast imaging (LSCI) system using light field (LF) microscope approach is proposed. As far as we known, it is first time to combine LSCI with LF. To verify this idea, a prototype consists of a modified LF microscope imaging system and an experimental device was built. A commercially used Lytro camera was modified for microscope imaging. Hollow glass tubes with different depth fixed in glass dish were used to simulate the vessels in brain and test the performance of the system. Compared with conventional LSCI, three new functions can be realized by using our system, which include refocusing, extending the depth of field (DOF) and gathering 3D information. Experiments show that the principle is feasible and the proposed system works well.
Integration of image exposure time into a modified laser speckle imaging method
Energy Technology Data Exchange (ETDEWEB)
RamIrez-San-Juan, J C; Salazar-Hermenegildo, N; Ramos-Garcia, R; Munoz-Lopez, J [Optics Department, INAOE, Puebla (Mexico); Huang, Y C [Department of Electrical Engineering and Computer Science, University of California, Irvine, CA (United States); Choi, B, E-mail: jcram@inaoep.m [Beckman Laser Institute and Medical Clinic, University of California, Irvine, CA (United States)
2010-11-21
Speckle-based methods have been developed to characterize tissue blood flow and perfusion. One such method, called modified laser speckle imaging (mLSI), enables computation of blood flow maps with relatively high spatial resolution. Although it is known that the sensitivity and noise in LSI measurements depend on image exposure time, a fundamental disadvantage of mLSI is that it does not take into account this parameter. In this work, we integrate the exposure time into the mLSI method and provide experimental support of our approach with measurements from an in vitro flow phantom.
Integration of image exposure time into a modified laser speckle imaging method
International Nuclear Information System (INIS)
RamIrez-San-Juan, J C; Salazar-Hermenegildo, N; Ramos-Garcia, R; Munoz-Lopez, J; Huang, Y C; Choi, B
2010-01-01
Speckle-based methods have been developed to characterize tissue blood flow and perfusion. One such method, called modified laser speckle imaging (mLSI), enables computation of blood flow maps with relatively high spatial resolution. Although it is known that the sensitivity and noise in LSI measurements depend on image exposure time, a fundamental disadvantage of mLSI is that it does not take into account this parameter. In this work, we integrate the exposure time into the mLSI method and provide experimental support of our approach with measurements from an in vitro flow phantom.
Nephron blood flow dynamics measured by laser speckle contrast imaging
DEFF Research Database (Denmark)
von Holstein-Rathlou, Niels-Henrik; Sosnovtseva, Olga V; Pavlov, Alexey N
2011-01-01
Tubuloglomerular feedback (TGF) has an important role in autoregulation of renal blood flow and glomerular filtration rate (GFR). Because of the characteristics of signal transmission in the feedback loop, the TGF undergoes self-sustained oscillations in single-nephron blood flow, GFR, and tubular...... simultaneously. The interacting nephron fields are likely to be more extensive. We have turned to laser speckle contrast imaging to measure the blood flow dynamics of 50-100 nephrons simultaneously on the renal surface of anesthetized rats. We report the application of this method and describe analytic...... pressure and flow. Nephrons interact by exchanging electrical signals conducted electrotonically through cells of the vascular wall, leading to synchronization of the TGF-mediated oscillations. Experimental studies of these interactions have been limited to observations on two or at most three nephrons...
Early diagnosis of teeth erosion using polarized laser speckle imaging
Nader, Christelle Abou; Pellen, Fabrice; Loutfi, Hadi; Mansour, Rassoul; Jeune, Bernard Le; Brun, Guy Le; Abboud, Marie
2016-07-01
Dental erosion starts with a chemical attack on dental tissue causing tooth demineralization, altering the tooth structure and making it more sensitive to mechanical erosion. Medical diagnosis of dental erosion is commonly achieved through a visual inspection by the dentist during dental checkups and is therefore highly dependent on the operator's experience. The detection of this disease at preliminary stages is important since, once the damage is done, cares become more complicated. We investigate the difference in light-scattering properties between healthy and eroded teeth. A change in light-scattering properties is observed and a transition from volume to surface backscattering is detected by means of polarized laser speckle imaging as teeth undergo acid etching, suggesting an increase in enamel surface roughness.
Simulations of multi-contrast x-ray imaging using near-field speckles
Energy Technology Data Exchange (ETDEWEB)
Zdora, Marie-Christine [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Thibault, Pierre [Department of Physics & Astronomy, University College London, London, WC1E 6BT (United Kingdom); Herzen, Julia; Pfeiffer, Franz [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Zanette, Irene [Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE (United Kingdom); Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany)
2016-01-28
X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.
Law, Yuen C.; Tenbrinck, Daniel; Jiang, Xiaoyi; Kuhlen, Torsten
2014-03-01
Computer-assisted processing and interpretation of medical ultrasound images is one of the most challenging tasks within image analysis. Physical phenomena in ultrasonographic images, e.g., the characteristic speckle noise and shadowing effects, make the majority of standard methods from image analysis non optimal. Furthermore, validation of adapted computer vision methods proves to be difficult due to missing ground truth information. There is no widely accepted software phantom in the community and existing software phantoms are not exible enough to support the use of specific speckle models for different tissue types, e.g., muscle and fat tissue. In this work we propose an anatomical software phantom with a realistic speckle pattern simulation to _ll this gap and provide a exible tool for validation purposes in medical ultrasound image analysis. We discuss the generation of speckle patterns and perform statistical analysis of the simulated textures to obtain quantitative measures of the realism and accuracy regarding the resulting textures.
M2 FILTER FOR SPECKLE NOISE SUPPRESSION IN BREAST ULTRASOUND IMAGES
Directory of Open Access Journals (Sweden)
E.S. Samundeeswari
2016-11-01
Full Text Available Breast cancer, commonly found in women is a serious life threatening disease due to its invasive nature. Ultrasound (US imaging method plays an effective role in screening early detection and diagnosis of Breast cancer. Speckle noise generally affects medical ultrasound images and also causes a number of difficulties in identifying the Region of Interest. Suppressing speckle noise is a challenging task as it destroys fine edge details. No specific filter is designed yet to get a noise free BUS image that is contaminated by speckle noise. In this paper M2 filter, a novel hybrid of linear and nonlinear filter is proposed and compared to other spatial filters with 3×3 kernel size. The performance of the proposed M2 filter is measured by statistical quantity parameters like MSE, PSNR and SSI. The experimental analysis clearly shows that the proposed M2 filter outperforms better than other spatial filters by 2% high PSNR values with regards to speckle suppression.
Modeling laser speckle imaging of perfusion in the skin (Conference Presentation)
Regan, Caitlin; Hayakawa, Carole K.; Choi, Bernard
2016-02-01
Laser speckle imaging (LSI) enables visualization of relative blood flow and perfusion in the skin. It is frequently applied to monitor treatment of vascular malformations such as port wine stain birthmarks, and measure changes in perfusion due to peripheral vascular disease. We developed a computational Monte Carlo simulation of laser speckle contrast imaging to quantify how tissue optical properties, blood vessel depths and speeds, and tissue perfusion affect speckle contrast values originating from coherent excitation. The simulated tissue geometry consisted of multiple layers to simulate the skin, or incorporated an inclusion such as a vessel or tumor at different depths. Our simulation used a 30x30mm uniform flat light source to optically excite the region of interest in our sample to better mimic wide-field imaging. We used our model to simulate how dynamically scattered photons from a buried blood vessel affect speckle contrast at different lateral distances (0-1mm) away from the vessel, and how these speckle contrast changes vary with depth (0-1mm) and flow speed (0-10mm/s). We applied the model to simulate perfusion in the skin, and observed how different optical properties, such as epidermal melanin concentration (1%-50%) affected speckle contrast. We simulated perfusion during a systolic forearm occlusion and found that contrast decreased by 35% (exposure time = 10ms). Monte Carlo simulations of laser speckle contrast give us a tool to quantify what regions of the skin are probed with laser speckle imaging, and measure how the tissue optical properties and blood flow affect the resulting images.
DEFF Research Database (Denmark)
Mogensen, Mette; Jørgensen, Thomas Martini; Thrane, Lars
2010-01-01
suggests a method for improving OCT image quality for skin cancer imaging. EXPERIMENTAL DESIGN: OCT is an optical imaging method analogous to ultrasound. Two basal cell carcinomas (BCC) were imaged using an OCT speckle reduction technique (SR-OCT) based on repeated scanning by altering the distance between...
A PHOTOMETRIC ANALYSIS OF SEVENTEEN BINARY STARS USING SPECKLE IMAGING
International Nuclear Information System (INIS)
Davidson, James W.; Baptista, Brian J.; Horch, Elliott P.; Franz, Otto; Van Altena, William F.
2009-01-01
Magnitude differences obtained from speckle imaging are used in combination with other data in the literature to place the components of binary star systems on the H-R diagram. Isochrones are compared with the positions obtained, and a best-fit isochrone is determined for each system, yielding both masses of the components as well as an age range consistent with the system parameters. Seventeen systems are studied, 12 of which were observed with the 0.6 m Lowell-Tololo Telescope at Cerro Tololo Inter-American Observatory and six of which were observed with the WIYN 3.5 m Telescope (The WIYN Observatory is a joint facility of the University of Wisconsin-Madison, Indiana University, Yale University, and the National Optical Astronomy Observatories) at Kitt Peak. One system was observed from both sites. In comparing photometric masses to mass information from orbit determinations, we find that the photometric masses agree very well with the dynamical masses, and are generally more precise. For three systems, no dynamical masses exist at present, and therefore the photometrically determined values are the first mass estimates derived for these components.
Directory of Open Access Journals (Sweden)
Zhangfang Hu
2014-10-01
Full Text Available The digital speckle correlation is a non-contact in-plane displacement measurement method based on machine vision. Motivated by the facts that the low accuracy and large amount of calculation produced by the traditional digital speckle correlation method in spatial domain, we introduce a sub-pixel displacement measurement algorithm which employs a fast interpolation method based on fractal theory and digital speckle correlation in frequency domain. This algorithm can overcome either the blocking effect or the blurring caused by the traditional interpolation methods, and the frequency domain processing also avoids the repeated searching in the correlation recognition of the spatial domain, thus the operation quantity is largely reduced and the information extracting speed is improved. The comparative experiment is given to verify that the proposed algorithm in this paper is effective.
Three Dimensional Speckle Imaging Employing a Frequency-Locked Tunable Diode Laser
Energy Technology Data Exchange (ETDEWEB)
Cannon, Bret D.; Bernacki, Bruce E.; Schiffern, John T.; Mendoza, Albert
2015-09-01
We describe a high accuracy frequency stepping method for a tunable diode laser to improve a three dimensional (3D) imaging approach based upon interferometric speckle imaging. The approach, modeled after Takeda, exploits tuning an illumination laser in frequency as speckle interferograms of the object (specklegrams) are acquired at each frequency in a Michelson interferometer. The resulting 3D hypercube of specklegrams encode spatial information in the x-y plane of each image with laser tuning arrayed along its z-axis. We present laboratory data of before and after results showing enhanced 3D imaging resulting from precise laser frequency control.
Speckle suppression via sparse representation for wide-field imaging through turbid media.
Jang, Hwanchol; Yoon, Changhyeong; Chung, Euiheon; Choi, Wonshik; Lee, Heung-No
2014-06-30
Speckle suppression is one of the most important tasks in the image transmission through turbid media. Insufficient speckle suppression requires an additional procedure such as temporal ensemble averaging over multiple exposures. In this paper, we consider the image recovery process based on the so-called transmission matrix (TM) of turbid media for the image transmission through the media. We show that the speckle left unremoved in the TM-based image recovery can be suppressed effectively via sparse representation (SR). SR is a relatively new signal reconstruction framework which works well even for ill-conditioned problems. This is the first study to show the benefit of using the SR as compared to the phase conjugation (PC) a de facto standard method to date for TM-based imaging through turbid media including a live cell through tissue slice.
Funamizu, Hideki; Onodera, Yusei; Aizu, Yoshihisa
2018-05-01
In this study, we report color quality improvement of reconstructed images in color digital holography using the speckle method and the spectral estimation. In this technique, an object is illuminated by a speckle field and then an object wave is produced, while a plane wave is used as a reference wave. For three wavelengths, the interference patterns of two coherent waves are recorded as digital holograms on an image sensor. Speckle fields are changed by moving a ground glass plate in an in-plane direction, and a number of holograms are acquired to average the reconstructed images. After the averaging process of images reconstructed from multiple holograms, we use the Wiener estimation method for obtaining spectral transmittance curves in reconstructed images. The color reproducibility in this method is demonstrated and evaluated using a Macbeth color chart film and staining cells of onion.
Speckle Noise Reduction for the Enhancement of Retinal Layers in Optical Coherence Tomography Images
Directory of Open Access Journals (Sweden)
Fereydoon Nowshiravan Rahatabad
2015-09-01
Full Text Available Introduction One of the most important pre-processing steps in optical coherence tomography (OCT is reducing speckle noise, resulting from multiple scattering of tissues, which degrades the quality of OCT images. Materials and Methods The present study focused on speckle noise reduction and edge detection techniques. Statistical filters with different masks and noise variances were applied on OCT and test images. Objective evaluation of both types of images was performed, using various image metrics such as peak signal-to-noise ratio (PSNR, root mean square error, correlation coefficient and elapsed time. For the purpose of recovery, Kuan filter was used as an input for edge enhancement. Also, a spatial filter was applied to improve image quality. Results The obtained results were presented as statistical tables and images. Based on statistical measures and visual quality of OCT images, Enhanced Lee filter (3×3 with a PSNR value of 43.6735 in low noise variance and Kuan filter (3×3 with a PSNR value of 37.2850 in high noise variance showed superior performance over other filters. Conclusion Based on the obtained results, by using speckle reduction filters such as Enhanced Lee and Kuan filters on OCT images, the number of compounded images, required to achieve a given image quality, could be reduced. Moreover, use of Kuan filters for promoting the edges allowed smoothing of speckle regions, while preserving image tissue texture.
ARTIFICIAL INCOHERENT SPECKLES ENABLE PRECISION ASTROMETRY AND PHOTOMETRY IN HIGH-CONTRAST IMAGING
Energy Technology Data Exchange (ETDEWEB)
Jovanovic, N.; Guyon, O.; Pathak, P.; Kudo, T. [National Astronomical Observatory of Japan, Subaru Telescope, 650 North A’Ohoku Place, Hilo, HI, 96720 (United States); Martinache, F. [Observatoire de la Cote d’Azur, Boulevard de l’Observatoire, F-06304 Nice (France); Hagelberg, J., E-mail: jovanovic.nem@gmail.com [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States)
2015-11-10
State-of-the-art coronagraphs employed on extreme adaptive optics enabled instruments are constantly improving the contrast detection limit for companions at ever-closer separations from the host star. In order to constrain their properties and, ultimately, compositions, it is important to precisely determine orbital parameters and contrasts with respect to the stars they orbit. This can be difficult in the post-coronagraphic image plane, as by definition the central star has been occulted by the coronagraph. We demonstrate the flexibility of utilizing the deformable mirror in the adaptive optics system of the Subaru Coronagraphic Extreme Adaptive Optics system to generate a field of speckles for the purposes of calibration. Speckles can be placed up to 22.5 λ/D from the star, with any position angle, brightness, and abundance required. Most importantly, we show that a fast modulation of the added speckle phase, between 0 and π, during a long science integration renders these speckles effectively incoherent with the underlying halo. We quantitatively show for the first time that this incoherence, in turn, increases the robustness and stability of the adaptive speckles, which will improve the precision of astrometric and photometric calibration procedures. This technique will be valuable for high-contrast imaging observations with imagers and integral field spectrographs alike.
Speckle reduction in optical coherence tomography images based on wave atoms
Du, Yongzhao; Liu, Gangjun; Feng, Guoying; Chen, Zhongping
2014-01-01
Abstract. Optical coherence tomography (OCT) is an emerging noninvasive imaging technique, which is based on low-coherence interferometry. OCT images suffer from speckle noise, which reduces image contrast. A shrinkage filter based on wave atoms transform is proposed for speckle reduction in OCT images. Wave atoms transform is a new multiscale geometric analysis tool that offers sparser expansion and better representation for images containing oscillatory patterns and textures than other traditional transforms, such as wavelet and curvelet transforms. Cycle spinning-based technology is introduced to avoid visual artifacts, such as Gibbs-like phenomenon, and to develop a translation invariant wave atoms denoising scheme. The speckle suppression degree in the denoised images is controlled by an adjustable parameter that determines the threshold in the wave atoms domain. The experimental results show that the proposed method can effectively remove the speckle noise and improve the OCT image quality. The signal-to-noise ratio, contrast-to-noise ratio, average equivalent number of looks, and cross-correlation (XCOR) values are obtained, and the results are also compared with the wavelet and curvelet thresholding techniques. PMID:24825507
Comet Shoemaker-Levy 9/Jupiter collision observed with a high resolution speckle imaging system
Energy Technology Data Exchange (ETDEWEB)
Gravel, D. [Lawrence Livermore National Lab., CA (United States)
1994-11-15
During the week of July 16, 1994, comet Shoemaker-Levy 9, broken into 20 plus pieces by tidal forces on its last orbit, smashed into the planet Jupiter, releasing the explosive energy of 500 thousand megatons. A team of observers from LLNL used the LLNL Speckle Imaging Camera mounted on the University of California`s Lick Observatory 3 Meter Telescope to capture continuous sequences of planet images during the comet encounter. Post processing with the bispectral phase reconstruction algorithm improves the resolution by removing much of the blurring due to atmospheric turbulence. High resolution images of the planet surface showing the aftermath of the impact are probably the best that were obtained from any ground-based telescope. We have been looking at the regions of the fragment impacts to try to discern any dynamic behavior of the spots left on Jupiter`s cloud tops. Such information can lead to conclusions about the nature of the comet and of Jupiter`s atmosphere. So far, the Hubble Space Telescope has observed expanding waves from the G impact whose mechanism is enigmatic since they appear to be too slow to be sound waves and too fast to be gravity waves, given the present knowledge of Jupiter`s atmosphere. Some of our data on the G and L impact region complements the Hubble observations but, so far, is inconclusive about spot dynamics.
Vaz, Pedro G.; Humeau-Heurtier, Anne; Figueiras, Edite; Correia, Carlos; Cardoso, João
2018-01-01
Laser speckle contrast imaging (LSCI) is a non-invasive microvascular blood flow assessment technique with good temporal and spatial resolution. Most LSCI systems, including commercial devices, can perform only qualitative blood flow evaluation, which is a major limitation of this technique. There are several factors that prevent the utilization of LSCI as a quantitative technique. Among these factors, we can highlight the effect of static scatterers. The goal of this work was to study the influence of differences in static and dynamic scatterer concentration on laser speckle correlation and contrast. In order to achieve this, a laser speckle prototype was developed and tested using an optical phantom with various concentrations of static and dynamic scatterers. It was found that the laser speckle correlation could be used to estimate the relative concentration of static/dynamic scatterers within a sample. Moreover, the speckle correlation proved to be independent of the dynamic scatterer velocity, which is a fundamental characteristic to be used in contrast correction.
NESSI and `Alopeke: Two new dual-channel speckle imaging instruments
Scott, Nicholas J.
2018-01-01
NESSI and `Alopeke are two new speckle imagers built at NASA's Ames Research Center for community use at the WIYN and Gemini telescopes, respectively. The two instruments are functionally similar and include the capability for wide-field imaging in additional to speckle interferometry. The diffraction-limited imaging available through speckle effectively eliminates distortions due to the presence of Earth's atmosphere by `freezing out' changes in the atmosphere by taking extremely short exposures and combining the resultant speckles in Fourier space. This technique enables angular resolutions equal to the theoretical best possible for a given telescope, effectively giving space-based resolution from the ground. Our instruments provide the highest spatial resolution available today on any single aperture telescope.A primary role of these instruments is exoplanet validation for the Kepler, K2, TESS, and many RV programs. Contrast ratios of 6 or more magnitudes are easily obtained. The instrument uses two emCCD cameras providing simultaneous dual-color observations help to characterize detected companions. High resolution imaging enables the identification of blended binaries that contaminate many exoplanet detections, leading to incorrectly measured radii. In this way small, rocky systems, such as Kepler-186b and the TRAPPIST-1 planet family, may be validated and thus the detected planets radii are correctly measured.
Local scattering property scales flow speed estimation in laser speckle contrast imaging
International Nuclear Information System (INIS)
Miao, Peng; Chao, Zhen; Feng, Shihan; Ji, Yuanyuan; Yu, Hang; Thakor, Nitish V; Li, Nan
2015-01-01
Laser speckle contrast imaging (LSCI) has been widely used in in vivo blood flow imaging. However, the effect of local scattering property (scattering coefficient µ s ) on blood flow speed estimation has not been well investigated. In this study, such an effect was quantified and involved in relation between speckle autocorrelation time τ c and flow speed v based on simulation flow experiments. For in vivo blood flow imaging, an improved estimation strategy was developed to eliminate the estimation bias due to the inhomogeneous distribution of the scattering property. Compared to traditional LSCI, a new estimation method significantly suppressed the imaging noise and improves the imaging contrast of vasculatures. Furthermore, the new method successfully captured the blood flow changes and vascular constriction patterns in rats’ cerebral cortex from normothermia to mild and moderate hypothermia. (letter)
International Nuclear Information System (INIS)
Pretto, Lucas Ramos de
2015-01-01
This work discusses the Optical Coherence Tomography system (OCT) and its application to the microfluidics area. To this end, physical characterization of microfluidic circuits were performed using 3D (three-dimensional) models constructed from OCT images of such circuits. The technique was thus evaluated as a potential tool to aid in the inspection of microchannels. Going further, this work paper studies and develops analytical techniques for microfluidic flow, in particular techniques based on speckle pattern. In the first instance, existing methods were studied and improved, such as Speckle Variance - OCT, where a gain of 31% was obtained in processing time. Other methods, such as LASCA (Laser Speckle Contrast Analysis), based on speckle autocorrelation, are adapted to OCT images. Derived from LASCA, the developed analysis technique based on intensity autocorrelation motivated the development of a custom OCT system as well as an optimized acquisition software, with a sampling rate of 8 kHz. The proposed method was, then, able to distinguish different flow rates, and limits of detection were tested, proving its feasibility for implementation on Brownian motion analysis and flow rates below 10 μl/min. (author)
Potlov, A. Yu.; Frolov, S. V.; Proskurin, S. G.
2018-04-01
High-quality OCT structural images reconstruction algorithm for endoscopic optical coherence tomography of biological tissue is described. The key features of the presented algorithm are: (1) raster scanning and averaging of adjacent Ascans and pixels; (2) speckle level minimization. The described algorithm can be used in the gastroenterology, urology, gynecology, otorhinolaryngology for mucous membranes and skin diagnostics in vivo and in situ.
Analysis of eroded bovine teeth through laser speckle imaging
Koshoji, Nelson H.; Bussadori, Sandra K.; Bortoletto, Carolina C.; Oliveira, Marcelo T.; Prates, Renato A.; Deana, Alessandro M.
2015-02-01
Dental erosion is a non-carious lesion that causes progressive tooth wear of structure through chemical processes that do not involve bacterial action. Its origin is related to eating habits or systemic diseases involving tooth contact with substances that pose a very low pH. This work demonstrates a new methodology to quantify the erosion by coherent light scattering of tooth surface. This technique shows a correlation between acid etch duration and laser speckle contrast map (LASCA). The experimental groups presented a relative contrast between eroded and sound tissue of 17.8(45)%, 23.4 (68)% 39.2 (40)% and 44.3 (30)%, for 10 min, 20 min, 30 min and 40 min of acid etching, respectively.
Integration of instrumentation and processing software of a laser speckle contrast imaging system
Carrick, Jacob J.
Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.
Maloca, Peter; Gyger, Cyrill; Hasler, Pascal W
2016-06-01
To visualize and measure the vascular network of melanocytic choroidal tumors with speckle noise-free swept source optical coherence tomography (SS-OCT choroidal angiography). Melanocytic choroidal tumors from 24 eyes were imaged with 1050-nm optical coherence tomography (Topcon DRI OCT-1 Atlantis). A semi-automated algorithm was developed to remove speckle noise and to extract and measure the volume of the choroidal vessels from the obtained OCT data. In all cases, analysis of the choroidal vessels could be performed with SS-OCT without the need for pupillary dilation. The proposed method allows speckle noise-free, structure-guided visualization and measurement of the larger choroidal vessels in three dimensions. The obtained data suggest that speckle noise-free OCT may be more effective at identifying choroidal structures than traditional OCT methods. The measured volume of the extracted choroidal vessels of Haller's layer and Sattler's layer in the examined tumorous eyes was on average 0.982463955 mm(3) /982463956 μm(3) (range of 0.209764406 mm(3) /209764405.9 μm(3)to 1.78105544 mm(3) /1781055440 μm(3)). Full thickness obstruction of the choroidal vasculature by the tumor was found in 18 cases (72 %). In seven cases (18 %), choroidal vessel architecture did not show pronounced morphological abnormalities (18 %). Speckle noise-free OCT may serve as a new illustrative imaging technology and enhance visualization of the choroidal vessels without the need for dye injection. OCT can be used to identify and evaluate the choroidal vessels of melanocytic choroidal tumors, and may represent a potentially useful tool for imaging and monitoring of choroidal nevi and melanoma.
Sirohi, Rajpal S.
2002-03-01
Illumination of a rough surface by a coherent monochromatic wave creates a grainy structure in space termed a speckle pattern. It was considered a special kind of noise and was the bane of holographers. However, its information-carrying property was soon discovered and the phenomenon was used for metrological applications. The realization that a speckle pattern carried information led to a new measurement technique known as speckle interferometry (SI). Although the speckle phenomenon in itself is a consequence of interference among numerous randomly dephased waves, a reference wave is required in SI. Further, it employs an imaging geometry. Initially SI was performed mostly by using silver emulsions as the recording media. The double-exposure specklegram was filtered to extract the desired information. Since SI can be configured so as to be sensitive to the in-plane displacement component, the out-of-plane displacement component or their derivatives, the interferograms corresponding to these were extracted from the specklegram for further analysis. Since the speckle size can be controlled by the F number of the imaging lens, it was soon realized that SI could be performed with electronic detection, thereby increasing its accuracy and speed of measurement. Furthermore, a phase-shifting technique can also be incorporated. This technique came to be known as electronic speckle pattern interferometry (ESPI). It employed the same experimental configurations as SI. ESPI found many industrial applications as it supplements holographic interferometry. We present three examples covering diverse areas. In one application it has been used to measure residual stress in a blank recordable compact disk. In another application, microscopic ESPI has been used to study the influence of relative humidity on paint-coated figurines and also the effect of a conservation agent applied on top of this. The final application is to find the defects in pipes. These diverse applications
Laser speckle imaging of rat retinal blood flow with hybrid temporal and spatial analysis method
Cheng, Haiying; Yan, Yumei; Duong, Timothy Q.
2009-02-01
Noninvasive monitoring of blood flow in retinal circulation will reveal the progression and treatment of ocular disorders, such as diabetic retinopathy, age-related macular degeneration and glaucoma. A non-invasive and direct BF measurement technique with high spatial-temporal resolution is needed for retinal imaging. Laser speckle imaging (LSI) is such a method. Currently, there are two analysis methods for LSI: spatial statistics LSI (SS-LSI) and temporal statistical LSI (TS-LSI). Comparing these two analysis methods, SS-LSI has higher signal to noise ratio (SNR) and TSLSI is less susceptible to artifacts from stationary speckle. We proposed a hybrid temporal and spatial analysis method (HTS-LSI) to measure the retinal blood flow. Gas challenge experiment was performed and images were analyzed by HTS-LSI. Results showed that HTS-LSI can not only remove the stationary speckle but also increase the SNR. Under 100% O2, retinal BF decreased by 20-30%. This was consistent with the results observed with laser Doppler technique. As retinal blood flow is a critical physiological parameter and its perturbation has been implicated in the early stages of many retinal diseases, HTS-LSI will be an efficient method in early detection of retina diseases.
Novel medical image enhancement algorithms
Agaian, Sos; McClendon, Stephen A.
2010-01-01
In this paper, we present two novel medical image enhancement algorithms. The first, a global image enhancement algorithm, utilizes an alpha-trimmed mean filter as its backbone to sharpen images. The second algorithm uses a cascaded unsharp masking technique to separate the high frequency components of an image in order for them to be enhanced using a modified adaptive contrast enhancement algorithm. Experimental results from enhancing electron microscopy, radiological, CT scan and MRI scan images, using the MATLAB environment, are then compared to the original images as well as other enhancement methods, such as histogram equalization and two forms of adaptive contrast enhancement. An image processing scheme for electron microscopy images of Purkinje cells will also be implemented and utilized as a comparison tool to evaluate the performance of our algorithm.
Simulations of x-ray speckle-based dark-field and phase-contrast imaging with a polychromatic beam
Energy Technology Data Exchange (ETDEWEB)
Zdora, Marie-Christine, E-mail: marie-christine.zdora@diamond.ac.uk [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom); Department of Physics & Astronomy, University College London, London WC1E 6BT (United Kingdom); Thibault, Pierre [Department of Physics & Astronomy, University College London, London WC1E 6BT (United Kingdom); Pfeiffer, Franz [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Zanette, Irene [Lehrstuhl für Biomedizinische Physik, Physik-Department & Institut für Medizintechnik, Technische Universität München, 85748 Garching (Germany); Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire OX11 0DE (United Kingdom)
2015-09-21
Following the first experimental demonstration of x-ray speckle-based multimodal imaging using a polychromatic beam [I. Zanette et al., Phys. Rev. Lett. 112(25), 253903 (2014)], we present a simulation study on the effects of a polychromatic x-ray spectrum on the performance of this technique. We observe that the contrast of the near-field speckles is only mildly influenced by the bandwidth of the energy spectrum. Moreover, using a homogeneous object with simple geometry, we characterize the beam hardening artifacts in the reconstructed transmission and refraction angle images, and we describe how the beam hardening also affects the dark-field signal provided by speckle tracking. This study is particularly important for further implementations and developments of coherent speckle-based techniques at laboratory x-ray sources.
Wu, Jun; Yu, Zhijing; Wang, Tao; Zhuge, Jingchang; Ji, Yue; Xue, Bin
2017-06-01
Airplane wing deformation is an important element of aerodynamic characteristics, structure design, and fatigue analysis for aircraft manufacturing, as well as a main test content of certification regarding flutter for airplanes. This paper presents a novel real-time detection method for wing deformation and flight flutter detection by using three-dimensional speckle image correlation technology. Speckle patterns whose positions are determined through the vibration characteristic of the aircraft are coated on the wing; then the speckle patterns are imaged by CCD cameras which are mounted inside the aircraft cabin. In order to reduce the computation, a matching technique based on Geodetic Systems Incorporated coded points combined with the classical epipolar constraint is proposed, and a displacement vector map for the aircraft wing can be obtained through comparing the coordinates of speckle points before and after deformation. Finally, verification experiments containing static and dynamic tests by using an aircraft wing model demonstrate the accuracy and effectiveness of the proposed method.
SPECKLE IMAGING EXCLUDES LOW-MASS COMPANIONS ORBITING THE EXOPLANET HOST STAR TRAPPIST-1
Energy Technology Data Exchange (ETDEWEB)
Howell, Steve B.; Scott, Nicholas J. [NASA Ames Research Center, Moffett Field, CA 94035 (United States); Everett, Mark E. [National Optical Astronomy Observatory, 950 N. Cherry Avenue, Tucson, AZ 85719 (United States); Horch, Elliott P. [Department of Physics, Southern Connecticut State University, 501 Crescent Street, New Haven, CT, 06515 (United States); Winters, Jennifer G. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA, 02138 (United States); Hirsch, Lea [Astronomy Department, University of California, Berkeley, 510 Campbell Hall, Berkeley, CA, 94720 (United States); Nusdeo, Dan [Department of Physics and Astronomy, Georgia State University, P.O. Box 5060, Atlanta, GA 30302 (United States)
2016-09-20
We have obtained the highest-resolution images available of TRAPPIST-1 using the Gemini-South telescope and our speckle imaging camera. Observing at 692 and 883 nm, we reached the diffraction limit of the telescope providing a best resolution of 27 mas or, at the distance of TRAPPIST-1, a spatial resolution of 0.32 au. Our imaging of the star extends from 0.32 to 14.5 au. We show that to a high confidence level, we can exclude all possible stellar and brown dwarf companions, indicating that TRAPPIST-1 is a single star.
SPECKLE IMAGING EXCLUDES LOW-MASS COMPANIONS ORBITING THE EXOPLANET HOST STAR TRAPPIST-1
International Nuclear Information System (INIS)
Howell, Steve B.; Scott, Nicholas J.; Everett, Mark E.; Horch, Elliott P.; Winters, Jennifer G.; Hirsch, Lea; Nusdeo, Dan
2016-01-01
We have obtained the highest-resolution images available of TRAPPIST-1 using the Gemini-South telescope and our speckle imaging camera. Observing at 692 and 883 nm, we reached the diffraction limit of the telescope providing a best resolution of 27 mas or, at the distance of TRAPPIST-1, a spatial resolution of 0.32 au. Our imaging of the star extends from 0.32 to 14.5 au. We show that to a high confidence level, we can exclude all possible stellar and brown dwarf companions, indicating that TRAPPIST-1 is a single star.
Shift-Invariant Image Reconstruction of Speckle-Degraded Images Using Bispectrum Estimation
1990-05-01
process with the requisite negative exponential pelf. I call this model the Negative Exponential Model ( NENI ). The NENI flowchart is seen in Figure 6...Figure ]3d-g. Statistical Histograms and Phase for the RPj NG EXP FDF MULT METHOD FILuteC 14a. Truth Object Speckled Via the NENI HISTOGRAM OF SPECKLE
Texture analysis of speckle in optical coherence tomography images of tissue phantoms
International Nuclear Information System (INIS)
Gossage, Kirk W; Smith, Cynthia M; Kanter, Elizabeth M; Hariri, Lida P; Stone, Alice L; Rodriguez, Jeffrey J; Williams, Stuart K; Barton, Jennifer K
2006-01-01
Optical coherence tomography (OCT) is an imaging modality capable of acquiring cross-sectional images of tissue using back-reflected light. Conventional OCT images have a resolution of 10-15 μm, and are thus best suited for visualizing tissue layers and structures. OCT images of collagen (with and without endothelial cells) have no resolvable features and may appear to simply show an exponential decrease in intensity with depth. However, examination of these images reveals that they display a characteristic repetitive structure due to speckle.The purpose of this study is to evaluate the application of statistical and spectral texture analysis techniques for differentiating living and non-living tissue phantoms containing various sizes and distributions of scatterers based on speckle content in OCT images. Statistically significant differences between texture parameters and excellent classification rates were obtained when comparing various endothelial cell concentrations ranging from 0 cells/ml to 25 million cells/ml. Statistically significant results and excellent classification rates were also obtained using various sizes of microspheres with concentrations ranging from 0 microspheres/ml to 500 million microspheres/ml. This study has shown that texture analysis of OCT images may be capable of differentiating tissue phantoms containing various sizes and distributions of scatterers
Investigation of the ripeness of oil palm fresh fruit bunches using bio-speckle imaging
Salambue, R.; Adnan, A.; Shiddiq, M.
2018-03-01
The ripeness of the oil palm Fresh Fruit Bunches (FFB) determines the yield of the oil produced. Traditionally there are two ways to determine FFB ripeness which are the number of loose fruits and the color changes. Nevertheless, one drawback of visual determination is subjective and qualitative judgment. In this study, the FFB ripeness was investigated using laser based image processing technique. The advantages of using this technique are non-destructive, simple and quantitative. The working principle of the investigation is that a FFB is inserted into a light tight box which contains a laser diode and a CMOS camera, the FFB is illuminated, and then an image is recorded. The FFB image recorder was performed on four FFB fractions i.e. F0, F3, F4 and F5 on the front and rear surfaces at three sections. The recorded images are speckled granules that have light intensity variation (bio-speckle imaging). The feature extracted from the specked image is the contrast value obtained from the average gray value intensity and the standard deviation. Based on the contrast values, the four fractions of FFB can be grouped into three levels of ripeness of unripe (F0), ripe (F3) and overripe (F4 and F5) on the front surface of base section of FFB by 75%.
Integration of speckle de-noising and image segmentation using ...
Indian Academy of Sciences (India)
2Department of Electronics and Communication Engineering, National Institute of Technology Karnataka,. Surathkal, Mangalore 575 025, India. ... cal images obtained from the satellites are often prone to bad climatic conditions and hence ... (2009) for satellite image segmentation. Mean shift segmentation (MSS) is a non-.
Particle and speckle imaging velocimetry applied to a monostatic LIDAR
Halldorsson, Thorsteinn; Langmeier, Andreas; Prücklmeier, Andreas; Banakh, Viktor; Falits, Andrey
2006-11-01
A novel backscatter-lidar imaging method of visualization of air movement in the atmosphere is discussed in the paper. The method is based on the particle image velocimetry (PIV) principle, namely: pairs of image of laser illuminated thin atmospheric layers are recorded by CCD camera and then are cross correlated to obtain velocity information from these records. Both the way of computer simulation of atmospheric version of PIV technique and the first concept proof experiments are described in the paper. It is proposed that the method can find an application for visualization of wake vortices arising behind large aircrafts.
Rice, Tyler B; Kwan, Elliott; Hayakawa, Carole K; Durkin, Anthony J; Choi, Bernard; Tromberg, Bruce J
2013-01-01
Laser Speckle Imaging (LSI) is a simple, noninvasive technique for rapid imaging of particle motion in scattering media such as biological tissue. LSI is generally used to derive a qualitative index of relative blood flow due to unknown impact from several variables that affect speckle contrast. These variables may include optical absorption and scattering coefficients, multi-layer dynamics including static, non-ergodic regions, and systematic effects such as laser coherence length. In order to account for these effects and move toward quantitative, depth-resolved LSI, we have developed a method that combines Monte Carlo modeling, multi-exposure speckle imaging (MESI), spatial frequency domain imaging (SFDI), and careful instrument calibration. Monte Carlo models were used to generate total and layer-specific fractional momentum transfer distributions. This information was used to predict speckle contrast as a function of exposure time, spatial frequency, layer thickness, and layer dynamics. To verify with experimental data, controlled phantom experiments with characteristic tissue optical properties were performed using a structured light speckle imaging system. Three main geometries were explored: 1) diffusive dynamic layer beneath a static layer, 2) static layer beneath a diffuse dynamic layer, and 3) directed flow (tube) submerged in a dynamic scattering layer. Data fits were performed using the Monte Carlo model, which accurately reconstructed the type of particle flow (diffusive or directed) in each layer, the layer thickness, and absolute flow speeds to within 15% or better.
Laser speckle contrast imaging of skin blood perfusion responses induced by laser coagulation
Energy Technology Data Exchange (ETDEWEB)
Ogami, M; Kulkarni, R; Wang, H; Reif, R; Wang, R K [University of Washington, Department of Bioengineering, Seattle, Washington 98195 (United States)
2014-08-31
We report application of laser speckle contrast imaging (LSCI), i.e., a fast imaging technique utilising backscattered light to distinguish such moving objects as red blood cells from such stationary objects as surrounding tissue, to localise skin injury. This imaging technique provides detailed information about the acute perfusion response after a blood vessel is occluded. In this study, a mouse ear model is used and pulsed laser coagulation serves as the method of occlusion. We have found that the downstream blood vessels lacked blood flow due to occlusion at the target site immediately after injury. Relative flow changes in nearby collaterals and anastomotic vessels have been approximated based on differences in intensity in the nearby collaterals and anastomoses. We have also estimated the density of the affected downstream vessels. Laser speckle contrast imaging is shown to be used for highresolution and fast-speed imaging for the skin microvasculature. It also allows direct visualisation of the blood perfusion response to injury, which may provide novel insights to the field of cutaneous wound healing. (laser biophotonics)
Laser speckle contrast imaging of skin blood perfusion responses induced by laser coagulation
Ogami, M.; Kulkarni, R.; Wang, H.; Reif, R.; Wang, R. K.
2014-08-01
We report application of laser speckle contrast imaging (LSCI), i.e., a fast imaging technique utilising backscattered light to distinguish such moving objects as red blood cells from such stationary objects as surrounding tissue, to localise skin injury. This imaging technique provides detailed information about the acute perfusion response after a blood vessel is occluded. In this study, a mouse ear model is used and pulsed laser coagulation serves as the method of occlusion. We have found that the downstream blood vessels lacked blood flow due to occlusion at the target site immediately after injury. Relative flow changes in nearby collaterals and anastomotic vessels have been approximated based on differences in intensity in the nearby collaterals and anastomoses. We have also estimated the density of the affected downstream vessels. Laser speckle contrast imaging is shown to be used for highresolution and fast-speed imaging for the skin microvasculature. It also allows direct visualisation of the blood perfusion response to injury, which may provide novel insights to the field of cutaneous wound healing.
Energy Technology Data Exchange (ETDEWEB)
O’Shea, T; Bamber, J; Harris, E [The Institute of Cancer Research & Royal Marsden, Sutton and London (United Kingdom)
2015-06-15
Purpose: For ultrasound speckle tracking there is some evidence that the envelope-detected signal (the main step in B-mode image formation) may be more accurate than raw ultrasound data for tracking larger inter-frame tissue motion. This study investigates the accuracy of raw radio-frequency (RF) versus non-logarithmic compressed envelope-detected (B-mode) data for ultrasound speckle tracking in the context of image-guided radiation therapy. Methods: Transperineal ultrasound RF data was acquired (with a 7.5 MHz linear transducer operating at a 12 Hz frame rate) from a speckle phantom moving with realistic intra-fraction prostate motion derived from a commercial tracking system. A normalised cross-correlation template matching algorithm was used to track speckle motion at the focus using (i) the RF signal and (ii) the B-mode signal. A range of imaging rates (0.5 to 12 Hz) were simulated by decimating the imaging sequences, therefore simulating larger to smaller inter-frame displacements. Motion estimation accuracy was quantified by comparison with known phantom motion. Results: The differences between RF and B-mode motion estimation accuracy (2D mean and 95% errors relative to ground truth displacements) were less than 0.01 mm for stable and persistent motion types and 0.2 mm for transient motion for imaging rates of 0.5 to 12 Hz. The mean correlation for all motion types and imaging rates was 0.851 and 0.845 for RF and B-mode data, respectively. Data type is expected to have most impact on axial (Superior-Inferior) motion estimation. Axial differences were <0.004 mm for stable and persistent motion and <0.3 mm for transient motion (axial mean errors were lowest for B-mode in all cases). Conclusions: Using the RF or B-mode signal for speckle motion estimation is comparable for translational prostate motion. B-mode image formation may involve other signal-processing steps which also influence motion estimation accuracy. A similar study for respiratory-induced motion
International Nuclear Information System (INIS)
Kuznetsov, Yu L; Kalchenko, V V; Astaf'eva, N G; Meglinski, I V
2014-01-01
The capability of using the laser speckle contrast imaging technique with a long exposure time for visualisation of primary acute skin vascular reactions caused by a topical application of a weak contact allergen is considered. The method is shown to provide efficient and accurate detection of irritant-induced primary acute vascular reactions of skin. The presented technique possesses a high potential in everyday diagnostic practice, preclinical studies, as well as in the prognosis of skin reactions to the interaction with potentially allergenic materials. (laser biophotonics)
Energy Technology Data Exchange (ETDEWEB)
Kuznetsov, Yu L; Kalchenko, V V [Department of Veterinary Resources, Weizmann Institute of Science, Rehovot, 76100 (Israel); Astaf' eva, N G [V.I.Razumovsky Saratov State Medical University, Saratov (Russian Federation); Meglinski, I V [N.G. Chernyshevsky Saratov State University, Saratov (Russian Federation)
2014-08-31
The capability of using the laser speckle contrast imaging technique with a long exposure time for visualisation of primary acute skin vascular reactions caused by a topical application of a weak contact allergen is considered. The method is shown to provide efficient and accurate detection of irritant-induced primary acute vascular reactions of skin. The presented technique possesses a high potential in everyday diagnostic practice, preclinical studies, as well as in the prognosis of skin reactions to the interaction with potentially allergenic materials. (laser biophotonics)
Borodinov, A. A.; Myasnikov, V. V.
2018-04-01
The present work is devoted to comparing the accuracy of the known qualification algorithms in the task of recognizing local objects on radar images for various image preprocessing methods. Preprocessing involves speckle noise filtering and normalization of the object orientation in the image by the method of image moments and by a method based on the Hough transform. In comparison, the following classification algorithms are used: Decision tree; Support vector machine, AdaBoost, Random forest. The principal component analysis is used to reduce the dimension. The research is carried out on the objects from the base of radar images MSTAR. The paper presents the results of the conducted studies.
Goodman, Joseph W.
2013-05-01
Speckle appears whenever coherent radiation of any kind is used. We review here the basic properties of speckle, the negative effects it has on imaging systems of various kinds, and the positive benefits it offers in certain nondestructive testing and metrology problems.
Lee, KyeoReh; Park, YongKeun
2016-10-31
The word 'holography' means a drawing that contains all of the information for light-both amplitude and wavefront. However, because of the insufficient bandwidth of current electronics, the direct measurement of the wavefront of light has not yet been achieved. Though reference-field-assisted interferometric methods have been utilized in numerous applications, introducing a reference field raises several fundamental and practical issues. Here we demonstrate a reference-free holographic image sensor. To achieve this, we propose a speckle-correlation scattering matrix approach; light-field information passing through a thin disordered layer is recorded and retrieved from a single-shot recording of speckle intensity patterns. Self-interference via diffusive scattering enables access to impinging light-field information, when light transport in the diffusive layer is precisely calibrated. As a proof-of-concept, we demonstrate direct holographic measurements of three-dimensional optical fields using a compact device consisting of a regular image sensor and a diffusor.
Mobile phone based laser speckle contrast imager for assessment of skin blood flow
Jakovels, Dainis; Saknite, Inga; Krievina, Gita; Zaharans, Janis; Spigulis, Janis
2014-10-01
Assessment of skin blood flow is of interest for evaluation of skin viability as well as for reflection of the overall condition of the circulatory system. Laser Doppler perfusion imaging (LDPI) and laser speckle contrast imaging (LASCI) are optical techniques used for assessment of skin perfusion. However, these systems are still too expensive and bulky to be widely available. Implementation of such techniques as connection kits for mobile phones have a potential for primary diagnostics. In this work we demonstrate simple and low cost LASCI connection kit for mobile phone and its comparison to laser Doppler perfusion imager. Post-occlusive hyperemia and local thermal hyperemia tests are used to compare both techniques and to demonstrate the potential of LASCI device.
Sharma, P.; Kumawat, J.; Kumar, S.; Sahu, K.; Verma, Y.; Gupta, P. K.; Rao, K. D.
2018-02-01
We report on a study to assess the feasibility of a swept source-based speckle variance optical coherence tomography setup for monitoring cutaneous microvasculature. Punch wounds created in the ear pinnae of diabetic mice were monitored at different times post wounding to assess the structural and vascular changes. It was observed that the epithelium thickness increases post wounding and continues to be thick even after healing. Also, the wound size assessed by vascular images is larger than the physical wound size. The results show that the developed speckle variance optical coherence tomography system can be used to monitor vascular regeneration during wound healing in diabetic mice.
E. Klijn (Elko); M.E.J.L. Hulscher (Marlies); R.K. Balvers (Rutger); W.P.J. Holland (Wim); J. Bakker (Jan); A.J.P.E. Vincent (Arnoud); C.M.F. Dirven (Clemens); C. Ince (Can)
2013-01-01
textabstractObject. The goal of awake neurosurgery is to maximize resection of brain lesions with minimal injury to functional brain areas. Laser speckle imaging (LSI) is a noninvasive macroscopic technique with high spatial and temporal resolution used to monitor changes in capillary perfusion. In
Klijn, Eva; Hulscher, Hester C.; Balvers, Rutger K.; Holland, Wim P. J.; Bakker, Jan; Vincent, Arnaud J. P. E.; Dirven, Clemens M. F.; Ince, Can
2013-01-01
The goal of awake neurosurgery is to maximize resection of brain lesions with minimal injury to functional brain areas. Laser speckle imaging (LSI) is a noninvasive macroscopic technique with high spatial and temporal resolution used to monitor changes in capillary perfusion. In this study, the
Optically sectioned in vivo imaging with speckle illumination HiLo microscopy
Lim, Daryl; Ford, Tim N.; Chu, Kengyeh K.; Mertz, Jerome
2011-01-01
We present a simple wide-field imaging technique, called HiLo microscopy, that is capable of producing optically sectioned images in real time, comparable in quality to confocal laser scanning microscopy. The technique is based on the fusion of two raw images, one acquired with speckle illumination and another with standard uniform illumination. The fusion can be numerically adjusted, using a single parameter, to produce optically sectioned images of varying thicknesses with the same raw data. Direct comparison between our HiLo microscope and a commercial confocal laser scanning microscope is made on the basis of sectioning strength and imaging performance. Specifically, we show that HiLo and confocal 3-D imaging of a GFP-labeled mouse brain hippocampus are comparable in quality. Moreover, HiLo microscopy is capable of faster, near video rate imaging over larger fields of view than attainable with standard confocal microscopes. The goal of this paper is to advertise the simplicity, robustness, and versatility of HiLo microscopy, which we highlight with in vivo imaging of common model organisms including planaria, C. elegans, and zebrafish.
Wavelet tree structure based speckle noise removal for optical coherence tomography
Yuan, Xin; Liu, Xuan; Liu, Yang
2018-02-01
We report a new speckle noise removal algorithm in optical coherence tomography (OCT). Though wavelet domain thresholding algorithms have demonstrated superior advantages in suppressing noise magnitude and preserving image sharpness in OCT, the wavelet tree structure has not been investigated in previous applications. In this work, we propose an adaptive wavelet thresholding algorithm via exploiting the tree structure in wavelet coefficients to remove the speckle noise in OCT images. The threshold for each wavelet band is adaptively selected following a special rule to retain the structure of the image across different wavelet layers. Our results demonstrate that the proposed algorithm outperforms conventional wavelet thresholding, with significant advantages in preserving image features.
Autonomous algorithms for image restoration
Griniasty , Meir
1994-01-01
We describe a general theoretical framework for algorithms that adaptively tune all their parameters during the restoration of a noisy image. The adaptation procedure is based on a mean field approach which is known as ``Deterministic Annealing'', and is reminiscent of the ``Deterministic Bolzmann Machiné'. The algorithm is less time consuming in comparison with its simulated annealing alternative. We apply the theory to several architectures and compare their performances.
The Research on Denoising of SAR Image Based on Improved K-SVD Algorithm
Tan, Linglong; Li, Changkai; Wang, Yueqin
2018-04-01
SAR images often receive noise interference in the process of acquisition and transmission, which can greatly reduce the quality of images and cause great difficulties for image processing. The existing complete DCT dictionary algorithm is fast in processing speed, but its denoising effect is poor. In this paper, the problem of poor denoising, proposed K-SVD (K-means and singular value decomposition) algorithm is applied to the image noise suppression. Firstly, the sparse dictionary structure is introduced in detail. The dictionary has a compact representation and can effectively train the image signal. Then, the sparse dictionary is trained by K-SVD algorithm according to the sparse representation of the dictionary. The algorithm has more advantages in high dimensional data processing. Experimental results show that the proposed algorithm can remove the speckle noise more effectively than the complete DCT dictionary and retain the edge details better.
Fast Superpixel Segmentation Algorithm for PolSAR Images
Directory of Open Access Journals (Sweden)
Zhang Yue
2017-10-01
Full Text Available As a pre-processing technique, superpixel segmentation algorithms should be of high computational efficiency, accurate boundary adherence and regular shape in homogeneous regions. A fast superpixel segmentation algorithm based on Iterative Edge Refinement (IER has shown to be applicable on optical images. However, it is difficult to obtain the ideal results when IER is applied directly to PolSAR images due to the speckle noise and small or slim regions in PolSAR images. To address these problems, in this study, the unstable pixel set is initialized as all the pixels in the PolSAR image instead of the initial grid edge pixels. In the local relabeling of the unstable pixels, the fast revised Wishart distance is utilized instead of the Euclidean distance in CIELAB color space. Then, a post-processing procedure based on dissimilarity measure is empolyed to remove isolated small superpixels as well as to retain the strong point targets. Finally, extensive experiments based on a simulated image and a real-world PolSAR image from Airborne Synthetic Aperture Radar (AirSAR are conducted, showing that the proposed algorithm, compared with three state-of-the-art methods, performs better in terms of several commonly used evaluation criteria with high computational efficiency, accurate boundary adherence, and homogeneous regularity.
International Nuclear Information System (INIS)
Deana, A M; Jesus, S H C; Koshoji, N H; Bussadori, S K; Oliveira, M T
2013-01-01
Currently, dental caries still represent one of the chronic diseases with the highest prevalence and present in most countries. The interaction between light and teeth (absorption, scattering and fluorescence) is intrinsically connected to the constitution of the dental tissue. Decay induced mineral loss introduces a shift in the optical properties of the affected tissue; therefore, study of these properties may produce novel techniques aimed at the early diagnosis of carious lesions. Based on the optical properties of the enamel, we demonstrate the application of first-order spatial statistics in laser speckle imaging, allowing the detection of carious lesions in their early stages. A highlight of this noninvasive, non-destructive, real time and cost effective approach is that it allows a dentist to detect a lesion even in the absence of biofilm or moisture. (paper)
Yuan, Lu; Li, Yao; Li, Hangdao; Lu, Hongyang; Tong, Shanbao
2015-09-01
Rodent middle cerebral artery occlusion (MCAO) model is commonly used in stroke research. Creating a stable infarct volume has always been challenging for technicians due to the variances of animal anatomy and surgical operations. The depth of filament suture advancement strongly influences the infarct volume as well. We investigated the cerebral blood flow (CBF) changes in the affected cortex using laser speckle contrast imaging when advancing suture during MCAO surgery. The relative CBF drop area (CBF50, i.e., the percentage area with CBF less than 50% of the baseline) showed an increase from 20.9% to 69.1% when the insertion depth increased from 1.6 to 1.8 cm. Using the real-time CBF50 marker to guide suture insertion during the surgery, our animal experiments showed that intraoperative CBF-guided surgery could significantly improve the stability of MCAO with a more consistent infarct volume and less mortality.
Speckle imaging of active galactic nuclei: NGC 1068 and NGC 4151
International Nuclear Information System (INIS)
Ebstein, S.M.
1987-01-01
High-resolution images of NGC 1068 and NGC 4151 in the [O III) 5007A line the nearby continuum produced from data taken with the PAPA photon-counting imaging detector using the technique of speckle imaging are presented. The images show an unresolved core of [O III] 5007A emission in the middle of an extended emission region. The extended emission tends to lie alongside the subarcsecond radio structure. In NGC 4151, the extended emission comes from a nearly linear structure extending on both sides of the unresolved core. In NGC 1068, the extended emission is also a linear structure centered on the unresolved core but the emission is concentrated in lobes lying to either side of the major axis. The continuum of NGC 4151 is spatially unresolved. The continuum of NGC 1068 is extended ∼1'' to the SW of the center of the [O III] 5007A emission. Certain aspects of the PAPA detector are discussed, including the variable-threshold discriminators that track the image intensifier pulse height and the camera artifacts. The data processing is described in detail
Burmeister, David M.; Ponticorvo, Adrien; Yang, Bruce; Becerra, Sandra C.; Choi, Bernard; Durkin, Anthony J.; Christy, Robert J.
2015-01-01
Surgical intervention of second degree burns is often delayed because of the difficulty in visual diagnosis, which increases the risk of scarring and infection. Non-invasive metrics have shown promise in accurately assessing burn depth. Here, we examine the use of spatial frequency domain imaging (SFDI) and laser speckle imaging (LSI) for predicting burn depth. Contact burn wounds of increasing severity were created on the dorsum of a Yorkshire pig, and wounds were imaged with SFDI/LSI starting immediately after-burn and then daily for the next 4 days. In addition, on each day the burn wounds were biopsied for histological analysis of burn depth, defined by collagen coagulation, apoptosis, and adnexal/vascular necrosis. Histological results show that collagen coagulation progressed from day 0 to day 1, and then stabilized. Results of burn wound imaging using non-invasive techniques were able to produce metrics that correlate to different predictors of burn depth. Collagen coagulation and apoptosis correlated with SFDI scattering coefficient parameter ( μs′) and adnexal/vascular necrosis on the day of burn correlated with blood flow determined by LSI. Therefore, incorporation of SFDI scattering coefficient and blood flow determined by LSI may provide an algorithm for accurate assessment of the severity of burn wounds in real time. PMID:26138371
Dynamical properties of speckled speckles
DEFF Research Database (Denmark)
Hanson, Steen Grüner; Iversen, Theis Faber Quist; Hansen, Rene Skov
2010-01-01
the static diffuser and the plane of observation consist of an optical system that can be characterized by a complex-valued ABCD-matrix (e.g. simple and complex imaging systems, free space propagation in both the near-and far-field, and Fourier transform systems). The use of the complex ABCD-method means...... to be Gaussian but the derived expressions are not restricted to a plane incident beam. The results are applicable for speckle-based systems for determining mechanical displacements, especially for long-range systems, and for analyzing systems for measuring biological activity beyond a diffuse layer, e.g. blood...
Motion Estimation Using the Firefly Algorithm in Ultrasonic Image Sequence of Soft Tissue
Directory of Open Access Journals (Sweden)
Chih-Feng Chao
2015-01-01
Full Text Available Ultrasonic image sequence of the soft tissue is widely used in disease diagnosis; however, the speckle noises usually influenced the image quality. These images usually have a low signal-to-noise ratio presentation. The phenomenon gives rise to traditional motion estimation algorithms that are not suitable to measure the motion vectors. In this paper, a new motion estimation algorithm is developed for assessing the velocity field of soft tissue in a sequence of ultrasonic B-mode images. The proposed iterative firefly algorithm (IFA searches for few candidate points to obtain the optimal motion vector, and then compares it to the traditional iterative full search algorithm (IFSA via a series of experiments of in vivo ultrasonic image sequences. The experimental results show that the IFA can assess the vector with better efficiency and almost equal estimation quality compared to the traditional IFSA method.
Off-axis holographic laser speckle contrast imaging of blood vessels in tissues
Abdurashitov, Arkady; Bragina, Olga; Sindeeva, Olga; Sergey, Sindeev; Semyachkina-Glushkovskaya, Oxana V.; Tuchin, Valery V.
2017-09-01
Laser speckle contrast imaging (LSCI) has become one of the most common tools for functional imaging in tissues. Incomplete theoretical description and sophisticated interpretation of measurement results are completely sidelined by a low-cost and simple hardware, fastness, consistent results, and repeatability. In addition to the relatively low measuring volume with around 700 μm of the probing depth for the visible spectral range of illumination, there is no depth selectivity in conventional LSCI configuration; furthermore, in a case of high NA objective, the actual penetration depth of light in tissues is greater than depth of field (DOF) of an imaging system. Thus, the information about these out-of-focus regions persists in the recorded frames but cannot be retrieved due to intensity-based registration method. We propose a simple modification of LSCI system based on the off-axis holography to introduce after-registration refocusing ability to overcome both depth-selectivity and DOF problems as well as to get the potential possibility of producing a cross-section view of the specimen.
International Nuclear Information System (INIS)
Miao, Peng; Feng, Shihan; Zhang, Qi; Lin, Xiaojie; Xie, Bohua; Liu, Chenwei; Yang, Guo-Yuan
2014-01-01
Abstract In-vivo imaging of blood flow in the cortex and sub-cortex is still a challenge in biological and pathological studies of cerebral vascular diseases. Laser speckle contrast imaging (LSCI) only provides cortex blood flow information. Traditional synchrotron radiation micro-angiography (SRA) provides sub-cortical vasculature information with high resolution. In this study, a bolus front-tracking method was developed to extract blood flow information based on SRA. Combining LSCI and SRA, arterial blood flow in the ipsilateral cortex and sub-cortex was monitored after experimental intracerebral hemorrhage of mice. At 72 h after injury, a significant blood flow increase was observed in the lenticulostriate artery along with blood flow decrease in cortical branches of the middle cerebral artery. This combined strategy provides a new approach for the investigation of brain vasculature and blood flow changes in preclinical studies. (paper)
Chiang, F. P.; Jin, F.; Wang, Q.; Zhu, N.
Before the milestone work of Leedertz in 1970 coherent speckles generated from a laser illuminated object are considered noise to be eliminated or minimized. Leedertz shows that coherent speckles are actually information carriers. Since then the speckle technique has found many applications to fields of mechanics, metrology, nondestructive evaluation and material sciences. Speckles need not be coherent. Artificially created socalled white light speckles can also be used as information carriers. In this paper we present two recent developments of speckle technique with applications to micromechanics problems using SIEM (Speckle Interferometry with Electron Microscopy), to nondestructive evaluation of crevice corrosion and composite disbond and vibration of large structures using TADS (Time-Average Digital Specklegraphy).
Crawford, D C; Bell, D S; Bamber, J C
1993-01-01
A systematic method to compensate for nonlinear amplification of individual ultrasound B-scanners has been investigated in order to optimise performance of an adaptive speckle reduction (ASR) filter for a wide range of clinical ultrasonic imaging equipment. Three potential methods have been investigated: (1) a method involving an appropriate selection of the speckle recognition feature was successful when the scanner signal processing executes simple logarithmic compressions; (2) an inverse transform (decompression) of the B-mode image was effective in correcting for the measured characteristics of image data compression when the algorithm was implemented in full floating point arithmetic; (3) characterising the behaviour of the statistical speckle recognition feature under conditions of speckle noise was found to be the method of choice for implementation of the adaptive speckle reduction algorithm in limited precision integer arithmetic. In this example, the statistical features of variance and mean were investigated. The third method may be implemented on commercially available fast image processing hardware and is also better suited for transfer into dedicated hardware to facilitate real-time adaptive speckle reduction. A systematic method is described for obtaining ASR calibration data from B-mode images of a speckle producing phantom.
DEFF Research Database (Denmark)
Jørgensen, Thomas Martini; Thrane, Lars; Mogensen, M.
2007-01-01
the scheme with a mobile fiber-based time-domain real-time OCT system. Essential enhancement was obtained in image contrast when performing in vivo imaging of normal skin and lesions. Resulting images show improved delineation of structure in correspondence with the observed improvements in contrast...... system. Here, we consider a method that in principle can be fitted to most OCT systems without major modifications. Specifically, we address a spatial diversity technique for suppressing speckle noise in OCT images of human skin. The method is a variant of changing the position of the sample relative...
Cui, Han; Chen, Yi; Zhong, Weizheng; Yu, Haibo; Li, Zhifeng; He, Yuhai; Yu, Wenlong; Jin, Lei
2016-01-01
Bell's palsy is a kind of peripheral neural disease that cause abrupt onset of unilateral facial weakness. In the pathologic study, it was evidenced that ischemia of facial nerve at the affected side of face existed in Bell's palsy patients. Since the direction of facial nerve blood flow is primarily proximal to distal, facial skin microcirculation would also be affected after the onset of Bell's palsy. Therefore, monitoring the full area of facial skin microcirculation would help to identify the condition of Bell's palsy patients. In this study, a non-invasive, real time and full field imaging technology - laser speckle imaging (LSI) technology was applied for measuring facial skin blood perfusion distribution of Bell's palsy patients. 85 participants with different stage of Bell's palsy were included. Results showed that Bell's palsy patients' facial skin perfusion of affected side was lower than that of the normal side at the region of eyelid, and that the asymmetric distribution of the facial skin perfusion between two sides of eyelid is positively related to the stage of the disease (P Bell's palsy patients, and we discovered that the facial skin blood perfusion could reflect the stage of Bell's palsy, which suggested that microcirculation should be investigated in patients with this neurological deficit. It was also suggested LSI as potential diagnostic tool for Bell's palsy.
State of the Art of X-ray Speckle-Based Phase-Contrast and Dark-Field Imaging
Directory of Open Access Journals (Sweden)
Marie-Christine Zdora
2018-04-01
Full Text Available In the past few years, X-ray phase-contrast and dark-field imaging have evolved to be invaluable tools for non-destructive sample visualisation, delivering information inaccessible by conventional absorption imaging. X-ray phase-sensing techniques are furthermore increasingly used for at-wavelength metrology and optics characterisation. One of the latest additions to the group of differential phase-contrast methods is the X-ray speckle-based technique. It has drawn significant attention due to its simple and flexible experimental arrangement, cost-effectiveness and multimodal character, amongst others. Since its first demonstration at highly brilliant synchrotron sources, the method has seen rapid development, including the translation to polychromatic laboratory sources and extension to higher-energy X-rays. Recently, different advanced acquisition schemes have been proposed to tackle some of the main limitations of previous implementations. Current applications of the speckle-based method range from optics characterisation and wavefront measurement to biomedical imaging and materials science. This review provides an overview of the state of the art of the X-ray speckle-based technique. Its basic principles and different experimental implementations as well as the the latest advances and applications are illustrated. In the end, an outlook for anticipated future developments of this promising technique is given.
Speckle noise reduction in breast ultrasound images: SMU (srad median unsharp) approch
International Nuclear Information System (INIS)
Njeh, I.; Sassi, O. B.; Ben Hamida, A.; Chtourou, K.
2011-01-01
Image denoising has become a very essential for better information extraction from the image and mainly from so noised ones, such as ultrasound images. In certain cases, for instance in ultrasound images, the noise can restrain information which is valuable for the general practitioner. Consequently medical images are very inconsistent, and it is crucial to operate case to case. This paper presents a novel algorithm SMU (Srad Median Unsharp) for noise suppression in ultrasound breast images in order to realize a computer aided diagnosis (CAD) for breast cancer.
Milstein, Dan M J; Ince, Can; Gisbertz, Suzanne S; Boateng, Kofi B; Geerts, Bart F; Hollmann, Markus W; van Berge Henegouwen, Mark I; Veelo, Denise P
2016-06-01
Gastric tube reconstruction (GTR) is a high-risk surgical procedure with substantial perioperative morbidity. Compromised arterial blood supply and venous congestion are believed to be the main etiologic factors associated with early and late anastomotic complications. Identifying low blood perfusion areas may provide information on the risks of future anastomotic leakage and could be essential for improving surgical techniques. The aim of this study was to generate a method for gastric microvascular perfusion analysis using laser speckle contrast imaging (LSCI) and to test the hypothesis that LSCI is able to identify ischemic regions on GTRs.Patients requiring elective laparoscopy-assisted GTR participated in this single-center observational investigation. A method for intraoperative evaluation of blood perfusion and postoperative analysis was generated and validated for reproducibility. Laser speckle measurements were performed at 3 different time pointes, baseline (devascularized) stomach (T0), after GTR (T1), and GTR at 20° reverse Trendelenburg (T2).Blood perfusion analysis inter-rater reliability was high, with intraclass correlation coefficients for each time point approximating 1 (P < 0.0001). Baseline (T0) and GTR (T1) mean blood perfusion profiles were highest at the base of the stomach and then progressively declined towards significant ischemia at the most cranial point or anastomotic tip (P < 0.01). After GTR, a statistically significant improvement in mean blood perfusion was observed in the cranial gastric regions of interest (P < 0.05). A generalized significant decrease in mean blood perfusion was observed across all GTR regions of interest during 20° reverse Trendelenburg (P < 0.05).It was feasible to implement LSCI intraoperatively to produce blood perfusion assessments on intact and reconstructed whole stomachs. The analytical design presented in this study resulted in good reproducibility of gastric perfusion measurements
Timoshina, Polina A.; Shi, Rui; Zhang, Yang; Zhu, Dan; Semyachkina-Glushkovskaya, Oxana V.; Tuchin, Valery V.; Luo, Qingming
2015-03-01
The study of blood microcirculation is one of the most important problems of the medicine. This paper presents results of experimental study of cerebral blood flow microcirculation in mice with alloxan-induced diabetes using Temporal Laser Speckle Imaging (TLSI). Additionally, a direct effect of glucose water solution (concentration 20% and 45%) on blood flow microcirculation was studied. In the research, 20 white laboratory mice weighing 20-30 g were used. The TLSI method allows one to investigate time dependent scattering from the objects with complex dynamics, since it possesses greater temporal resolution. Results show that in brain of animal diabetic group diameter of sagittal vein is increased and the speed of blood flow reduced relative to the control group. Topical application of 20%- or 45%-glucose solutions also causes increase of diameter of blood vessels and slows down blood circulation. The results obtained show that diabetes development causes changes in the cerebral microcirculatory system and TLSI techniques can be effectively used to quantify these alterations.
Verho, Tuukka; Karppinen, Pasi; Gröschel, André H; Ikkala, Olli
2018-01-01
Mollusk nacre is a prototypical biological inorganic-organic composite that combines high toughness, stiffness, and strength by its brick-and-mortar microstructure, which has inspired several synthetic mimics. Its remarkable fracture toughness relies on inelastic deformations at the process zone at the crack tip that dissolve stress concentrations and stop cracks. The micrometer-scale structure allows resolving the size and shape of the process zone to understand the fracture processes. However, for better scalability, nacre-mimetic nanocomposites with aligned inorganic or graphene nanosheets are extensively pursued, to avoid the packing problems of mesoscale sheets like in nacre or slow in situ biomineralization. This calls for novel methods to explore the process zone of biomimetic nanocomposites. Here the fracture of nacre and nacre-inspired clay/polymer nanocomposite is explored using laser speckle imaging that reveals the process zone even in absence of changes in optical scattering. To demonstrate the diagnostic value, compared to nacre, the nacre-inspired nanocomposite develops a process zone more abruptly with macroscopic crack deflection shown by a flattened process zone. In situ scanning electron microscopy suggests similar toughening mechanisms in nanocomposite and nacre. These new insights guide the design of nacre-inspired nanocomposites toward better mechanical properties to reach the level of synergy of their biological model.
Skin perfusion evaluation between laser speckle contrast imaging and laser Doppler flowmetry
Humeau-Heurtier, Anne; Mahe, Guillaume; Durand, Sylvain; Abraham, Pierre
2013-03-01
In the biomedical field, laser Doppler flowmetry (LDF) and laser speckle contrast imaging (LSCI) are two optical techniques aiming at monitoring - non-invasively - the microvascular blood perfusion. LDF has been used for nearly 40 years whereas LSCI is a recent technique that overcomes some drawbacks of LDF. Both LDF and LSCI give perfusion assessments in arbitrary units. However, the possible relationship existing between perfusions given by LDF and by LSCI over large blood flow values has not been completely studied yet. We therefore herein evaluate the relationship between the LDF and LSCI perfusion values across a broad range of skin blood flows. For this purpose, LDF and LSCI data were acquired simultaneously on the forearm of 12 healthy subjects, at rest, during different durations of vascular occlusion and during reactive hyperemia. For the range of skin blood flows studied, the power function fits the data better than the linear function: powers for individual subjects go from 1.2 to 1.7 and the power is close to 1.3 when all the subjects are studied together. We thus suggest distinguishing perfusion values given by the two optical systems.
Determining the mechanical properties of rat skin with digital image speckle correlation.
Guan, E; Smilow, Sarah; Rafailovich, Miriam; Sokolov, Jonathan
2004-01-01
Accurate measurement of the mechanical properties of skin has numerous implications in surgical repair, dermal disorders and the diagnosis and treatment of trauma to the skin. Investigation of facial wrinkle formation, as well as research in the areas of skin aging and cosmetic product assessment can also benefit from alternative methodologies for the measurement of mechanical properties. A noncontact, noninvasive technique, digital image speckle correlation (DISC), has been successfully introduced to measure the deformation field of a skin sample loaded by a material test machine. With the force information obtained from the loading device, the mechanical properties of the skin, such as Young's modulus, linear limitation and material strength, can be calculated using elastic or viscoelastic theory. The DISC method was used to measure the deformation of neonatal rat skin, with and without a glycerin-fruit-oil-based cream under uniaxial tension. Deformation to failure procedure of newborn rat skin was recorded and analyzed. Single skin layer failures were observed and located by finding the strain concentration. Young's moduli of freshly excised rat skin, cream-processed rat skin and unprocessed rat skin, 24 h after excision, were found with tensile tests to be 1.6, 1.4 and 0.7 MPa, respectively. Our results have shown that DISC provides a novel technique for numerous applications in dermatology and reconstructive surgeries. Copyright 2004 S. Karger AG, Basel
Jayanthy, A. K.; Sujatha, N.; Reddy, M. Ramasubba; Narayanamoorthy, V. B.
2014-03-01
Measuring microcirculatory tissue blood perfusion is of interest for both clinicians and researchers in a wide range of applications and can provide essential information of the progress of treatment of certain diseases which causes either an increased or decreased blood flow. Diabetic ulcer associated with alterations in tissue blood flow is the most common cause of non-traumatic lower extremity amputations. A technique which can detect the onset of ulcer and provide essential information on the progress of the treatment of ulcer would be of great help to the clinicians. A noninvasive, noncontact and whole field laser speckle contrast imaging (LSCI) technique has been described in this paper which is used to assess the changes in blood flow in diabetic ulcer affected areas of the foot. The blood flow assessment at the wound site can provide critical information on the efficiency and progress of the treatment given to the diabetic ulcer subjects. The technique may also potentially fulfill a significant need in diabetic foot ulcer screening and management.
From synchrotron radiation to lab source: advanced speckle-based X-ray imaging using abrasive paper
Wang, Hongchang; Kashyap, Yogesh; Sawhney, Kawal
2016-02-01
X-ray phase and dark-field imaging techniques provide complementary and inaccessible information compared to conventional X-ray absorption or visible light imaging. However, such methods typically require sophisticated experimental apparatus or X-ray beams with specific properties. Recently, an X-ray speckle-based technique has shown great potential for X-ray phase and dark-field imaging using a simple experimental arrangement. However, it still suffers from either poor resolution or the time consuming process of collecting a large number of images. To overcome these limitations, in this report we demonstrate that absorption, dark-field, phase contrast, and two orthogonal differential phase contrast images can simultaneously be generated by scanning a piece of abrasive paper in only one direction. We propose a novel theoretical approach to quantitatively extract the above five images by utilising the remarkable properties of speckles. Importantly, the technique has been extended from a synchrotron light source to utilise a lab-based microfocus X-ray source and flat panel detector. Removing the need to raster the optics in two directions significantly reduces the acquisition time and absorbed dose, which can be of vital importance for many biological samples. This new imaging method could potentially provide a breakthrough for numerous practical imaging applications in biomedical research and materials science.
Signal-to-noise based local decorrelation compensation for speckle interferometry applications
International Nuclear Information System (INIS)
Molimard, Jerome; Cordero, Raul; Vautrin, Alain
2008-01-01
Speckle-based interferometric techniques allow assessing the whole-field deformation induced on a specimen due to the application of load. These high sensitivity optical techniques yield fringe images generated by subtracting speckle patterns captured while the specimen undergoes deformation. The quality of the fringes, and in turn the accuracy of the deformation measurements, strongly depends on the speckle correlation. Specimen rigid body motion leads to speckle decorrelation that, in general, cannot be effectively counteracted by applying a global translation to the involved speckle patterns. In this paper, we propose a recorrelation procedure based on the application of locally evaluated translations. The proposed procedure implies dividing the field into several regions, applying a local translation, and calculating, in every region, the signal-to-noise ratio (SNR). Since the latter is a correlation indicator (the noise increases with the decorrelation) we argue that the proper translation is that which maximizes the locally evaluated SNR. The search of the proper local translations is, of course, an interactive process that can be facilitated by using a SNR optimization algorithm. The performance of the proposed recorrelation procedure was tested on two examples. First, the SNR optimization algorithm was applied to fringe images obtained by subtracting simulated speckle patterns. Next, it was applied to fringe images obtained by using a shearography optical setup from a specimen subjected to mechanical deformation. Our results show that the proposed SNR optimization method can significantly improve the reliability of measurements performed by using speckle-based techniques
Regan, Caitlin; Hayakawa, Carole; Choi, Bernard
2017-12-01
Due to its simplicity and low cost, laser speckle imaging (LSI) has achieved widespread use in biomedical applications. However, interpretation of the blood-flow maps remains ambiguous, as LSI enables only limited visualization of vasculature below scattering layers such as the epidermis and skull. Here, we describe a computational model that enables flexible in-silico study of the impact of these factors on LSI measurements. The model uses Monte Carlo methods to simulate light and momentum transport in a heterogeneous tissue geometry. The virtual detectors of the model track several important characteristics of light. This model enables study of LSI aspects that may be difficult or unwieldy to address in an experimental setting, and enables detailed study of the fundamental origins of speckle contrast modulation in tissue-specific geometries. We applied the model to an in-depth exploration of the spectral dependence of speckle contrast signal in the skin, the effects of epidermal melanin content on LSI, and the depth-dependent origins of our signal. We found that LSI of transmitted light allows for a more homogeneous integration of the signal from the entire bulk of the tissue, whereas epi-illumination measurements of contrast are limited to a fraction of the light penetration depth. We quantified the spectral depth dependence of our contrast signal in the skin, and did not observe a statistically significant effect of epidermal melanin on speckle contrast. Finally, we corroborated these simulated results with experimental LSI measurements of flow beneath a thin absorbing layer. The results of this study suggest the use of LSI in the clinic to monitor perfusion in patients with different skin types, or inhomogeneous epidermal melanin distributions.
Directory of Open Access Journals (Sweden)
Joshua S. Ullom
2012-01-01
Full Text Available A method for improving the contrast-to-noise ratio (CNR while maintaining the −6 dB axial resolution of ultrasonic B-mode images is proposed. The technique proposed is known as eREC-FC, which enhances a recently developed REC-FC technique. REC-FC is a combination of the coded excitation technique known as resolution enhancement compression (REC and the speckle-reduction technique frequency compounding (FC. In REC-FC, image CNR is improved but at the expense of a reduction in axial resolution. However, by compounding various REC-FC images made from various subband widths, the tradeoff between axial resolution and CNR enhancement can be extended. Further improvements in CNR can be obtained by applying postprocessing despeckling filters to the eREC-FC B-mode images. The despeckling filters evaluated were the following: median, Lee, homogeneous mask area, geometric, and speckle-reducing anisotropic diffusion (SRAD. Simulations and experimental measurements were conducted with a single-element transducer (f/2.66 having a center frequency of 2.25 MHz and a −3 dB bandwidth of 50%. In simulations and experiments, the eREC-FC technique resulted in the same axial resolution that would be typically observed with conventional excitation with a pulse. Moreover, increases in CNR of 348% were obtained in experiments when comparing eREC-FC with a Lee filter to conventional pulsing methods.
Ultrasound speckle reduction based on fractional order differentiation.
Shao, Dangguo; Zhou, Ting; Liu, Fan; Yi, Sanli; Xiang, Yan; Ma, Lei; Xiong, Xin; He, Jianfeng
2017-07-01
Ultrasound images show a granular pattern of noise known as speckle that diminishes their quality and results in difficulties in diagnosis. To preserve edges and features, this paper proposes a fractional differentiation-based image operator to reduce speckle in ultrasound. An image de-noising model based on fractional partial differential equations with balance relation between k (gradient modulus threshold that controls the conduction) and v (the order of fractional differentiation) was constructed by the effective combination of fractional calculus theory and a partial differential equation, and the numerical algorithm of it was achieved using a fractional differential mask operator. The proposed algorithm has better speckle reduction and structure preservation than the three existing methods [P-M model, the speckle reducing anisotropic diffusion (SRAD) technique, and the detail preserving anisotropic diffusion (DPAD) technique]. And it is significantly faster than bilateral filtering (BF) in producing virtually the same experimental results. Ultrasound phantom testing and in vivo imaging show that the proposed method can improve the quality of an ultrasound image in terms of tissue SNR, CNR, and FOM values.
Speckle Suppression by Weighted Euclidean Distance Anisotropic Diffusion
Directory of Open Access Journals (Sweden)
Fengcheng Guo
2018-05-01
Full Text Available To better reduce image speckle noise while also maintaining edge information in synthetic aperture radar (SAR images, we propose a novel anisotropic diffusion algorithm using weighted Euclidean distance (WEDAD. Presented here is a modified speckle reducing anisotropic diffusion (SRAD method, which constructs a new edge detection operator using weighted Euclidean distances. The new edge detection operator can adaptively distinguish between homogenous and heterogeneous image regions, effectively generate anisotropic diffusion coefficients for each image pixel, and filter each pixel at different scales. Additionally, the effects of two different weighting methods (Gaussian weighting and non-linear weighting of de-noising were analyzed. The effect of different adjustment coefficient settings on speckle suppression was also explored. A series of experiments were conducted using an added noise image, GF-3 SAR image, and YG-29 SAR image. The experimental results demonstrate that the proposed method can not only significantly suppress speckle, thus improving the visual effects, but also better preserve the edge information of images.
Objective speckle velocimetry for autonomous vehicle odometry.
Francis, D; Charrett, T O H; Waugh, L; Tatam, R P
2012-06-01
Speckle velocimetry is investigated as a means of determining odometry data with potential for application on autonomous robotic vehicles. The technique described here relies on the integration of translation measurements made by normalized cross-correlation of speckle patterns to determine the change in position over time. The use of objective (non-imaged) speckle offers a number of advantages over subjective (imaged) speckle, such as a reduction in the number of optical components, reduced modulation of speckles at the edges of the image, and improved light efficiency. The influence of the source/detector configuration on the speckle translation to vehicle translation scaling factor for objective speckle is investigated using a computer model and verified experimentally. Experimental measurements are presented at velocities up to 80 mm s(-1) which show accuracy better than 0.4%.
Improving high resolution retinal image quality using speckle illumination HiLo imaging.
Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew
2014-08-01
Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis.
Segmentation of pomegranate MR images using spatial fuzzy c-means (SFCM) algorithm
Moradi, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.
2011-10-01
Segmentation is one of the fundamental issues of image processing and machine vision. It plays a prominent role in a variety of image processing applications. In this paper, one of the most important applications of image processing in MRI segmentation of pomegranate is explored. Pomegranate is a fruit with pharmacological properties such as being anti-viral and anti-cancer. Having a high quality product in hand would be critical factor in its marketing. The internal quality of the product is comprehensively important in the sorting process. The determination of qualitative features cannot be manually made. Therefore, the segmentation of the internal structures of the fruit needs to be performed as accurately as possible in presence of noise. Fuzzy c-means (FCM) algorithm is noise-sensitive and pixels with noise are classified inversely. As a solution, in this paper, the spatial FCM algorithm in pomegranate MR images' segmentation is proposed. The algorithm is performed with setting the spatial neighborhood information in FCM and modification of fuzzy membership function for each class. The segmentation algorithm results on the original and the corrupted Pomegranate MR images by Gaussian, Salt Pepper and Speckle noises show that the SFCM algorithm operates much more significantly than FCM algorithm. Also, after diverse steps of qualitative and quantitative analysis, we have concluded that the SFCM algorithm with 5×5 window size is better than the other windows.
Algorithms for boundary detection in radiographic images
International Nuclear Information System (INIS)
Gonzaga, Adilson; Franca, Celso Aparecido de
1996-01-01
Edge detecting techniques applied to radiographic digital images are discussed. Some algorithms have been implemented and the results are displayed to enhance boundary or hide details. An algorithm applied in a pre processed image with contrast enhanced is proposed and the results are discussed
Laser Speckle Imaging of Rat Pial Microvasculature during Hypoperfusion-Reperfusion Damage
Directory of Open Access Journals (Sweden)
Teresa Mastantuono
2017-09-01
Full Text Available The present study was aimed to in vivo assess the blood flow oscillatory patterns in rat pial microvessels during 30 min bilateral common carotid artery occlusion (BCCAO and 60 min reperfusion by laser speckle imaging (LSI. Pial microcirculation was visualized by fluorescence microscopy. The blood flow oscillations of single microvessels were recorded by LSI; spectral analysis was performed by Wavelet transform. Under baseline conditions, arterioles and venules were characterized by blood flow oscillations in the frequency ranges 0.005–0.0095 Hz, 0.0095–0.021 Hz, 0.021–0.052 Hz, 0.052–0.150 Hz and 0.150–0.500 Hz. Arterioles showed oscillations with the highest spectral density when compared with venules. Moreover, the frequency components in the ranges 0.052–0.150 Hz and 0.150–0.500 were predominant in the arteriolar total power spectrum; while, the frequency component in the range 0.150–0.500 Hz showed the highest spectral density in venules. After 30 min BCCAO, the arteriolar spectral density decreased compared to baseline; moreover, the arteriolar frequency component in the range 0.052–0.150 Hz significantly decreased in percent spectral density, while the frequency component in the range 0.150–0.500 Hz significantly increased in percent spectral density. However, an increase in arteriolar spectral density was detected at 60 min reperfusion compared to BCCAO values; consequently, an increase in percent spectral density of the frequency component in the range 0.052–0.150 Hz was observed, while the percent spectral density of the frequency component in the range 0.150–0.500 Hz significantly decreased. The remaining frequency components did not significantly change during hypoperfusion and reperfusion. The changes in blood flow during hypoperfusion/reperfusion caused tissue damage in the cortex and striatum of all animals. In conclusion, our data demonstrate that the frequency component in the range 0.052–0.150 Hz
Fritz, Jason R; Phillips, Brett T; Conkling, Nicole; Fourman, Mitchell; Melendez, Mark M; Bhatnagar, Divya; Simon, Marcia; Rafailovich, Miriam; Dagum, Alexander B
2012-10-01
Dermal substitutes are currently used in plastic surgery to cover various soft tissue defects caused by trauma, burns, or ablative cancer surgery. Little information is available on the biomechanical properties of these dermal substitutes after adequate incorporation as compared to normal skin. Determining parameters such as tensile strength in these skin substitutes will help us further understand their wound healing properties and potential in developing artificial tissue constructs. We hypothesize that a dermal substitute has a lower stress-strain curve and altered stress-induced deformation quantified with tensiometry and digital image speckle correlation (DISC) analysis. Two separate 5×10-cm full-thickness wounds were created on the dorsum of 3 female swine. Fibrin glue was applied before either a full-thickness skin graft (FTSG) or application of artificial dermal matrix. On day 42, cultured autologous keratinocytes were applied as a cell sheet to the wound covered with Integra. On day 56, the wounds were fully excised and fresh tissue specimens, including normal skin, were stored in a physiological solution and prepared for analysis. Rectangular samples were excised from the center of each specimen measuring 4×4×30 mm. Using a tensiometer and DISC analysis, we evaluated the tensile strength of 3 different groups of skin, namely, normal, FTSG, and Integra. There is a significant difference between the Integra specimen when compared to normal skin and FTSG. We found a minimal difference in the stress-strain curves of the latter two. Integra alone shows plastic deformation with continued stretching before ultimate midline fracture. There is significant change between the Young's moduli of the normal skin and the Integra, whereas there is little difference between the FTSG and the normal skin; DISC confirms this analysis. The normal skin and FTSG show a convergence of vectors to a linear plane, whereas Integra shows very little organization. Using 2 different
Suen, Ricky Wai
The work described in this thesis covers the conversion of HiLo image processing into MATLAB architecture and the use of speckle-illumination HiLo microscopy for use of ex-vivo and in-vivo imaging of thick tissue models. HiLo microscopy is a wide-field fluorescence imaging technique and has been demonstrated to produce optically sectioned images comparable to confocal in thin samples. The imaging technique was developed by Jerome Mertz and the Boston University Biomicroscopy Lab and has been implemented in our lab as a stand-alone optical setup and a modification to a conventional fluorescence microscope. Speckle-illumination HiLo microscopy combines two images taken under speckle-illumination and standard uniform-illumination to generate an optically sectioned image that reject out-of-focus fluorescence. The evaluated speckle contrast in the images is used as a weighting function where elements that move out-of-focus have a speckle contrast that decays to zero. The experiments shown here demonstrate the capability of our HiLo microscopes to produce optically-sectioned images of the microvasculature of ex-vivo and in-vivo thick tissue models. The HiLo microscope were used to image the microvasculature of ex-vivo mouse heart sections prepared for optical histology and the microvasculature of in-vivo rodent dorsal window chamber models. Studies in label-free surface profiling with HiLo microscopy is also presented.
A new chaotic algorithm for image encryption
International Nuclear Information System (INIS)
Gao Haojiang; Zhang Yisheng; Liang Shuyun; Li Dequn
2006-01-01
Recent researches of image encryption algorithms have been increasingly based on chaotic systems, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper presents a new nonlinear chaotic algorithm (NCA) which uses power function and tangent function instead of linear function. Its structural parameters are obtained by experimental analysis. And an image encryption algorithm in a one-time-one-password system is designed. The experimental results demonstrate that the image encryption algorithm based on NCA shows advantages of large key space and high-level security, while maintaining acceptable efficiency. Compared with some general encryption algorithms such as DES, the encryption algorithm is more secure
A combinational fast algorithm for image reconstruction
International Nuclear Information System (INIS)
Wu Zhongquan
1987-01-01
A combinational fast algorithm has been developed in order to increase the speed of reconstruction. First, an interpolation method based on B-spline functions is used in image reconstruction. Next, the influence of the boundary conditions assumed here on the interpolation of filtered projections and on the image reconstruction is discussed. It is shown that this boundary condition has almost no influence on the image in the central region of the image space, because the error of interpolation rapidly decreases by a factor of ten in shifting two pixels from the edge toward the center. In addition, a fast algorithm for computing the detecting angle has been used with the mentioned interpolation algorithm, and the cost for detecting angle computaton is reduced by a factor of two. The implementation results show that in the same subjective and objective fidelity, the computational cost for the interpolation using this algorithm is about one-twelfth of the conventional algorithm
Vladimirov, A. P.; Malygin, A. S.; Mikhailova, J. A.; Borodin, E. M.; Bakharev, A. A.; Poryvayeva, A. P.
2014-09-01
Earlier we reported developing a speckle interferometry technique and a device designed to assess the metabolic activity of a cell monolayer cultivated on a glass substrate. This paper aimed at upgrading the technique and studying its potential for real-time assessment of herpes virus development process. Speckle dynamics was recorded in the image plane of intact and virus-infected cell monolayer. HLE-3, L-41 and Vero cells were chosen as research targets. Herpes simplex virus-1-(HSV-1)- infected cell cultures were studied. For 24 h we recorded the digital value of optical signal I in one pixel and parameter η characterizing change in the distribution of the optical signal on 10 × 10-pixel areas. The coefficient of multiple determination calculated by η time dependences for three intact cell cultures equals 0.94. It was demonstrated that the activity parameters are significantly different for intact and virus-infected cells. The difference of η value for intact and HSV-1-infected cells is detectable 10 minutes from the experiment start.
International Nuclear Information System (INIS)
Vladimirov, A P; Malygin, A S; Mikhailova, J A; Borodin, E M; Bakharev, A A; Poryvayeva, A P
2014-01-01
Earlier we reported developing a speckle interferometry technique and a device designed to assess the metabolic activity of a cell monolayer cultivated on a glass substrate. This paper aimed at upgrading the technique and studying its potential for real-time assessment of herpes virus development process. Speckle dynamics was recorded in the image plane of intact and virus-infected cell monolayer. HLE-3, L-41 and Vero cells were chosen as research targets. Herpes simplex virus-1-(HSV-1)- infected cell cultures were studied. For 24 h we recorded the digital value of optical signal I in one pixel and parameter η characterizing change in the distribution of the optical signal on 10 × 10-pixel areas. The coefficient of multiple determination calculated by η time dependences for three intact cell cultures equals 0.94. It was demonstrated that the activity parameters are significantly different for intact and virus-infected cells. The difference of η value for intact and HSV-1-infected cells is detectable 10 minutes from the experiment start.
Energy Technology Data Exchange (ETDEWEB)
Borges, J.P. [Laboratório de Atividade Física e Promoção è Saúde, Departamento de Desporto Coletivo, Instituto de Educação Física e Desportos, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, RJ (Brazil); Lopes, G.O. [Laboratório de Atividade Física e Promoção è Saúde, Departamento de Desporto Coletivo, Instituto de Educação Física e Desportos, Universidade do Estado do Rio de Janeiro, Rio de Janeiro, RJ (Brazil); Instituto Nacional de Cardiologia, Rio de Janeiro, RJ (Brazil); Verri, V.; Coelho, M.P.; Nascimento, P.M.C.; Kopiler, D.A. [Instituto Nacional de Cardiologia, Rio de Janeiro, RJ (Brazil); Tibirica, E. [Instituto Nacional de Cardiologia, Rio de Janeiro, RJ (Brazil); Laboratório de Investigação Cardiovascular, Departamento Osório de Almeida, Instituto Oswaldo Cruz, FIOCRUZ, Rio de Janeiro, RJ (Brazil)
2016-09-01
Evaluation of microvascular endothelial function is essential for investigating the pathophysiology and treatment of cardiovascular and metabolic diseases. Although laser speckle contrast imaging technology is well accepted as a noninvasive methodology for assessing microvascular endothelial function, it has never been used to compare male patients with coronary artery disease with male age-matched healthy controls. Thus, the aim of this study was to determine whether laser speckle contrast imaging could be used to detect differences in the systemic microvascular functions of patients with established cardiovascular disease (n=61) and healthy age-matched subjects (n=24). Cutaneous blood flow was assessed in the skin of the forearm using laser speckle contrast imaging coupled with the transdermal iontophoretic delivery of acetylcholine and post-occlusive reactive hyperemia. The maximum increase in skin blood flow induced by acetylcholine was significantly reduced in the cardiovascular disease patients compared with the control subjects (74 vs 116%; P<0.01). With regard to post-occlusive reactive hyperemia-induced vasodilation, the patients also presented reduced responses compared to the controls (0.42±0.15 vs 0.50±0.13 APU/mmHg; P=0.04). In conclusion, laser speckle contrast imaging can identify endothelial and microvascular dysfunctions in male individuals with cardiovascular disease. Thus, this technology appears to be an efficient non-invasive technique for evaluating systemic microvascular and endothelial functions, which could be valuable as a peripheral marker of atherothrombotic diseases in men.
International Nuclear Information System (INIS)
Borges, J.P.; Lopes, G.O.; Verri, V.; Coelho, M.P.; Nascimento, P.M.C.; Kopiler, D.A.; Tibirica, E.
2016-01-01
Evaluation of microvascular endothelial function is essential for investigating the pathophysiology and treatment of cardiovascular and metabolic diseases. Although laser speckle contrast imaging technology is well accepted as a noninvasive methodology for assessing microvascular endothelial function, it has never been used to compare male patients with coronary artery disease with male age-matched healthy controls. Thus, the aim of this study was to determine whether laser speckle contrast imaging could be used to detect differences in the systemic microvascular functions of patients with established cardiovascular disease (n=61) and healthy age-matched subjects (n=24). Cutaneous blood flow was assessed in the skin of the forearm using laser speckle contrast imaging coupled with the transdermal iontophoretic delivery of acetylcholine and post-occlusive reactive hyperemia. The maximum increase in skin blood flow induced by acetylcholine was significantly reduced in the cardiovascular disease patients compared with the control subjects (74 vs 116%; P<0.01). With regard to post-occlusive reactive hyperemia-induced vasodilation, the patients also presented reduced responses compared to the controls (0.42±0.15 vs 0.50±0.13 APU/mmHg; P=0.04). In conclusion, laser speckle contrast imaging can identify endothelial and microvascular dysfunctions in male individuals with cardiovascular disease. Thus, this technology appears to be an efficient non-invasive technique for evaluating systemic microvascular and endothelial functions, which could be valuable as a peripheral marker of atherothrombotic diseases in men
Richards, Lisa M.; Weber, Erica L.; Parthasarathy, Ashwin B.; Kappeler, Kaelyn L.; Fox, Douglas J.; Dunn, Andrew K.
2012-02-01
Monitoring cerebral blood flow (CBF) during neurosurgery can provide important physiological information for a variety of surgical procedures. Although multiple intraoperative vascular monitoring technologies are currently available, a quantitative method that allows for continuous monitoring is still needed. Laser speckle contrast imaging (LSCI) is an optical imaging method with high spatial and temporal resolution that has been widely used to image CBF in animal models in vivo. In this pilot clinical study, we adapted a Zeiss OPMI Pentero neurosurgical microscope to obtain LSCI images by attaching a camera and a laser diode. This LSCI adapted instrument has been used to acquire full field flow images from 10 patients during tumor resection procedures. The patient's ECG was recorded during acquisition and image registration was performed in post-processing to account for pulsatile motion artifacts. Digital photographs confirmed alignment of vasculature and flow images in four cases, and a relative change in blood flow was observed in two patients after bipolar cautery. The LSCI adapted instrument has the capability to produce real-time, full field CBF image maps with excellent spatial resolution and minimal intervention to the surgical procedure. Results from this study demonstrate the feasibility of using LSCI to monitor blood flow during neurosurgery.
Quantum Image Encryption Algorithm Based on Image Correlation Decomposition
Hua, Tianxiang; Chen, Jiamin; Pei, Dongju; Zhang, Wenquan; Zhou, Nanrun
2015-02-01
A novel quantum gray-level image encryption and decryption algorithm based on image correlation decomposition is proposed. The correlation among image pixels is established by utilizing the superposition and measurement principle of quantum states. And a whole quantum image is divided into a series of sub-images. These sub-images are stored into a complete binary tree array constructed previously and then randomly performed by one of the operations of quantum random-phase gate, quantum revolving gate and Hadamard transform. The encrypted image can be obtained by superimposing the resulting sub-images with the superposition principle of quantum states. For the encryption algorithm, the keys are the parameters of random phase gate, rotation angle, binary sequence and orthonormal basis states. The security and the computational complexity of the proposed algorithm are analyzed. The proposed encryption algorithm can resist brute force attack due to its very large key space and has lower computational complexity than its classical counterparts.
Algorithms for reconstructing images for industrial applications
International Nuclear Information System (INIS)
Lopes, R.T.; Crispim, V.R.
1986-01-01
Several algorithms for reconstructing objects from their projections are being studied in our Laboratory, for industrial applications. Such algorithms are useful locating the position and shape of different composition of materials in the object. A Comparative study of two algorithms is made. The two investigated algorithsm are: The MART (Multiplicative - Algebraic Reconstruction Technique) and the Convolution Method. The comparison are carried out from the point view of the quality of the image reconstructed, number of views and cost. (Author) [pt
Speckle-learning-based object recognition through scattering media.
Ando, Takamasa; Horisaki, Ryoichi; Tanida, Jun
2015-12-28
We experimentally demonstrated object recognition through scattering media based on direct machine learning of a number of speckle intensity images. In the experiments, speckle intensity images of amplitude or phase objects on a spatial light modulator between scattering plates were captured by a camera. We used the support vector machine for binary classification of the captured speckle intensity images of face and non-face data. The experimental results showed that speckles are sufficient for machine learning.
Kang, Yu; Wang, Wei; Zhao, Hang; Qiao, Zhiqing; Shen, Xuedong; He, Ben
2017-07-10
Despite their clear therapeutic benefits, anthracycline-induced cardiotoxicity is a major concern limiting the ability to reduce morbidity and mortality associated with cancers. The early identification of anthracycline-induced cardiotoxicity is of vital importance to assess the cardiac risk against the potential cancer treatment. To investigate whether speckle-tracking analysis can provide a sensitive and accurate measurement when detecting doxorubicin-induced left ventricular injury. Wistar rats were divided into 4 groups with 8 rats each, given doxorubicin intraperitoneally at weekly intervals for up to 4 weeks. Group 1: 2.5 mg/kg/week; group 2: 3 mg/kg/week; group 3: 3.5mg/kg/week; group 4: 4mg/kg/week. An additional 5 rats were used as controls. Echocardiographic images were obtained at baseline and 1 week after the last dose of treatment. Radial (Srad) and circumferential (Scirc) strains, radial (SRrad) and circumferential (SRcirc) strain rates were analyzed. After the experiment, cardiac troponin I (cTnI) was analyzed and the heart samples were histologically evaluated. After doxorubicin exposure, LVEF was significantly reduced in group 4 (p = 0.006), but remained stable in the other groups. However, after treatment, Srads were reduced in groups 2, 3 and 4 (p all grupos de 8 ratos cada, e doxorrubicina foi administrada intraperitonealmente em intervalos semanais de até 4 semanas. Grupo 1: 2,5 mg/kg/semana; Grupo 2: 3 mg/kg/semana; Grupo 3: 3,5 mg/kg/semana; Grupo 4: 4 mg/kg/semana. Foram utilizados 5 ratos adicionais como controles. As imagens ecocardiográficas foram obtidas na linha basal e 1 semana após a última dose do tratamento. Foram analisados o strain radial (Srad) e circunferencial (Scirc) e as taxas de strain radial (TSrad) e circunferencial (TScirc). Após o experimento, a troponina cardíaca I (cTnI) foi analisada e as amostras cardíacas foram avaliadas histologicamente. Após a exposição à doxorrubicina, a FEVE foi significativamente
Wholefield displacement measurements using speckle image processing techniques for crash tests
Sriram, P.; Hanagud, S.; Ranson, W. F.
The digital correlation scheme of Peters et al. (1983) was extended to measure out-of-plane deformations, using a white light projection speckle technique. A simple ray optic theory and the digital correlation scheme are outlined. The technique was applied successfully to measure out-of-plane displacements of initially flat rotorcraft structures (an acrylic circular plate and a steel cantilever beam), using a low cost video camera and a desktop computer. The technique can be extended to measurements of three-dimensional deformations and dynamic deformations.
Speckle interferometry of asteroids
International Nuclear Information System (INIS)
Drummond, J.
1988-01-01
By studying the image two-dimensional power spectra or autocorrelations projected by an asteroid as it rotates, it is possible to locate its rotational pole and derive its three axes dimensions through speckle interferometry under certain assumptions of uniform, geometric scattering, and triaxial ellipsoid shape. However, in cases where images can be reconstructed, the need for making the assumptions is obviated. Furthermore, the ultimate goal for speckle interferometry of image reconstruction will lead to mapping albedo features (if they exist) as impact areas or geological units. The first glimpses of the surface of an asteroid were obtained from images of 4 Vesta reconstructed from speckle interferometric observations. These images reveal that Vesta is quite Moon-like in having large hemispheric-scale albedo features. All of its lightcurves can be produced from a simple model developed from the images. Although undoubtedly more intricate than the model, Vesta's lightcurves can be matched by a model with three dark and four bright spots. The dark areas so dominate one hemisphere that a lightcurve minimum occurs when the maximum cross-section area is visible. The triaxial ellipsoid shape derived for Vesta is not consistent with the notion that the asteroid has an equilibrium shape in spite of its having apparently been differentiated
Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry
Energy Technology Data Exchange (ETDEWEB)
Hassan, T.A.
1992-12-01
The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows. A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.
Salem, Ran; Matityahu, Shlomi; Melchior, Aviva; Nikolaevsky, Mark; Noked, Ori; Sterer, Eran
2015-09-01
The precision of melting curve measurements using laser-heated diamond anvil cell (LHDAC) is largely limited by the correct and reliable determination of the onset of melting. We present a novel image analysis of speckle interference patterns in the LHDAC as a way to define quantitative measures which enable an objective determination of the melting transition. Combined with our low-temperature customized IR pyrometer, designed for measurements down to 500 K, our setup allows studying the melting curve of materials with low melting temperatures, with relatively high precision. As an application, the melting curve of Te was measured up to 35 GPa. The results are found to be in good agreement with previous data obtained at pressures up to 10 GPa.
Zhu, Pei-hua; Huang, Jing-yuan; Ye, Meng; Zheng, Zhe-lan
2014-09-01
To evaluate the left ventricular twist characteristics in patients with type 2 diabetes by using two-dimensional speckle tracking imaging (STI). Ninety-three patients with type 2 diabetes admitted in Zhejiang Hospital from May 2012 to September 2013 were enrolled. According to left ventricular ejection fraction (LVEF), patients were divided into two groups: normal left ventricular systolic function group (group A, LVEF≥0.50, n=46) and abnormal left ventricular systolic function group (group B, LVEF Consistency check for STI was conducted to assess its stability and reliability. The Peaktw, AVCtw, and MVOtw in group A were significantly elevated than those in normal controls (Pconsistency limit=-2.8-2.7; within measurer: R=0.964, bias=-0.2, 95% consistency limits=-2.7-2.2). STI can be used for early recognition of abnormal changes of cardiac function in type 2 diabetic mellitus patients, with high stability and reliability.
Xu, Yan; Palmaccio, Samantha Palmaccio; Bui, Duc; Dagum, Alexander; Rafailovich, Miriam
Been famous for clinical use from early 1980s, the neuromuscular blocking agent Botulinum toxin type A (BTX-A), has been used to reduce wrinkles for a long time. Only little research has been done to quantify the change of muscle contraction before and after injection and most research paper depend on subjective evaluation from both patients and surgeons. In our research, Digital Image Speckle Correlation (DISC) was employed to study the mechanical properties of skin, contraction mode of muscles (injected) and reaction of neighbor muscle group (un-injected).At the same time, displacement patterns (vector maps)generated by DISC can predict injection locus for surgeons who normally handle it depending only on visual observation.
Applying laser speckle images to skin science: skin lesion differentiation by polarization
Lee, Tim K.; Tchvialeva, Lioudmila; Dhadwal, Gurbir; Sotoodian, Bahman; Kalai, Sunil; Zeng, Haishan; Lui, Harvey; McLean, David I.
2012-01-01
Skin cancer is a worldwide health problem. It is the most common cancer in the countries with a large white population; furthermore, the incidence of malignant melanoma, the most dangerous form of skin cancer, has been increasing steadily over the last three decades. There is an urgent need to develop in-vivo, noninvasive diagnostic tools for the disease. This paper attempts to response to the challenge by introducing a simple and fast method based on polarization and laser speckle. The degree of maintaining polarization estimates the fraction of linearly maintaining polarization in the backscattered speckle field. Clinical experiments of 214 skin lesions including malignant melanomas, squamous cell carcinomas, basal cell carcinomas, nevi, and seborrheic keratoses demonstrated that such a parameter can potentially diagnose different skin lesion types. ROC analyses showed that malignant melanoma and seborrheic keratosis could be differentiated by both the blue and red lasers with the area under the curve (AUC) = 0.8 and 0.7, respectively. Also malignant melanoma and squamous cell carcinoma could be separated by the blue laser (AUC = 0.9), while nevus and seborrheic keratosis could be identified using the red laser (AUC = 0.7). These experiments demonstrated that polarization could be a potential in-vivo diagnostic indicator for skin diseases.
Evaluation of segmentation algorithms for optical coherence tomography images of ovarian tissue
Sawyer, Travis W.; Rice, Photini F. S.; Sawyer, David M.; Koevary, Jennifer W.; Barton, Jennifer K.
2018-02-01
Ovarian cancer has the lowest survival rate among all gynecologic cancers due to predominantly late diagnosis. Early detection of ovarian cancer can increase 5-year survival rates from 40% up to 92%, yet no reliable early detection techniques exist. Optical coherence tomography (OCT) is an emerging technique that provides depthresolved, high-resolution images of biological tissue in real time and demonstrates great potential for imaging of ovarian tissue. Mouse models are crucial to quantitatively assess the diagnostic potential of OCT for ovarian cancer imaging; however, due to small organ size, the ovaries must rst be separated from the image background using the process of segmentation. Manual segmentation is time-intensive, as OCT yields three-dimensional data. Furthermore, speckle noise complicates OCT images, frustrating many processing techniques. While much work has investigated noise-reduction and automated segmentation for retinal OCT imaging, little has considered the application to the ovaries, which exhibit higher variance and inhomogeneity than the retina. To address these challenges, we evaluated a set of algorithms to segment OCT images of mouse ovaries. We examined ve preprocessing techniques and six segmentation algorithms. While all pre-processing methods improve segmentation, Gaussian filtering is most effective, showing an improvement of 32% +/- 1.2%. Of the segmentation algorithms, active contours performs best, segmenting with an accuracy of 0.948 +/- 0.012 compared with manual segmentation (1.0 being identical). Nonetheless, further optimization could lead to maximizing the performance for segmenting OCT images of the ovaries.
Yamauchi, Toyohiko; Yamada, Hidenao; Matsui, Hisayuki; Yasuhiko, Osamu; Ueda, Yukio
2018-02-01
We developed a compact Mach-Zehnder interferometer module to be used as a replacement of the objective lens in a conventional inverted microscope (Nikon, TS100-F) in order to make them quantitative phase microscopes. The module has a 90-degree-flipped U-shape; the dimensions of the module are 160 mm by 120 mm by 40 mm and the weight is 380 grams. The Mach-Zehnder interferometer equipped with the separate reference and sample arms was implemented in this U-shaped housing and the path-length difference between the two arms was manually adjustable. The sample under test was put on the stage of the microscope and a sample light went through it. Both arms had identical achromatic lenses for image formation and the lateral positions of them were also manually adjustable. Therefore, temporally and spatially low coherent illumination was applicable because the users were able to balance precisely the path length of the two arms and to overlap the two wavefronts. In the experiment, spectrally filtered LED light for illumination (wavelength = 633 nm and bandwidth = 3 nm) was input to the interferometer module via a 50 micrometer core optical fiber. We have successfully captured full-field interference images by a camera put on the trinocular tube of the microscope and constructed quantitative phase images of the cultured cells by means of the quarter-wavelength phase shifting algorithm. The resultant quantitative phase images were speckle-free and halo-free due to spectrally and spatially low coherent illumination.
Algorithms for image processing and computer vision
Parker, J R
2010-01-01
A cookbook of algorithms for common image processing applications Thanks to advances in computer hardware and software, algorithms have been developed that support sophisticated image processing without requiring an extensive background in mathematics. This bestselling book has been fully updated with the newest of these, including 2D vision methods in content-based searches and the use of graphics cards as image processing computational aids. It's an ideal reference for software engineers and developers, advanced programmers, graphics programmers, scientists, and other specialists wh
Parallel asynchronous systems and image processing algorithms
Coon, D. D.; Perera, A. G. U.
1989-01-01
A new hardware approach to implementation of image processing algorithms is described. The approach is based on silicon devices which would permit an independent analog processing channel to be dedicated to evey pixel. A laminar architecture consisting of a stack of planar arrays of the device would form a two-dimensional array processor with a 2-D array of inputs located directly behind a focal plane detector array. A 2-D image data stream would propagate in neuronlike asynchronous pulse coded form through the laminar processor. Such systems would integrate image acquisition and image processing. Acquisition and processing would be performed concurrently as in natural vision systems. The research is aimed at implementation of algorithms, such as the intensity dependent summation algorithm and pyramid processing structures, which are motivated by the operation of natural vision systems. Implementation of natural vision algorithms would benefit from the use of neuronlike information coding and the laminar, 2-D parallel, vision system type architecture. Besides providing a neural network framework for implementation of natural vision algorithms, a 2-D parallel approach could eliminate the serial bottleneck of conventional processing systems. Conversion to serial format would occur only after raw intensity data has been substantially processed. An interesting challenge arises from the fact that the mathematical formulation of natural vision algorithms does not specify the means of implementation, so that hardware implementation poses intriguing questions involving vision science.
International Nuclear Information System (INIS)
Prentice, H. J.; Proud, W. G.
2006-01-01
A technique has been developed to determine experimentally the three-dimensional displacement field on the rear surface of a dynamically deforming plate. The technique combines speckle analysis with stereoscopy, using a modified angular-lens method: this incorporates split-frame photography and a simple method by which the effective lens separation can be adjusted and calibrated in situ. Whilst several analytical models exist to predict deformation in extended or semi-infinite targets, the non-trivial nature of the wave interactions complicates the generation and development of analytical models for targets of finite depth. By interrogating specimens experimentally to acquire three-dimensional strain data points, both analytical and numerical model predictions can be verified more rigorously. The technique is applied to the quasi-static deformation of a rubber sheet and dynamically to Mild Steel sheets of various thicknesses
Color speckle in laser displays
Kuroda, Kazuo
2015-07-01
At the beginning of this century, lighting technology has been shifted from discharge lamps, fluorescent lamps and electric bulbs to solid-state lighting. Current solid-state lighting is based on the light emitting diodes (LED) technology, but the laser lighting technology is developing rapidly, such as, laser cinema projectors, laser TVs, laser head-up displays, laser head mounted displays, and laser headlamps for motor vehicles. One of the main issues of laser displays is the reduction of speckle noise1). For the monochromatic laser light, speckle is random interference pattern on the image plane (retina for human observer). For laser displays, RGB (red-green-blue) lasers form speckle patterns independently, which results in random distribution of chromaticity, called color speckle2).
Energy Technology Data Exchange (ETDEWEB)
Pretto, Lucas Ramos de
2015-07-01
This work discusses the Optical Coherence Tomography system (OCT) and its application to the microfluidics area. To this end, physical characterization of microfluidic circuits were performed using 3D (three-dimensional) models constructed from OCT images of such circuits. The technique was thus evaluated as a potential tool to aid in the inspection of microchannels. Going further, this work paper studies and develops analytical techniques for microfluidic flow, in particular techniques based on speckle pattern. In the first instance, existing methods were studied and improved, such as Speckle Variance - OCT, where a gain of 31% was obtained in processing time. Other methods, such as LASCA (Laser Speckle Contrast Analysis), based on speckle autocorrelation, are adapted to OCT images. Derived from LASCA, the developed analysis technique based on intensity autocorrelation motivated the development of a custom OCT system as well as an optimized acquisition software, with a sampling rate of 8 kHz. The proposed method was, then, able to distinguish different flow rates, and limits of detection were tested, proving its feasibility for implementation on Brownian motion analysis and flow rates below 10 μl/min. (author)
Speckle perception and disturbance limit in laser based projectors
Verschaffelt, Guy; Roelandt, Stijn; Meuret, Youri; Van den Broeck, Wendy; Kilpi, Katriina; Lievens, Bram; Jacobs, An; Janssens, Peter; Thienpont, Hugo
2016-04-01
We investigate the level of speckle that can be tolerated in a laser cinema projector. For this purpose, we equipped a movie theatre room with a prototype laser projector. A group of 186 participants was gathered to evaluate the speckle perception of several, short movie trailers in a subjective `Quality of Experience' experiment. This study is important as the introduction of lasers in projection systems has been hampered by the presence of speckle in projected images. We identify a speckle disturbance threshold by statistically analyzing the observers' responses for different values of the amount of speckle, which was monitored using a well-defined speckle measurement method. The analysis shows that the speckle perception of a human observer is not only dependent on the objectively measured amount of speckle, but it is also strongly influenced by the image content. As is also discussed in [Verschaffelt et al., Scientific Reports 5, art. nr. 14105, 2015] we find that, for moving images, the speckle becomes disturbing if the speckle contrast becomes larger than 6.9% for the red, 6.0% for the green, and 4.8% for the blue primary colors of the projector, whereas for still images the speckle detection threshold is about 3%. As we could not independently tune the speckle contrast of each of the primary colors, this speckle disturbance limit seems to be determined by the 6.9% speckle contrast of the red color as this primary color contains the largest amount of speckle. The speckle disturbance limit for movies thus turns out to be substantially larger than that for still images, and hence is easier to attain.
Medical image segmentation using genetic algorithms.
Maulik, Ujjwal
2009-03-01
Genetic algorithms (GAs) have been found to be effective in the domain of medical image segmentation, since the problem can often be mapped to one of search in a complex and multimodal landscape. The challenges in medical image segmentation arise due to poor image contrast and artifacts that result in missing or diffuse organ/tissue boundaries. The resulting search space is therefore often noisy with a multitude of local optima. Not only does the genetic algorithmic framework prove to be effective in coming out of local optima, it also brings considerable flexibility into the segmentation procedure. In this paper, an attempt has been made to review the major applications of GAs to the domain of medical image segmentation.
On combining algorithms for deformable image registration
Muenzing, S.E.A.; Ginneken, van B.; Pluim, J.P.W.; Dawant, B.M.
2012-01-01
We propose a meta-algorithm for registration improvement by combining deformable image registrations (MetaReg). It is inspired by a well-established method from machine learning, the combination of classifiers. MetaReg consists of two main components: (1) A strategy for composing an improved
He, Zhijie; Lu, Hongyang; Yang, Xiaojiao; Zhang, Li; Wu, Yi; Niu, Wenxiu; Ding, Li; Wang, Guili; Tong, Shanbao; Jia, Jie
2018-01-01
Exercise preconditioning induces neuroprotective effects during cerebral ischemia and reperfusion, which involves the recovery of cerebral blood flow (CBF). Mechanisms underlying the neuroprotective effects of re-established CBF following ischemia and reperfusion are unclear. The present study investigated CBF in hyper-early stage of reperfusion by laser speckle contrast imaging, a full-field high-resolution optical imaging technique. Rats with or without treadmill training were subjected to middle cerebral artery occlusion followed by reperfusion. CBF in arteries, veins, and capillaries in hyper-early stage of reperfusion (1, 2, and 3 h after reperfusion) and in subacute stage (24 h after reperfusion) were measured. Neurological scoring and 2,3,5-triphenyltetrazolium chloride staining were further applied to determine the neuroprotective effects of exercise preconditioning. In hyper-early stage of reperfusion, CBF in the rats with exercise preconditioning was reduced significantly in arteries and veins, respectively, compared to rats with no exercise preconditioning. Capillary CBF remained stable in the hyper-early stage of reperfusion, though it increased significantly 24 h after reperfusion in the rats with exercise preconditioning. As a neuroprotective strategy, exercise preconditioning reduced the blood perfusion of arteries and veins in the hyper-early stage of reperfusion, which indicated intervention-induced neuroprotective hypoperfusion after reperfusion onset.
Bhatnagar, Divya; Conkling, Nicole; Rafailovich, Miriam; Dagum, Alexander
2012-02-01
The skin on the face is directly attached to the underlying muscles. Here, we successfully introduce a non-invasive, non-contact technique, Digital Image Speckle Correlation (DISC), to measure the precise magnitude and duration of facial muscle paralysis inflicted by BTX-A. Subjective evaluation by clinicians and patients fail to objectively quantify the direct effect and duration of BTX-A on the facial musculature. By using DISC, we can (a) Directly measure deformation field of the facial skin and determine the locus of facial muscular tension(b)Quantify and monitor muscular paralysis and subsequent re-innervation following injection; (c) Continuously correlate the appearance of wrinkles and muscular tension. Two sequential photographs of slight facial motion (frowning, raising eyebrows) are taken. DISC processes the images to produce a vector map of muscular displacement from which spatially resolved information is obtained regarding facial tension. DISC can track the ability of different muscle groups to contract and can be used to predict the site of injection, quantify muscle paralysis and the rate of recovery following BOTOX injection.
Algorithms of image processing in nuclear medicine
International Nuclear Information System (INIS)
Oliveira, V.A.
1990-01-01
The problem of image restoration from noisy measurements as encountered in Nuclear Medicine is considered. A new approach for treating the measurements wherein they are represented by a spatial noncausal interaction model prior to maximum entropy restoration is given. This model describes the statistical dependence among the image values and their neighbourhood. The particular application of the algorithms presented here relates to gamma ray imaging systems, and is aimed at improving the resolution-noise suppression product. Results for actual gamma camera data are presented and compared with more conventional techniques. (author)
LSB Based Quantum Image Steganography Algorithm
Jiang, Nan; Zhao, Na; Wang, Luo
2016-01-01
Quantum steganography is the technique which hides a secret message into quantum covers such as quantum images. In this paper, two blind LSB steganography algorithms in the form of quantum circuits are proposed based on the novel enhanced quantum representation (NEQR) for quantum images. One algorithm is plain LSB which uses the message bits to substitute for the pixels' LSB directly. The other is block LSB which embeds a message bit into a number of pixels that belong to one image block. The extracting circuits can regain the secret message only according to the stego cover. Analysis and simulation-based experimental results demonstrate that the invisibility is good, and the balance between the capacity and the robustness can be adjusted according to the needs of applications.
An enhanced fractal image denoising algorithm
International Nuclear Information System (INIS)
Lu Jian; Ye Zhongxing; Zou Yuru; Ye Ruisong
2008-01-01
In recent years, there has been a significant development in image denoising using fractal-based method. This paper presents an enhanced fractal predictive denoising algorithm for denoising the images corrupted by an additive white Gaussian noise (AWGN) by using quadratic gray-level function. Meanwhile, a quantization method for the fractal gray-level coefficients of the quadratic function is proposed to strictly guarantee the contractivity requirement of the enhanced fractal coding, and in terms of the quality of the fractal representation measured by PSNR, the enhanced fractal image coding using quadratic gray-level function generally performs better than the standard fractal coding using linear gray-level function. Based on this enhanced fractal coding, the enhanced fractal image denoising is implemented by estimating the fractal gray-level coefficients of the quadratic function of the noiseless image from its noisy observation. Experimental results show that, compared with other standard fractal-based image denoising schemes using linear gray-level function, the enhanced fractal denoising algorithm can improve the quality of the restored image efficiently
Image quality evaluation of full reference algorithm
He, Nannan; Xie, Kai; Li, Tong; Ye, Yushan
2018-03-01
Image quality evaluation is a classic research topic, the goal is to design the algorithm, given the subjective feelings consistent with the evaluation value. This paper mainly introduces several typical reference methods of Mean Squared Error(MSE), Peak Signal to Noise Rate(PSNR), Structural Similarity Image Metric(SSIM) and feature similarity(FSIM) of objective evaluation methods. The different evaluation methods are tested by Matlab, and the advantages and disadvantages of these methods are obtained by analyzing and comparing them.MSE and PSNR are simple, but they are not considered to introduce HVS characteristics into image quality evaluation. The evaluation result is not ideal. SSIM has a good correlation and simple calculation ,because it is considered to the human visual effect into image quality evaluation,However the SSIM method is based on a hypothesis,The evaluation result is limited. The FSIM method can be used for test of gray image and color image test, and the result is better. Experimental results show that the new image quality evaluation algorithm based on FSIM is more accurate.
Measurement of deformation field in CT specimen using laser speckle
International Nuclear Information System (INIS)
Jeon, Moon Chang; Kang, Ki Ju
2001-01-01
To obtain A 2 experimentally in the J-A 2 theory, deformation field on the lateral surface of a CT specimen was to be determined using laser speckle method. The crack growth was measured using direct current potential drop method and most procedure of experimental and data reduction was performed according to ASTM Standard E1737-96. Laser speckle images during crack propagation were monitored by two CCD cameras to cancel the effect of rotation and translation of the specimen. An algorithm to pursue displacement of a point from each image was developed and successfully used to measure A 2 continuously as the crack tip was propagated. The effects of specimen thickness on J-R curve and A 2 were explored
Digital image processing an algorithmic approach with Matlab
Qidwai, Uvais
2009-01-01
Introduction to Image Processing and the MATLAB EnvironmentIntroduction Digital Image Definitions: Theoretical Account Image Properties MATLAB Algorithmic Account MATLAB CodeImage Acquisition, Types, and File I/OImage Acquisition Image Types and File I/O Basics of Color Images Other Color Spaces Algorithmic Account MATLAB CodeImage ArithmeticIntroduction Operator Basics Theoretical TreatmentAlgorithmic Treatment Coding ExamplesAffine and Logical Operations, Distortions, and Noise in ImagesIntroduction Affine Operations Logical Operators Noise in Images Distortions in ImagesAlgorithmic Account
Directory of Open Access Journals (Sweden)
Guohua Zou
2016-12-01
Full Text Available New medical imaging technology, such as Computed Tomography and Magnetic Resonance Imaging (MRI, has been widely used in all aspects of medical diagnosis. The purpose of these imaging techniques is to obtain various qualitative and quantitative data of the patient comprehensively and accurately, and provide correct digital information for diagnosis, treatment planning and evaluation after surgery. MR has a good imaging diagnostic advantage for brain diseases. However, as the requirements of the brain image definition and quantitative analysis are always increasing, it is necessary to have better segmentation of MR brain images. The FCM (Fuzzy C-means algorithm is widely applied in image segmentation, but it has some shortcomings, such as long computation time and poor anti-noise capability. In this paper, firstly, the Ant Colony algorithm is used to determine the cluster centers and the number of FCM algorithm so as to improve its running speed. Then an improved Markov random field model is used to improve the algorithm, so that its antinoise ability can be improved. Experimental results show that the algorithm put forward in this paper has obvious advantages in image segmentation speed and segmentation effect.
A hash-based image encryption algorithm
Cheddad, Abbas; Condell, Joan; Curran, Kevin; McKevitt, Paul
2010-03-01
There exist several algorithms that deal with text encryption. However, there has been little research carried out to date on encrypting digital images or video files. This paper describes a novel way of encrypting digital images with password protection using 1D SHA-2 algorithm coupled with a compound forward transform. A spatial mask is generated from the frequency domain by taking advantage of the conjugate symmetry of the complex imagery part of the Fourier Transform. This mask is then XORed with the bit stream of the original image. Exclusive OR (XOR), a logical symmetric operation, that yields 0 if both binary pixels are zeros or if both are ones and 1 otherwise. This can be verified simply by modulus (pixel1, pixel2, 2). Finally, confusion is applied based on the displacement of the cipher's pixels in accordance with a reference mask. Both security and performance aspects of the proposed method are analyzed, which prove that the method is efficient and secure from a cryptographic point of view. One of the merits of such an algorithm is to force a continuous tone payload, a steganographic term, to map onto a balanced bits distribution sequence. This bit balance is needed in certain applications, such as steganography and watermarking, since it is likely to have a balanced perceptibility effect on the cover image when embedding.
Directory of Open Access Journals (Sweden)
Xiao-jing Song
2014-01-01
Full Text Available The study was conducted to observe the effect of electroacupuncture (EA on hepatic blood perfusion (HBP and vascular regulation. We investigated 60 male anesthetized mice under the following 3 conditions: without EA stimulation (control group; EA stimulation at Zusanli (ST36 group; EA stimulation at nonacupoint (NA group during 30 min. The HBP was measured using the laser speckle perfusion imaging (LSPI. The level of nitric oxide (NO, endothelin-1 (ET-1, and noradrenaline (NE in liver tissue was detected by biochemical methods. Results were as follows. At each time point, HBP increase in ST36 group was higher than that in the NA group in anesthetized mice. HBP gradually decreased during 30 min in control group. The level of NO in ST36 group was higher than that in NA group. The level of both ET-1 and NE was the highest in control group, followed by NA group and ST36 group. It is concluded that EA at ST36 could increase HBP possibly by increasing the blood flow velocity (BFV, changing vascular activity, increasing the level of NO, and inhibiting the level of ET-1 in liver tissue.
A parallelizable real-time motion tracking algorithm with applications to ultrasonic strain imaging
International Nuclear Information System (INIS)
Jiang, J; Hall, T J
2007-01-01
Ultrasound-based mechanical strain imaging systems utilize signals from conventional diagnostic ultrasound systems to image tissue elasticity contrast that provides new diagnostically valuable information. Previous works (Hall et al 2003 Ultrasound Med. Biol. 29 427, Zhu and Hall 2002 Ultrason. Imaging 24 161) demonstrated that uniaxial deformation with minimal elevation motion is preferred for breast strain imaging and real-time strain image feedback to operators is important to accomplish this goal. The work reported here enhances the real-time speckle tracking algorithm with two significant modifications. One fundamental change is that the proposed algorithm is a column-based algorithm (a column is defined by a line of data parallel to the ultrasound beam direction, i.e. an A-line), as opposed to a row-based algorithm (a row is defined by a line of data perpendicular to the ultrasound beam direction). Then, displacement estimates from its adjacent columns provide good guidance for motion tracking in a significantly reduced search region to reduce computational cost. Consequently, the process of displacement estimation can be naturally split into at least two separated tasks, computed in parallel, propagating outward from the center of the region of interest (ROI). The proposed algorithm has been implemented and optimized in a Windows (registered) system as a stand-alone ANSI C++ program. Results of preliminary tests, using numerical and tissue-mimicking phantoms, and in vivo tissue data, suggest that high contrast strain images can be consistently obtained with frame rates (10 frames s -1 ) that exceed our previous methods
Billings, Jake
2017-01-01
A new variation of blockchain proof of work algorithm is proposed to incentivize the timely execution of image processing algorithms. A sample image processing algorithm is proposed to determine interesting images using analysis of the entropy of pixel subsets within images. The efficacy of the image processing algorithm is examined using two small sets of training and test data. The interesting image algorithm is then integrated into a simplified blockchain mining proof of work algorithm bas...
A Multiresolution Image Completion Algorithm for Compressing Digital Color Images
Directory of Open Access Journals (Sweden)
R. Gomathi
2014-01-01
Full Text Available This paper introduces a new framework for image coding that uses image inpainting method. In the proposed algorithm, the input image is subjected to image analysis to remove some of the portions purposefully. At the same time, edges are extracted from the input image and they are passed to the decoder in the compressed manner. The edges which are transmitted to decoder act as assistant information and they help inpainting process fill the missing regions at the decoder. Textural synthesis and a new shearlet inpainting scheme based on the theory of p-Laplacian operator are proposed for image restoration at the decoder. Shearlets have been mathematically proven to represent distributed discontinuities such as edges better than traditional wavelets and are a suitable tool for edge characterization. This novel shearlet p-Laplacian inpainting model can effectively reduce the staircase effect in Total Variation (TV inpainting model whereas it can still keep edges as well as TV model. In the proposed scheme, neural network is employed to enhance the value of compression ratio for image coding. Test results are compared with JPEG 2000 and H.264 Intracoding algorithms. The results show that the proposed algorithm works well.
Ponticorvo, A.; Rowland, R.; Yang, B.; Lertsakdadet, B.; Crouzet, C.; Bernal, N.; Choi, B.; Durkin, A. J.
2017-02-01
Burn wounds are often characterized by injury depth, which then dictates wound management strategy. While most superficial burns and full thickness burns can be diagnosed through visual inspection, clinicians experience difficulty with accurate diagnosis of burns that fall between these extremes. Accurately diagnosing burn severity in a timely manner is critical for starting the appropriate treatment plan at the earliest time points to improve patient outcomes. To address this challenge, research groups have studied the use of commercial laser Doppler imaging (LDI) systems to provide objective characterization of burn-wound severity. Despite initial promising findings, LDI systems are not commonplace in part due to long acquisition times that can suffer from artifacts in moving patients. Commercial LDI systems are being phased out in favor of laser speckle imaging (LSI) systems that can provide similar information with faster acquisition speeds. To better understand the accuracy and usefulness of commercial LSI systems in burn-oriented research, we studied the performance of a commercial LSI system in three different sample systems and compared its results to a research-grade LSI system in the same environments. The first sample system involved laboratory measurements of intralipid (1%) flowing through a tissue simulating phantom, the second preclinical measurements in a controlled burn study in which wounds of graded severity were created on a Yorkshire pig, and the third clinical measurements involving a small sample of clinical patients. In addition to the commercial LSI system, a research grade LSI system that was designed and fabricated in our labs was used to quantitatively compare the performance of both systems and also to better understand the "Perfusion Unit" output of commercial systems.
Zhou, Yingchao; Xiao, Hong; Wu, Jianfei; Zha, Lingfeng; Zhou, Mengchen; Li, Qianqian; Wang, Mengru; Shi, Shumei; Li, Yanze; Lyu, Liangkun; Wang, Qing; Tu, Xin; Lu, Qiulun
2018-01-01
Diabetes mellitus (DM) has been demonstrated to have a strong association with heart failure. Conventional echocardiographic analysis cannot sensitively monitor cardiac dysfunction in type I diabetic Akita hearts, but the phenotype of heart failure is observed in molecular levels during the early stages. Male Akita (Ins2WT/C96Y) mice were monitored with echocardiographic imaging at various ages, and then with conventional echocardiographic analysis and speckle-tracking based strain analyses. With speckle-tracking based strain analyses, diabetic Akita mice showed changes in average global radial strain at the age of 12 weeks, as well as decreased longitudinal strain. These changes occurred in the early stage and remained throughout the progression of diabetic cardiomyopathy in Akita mice. Speckle-tracking showed that the detailed and precise changes of cardiac deformation in the progression of diabetic cardiomyopathy in the genetic type I diabetic Akita mice were uncoupled. We monitored early-stage changes in the heart of diabetic Akita mice. We utilize this technique to elucidate the underlying mechanism for heart failure in Akita genetic type I diabetic mice. It will further advance the assessment of cardiac abnormalities, as well as the discovery of new drug treatments using Akita genetic type I diabetic mice. © 2018 The Author(s). Published by S. Karger AG, Basel.
Speckle disturbance limit in laser-based cinema projection systems
Verschaffelt, Guy; Roelandt, Stijn; Meuret, Youri; van den Broeck, Wendy; Kilpi, Katriina; Lievens, Bram; Jacobs, An; Janssens, Peter; Thienpont, Hugo
2015-09-01
In a multi-disciplinary effort, we investigate the level of speckle that can be tolerated in a laser cinema projector based on a quality of experience experiment with movie clips shown to a test audience in a real-life movie theatre setting. We identify a speckle disturbance threshold by statistically analyzing the observers’ responses for different values of the amount of speckle, which was monitored using a well-defined speckle measurement method. The analysis shows that the speckle perception of a human observer is not only dependent on the objectively measured amount of speckle, but it is also strongly influenced by the image content. The speckle disturbance limit for movies turns out to be substantially larger than that for still images, and hence is easier to attain.
Speckle contrast diffuse correlation tomography of complex turbid medium flow
Energy Technology Data Exchange (ETDEWEB)
Huang, Chong; Irwin, Daniel; Lin, Yu; Shang, Yu; He, Lian; Kong, Weikai; Yu, Guoqiang [Department of Biomedical Engineering, University of Kentucky, Lexington, Kentucky 40506 (United States); Luo, Jia [Department of Pharmacology and Nutritional Sciences, University of Kentucky, Lexington, Kentucky 40506 (United States)
2015-07-15
Purpose: Developed herein is a three-dimensional (3D) flow contrast imaging system leveraging advancements in the extension of laser speckle contrast imaging theories to deep tissues along with our recently developed finite-element diffuse correlation tomography (DCT) reconstruction scheme. This technique, termed speckle contrast diffuse correlation tomography (scDCT), enables incorporation of complex optical property heterogeneities and sample boundaries. When combined with a reflectance-based design, this system facilitates a rapid segue into flow contrast imaging of larger, in vivo applications such as humans. Methods: A highly sensitive CCD camera was integrated into a reflectance-based optical system. Four long-coherence laser source positions were coupled to an optical switch for sequencing of tomographic data acquisition providing multiple projections through the sample. This system was investigated through incorporation of liquid and solid tissue-like phantoms exhibiting optical properties and flow characteristics typical of human tissues. Computer simulations were also performed for comparisons. A uniquely encountered smear correction algorithm was employed to correct point-source illumination contributions during image capture with the frame-transfer CCD and reflectance setup. Results: Measurements with scDCT on a homogeneous liquid phantom showed that speckle contrast-based deep flow indices were within 12% of those from standard DCT. Inclusion of a solid phantom submerged below the liquid phantom surface allowed for heterogeneity detection and validation. The heterogeneity was identified successfully by reconstructed 3D flow contrast tomography with scDCT. The heterogeneity center and dimensions and averaged relative flow (within 3%) and localization were in agreement with actuality and computer simulations, respectively. Conclusions: A custom cost-effective CCD-based reflectance 3D flow imaging system demonstrated rapid acquisition of dense boundary
A Survey of Image Encryption Algorithms
Kumari, Manju; Gupta, Shailender; Sardana, Pranshul
2017-12-01
Security of data/images is one of the crucial aspects in the gigantic and still expanding domain of digital transfer. Encryption of images is one of the well known mechanisms to preserve confidentiality of images over a reliable unrestricted public media. This medium is vulnerable to attacks and hence efficient encryption algorithms are necessity for secure data transfer. Various techniques have been proposed in literature till date, each have an edge over the other, to catch-up to the ever growing need of security. This paper is an effort to compare the most popular techniques available on the basis of various performance metrics like differential, statistical and quantitative attacks analysis. To measure the efficacy, all the modern and grown-up techniques are implemented in MATLAB-2015. The results show that the chaotic schemes used in the study provide highly scrambled encrypted images having uniform histogram distribution. In addition, the encrypted images provided very less degree of correlation coefficient values in horizontal, vertical and diagonal directions, proving their resistance against statistical attacks. In addition, these schemes are able to resist differential attacks as these showed a high sensitivity for the initial conditions, i.e. pixel and key values. Finally, the schemes provide a large key spacing, hence can resist the brute force attacks, and provided a very less computational time for image encryption/decryption in comparison to other schemes available in literature.
Sensitivity evaluation of dynamic speckle activity measurements using clustering methods
International Nuclear Information System (INIS)
Etchepareborda, Pablo; Federico, Alejandro; Kaufmann, Guillermo H.
2010-01-01
We evaluate and compare the use of competitive neural networks, self-organizing maps, the expectation-maximization algorithm, K-means, and fuzzy C-means techniques as partitional clustering methods, when the sensitivity of the activity measurement of dynamic speckle images needs to be improved. The temporal history of the acquired intensity generated by each pixel is analyzed in a wavelet decomposition framework, and it is shown that the mean energy of its corresponding wavelet coefficients provides a suited feature space for clustering purposes. The sensitivity obtained by using the evaluated clustering techniques is also compared with the well-known methods of Konishi-Fujii, weighted generalized differences, and wavelet entropy. The performance of the partitional clustering approach is evaluated using simulated dynamic speckle patterns and also experimental data.
Staloff, Isabelle Afriat
Skin mechanical properties have been extensively studied and have led to an understanding of the structure and role of the collagen and elastin fibers network in the dermis and their changes due to aging. All these techniques have either isolated the skin from its natural environment (in vitro), or, when studied in vivo, attempted to minimize the effect of the underlying tissues and muscles. The human facial region is unique compared to the other parts of the body in that the underlying musculature runs through the subcutaneous tissue and is directly connected to the dermis with collagen based fibrous tissues. These fibrous tissues comprise the superficial musculoaponeurotic system, commonly referred to as the SMAS layer. Retaining ligaments anchor the skin to the periosteum, and hold the dermis to the SMAS. In addition, traditional techniques generally collect an average response of the skin. Data gathered in this manner is incomplete as the skin is anisotropic and under constant tension. We therefore introduce the Digital Image Speckle Correlation (DISC) method that maps in two dimensions the skin deformation under the complex set of forces involved during muscular activity. DISC, a non-contact in vivo technique, generates spatial resolved information. By observing the detailed motion of the facial skin we can infer the manner in which the complex ensemble of forces induced by movement of the muscles distribute and dissipate on the skin. By analyzing the effect of aging on the distribution of these complex forces we can measure its impact on skin elasticity and quantify the efficacy of skin care products. In addition, we speculate on the mechanism of wrinkle formation. Furthermore, we investigate the use of DISC to map the mechanism of film formation on skin of various polymers. Finally, we show that DISC can detect the involuntary facial muscular activity induced by various fragrances.
Directory of Open Access Journals (Sweden)
Cyril Puissant
Full Text Available Endothelial dysfunction precedes atherosclerosis. Vasodilation induced by acetylcholine (ACh is a specific test of endothelial function. Reproducibility of laser techniques such as laser-Doppler-flowmetry (LDF and Laser-speckle-contrast-imaging (LSCI to detect ACh vasodilation is debated and results expressions lack standardization. We aimed to study at a 7-day interval (i the inter-subject reproducibility, (ii the intra-subjects reproducibility, and (iii the effect of the results expressions over variability.Using LDF and LSCI simultaneously, we performed two different ACh-iontophoresis protocols. The maximal ACh vasodilation (peak-ACh was expressed as absolute or normalized flow or conductance values. Inter-subject reproducibility was expressed as coefficient of variation (inter-CV,%. Intra-subject reproducibility was expressed as within subject coefficients of variation (intra-CV,%, and intra-class correlation coefficients (ICC. Fifteen healthy subjects were included. The inter-subject reproducibility of peak-ACh depended upon the expression of the results and ranged from 55% to 162% for LDF and from 17% to 83% for LSCI. The intra-subject reproducibility (intra-CV/ICC of peak-ACh was reduced when assessed with LSCI compared to LDF no matter how the results were expressed and whatever the protocol used. The highest intra-subject reproducibility was found using LSCI. It was 18.7%/0.87 for a single current stimulation (expressed as cutaneous vascular conductance and 11.4%/0.61 for multiple current stimulations (expressed as absolute value.ACh-iontophoresis coupled with LSCI is a promising test to assess endothelial function because it is reproducible, safe, and non-invasive. N°: NCT01664572.
Algorithms evaluation for fundus images enhancement
International Nuclear Information System (INIS)
Braem, V; Marcos, M; Bizai, G; Drozdowicz, B; Salvatelli, A
2011-01-01
Color images of the retina inherently involve noise and illumination artifacts. In order to improve the diagnostic quality of the images, it is desirable to homogenize the non-uniform illumination and increase contrast while preserving color characteristics. The visual result of different pre-processing techniques can be very dissimilar and it is necessary to make an objective assessment of the techniques in order to select the most suitable. In this article the performance of eight algorithms to correct the non-uniform illumination, contrast modification and color preservation was evaluated. In order to choose the most suitable a general score was proposed. The results got good impression from experts, although some differences suggest that not necessarily the best statistical quality of image is the one of best diagnostic quality to the trained doctor eye. This means that the best pre-processing algorithm for an automatic classification may be different to the most suitable one for visual diagnosis. However, both should result in the same final diagnosis.
MATLAB for laser speckle contrast analysis (LASCA): a practice-based approach
Postnikov, Eugene B.; Tsoy, Maria O.; Postnov, Dmitry E.
2018-04-01
Laser Speckle Contrast Analysis (LASCA) is one of the most powerful modern methods for revealing blood dynamics. The experimental design and theory for this method are well established, and the computational recipie is often regarded to be trivial. However, the achieved performance and spatial resolution may considerable differ for different implementations. We comprise a minireview of known approaches to the spatial laser speckle contrast data processing and their realization in MATLAB code providing an explicit correspondence to the mathematical representation, a discussion of available implementations. We also present the algorithm based on the 2D Haar wavelet transform, also supplied with the program code. This new method provides an opportunity to introduce horizontal, vertical and diagonal speckle contrasts; it may be used for processing highly anisotropic images of vascular trees. We provide the comparative analysis of the accuracy of vascular pattern detection and the processing times with a special attention to details of the used MATLAB procedures.
Autofocus algorithm for curvilinear SAR imaging
Bleszynski, E.; Bleszynski, M.; Jaroszewicz, T.
2012-05-01
We describe an approach to autofocusing for large apertures on curved SAR trajectories. It is a phase-gradient type method in which phase corrections compensating trajectory perturbations are estimated not directly from the image itself, but rather on the basis of partial" SAR data { functions of the slow and fast times { recon- structed (by an appropriate forward-projection procedure) from windowed scene patches, of sizes comparable to distances between distinct targets or localized features of the scene. The resulting partial data" can be shown to contain the same information on the phase perturbations as that in the original data, provided the frequencies of the perturbations do not exceed a quantity proportional to the patch size. The algorithm uses as input a sequence of conventional scene images based on moderate-size subapertures constituting the full aperture for which the phase corrections are to be determined. The subaperture images are formed with pixel sizes comparable to the range resolution which, for the optimal subaperture size, should be also approximately equal the cross-range resolution. The method does not restrict the size or shape of the synthetic aperture and can be incorporated in the data collection process in persistent sensing scenarios. The algorithm has been tested on the publicly available set of GOTCHA data, intentionally corrupted by random-walk-type trajectory uctuations (a possible model of errors caused by imprecise inertial navigation system readings) of maximum frequencies compatible with the selected patch size. It was able to eciently remove image corruption for apertures of sizes up to 360 degrees.
Non-Imaging Speckle Interferometry forHigh Speed Nanometer-Scale Position Detection
van Putten, E. G.; Lagendijk, A.; Mosk, A. P.
2011-01-01
We experimentally demonstrate a non-imaging approach to displacement measurement for complex scattering materials. By spatially controlling the wave front of the light that incidents on the material we concentrate the scattered light in a focus on a designated position. This wave front acts as an unique optical fingerprint that enables precise position detection of the illuminated material by simply measuring the intensity in the focus. By combining two optical fingerprints we demonstrate pos...
Improved Bat Algorithm Applied to Multilevel Image Thresholding
Directory of Open Access Journals (Sweden)
Adis Alihodzic
2014-01-01
Full Text Available Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.
Inverse synthetic aperture radar imaging principles, algorithms and applications
Chen , Victor C
2014-01-01
Inverse Synthetic Aperture Radar Imaging: Principles, Algorithms and Applications is based on the latest research on ISAR imaging of moving targets and non-cooperative target recognition (NCTR). With a focus on the advances and applications, this book will provide readers with a working knowledge on various algorithms of ISAR imaging of targets and implementation with MATLAB. These MATLAB algorithms will prove useful in order to visualize and manipulate some simulated ISAR images.
Parallel image encryption algorithm based on discretized chaotic map
International Nuclear Information System (INIS)
Zhou Qing; Wong Kwokwo; Liao Xiaofeng; Xiang Tao; Hu Yue
2008-01-01
Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms
1987-03-15
Jeffreys, W. H. 1980. A.J., 85 177 - 1982, Asir. Ap., 115. 253. . 1981AJ..86 149 Bonneau. D..and Lebeyne. A . 1973.4 p. J. (Letters. 181. LI. Karovska . M...supergiant and the nearby companion recently proposed by Karovska et al. (1986). 4 65 ’ A ,-"’ A V.. -Y U . ....WW WVWV 2 Imaging Technique The WSA...91-96. 9 Nisenson,P., R.V. Stachnik, M. Karovska , and R. Noyes (1985). A new optical source associated with T Tauri. Astrophs. J. 297, L17-L20
Speckle dynamics under ergodicity breaking
Sdobnov, Anton; Bykov, Alexander; Molodij, Guillaume; Kalchenko, Vyacheslav; Jarvinen, Topias; Popov, Alexey; Kordas, Krisztian; Meglinski, Igor
2018-04-01
Laser speckle contrast imaging (LSCI) is a well-known and versatile approach for the non-invasive visualization of flows and microcirculation localized in turbid scattering media, including biological tissues. In most conventional implementations of LSCI the ergodic regime is typically assumed valid. However, most composite turbid scattering media, especially biological tissues, are non-ergodic, containing a mixture of dynamic and static centers of light scattering. In the current study, we examined the speckle contrast in different dynamic conditions with the aim of assessing limitations in the quantitative interpretation of speckle contrast images. Based on a simple phenomenological approach, we introduced a coefficient of speckle dynamics to quantitatively assess the ratio of the dynamic part of a scattering medium to the static one. The introduced coefficient allows one to distinguish real changes in motion from the mere appearance of static components in the field of view. As examples of systems with static/dynamic transitions, thawing and heating of Intralipid samples were studied by the LSCI approach.
X-ray speckle correlation interferometer
International Nuclear Information System (INIS)
Eisenhower, Rachel; Materlik, Gerhard
2000-01-01
Speckle Pattern Correlation Interferometry (SPCI) is a well-established technique in the visible-light regime for observing surface disturbances. Although not a direct imaging technique, SPCI gives full-field, high-resolution information about an object's motion. Since x-ray synchrotron radiation beamlines with high coherent flux have allowed the observation of x-ray speckle, x-ray SPCI could provide a means to measure strains and other quasi-static motions in disordered systems. This paper therefore examines the feasibility of an x-ray speckle correlation interferometer
Speckle noise reduction for optical coherence tomography based on adaptive 2D dictionary
Lv, Hongli; Fu, Shujun; Zhang, Caiming; Zhai, Lin
2018-05-01
As a high-resolution biomedical imaging modality, optical coherence tomography (OCT) is widely used in medical sciences. However, OCT images often suffer from speckle noise, which can mask some important image information, and thus reduce the accuracy of clinical diagnosis. Taking full advantage of nonlocal self-similarity and adaptive 2D-dictionary-based sparse representation, in this work, a speckle noise reduction algorithm is proposed for despeckling OCT images. To reduce speckle noise while preserving local image features, similar nonlocal patches are first extracted from the noisy image and put into groups using a gamma- distribution-based block matching method. An adaptive 2D dictionary is then learned for each patch group. Unlike traditional vector-based sparse coding, we express each image patch by the linear combination of a few matrices. This image-to-matrix method can exploit the local correlation between pixels. Since each image patch might belong to several groups, the despeckled OCT image is finally obtained by aggregating all filtered image patches. The experimental results demonstrate the superior performance of the proposed method over other state-of-the-art despeckling methods, in terms of objective metrics and visual inspection.
Image steganalysis using Artificial Bee Colony algorithm
Sajedi, Hedieh
2017-09-01
Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.
Efficient predictive algorithms for image compression
Rosário Lucas, Luís Filipe; Maciel de Faria, Sérgio Manuel; Morais Rodrigues, Nuno Miguel; Liberal Pagliari, Carla
2017-01-01
This book discusses efficient prediction techniques for the current state-of-the-art High Efficiency Video Coding (HEVC) standard, focusing on the compression of a wide range of video signals, such as 3D video, Light Fields and natural images. The authors begin with a review of the state-of-the-art predictive coding methods and compression technologies for both 2D and 3D multimedia contents, which provides a good starting point for new researchers in the field of image and video compression. New prediction techniques that go beyond the standardized compression technologies are then presented and discussed. In the context of 3D video, the authors describe a new predictive algorithm for the compression of depth maps, which combines intra-directional prediction, with flexible block partitioning and linear residue fitting. New approaches are described for the compression of Light Field and still images, which enforce sparsity constraints on linear models. The Locally Linear Embedding-based prediction method is in...
Sengupta, Partho P; Huang, Yen-Min; Bansal, Manish; Ashrafi, Ali; Fisher, Matt; Shameer, Khader; Gall, Walt; Dudley, Joel T
2016-06-01
Associating a patient's profile with the memories of prototypical patients built through previous repeat clinical experience is a key process in clinical judgment. We hypothesized that a similar process using a cognitive computing tool would be well suited for learning and recalling multidimensional attributes of speckle tracking echocardiography data sets derived from patients with known constrictive pericarditis and restrictive cardiomyopathy. Clinical and echocardiographic data of 50 patients with constrictive pericarditis and 44 with restrictive cardiomyopathy were used for developing an associative memory classifier-based machine-learning algorithm. The speckle tracking echocardiography data were normalized in reference to 47 controls with no structural heart disease, and the diagnostic area under the receiver operating characteristic curve of the associative memory classifier was evaluated for differentiating constrictive pericarditis from restrictive cardiomyopathy. Using only speckle tracking echocardiography variables, associative memory classifier achieved a diagnostic area under the curve of 89.2%, which improved to 96.2% with addition of 4 echocardiographic variables. In comparison, the area under the curve of early diastolic mitral annular velocity and left ventricular longitudinal strain were 82.1% and 63.7%, respectively. Furthermore, the associative memory classifier demonstrated greater accuracy and shorter learning curves than other machine-learning approaches, with accuracy asymptotically approaching 90% after a training fraction of 0.3 and remaining flat at higher training fractions. This study demonstrates feasibility of a cognitive machine-learning approach for learning and recalling patterns observed during echocardiographic evaluations. Incorporation of machine-learning algorithms in cardiac imaging may aid standardized assessments and support the quality of interpretations, particularly for novice readers with limited experience. © 2016
New imaging algorithm in diffusion tomography
Klibanov, Michael V.; Lucas, Thomas R.; Frank, Robert M.
1997-08-01
A novel imaging algorithm for diffusion/optical tomography is presented for the case of the time dependent diffusion equation. Numerical tests are conducted for ranges of parameters realistic for applications to an early breast cancer diagnosis using ultrafast laser pulses. This is a perturbation-like method which works for both homogeneous a heterogeneous background media. Its main innovation lies in a new approach for a novel linearized problem (LP). Such an LP is derived and reduced to a boundary value problem for a coupled system of elliptic partial differential equations. As is well known, the solution of such a system amounts to the factorization of well conditioned, sparse matrices with few non-zero entries clustered along the diagonal, which can be done very rapidly. Thus, the main advantages of this technique are that it is fast and accurate. The authors call this approach the elliptic systems method (ESM). The ESM can be extended for other data collection schemes.
Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis
Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song
2018-01-01
To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.
Speckle reduction in digital holography with resampling ring masks
Zhang, Wenhui; Cao, Liangcai; Jin, Guofan
2018-01-01
One-shot digital holographic imaging has the advantages of high stability and low temporal cost. However, the reconstruction is affected by the speckle noise. Resampling ring-mask method in spectrum domain is proposed for speckle reduction. The useful spectrum of one hologram is divided into several sub-spectra by ring masks. In the reconstruction, angular spectrum transform is applied to guarantee the calculation accuracy which has no approximation. N reconstructed amplitude images are calculated from the corresponding sub-spectra. Thanks to speckle's random distribution, superimposing these N uncorrelated amplitude images would lead to a final reconstructed image with lower speckle noise. Normalized relative standard deviation values of the reconstructed image are used to evaluate the reduction of speckle. Effect of the method on the spatial resolution of the reconstructed image is also quantitatively evaluated. Experimental and simulation results prove the feasibility and effectiveness of the proposed method.
Adaptive Algorithms for Automated Processing of Document Images
2011-01-01
ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University
Maksimova, L. A.; Ryabukho, P. V.; Mysina, N. Yu.; Lyakin, D. V.; Ryabukho, V. P.
2018-04-01
We have investigated the capabilities of the method of digital speckle interferometry for determining subpixel displacements of a speckle structure formed by a displaceable or deformable object with a scattering surface. An analysis of spatial spectra of speckle structures makes it possible to perform measurements with a subpixel accuracy and to extend the lower boundary of the range of measurements of displacements of speckle structures to the range of subpixel values. The method is realized on the basis of digital recording of the images of undisplaced and displaced speckle structures, their spatial frequency analysis using numerically specified constant phase shifts, and correlation analysis of spatial spectra of speckle structures. Transformation into the frequency range makes it possible to obtain quantities to be measured with a subpixel accuracy from the shift of the interference-pattern minimum in the diffraction halo by introducing an additional phase shift into the complex spatial spectrum of the speckle structure or from the slope of the linear plot of the function of accumulated phase difference in the field of the complex spatial spectrum of the displaced speckle structure. The capabilities of the method have been investigated in natural experiment.
Highly porous nanoberyllium for X-ray beam speckle suppression
Energy Technology Data Exchange (ETDEWEB)
Goikhman, Alexander, E-mail: agoikhman@ymail.com; Lyatun, Ivan; Ershov, Petr [Immanuel Kant Baltic Federal University, Nevskogo str. 14, Kaliningrad 236041 (Russian Federation); Snigireva, Irina [European Synchrotron Radiation Facility, BP 220, 38043 Grenoble (France); Wojda, Pawel [Immanuel Kant Baltic Federal University, Nevskogo str. 14, Kaliningrad 236041 (Russian Federation); Gdańsk University of Technology, 11/12 G. Narutowicza, Gdańsk 80-233 (Poland); Gorlevsky, Vladimir; Semenov, Alexander; Sheverdyaev, Maksim; Koletskiy, Viktor [A. A. Bochvar High-Technology Scientific Research Institute for Inorganic Materials, Rogova str. 5a, Moscow 123098 (Russian Federation); Snigirev, Anatoly [Immanuel Kant Baltic Federal University, Nevskogo str. 14, Kaliningrad 236041 (Russian Federation); European Synchrotron Radiation Facility, BP 220, 38043 Grenoble (France)
2015-04-09
A speckle suppression device containing highly porous nanoberyllium is proposed for manipulating the spatial coherence length and removing undesirable speckle structure during imaging experiments. This paper reports a special device called a ‘speckle suppressor’, which contains a highly porous nanoberyllium plate squeezed between two beryllium windows. The insertion of the speckle suppressor in an X-ray beam allows manipulation of the spatial coherence length, thus changing the effective source size and removing the undesirable speckle structure in X-ray imaging experiments almost without beam attenuation. The absorption of the nanoberyllium plate is below 1% for 1 mm thickness at 12 keV. The speckle suppressor was tested on the ID06 ESRF beamline with X-rays in the energy range from 9 to 15 keV. It was applied for the transformation of the phase–amplitude contrast to the pure amplitude contrast in full-field microscopy.
Highly porous nanoberyllium for X-ray beam speckle suppression
International Nuclear Information System (INIS)
Goikhman, Alexander; Lyatun, Ivan; Ershov, Petr; Snigireva, Irina; Wojda, Pawel; Gorlevsky, Vladimir; Semenov, Alexander; Sheverdyaev, Maksim; Koletskiy, Viktor; Snigirev, Anatoly
2015-01-01
A speckle suppression device containing highly porous nanoberyllium is proposed for manipulating the spatial coherence length and removing undesirable speckle structure during imaging experiments. This paper reports a special device called a ‘speckle suppressor’, which contains a highly porous nanoberyllium plate squeezed between two beryllium windows. The insertion of the speckle suppressor in an X-ray beam allows manipulation of the spatial coherence length, thus changing the effective source size and removing the undesirable speckle structure in X-ray imaging experiments almost without beam attenuation. The absorption of the nanoberyllium plate is below 1% for 1 mm thickness at 12 keV. The speckle suppressor was tested on the ID06 ESRF beamline with X-rays in the energy range from 9 to 15 keV. It was applied for the transformation of the phase–amplitude contrast to the pure amplitude contrast in full-field microscopy
Implementation of dictionary pair learning algorithm for image quality improvement
Vimala, C.; Aruna Priya, P.
2018-04-01
This paper proposes an image denoising on dictionary pair learning algorithm. Visual information is transmitted in the form of digital images is becoming a major method of communication in the modern age, but the image obtained after transmissions is often corrupted with noise. The received image needs processing before it can be used in applications. Image denoising involves the manipulation of the image data to produce a visually high quality image.
Algorithms for contrast enhancement of electronic portal images
International Nuclear Information System (INIS)
Díez, S.; Sánchez, S.
2015-01-01
An implementation of two new automatized image processing algorithms for contrast enhancement of portal images is presented as suitable tools which facilitate the setup verification and visualization of patients during radiotherapy treatments. In the first algorithm, called Automatic Segmentation and Histogram Stretching (ASHS), the portal image is automatically segmented in two sub-images delimited by the conformed treatment beam: one image consisting of the imaged patient obtained directly from the radiation treatment field, and the second one is composed of the imaged patient outside it. By segmenting the original image, a histogram stretching can be independently performed and improved in both regions. The second algorithm involves a two-step process. In the first step, a Normalization to Local Mean (NLM), an inverse restoration filter is applied by dividing pixel by pixel a portal image by its blurred version. In the second step, named Lineally Combined Local Histogram Equalization (LCLHE), the contrast of the original image is strongly improved by a Local Contrast Enhancement (LCE) algorithm, revealing the anatomical structures of patients. The output image is lineally combined with a portal image of the patient. Finally the output images of the previous algorithms (NLM and LCLHE) are lineally combined, once again, in order to obtain a contrast enhanced image. These two algorithms have been tested on several portal images with great results. - Highlights: • Two Algorithms are implemented to improve the contrast of Electronic Portal Images. • The multi-leaf and conformed beam are automatically segmented into Portal Images. • Hidden anatomical and bony structures in portal images are revealed. • The task related to the patient setup verification is facilitated by the contrast enhancement then achieved.
Automated Photogrammetric Image Matching with Sift Algorithm and Delaunay Triangulation
DEFF Research Database (Denmark)
Karagiannis, Georgios; Antón Castro, Francesc/François; Mioc, Darka
2016-01-01
An algorithm for image matching of multi-sensor and multi-temporal satellite images is developed. The method is based on the SIFT feature detector proposed by Lowe in (Lowe, 1999). First, SIFT feature points are detected independently in two images (reference and sensed image). The features detec...... of each feature set for each image are computed. The isomorphism of the Delaunay triangulations is determined to guarantee the quality of the image matching. The algorithm is implemented in Matlab and tested on World-View 2, SPOT6 and TerraSAR-X image patches....
A locally adaptive algorithm for shadow correction in color images
Karnaukhov, Victor; Kober, Vitaly
2017-09-01
The paper deals with correction of color images distorted by spatially nonuniform illumination. A serious distortion occurs in real conditions when a part of the scene containing 3D objects close to a directed light source is illuminated much brighter than the rest of the scene. A locally-adaptive algorithm for correction of shadow regions in color images is proposed. The algorithm consists of segmentation of shadow areas with rank-order statistics followed by correction of nonuniform illumination with human visual perception approach. The performance of the proposed algorithm is compared to that of common algorithms for correction of color images containing shadow regions.
Dumas, Christian; van der Lee, Arie; Palatinus, Lukáš
2013-05-01
Diffractive imaging using the intense and coherent beam of X-ray free-electron lasers opens new perspectives for structural studies of single nanoparticles and biomolecules. Simulations were carried out to generate 3D oversampled diffraction patterns of non-crystalline biological samples, ranging from peptides and proteins to megadalton complex assemblies, and to recover their molecular structure from nanometer to near-atomic resolutions. Using these simulated data, we show here that iterative reconstruction methods based on standard and variant forms of the charge flipping algorithm, can efficiently solve the phase retrieval problem and extract a unique and reliable molecular structure. Contrary to the case of conventional algorithms, where the estimation and the use of a compact support is imposed, our approach does not require any prior information about the molecular assembly, and is amenable to a wide range of biological assemblies. Importantly, the robustness of this ab initio approach is illustrated by the fact that it tolerates experimental noise and incompleteness of the intensity data at the center of the speckle pattern. Copyright © 2013 Elsevier Inc. All rights reserved.
AN IMPROVED FUZZY CLUSTERING ALGORITHM FOR MICROARRAY IMAGE SPOTS SEGMENTATION
Directory of Open Access Journals (Sweden)
V.G. Biju
2015-11-01
Full Text Available An automatic cDNA microarray image processing using an improved fuzzy clustering algorithm is presented in this paper. The spot segmentation algorithm proposed uses the gridding technique developed by the authors earlier, for finding the co-ordinates of each spot in an image. Automatic cropping of spots from microarray image is done using these co-ordinates. The present paper proposes an improved fuzzy clustering algorithm Possibility fuzzy local information c means (PFLICM to segment the spot foreground (FG from background (BG. The PFLICM improves fuzzy local information c means (FLICM algorithm by incorporating typicality of a pixel along with gray level information and local spatial information. The performance of the algorithm is validated using a set of simulated cDNA microarray images added with different levels of AWGN noise. The strength of the algorithm is tested by computing the parameters such as the Segmentation matching factor (SMF, Probability of error (pe, Discrepancy distance (D and Normal mean square error (NMSE. SMF value obtained for PFLICM algorithm shows an improvement of 0.9 % and 0.7 % for high noise and low noise microarray images respectively compared to FLICM algorithm. The PFLICM algorithm is also applied on real microarray images and gene expression values are computed.
Fast image matching algorithm based on projection characteristics
Zhou, Lijuan; Yue, Xiaobo; Zhou, Lijun
2011-06-01
Based on analyzing the traditional template matching algorithm, this paper identified the key factors restricting the speed of matching and put forward a brand new fast matching algorithm based on projection. Projecting the grayscale image, this algorithm converts the two-dimensional information of the image into one-dimensional one, and then matches and identifies through one-dimensional correlation, meanwhile, because of normalization has been done, when the image brightness or signal amplitude increasing in proportion, it could also perform correct matching. Experimental results show that the projection characteristics based image registration method proposed in this article could greatly improve the matching speed, which ensuring the matching accuracy as well.
High performance deformable image registration algorithms for manycore processors
Shackleford, James; Sharp, Gregory
2013-01-01
High Performance Deformable Image Registration Algorithms for Manycore Processors develops highly data-parallel image registration algorithms suitable for use on modern multi-core architectures, including graphics processing units (GPUs). Focusing on deformable registration, we show how to develop data-parallel versions of the registration algorithm suitable for execution on the GPU. Image registration is the process of aligning two or more images into a common coordinate frame and is a fundamental step to be able to compare or fuse data obtained from different sensor measurements. E
The Noise Clinic: a Blind Image Denoising Algorithm
Directory of Open Access Journals (Sweden)
Marc Lebrun
2015-01-01
Full Text Available This paper describes the complete implementation of a blind image algorithm, that takes any digital image as input. In a first step the algorithm estimates a Signal and Frequency Dependent (SFD noise model. In a second step, the image is denoised by a multiscale adaptation of the Non-local Bayes denoising method. We focus here on a careful analysis of the denoising step and present a detailed discussion of the influence of its parameters. Extensive commented tests of the blind denoising algorithm are presented, on real JPEG images and scans of old photographs.
Energy Technology Data Exchange (ETDEWEB)
Hassan, T.A.
1992-12-01
The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows. A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.
DEFF Research Database (Denmark)
Chakrabarti, Maumita; Jakobsen, Michael Linde; Hanson, Steen Grüner
2015-01-01
A novel spectrometer concept is analyzed and experimentally verified. The method relies on probing the speckle displacement due to a change in the incident wavelength. A rough surface is illuminated at an oblique angle, and the peak position of the covariance between the speckle patterns observed...
A Novel Plant Root Foraging Algorithm for Image Segmentation Problems
Directory of Open Access Journals (Sweden)
Lianbo Ma
2014-01-01
Full Text Available This paper presents a new type of biologically-inspired global optimization methodology for image segmentation based on plant root foraging behavior, namely, artificial root foraging algorithm (ARFO. The essential motive of ARFO is to imitate the significant characteristics of plant root foraging behavior including branching, regrowing, and tropisms for constructing a heuristic algorithm for multidimensional and multimodal problems. A mathematical model is firstly designed to abstract various plant root foraging patterns. Then, the basic process of ARFO algorithm derived in the model is described in details. When tested against ten benchmark functions, ARFO shows the superiority to other state-of-the-art algorithms on several benchmark functions. Further, we employed the ARFO algorithm to deal with multilevel threshold image segmentation problem. Experimental results of the new algorithm on a variety of images demonstrated the suitability of the proposed method for solving such problem.
High speed display algorithm for 3D medical images using Multi Layer Range Image
International Nuclear Information System (INIS)
Ban, Hideyuki; Suzuki, Ryuuichi
1993-01-01
We propose high speed algorithm that display 3D voxel images obtained from medical imaging systems such as MRI. This algorithm convert voxel image data to 6 Multi Layer Range Image (MLRI) data, which is an augmentation of the range image data. To avoid the calculation for invisible voxels, the algorithm selects at most 3 MLRI data from 6 in accordance with the view direction. The proposed algorithm displays 256 x 256 x 256 voxel data within 0.6 seconds using 22 MIPS Workstation without a special hardware such as Graphics Engine. Real-time display will be possible on 100 MIPS class Workstation by our algorithm. (author)
Directory of Open Access Journals (Sweden)
Jian-Li Fu
2016-10-01
Full Text Available Objective: To analyze the three-dimensional speckle tracking imaging assessment of left ventricular change in patient with coronary heart disease and its correlation with serum indexes. Methods: A total of 152 patients first diagnosed with coronary heart disease were the observation group of the study and 117 healthy subjects were the control group. Threedimensional speckle tracking imaging (3D-STI was used to evaluate the left ventricular function parameters of two groups, the serum content of endothelial function indexes and platelet function indexes were detected, and the correlation between left ventricular function parameters under 3D-STI and serum indexes was further analyzed. Results: Absolute values of left ventricular function parameters LVGLS, LVGRS, LVGCS and LVGAS from 3D-STI of observation group were significantly less than those of control group while Ptw and Torsion levels were greater than those of control group; endothelial function indexes vWF, sICAM-1, sVCAM-1 and ET-1 content in serum were significantly higher than those of control group while vWF-cp and NO content were significantly lower than those of control group; platelet function indexes CD62P, GMP-140, CD63, sP-selectin, sCD40L and PAC-1 content in serum were significantly higher than those of control group. The levels of left ventricular function parameters from 3D-STI in patients with coronary heart disease were directly correlated with serum indexes. Conclusion: 3D-STI can accurately assess the left ventricular function and the overall disease severity in patients with coronary heart disease, and it is expected to become an effective method for early diagnosis of diseases and guidance of clinical treatment.
Barcelos, Amanda; Lamas, Cristiane; Tibiriça, Eduardo
2017-07-28
Infective endocarditis is a severe condition with high in-hospital and 5-year mortality. There is increasing incidence of infective endocarditis, which may be related to healthcare and changes in prophylaxis recommendations regarding oral procedures. Few studies have evaluated the microcirculation in patients with infective endocarditis, and so far, none have utilized laser-based technology or evaluated functional capillary density. The aim of the study is to evaluate the changes in the systemic microvascular bed of patients with both acute and subacute endocarditis. This is a cohort study that will include adult patients with confirmed active infective endocarditis according to the modified Duke criteria who were admitted to our center for treatment. A control group of sex- and age-matched healthy volunteers will be included. Functional capillary density, which is defined as the number of spontaneously perfused capillaries per square millimeter of skin, will be assessed by video-microscopy with an epi-illuminated fiber optic microscope. Capillary recruitment will be evaluated using post-occlusive reactive hyperemia. Microvascular flow will be evaluated in the forearm using a laser speckle contrast imaging system for the noninvasive and continuous measurement of cutaneous microvascular perfusion changes. Laser speckle contrast imaging will be used in combination with skin iontophoresis of acetylcholine, an endothelium-dependent vasodilator, or sodium nitroprusside (endothelium independent) to test microvascular reactivity. The present study will contribute to the investigation of microcirculatory changes in infective endocarditis and possibly lead to an earlier diagnosis of the condition and/or determination of its severity and complications. Trial registration ClinicalTrials.gov ID: NCT02940340.
The optimal algorithm for Multi-source RS image fusion.
Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan
2016-01-01
In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.
An improved ASIFT algorithm for indoor panorama image matching
Fu, Han; Xie, Donghai; Zhong, Ruofei; Wu, Yu; Wu, Qiong
2017-07-01
The generation of 3D models for indoor objects and scenes is an attractive tool for digital city, virtual reality and SLAM purposes. Panoramic images are becoming increasingly more common in such applications due to their advantages to capture the complete environment in one single image with large field of view. The extraction and matching of image feature points are important and difficult steps in three-dimensional reconstruction, and ASIFT is a state-of-the-art algorithm to implement these functions. Compared with the SIFT algorithm, more feature points can be generated and the matching accuracy of ASIFT algorithm is higher, even for the panoramic images with obvious distortions. However, the algorithm is really time-consuming because of complex operations and performs not very well for some indoor scenes under poor light or without rich textures. To solve this problem, this paper proposes an improved ASIFT algorithm for indoor panoramic images: firstly, the panoramic images are projected into multiple normal perspective images. Secondly, the original ASIFT algorithm is simplified from the affine transformation of tilt and rotation with the images to the only tilt affine transformation. Finally, the results are re-projected to the panoramic image space. Experiments in different environments show that this method can not only ensure the precision of feature points extraction and matching, but also greatly reduce the computing time.
Deformation measurements of materials at low temperatures using laser speckle photography method
International Nuclear Information System (INIS)
Sumio Nakahara; Yukihide Maeda; Kazunori Matsumura; Shigeyoshi Hisada; Takeyoshi Fujita; Kiyoshi Sugihara
1992-01-01
The authors observed deformations of several materials during cooling down process from room temperature to liquid nitrogen temperature using the laser speckle photography method. The in-plane displacements were measured by the image plane speckle photography and the out-of-plane displacement gradients by the defocused speckle photography. The results of measurements of in-plane displacement are compared with those of FEM analysis. The applicability of laser speckle photography method to cryogenic engineering are also discussed
Analyzing speckle contrast for HiLo microscopy optimization
Mazzaferri, J.; Kunik, D.; Belisle, J. M.; Singh, K.; Lefrançois, S.; Costantino, S.
2011-07-01
HiLo microscopy is a recently developed technique that provides both optical sectioning and fast imaging with a simple implementation and at a very low cost. The methodology combines widefield and speckled illumination images to obtain one optically sectioned image. Hence, the characteristics of such speckle illumination ultimately determine the quality of HiLo images and the overall performance of the method. In this work, we study how speckle contrast influence local variations of fluorescence intensity and brightness profiles of thick samples. We present this article as a guide to adjust the parameters of the system for optimizing the capabilities of this novel technology.
Entendue invariance in speckle fields
International Nuclear Information System (INIS)
Medina, F.F.; Garcia-Sucerquia, J.; Henao, R.; Trivi, M.
2000-04-01
Experimental evidence is shown that confirms the Entendue invariance in speckle fields. Because of this condition, the coherence patch of the speckle field can be significantly greater than the mean size of the speckles, as is shown by double exposure speckle interferometry. (author)
Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm
Elahi, Sana; kaleem, Muhammad; Omer, Hammad
2018-01-01
Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.
An Improved FCM Medical Image Segmentation Algorithm Based on MMTD
Directory of Open Access Journals (Sweden)
Ningning Zhou
2014-01-01
Full Text Available Image segmentation plays an important role in medical image processing. Fuzzy c-means (FCM is one of the popular clustering algorithms for medical image segmentation. But FCM is highly vulnerable to noise due to not considering the spatial information in image segmentation. This paper introduces medium mathematics system which is employed to process fuzzy information for image segmentation. It establishes the medium similarity measure based on the measure of medium truth degree (MMTD and uses the correlation of the pixel and its neighbors to define the medium membership function. An improved FCM medical image segmentation algorithm based on MMTD which takes some spatial features into account is proposed in this paper. The experimental results show that the proposed algorithm is more antinoise than the standard FCM, with more certainty and less fuzziness. This will lead to its practicable and effective applications in medical image segmentation.
A Constrained Algorithm Based NMFα for Image Representation
Directory of Open Access Journals (Sweden)
Chenxue Yang
2014-01-01
Full Text Available Nonnegative matrix factorization (NMF is a useful tool in learning a basic representation of image data. However, its performance and applicability in real scenarios are limited because of the lack of image information. In this paper, we propose a constrained matrix decomposition algorithm for image representation which contains parameters associated with the characteristics of image data sets. Particularly, we impose label information as additional hard constraints to the α-divergence-NMF unsupervised learning algorithm. The resulted algorithm is derived by using Karush-Kuhn-Tucker (KKT conditions as well as the projected gradient and its monotonic local convergence is proved by using auxiliary functions. In addition, we provide a method to select the parameters to our semisupervised matrix decomposition algorithm in the experiment. Compared with the state-of-the-art approaches, our method with the parameters has the best classification accuracy on three image data sets.
Thinning an object boundary on digital image using pipelined algorithm
International Nuclear Information System (INIS)
Dewanto, S.; Aliyanta, B.
1997-01-01
In digital image processing, the thinning process to an object boundary is required to analyze the image structure with a measurement of parameter such as area, circumference of the image object. The process needs a sufficient large memory and time consuming if all the image pixels stored in the memory and the following process is done after all the pixels has ben transformed. pipelined algorithm can reduce the time used in the process. This algorithm uses buffer memory where its size can be adjusted. the next thinning process doesn't need to wait all the transformation of pixels. This paper described pipelined algorithm with some result on the use of the algorithm to digital image
A SAR IMAGE REGISTRATION METHOD BASED ON SIFT ALGORITHM
Directory of Open Access Journals (Sweden)
W. Lu
2017-09-01
Full Text Available In order to improve the stability and rapidity of synthetic aperture radar (SAR images matching, an effective method was presented. Firstly, the adaptive smoothing filtering was employed for image denoising in image processing based on Wallis filtering to avoid the follow-up noise is amplified. Secondly, feature points were extracted by a simplified SIFT algorithm. Finally, the exact matching of the images was achieved with these points. Compared with the existing methods, it not only maintains the richness of features, but a-lso reduces the noise of the image. The simulation results show that the proposed algorithm can achieve better matching effect.
Qin, Yi; Wang, Zhipeng; Wang, Hongjuan; Gong, Qiong; Zhou, Nanrun
2018-06-01
The diffractive-imaging-based encryption (DIBE) scheme has aroused wide interesting due to its compact architecture and low requirement of conditions. Nevertheless, the primary information can hardly be recovered exactly in the real applications when considering the speckle noise and potential occlusion imposed on the ciphertext. To deal with this issue, the customized data container (CDC) into DIBE is introduced and a new phase retrieval algorithm (PRA) for plaintext retrieval is proposed. The PRA, designed according to the peculiarity of the CDC, combines two key techniques from previous approaches, i.e., input-support-constraint and median-filtering. The proposed scheme can guarantee totally the reconstruction of the primary information despite heavy noise or occlusion and its effectiveness and feasibility have been demonstrated with simulation results.
Brain-inspired algorithms for retinal image analysis
ter Haar Romeny, B.M.; Bekkers, E.J.; Zhang, J.; Abbasi-Sureshjani, S.; Huang, F.; Duits, R.; Dasht Bozorg, Behdad; Berendschot, T.T.J.M.; Smit-Ockeloen, I.; Eppenhof, K.A.J.; Feng, J.; Hannink, J.; Schouten, J.; Tong, M.; Wu, H.; van Triest, J.W.; Zhu, S.; Chen, D.; He, W.; Xu, L.; Han, P.; Kang, Y.
2016-01-01
Retinal image analysis is a challenging problem due to the precise quantification required and the huge numbers of images produced in screening programs. This paper describes a series of innovative brain-inspired algorithms for automated retinal image analysis, recently developed for the RetinaCheck
Chaos-based image encryption algorithm
International Nuclear Information System (INIS)
Guan Zhihong; Huang Fangjun; Guan Wenjie
2005-01-01
In this Letter, a new image encryption scheme is presented, in which shuffling the positions and changing the grey values of image pixels are combined to confuse the relationship between the cipher-image and the plain-image. Firstly, the Arnold cat map is used to shuffle the positions of the image pixels in the spatial-domain. Then the discrete output signal of the Chen's chaotic system is preprocessed to be suitable for the grayscale image encryption, and the shuffled image is encrypted by the preprocessed signal pixel by pixel. The experimental results demonstrate that the key space is large enough to resist the brute-force attack and the distribution of grey values of the encrypted image has a random-like behavior
Multilevel Image Segmentation Based on an Improved Firefly Algorithm
Directory of Open Access Journals (Sweden)
Kai Chen
2016-01-01
Full Text Available Multilevel image segmentation is time-consuming and involves large computation. The firefly algorithm has been applied to enhancing the efficiency of multilevel image segmentation. However, in some cases, firefly algorithm is easily trapped into local optima. In this paper, an improved firefly algorithm (IFA is proposed to search multilevel thresholds. In IFA, in order to help fireflies escape from local optima and accelerate the convergence, two strategies (i.e., diversity enhancing strategy with Cauchy mutation and neighborhood strategy are proposed and adaptively chosen according to different stagnation stations. The proposed IFA is compared with three benchmark optimal algorithms, that is, Darwinian particle swarm optimization, hybrid differential evolution optimization, and firefly algorithm. The experimental results show that the proposed method can efficiently segment multilevel images and obtain better performance than the other three methods.
Speckle Reduction and Structure Enhancement by Multichannel Median Boosted Anisotropic Diffusion
Directory of Open Access Journals (Sweden)
Yang Zhi
2004-01-01
Full Text Available We propose a new approach to reduce speckle noise and enhance structures in speckle-corrupted images. It utilizes a median-anisotropic diffusion compound scheme. The median-filter-based reaction term acts as a guided energy source to boost the structures in the image being processed. In addition, it regularizes the diffusion equation to ensure the existence and uniqueness of a solution. We also introduce a decimation and back reconstruction scheme to further enhance the processing result. Before the iteration of the diffusion process, the image is decimated and a subpixel shifted image set is formed. This allows a multichannel parallel diffusion iteration, and more importantly, the speckle noise is broken into impulsive or salt-pepper noise, which is easy to remove by median filtering. The advantage of the proposed technique is clear when it is compared to other diffusion algorithms and the well-known adaptive weighted median filtering (AWMF scheme in both simulation and real medical ultrasound images.
A Method for Improving the Progressive Image Coding Algorithms
Directory of Open Access Journals (Sweden)
Ovidiu COSMA
2014-12-01
Full Text Available This article presents a method for increasing the performance of the progressive coding algorithms for the subbands of images, by representing the coefficients with a code that reduces the truncation error.
ProxImaL: efficient image optimization using proximal algorithms
Heide, Felix; Diamond, Steven; Nieß ner, Matthias; Ragan-Kelley, Jonathan; Heidrich, Wolfgang; Wetzstein, Gordon
2016-01-01
domain-specific language and compiler for image optimization problems that makes it easy to experiment with different problem formulations and algorithm choices. The language uses proximal operators as the fundamental building blocks of a variety
Comparison of SeaWinds Backscatter Imaging Algorithms
Long, David G.
2017-01-01
This paper compares the performance and tradeoffs of various backscatter imaging algorithms for the SeaWinds scatterometer when multiple passes over a target are available. Reconstruction methods are compared with conventional gridding algorithms. In particular, the performance and tradeoffs in conventional ‘drop in the bucket’ (DIB) gridding at the intrinsic sensor resolution are compared to high-spatial-resolution imaging algorithms such as fine-resolution DIB and the scatterometer image reconstruction (SIR) that generate enhanced-resolution backscatter images. Various options for each algorithm are explored, including considering both linear and dB computation. The effects of sampling density and reconstruction quality versus time are explored. Both simulated and actual data results are considered. The results demonstrate the effectiveness of high-resolution reconstruction using SIR as well as its limitations and the limitations of DIB and fDIB. PMID:28828143
A new modified fast fractal image compression algorithm
DEFF Research Database (Denmark)
Salarian, Mehdi; Nadernejad, Ehsan; MiarNaimi, Hossein
2013-01-01
In this paper, a new fractal image compression algorithm is proposed, in which the time of the encoding process is considerably reduced. The algorithm exploits a domain pool reduction approach, along with the use of innovative predefined values for contrast scaling factor, S, instead of searching...
A. AL-Salhi, Yahya E.; Lu, Songfeng
2016-08-01
Quantum steganography can solve some problems that are considered inefficient in image information concealing. It researches on Quantum image information concealing to have been widely exploited in recent years. Quantum image information concealing can be categorized into quantum image digital blocking, quantum image stereography, anonymity and other branches. Least significant bit (LSB) information concealing plays vital roles in the classical world because many image information concealing algorithms are designed based on it. Firstly, based on the novel enhanced quantum representation (NEQR), image uniform blocks clustering around the concrete the least significant Qu-block (LSQB) information concealing algorithm for quantum image steganography is presented. Secondly, a clustering algorithm is proposed to optimize the concealment of important data. Finally, we used Con-Steg algorithm to conceal the clustered image blocks. Information concealing located on the Fourier domain of an image can achieve the security of image information, thus we further discuss the Fourier domain LSQu-block information concealing algorithm for quantum image based on Quantum Fourier Transforms. In our algorithms, the corresponding unitary Transformations are designed to realize the aim of concealing the secret information to the least significant Qu-block representing color of the quantum cover image. Finally, the procedures of extracting the secret information are illustrated. Quantum image LSQu-block image information concealing algorithm can be applied in many fields according to different needs.
International Nuclear Information System (INIS)
Niu Lili; Qian Ming; Yu Wentao; Jin Qiaofeng; Ling Tao; Zheng Hairong; Wan Kun; Gao Shen
2010-01-01
This paper presents a new algorithm for ultrasonic particle image velocimetry (Echo PIV) for improving the flow velocity measurement accuracy and efficiency in regions with high velocity gradients. The conventional Echo PIV algorithm has been modified by incorporating a multiple iterative algorithm, sub-pixel method, filter and interpolation method, and spurious vector elimination algorithm. The new algorithms' performance is assessed by analyzing simulated images with known displacements, and ultrasonic B-mode images of in vitro laminar pipe flow, rotational flow and in vivo rat carotid arterial flow. Results of the simulated images show that the new algorithm produces much smaller bias from the known displacements. For laminar flow, the new algorithm results in 1.1% deviation from the analytically derived value, and 8.8% for the conventional algorithm. The vector quality evaluation for the rotational flow imaging shows that the new algorithm produces better velocity vectors. For in vivo rat carotid arterial flow imaging, the results from the new algorithm deviate 6.6% from the Doppler-measured peak velocities averagely compared to 15% of that from the conventional algorithm. The new Echo PIV algorithm is able to effectively improve the measurement accuracy in imaging flow fields with high velocity gradients.
A Modified Image Comparison Algorithm Using Histogram Features
Al-Oraiqat, Anas M.; Kostyukova, Natalya S.
2018-01-01
This article discuss the problem of color image content comparison. Particularly, methods of image content comparison are analyzed, restrictions of color histogram are described and a modified method of images content comparison is proposed. This method uses the color histograms and considers color locations. Testing and analyzing of based and modified algorithms are performed. The modified method shows 97% average precision for a collection containing about 700 images without loss of the adv...
Digital Image Encryption Algorithm Design Based on Genetic Hyperchaos
Directory of Open Access Journals (Sweden)
Jian Wang
2016-01-01
Full Text Available In view of the present chaotic image encryption algorithm based on scrambling (diffusion is vulnerable to choosing plaintext (ciphertext attack in the process of pixel position scrambling, we put forward a image encryption algorithm based on genetic super chaotic system. The algorithm, by introducing clear feedback to the process of scrambling, makes the scrambling effect related to the initial chaos sequence and the clear text itself; it has realized the image features and the organic fusion of encryption algorithm. By introduction in the process of diffusion to encrypt plaintext feedback mechanism, it improves sensitivity of plaintext, algorithm selection plaintext, and ciphertext attack resistance. At the same time, it also makes full use of the characteristics of image information. Finally, experimental simulation and theoretical analysis show that our proposed algorithm can not only effectively resist plaintext (ciphertext attack, statistical attack, and information entropy attack but also effectively improve the efficiency of image encryption, which is a relatively secure and effective way of image communication.
Reconstruction Algorithms in Undersampled AFM Imaging
DEFF Research Database (Denmark)
Arildsen, Thomas; Oxvig, Christian Schou; Pedersen, Patrick Steffen
2016-01-01
This paper provides a study of spatial undersampling in atomic force microscopy (AFM) imaging followed by different image reconstruction techniques based on sparse approximation as well as interpolation. The main reasons for using undersampling is that it reduces the path length and thereby...... the scanning time as well as the amount of interaction between the AFM probe and the specimen. It can easily be applied on conventional AFM hardware. Due to undersampling, it is then necessary to further process the acquired image in order to reconstruct an approximation of the image. Based on real AFM cell...... images, our simulations reveal that using a simple raster scanning pattern in combination with conventional image interpolation performs very well. Moreover, this combination enables a reduction by a factor 10 of the scanning time while retaining an average reconstruction quality around 36 dB PSNR...
Fast parallel algorithm for CT image reconstruction.
Flores, Liubov A; Vidal, Vicent; Mayo, Patricia; Rodenas, Francisco; Verdú, Gumersindo
2012-01-01
In X-ray computed tomography (CT) the X rays are used to obtain the projection data needed to generate an image of the inside of an object. The image can be generated with different techniques. Iterative methods are more suitable for the reconstruction of images with high contrast and precision in noisy conditions and from a small number of projections. Their use may be important in portable scanners for their functionality in emergency situations. However, in practice, these methods are not widely used due to the high computational cost of their implementation. In this work we analyze iterative parallel image reconstruction with the Portable Extensive Toolkit for Scientific computation (PETSc).
Development of information preserving data compression algorithm for CT images
International Nuclear Information System (INIS)
Kobayashi, Yoshio
1989-01-01
Although digital imaging techniques in radiology develop rapidly, problems arise in archival storage and communication of image data. This paper reports on a new information preserving data compression algorithm for computed tomographic (CT) images. This algorithm consists of the following five processes: 1. Pixels surrounding the human body showing CT values smaller than -900 H.U. are eliminated. 2. Each pixel is encoded by its numerical difference from its neighboring pixel along a matrix line. 3. Difference values are encoded by a newly designed code rather than the natural binary code. 4. Image data, obtained with the above process, are decomposed into bit planes. 5. The bit state transitions in each bit plane are encoded by run length coding. Using this new algorithm, the compression ratios of brain, chest, and abdomen CT images are 4.49, 4.34. and 4.40 respectively. (author)
Successive approximation algorithm for cancellation of artifacts in DSA images
International Nuclear Information System (INIS)
Funakami, Raiko; Hiroshima, Kyoichi; Nishino, Junji
2000-01-01
In this paper, we propose an algorithm for cancellation of artifacts in DSA images. We have already proposed an automatic registration method based on the detection of local movements. When motion of the object is large, it is difficult to estimate the exact movement, and the cancellation of artifacts may therefore fail. The algorithm we propose here is based on a simple rigid model. We present the results of applying the proposed method to a series of experimental X-ray images, as well as the results of applying the algorithm as preprocessing for a registration method based on local movement. (author)
A Novel Image Encryption Algorithm Based on DNA Subsequence Operation
Directory of Open Access Journals (Sweden)
Qiang Zhang
2012-01-01
Full Text Available We present a novel image encryption algorithm based on DNA subsequence operation. Different from the traditional DNA encryption methods, our algorithm does not use complex biological operation but just uses the idea of DNA subsequence operations (such as elongation operation, truncation operation, deletion operation, etc. combining with the logistic chaotic map to scramble the location and the value of pixel points from the image. The experimental results and security analysis show that the proposed algorithm is easy to be implemented, can get good encryption effect, has a wide secret key's space, strong sensitivity to secret key, and has the abilities of resisting exhaustive attack and statistic attack.
Image processing algorithm for robot tracking in reactor vessel
International Nuclear Information System (INIS)
Kim, Tae Won; Choi, Young Soo; Lee, Sung Uk; Jeong, Kyung Min; Kim, Nam Kyun
2011-01-01
In this paper, we proposed an image processing algorithm to find the position of an underwater robot in the reactor vessel. Proposed algorithm is composed of Modified SURF(Speeded Up Robust Feature) based on Mean-Shift and CAMSHIFT(Continuously Adaptive Mean Shift Algorithm) based on color tracking algorithm. Noise filtering using luminosity blend method and color clipping are preprocessed. Initial tracking area for the CAMSHIFT is determined by using modified SURF. And then extracting the contour and corner points in the area of target tracked by CAMSHIFT method. Experiments are performed at the reactor vessel mockup and verified to use in the control of robot by visual tracking
Optimisation of centroiding algorithms for photon event counting imaging
International Nuclear Information System (INIS)
Suhling, K.; Airey, R.W.; Morgan, B.L.
1999-01-01
Approaches to photon event counting imaging in which the output events of an image intensifier are located using a centroiding technique have long been plagued by fixed pattern noise in which a grid of dimensions similar to those of the CCD pixels is superimposed on the image. This is caused by a mismatch between the photon event shape and the centroiding algorithm. We have used hyperbolic cosine, Gaussian, Lorentzian, parabolic as well as 3-, 5-, and 7-point centre of gravity algorithms, and hybrids thereof, to assess means of minimising this fixed pattern noise. We show that fixed pattern noise generated by the widely used centre of gravity centroiding is due to intrinsic features of the algorithm. Our results confirm that the recently proposed use of Gaussian centroiding does indeed show a significant reduction of fixed pattern noise compared to centre of gravity centroiding (Michel et al., Mon. Not. R. Astron. Soc. 292 (1997) 611-620). However, the disadvantage of a Gaussian algorithm is a centroiding failure for small pulses, caused by a division by zero, which leads to a loss of detective quantum efficiency (DQE) and to small amounts of residual fixed pattern noise. Using both real data from an image intensifier system employing a progressive scan camera, framegrabber and PC, and also synthetic data from Monte-Carlo simulations, we find that hybrid centroiding algorithms can reduce the fixed pattern noise without loss of resolution or loss of DQE. Imaging a test pattern to assess the features of the different algorithms shows that a hybrid of Gaussian and 3-point centre of gravity centroiding algorithms results in an optimum combination of low fixed pattern noise (lower than a simple Gaussian), high DQE, and high resolution. The Lorentzian algorithm gives the worst results in terms of high fixed pattern noise and low resolution, and the Gaussian and hyperbolic cosine algorithms have the lowest DQEs
Convergence of iterative image reconstruction algorithms for Digital Breast Tomosynthesis
DEFF Research Database (Denmark)
Sidky, Emil; Jørgensen, Jakob Heide; Pan, Xiaochuan
2012-01-01
Most iterative image reconstruction algorithms are based on some form of optimization, such as minimization of a data-fidelity term plus an image regularizing penalty term. While achieving the solution of these optimization problems may not directly be clinically relevant, accurate optimization s...
A Fuzzy Homomorphic Algorithm for Image Enhancement | Nnolim ...
African Journals Online (AJOL)
The implementation and analysis of a novel Fuzzy Homomorphic image enhancement technique is presented. The technique combines the logarithmic transform with fuzzy membership functions to deliver an intuitive method of image enhancement. This algorithm reduces the computational complexity by eliminating the ...
Evaluation of Underwater Image Enhancement Algorithms under Different Environmental Conditions
Directory of Open Access Journals (Sweden)
Marino Mangeruga
2018-01-01
Full Text Available Underwater images usually suffer from poor visibility, lack of contrast and colour casting, mainly due to light absorption and scattering. In literature, there are many algorithms aimed to enhance the quality of underwater images through different approaches. Our purpose was to identify an algorithm that performs well in different environmental conditions. We have selected some algorithms from the state of the art and we have employed them to enhance a dataset of images produced in various underwater sites, representing different environmental and illumination conditions. These enhanced images have been evaluated through some quantitative metrics. By analysing the results of these metrics, we tried to understand which of the selected algorithms performed better than the others. Another purpose of our research was to establish if a quantitative metric was enough to judge the behaviour of an underwater image enhancement algorithm. We aim to demonstrate that, even if the metrics can provide an indicative estimation of image quality, they could lead to inconsistent or erroneous evaluations.
FACT. New image parameters based on the watershed-algorithm
Energy Technology Data Exchange (ETDEWEB)
Linhoff, Lena; Bruegge, Kai Arno; Buss, Jens [TU Dortmund (Germany). Experimentelle Physik 5b; Collaboration: FACT-Collaboration
2016-07-01
FACT, the First G-APD Cherenkov Telescope, is the first imaging atmospheric Cherenkov telescope that is using Geiger-mode avalanche photodiodes (G-APDs) as photo sensors. The raw data produced by this telescope are processed in an analysis chain, which leads to a classification of the primary particle that induce a shower and to an estimation of its energy. One important step in this analysis chain is the parameter extraction from shower images. By the application of a watershed algorithm to the camera image, new parameters are computed. Perceiving the brightness of a pixel as height, a set of pixels can be seen as 'landscape' with hills and valleys. A watershed algorithm groups all pixels to a cluster that belongs to the same hill. From the emerging segmented image, one can find new parameters for later analysis steps, e.g. number of clusters, their shape and containing photon charge. For FACT data, the FellWalker algorithm was chosen from the class of watershed algorithms, because it was designed to work on discrete distributions, in this case the pixels of a camera image. The FellWalker algorithm is implemented in FACT-tools, which provides the low level analysis framework for FACT. This talk will focus on the computation of new, FellWalker based, image parameters, which can be used for the gamma-hadron separation. Additionally, their distributions concerning real and Monte Carlo Data are compared.
Chien-Ching Ma; Ching-Yuan Chang
2013-07-01
Interferometry provides a high degree of accuracy in the measurement of sub-micrometer deformations; however, the noise associated with experimental measurement undermines the integrity of interference fringes. This study proposes the use of standard deviation in the temporal domain to improve the image quality of patterns obtained from temporal speckle pattern interferometry. The proposed method combines the advantages of both mean and subtractive methods to remove background noise and ambient disturbance simultaneously, resulting in high-resolution images of excellent quality. The out-of-plane vibration of a thin piezoelectric plate is the main focus of this study, providing information useful to the development of energy harvesters. First, ten resonant states were measured using the proposed method, and both mode shape and resonant frequency were investigated. We then rebuilt the phase distribution of the first resonant mode based on the clear interference patterns obtained using the proposed method. This revealed instantaneous deformations in the dynamic characteristics of the resonant state. The proposed method also provides a frequency-sweeping function, facilitating its practical application in the precise measurement of resonant frequency. In addition, the mode shapes and resonant frequencies obtained using the proposed method were recorded and compared with results obtained using finite element method and laser Doppler vibrometery, which demonstrated close agreement.
Namgung, Bumseok; Ng, Yan Cheng; Nam, Jeonghun; Leo, Hwa Liang; Kim, Sangho
2015-01-01
This study examined the effect of dextran-induced RBC aggregation on the venular flow in microvasculature. We utilized the laser speckle contrast imaging (LSCI) as a wide-field imaging technique to visualize the flow distribution in venules influenced by abnormally elevated levels of RBC aggregation at a network-scale level, which was unprecedented in previous studies. RBC aggregation in rats was induced by infusing Dextran 500. To elucidate the impact of RBC aggregation on microvascular perfusion, blood flow in the venular network of a rat cremaster muscle was analyzed with a stepwise reduction of the arterial pressure (100 → 30 mmHg). The LSCI analysis revealed a substantial decrease in the functional vascular density after the infusion of dextran. The relative decrease in flow velocity after dextran infusion was notably pronounced at low arterial pressures. Whole blood viscosity measurements implied that the reduction in venular flow with dextran infusion could be due to the elevation of medium viscosity in high shear conditions (> 45 s-1). In contrast, further augmentation to the flow reduction at low arterial pressures could be attributed to the formation of RBC aggregates (networks.
Automatic brightness control algorithms and their effect on fluoroscopic imaging
International Nuclear Information System (INIS)
Quinn, P.W.; Gagne, R.M.
1989-01-01
This paper reports a computer model used to investigate the effect on dose and image quality of three automatic brightness control (ABC) algorithms used in the imaging of barium during general-purpose fluoroscopy. A model incorporating all aspects of image formation - i.e., x- ray production, phantom attenuation, and energy absorption in the CSI phosphor - was driven according to each ABC algorithm as a function of patient thickness. The energy absorbed in the phosphor was kept constant, while the changes in exposure, integral dose, organ dose, and contrast were monitored
Image segmentation algorithm based on T-junctions cues
Qian, Yanyu; Cao, Fengyun; Wang, Lu; Yang, Xuejie
2016-03-01
To improve the over-segmentation and over-merge phenomenon of single image segmentation algorithm,a novel approach of combing Graph-Based algorithm and T-junctions cues is proposed in this paper. First, a method by L0 gradient minimization is applied to the smoothing of the target image eliminate artifacts caused by noise and texture detail; Then, the initial over-segmentation result of the smoothing image using the graph-based algorithm; Finally, the final results via a region fusion strategy by t-junction cues. Experimental results on a variety of images verify the new approach's efficiency in eliminating artifacts caused by noise,segmentation accuracy and time complexity has been significantly improved.
Regularization iteration imaging algorithm for electrical capacitance tomography
Tong, Guowei; Liu, Shi; Chen, Hongyan; Wang, Xueyao
2018-03-01
The image reconstruction method plays a crucial role in real-world applications of the electrical capacitance tomography technique. In this study, a new cost function that simultaneously considers the sparsity and low-rank properties of the imaging targets is proposed to improve the quality of the reconstruction images, in which the image reconstruction task is converted into an optimization problem. Within the framework of the split Bregman algorithm, an iterative scheme that splits a complicated optimization problem into several simpler sub-tasks is developed to solve the proposed cost function efficiently, in which the fast-iterative shrinkage thresholding algorithm is introduced to accelerate the convergence. Numerical experiment results verify the effectiveness of the proposed algorithm in improving the reconstruction precision and robustness.
Directory of Open Access Journals (Sweden)
Jing Yu
2016-08-01
Full Text Available Objective: To study the assessment value of 3-dimensional speckle tracking imaging for changes of early left ventricular longitudinal systolic function in patients with primary hypertension. Methods: Patients with primary hypertension who were treated in our hospital from May 2012 to October 2015 were selected, and 40 patients with left ventricular normal (LVN primary hypertension and 40 patients with left ventricular remodeling (LVR primary hypertension were screened according to Ganau typing and enrolled in the LVN group and LVR group of the study respectively; 40 cases of healthy volunteers who received physical examination in our hospital during the same period were selected as control group. Ultrasonic testing was conducted to determine conventional ultrasonic indicators and 3D-STI parameters, and serum was collected to determine AngII, ALD, TGF-β1 and Ang1-7 levels. Results: LVEDd, LVPWT and LVEF of LVN group were not significantly different from those of control group, LVEF of LVR group was not significantly different from those of LVN group and control group, and LVEDd and LVPWT of LVR group were significantly higher than those of LVN group and control group; absolute values of GLS, GCS, GRS and GAS as well as serum Ang1-7 level of LVN group was significantly lower than those of control group, serum AngII, ALD and TGF-β1 levels were higher than those of control group, absolute values of GLS, GCS, GRS and GAS as well as serum Ang1-7 level of LVR group was significantly lower than those of LVN group and control group, and serum AngII, ALD and TGF-β1 levels were higher than those of LVN group and control group; absolute values of GLS, GCS, GRS and GAS were negatively correlated with serum AngII, ALD and TGF-β1 levels, and positively correlated with serum Ang1-7 level. Conclusion: 3-dimensional speckle tracking imaging can be used for early evaluation of left ventricular longitudinal systolic function in patients with primary
Anisotropic conductivity imaging with MREIT using equipotential projection algorithm
Energy Technology Data Exchange (ETDEWEB)
Degirmenci, Evren [Department of Electrical and Electronics Engineering, Mersin University, Mersin (Turkey); Eyueboglu, B Murat [Department of Electrical and Electronics Engineering, Middle East Technical University, 06531, Ankara (Turkey)
2007-12-21
Magnetic resonance electrical impedance tomography (MREIT) combines magnetic flux or current density measurements obtained by magnetic resonance imaging (MRI) and surface potential measurements to reconstruct images of true conductivity with high spatial resolution. Most of the biological tissues have anisotropic conductivity; therefore, anisotropy should be taken into account in conductivity image reconstruction. Almost all of the MREIT reconstruction algorithms proposed to date assume isotropic conductivity distribution. In this study, a novel MREIT image reconstruction algorithm is proposed to image anisotropic conductivity. Relative anisotropic conductivity values are reconstructed iteratively, using only current density measurements without any potential measurement. In order to obtain true conductivity values, only either one potential or conductivity measurement is sufficient to determine a scaling factor. The proposed technique is evaluated on simulated data for isotropic and anisotropic conductivity distributions, with and without measurement noise. Simulation results show that the images of both anisotropic and isotropic conductivity distributions can be reconstructed successfully.
Speckle Noise Reduction via Nonconvex High Total Variation Approach
Directory of Open Access Journals (Sweden)
Yulian Wu
2015-01-01
Full Text Available We address the problem of speckle noise removal. The classical total variation is extensively used in this field to solve such problem, but this method suffers from the staircase-like artifacts and the loss of image details. In order to resolve these problems, a nonconvex total generalized variation (TGV regularization is used to preserve both edges and details of the images. The TGV regularization which is able to remove the staircase effect has strong theoretical guarantee by means of its high order smooth feature. Our method combines the merits of both the TGV method and the nonconvex variational method and avoids their main drawbacks. Furthermore, we develop an efficient algorithm for solving the nonconvex TGV-based optimization problem. We experimentally demonstrate the excellent performance of the technique, both visually and quantitatively.
A comparative study of image low level feature extraction algorithms
Directory of Open Access Journals (Sweden)
M.M. El-gayar
2013-07-01
Full Text Available Feature extraction and matching is at the base of many computer vision problems, such as object recognition or structure from motion. Current methods for assessing the performance of popular image matching algorithms are presented and rely on costly descriptors for detection and matching. Specifically, the method assesses the type of images under which each of the algorithms reviewed herein perform to its maximum or highest efficiency. The efficiency is measured in terms of the number of matches founds by the algorithm and the number of type I and type II errors encountered when the algorithm is tested against a specific pair of images. Current comparative studies asses the performance of the algorithms based on the results obtained in different criteria such as speed, sensitivity, occlusion, and others. This study addresses the limitations of the existing comparative tools and delivers a generalized criterion to determine beforehand the level of efficiency expected from a matching algorithm given the type of images evaluated. The algorithms and the respective images used within this work are divided into two groups: feature-based and texture-based. And from this broad classification only three of the most widely used algorithms are assessed: color histogram, FAST (Features from Accelerated Segment Test, SIFT (Scale Invariant Feature Transform, PCA-SIFT (Principal Component Analysis-SIFT, F-SIFT (fast-SIFT and SURF (speeded up robust features. The performance of the Fast-SIFT (F-SIFT feature detection methods are compared for scale changes, rotation, blur, illumination changes and affine transformations. All the experiments use repeatability measurement and the number of correct matches for the evaluation measurements. SIFT presents its stability in most situations although its slow. F-SIFT is the fastest one with good performance as the same as SURF, SIFT, PCA-SIFT show its advantages in rotation and illumination changes.
Experience with CANDID: Comparison algorithm for navigating digital image databases
Energy Technology Data Exchange (ETDEWEB)
Kelly, P.; Cannon, M.
1994-10-01
This paper presents results from the authors experience with CANDID (Comparison Algorithm for Navigating Digital Image Databases), which was designed to facilitate image retrieval by content using a query-by-example methodology. A global signature describing the texture, shape, or color content is first computed for every image stored in a database, and a normalized similarity measure between probability density functions of feature vectors is used to match signatures. This method can be used to retrieve images from a database that are similar to a user-provided example image. Results for three test applications are included.
Restoration algorithms for imaging through atmospheric turbulence
2017-02-18
temporal vector fields. Let us denote fpi, j, nq the input sequence of N frames where i, j are the 2D spatial variables and n is the frame number. The...and Surfaces. Vanderbilt University Press, 1999. [9] A. Chambolle, R.A. DeVore, N. Lee, and B.J. Lucier. Nonlinear wavelet image processing: varia
Parallel Algorithm for Reconstruction of TAC Images
International Nuclear Information System (INIS)
Vidal Gimeno, V.
2012-01-01
The algebraic reconstruction methods are based on solving a system of linear equations. In a previous study, was used and showed as the PETSc library, was and is a scientific computing tool, which facilitates and enables the optimal use of a computer system in the image reconstruction process.
Lagrangian speckle model and tissue-motion estimation--theory.
Maurice, R L; Bertrand, M
1999-07-01
It is known that when a tissue is subjected to movements such as rotation, shearing, scaling, etc., changes in speckle patterns that result act as a noise source, often responsible for most of the displacement-estimate variance. From a modeling point of view, these changes can be thought of as resulting from two mechanisms: one is the motion of the speckles and the other, the alterations of their morphology. In this paper, we propose a new tissue-motion estimator to counteract these speckle decorrelation effects. The estimator is based on a Lagrangian description of the speckle motion. This description allows us to follow local characteristics of the speckle field as if they were a material property. This method leads to an analytical description of the decorrelation in a way which enables the derivation of an appropriate inverse filter for speckle restoration. The filter is appropriate for linear geometrical transformation of the scattering function (LT), i.e., a constant-strain region of interest (ROI). As the LT itself is a parameter of the filter, a tissue-motion estimator can be formulated as a nonlinear minimization problem, seeking the best match between the pre-tissue-motion image and a restored-speckle post-motion image. The method is tested, using simulated radio-frequency (RF) images of tissue undergoing axial shear.
Imaging for dismantlement verification: Information management and analysis algorithms
International Nuclear Information System (INIS)
Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.
2012-01-01
The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.
An Improved Recovery Algorithm for Decayed AES Key Schedule Images
Tsow, Alex
A practical algorithm that recovers AES key schedules from decayed memory images is presented. Halderman et al. [1] established this recovery capability, dubbed the cold-boot attack, as a serious vulnerability for several widespread software-based encryption packages. Our algorithm recovers AES-128 key schedules tens of millions of times faster than the original proof-of-concept release. In practice, it enables reliable recovery of key schedules at 70% decay, well over twice the decay capacity of previous methods. The algorithm is generalized to AES-256 and is empirically shown to recover 256-bit key schedules that have suffered 65% decay. When solutions are unique, the algorithm efficiently validates this property and outputs the solution for memory images decayed up to 60%.
Otsu Based Optimal Multilevel Image Thresholding Using Firefly Algorithm
Directory of Open Access Journals (Sweden)
N. Sri Madhava Raja
2014-01-01
Full Text Available Histogram based multilevel thresholding approach is proposed using Brownian distribution (BD guided firefly algorithm (FA. A bounded search technique is also presented to improve the optimization accuracy with lesser search iterations. Otsu’s between-class variance function is maximized to obtain optimal threshold level for gray scale images. The performances of the proposed algorithm are demonstrated by considering twelve benchmark images and are compared with the existing FA algorithms such as Lévy flight (LF guided FA and random operator guided FA. The performance assessment comparison between the proposed and existing firefly algorithms is carried using prevailing parameters such as objective function, standard deviation, peak-to-signal ratio (PSNR, structural similarity (SSIM index, and search time of CPU. The results show that BD guided FA provides better objective function, PSNR, and SSIM, whereas LF based FA provides faster convergence with relatively lower CPU time.
An AK-LDMeans algorithm based on image clustering
Chen, Huimin; Li, Xingwei; Zhang, Yongbin; Chen, Nan
2018-03-01
Clustering is an effective analytical technique for handling unmarked data for value mining. Its ultimate goal is to mark unclassified data quickly and correctly. We use the roadmap for the current image processing as the experimental background. In this paper, we propose an AK-LDMeans algorithm to automatically lock the K value by designing the Kcost fold line, and then use the long-distance high-density method to select the clustering centers to further replace the traditional initial clustering center selection method, which further improves the efficiency and accuracy of the traditional K-Means Algorithm. And the experimental results are compared with the current clustering algorithm and the results are obtained. The algorithm can provide effective reference value in the fields of image processing, machine vision and data mining.
A novel algorithm for segmentation of brain MR images
International Nuclear Information System (INIS)
Sial, M.Y.; Yu, L.; Chowdhry, B.S.; Rajput, A.Q.K.; Bhatti, M.I.
2006-01-01
Accurate and fully automatic segmentation of brain from magnetic resonance (MR) scans is a challenging problem that has received an enormous amount of . attention lately. Many researchers have applied various techniques however a standard fuzzy c-means algorithm has produced better results compared to other methods. In this paper, we present a modified fuzzy c-means (FCM) based algorithm for segmentation of brain MR images. Our algorithm is formulated by modifying the objective function of the standard FCM and uses a special spread method to get a smooth and slow varying bias field This method has the advantage that it can be applied at an early stage in an automated data analysis before a tissue model is available. The results on MRI images show that this method provides better results compared to standard FCM algorithms. (author)
COMPARISON OF DIFFERENT SEGMENTATION ALGORITHMS FOR DERMOSCOPIC IMAGES
Directory of Open Access Journals (Sweden)
A.A. Haseena Thasneem
2015-05-01
Full Text Available This paper compares different algorithms for the segmentation of skin lesions in dermoscopic images. The basic segmentation algorithms compared are Thresholding techniques (Global and Adaptive, Region based techniques (K-means, Fuzzy C means, Expectation Maximization and Statistical Region Merging, Contour models (Active Contour Model and Chan - Vese Model and Spectral Clustering. Accuracy, sensitivity, specificity, Border error, Hammoude distance, Hausdorff distance, MSE, PSNR and elapsed time metrices were used to evaluate various segmentation techniques.
Optimization of image processing algorithms on mobile platforms
Poudel, Pramod; Shirvaikar, Mukul
2011-03-01
This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.
Optimization-Based Image Segmentation by Genetic Algorithms
Directory of Open Access Journals (Sweden)
Rosenberger C
2008-01-01
Full Text Available Abstract Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.
Optimization-Based Image Segmentation by Genetic Algorithms
Directory of Open Access Journals (Sweden)
H. Laurent
2008-05-01
Full Text Available Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.
FPGA implementation of image dehazing algorithm for real time applications
Kumar, Rahul; Kaushik, Brajesh Kumar; Balasubramanian, R.
2017-09-01
Weather degradation such as haze, fog, mist, etc. severely reduces the effective range of visual surveillance. This degradation is a spatially varying phenomena, which makes this problem non trivial. Dehazing is an essential preprocessing stage in applications such as long range imaging, border security, intelligent transportation system, etc. However, these applications require low latency of the preprocessing block. In this work, single image dark channel prior algorithm is modified and implemented for fast processing with comparable visual quality of the restored image/video. Although conventional single image dark channel prior algorithm is computationally expensive, it yields impressive results. Moreover, a two stage image dehazing architecture is introduced, wherein, dark channel and airlight are estimated in the first stage. Whereas, transmission map and intensity restoration are computed in the next stages. The algorithm is implemented using Xilinx Vivado software and validated by using Xilinx zc702 development board, which contains an Artix7 equivalent Field Programmable Gate Array (FPGA) and ARM Cortex A9 dual core processor. Additionally, high definition multimedia interface (HDMI) has been incorporated for video feed and display purposes. The results show that the dehazing algorithm attains 29 frames per second for the image resolution of 1920x1080 which is suitable of real time applications. The design utilizes 9 18K_BRAM, 97 DSP_48, 6508 FFs and 8159 LUTs.
Image Reconstruction Algorithm For Electrical Capacitance Tomography (ECT)
International Nuclear Information System (INIS)
Arko
2001-01-01
). Most image reconstruction algorithms for electrical capacitance tomography (ECT) use sensitivity maps as weighting factors. The computation is fast, involving a simple multiply-and- accumulate (MAC) operation, but the resulting image suffers from blurring due to the soft-field effect of the sensor. This paper presents a low cost iterative method employing proportional thresholding, which improves image quality dramatically. The strategy for implementation, computational cost, and achievable speed is examined when using a personal computer (PC) and Digital Signal Processor (DSP). For PC implementation, Watcom C++ 10.6 and Visual C++ 5.0 compilers were used. The experimental results are compared to the images reconstructed by commercially available software. The new algorithm improves the image quality significantly at a cost of a few iterations. This technique can be readily exploited for online applications
FCM Clustering Algorithms for Segmentation of Brain MR Images
Directory of Open Access Journals (Sweden)
Yogita K. Dubey
2016-01-01
Full Text Available The study of brain disorders requires accurate tissue segmentation of magnetic resonance (MR brain images which is very important for detecting tumors, edema, and necrotic tissues. Segmentation of brain images, especially into three main tissue types: Cerebrospinal Fluid (CSF, Gray Matter (GM, and White Matter (WM, has important role in computer aided neurosurgery and diagnosis. Brain images mostly contain noise, intensity inhomogeneity, and weak boundaries. Therefore, accurate segmentation of brain images is still a challenging area of research. This paper presents a review of fuzzy c-means (FCM clustering algorithms for the segmentation of brain MR images. The review covers the detailed analysis of FCM based algorithms with intensity inhomogeneity correction and noise robustness. Different methods for the modification of standard fuzzy objective function with updating of membership and cluster centroid are also discussed.
Iris recognition using image moments and k-means algorithm.
Khan, Yaser Daanial; Khan, Sher Afzal; Ahmad, Farooq; Islam, Saeed
2014-01-01
This paper presents a biometric technique for identification of a person using the iris image. The iris is first segmented from the acquired image of an eye using an edge detection algorithm. The disk shaped area of the iris is transformed into a rectangular form. Described moments are extracted from the grayscale image which yields a feature vector containing scale, rotation, and translation invariant moments. Images are clustered using the k-means algorithm and centroids for each cluster are computed. An arbitrary image is assumed to belong to the cluster whose centroid is the nearest to the feature vector in terms of Euclidean distance computed. The described model exhibits an accuracy of 98.5%.
A novel algorithm for thermal image encryption.
Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen
2018-04-16
Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.
Low dose reconstruction algorithm for differential phase contrast imaging.
Wang, Zhentian; Huang, Zhifeng; Zhang, Li; Chen, Zhiqiang; Kang, Kejun; Yin, Hongxia; Wang, Zhenchang; Marco, Stampanoni
2011-01-01
Differential phase contrast imaging computed tomography (DPCI-CT) is a novel x-ray inspection method to reconstruct the distribution of refraction index rather than the attenuation coefficient in weakly absorbing samples. In this paper, we propose an iterative reconstruction algorithm for DPCI-CT which benefits from the new compressed sensing theory. We first realize a differential algebraic reconstruction technique (DART) by discretizing the projection process of the differential phase contrast imaging into a linear partial derivative matrix. In this way the compressed sensing reconstruction problem of DPCI reconstruction can be transformed to a resolved problem in the transmission imaging CT. Our algorithm has the potential to reconstruct the refraction index distribution of the sample from highly undersampled projection data. Thus it can significantly reduce the dose and inspection time. The proposed algorithm has been validated by numerical simulations and actual experiments.
The Peak Pairs algorithm for strain mapping from HRTEM images
Energy Technology Data Exchange (ETDEWEB)
Galindo, Pedro L. [Departamento de Lenguajes y Sistemas Informaticos, CASEM, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain)], E-mail: pedro.galindo@uca.es; Kret, Slawomir [Institute of Physics, PAS, AL. Lotnikow 32/46, 02-668 Warsaw (Poland); Sanchez, Ana M. [Departamento de Ciencia de los Materiales e Ing. Metalurgica y Q. Inorganica, Facultad de Ciencias, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain); Laval, Jean-Yves [Laboratoire de Physique du Solide, UPR5 CNRS-ESPCI, Paris (France); Yanez, Andres; Pizarro, Joaquin; Guerrero, Elisa [Departamento de Lenguajes y Sistemas Informaticos, CASEM, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain); Ben, Teresa; Molina, Sergio I. [Departamento de Ciencia de los Materiales e Ing. Metalurgica y Q. Inorganica, Facultad de Ciencias, Universidad de Cadiz, Pol. Rio San Pedro s/n. 11510, Puerto Real, Cadiz (Spain)
2007-11-15
Strain mapping is defined as a numerical image-processing technique that measures the local shifts of image details around a crystal defect with respect to the ideal, defect-free, positions in the bulk. Algorithms to map elastic strains from high-resolution transmission electron microscopy (HRTEM) images may be classified into two categories: those based on the detection of peaks of intensity in real space and the Geometric Phase approach, calculated in Fourier space. In this paper, we discuss both categories and propose an alternative real space algorithm (Peak Pairs) based on the detection of pairs of intensity maxima in an affine transformed space dependent on the reference area. In spite of the fact that it is a real space approach, the Peak Pairs algorithm exhibits good behaviour at heavily distorted defect cores, e.g. interfaces and dislocations. Quantitative results are reported from experiments to determine local strain in different types of semiconductor heterostructures.
A novel highly parallel algorithm for linearly unmixing hyperspectral images
Guerra, Raúl; López, Sebastián.; Callico, Gustavo M.; López, Jose F.; Sarmiento, Roberto
2014-10-01
Endmember extraction and abundances calculation represent critical steps within the process of linearly unmixing a given hyperspectral image because of two main reasons. The first one is due to the need of computing a set of accurate endmembers in order to further obtain confident abundance maps. The second one refers to the huge amount of operations involved in these time-consuming processes. This work proposes an algorithm to estimate the endmembers of a hyperspectral image under analysis and its abundances at the same time. The main advantage of this algorithm is its high parallelization degree and the mathematical simplicity of the operations implemented. This algorithm estimates the endmembers as virtual pixels. In particular, the proposed algorithm performs the descent gradient method to iteratively refine the endmembers and the abundances, reducing the mean square error, according with the linear unmixing model. Some mathematical restrictions must be added so the method converges in a unique and realistic solution. According with the algorithm nature, these restrictions can be easily implemented. The results obtained with synthetic images demonstrate the well behavior of the algorithm proposed. Moreover, the results obtained with the well-known Cuprite dataset also corroborate the benefits of our proposal.
Nguyen, Cu Dinh; Hult, Jenny; Sheikh, Rafi; Tenland, Kajsa; Dahlstrand, Ulf; Lindstedt, Sandra; Malmsjö, Malin
2017-10-11
It is well known that blood perfusion is important for the survival of skin flaps. As no study has been conducted to investigate how the blood perfusion in human eyelid skin flaps is affected by the flap length and diathermy, the present study was carried out to investigate these in patients. Fifteen upper eyelids were dissected as part of a blepharoplastic procedure, releasing a 30-mm long piece of skin, while allowing the 5 mm wide distal part of the skin to remain attached, to mimic a skin flap (hereafter called a "skin flap"). Blood perfusion was measured before and after repeated diathermy, using laser speckle contrast imaging. Blood perfusion decreased from the base to the tip of the flap: 5 mm from the base, the perfusion was 69%, at 10 mm it was 40%, at 15 mm it was 20%, and at 20 mm it was only 13% of baseline values. Diathermy further decreased blood perfusion (measured 15 mm from the base) to 13% after applying diathermy for the first time, to 6% after the second and to 4% after the third applications of diathermy. Blood perfusion falls rapidly with distance from the base of skin flaps on the human eyelid, and diathermy reduces blood perfusion even further. Clinically, it may be advised that flaps with a width of 5 mm be no longer than 15 mm (i.e., a width:length ratio of 1:3), and that the use of diathermy should be carefully considered.
Molnár, Eszter; Molnár, Bálint; Lohinai, Zsolt; Tóth, Zsuzsanna; Benyó, Zoltán; Hricisák, Laszló; Windisch, Péter; Vág, János
2017-01-01
The laser speckle contrast imaging (LSCI) is proved to be a reliable tool in flap monitoring in general surgery; however, it has not been evaluated in oral surgery yet. We applied the LSCI to compare the effect of a xenogeneic collagen matrix (Geistlich Mucograft®) to connective tissue grafts (CTG) on the microcirculation of the modified coronally advanced tunnel technique (MCAT) for gingival recession coverage. Gingival microcirculation and wound fluid were measured before and after surgery for six months at twenty-seven treated teeth. In males, the flap microcirculation was restored within 3 days for both grafts followed by a hyperemic response. During the first 8 days the blood flow was higher at xenogeneic graft comparing to the CTG. In females, the ischemic period lasted for 7-12 days depending on the graft and no hyperemic response was observed. Females had more intense and prolonged wound fluid production. The LSCI method is suitable to capture the microcirculatory effect of the surgical intervention in human oral mucosa. The application of xenogeneic collagen matrices as a CTG substitute does not seem to restrain the recovery of graft bed circulation. Gender may have an effect on postoperative circulation and inflammation.
Application of the EM algorithm to radiographic images.
Brailean, J C; Little, D; Giger, M L; Chen, C T; Sullivan, B J
1992-01-01
The expectation maximization (EM) algorithm has received considerable attention in the area of positron emitted tomography (PET) as a restoration and reconstruction technique. In this paper, the restoration capabilities of the EM algorithm when applied to radiographic images is investigated. This application does not involve reconstruction. The performance of the EM algorithm is quantitatively evaluated using a "perceived" signal-to-noise ratio (SNR) as the image quality metric. This perceived SNR is based on statistical decision theory and includes both the observer's visual response function and a noise component internal to the eye-brain system. For a variety of processing parameters, the relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to compare quantitatively the effects of the EM algorithm with two other image enhancement techniques: global contrast enhancement (windowing) and unsharp mask filtering. The results suggest that the EM algorithm's performance is superior when compared to unsharp mask filtering and global contrast enhancement for radiographic images which contain objects smaller than 4 mm.
Incident Light Frequency-Based Image Defogging Algorithm
Directory of Open Access Journals (Sweden)
Wenbo Zhang
2017-01-01
Full Text Available To solve the color distortion problem produced by the dark channel prior algorithm, an improved method for calculating transmittance of all channels, respectively, was proposed in this paper. Based on the Beer-Lambert Law, the influence between the frequency of the incident light and the transmittance was analyzed, and the ratios between each channel’s transmittance were derived. Then, in order to increase efficiency, the input image was resized to a smaller size before acquiring the refined transmittance which will be resized to the same size of original image. Finally, all the transmittances were obtained with the help of the proportion between each color channel, and then they were used to restore the defogging image. Experiments suggest that the improved algorithm can produce a much more natural result image in comparison with original algorithm, which means the problem of high color saturation was eliminated. What is more, the improved algorithm speeds up by four to nine times compared to the original algorithm.
International Nuclear Information System (INIS)
Sidky, Emil Y.; Pan Xiaochuan; Reiser, Ingrid S.; Nishikawa, Robert M.; Moore, Richard H.; Kopans, Daniel B.
2009-01-01
Purpose: The authors develop a practical, iterative algorithm for image-reconstruction in undersampled tomographic systems, such as digital breast tomosynthesis (DBT). Methods: The algorithm controls image regularity by minimizing the image total p variation (TpV), a function that reduces to the total variation when p=1.0 or the image roughness when p=2.0. Constraints on the image, such as image positivity and estimated projection-data tolerance, are enforced by projection onto convex sets. The fact that the tomographic system is undersampled translates to the mathematical property that many widely varied resultant volumes may correspond to a given data tolerance. Thus the application of image regularity serves two purposes: (1) Reduction in the number of resultant volumes out of those allowed by fixing the data tolerance, finding the minimum image TpV for fixed data tolerance, and (2) traditional regularization, sacrificing data fidelity for higher image regularity. The present algorithm allows for this dual role of image regularity in undersampled tomography. Results: The proposed image-reconstruction algorithm is applied to three clinical DBT data sets. The DBT cases include one with microcalcifications and two with masses. Conclusions: Results indicate that there may be a substantial advantage in using the present image-reconstruction algorithm for microcalcification imaging.
Algorithm-Architecture Matching for Signal and Image Processing
Gogniat, Guy; Morawiec, Adam; Erdogan, Ahmet
2011-01-01
Advances in signal and image processing together with increasing computing power are bringing mobile technology closer to applications in a variety of domains like automotive, health, telecommunication, multimedia, entertainment and many others. The development of these leading applications, involving a large diversity of algorithms (e.g. signal, image, video, 3D, communication, cryptography) is classically divided into three consecutive steps: a theoretical study of the algorithms, a study of the target architecture, and finally the implementation. Such a linear design flow is reaching its li
Road detection in SAR images using a tensor voting algorithm
Shen, Dajiang; Hu, Chun; Yang, Bing; Tian, Jinwen; Liu, Jian
2007-11-01
In this paper, the problem of the detection of road networks in Synthetic Aperture Radar (SAR) images is addressed. Most of the previous methods extract the road by detecting lines and network reconstruction. Traditional algorithms such as MRFs, GA, Level Set, used in the progress of reconstruction are iterative. The tensor voting methodology we proposed is non-iterative, and non-sensitive to initialization. Furthermore, the only free parameter is the size of the neighborhood, related to the scale. The algorithm we present is verified to be effective when it's applied to the road extraction using the real Radarsat Image.
Fingerprint matching algorithm for poor quality images
Directory of Open Access Journals (Sweden)
Vedpal Singh
2015-04-01
Full Text Available The main aim of this study is to establish an efficient platform for fingerprint matching for low-quality images. Generally, fingerprint matching approaches use the minutiae points for authentication. However, it is not such a reliable authentication method for low-quality images. To overcome this problem, the current study proposes a fingerprint matching methodology based on normalised cross-correlation, which would improve the performance and reduce the miscalculations during authentication. It would decrease the computational complexities. The error rate of the proposed method is 5.4%, which is less than the two-dimensional (2D dynamic programming (DP error rate of 5.6%, while Lee's method produces 5.9% and the combined method has 6.1% error rate. Genuine accept rate at 1% false accept rate is 89.3% but at 0.1% value it is 96.7%, which is higher. The outcome of this study suggests that the proposed methodology has a low error rate with minimum computational effort as compared with existing methods such as Lee's method and 2D DP and the combined method.
Beam hardening correction algorithm in microtomography images
International Nuclear Information System (INIS)
Sales, Erika S.; Lima, Inaya C.B.; Lopes, Ricardo T.; Assis, Joaquim T. de
2009-01-01
Quantification of mineral density of bone samples is directly related to the attenuation coefficient of bone. The X-rays used in microtomography images are polychromatic and have a moderately broad spectrum of energy, which makes the low-energy X-rays passing through a sample to be absorbed, causing a decrease in the attenuation coefficient and possibly artifacts. This decrease in the attenuation coefficient is due to a process called beam hardening. In this work the beam hardening of microtomography images of vertebrae of Wistar rats subjected to a study of hyperthyroidism was corrected by the method of linearization of the projections. It was discretized using a spectrum in energy, also called the spectrum of Herman. The results without correction for beam hardening showed significant differences in bone volume, which could lead to a possible diagnosis of osteoporosis. But the data with correction showed a decrease in bone volume, but this decrease was not significant in a confidence interval of 95%. (author)
Beam hardening correction algorithm in microtomography images
Energy Technology Data Exchange (ETDEWEB)
Sales, Erika S.; Lima, Inaya C.B.; Lopes, Ricardo T., E-mail: esales@con.ufrj.b, E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Assis, Joaquim T. de, E-mail: joaquim@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Engenharia Mecanica
2009-07-01
Quantification of mineral density of bone samples is directly related to the attenuation coefficient of bone. The X-rays used in microtomography images are polychromatic and have a moderately broad spectrum of energy, which makes the low-energy X-rays passing through a sample to be absorbed, causing a decrease in the attenuation coefficient and possibly artifacts. This decrease in the attenuation coefficient is due to a process called beam hardening. In this work the beam hardening of microtomography images of vertebrae of Wistar rats subjected to a study of hyperthyroidism was corrected by the method of linearization of the projections. It was discretized using a spectrum in energy, also called the spectrum of Herman. The results without correction for beam hardening showed significant differences in bone volume, which could lead to a possible diagnosis of osteoporosis. But the data with correction showed a decrease in bone volume, but this decrease was not significant in a confidence interval of 95%. (author)
CANDID: Comparison algorithm for navigating digital image databases
Energy Technology Data Exchange (ETDEWEB)
Kelly, P.M.; Cannon, T.M.
1994-02-21
In this paper, we propose a method for calculating the similarity between two digital images. A global signature describing the texture, shape, or color content is first computed for every image stored in a database, and a normalized distance between probability density functions of feature vectors is used to match signatures. This method can be used to retrieve images from a database that are similar to an example target image. This algorithm is applied to the problem of search and retrieval for database containing pulmonary CT imagery, and experimental results are provided.
Categorization and Searching of Color Images Using Mean Shift Algorithm
Directory of Open Access Journals (Sweden)
Prakash PANDEY
2009-07-01
Full Text Available Now a day’s Image Searching is still a challenging problem in content based image retrieval (CBIR system. Most CBIR system operates on all images without pre-sorting the images. The image search result contains many unrelated image. The aim of this research is to propose a new object based indexing system Based on extracting salient region representative from the image, categorizing the image into different types and search images that are similar to given query images.In our approach, the color features are extracted using the mean shift algorithm, a robust clustering technique, Dominant objects are obtained by performing region grouping of segmented thumbnails. The category for an image is generated automatically by analyzing the image for the presence of a dominant object. The images in the database are clustered based on region feature similarity using Euclidian distance. Placing an image into a category can help the user to navigate retrieval results more effectively. Extensive experimental results illustrate excellent performance.
Image noise reduction algorithm for digital subtraction angiography: clinical results.
Söderman, Michael; Holmin, Staffan; Andersson, Tommy; Palmgren, Charlotta; Babic, Draženko; Hoornaert, Bart
2013-11-01
To test the hypothesis that an image noise reduction algorithm designed for digital subtraction angiography (DSA) in interventional neuroradiology enables a reduction in the patient entrance dose by a factor of 4 while maintaining image quality. This clinical prospective study was approved by the local ethics committee, and all 20 adult patients provided informed consent. DSA was performed with the default reference DSA program, a quarter-dose DSA program with modified acquisition parameters (to reduce patient radiation dose exposure), and a real-time noise-reduction algorithm. Two consecutive biplane DSA data sets were acquired in each patient. The dose-area product (DAP) was calculated for each image and compared. A randomized, blinded, offline reading study was conducted to show noninferiority of the quarter-dose image sets. Overall, 40 samples per treatment group were necessary to acquire 80% power, which was calculated by using a one-sided α level of 2.5%. The mean DAP with the quarter-dose program was 25.3% ± 0.8 of that with the reference program. The median overall image quality scores with the reference program were 9, 13, and 12 for readers 1, 2, and 3, respectively. These scores increased slightly to 12, 15, and 12, respectively, with the quarter-dose program imaging chain. In DSA, a change in technique factors combined with a real-time noise-reduction algorithm will reduce the patient entrance dose by 75%, without a loss of image quality. RSNA, 2013
ProxImaL: efficient image optimization using proximal algorithms
Heide, Felix
2016-07-11
Computational photography systems are becoming increasingly diverse, while computational resources-for example on mobile platforms-are rapidly increasing. As diverse as these camera systems may be, slightly different variants of the underlying image processing tasks, such as demosaicking, deconvolution, denoising, inpainting, image fusion, and alignment, are shared between all of these systems. Formal optimization methods have recently been demonstrated to achieve state-of-the-art quality for many of these applications. Unfortunately, different combinations of natural image priors and optimization algorithms may be optimal for different problems, and implementing and testing each combination is currently a time-consuming and error-prone process. ProxImaL is a domain-specific language and compiler for image optimization problems that makes it easy to experiment with different problem formulations and algorithm choices. The language uses proximal operators as the fundamental building blocks of a variety of linear and nonlinear image formation models and cost functions, advanced image priors, and noise models. The compiler intelligently chooses the best way to translate a problem formulation and choice of optimization algorithm into an efficient solver implementation. In applications to the image processing pipeline, deconvolution in the presence of Poisson-distributed shot noise, and burst denoising, we show that a few lines of ProxImaL code can generate highly efficient solvers that achieve state-of-the-art results. We also show applications to the nonlinear and nonconvex problem of phase retrieval.
An efficient feedback calibration algorithm for direct imaging radio telescopes
Beardsley, Adam P.; Thyagarajan, Nithyanandan; Bowman, Judd D.; Morales, Miguel F.
2017-10-01
We present the E-field Parallel Imaging Calibration (EPICal) algorithm, which addresses the need for a fast calibration method for direct imaging radio astronomy correlators. Direct imaging involves a spatial fast Fourier transform of antenna signals, alleviating an O(Na ^2) computational bottleneck typical in radio correlators, and yielding a more gentle O(Ng log _2 Ng) scaling, where Na is the number of antennas in the array and Ng is the number of gridpoints in the imaging analysis. This can save orders of magnitude in computation cost for next generation arrays consisting of hundreds or thousands of antennas. However, because antenna signals are mixed in the imaging correlator without creating visibilities, gain correction must be applied prior to imaging, rather than on visibilities post-correlation. We develop the EPICal algorithm to form gain solutions quickly and without ever forming visibilities. This method scales as the number of antennas, and produces results comparable to those from visibilities. We use simulations to demonstrate the EPICal technique and study the noise properties of our gain solutions, showing they are similar to visibility-based solutions in realistic situations. By applying EPICal to 2 s of Long Wavelength Array data, we achieve a 65 per cent dynamic range improvement compared to uncalibrated images, showing this algorithm is a promising solution for next generation instruments.
Performance evaluation of the EM algorithm applied to radiographic images
International Nuclear Information System (INIS)
Brailean, J.C.; Giger, M.L.; Chen, C.T.; Sullivan, B.J.
1990-01-01
In this paper the authors evaluate the expectation maximization (EM) algorithm, both qualitatively and quantitatively, as a technique for enhancing radiographic images. Previous studies have qualitatively shown the usefulness of the EM algorithm but have failed to quantify and compare its performance with those of other image processing techniques. Recent studies by Loo et al, Ishida et al, and Giger et al, have explained improvements in image quality quantitatively in terms of a signal-to-noise ratio (SNR) derived from signal detection theory. In this study, we take a similar approach in quantifying the effect of the EM algorithm on detection of simulated low-contrast square objects superimposed on radiographic mottle. The SNRs of the original and processed images are calculated taking into account both the human visual system response and the screen-film transfer function as well as a noise component internal to the eye-brain system. The EM algorithm was also implemented on digital screen-film images of test patterns and clinical mammograms
Imaging of the spleen: a proposed algorithm
International Nuclear Information System (INIS)
Shirkhoda, A.; McCartney, W.H.; Staab, E.V.; Mittelstaedt, C.A.
1980-01-01
The /sup 99m/Tc sulfur colloid scan is an effective initial method for evaluating splenic size, position, and focal or diffusely altered radionuclide uptake. Sonography is a useful next step in determining whether focal lesions are solid or cystic and the relation of the spleen to adjacent organs. In our opinion, computed tomography (CT) may be reserved for the few instances in which diagnostic questions remain unanswered after radionuclide scanning and sonography. Angiography is used primarily in splenic trauma. We evaluated 900 patients suspected of having liver-spleen abnormality. This experience, which led to a logically sequenced noninvasive imaging approach for evaluating suspected splenic pathology, is summarized and illustrated by several cases
Imaging of the spleen: a proposed algorithm
Energy Technology Data Exchange (ETDEWEB)
Shirkhoda, A.; McCartney, W.H.; Staab, E.V.; Mittelstaedt, C.A.
1980-07-01
The /sup 99m/Tc sulfur colloid scan is an effective initial method for evaluating splenic size, position, and focal or diffusely altered radionuclide uptake. Sonography is a useful next step in determining whether focal lesions are solid or cystic and the relation of the spleen to adjacent organs. In our opinion, computed tomography (CT) may be reserved for the few instances in which diagnostic questions remain unanswered after radionuclide scanning and sonography. Angiography is used primarily in splenic trauma. We evaluated 900 patients suspected of having liver-spleen abnormality. This experience, which led to a logically sequenced noninvasive imaging approach for evaluating suspected splenic pathology, is summarized and illustrated by several cases.
Research on fusion algorithm of polarization image in tetrolet domain
Zhang, Dexiang; Yuan, BaoHong; Zhang, Jingjing
2015-12-01
Tetrolets are Haar-type wavelets whose supports are tetrominoes which are shapes made by connecting four equal-sized squares. A fusion method for polarization images based on tetrolet transform is proposed. Firstly, the magnitude of polarization image and angle of polarization image can be decomposed into low-frequency coefficients and high-frequency coefficients with multi-scales and multi-directions using tetrolet transform. For the low-frequency coefficients, the average fusion method is used. According to edge distribution differences in high frequency sub-band images, for the directional high-frequency coefficients are used to select the better coefficients by region spectrum entropy algorithm for fusion. At last the fused image can be obtained by utilizing inverse transform for fused tetrolet coefficients. Experimental results show that the proposed method can detect image features more effectively and the fused image has better subjective visual effect
Adaptive Proximal Point Algorithms for Total Variation Image Restoration
Directory of Open Access Journals (Sweden)
Ying Chen
2015-02-01
Full Text Available Image restoration is a fundamental problem in various areas of imaging sciences. This paper presents a class of adaptive proximal point algorithms (APPA with contraction strategy for total variational image restoration. In each iteration, the proposed methods choose an adaptive proximal parameter matrix which is not necessary symmetric. In fact, there is an inner extrapolation in the prediction step, which is followed by a correction step for contraction. And the inner extrapolation is implemented by an adaptive scheme. By using the framework of contraction method, global convergence result and a convergence rate of O(1/N could be established for the proposed methods. Numerical results are reported to illustrate the efficiency of the APPA methods for solving total variation image restoration problems. Comparisons with the state-of-the-art algorithms demonstrate that the proposed methods are comparable and promising.
Image Retrieval Algorithm Based on Discrete Fractional Transforms
Jindal, Neeru; Singh, Kulbir
2013-06-01
The discrete fractional transforms is a signal processing tool which suggests computational algorithms and solutions to various sophisticated applications. In this paper, a new technique to retrieve the encrypted and scrambled image based on discrete fractional transforms has been proposed. Two-dimensional image was encrypted using discrete fractional transforms with three fractional orders and two random phase masks placed in the two intermediate planes. The significant feature of discrete fractional transforms benefits from its extra degree of freedom that is provided by its fractional orders. Security strength was enhanced (1024!)4 times by scrambling the encrypted image. In decryption process, image retrieval is sensitive for both correct fractional order keys and scrambling algorithm. The proposed approach make the brute force attack infeasible. Mean square error and relative error are the recital parameters to verify validity of proposed method.
Efficient generation of image chips for training deep learning algorithms
Han, Sanghui; Fafard, Alex; Kerekes, John; Gartley, Michael; Ientilucci, Emmett; Savakis, Andreas; Law, Charles; Parhan, Jason; Turek, Matt; Fieldhouse, Keith; Rovito, Todd
2017-05-01
Training deep convolutional networks for satellite or aerial image analysis often requires a large amount of training data. For a more robust algorithm, training data need to have variations not only in the background and target, but also radiometric variations in the image such as shadowing, illumination changes, atmospheric conditions, and imaging platforms with different collection geometry. Data augmentation is a commonly used approach to generating additional training data. However, this approach is often insufficient in accounting for real world changes in lighting, location or viewpoint outside of the collection geometry. Alternatively, image simulation can be an efficient way to augment training data that incorporates all these variations, such as changing backgrounds, that may be encountered in real data. The Digital Imaging and Remote Sensing Image Image Generation (DIRSIG) model is a tool that produces synthetic imagery using a suite of physics-based radiation propagation modules. DIRSIG can simulate images taken from different sensors with variation in collection geometry, spectral response, solar elevation and angle, atmospheric models, target, and background. Simulation of Urban Mobility (SUMO) is a multi-modal traffic simulation tool that explicitly models vehicles that move through a given road network. The output of the SUMO model was incorporated into DIRSIG to generate scenes with moving vehicles. The same approach was used when using helicopters as targets, but with slight modifications. Using the combination of DIRSIG and SUMO, we quickly generated many small images, with the target at the center with different backgrounds. The simulations generated images with vehicles and helicopters as targets, and corresponding images without targets. Using parallel computing, 120,000 training images were generated in about an hour. Some preliminary results show an improvement in the deep learning algorithm when real image training data are augmented with
Speckle reduction in echocardiography by temporal compounding and anisotropic diffusion filtering
Giraldo-Guzmán, Jader; Porto-Solano, Oscar; Cadena-Bonfanti, Alberto; Contreras-Ortiz, Sonia H.
2015-01-01
Echocardiography is a medical imaging technique based on ultrasound signals that is used to evaluate heart anatomy and physiology. Echocardiographic images are affected by speckle, a type of multiplicative noise that obscures details of the structures, and reduces the overall image quality. This paper shows an approach to enhance echocardiography using two processing techniques: temporal compounding and anisotropic diffusion filtering. We used twenty echocardiographic videos that include one or three cardiac cycles to test the algorithms. Two images from each cycle were aligned in space and averaged to obtain the compound images. These images were then processed using anisotropic diffusion filters to further improve their quality. Resultant images were evaluated using quality metrics and visual assessment by two medical doctors. The average total improvement on signal-to-noise ratio was up to 100.29% for videos with three cycles, and up to 32.57% for videos with one cycle.
32Still Image Compression Algorithm Based on Directional Filter Banks
Chunling Yang; Duanwu Cao; Li Ma
2010-01-01
Hybrid wavelet and directional filter banks (HWD) is an effective multi-scale geometrical analysis method. Compared to wavelet transform, it can better capture the directional information of images. But the ringing artifact, which is caused by the coefficient quantization in transform domain, is the biggest drawback of image compression algorithms in HWD domain. In this paper, by researching on the relationship between directional decomposition and ringing artifact, an improved decomposition ...
Evaluation of clinical image processing algorithms used in digital mammography.
Zanca, Federica; Jacobs, Jurgen; Van Ongeval, Chantal; Claus, Filip; Celis, Valerie; Geniets, Catherine; Provost, Veerle; Pauwels, Herman; Marchal, Guy; Bosmans, Hilde
2009-03-01
Screening is the only proven approach to reduce the mortality of breast cancer, but significant numbers of breast cancers remain undetected even when all quality assurance guidelines are implemented. With the increasing adoption of digital mammography systems, image processing may be a key factor in the imaging chain. Although to our knowledge statistically significant effects of manufacturer-recommended image processings have not been previously demonstrated, the subjective experience of our radiologists, that the apparent image quality can vary considerably between different algorithms, motivated this study. This article addresses the impact of five such algorithms on the detection of clusters of microcalcifications. A database of unprocessed (raw) images of 200 normal digital mammograms, acquired with the Siemens Novation DR, was collected retrospectively. Realistic simulated microcalcification clusters were inserted in half of the unprocessed images. All unprocessed images were subsequently processed with five manufacturer-recommended image processing algorithms (Agfa Musica 1, IMS Raffaello Mammo 1.2, Sectra Mamea AB Sigmoid, Siemens OPVIEW v2, and Siemens OPVIEW v1). Four breast imaging radiologists were asked to locate and score the clusters in each image on a five point rating scale. The free-response data were analyzed by the jackknife free-response receiver operating characteristic (JAFROC) method and, for comparison, also with the receiver operating characteristic (ROC) method. JAFROC analysis revealed highly significant differences between the image processings (F = 8.51, p < 0.0001), suggesting that image processing strongly impacts the detectability of clusters. Siemens OPVIEW2 and Siemens OPVIEW1 yielded the highest and lowest performances, respectively. ROC analysis of the data also revealed significant differences between the processing but at lower significance (F = 3.47, p = 0.0305) than JAFROC. Both statistical analysis methods revealed that the
A Novel Perceptual Hash Algorithm for Multispectral Image Authentication
Directory of Open Access Journals (Sweden)
Kaimeng Ding
2018-01-01
Full Text Available The perceptual hash algorithm is a technique to authenticate the integrity of images. While a few scholars have worked on mono-spectral image perceptual hashing, there is limited research on multispectral image perceptual hashing. In this paper, we propose a perceptual hash algorithm for the content authentication of a multispectral remote sensing image based on the synthetic characteristics of each band: firstly, the multispectral remote sensing image is preprocessed with band clustering and grid partition; secondly, the edge feature of the band subsets is extracted by band fusion-based edge feature extraction; thirdly, the perceptual feature of the same region of the band subsets is compressed and normalized to generate the perceptual hash value. The authentication procedure is achieved via the normalized Hamming distance between the perceptual hash value of the recomputed perceptual hash value and the original hash value. The experiments indicated that our proposed algorithm is robust compared to content-preserved operations and it efficiently authenticates the integrity of multispectral remote sensing images.
The influence of image reconstruction algorithms on linear thorax EIT image analysis of ventilation
International Nuclear Information System (INIS)
Zhao, Zhanqi; Möller, Knut; Frerichs, Inéz; Pulletz, Sven; Müller-Lisse, Ullrich
2014-01-01
Analysis methods of electrical impedance tomography (EIT) images based on different reconstruction algorithms were examined. EIT measurements were performed on eight mechanically ventilated patients with acute respiratory distress syndrome. A maneuver with step increase of airway pressure was performed. EIT raw data were reconstructed offline with (1) filtered back-projection (BP); (2) the Dräger algorithm based on linearized Newton–Raphson (DR); (3) the GREIT (Graz consensus reconstruction algorithm for EIT) reconstruction algorithm with a circular forward model (GR C ) and (4) GREIT with individual thorax geometry (GR T ). Individual thorax contours were automatically determined from the routine computed tomography images. Five indices were calculated on the resulting EIT images respectively: (a) the ratio between tidal and deep inflation impedance changes; (b) tidal impedance changes in the right and left lungs; (c) center of gravity; (d) the global inhomogeneity index and (e) ventilation delay at mid-dorsal regions. No significant differences were found in all examined indices among the four reconstruction algorithms (p > 0.2, Kruskal–Wallis test). The examined algorithms used for EIT image reconstruction do not influence the selected indices derived from the EIT image analysis. Indices that validated for images with one reconstruction algorithm are also valid for other reconstruction algorithms. (paper)
The influence of image reconstruction algorithms on linear thorax EIT image analysis of ventilation.
Zhao, Zhanqi; Frerichs, Inéz; Pulletz, Sven; Müller-Lisse, Ullrich; Möller, Knut
2014-06-01
Analysis methods of electrical impedance tomography (EIT) images based on different reconstruction algorithms were examined. EIT measurements were performed on eight mechanically ventilated patients with acute respiratory distress syndrome. A maneuver with step increase of airway pressure was performed. EIT raw data were reconstructed offline with (1) filtered back-projection (BP); (2) the Dräger algorithm based on linearized Newton-Raphson (DR); (3) the GREIT (Graz consensus reconstruction algorithm for EIT) reconstruction algorithm with a circular forward model (GR(C)) and (4) GREIT with individual thorax geometry (GR(T)). Individual thorax contours were automatically determined from the routine computed tomography images. Five indices were calculated on the resulting EIT images respectively: (a) the ratio between tidal and deep inflation impedance changes; (b) tidal impedance changes in the right and left lungs; (c) center of gravity; (d) the global inhomogeneity index and (e) ventilation delay at mid-dorsal regions. No significant differences were found in all examined indices among the four reconstruction algorithms (p > 0.2, Kruskal-Wallis test). The examined algorithms used for EIT image reconstruction do not influence the selected indices derived from the EIT image analysis. Indices that validated for images with one reconstruction algorithm are also valid for other reconstruction algorithms.
A fast image encryption algorithm based on chaotic map
Liu, Wenhao; Sun, Kehui; Zhu, Congxu
2016-09-01
Derived from Sine map and iterative chaotic map with infinite collapse (ICMIC), a new two-dimensional Sine ICMIC modulation map (2D-SIMM) is proposed based on a close-loop modulation coupling (CMC) model, and its chaotic performance is analyzed by means of phase diagram, Lyapunov exponent spectrum and complexity. It shows that this map has good ergodicity, hyperchaotic behavior, large maximum Lyapunov exponent and high complexity. Based on this map, a fast image encryption algorithm is proposed. In this algorithm, the confusion and diffusion processes are combined for one stage. Chaotic shift transform (CST) is proposed to efficiently change the image pixel positions, and the row and column substitutions are applied to scramble the pixel values simultaneously. The simulation and analysis results show that this algorithm has high security, low time complexity, and the abilities of resisting statistical analysis, differential, brute-force, known-plaintext and chosen-plaintext attacks.
Image Encryption Using a Lightweight Stream Encryption Algorithm
Directory of Open Access Journals (Sweden)
Saeed Bahrami
2012-01-01
Full Text Available Security of the multimedia data including image and video is one of the basic requirements for the telecommunications and computer networks. In this paper, we consider a simple and lightweight stream encryption algorithm for image encryption, and a series of tests are performed to confirm suitability of the described encryption algorithm. These tests include visual test, histogram analysis, information entropy, encryption quality, correlation analysis, differential analysis, and performance analysis. Based on this analysis, it can be concluded that the present algorithm in comparison to A5/1 and W7 stream ciphers has the same security level, is better in terms of the speed of performance, and is used for real-time applications.
Sparse Nonlinear Electromagnetic Imaging Accelerated With Projected Steepest Descent Algorithm
Desmal, Abdulla
2017-04-03
An efficient electromagnetic inversion scheme for imaging sparse 3-D domains is proposed. The scheme achieves its efficiency and accuracy by integrating two concepts. First, the nonlinear optimization problem is constrained using L₀ or L₁-norm of the solution as the penalty term to alleviate the ill-posedness of the inverse problem. The resulting Tikhonov minimization problem is solved using nonlinear Landweber iterations (NLW). Second, the efficiency of the NLW is significantly increased using a steepest descent algorithm. The algorithm uses a projection operator to enforce the sparsity constraint by thresholding the solution at every iteration. Thresholding level and iteration step are selected carefully to increase the efficiency without sacrificing the convergence of the algorithm. Numerical results demonstrate the efficiency and accuracy of the proposed imaging scheme in reconstructing sparse 3-D dielectric profiles.
Cryptanalysis of a chaos-based image encryption algorithm
International Nuclear Information System (INIS)
Cokal, Cahit; Solak, Ercan
2009-01-01
A chaos-based image encryption algorithm was proposed in [Z.-H. Guan, F. Huang, W. Guan, Phys. Lett. A 346 (2005) 153]. In this Letter, we analyze the security weaknesses of the proposal. By applying chosen-plaintext and known-plaintext attacks, we show that all the secret parameters can be revealed
Security Analysis of A Chaos-based Image Encryption Algorithm
Lian, Shiguo; Sun, Jinsheng; Wang, Zhiquan
2006-01-01
The security of Fridrich Image Encryption Algorithm against brute-force attack, statistical attack, known-plaintext attack and select-plaintext attack is analyzed by investigating the properties of the involved chaotic maps and diffusion functions. Based on the given analyses, some means are proposed to strengthen the overall performance of the focused cryptosystem.
Quality measures for HRR alignment based ISAR imaging algorithms
CSIR Research Space (South Africa)
Janse van Rensburg, V
2013-05-01
Full Text Available Some Inverse Synthetic Aperture Radar (ISAR) algorithms form the image in a two-step process of range alignment and phase conjugation. This paper discusses a comprehensive set of measures used to quantify the quality of range alignment, with the aim...
E-PLE: an Algorithm for Image Inpainting
Directory of Open Access Journals (Sweden)
Yi-Qing Wang
2013-12-01
Full Text Available Gaussian mixture is a powerful tool for modeling the patch prior. In this work, a probabilisticview of an existing algorithm piecewise linear estimation (PLE for image inpainting is presentedwhich leads to several theoretical and numerical improvements based on an effective use ofGaussian mixture.
Computationally efficient algorithms for statistical image processing : implementation in R
Langovoy, M.; Wittich, O.
2010-01-01
In the series of our earlier papers on the subject, we proposed a novel statistical hypothesis testing method for detection of objects in noisy images. The method uses results from percolation theory and random graph theory. We developed algorithms that allowed to detect objects of unknown shapes in
A Novel Algorithm of Surface Eliminating in Undersurface Optoacoustic Imaging
Directory of Open Access Journals (Sweden)
Zhulina Yulia V
2004-01-01
Full Text Available This paper analyzes the task of optoacoustic imaging of the objects located under the surface covering them. In this paper, we suggest the algorithm of the surface eliminating based on the fact that the intensity of the image as a function of the spatial point should change slowly inside the local objects, and will suffer a discontinuity of the spatial gradients on their boundaries. The algorithm forms the 2-dimensional curves along which the discontinuity of the signal derivatives is detected. Then, the algorithm divides the signal space into the areas along these curves. The signals inside the areas with the maximum level of the signal amplitudes and the maximal gradient absolute values on their edges are put equal to zero. The rest of the signals are used for the image restoration. This method permits to reconstruct the picture of the surface boundaries with a higher contrast than that of the surface detection technique based on the maximums of the received signals. This algorithm does not require any prior knowledge of the signals' statistics inside and outside the local objects. It may be used for reconstructing any images with the help of the signals representing the integral over the object's volume. Simulation and real data are also provided to validate the proposed method.
Heuristic Scheduling Algorithm Oriented Dynamic Tasks for Imaging Satellites
Directory of Open Access Journals (Sweden)
Maocai Wang
2014-01-01
Full Text Available Imaging satellite scheduling is an NP-hard problem with many complex constraints. This paper researches the scheduling problem for dynamic tasks oriented to some emergency cases. After the dynamic properties of satellite scheduling were analyzed, the optimization model is proposed in this paper. Based on the model, two heuristic algorithms are proposed to solve the problem. The first heuristic algorithm arranges new tasks by inserting or deleting them, then inserting them repeatedly according to the priority from low to high, which is named IDI algorithm. The second one called ISDR adopts four steps: insert directly, insert by shifting, insert by deleting, and reinsert the tasks deleted. Moreover, two heuristic factors, congestion degree of a time window and the overlapping degree of a task, are employed to improve the algorithm’s performance. Finally, a case is given to test the algorithms. The results show that the IDI algorithm is better than ISDR from the running time point of view while ISDR algorithm with heuristic factors is more effective with regard to algorithm performance. Moreover, the results also show that our method has good performance for the larger size of the dynamic tasks in comparison with the other two methods.
Development of computed tomography system and image reconstruction algorithm
International Nuclear Information System (INIS)
Khairiah Yazid; Mohd Ashhar Khalid; Azaman Ahmad; Khairul Anuar Mohd Salleh; Ab Razak Hamzah
2006-01-01
Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 μm pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)
Advances and applications of optimised algorithms in image processing
Oliva, Diego
2017-01-01
This book presents a study of the use of optimization algorithms in complex image processing problems. The problems selected explore areas ranging from the theory of image segmentation to the detection of complex objects in medical images. Furthermore, the concepts of machine learning and optimization are analyzed to provide an overview of the application of these tools in image processing. The material has been compiled from a teaching perspective. Accordingly, the book is primarily intended for undergraduate and postgraduate students of Science, Engineering, and Computational Mathematics, and can be used for courses on Artificial Intelligence, Advanced Image Processing, Computational Intelligence, etc. Likewise, the material can be useful for research from the evolutionary computation, artificial intelligence and image processing co.
Image Blocking Encryption Algorithm Based on Laser Chaos Synchronization
Directory of Open Access Journals (Sweden)
Shu-Ying Wang
2016-01-01
Full Text Available In view of the digital image transmission security, based on laser chaos synchronization and Arnold cat map, a novel image encryption scheme is proposed. Based on pixel values of plain image a parameter is generated to influence the secret key. Sequences of the drive system and response system are pretreated by the same method and make image blocking encryption scheme for plain image. Finally, pixels position are scrambled by general Arnold transformation. In decryption process, the chaotic synchronization accuracy is fully considered and the relationship between the effect of synchronization and decryption is analyzed, which has characteristics of high precision, higher efficiency, simplicity, flexibility, and better controllability. The experimental results show that the encryption algorithm image has high security and good antijamming performance.
Performance evaluation of image segmentation algorithms on microscopic image data
Czech Academy of Sciences Publication Activity Database
Beneš, Miroslav; Zitová, Barbara
2015-01-01
Roč. 275, č. 1 (2015), s. 65-85 ISSN 0022-2720 R&D Projects: GA ČR GAP103/12/2211 Institutional support: RVO:67985556 Keywords : image segmentation * performance evaluation * microscopic images Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.136, year: 2015 http://library.utia.cas.cz/separaty/2014/ZOI/zitova-0434809-DOI.pdf
Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors
Directory of Open Access Journals (Sweden)
Chen Qu
2017-09-01
Full Text Available The CMOS (Complementary Metal-Oxide-Semiconductor is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze, causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.
The ANACONDA algorithm for deformable image registration in radiotherapy
International Nuclear Information System (INIS)
Weistrand, Ola; Svensson, Stina
2015-01-01
Purpose: The purpose of this work was to describe a versatile algorithm for deformable image registration with applications in radiotherapy and to validate it on thoracic 4DCT data as well as CT/cone beam CT (CBCT) data. Methods: ANAtomically CONstrained Deformation Algorithm (ANACONDA) combines image information (i.e., intensities) with anatomical information as provided by contoured image sets. The registration problem is formulated as a nonlinear optimization problem and solved with an in-house developed solver, tailored to this problem. The objective function, which is minimized during optimization, is a linear combination of four nonlinear terms: 1. image similarity term; 2. grid regularization term, which aims at keeping the deformed image grid smooth and invertible; 3. a shape based regularization term which works to keep the deformation anatomically reasonable when regions of interest are present in the reference image; and 4. a penalty term which is added to the optimization problem when controlling structures are used, aimed at deforming the selected structure in the reference image to the corresponding structure in the target image. Results: To validate ANACONDA, the authors have used 16 publically available thoracic 4DCT data sets for which target registration errors from several algorithms have been reported in the literature. On average for the 16 data sets, the target registration error is 1.17 ± 0.87 mm, Dice similarity coefficient is 0.98 for the two lungs, and image similarity, measured by the correlation coefficient, is 0.95. The authors have also validated ANACONDA using two pelvic cases and one head and neck case with planning CT and daily acquired CBCT. Each image has been contoured by a physician (radiation oncologist) or experienced radiation therapist. The results are an improvement with respect to rigid registration. However, for the head and neck case, the sample set is too small to show statistical significance. Conclusions: ANACONDA
Reliable Line Matching Algorithm for Stereo Images with Topological Relationship
Directory of Open Access Journals (Sweden)
WANG Jingxue
2017-11-01
Full Text Available Because of the lack of relationships between matching line and adjacent lines in the process of individual line matching, and the weak reliability of the individual line descriptor facing on discontinue texture, this paper presents a reliable line matching algorithm for stereo images with topological relationship. The algorithm firstly generates grouped line pairs from lines extracted from the reference image and searching image according to the basic topological relationships such as distance and angle between the lines. Then it takes the grouped line pairs as matching primitives, and matches these grouped line pairs by using epipolar constraint, homography constraint, quadrant constraint and gray correlation constraint of irregular triangle in order. And finally, it resolves the corresponding line pairs into two pairs of corresponding individual lines, and obtains one to one matching results after the post-processing of integrating, fitting, and checking. This paper adopts digital aerial images and close-range images with typical texture features to deal with the parameter analysis and line matching, and the experiment results demonstrate that the proposed algorithm in this paper can obtain reliable line matching results.
A high performance hardware implementation image encryption with AES algorithm
Farmani, Ali; Jafari, Mohamad; Miremadi, Seyed Sohrab
2011-06-01
This paper describes implementation of a high-speed encryption algorithm with high throughput for encrypting the image. Therefore, we select a highly secured symmetric key encryption algorithm AES(Advanced Encryption Standard), in order to increase the speed and throughput using pipeline technique in four stages, control unit based on logic gates, optimal design of multiplier blocks in mixcolumn phase and simultaneous production keys and rounds. Such procedure makes AES suitable for fast image encryption. Implementation of a 128-bit AES on FPGA of Altra company has been done and the results are as follow: throughput, 6 Gbps in 471MHz. The time of encrypting in tested image with 32*32 size is 1.15ms.
A robust color image watermarking algorithm against rotation attacks
Han, Shao-cheng; Yang, Jin-feng; Wang, Rui; Jia, Gui-min
2018-01-01
A robust digital watermarking algorithm is proposed based on quaternion wavelet transform (QWT) and discrete cosine transform (DCT) for copyright protection of color images. The luminance component Y of a host color image in YIQ space is decomposed by QWT, and then the coefficients of four low-frequency subbands are transformed by DCT. An original binary watermark scrambled by Arnold map and iterated sine chaotic system is embedded into the mid-frequency DCT coefficients of the subbands. In order to improve the performance of the proposed algorithm against rotation attacks, a rotation detection scheme is implemented before watermark extracting. The experimental results demonstrate that the proposed watermarking scheme shows strong robustness not only against common image processing attacks but also against arbitrary rotation attacks.
RESEARCH ON FOREST FLAME RECOGNITION ALGORITHM BASED ON IMAGE FEATURE
Directory of Open Access Journals (Sweden)
Z. Wang
2017-09-01
Full Text Available In recent years, fire recognition based on image features has become a hotspot in fire monitoring. However, due to the complexity of forest environment, the accuracy of forest fireworks recognition based on image features is low. Based on this, this paper proposes a feature extraction algorithm based on YCrCb color space and K-means clustering. Firstly, the paper prepares and analyzes the color characteristics of a large number of forest fire image samples. Using the K-means clustering algorithm, the forest flame model is obtained by comparing the two commonly used color spaces, and the suspected flame area is discriminated and extracted. The experimental results show that the extraction accuracy of flame area based on YCrCb color model is higher than that of HSI color model, which can be applied in different scene forest fire identification, and it is feasible in practice.
Hypercube algorithms suitable for image understanding in uncertain environments
International Nuclear Information System (INIS)
Huntsberger, T.L.; Sengupta, A.
1988-01-01
Computer vision in a dynamic environment needs to be fast and able to tolerate incomplete or uncertain intermediate results. An appropriately chose representation coupled with a parallel architecture addresses both concerns. The wide range of numerical and symbolic processing needed for robust computer vision can only be achieved through a blend of SIMD and MIMD processing techniques. The 1024 element hypercube architecture has these capabilities, and was chosen as the test-bed hardware for development of highly parallel computer vision algorithms. This paper presents and analyzes parallel algorithms for color image segmentation and edge detection. These algorithms are part of a recently developed computer vision system which uses multiple valued logic to represent uncertainty in the imaging process and in intermediate results. Algorithms for the extraction of three dimensional properties of objects using dynamic scene analysis techniques within the same framework are examined. Results from experimental studies using a 1024 element hypercube implementation of the algorithm as applied to a series of natural scenes are reported
National Research Council Canada - National Science Library
Freedman, Matthew T
2004-01-01
.... We have received US Army Human Use approval for study of tissue samples. We provided technical advice to Imperium, and have performed physics tests and in imaging of pieces of animal tissue obtained in a supermarket...
National Research Council Canada - National Science Library
Freedman, Matthew
2003-01-01
.... We have received US Army Human Use approval for study of tissue samples. During this past year, we provided technical advice to Imperium, and have performed physics tests and in imaging of pieces of animal tissue obtained in a supermarket...
Low-Complexity Regularization Algorithms for Image Deblurring
Alanazi, Abdulrahman
2016-11-01
Image restoration problems deal with images in which information has been degraded by blur or noise. In practice, the blur is usually caused by atmospheric turbulence, motion, camera shake, and several other mechanical or physical processes. In this study, we present two regularization algorithms for the image deblurring problem. We first present a new method based on solving a regularized least-squares (RLS) problem. This method is proposed to find a near-optimal value of the regularization parameter in the RLS problems. Experimental results on the non-blind image deblurring problem are presented. In all experiments, comparisons are made with three benchmark methods. The results demonstrate that the proposed method clearly outperforms the other methods in terms of both the output PSNR and structural similarity, as well as the visual quality of the deblurred images. To reduce the complexity of the proposed algorithm, we propose a technique based on the bootstrap method to estimate the regularization parameter in low and high-resolution images. Numerical results show that the proposed technique can effectively reduce the computational complexity of the proposed algorithms. In addition, for some cases where the point spread function (PSF) is separable, we propose using a Kronecker product so as to reduce the computations. Furthermore, in the case where the image is smooth, it is always desirable to replace the regularization term in the RLS problems by a total variation term. Therefore, we propose a novel method for adaptively selecting the regularization parameter in a so-called square root regularized total variation (SRTV). Experimental results demonstrate that our proposed method outperforms the other benchmark methods when applied to smooth images in terms of PSNR, SSIM and the restored image quality. In this thesis, we focus on the non-blind image deblurring problem, where the blur kernel is assumed to be known. However, we developed algorithms that also work
Analysis of the speckle properties in a laser projection system based on a human eye model.
Cui, Zhe; Wang, Anting; Ma, Qianli; Ming, Hai
2014-03-01
In this paper, the properties of the speckle that is observed by humans in laser projection systems are theoretically analyzed. The speckle pattern on the fovea of the human retina is numerically simulated by introducing a chromatic human eye model. The results show that the speckle contrast experienced by humans is affected by the light intensity of the projected images and the wavelength of the laser source when considering the paracentral vision. Furthermore, the image quality is also affected by these two parameters. We believe that these results are useful for evaluating the speckle noise in laser projection systems.
Improved document image segmentation algorithm using multiresolution morphology
Bukhari, Syed Saqib; Shafait, Faisal; Breuel, Thomas M.
2011-01-01
Page segmentation into text and non-text elements is an essential preprocessing step before optical character recognition (OCR) operation. In case of poor segmentation, an OCR classification engine produces garbage characters due to the presence of non-text elements. This paper describes modifications to the text/non-text segmentation algorithm presented by Bloomberg,1 which is also available in his open-source Leptonica library.2The modifications result in significant improvements and achieved better segmentation accuracy than the original algorithm for UW-III, UNLV, ICDAR 2009 page segmentation competition test images and circuit diagram datasets.
Chaotic Image Encryption Algorithm Based on Circulant Operation
Directory of Open Access Journals (Sweden)
Xiaoling Huang
2013-01-01
Full Text Available A novel chaotic image encryption scheme based on the time-delay Lorenz system is presented in this paper with the description of Circulant matrix. Making use of the chaotic sequence generated by the time-delay Lorenz system, the pixel permutation is carried out in diagonal and antidiagonal directions according to the first and second components. Then, a pseudorandom chaotic sequence is generated again from time-delay Lorenz system using all components. Modular operation is further employed for diffusion by blocks, in which the control parameter is generated depending on the plain-image. Numerical experiments show that the proposed scheme possesses the properties of a large key space to resist brute-force attack, sensitive dependence on secret keys, uniform distribution of gray values in the cipher-image, and zero correlation between two adjacent cipher-image pixels. Therefore, it can be adopted as an effective and fast image encryption algorithm.
Chaotic Image Scrambling Algorithm Based on S-DES
International Nuclear Information System (INIS)
Yu, X Y; Zhang, J; Ren, H E; Xu, G S; Luo, X Y
2006-01-01
With the security requirement improvement of the image on the network, some typical image encryption methods can't meet the demands of encryption, such as Arnold cat map and Hilbert transformation. S-DES system can encrypt the input binary flow of image, but the fixed system structure and few keys will still bring some risks. However, the sensitivity of initial value that Logistic chaotic map can be well applied to the system of S-DES, which makes S-DES have larger random and key quantities. A dual image encryption algorithm based on S-DES and Logistic map is proposed. Through Matlab simulation experiments, the key quantities will attain 10 17 and the encryption speed of one image doesn't exceed one second. Compared to traditional methods, it has some merits such as easy to understand, rapid encryption speed, large keys and sensitivity to initial value
Image-reconstruction algorithms for positron-emission tomography systems
International Nuclear Information System (INIS)
Cheng, S.N.C.
1982-01-01
The positional uncertainty in the time-of-flight measurement of a positron-emission tomography system is modelled as a Gaussian distributed random variable and the image is assumed to be piecewise constant on a rectilinear lattice. A reconstruction algorithm using maximum-likelihood estimation is derived for the situation in which time-of-flight data are sorted as the most-likely-position array. The algorithm is formulated as a linear system described by a nonseparable, block-banded, Toeplitz matrix, and a sine-transform technique is used to implement this algorithm efficiently. The reconstruction algorithms for both the most-likely-position array and the confidence-weighted array are described by similar equations, hence similar linear systems can be used to described the reconstruction algorithm for a discrete, confidence-weighted array, when the matrix and the entries in the data array are properly identified. It is found that the mean square-error depends on the ratio of the full width at half the maximum of time-of-flight measurement over the size of a pixel. When other parameters are fixed, the larger the pixel size, the smaller is the mean square-error. In the study of resolution, parameters that affect the impulse response of time-of-flight reconstruction algorithms are identified. It is found that the larger the pixel size, the larger is the standard deviation of the impulse response. This shows that small mean square-error and fine resolution are two contradictory requirements
Robust digital image inpainting algorithm in the wireless environment
Karapetyan, G.; Sarukhanyan, H. G.; Agaian, S. S.
2014-05-01
Image or video inpainting is the process/art of retrieving missing portions of an image without introducing undesirable artifacts that are undetectable by an ordinary observer. An image/video can be damaged due to a variety of factors, such as deterioration due to scratches, laser dazzling effects, wear and tear, dust spots, loss of data when transmitted through a channel, etc. Applications of inpainting include image restoration (removing laser dazzling effects, dust spots, date, text, time, etc.), image synthesis (texture synthesis), completing panoramas, image coding, wireless transmission (recovery of the missing blocks), digital culture protection, image de-noising, fingerprint recognition, and film special effects and production. Most inpainting methods can be classified in two key groups: global and local methods. Global methods are used for generating large image regions from samples while local methods are used for filling in small image gaps. Each method has its own advantages and limitations. For example, the global inpainting methods perform well on textured image retrieval, whereas the classical local methods perform poorly. In addition, some of the techniques are computationally intensive; exceeding the capabilities of most currently used mobile devices. In general, the inpainting algorithms are not suitable for the wireless environment. This paper presents a new and efficient scheme that combines the advantages of both local and global methods into a single algorithm. Particularly, it introduces a blind inpainting model to solve the above problems by adaptively selecting support area for the inpainting scheme. The proposed method is applied to various challenging image restoration tasks, including recovering old photos, recovering missing data on real and synthetic images, and recovering the specular reflections in endoscopic images. A number of computer simulations demonstrate the effectiveness of our scheme and also illustrate the main properties
Speckle-modulating optical coherence tomography in living mice and humans
Liba, Orly; Lew, Matthew D.; Sorelle, Elliott D.; Dutta, Rebecca; Sen, Debasish; Moshfeghi, Darius M.; Chu, Steven; de La Zerda, Adam
2017-06-01
Optical coherence tomography (OCT) is a powerful biomedical imaging technology that relies on the coherent detection of backscattered light to image tissue morphology in vivo. As a consequence, OCT is susceptible to coherent noise (speckle noise), which imposes significant limitations on its diagnostic capabilities. Here we show speckle-modulating OCT (SM-OCT), a method based purely on light manipulation that virtually eliminates speckle noise originating from a sample. SM-OCT accomplishes this by creating and averaging an unlimited number of scans with uncorrelated speckle patterns without compromising spatial resolution. Using SM-OCT, we reveal small structures in the tissues of living animals, such as the inner stromal structure of a live mouse cornea, the fine structures inside the mouse pinna, and sweat ducts and Meissner's corpuscle in the human fingertip skin--features that are otherwise obscured by speckle noise when using conventional OCT or OCT with current state of the art speckle reduction methods.
A Degree Distribution Optimization Algorithm for Image Transmission
Jiang, Wei; Yang, Junjie
2016-09-01
Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.
A novel high-frequency encoding algorithm for image compression
Siddeq, Mohammed M.; Rodrigues, Marcos A.
2017-12-01
In this paper, a new method for image compression is proposed whose quality is demonstrated through accurate 3D reconstruction from 2D images. The method is based on the discrete cosine transform (DCT) together with a high-frequency minimization encoding algorithm at compression stage and a new concurrent binary search algorithm at decompression stage. The proposed compression method consists of five main steps: (1) divide the image into blocks and apply DCT to each block; (2) apply a high-frequency minimization method to the AC-coefficients reducing each block by 2/3 resulting in a minimized array; (3) build a look up table of probability data to enable the recovery of the original high frequencies at decompression stage; (4) apply a delta or differential operator to the list of DC-components; and (5) apply arithmetic encoding to the outputs of steps (2) and (4). At decompression stage, the look up table and the concurrent binary search algorithm are used to reconstruct all high-frequency AC-coefficients while the DC-components are decoded by reversing the arithmetic coding. Finally, the inverse DCT recovers the original image. We tested the technique by compressing and decompressing 2D images including images with structured light patterns for 3D reconstruction. The technique is compared with JPEG and JPEG2000 through 2D and 3D RMSE. Results demonstrate that the proposed compression method is perceptually superior to JPEG with equivalent quality to JPEG2000. Concerning 3D surface reconstruction from images, it is demonstrated that the proposed method is superior to both JPEG and JPEG2000.
TRANSFORMATION ALGORITHM FOR IMAGES OBTAINED BY OMNIDIRECTIONAL CAMERAS
Directory of Open Access Journals (Sweden)
V. P. Lazarenko
2015-01-01
Full Text Available Omnidirectional optoelectronic systems find their application in areas where a wide viewing angle is critical. However, omnidirectional optoelectronic systems have a large distortion that makes their application more difficult. The paper compares the projection functions of traditional perspective lenses and omnidirectional wide angle fish-eye lenses with a viewing angle not less than 180°. This comparison proves that distortion models of omnidirectional cameras cannot be described as a deviation from the classic model of pinhole camera. To solve this problem, an algorithm for transforming omnidirectional images has been developed. The paper provides a brief comparison of the four calibration methods available in open source toolkits for omnidirectional optoelectronic systems. Geometrical projection model is given used for calibration of omnidirectional optical system. The algorithm consists of three basic steps. At the first step, we calculate he field of view of a virtual pinhole PTZ camera. This field of view is characterized by an array of 3D points in the object space. At the second step the array of corresponding pixels for these three-dimensional points is calculated. Then we make a calculation of the projection function that expresses the relation between a given 3D point in the object space and a corresponding pixel point. In this paper we use calibration procedure providing the projection function for calibrated instance of the camera. At the last step final image is formed pixel-by-pixel from the original omnidirectional image using calculated array of 3D points and projection function. The developed algorithm gives the possibility for obtaining an image for a part of the field of view of an omnidirectional optoelectronic system with the corrected distortion from the original omnidirectional image. The algorithm is designed for operation with the omnidirectional optoelectronic systems with both catadioptric and fish-eye lenses
IMAGEP - A FORTRAN ALGORITHM FOR DIGITAL IMAGE PROCESSING
Roth, D. J.
1994-01-01
IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.
Spatial correlation genetic algorithm for fractal image compression
International Nuclear Information System (INIS)
Wu, M.-S.; Teng, W.-C.; Jeng, J.-H.; Hsieh, J.-G.
2006-01-01
Fractal image compression explores the self-similarity property of a natural image and utilizes the partitioned iterated function system (PIFS) to encode it. This technique is of great interest both in theory and application. However, it is time-consuming in the encoding process and such drawback renders it impractical for real time applications. The time is mainly spent on the search for the best-match block in a large domain pool. In this paper, a spatial correlation genetic algorithm (SC-GA) is proposed to speed up the encoder. There are two stages for the SC-GA method. The first stage makes use of spatial correlations in images for both the domain pool and the range pool to exploit local optima. The second stage is operated on the whole image to explore more adequate similarities if the local optima are not satisfied. With the aid of spatial correlation in images, the encoding time is 1.5 times faster than that of traditional genetic algorithm method, while the quality of the retrieved image is almost the same. Moreover, about half of the matched blocks come from the correlated space, so fewer bits are required to represent the fractal transform and therefore the compression ratio is also improved
International Nuclear Information System (INIS)
Gerganov, G.; Kuvandjiev, V.; Dimitrova, I.; Mitev, K.; Kawrakow, I.
2012-01-01
The objective of this work is to present the capabilities of the NUMERICS web platform for evaluation of the performance of image registration algorithms. The NUMERICS platform is a web accessible tool which provides access to dedicated numerical algorithms for registration and comparison of medical images (http://numerics.phys.uni-sofia.bg). The platform allows comparison of noisy medical images by means of different types of image comparison algorithms, which are based on statistical tests for outliers. The platform also allows 2D image registration with different techniques like Elastic Thin-Plate Spline registration, registration based on rigid transformations, affine transformations, as well as non-rigid image registration based on Mobius transformations. In this work we demonstrate how the platform can be used as a tool for evaluation of the quality of the image registration process. We demonstrate performance evaluation of a deformable image registration technique based on Mobius transformations. The transformations are applied with appropriate cost functions like: Mutual information, Correlation coefficient, Sum of Squared Differences. The accent is on the results provided by the platform to the user and their interpretation in the context of the performance evaluation of 2D image registration. The NUMERICS image registration and image comparison platform provides detailed statistical information about submitted image registration jobs and can be used to perform quantitative evaluation of the performance of different image registration techniques. (authors)
Behaviors study of image registration algorithms in image guided radiation therapy
International Nuclear Information System (INIS)
Zou Lian; Hou Qing
2008-01-01
Objective: Study the behaviors of image registration algorithms, and analyze the elements which influence the performance of image registrations. Methods: Pre-known corresponding coordinates were appointed for reference image and moving image, and then the influence of region of interest (ROI) selection, transformation function initial parameters and coupled parameter spaces on registration results were studied with a software platform developed in home. Results: Region of interest selection had a manifest influence on registration performance. An improperly chosen ROI resulted in a bad registration. Transformation function initial parameters selection based on pre-known information could improve the accuracy of image registration. Coupled parameter spaces would enhance the dependence of image registration algorithm on ROI selection. Conclusions: It is necessary for clinic IGRT to obtain a ROI selection strategy (depending on specific commercial software) correlated to tumor sites. Three suggestions for image registration technique developers are automatic selection of the initial parameters of transformation function based on pre-known information, developing specific image registration algorithm for specific image feature, and assembling real-time image registration algorithms according to tumor sites selected by software user. (authors)
2015-03-26
the number of speckle samples obtained, laser power and coherence length, spot size, target reflectance, speckle size, and pixels per speckle width...gated imaging systems,” Proc. SPIE, 6542: 654218, April 2007. 90 St. Pierre, Randall J. and others. “Active Tracker Laser (ATLAS),” IEEE J. Sel...numerical model developed here and existing theory developed by Hu. A 671 nm diode laser source with coherence length of 259 +/- 7 µm is reflected
An Improved Piecewise Linear Chaotic Map Based Image Encryption Algorithm
Directory of Open Access Journals (Sweden)
Yuping Hu
2014-01-01
Full Text Available An image encryption algorithm based on improved piecewise linear chaotic map (MPWLCM model was proposed. The algorithm uses the MPWLCM to permute and diffuse plain image simultaneously. Due to the sensitivity to initial key values, system parameters, and ergodicity in chaotic system, two pseudorandom sequences are designed and used in the processes of permutation and diffusion. The order of processing pixels is not in accordance with the index of pixels, but it is from beginning or end alternately. The cipher feedback was introduced in diffusion process. Test results and security analysis show that not only the scheme can achieve good encryption results but also its key space is large enough to resist against brute attack.
Research on Wavelet-Based Algorithm for Image Contrast Enhancement
Institute of Scientific and Technical Information of China (English)
Wu Ying-qian; Du Pei-jun; Shi Peng-fei
2004-01-01
A novel wavelet-based algorithm for image enhancement is proposed in the paper. On the basis of multiscale analysis, the proposed algorithm solves efficiently the problem of noise over-enhancement, which commonly occurs in the traditional methods for contrast enhancement. The decomposed coefficients at same scales are processed by a nonlinear method, and the coefficients at different scales are enhanced in different degree. During the procedure, the method takes full advantage of the properties of Human visual system so as to achieve better performance. The simulations demonstrate that these characters of the proposed approach enable it to fully enhance the content in images, to efficiently alleviate the enhancement of noise and to achieve much better enhancement effect than the traditional approaches.
MODERN POSSIBILITIES OF SPECKLE TRACKING ECHOCARDIOGRAPHY IN CLINICAL PRACTICE
Directory of Open Access Journals (Sweden)
V. S. Nikiforov
2017-01-01
Full Text Available Speckle-tracking echocardiography is promising modern technique for evaluation of structural and functional changes in the myocardium. It evaluates the indicator of global longitudinal myocardial deformation, which is more sensitive than ejection fraction to early changes of left ventricular contractility. The diagnostic capabilities of speckle tracking echocardiography are reflected in clinical recommendations and consensus statements of European Society of Cardiology (ESC, European Association of Cardiovascular Imaging (EACVI and American Society of Echocardiography (ASE. The aim of this paper is describe basic principles of speckle tracking echocardiography and clinical applications of this new technology. Attention is paid to the use of speckle tracking echocardiography in such heart pathologies as heart failure, coronary heart disease and myocardial infarction, left ventricular hypertrophy in arterial hypertension, hypertrophic cardiomyopathy and amyloidosis of the heart, valvular heart disease, constrictive pericarditis and cancer therapy-induced cardiotoxicity.
A decade of innovation with laser speckle metrology
Ettemeyer, Andreas
2003-05-01
Speckle Pattern Interferometry has emerged from the experimental substitution of holographic interferometry to become a powerful problem solving tool in research and industry. The rapid development of computer and digital imaging techniques in combination with minaturization of the optical equipment led to new applications which had not been anticipated before. While classical holographic interferometry had always required careful consideration of the environmental conditions such as vibration, noise, light, etc. and could generally only be performed in the optical laboratory, it is now state of the art, to handle portable speckle measuring equipment at almost any place. During the last decade, the change in design and technique has dramatically influenced the range of applications of speckle metrology and opened new markets. The integration of recent research results into speckle measuring equipment has led to handy equipment, simplified the operation and created high quality data output.
Ponticorvo, A.; Rowland, R.; Baldado, M.; Burmeister, D. M.; Christy, R. J.; Bernal, N.; Durkin, A. J.
2018-02-01
The current standard for assessment of burn severity and subsequent wound healing is through clinical examination, which is highly subjective. Accurate early assessment of burn severity is critical for dictating the course of wound management. Complicating matters is the fact that burn wounds are often large and can have multiple regions that vary in severity. In order to manage the treatment more effectively, a tool that can provide spatially resolved information related to mapping burn severity could aid clinicians when making decisions. Several new technologies focus on burn care in an attempt to help clinicians objectively determine burn severity. By quantifying perfusion, laser speckle imaging (LSI) has had success in categorizing burn wound severity at earlier time points than clinical assessment alone. Additionally, spatial frequency domain imaging (SFDI) is a new technique that can quantify the tissue structural damage associated with burns to achieve earlier categorization of burn severity. Here we compared the performance of a commercial LSI device (PeriCam PSI, Perimed Inc.), a SFDI device (Reflect RSTM, Modulated Imaging Inc.) and conventional clinical assessment in a controlled (porcine) model of graded burn wound severity over the course of 28 days. Specifically we focused on the ability of each system to predict the spatial heterogeneity of the healed wound at 28 days, based on the images at an early time point. Spatial heterogeneity was defined by clinical assessment of distinct regions of healing on day 28. Across six pigs, 96 burn wounds (3 cm diameter) were created. Clinical assessment at day 28 indicated that 39 had appeared to heal in a heterogeneous manner. Clinical observation at day 1 found 35 / 39 (90%) to be spatially heterogeneous in terms of burn severity. The LSI system was able to detect spatial heterogeneity of burn severity in 14 / 39 (36%) cases on day 1 and 23 / 39 cases (59%) on day 7. By contrast the SFDI system was able to
Target recognition of ladar range images using slice image: comparison of four improved algorithms
Xia, Wenze; Han, Shaokun; Cao, Jingya; Wang, Liang; Zhai, Yu; Cheng, Yang
2017-07-01
Compared with traditional 3-D shape data, ladar range images possess properties of strong noise, shape degeneracy, and sparsity, which make feature extraction and representation difficult. The slice image is an effective feature descriptor to resolve this problem. We propose four improved algorithms on target recognition of ladar range images using slice image. In order to improve resolution invariance of the slice image, mean value detection instead of maximum value detection is applied in these four improved algorithms. In order to improve rotation invariance of the slice image, three new improved feature descriptors-which are feature slice image, slice-Zernike moments, and slice-Fourier moments-are applied to the last three improved algorithms, respectively. Backpropagation neural networks are used as feature classifiers in the last two improved algorithms. The performance of these four improved recognition systems is analyzed comprehensively in the aspects of the three invariances, recognition rate, and execution time. The final experiment results show that the improvements for these four algorithms reach the desired effect, the three invariances of feature descriptors are not directly related to the final recognition performance of recognition systems, and these four improved recognition systems have different performances under different conditions.
Mapping Iterative Medical Imaging Algorithm on Cell Accelerator
Directory of Open Access Journals (Sweden)
Meilian Xu
2011-01-01
architectures that exploit data parallel applications, medical imaging algorithms such as OS-SART can be studied to produce increased performance. In this paper, we map OS-SART on cell broadband engine (Cell BE. We effectively use the architectural features of Cell BE to provide an efficient mapping. The Cell BE consists of one powerPC processor element (PPE and eight SIMD coprocessors known as synergetic processor elements (SPEs. The limited memory storage on each of the SPEs makes the mapping challenging. Therefore, we present optimization techniques to efficiently map the algorithm on the Cell BE for improved performance over CPU version. We compare the performance of our proposed algorithm on Cell BE to that of Sun Fire ×4600, a shared memory machine. The Cell BE is five times faster than AMD Opteron dual-core processor. The speedup of the algorithm on Cell BE increases with the increase in the number of SPEs. We also experiment with various parameters, such as number of subsets, number of processing elements, and number of DMA transfers between main memory and local memory, that impact the performance of the algorithm.
First results of genetic algorithm application in ML image reconstruction in emission tomography
International Nuclear Information System (INIS)
Smolik, W.
1999-01-01
This paper concerns application of genetic algorithm in maximum likelihood image reconstruction in emission tomography. The example of genetic algorithm for image reconstruction is presented. The genetic algorithm was based on the typical genetic scheme modified due to the nature of solved problem. The convergence of algorithm was examined. The different adaption functions, selection and crossover methods were verified. The algorithm was tested on simulated SPECT data. The obtained results of image reconstruction are discussed. (author)
Multi-viewpoint Image Array Virtual Viewpoint Rapid Generation Algorithm Based on Image Layering
Jiang, Lu; Piao, Yan
2018-04-01
The use of multi-view image array combined with virtual viewpoint generation technology to record 3D scene information in large scenes has become one of the key technologies for the development of integrated imaging. This paper presents a virtual viewpoint rendering method based on image layering algorithm. Firstly, the depth information of reference viewpoint image is quickly obtained. During this process, SAD is chosen as the similarity measure function. Then layer the reference image and calculate the parallax based on the depth information. Through the relative distance between the virtual viewpoint and the reference viewpoint, the image layers are weighted and panned. Finally the virtual viewpoint image is rendered layer by layer according to the distance between the image layers and the viewer. This method avoids the disadvantages of the algorithm DIBR, such as high-precision requirements of depth map and complex mapping operations. Experiments show that, this algorithm can achieve the synthesis of virtual viewpoints in any position within 2×2 viewpoints range, and the rendering speed is also very impressive. The average result proved that this method can get satisfactory image quality. The average SSIM value of the results relative to real viewpoint images can reaches 0.9525, the PSNR value can reaches 38.353 and the image histogram similarity can reaches 93.77%.
Despeckle filtering for ultrasound imaging and video II selected applications
Loizou, Christos P
2015-01-01
In ultrasound imaging and video visual perception is hindered by speckle multiplicative noise that degrades the quality. Noise reduction is therefore essential for improving the visual observation quality or as a pre-processing step for further automated analysis, such as image/video segmentation, texture analysis and encoding in ultrasound imaging and video. The goal of the first book (book 1 of 2 books) was to introduce the problem of speckle in ultrasound image and video as well as the theoretical background, algorithmic steps, and the MatlabTM for the following group of despeckle filters:
Understanding the exposure-time effect on speckle contrast measurements for laser displays
Suzuki, Koji; Kubota, Shigeo
2018-02-01
To evaluate the influence of exposure time on speckle noise for laser displays, speckle contrast measurement method was developed observable at a human eye response time using a high-sensitivity camera which has a signal multiplying function. The nonlinearity of camera light sensitivity was calibrated to measure accurate speckle contrasts, and the measuring lower limit noise of speckle contrast was improved by applying spatial-frequency low pass filter to the captured images. Three commercially available laser displays were measured over a wide range of exposure times from tens of milliseconds to several seconds without adjusting the brightness of laser displays. The speckle contrast of raster-scanned mobile projector without any speckle-reduction device was nearly constant over various exposure times. On the contrary to this, in full-frame projection type laser displays equipped with a temporally-averaging speckle-reduction device, some of their speckle contrasts close to the lower limits noise were slightly increased at the shorter exposure time due to the noise. As a result, the exposure-time effect of speckle contrast could not be observed in our measurements, although it is more reasonable to think that the speckle contrasts of laser displays, which are equipped with the temporally-averaging speckle-reduction device, are dependent on the exposure time. This discrepancy may be attributed to the underestimation of temporal averaging factor. We expected that this method is useful for evaluating various laser displays and clarify the relationship between the speckle noise and the exposure time for a further verification of speckle reduction.
Comparison of segmentation algorithms for fluorescence microscopy images of cells.
Dima, Alden A; Elliott, John T; Filliben, James J; Halter, Michael; Peskin, Adele; Bernal, Javier; Kociolek, Marcin; Brady, Mary C; Tang, Hai C; Plant, Anne L
2011-07-01
The analysis of fluorescence microscopy of cells often requires the determination of cell edges. This is typically done using segmentation techniques that separate the cell objects in an image from the surrounding background. This study compares segmentation results from nine different segmentation techniques applied to two different cell lines and five different sets of imaging conditions. Significant variability in the results of segmentation was observed that was due solely to differences in imaging conditions or applications of different algorithms. We quantified and compared the results with a novel bivariate similarity index metric that evaluates the degree of underestimating or overestimating a cell object. The results show that commonly used threshold-based segmentation techniques are less accurate than k-means clustering with multiple clusters. Segmentation accuracy varies with imaging conditions that determine the sharpness of cell edges and with geometric features of a cell. Based on this observation, we propose a method that quantifies cell edge character to provide an estimate of how accurately an algorithm will perform. The results of this study will assist the development of criteria for evaluating interlaboratory comparability. Published 2011 Wiley-Liss, Inc.
Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding
Directory of Open Access Journals (Sweden)
Linguo Li
2017-01-01
Full Text Available The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO, which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur’s entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO, the differential evolution (DE, the Artifical Bee Colony (ABC, and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability.
Algorithms and Array Design Criteria for Robust Imaging in Interferometry
Kurien, Binoy George
Optical interferometry is a technique for obtaining high-resolution imagery of a distant target by interfering light from multiple telescopes. Image restoration from interferometric measurements poses a unique set of challenges. The first challenge is that the measurement set provides only a sparse-sampling of the object's Fourier Transform and hence image formation from these measurements is an inherently ill-posed inverse problem. Secondly, atmospheric turbulence causes severe distortion of the phase of the Fourier samples. We develop array design conditions for unique Fourier phase recovery, as well as a comprehensive algorithmic framework based on the notion of redundant-spaced-calibration (RSC), which together achieve reliable image reconstruction in spite of these challenges. Within this framework, we see that classical interferometric observables such as the bispectrum and closure phase can limit sensitivity, and that generalized notions of these observables can improve both theoretical and empirical performance. Our framework leverages techniques from lattice theory to resolve integer phase ambiguities in the interferometric phase measurements, and from graph theory, to select a reliable set of generalized observables. We analyze the expected shot-noise-limited performance of our algorithm for both pairwise and Fizeau interferometric architectures and corroborate this analysis with simulation results. We apply techniques from the field of compressed sensing to perform image reconstruction from the estimates of the object's Fourier coefficients. The end result is a comprehensive strategy to achieve well-posed and easily-predictable reconstruction performance in optical interferometry.
Information theoretic methods for image processing algorithm optimization
Prokushkin, Sergey F.; Galil, Erez
2015-01-01
Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).
Pomegranate MR images analysis using ACM and FCM algorithms
Morad, Ghobad; Shamsi, Mousa; Sedaaghi, M. H.; Alsharif, M. R.
2011-10-01
Segmentation of an image plays an important role in image processing applications. In this paper segmentation of pomegranate magnetic resonance (MR) images has been explored. Pomegranate has healthy nutritional and medicinal properties for which the maturity indices and quality of internal tissues play an important role in the sorting process in which the admissible determination of features mentioned above cannot be easily achieved by human operator. Seeds and soft tissues are the main internal components of pomegranate. For research purposes, such as non-destructive investigation, in order to determine the ripening index and the percentage of seeds in growth period, segmentation of the internal structures should be performed as exactly as possible. In this paper, we present an automatic algorithm to segment the internal structure of pomegranate. Since its intensity of stem and calyx is close to the internal tissues, the stem and calyx pixels are usually labeled to the internal tissues by segmentation algorithm. To solve this problem, first, the fruit shape is extracted from its background using active contour model (ACM). Then stem and calyx are removed using morphological filters. Finally the image is segmented by fuzzy c-means (FCM). The experimental results represent an accuracy of 95.91% in the presence of stem and calyx, while the accuracy of segmentation increases to 97.53% when stem and calyx are first removed by morphological filters.
An Algorithm for Pedestrian Detection in Multispectral Image Sequences
Kniaz, V. V.; Fedorenko, V. V.
2017-05-01
The growing interest for self-driving cars provides a demand for scene understanding and obstacle detection algorithms. One of the most challenging problems in this field is the problem of pedestrian detection. Main difficulties arise from a diverse appearances of pedestrians. Poor visibility conditions such as fog and low light conditions also significantly decrease the quality of pedestrian detection. This paper presents a new optical flow based algorithm BipedDetet that provides robust pedestrian detection on a single-borad computer. The algorithm is based on the idea of simplified Kalman filtering suitable for realization on modern single-board computers. To detect a pedestrian a synthetic optical flow of the scene without pedestrians is generated using slanted-plane model. The estimate of a real optical flow is generated using a multispectral image sequence. The difference of the synthetic optical flow and the real optical flow provides the optical flow induced by pedestrians. The final detection of pedestrians is done by the segmentation of the difference of optical flows. To evaluate the BipedDetect algorithm a multispectral dataset was collected using a mobile robot.
Barcelos, Amanda; Tibirica, Eduardo; Lamas, Cristiane
2018-07-01
To evaluate the systemic microcirculation of patients with infective endocarditis (IE). This is a comparative study of patients with definite IE by the modified Duke criteria admitted to our center for treatment. A reference group of sex- and age-matched healthy volunteers was included. Microvascular flow was evaluated in the forearm using a laser speckle contrast imaging system, for noninvasive measurement of cutaneous microvascular perfusion, in combination with skin iontophoresis of acetylcholine (ACh) and sodium nitroprusside (SNP) to test microvascular reactivity. Microvascular density was evaluated using skin video-capillaroscopy. We studied 22 patients with IE; 15 were male and seven female. The mean age and standard deviation (SD) were 45.5 ± 17.3 years. Basal skin microvascular conductance was significantly increased in patients with IE, compared with healthy individuals (0.36 ± 0.13 versus 0.21 ± 0.08 APU/mmHg; P < 0.0001). The increase in microvascular conductance induced by ACh in patients was 0.21 ± 0.17 and in the reference group, it was 0.37 ± 0.14 APU/mmHg (P = 0.0012). The increase in microvascular conductance induced by SNP in patients was 0.18 ± 0.14 and it was 0.29 ± 0.15 APU/mmHg (P = 0.0140) in the reference group. The basal mean skin capillary density of patients (135 ± 24 capillaries/mm 2 ) was significantly higher, compared with controls (97 ± 21 capillaries/mm 2 ; P < 0.0001). The main findings in the microcirculation of patients with IE were greater basal vasodilation and a reduction of the endothelium-dependent and -independent microvascular reactivity, as well as greater functional skin capillary density compared to healthy individuals. Copyright © 2018 Elsevier Inc. All rights reserved.
Hybrid wavefront sensing and image correction algorithm for imaging through turbulent media
Wu, Chensheng; Robertson Rzasa, John; Ko, Jonathan; Davis, Christopher C.
2017-09-01
It is well known that passive image correction of turbulence distortions often involves using geometry-dependent deconvolution algorithms. On the other hand, active imaging techniques using adaptive optic correction should use the distorted wavefront information for guidance. Our work shows that a hybrid hardware-software approach is possible to obtain accurate and highly detailed images through turbulent media. The processing algorithm also takes much fewer iteration steps in comparison with conventional image processing algorithms. In our proposed approach, a plenoptic sensor is used as a wavefront sensor to guide post-stage image correction on a high-definition zoomable camera. Conversely, we show that given the ground truth of the highly detailed image and the plenoptic imaging result, we can generate an accurate prediction of the blurred image on a traditional zoomable camera. Similarly, the ground truth combined with the blurred image from the zoomable camera would provide the wavefront conditions. In application, our hybrid approach can be used as an effective way to conduct object recognition in a turbulent environment where the target has been significantly distorted or is even unrecognizable.
RESEARCH ON AIRBORNE SAR IMAGING BASED ON ESC ALGORITHM
Directory of Open Access Journals (Sweden)
X. T. Dong
2017-09-01
Full Text Available Due to the ability of flexible, accurate, and fast obtaining abundant information, airborne SAR is significant in the field of Earth Observation and many other applications. Optimally the flight paths are straight lines, but in reality it is not the case since some portion of deviation from the ideal path is impossible to avoid. A small disturbance from the ideal line will have a major effect on the signal phase, dramatically deteriorating the quality of SAR images and data. Therefore, to get accurate echo information and radar images, it is essential to measure and compensate for nonlinear motion of antenna trajectories. By means of compensating each flying trajectory to its reference track, MOCO method corrects linear phase error and quadratic phase error caused by nonlinear antenna trajectories. Position and Orientation System (POS data is applied to acquiring accuracy motion attitudes and spatial positions of antenna phase centre (APC. In this paper, extend chirp scaling algorithm (ECS is used to deal with echo data of airborne SAR. An experiment is done using VV-Polarization raw data of C-band airborne SAR. The quality evaluations of compensated SAR images and uncompensated SAR images are done in the experiment. The former always performs better than the latter. After MOCO processing, azimuth ambiguity is declined, peak side lobe ratio (PSLR effectively improves and the resolution of images is improved obviously. The result shows the validity and operability of the imaging process for airborne SAR.
Research on Airborne SAR Imaging Based on Esc Algorithm
Dong, X. T.; Yue, X. J.; Zhao, Y. H.; Han, C. M.
2017-09-01
Due to the ability of flexible, accurate, and fast obtaining abundant information, airborne SAR is significant in the field of Earth Observation and many other applications. Optimally the flight paths are straight lines, but in reality it is not the case since some portion of deviation from the ideal path is impossible to avoid. A small disturbance from the ideal line will have a major effect on the signal phase, dramatically deteriorating the quality of SAR images and data. Therefore, to get accurate echo information and radar images, it is essential to measure and compensate for nonlinear motion of antenna trajectories. By means of compensating each flying trajectory to its reference track, MOCO method corrects linear phase error and quadratic phase error caused by nonlinear antenna trajectories. Position and Orientation System (POS) data is applied to acquiring accuracy motion attitudes and spatial positions of antenna phase centre (APC). In this paper, extend chirp scaling algorithm (ECS) is used to deal with echo data of airborne SAR. An experiment is done using VV-Polarization raw data of C-band airborne SAR. The quality evaluations of compensated SAR images and uncompensated SAR images are done in the experiment. The former always performs better than the latter. After MOCO processing, azimuth ambiguity is declined, peak side lobe ratio (PSLR) effectively improves and the resolution of images is improved obviously. The result shows the validity and operability of the imaging process for airborne SAR.
Utilizing Minkowski functionals for image analysis: a marching square algorithm
International Nuclear Information System (INIS)
Mantz, Hubert; Jacobs, Karin; Mecke, Klaus
2008-01-01
Comparing noisy experimental image data with statistical models requires a quantitative analysis of grey-scale images beyond mean values and two-point correlations. A real-space image analysis technique is introduced for digitized grey-scale images, based on Minkowski functionals of thresholded patterns. A novel feature of this marching square algorithm is the use of weighted side lengths for pixels, so that boundary lengths are captured accurately. As examples to illustrate the technique we study surface topologies emerging during the dewetting process of thin films and analyse spinodal decomposition as well as turbulent patterns in chemical reaction–diffusion systems. The grey-scale value corresponds to the height of the film or to the concentration of chemicals, respectively. Comparison with analytic calculations in stochastic geometry models reveals a remarkable agreement of the examples with a Gaussian random field. Thus, a statistical test for non-Gaussian features in experimental data becomes possible with this image analysis technique—even for small image sizes. Implementations of the software used for the analysis are offered for download
Pelet, S; Previte, M J R; Laiho, L H; So, P T C
2004-10-01
Global fitting algorithms have been shown to improve effectively the accuracy and precision of the analysis of fluorescence lifetime imaging microscopy data. Global analysis performs better than unconstrained data fitting when prior information exists, such as the spatial invariance of the lifetimes of individual fluorescent species. The highly coupled nature of global analysis often results in a significantly slower convergence of the data fitting algorithm as compared with unconstrained analysis. Convergence speed can be greatly accelerated by providing appropriate initial guesses. Realizing that the image morphology often correlates with fluorophore distribution, a global fitting algorithm has been developed to assign initial guesses throughout an image based on a segmentation analysis. This algorithm was tested on both simulated data sets and time-domain lifetime measurements. We have successfully measured fluorophore distribution in fibroblasts stained with Hoechst and calcein. This method further allows second harmonic generation from collagen and elastin autofluorescence to be differentiated in fluorescence lifetime imaging microscopy images of ex vivo human skin. On our experimental measurement, this algorithm increased convergence speed by over two orders of magnitude and achieved significantly better fits. Copyright 2004 Biophysical Society
Laser-induced speckle scatter patterns in Bacillus colonies
Directory of Open Access Journals (Sweden)
Huisung eKim
2014-10-01
Full Text Available Label-free bacterial colony phenotyping technology called BARDOT (BActerial Rapid Detection using Optical scattering Technology provided successful classification of several different bacteria at the genus, species, and serovar level. Recent experiments with colonies of Bacillus species provided strikingly different characteristics of elastic light scatter (ELS patterns, which were comprised of random speckles compared to other bacteria, which are dominated by concentric rings and spokes. Since this laser-based optical sensor interrogates the whole volume of the colony, 3-D information of micro- and macro-structures are all encoded in the far-field scatter patterns. Here, we present a theoretical model explaining the underlying mechanism of the speckle formation by the colonies from Bacillus species. Except for Bacillus polymyxa, all Bacillus spp. produced random bright spots on the imaging plane, which presumably dependent on the cellular and molecular organization and content within the colony. Our scatter model-based analysis revealed that colony spread resulting in variable surface roughness can modify the wavefront of the scatter field. As the center diameter of the Bacillus spp. colony grew from 500 μm to 900 μm, average speckles area decreased 2-fold and the number of small speckles increased 7-fold. In conclusion, as Bacillus colony grows, the average speckle size in the scatter pattern decreases and the number of smaller speckle increases due to the swarming growth characteristics of bacteria within the colony.
A Fast and Efficient Thinning Algorithm for Binary Images
Directory of Open Access Journals (Sweden)
Tarik Abu-Ain
2014-11-01
Full Text Available Skeletonization “also known as thinning” is an important step in the pre-processing phase in many of pattern recognition techniques. The output of Skeletonization process is the skeleton of the pattern in the images. Skeletonization is a crucial process for many applications such as OCR and writer identification. However, the improvements in this area are only a recent phenomenon and still require more researches. In this paper, a new skeletonization algorithm is proposed. This algorithm combines between parallel and sequential, which is categorized under an iterative approach. The suggested method is conducted by experiments of benchmark dataset for evaluation. The outcome is to obtain much better results compared to other thinning methods that are discussed in comparison part.
Real time processor for array speckle interferometry
International Nuclear Information System (INIS)
Chin, G.; Florez, J.; Borelli, R.; Fong, W.; Miko, J.; Trujillo, C.
1989-01-01
With the construction of several new large aperture telescopes and the development of large format array detectors in the near IR, the ability to obtain diffraction limited seeing via IR array speckle interferometry offers a powerful tool. We are constructing a real-time processor to acquire image frames, perform array flat-fielding, execute a 64 x 64 element 2D complex FFT, and to average the power spectrum all within the 25 msec coherence time for speckles at near IR wavelength. The processor is a compact unit controlled by a PC with real time display and data storage capability. It provides the ability to optimize observations and obtain results on the telescope rather than waiting several weeks before the data can be analyzed and viewed with off-line methods
Adabi, Saba; Turani, Zahra; Fatemizadeh, Emad; Clayton, Anne; Nasiriavanaki, Mohammadreza
2017-01-01
Optical coherence tomography (OCT) delivers 3-dimensional images of tissue microstructures. Although OCT imaging offers a promising high-resolution method, OCT images experience some artifacts that lead to misapprehension of tissue structures. Speckle, intensity decay, and blurring are 3 major artifacts in OCT images. Speckle is due to the low coherent light source used in the configuration of OCT. Intensity decay is a deterioration of light with respect to depth, and blurring is the consequence of deficiencies of optical components. In this short review, we summarize some of the image enhancement algorithms for OCT images which address the abovementioned artifacts. PMID:28638245
Directory of Open Access Journals (Sweden)
Saba Adabi
2017-06-01
Full Text Available Optical coherence tomography (OCT delivers 3-dimensional images of tissue microstructures. Although OCT imaging offers a promising high-resolution method, OCT images experience some artifacts that lead to misapprehension of tissue structures. Speckle, intensity decay, and blurring are 3 major artifacts in OCT images. Speckle is due to the low coherent light source used in the configuration of OCT. Intensity decay is a deterioration of light with respect to depth, and blurring is the consequence of deficiencies of optical components. In this short review, we summarize some of the image enhancement algorithms for OCT images which address the abovementioned artifacts.
The SUMO Ship Detector Algorithm for Satellite Radar Images
Directory of Open Access Journals (Sweden)
Harm Greidanus
2017-03-01
Full Text Available Search for Unidentified Maritime Objects (SUMO is an algorithm for ship detection in satellite Synthetic Aperture Radar (SAR images. It has been developed over the course of more than 15 years, using a large amount of SAR images from almost all available SAR satellites operating in L-, C- and X-band. As validated by benchmark tests, it performs very well on a wide range of SAR image modes (from Spotlight to ScanSAR and resolutions (from 1–100 m and for all types and sizes of ships, within the physical limits imposed by the radar imaging. This paper describes, in detail, the algorithmic approach in all of the steps of the ship detection: land masking, clutter estimation, detection thresholding, target clustering, ship attribute estimation and false alarm suppression. SUMO is a pixel-based CFAR (Constant False Alarm Rate detector for multi-look radar images. It assumes a K distribution for the sea clutter, corrected however for deviations of the actual sea clutter from this distribution, implementing a fast and robust method for the clutter background estimation. The clustering of detected pixels into targets (ships uses several thresholds to deal with the typically irregular distribution of the radar backscatter over a ship. In a multi-polarization image, the different channels are fused. Azimuth ambiguities, a common source of false alarms in ship detection, are removed. A reliability indicator is computed for each target. In post-processing, using the results of a series of images, additional false alarms from recurrent (fixed targets including range ambiguities are also removed. SUMO can run in semi-automatic mode, where an operator can verify each detected target. It can also run in fully automatic mode, where batches of over 10,000 images have successfully been processed in less than two hours. The number of satellite SAR systems keeps increasing, as does their application to maritime surveillance. The open data policy of the EU
Directory of Open Access Journals (Sweden)
Taek Seo Jung
2006-03-01
Full Text Available This paper presents an Image Motion Compensation (IMC algorithm for the Korea's Communication, Ocean, and Meteorological Satellite (COMS-1. An IMC algorithm is a priority component of image registration in Image Navigation and Registration (INR system to locate and register radiometric image data. Due to various perturbations, a satellite has orbit and attitude errors with respect to a reference motion. These errors cause depointing of the imager aiming direction, and in consequence cause image distortions. To correct the depointing of the imager aiming direction, a compensation algorithm is designed by adapting different equations from those used for the GOES satellites. The capability of the algorithm is compared with that of existing algorithm applied to the GOES's INR system. The algorithm developed in this paper improves pointing accuracy by 40%, and efficiently compensates the depointings of the imager aiming direction.
Pelet, S.; Previte, M.J.R.; Laiho, L.H.; So, P.T. C.
2004-01-01
Global fitting algorithms have been shown to improve effectively the accuracy and precision of the analysis of fluorescence lifetime imaging microscopy data. Global analysis performs better than unconstrained data fitting when prior information exists, such as the spatial invariance of the lifetimes of individual fluorescent species. The highly coupled nature of global analysis often results in a significantly slower convergence of the data fitting algorithm as compared with unconstrained ana...
International Nuclear Information System (INIS)
Delakis, Ioannis; Hammad, Omer; Kitney, Richard I
2007-01-01
Wavelet-based de-noising has been shown to improve image signal-to-noise ratio in magnetic resonance imaging (MRI) while maintaining spatial resolution. Wavelet-based de-noising techniques typically implemented in MRI require that noise displays uniform spatial distribution. However, images acquired with parallel MRI have spatially varying noise levels. In this work, a new algorithm for filtering images with parallel MRI is presented. The proposed algorithm extracts the edges from the original image and then generates a noise map from the wavelet coefficients at finer scales. The noise map is zeroed at locations where edges have been detected and directional analysis is also used to calculate noise in regions of low-contrast edges that may not have been detected. The new methodology was applied on phantom and brain images and compared with other applicable de-noising techniques. The performance of the proposed algorithm was shown to be comparable with other techniques in central areas of the images, where noise levels are high. In addition, finer details and edges were maintained in peripheral areas, where noise levels are low. The proposed methodology is fully automated and can be applied on final reconstructed images without requiring sensitivity profiles or noise matrices of the receiver coils, therefore making it suitable for implementation in a clinical MRI setting
Directory of Open Access Journals (Sweden)
Alexandrov D.A.
2014-12-01
Full Text Available The purpose: to establish influence of a full ischemia of different duration and the subsequent reperfusionon pathology development in pancreas of rats by means of laser speckle-visualization and lifetime digital microscopy. Materials and Methods. The work has been performed on 42 white rats of line Wistar in weight of 200-250 Research of properties of a blood-groove was made by means of methods laser Doppler flowmetry, digital biomicroscopy and a method of laser speckle-contrast visualization. Results. After the termination of a 5-minute full ischemia the speed of bloodflow has been increased in 2-3 times, clinic pancreatic necrosis is marked does not develop. After the termination of 20-minute full ischemia the increase in speed of a bloodflow did not occur, there were morphological and clinical signs of pancreatic necrosis. Conclusion, the efficiency of monitoring of microhemodynamics of pancreas in rats by the method of speckle-capillary of full field has been shown. Multidirectional phase of perfusion changes in pancreas have been revealed after reversible infringement of blood supply of different duration.
Development of Image Reconstruction Algorithms in electrical Capacitance Tomography
International Nuclear Information System (INIS)
Fernandez Marron, J. L.; Alberdi Primicia, J.; Barcala Riveira, J. M.
2007-01-01
The Electrical Capacitance Tomography (ECT) has not obtained a good development in order to be used at industrial level. That is due first to difficulties in the measurement of very little capacitances (in the range of femto farads) and second to the problem of reconstruction on- line of the images. This problem is due also to the small numbers of electrodes (maximum 16), that made the usual algorithms of reconstruction has many errors. In this work it is described a new purely geometrical method that could be used for this purpose. (Author) 4 refs
Scheduling algorithms for rapid imaging using agile Cubesat constellations
Nag, Sreeja; Li, Alan S.; Merrick, James H.
2018-02-01
Distributed Space Missions such as formation flight and constellations, are being recognized as important Earth Observation solutions to increase measurement samples over space and time. Cubesats are increasing in size (27U, ∼40 kg in development) with increasing capabilities to host imager payloads. Given the precise attitude control systems emerging in the commercial market, Cubesats now have the ability to slew and capture images within short notice. We propose a modular framework that combines orbital mechanics, attitude control and scheduling optimization to plan the time-varying, full-body orientation of agile Cubesats in a constellation such that they maximize the number of observed images and observation time, within the constraints of Cubesat hardware specifications. The attitude control strategy combines bang-bang and PD control, with constraints such as power consumption, response time, and stability factored into the optimality computations and a possible extension to PID control to account for disturbances. Schedule optimization is performed using dynamic programming with two levels of heuristics, verified and improved upon using mixed integer linear programming. The automated scheduler is expected to run on ground station resources and the resultant schedules uplinked to the satellites for execution, however it can be adapted for onboard scheduling, contingent on Cubesat hardware and software upgrades. The framework is generalizable over small steerable spacecraft, sensor specifications, imaging objectives and regions of interest, and is demonstrated using multiple 20 kg satellites in Low Earth Orbit for two case studies - rapid imaging of Landsat's land and coastal images and extended imaging of global, warm water coral reefs. The proposed algorithm captures up to 161% more Landsat images than nadir-pointing sensors with the same field of view, on a 2-satellite constellation over a 12-h simulation. Integer programming was able to verify that
The SRT reconstruction algorithm for semiquantification in PET imaging
Energy Technology Data Exchange (ETDEWEB)
Kastis, George A., E-mail: gkastis@academyofathens.gr [Research Center of Mathematics, Academy of Athens, Athens 11527 (Greece); Gaitanis, Anastasios [Biomedical Research Foundation of the Academy of Athens (BRFAA), Athens 11527 (Greece); Samartzis, Alexandros P. [Nuclear Medicine Department, Evangelismos General Hospital, Athens 10676 (Greece); Fokas, Athanasios S. [Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge CB30WA, United Kingdom and Research Center of Mathematics, Academy of Athens, Athens 11527 (Greece)
2015-10-15
Purpose: The spline reconstruction technique (SRT) is a new, fast algorithm based on a novel numerical implementation of an analytic representation of the inverse Radon transform. The mathematical details of this algorithm and comparisons with filtered backprojection were presented earlier in the literature. In this study, the authors present a comparison between SRT and the ordered-subsets expectation–maximization (OSEM) algorithm for determining contrast and semiquantitative indices of {sup 18}F-FDG uptake. Methods: The authors implemented SRT in the software for tomographic image reconstruction (STIR) open-source platform and evaluated this technique using simulated and real sinograms obtained from the GE Discovery ST positron emission tomography/computer tomography scanner. All simulations and reconstructions were performed in STIR. For OSEM, the authors used the clinical protocol of their scanner, namely, 21 subsets and two iterations. The authors also examined images at one, four, six, and ten iterations. For the simulation studies, the authors analyzed an image-quality phantom with cold and hot lesions. Two different versions of the phantom were employed at two different hot-sphere lesion-to-background ratios (LBRs), namely, 2:1 and 4:1. For each noiseless sinogram, 20 Poisson realizations were created at five different noise levels. In addition to making visual comparisons of the reconstructed images, the authors determined contrast and bias as a function of the background image roughness (IR). For the real-data studies, sinograms of an image-quality phantom simulating the human torso were employed. The authors determined contrast and LBR as a function of the background IR. Finally, the authors present plots of contrast as a function of IR after smoothing each reconstructed image with Gaussian filters of six different sizes. Statistical significance was determined by employing the Wilcoxon rank-sum test. Results: In both simulated and real studies, SRT
The SRT reconstruction algorithm for semiquantification in PET imaging
International Nuclear Information System (INIS)
Kastis, George A.; Gaitanis, Anastasios; Samartzis, Alexandros P.; Fokas, Athanasios S.
2015-01-01
Purpose: The spline reconstruction technique (SRT) is a new, fast algorithm based on a novel numerical implementation of an analytic representation of the inverse Radon transform. The mathematical details of this algorithm and comparisons with filtered backprojection were presented earlier in the literature. In this study, the authors present a comparison between SRT and the ordered-subsets expectation–maximization (OSEM) algorithm for determining contrast and semiquantitative indices of 18 F-FDG uptake. Methods: The authors implemented SRT in the software for tomographic image reconstruction (STIR) open-source platform and evaluated this technique using simulated and real sinograms obtained from the GE Discovery ST positron emission tomography/computer tomography scanner. All simulations and reconstructions were performed in STIR. For OSEM, the authors used the clinical protocol of their scanner, namely, 21 subsets and two iterations. The authors also examined images at one, four, six, and ten iterations. For the simulation studies, the authors analyzed an image-quality phantom with cold and hot lesions. Two different versions of the phantom were employed at two different hot-sphere lesion-to-background ratios (LBRs), namely, 2:1 and 4:1. For each noiseless sinogram, 20 Poisson realizations were created at five different noise levels. In addition to making visual comparisons of the reconstructed images, the authors determined contrast and bias as a function of the background image roughness (IR). For the real-data studies, sinograms of an image-quality phantom simulating the human torso were employed. The authors determined contrast and LBR as a function of the background IR. Finally, the authors present plots of contrast as a function of IR after smoothing each reconstructed image with Gaussian filters of six different sizes. Statistical significance was determined by employing the Wilcoxon rank-sum test. Results: In both simulated and real studies, SRT
International Nuclear Information System (INIS)
Niknam, Mehdi; Thulasiraman, Parimala; Camorlinga, Sergio
2010-01-01
Connected component labelling is an essential step in image processing. We provide a parallel version of Suzuki's sequential connected component algorithm in order to speed up the labelling process. Also, we modify the algorithm to enable labelling gray-scale images. Due to the data dependencies in the algorithm we used a method similar to pipeline to exploit parallelism. The parallel algorithm method achieved a speedup of 2.5 for image size of 256 x 256 pixels using 4 processing threads.
Directory of Open Access Journals (Sweden)
Wenjing Zhao
2018-01-01
Full Text Available SGK (sequential generalization of K-means dictionary learning denoising algorithm has the characteristics of fast denoising speed and excellent denoising performance. However, the noise standard deviation must be known in advance when using SGK algorithm to process the image. This paper presents a denoising algorithm combined with SGK dictionary learning and the principal component analysis (PCA noise estimation. At first, the noise standard deviation of the image is estimated by using the PCA noise estimation algorithm. And then it is used for SGK dictionary learning algorithm. Experimental results show the following: (1 The SGK algorithm has the best denoising performance compared with the other three dictionary learning algorithms. (2 The SGK algorithm combined with PCA is superior to the SGK algorithm combined with other noise estimation algorithms. (3 Compared with the original SGK algorithm, the proposed algorithm has higher PSNR and better denoising performance.
3-color photometry of a sunspot using speckle masking techniques
Wiehr, E.; Sütterlin, P.
1998-01-01
A three-colour photometry is used to deduce the temperature of sunspot fine-structures. Using the Speckle-Masking method for image restoration, the resulting images (one per colour and burst) have a spatial resolution only limited by the telescope's aperture, i.e. 95km (blue), 145 km (red) and
Zhang, Lei; Yang, Fengbao; Ji, Linna; Lv, Sheng
2018-01-01
Diverse image fusion methods perform differently. Each method has advantages and disadvantages compared with others. One notion is that the advantages of different image methods can be effectively combined. A multiple-algorithm parallel fusion method based on algorithmic complementarity and synergy is proposed. First, in view of the characteristics of the different algorithms and difference-features among images, an index vector-based feature-similarity is proposed to define the degree of complementarity and synergy. This proposed index vector is a reliable evidence indicator for algorithm selection. Second, the algorithms with a high degree of complementarity and synergy are selected. Then, the different degrees of various features and infrared intensity images are used as the initial weights for the nonnegative matrix factorization (NMF). This avoids randomness of the NMF initialization parameter. Finally, the fused images of different algorithms are integrated using the NMF because of its excellent data fusing performance on independent features. Experimental results demonstrate that the visual effect and objective evaluation index of the fused images obtained using the proposed method are better than those obtained using traditional methods. The proposed method retains all the advantages that individual fusion algorithms have.
Plant phenomics: an overview of image acquisition technologies and image data analysis algorithms.
Perez-Sanz, Fernando; Navarro, Pedro J; Egea-Cortines, Marcos
2017-11-01
The study of phenomes or phenomics has been a central part of biology. The field of automatic phenotype acquisition technologies based on images has seen an important advance in the last years. As with other high-throughput technologies, it addresses a common set of problems, including data acquisition and analysis. In this review, we give an overview of the main systems developed to acquire images. We give an in-depth analysis of image processing with its major issues and the algorithms that are being used or emerging as useful to obtain data out of images in an automatic fashion. © The Author 2017. Published by Oxford University Press.
International Nuclear Information System (INIS)
Zeeberg, B.R.; Bacharach, S.; Carson, R.; Green, M.V.; Larson, S.M.; Soucaille, J.F.
1985-01-01
An algorithm is presented which permits the reconstruction of SPECT images in the presence of spatially varying attenuation. The algorithm considers the spatially variant attenuation as a perturbation of the constant attenuation case and computes a reconstructed image and a correction image to estimate the effects of this perturbation. The corrected image will be computed from these two images and is of comparable quality both visually and quantitatively to those simulated for zero or constant attenuation taken as standard reference images. In addition, the algorithm is time efficient, in that the time required is approximately 2.5 times that for a standard convolution-back projection algorithm
Directory of Open Access Journals (Sweden)
Mingjian Sun
2015-01-01
Full Text Available Photoacoustic imaging is an innovative imaging technique to image biomedical tissues. The time reversal reconstruction algorithm in which a numerical model of the acoustic forward problem is run backwards in time is widely used. In the paper, a time reversal reconstruction algorithm based on particle swarm optimization (PSO optimized support vector machine (SVM interpolation method is proposed for photoacoustics imaging. Numerical results show that the reconstructed images of the proposed algorithm are more accurate than those of the nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation based time reversal algorithm, which can provide higher imaging quality by using significantly fewer measurement positions or scanning times.
Fixed-point image orthorectification algorithms for reduced computational cost
French, Joseph Clinton
Imaging systems have been applied to many new applications in recent years. With the advent of low-cost, low-power focal planes and more powerful, lower cost computers, remote sensing applications have become more wide spread. Many of these applications require some form of geolocation, especially when relative distances are desired. However, when greater global positional accuracy is needed, orthorectification becomes necessary. Orthorectification is the process of projecting an image onto a Digital Elevation Map (DEM), which removes terrain distortions and corrects the perspective distortion by changing the viewing angle to be perpendicular to the projection plane. Orthorectification is used in disaster tracking, landscape management, wildlife monitoring and many other applications. However, orthorectification is a computationally expensive process due to floating point operations and divisions in the algorithm. To reduce the computational cost of on-board processing, two novel algorithm modifications are proposed. One modification is projection utilizing fixed-point arithmetic. Fixed point arithmetic removes the floating point operations and reduces the processing time by operating only on integers. The second modification is replacement of the division inherent in projection with a multiplication of the inverse. The inverse must operate iteratively. Therefore, the inverse is replaced with a linear approximation. As a result of these modifications, the processing time of projection is reduced by a factor of 1.3x with an average pixel position error of 0.2% of a pixel size for 128-bit integer processing and over 4x with an average pixel position error of less than 13% of a pixel size for a 64-bit integer processing. A secondary inverse function approximation is also developed that replaces the linear approximation with a quadratic. The quadratic approximation produces a more accurate approximation of the inverse, allowing for an integer multiplication calculation
From Pixels to Region: A Salient Region Detection Algorithm for Location-Quantification Image
Directory of Open Access Journals (Sweden)
Mengmeng Zhang
2014-01-01
Full Text Available Image saliency detection has become increasingly important with the development of intelligent identification and machine vision technology. This process is essential for many image processing algorithms such as image retrieval, image segmentation, image recognition, and adaptive image compression. We propose a salient region detection algorithm for full-resolution images. This algorithm analyzes the randomness and correlation of image pixels and pixel-to-region saliency computation mechanism. The algorithm first obtains points with more saliency probability by using the improved smallest univalue segment assimilating nucleus operator. It then reconstructs the entire saliency region detection by taking these points as reference and combining them with image spatial color distribution, as well as regional and global contrasts. The results for subjective and objective image saliency detection show that the proposed algorithm exhibits outstanding performance in terms of technology indices such as precision and recall rates.
Statistical model for OCT image denoising
Li, Muxingzi
2017-08-01
Optical coherence tomography (OCT) is a non-invasive technique with a large array of applications in clinical imaging and biological tissue visualization. However, the presence of speckle noise affects the analysis of OCT images and their diagnostic utility. In this article, we introduce a new OCT denoising algorithm. The proposed method is founded on a numerical optimization framework based on maximum-a-posteriori estimate of the noise-free OCT image. It combines a novel speckle noise model, derived from local statistics of empirical spectral domain OCT (SD-OCT) data, with a Huber variant of total variation regularization for edge preservation. The proposed approach exhibits satisfying results in terms of speckle noise reduction as well as edge preservation, at reduced computational cost.
Algorithm of imaging modalities in cases of mandibular fractures
International Nuclear Information System (INIS)
Mihailova, H.
2009-01-01
Mandibular fracture is the most common bone fracture of maxillo-facial trauma. Up to now the main method for examination of the mandible is radiography. The aim of the issue is to present an algorithm of imaging modalities for investigation of patients in cases of mandibular trauma. It consists of series of X ray techniques and views of the facial skull named mandibulo-facial. This standardizes mandibulo-facial series includes exactly determined four projections done by conventional X ray techniques: posterior-anterior view of skull (PA or AP), oblique view of the left mandible; oblique view of the right mandible; occipito-mental view. Using these four planned radiograms is obligatory for each mandibular trauma. Panoramic X-ray is obligatory in cases of apparatus availability; this abolish only oblique views (left and right). Occipito-mental view of the skull gives anatomically better the coronoid process of the mandible, the zygoma complex, the orbital edges and maxillar sinus than Waters projection. So mandibulo-facial series of four planned radiograms is not only for diagnostic of mandibular fractures, but as a screening of mandibulo-facial trauma too. Thus using algorithm of imaging modalities in cases of mandibular fracture leads to optimization of diagnostic process in patients with mandibular trauma. (author)
Ameliorating mammograms by using novel image processing algorithms
Pillai, A.; Kwartowitz, D.
2014-03-01
Mammography is one of the most important tools for the early detection of breast cancer typically through detection of characteristic masses and/or micro calcifications. Digital mammography has become commonplace in recent years. High quality mammogram images are large in size, providing high-resolution data. Estimates of the false negative rate for cancers in mammography are approximately 10%-30%. This may be due to observation error, but more frequently it is because the cancer is hidden by other dense tissue in the breast and even after retrospective review of the mammogram, cannot be seen. In this study, we report on the results of novel image processing algorithms that will enhance the images providing decision support to reading physicians. Techniques such as Butterworth high pass filtering and Gabor filters will be applied to enhance images; followed by segmentation of the region of interest (ROI). Subsequently, the textural features will be extracted from the ROI, which will be used to classify the ROIs as either masses or non-masses. Among the statistical methods most used for the characterization of textures, the co-occurrence matrix makes it possible to determine the frequency of appearance of two pixels separated by a distance, at an angle from the horizontal. This matrix contains a very large amount of information that is complex. Therefore, it is not used directly but through measurements known as indices of texture such as average, variance, energy, contrast, correlation, normalized correlation and entropy.
Geometry correction Algorithm for UAV Remote Sensing Image Based on Improved Neural Network
Liu, Ruian; Liu, Nan; Zeng, Beibei; Chen, Tingting; Yin, Ninghao
2018-03-01
Aiming at the disadvantage of current geometry correction algorithm for UAV remote sensing image, a new algorithm is proposed. Adaptive genetic algorithm (AGA) and RBF neural network are introduced into this algorithm. And combined with the geometry correction principle for UAV remote sensing image, the algorithm and solving steps of AGA-RBF are presented in order to realize geometry correction for UAV remote sensing. The correction accuracy and operational efficiency is improved through optimizing the structure and connection weight of RBF neural network separately with AGA and LMS algorithm. Finally, experiments show that AGA-RBF algorithm has the advantages of high correction accuracy, high running rate and strong generalization ability.
Mozaffarzadeh, Moein; Mahloojifar, Ali; Orooji, Mahdi; Adabi, Saba; Nasiriavanaki, Mohammadreza
2018-01-01
Photoacoustic imaging (PAI) is an emerging medical imaging modality capable of providing high spatial resolution of Ultrasound (US) imaging and high contrast of optical imaging. Delay-and-Sum (DAS) is the most common beamforming algorithm in PAI. However, using DAS beamformer leads to low resolution images and considerable contribution of off-axis signals. A new paradigm namely Delay-Multiply-and-Sum (DMAS), which was originally used as a reconstruction algorithm in confocal microwave imaging...
Simulation of speckle patterns with pre-defined correlation distributions
Song, Lipei; Zhou, Zhen; Wang, Xueyan; Zhao, Xing; Elson, Daniel S.
2016-01-01
We put forward a method to easily generate a single or a sequence of fully developed speckle patterns with pre-defined correlation distribution by utilizing the principle of coherent imaging. The few-to-one mapping between the input correlation matrix and the correlation distribution between simulated speckle patterns is realized and there is a simple square relationship between the values of these two correlation coefficient sets. This method is demonstrated both theoretically and experimentally. The square relationship enables easy conversion from any desired correlation distribution. Since the input correlation distribution can be defined by a digital matrix or a gray-scale image acquired experimentally, this method provides a convenient way to simulate real speckle-related experiments and to evaluate data processing techniques. PMID:27231589
X-ray pulse wavefront metrology using speckle tracking
International Nuclear Information System (INIS)
Berujon, Sebastien; Ziegler, Eric; Cloetens, Peter
2015-01-01
The theoretical description and experimental implementation of a speckle-tracking-based instrument which permits the characterisation of X-ray pulse wavefronts. An instrument allowing the quantitative analysis of X-ray pulsed wavefronts is presented and its processing method explained. The system relies on the X-ray speckle tracking principle to accurately measure the phase gradient of the X-ray beam from which beam optical aberrations can be deduced. The key component of this instrument, a semi-transparent scintillator emitting visible light while transmitting X-rays, allows simultaneous recording of two speckle images at two different propagation distances from the X-ray source. The speckle tracking procedure for a reference-less metrology mode is described with a detailed account on the advanced processing schemes used. A method to characterize and compensate for the imaging detector distortion, whose principle is also based on speckle, is included. The presented instrument is expected to find interest at synchrotrons and at the new X-ray free-electron laser sources under development worldwide where successful exploitation of beams relies on the availability of an accurate wavefront metrology
A Stereo Dual-Channel Dynamic Programming Algorithm for UAV Image Stitching.
Li, Ming; Chen, Ruizhi; Zhang, Weilong; Li, Deren; Liao, Xuan; Wang, Lei; Pan, Yuanjin; Zhang, Peng
2017-09-08
Dislocation is one of the major challenges in unmanned aerial vehicle (UAV) image stitching. In this paper, we propose a new algorithm for seamlessly stitching UAV images based on a dynamic programming approach. Our solution consists of two steps: Firstly, an image matching algorithm is used to correct the images so that they are in the same coordinate system. Secondly, a new dynamic programming algorithm is developed based on the concept of a stereo dual-channel energy accumulation. A new energy aggregation and traversal strategy is adopted in our solution, which can find a more optimal seam line for image stitching. Our algorithm overcomes the theoretical limitation of the classical Duplaquet algorithm. Experiments show that the algorithm can effectively solve the dislocation problem in UAV image stitching, especially for the cases in dense urban areas. Our solution is also direction-independent, which has better adaptability and robustness for stitching images.
Sengupta, Partho P.; Huang, Yen-Min; Bansal, Manish; Ashrafi, Ali; Fisher, Matt; Shameer, Khader; Gall, Walt; Dudley, Joel T
2016-01-01
Background Associating a patient’s profile with the memories of prototypical patients built through previous repeat clinical experience is a key process in clinical judgment. We hypothesized that a similar process using a cognitive computing tool would be well suited for learning and recalling multidimensional attributes of speckle tracking echocardiography (STE) data sets derived from patients with known constrictive pericarditis (CP) and restrictive cardiomyopathy (RCM). Methods and Results Clinical and echocardiographic data of 50 patients with CP and 44 with RCM were used for developing an associative memory classifier (AMC) based machine learning algorithm. The STE data was normalized in reference to 47 controls with no structural heart disease, and the diagnostic area under the receiver operating characteristic curve (AUC) of the AMC was evaluated for differentiating CP from RCM. Using only STE variables, AMC achieved a diagnostic AUC of 89·2%, which improved to 96·2% with addition of 4 echocardiographic variables. In comparison, the AUC of early diastolic mitral annular velocity and left ventricular longitudinal strain were 82.1% and 63·7%, respectively. Furthermore, AMC demonstrated greater accuracy and shorter learning curves than other machine learning approaches with accuracy asymptotically approaching 90% after a training fraction of 0·3 and remaining flat at higher training fractions. Conclusions This study demonstrates feasibility of a cognitive machine learning approach for learning and recalling patterns observed during echocardiographic evaluations. Incorporation of machine learning algorithms in cardiac imaging may aid standardized assessments and support the quality of interpretations, particularly for novice readers with limited experience. PMID:27266599
Synchronized renal blood flow dynamics mapped with wavelet analysis of laser speckle flowmetry data
DEFF Research Database (Denmark)
Brazhe, Alexey R; Marsh, Donald J; von Holstein-Rathlou, Niels-Henrik
2014-01-01
of rat kidneys. The regulatory mechanism in the renal microcirculation generates oscillations in arterial blood flow at several characteristic frequencies. Our approach to laser speckle image processing allows detection of frequency and phase entrainments, visualization of their patterns, and estimation......Full-field laser speckle microscopy provides real-time imaging of superficial blood flow rate. Here we apply continuous wavelet transform to time series of speckle-estimated blood flow from each pixel of the images to map synchronous patterns in instantaneous frequency and phase on the surface...... of the extent of synchronization in renal cortex dynamics....
Evaluation of imaging protocol for ECT based on CS image reconstruction algorithm
International Nuclear Information System (INIS)
Zhou Xiaolin; Yun Mingkai; Cao Xuexiang; Liu Shuangquan; Wang Lu; Huang Xianchao; Wei Long
2014-01-01
Single-photon emission computerized tomography and positron emission tomography are essential medical imaging tools, for which the sampling angle number and scan time should be carefully chosen to give a good compromise between image quality and radiopharmaceutical dose. In this study, the image quality of different acquisition protocols was evaluated via varied angle number and count number per angle with Monte Carlo simulation data. It was shown that, when similar imaging counts were used, the factor of acquisition counts was more important than that of the sampling number in emission computerized tomography. To further reduce the activity requirement and the scan duration, an iterative image reconstruction algorithm for limited-view and low-dose tomography based on compressed sensing theory has been developed. The total variation regulation was added to the reconstruction process to improve the signal to noise Ratio and reduce artifacts caused by the limited angle sampling. Maximization of the maximum likelihood of the estimated image and the measured data and minimization of the total variation of the image are alternatively implemented. By using this advanced algorithm, the reconstruction process is able to achieve image quality matching or exceed that of normal scans with only half of the injection radiopharmaceutical dose. (authors)
Spatial compression algorithm for the analysis of very large multivariate images
Keenan, Michael R [Albuquerque, NM
2008-07-15
A method for spatially compressing data sets enables the efficient analysis of very large multivariate images. The spatial compression algorithms use a wavelet transformation to map an image into a compressed image containing a smaller number of pixels that retain the original image's information content. Image analysis can then be performed on a compressed data matrix consisting of a reduced number of significant wavelet coefficients. Furthermore, a block algorithm can be used for performing common operations more efficiently. The spatial compression algorithms can be combined with spectral compression algorithms to provide further computational efficiencies.
International Nuclear Information System (INIS)
Liu, Xiaozheng; Yuan, Zhenming; Zhu, Junming; Xu, Dongrong
2013-01-01
The demons algorithm is a popular algorithm for non-rigid image registration because of its computational efficiency and simple implementation. The deformation forces of the classic demons algorithm were derived from image gradients by considering the deformation to decrease the intensity dissimilarity between images. However, the methods using the difference of image intensity for medical image registration are easily affected by image artifacts, such as image noise, non-uniform imaging and partial volume effects. The gradient magnitude image is constructed from the local information of an image, so the difference in a gradient magnitude image can be regarded as more reliable and robust for these artifacts. Then, registering medical images by considering the differences in both image intensity and gradient magnitude is a straightforward selection. In this paper, based on a diffeomorphic demons algorithm, we propose a chain-type diffeomorphic demons algorithm by combining the differences in both image intensity and gradient magnitude for medical image registration. Previous work had shown that the classic demons algorithm can be considered as an approximation of a second order gradient descent on the sum of the squared intensity differences. By optimizing the new dissimilarity criteria, we also present a set of new demons forces which were derived from the gradients of the image and gradient magnitude image. We show that, in controlled experiments, this advantage is confirmed, and yields a fast convergence. (paper)
Cone-beam and fan-beam image reconstruction algorithms based on spherical and circular harmonics
International Nuclear Information System (INIS)
Zeng, Gengsheng L; Gullberg, Grant T
2004-01-01
A cone-beam image reconstruction algorithm using spherical harmonic expansions is proposed. The reconstruction algorithm is in the form of a summation of inner products of two discrete arrays of spherical harmonic expansion coefficients at each cone-beam point of acquisition. This form is different from the common filtered backprojection algorithm and the direct Fourier reconstruction algorithm. There is no re-sampling of the data, and spherical harmonic expansions are used instead of Fourier expansions. As a special case, a new fan-beam image reconstruction algorithm is also derived in terms of a circular harmonic expansion. Computer simulation results for both cone-beam and fan-beam algorithms are presented for circular planar orbit acquisitions. The algorithms give accurate reconstructions; however, the implementation of the cone-beam reconstruction algorithm is computationally intensive. A relatively efficient algorithm is proposed for reconstructing the central slice of the image when a circular scanning orbit is used
Compositional-prior-guided image reconstruction algorithm for multi-modality imaging
Fang, Qianqian; Moore, Richard H.; Kopans, Daniel B.; Boas, David A.
2010-01-01
The development of effective multi-modality imaging methods typically requires an efficient information fusion model, particularly when combining structural images with a complementary imaging modality that provides functional information. We propose a composition-based image segmentation method for X-ray digital breast tomosynthesis (DBT) and a structural-prior-guided image reconstruction for a combined DBT and diffuse optical tomography (DOT) breast imaging system. Using the 3D DBT images from 31 clinically measured healthy breasts, we create an empirical relationship between the X-ray intensities for adipose and fibroglandular tissue. We use this relationship to then segment another 58 healthy breast DBT images from 29 subjects into compositional maps of different tissue types. For each breast, we build a weighted-graph in the compositional space and construct a regularization matrix to incorporate the structural priors into a finite-element-based DOT image reconstruction. Use of the compositional priors enables us to fuse tissue anatomy into optical images with less restriction than when using a binary segmentation. This allows us to recover the image contrast captured by DOT but not by DBT. We show that it is possible to fine-tune the strength of the structural priors by changing a single regularization parameter. By estimating the optical properties for adipose and fibroglandular tissue using the proposed algorithm, we found the results are comparable or superior to those estimated with expert-segmentations, but does not involve the time-consuming manual selection of regions-of-interest. PMID:21258460
Novel image reconstruction algorithm for multi-phase flow tomography system using γ ray method
International Nuclear Information System (INIS)
Hao Kuihong; Wang Huaxiang; Gao Mei
2007-01-01
After analyzing the reason of image reconstructed algorithm by using the conventional back projection (IBP) is prone to produce spurious line, and considering the characteristic of multi-phase flow tomography, a novel image reconstruction algorithm is proposed, which carries out the intersection calculation using back projection data. This algorithm can obtain a perfect system point spread function, and can eliminate spurious line better. Simulating results show that the algorithm is effective for identifying multi-phase flow pattern. (authors)
Coelho, Luís Pedro; Shariff, Aabid; Murphy, Robert F.
2009-01-01
Image segmentation is an essential step in many image analysis pipelines and many algorithms have been proposed to solve this problem. However, they are often evaluated subjectively or based on a small number of examples. To fill this gap, we hand-segmented a set of 97 fluorescence microscopy images (a total of 4009 cells) and objectively evaluated some previously proposed segmentation algorithms.
Image-Data Compression Using Edge-Optimizing Algorithm for WFA Inference.
Culik, Karel II; Kari, Jarkko
1994-01-01
Presents an inference algorithm that produces a weighted finite automata (WFA), in particular, the grayness functions of graytone images. Image-data compression results based on the new inference algorithm produces a WFA with a relatively small number of edges. Image-data compression results alone and in combination with wavelets are discussed.…
Objectness Supervised Merging Algorithm for Color Image Segmentation
Directory of Open Access Journals (Sweden)
Haifeng Sima
2016-01-01
Full Text Available Ideal color image segmentation needs both low-level cues and high-level semantic features. This paper proposes a two-hierarchy segmentation model based on merging homogeneous superpixels. First, a region growing strategy is designed for producing homogenous and compact superpixels in different partitions. Total variation smoothing features are adopted in the growing procedure for locating real boundaries. Before merging, we define a combined color-texture histogram feature for superpixels description and, meanwhile, a novel objectness feature is proposed to supervise the region merging procedure for reliable segmentation. Both color-texture histograms and objectness are computed to measure regional similarities between region pairs, and the mixed standard deviation of the union features is exploited to make stop criteria for merging process. Experimental results on the popular benchmark dataset demonstrate the better segmentation performance of the proposed model compared to other well-known segmentation algorithms.
Optimized Laplacian image sharpening algorithm based on graphic processing unit
Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah
2014-12-01
In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.
Automated microaneurysm detection algorithms applied to diabetic retinopathy retinal images
Directory of Open Access Journals (Sweden)
Akara Sopharak
2013-07-01
Full Text Available Diabetic retinopathy is the commonest cause of blindness in working age people. It is characterised and graded by the development of retinal microaneurysms, haemorrhages and exudates. The damage caused by diabetic retinopathy can be prevented if it is treated in its early stages. Therefore, automated early detection can limit the severity of the disease, improve the follow-up management of diabetic patients and assist ophthalmologists in investigating and treating the disease more efficiently. This review focuses on microaneurysm detection as the earliest clinically localised characteristic of diabetic retinopathy, a frequently observed complication in both Type 1 and Type 2 diabetes. Algorithms used for microaneurysm detection from retinal images are reviewed. A number of features used to extract microaneurysm are summarised. Furthermore, a comparative analysis of reported methods used to automatically detect microaneurysms is presented and discussed. The performance of methods and their complexity are also discussed.
Sparse spectral deconvolution algorithm for noncartesian MR spectroscopic imaging.
Bhave, Sampada; Eslami, Ramin; Jacob, Mathews
2014-02-01
To minimize line shape distortions and spectral leakage artifacts in MR spectroscopic imaging (MRSI). A spatially and spectrally regularized non-Cartesian MRSI algorithm that uses the line shape distortion priors, estimated from water reference data, to deconvolve the spectra is introduced. Sparse spectral regularization is used to minimize noise amplification associated with deconvolution. A spiral MRSI sequence that heavily oversamples the central k-space regions is used to acquire the MRSI data. The spatial regularization term uses the spatial supports of brain and extracranial fat regions to recover the metabolite spectra and nuisance signals at two different resolutions. Specifically, the nuisance signals are recovered at the maximum resolution to minimize spectral leakage, while the point spread functions of metabolites are controlled to obtain acceptable signal-to-noise ratio. The comparisons of the algorithm against Tikhonov regularized reconstructions demonstrates considerably reduced line-shape distortions and improved metabolite maps. The proposed sparsity constrained spectral deconvolution scheme is effective in minimizing the line-shape distortions. The dual resolution reconstruction scheme is capable of minimizing spectral leakage artifacts. Copyright © 2013 Wiley Periodicals, Inc.
Algorithms for detection of objects in image sequences captured from an airborne imaging system
Kasturi, Rangachar; Camps, Octavia; Tang, Yuan-Liang; Devadiga, Sadashiva; Gandhi, Tarak
1995-01-01
This research was initiated as a part of the effort at the NASA Ames Research Center to design a computer vision based system that can enhance the safety of navigation by aiding the pilots in detecting various obstacles on the runway during critical section of the flight such as a landing maneuver. The primary goal is the development of algorithms for detection of moving objects from a sequence of images obtained from an on-board video camera. Image regions corresponding to the independently moving objects are segmented from the background by applying constraint filtering on the optical flow computed from the initial few frames of the sequence. These detected regions are tracked over subsequent frames using a model based tracking algorithm. Position and velocity of the moving objects in the world coordinate is estimated using an extended Kalman filter. The algorithms are tested using the NASA line image sequence with six static trucks and a simulated moving truck and experimental results are described. Various limitations of the currently implemented version of the above algorithm are identified and possible solutions to build a practical working system are investigated.
Directory of Open Access Journals (Sweden)
Chen Deyun
2013-01-01
Full Text Available According to the image reconstruction accuracy influenced by the “soft field” nature and ill-conditioned problems in electrical capacitance tomography, a superresolution image reconstruction algorithm based on Landweber is proposed in the paper, which is based on the working principle of the electrical capacitance tomography system. The method uses the algorithm which is derived by regularization of solutions derived and derives closed solution by fast Fourier transform of the convolution kernel. So, it ensures the certainty of the solution and improves the stability and quality of image reconstruction results. Simulation results show that the imaging precision and real-time imaging of the algorithm are better than Landweber algorithm, and this algorithm proposes a new method for the electrical capacitance tomography image reconstruction algorithm.
Implementation of digital image encryption algorithm using logistic function and DNA encoding
Suryadi, MT; Satria, Yudi; Fauzi, Muhammad
2018-03-01
Cryptography is a method to secure information that might be in form of digital image. Based on past research, in order to increase security level of chaos based encryption algorithm and DNA based encryption algorithm, encryption algorithm using logistic function and DNA encoding was proposed. Digital image encryption algorithm using logistic function and DNA encoding use DNA encoding to scramble the pixel values into DNA base and scramble it in DNA addition, DNA complement, and XOR operation. The logistic function in this algorithm used as random number generator needed in DNA complement and XOR operation. The result of the test show that the PSNR values of cipher images are 7.98-7.99 bits, the entropy values are close to 8, the histogram of cipher images are uniformly distributed and the correlation coefficient of cipher images are near 0. Thus, the cipher image can be decrypted perfectly and the encryption algorithm has good resistance to entropy attack and statistical attack.
Granulometry use for the study of dynamics speckles patterns
International Nuclear Information System (INIS)
Mavilioa, Adriana; Fernandez, Margarita; Trivi, Marcelo; Rabal, Hector; Arizaga, Ricardo
2009-01-01
Dynamic speckle patterns are generated by laser light scattering on surfaces that exhibit some kind of activity, due to physical or biological processes that take place in the illuminated object. The characterization of this dynamic process is carried out by studying the texture changes of auxiliary images: temporal history of the speckle pattern (THSP) obtained from this speckles patterns. The drying process of water borne paint is studied through a method based on mathematical morphology applied to the THSP image processing. It is based on obtaining the granulometry of these images and their characteristic granulometric spectrum. From the granulometric size distribution of each THSP image four parameters are obtained: mean length, standard deviation, asymmetry and kurtosis. These parameters are found to be suitable as texture features. The Mahalanobis distance is calculated between the texture features of the THSP images representative of the temporary stages of the drying process and the features of the final stage or pattern texture. The behavior of the distance function describes satisfactorily the drying process of the water borne paint. Finally, these results are compared with the obtained by other methods. Compared with others, the granulometric method reported in this work distinguished by its simplicity and easy implementation and can be used to characterize the evolution of any process recorded through dynamic speckles. (Author)
DEFF Research Database (Denmark)
Sidky, Emil Y.; Jørgensen, Jakob Heide; Pan, Xiaochuan
2012-01-01
The primal–dual optimization algorithm developed in Chambolle and Pock (CP) (2011 J. Math. Imag. Vis. 40 1–26) is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems...... for the purpose of designing iterative image reconstruction algorithms for CT. The primal–dual algorithm is briefly summarized in this paper, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application...
Wognum, S.; Heethuis, S. E.; Rosario, T.; Hoogeman, M. S.; Bel, A.
2014-01-01
The spatial accuracy of deformable image registration (DIR) is important in the implementation of image guided adaptive radiotherapy techniques for cancer in the pelvic region. Validation of algorithms is best performed on phantoms with fiducial markers undergoing controlled large deformations.
Zhang, Leihong; Liang, Dong; Li, Bei; Kang, Yi; Pan, Zilan; Zhang, Dawei; Gao, Xiumin; Ma, Xiuhua
2016-07-01
On the basis of analyzing the cosine light field with determined analytic expression and the pseudo-inverse method, the object is illuminated by a presetting light field with a determined discrete Fourier transform measurement matrix, and the object image is reconstructed by the pseudo-inverse method. The analytic expression of the algorithm of computational ghost imaging based on discrete Fourier transform measurement matrix is deduced theoretically, and compared with the algorithm of compressive computational ghost imaging based on random measurement matrix. The reconstruction process and the reconstruction error are analyzed. On this basis, the simulation is done to verify the theoretical analysis. When the sampling measurement number is similar to the number of object pixel, the rank of discrete Fourier transform matrix is the same as the one of the random measurement matrix, the PSNR of the reconstruction image of FGI algorithm and PGI algorithm are similar, the reconstruction error of the traditional CGI algorithm is lower than that of reconstruction image based on FGI algorithm and PGI algorithm. As the decreasing of the number of sampling measurement, the PSNR of reconstruction image based on FGI algorithm decreases slowly, and the PSNR of reconstruction image based on PGI algorithm and CGI algorithm decreases sharply. The reconstruction time of FGI algorithm is lower than that of other algorithms and is not affected by the number of sampling measurement. The FGI algorithm can effectively filter out the random white noise through a low-pass filter and realize the reconstruction denoising which has a higher denoising capability than that of the CGI algorithm. The FGI algorithm can improve the reconstruction accuracy and the reconstruction speed of computational ghost imaging.
Automated drusen detection in retinal images using analytical modelling algorithms
Directory of Open Access Journals (Sweden)
Manivannan Ayyakkannu
2011-07-01
Full Text Available Abstract Background Drusen are common features in the ageing macula associated with exudative Age-Related Macular Degeneration (ARMD. They are visible in retinal images and their quantitative analysis is important in the follow up of the ARMD. However, their evaluation is fastidious and difficult to reproduce when performed manually. Methods This article proposes a methodology for Automatic Drusen Deposits Detection and quantification in Retinal Images (AD3RI by using digital image processing techniques. It includes an image pre-processing method to correct the uneven illumination and to normalize the intensity contrast with smoothing splines. The drusen detection uses a gradient based segmentation algorithm that isolates drusen and provides basic drusen characterization to the modelling stage. The detected drusen are then fitted by Modified Gaussian functions, producing a model of the image that is used to evaluate the affected area. Twenty two images were graded by eight experts, with the aid of a custom made software and compared with AD3RI. This comparison was based both on the total area and on the pixel-to-pixel analysis. The coefficient of variation, the intraclass correlation coefficient, the sensitivity, the specificity and the kappa coefficient were calculated. Results The ground truth used in this study was the experts' average grading. In order to evaluate the proposed methodology three indicators were defined: AD3RI compared to the ground truth (A2G; each expert compared to the other experts (E2E and a standard Global Threshold method compared to the ground truth (T2G. The results obtained for the three indicators, A2G, E2E and T2G, were: coefficient of variation 28.8 %, 22.5 % and 41.1 %, intraclass correlation coefficient 0.92, 0.88 and 0.67, sensitivity 0.68, 0.67 and 0.74, specificity 0.96, 0.97 and 0.94, and kappa coefficient 0.58, 0.60 and 0.49, respectively. Conclusions The gradings produced by AD3RI obtained an agreement
3-D Image Encryption Based on Rubik's Cube and RC6 Algorithm
Helmy, Mai; El-Rabaie, El-Sayed M.; Eldokany, Ibrahim M.; El-Samie, Fathi E. Abd
2017-12-01
A novel encryption algorithm based on the 3-D Rubik's cube is proposed in this paper to achieve 3D encryption of a group of images. This proposed encryption algorithm begins with RC6 as a first step for encrypting multiple images, separately. After that, the obtained encrypted images are further encrypted with the 3-D Rubik's cube. The RC6 encrypted images are used as the faces of the Rubik's cube. From the concepts of image encryption, the RC6 algorithm adds a degree of diffusion, while the Rubik's cube algorithm adds a degree of permutation. The simulation results demonstrate that the proposed encryption algorithm is efficient, and it exhibits strong robustness and security. The encrypted images are further transmitted over wireless Orthogonal Frequency Division Multiplexing (OFDM) system and decrypted at the receiver side. Evaluation of the quality of the decrypted images at the receiver side reveals good results.
Frequency-domain imaging algorithm for ultrasonic testing by application of matrix phased arrays
Directory of Open Access Journals (Sweden)
Dolmatov Dmitry
2017-01-01
Full Text Available Constantly increasing demand for high-performance materials and systems in aerospace industry requires advanced methods of nondestructive testing. One of the most promising methods is ultrasonic imaging by using matrix phased arrays. This technique allows to create three-dimensional ultrasonic imaging with high lateral resolution. Further progress in matrix phased array ultrasonic testing is determined by the development of fast imaging algorithms. In this article imaging algorithm based on frequency domain calculations is proposed. This approach is computationally efficient in comparison with time domain algorithms. Performance of the proposed algorithm was tested via computer simulations for planar specimen with flat bottom holes.
A segmentation algorithm based on image projection for complex text layout
Zhu, Wangsheng; Chen, Qin; Wei, Chuanyi; Li, Ziyang
2017-10-01
Segmentation algorithm is an important part of layout analysis, considering the efficiency advantage of the top-down approach and the particularity of the object, a breakdown of projection layout segmentation algorithm. Firstly, the algorithm will algorithm first partitions the text image, and divided into several columns, then for each column scanning projection, the text image is divided into several sub regions through multiple projection. The experimental results show that, this method inherits the projection itself and rapid calculation speed, but also can avoid the effect of arc image information page segmentation, and also can accurate segmentation of the text image layout is complex.
Speckle reduction methods in laser-based picture projectors
Akram, M. Nadeem; Chen, Xuyuan
2016-02-01
Laser sources have been promised for many years to be better light sources as compared to traditional lamps or light-emitting diodes (LEDs) for projectors, which enable projectors having wide colour gamut for vivid image, super brightness and high contrast for the best picture quality, long lifetime for maintain free operation, mercury free, and low power consumption for green environment. A major technology obstacle in using lasers for projection has been the speckle noise caused by to the coherent nature of the lasers. For speckle reduction, current state of the art solutions apply moving parts with large physical space demand. Solutions beyond the state of the art need to be developed such as integrated optical components, hybrid MOEMS devices, and active phase modulators for compact speckle reduction. In this article, major methods reported in the literature for the speckle reduction in laser projectors are presented and explained. With the advancement in semiconductor lasers with largely reduced cost for the red, green and the blue primary colours, and the developed methods for their speckle reduction, it is hoped that the lasers will be widely utilized in different projector applications in the near future.
Multiple rotation assessment through isothetic fringes in speckle photography
International Nuclear Information System (INIS)
Angel, Luciano; Tebaldi, Myrian; Bolognini, Nestor
2007-01-01
The use of different pupils for storing each speckled image in speckle photography is employed to determine multiple in-plane rotations. The method consists of recording a four-exposure specklegram where the rotations are done between exposures. This specklegram is then optically processed in a whole field approach rendering isothetic fringes, which give detailed information about the multiple rotations. It is experimentally demonstrated that the proposed arrangement permits the depiction of six isothetics in order to measure either six different angles or three nonparallel components for two local general in-plane displacements
Zhao, Yuchen; Zemmamouche, Redouane; Vandenrijt, Jean-François; Georges, Marc P.
2018-05-01
A combination of digital holographic interferometry (DHI) and digital speckle photography (DSP) allows in-plane and out-of-plane displacement measurement between two states of an object. The former can be determined by correlating the two speckle patterns whereas the latter is given by the phase difference obtained from DHI. We show that the amplitude of numerically reconstructed object wavefront obtained from Fresnel in-line digital holography (DH), in combination with phase shifting techniques, can be used as speckle patterns in DSP. The accuracy of in-plane measurement is improved after correcting the phase errors induced by reference wave during reconstruction process. Furthermore, unlike conventional imaging system, Fresnel DH offers the possibility to resize the pixel size of speckle patterns situated on the reconstruction plane under the same optical configuration simply by zero-padding the hologram. The flexibility of speckle size adjustment in Fresnel DH ensures the accuracy of estimation result using DSP.
Sensitivity study of voxel-based PET image comparison to image registration algorithms
Energy Technology Data Exchange (ETDEWEB)
Yip, Stephen, E-mail: syip@lroc.harvard.edu; Chen, Aileen B.; Berbeco, Ross [Department of Radiation Oncology, Brigham and Women’s Hospital, Dana-Farber Cancer Institute and Harvard Medical School, Boston, Massachusetts 02115 (United States); Aerts, Hugo J. W. L. [Department of Radiation Oncology, Brigham and Women’s Hospital, Dana-Farber Cancer Institute and Harvard Medical School, Boston, Massachusetts 02115 and Department of Radiology, Brigham and Women’s Hospital and Harvard Medical School, Boston, Massachusetts 02115 (United States)
2014-11-01
Purpose: Accurate deformable registration is essential for voxel-based comparison of sequential positron emission tomography (PET) images for proper adaptation of treatment plan and treatment response assessment. The comparison may be sensitive to the method of deformable registration as the optimal algorithm is unknown. This study investigated the impact of registration algorithm choice on therapy response evaluation. Methods: Sixteen patients with 20 lung tumors underwent a pre- and post-treatment computed tomography (CT) and 4D FDG-PET scans before and after chemoradiotherapy. All CT images were coregistered using a rigid and ten deformable registration algorithms. The resulting transformations were then applied to the respective PET images. Moreover, the tumor region defined by a physician on the registered PET images was classified into progressor, stable-disease, and responder subvolumes. Particularly, voxels with standardized uptake value (SUV) decreases >30% were classified as responder, while voxels with SUV increases >30% were progressor. All other voxels were considered stable-disease. The agreement of the subvolumes resulting from difference registration algorithms was assessed by Dice similarity index (DSI). Coefficient of variation (CV) was computed to assess variability of DSI between individual tumors. Root mean square difference (RMS{sub rigid}) of the rigidly registered CT images was used to measure the degree of tumor deformation. RMS{sub rigid} and DSI were correlated by Spearman correlation coefficient (R) to investigate the effect of tumor deformation on DSI. Results: Median DSI{sub rigid} was found to be 72%, 66%, and 80%, for progressor, stable-disease, and responder, respectively. Median DSI{sub deformable} was 63%–84%, 65%–81%, and 82%–89%. Variability of DSI was substantial and similar for both rigid and deformable algorithms with CV > 10% for all subvolumes. Tumor deformation had moderate to significant impact on DSI for progressor
Impact of transducer frequency setting on speckle tracking measures
DEFF Research Database (Denmark)
Olsen, Flemming Javier; Svendsen, Jesper Hastrup; Køber, Lars
2018-01-01
.5/3.0 MHz. The images were obtained immediately after each other at the exact same position for the two settings. Speckle tracking was performed in three apical projections, allowing for acquisition of layered global longitudinal strain (GLS) and strain rate measures. Concordance between the frequency...
Indian Academy of Sciences (India)
polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.
Motion tolerant iterative reconstruction algorithm for cone-beam helical CT imaging
Energy Technology Data Exchange (ETDEWEB)
Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu [Hitachi Medical Corporation, Chiba-ken (Japan). CT System Div.
2011-07-01
We have developed a new advanced iterative reconstruction algorithm for cone-beam helical CT. The features of this algorithm are: (a) it uses separable paraboloidal surrogate (SPS) technique as a foundation for reconstruction to reduce noise and cone-beam artifact, (b) it uses a view weight in the back-projection process to reduce motion artifact. To confirm the improvement of our proposed algorithm over other existing algorithm, such as Feldkamp-Davis-Kress (FDK) or SPS algorithm, we compared the motion artifact reduction, image noise reduction (standard deviation of CT number), and cone-beam artifact reduction on simulated and clinical data set. Our results demonstrate that the proposed algorithm dramatically reduces motion artifacts compared with the SPS algorithm, and decreases image noise compared with the FDK algorithm. In addition, the proposed algorithm potentially improves time resolution of iterative reconstruction. (orig.)
Secure image encryption algorithm design using a novel chaos based S-Box
International Nuclear Information System (INIS)
Çavuşoğlu, Ünal; Kaçar, Sezgin; Pehlivan, Ihsan; Zengin, Ahmet
2017-01-01
Highlights: • A new chaotic system is developed for creating S-Box and image encryption algorithm. • Chaos based random number generator is designed with the help of the new chaotic system. NIST tests are run on generated random numbers to verify randomness. • A new S-Box design algorithm is developed to create the chaos based S-Box to be utilized in encryption algorithm and performance tests are made. • The new developed S-Box based image encryption algorithm is introduced and image encryption application is carried out. • To show the quality and strong of the encryption process, security analysis are performed and compared with the AES and chaos algorithms. - Abstract: In this study, an encryption algorithm that uses chaos based S-BOX is developed for secure and speed image encryption. First of all, a new chaotic system is developed for creating S-Box and image encryption algorithm. Chaos based random number generator is designed with the help of the new chaotic system. Then, NIST tests are run on generated random numbers to verify randomness. A new S-Box design algorithm is developed to create the chaos based S-Box to be utilized in encryption algorithm and performance tests are made. As the next step, the new developed S-Box based image encryption algorithm is introduced in detail. Finally, image encryption application is carried out. To show the quality and strong of the encryption process, security analysis are performed. Proposed algorithm is compared with the AES and chaos algorithms. According to tests results, the proposed image encryption algorithm is secure and speed for image encryption application.
Multiscale Distance Coherence Vector Algorithm for Content-Based Image Retrieval
Jiexian, Zeng; Xiupeng, Liu
2014-01-01
Multiscale distance coherence vector algorithm for content-based image retrieval (CBIR) is proposed due to the same descriptor with different shapes and the shortcomings of antinoise performance of the distance coherence vector algorithm. By this algorithm, the image contour curve is evolved by Gaussian function first, and then the distance coherence vector is, respectively, extracted from the contour of the original image and evolved images. Multiscale distance coherence vector was obtained by reasonable weight distribution of the distance coherence vectors of evolved images contour. This algorithm not only is invariable to translation, rotation, and scaling transformation but also has good performance of antinoise. The experiment results show us that the algorithm has a higher recall rate and precision rate for the retrieval of images polluted by noise. PMID:24883416
Improving performance of wavelet-based image denoising algorithm using complex diffusion process
DEFF Research Database (Denmark)
Nadernejad, Ehsan; Sharifzadeh, Sara; Korhonen, Jari
2012-01-01
using a variety of standard images and its performance has been compared against several de-noising algorithms known from the prior art. Experimental results show that the proposed algorithm preserves the edges better and in most cases, improves the measured visual quality of the denoised images......Image enhancement and de-noising is an essential pre-processing step in many image processing algorithms. In any image de-noising algorithm, the main concern is to keep the interesting structures of the image. Such interesting structures often correspond to the discontinuities (edges...... in comparison to the existing methods known from the literature. The improvement is obtained without excessive computational cost, and the algorithm works well on a wide range of different types of noise....
Robustness of Multiple Clustering Algorithms on Hyperspectral Images
National Research Council Canada - National Science Library
Williams, Jason P
2007-01-01
.... Various clustering algorithms were employed, including a hierarchical method, ISODATA, K-means, and X-means, and were used on a simple two dimensional dataset in order to discover potential problems with the algorithms...
Sparse Nonlinear Electromagnetic Imaging Accelerated With Projected Steepest Descent Algorithm
Desmal, Abdulla; Bagci, Hakan
2017-01-01
steepest descent algorithm. The algorithm uses a projection operator to enforce the sparsity constraint by thresholding the solution at every iteration. Thresholding level and iteration step are selected carefully to increase the efficiency without
Research on Adaptive Optics Image Restoration Algorithm by Improved Expectation Maximization Method
Zhang, Lijuan; Li, Dongming; Su, Wei; Yang, Jinhua; Jiang, Yutong
2014-01-01
To improve the effect of adaptive optics images’ restoration, we put forward a deconvolution algorithm improved by the EM algorithm which joints multiframe adaptive optics images based on expectation-maximization theory. Firstly, we need to make a mathematical model for the degenerate multiframe adaptive optics images. The function model is deduced for the points that spread with time based on phase error. The AO images are denoised using the image power spectral density and support constrain...
Mandell, Jacob C; Khurana, Bharti; Folio, Les R; Hyun, Hyewon; Smith, Stacy E; Dunne, Ruth M; Andriole, Katherine P
2017-06-01
A methodology is described using Adobe Photoshop and Adobe Extendscript to process DICOM images with a Relative Attenuation-Dependent Image Overlay (RADIO) algorithm to visualize the full dynamic range of CT in one view, without requiring a change in window and level settings. The potential clinical uses for such an algorithm are described in a pictorial overview, including applications in emergency radiology, oncologic imaging, and nuclear medicine and molecular imaging.
IMAGING THE EPOCH OF REIONIZATION: LIMITATIONS FROM FOREGROUND CONFUSION AND IMAGING ALGORITHMS
International Nuclear Information System (INIS)
Vedantham, Harish; Udaya Shankar, N.; Subrahmanyan, Ravi
2012-01-01
Tomography of redshifted 21 cm transition from neutral hydrogen using Fourier synthesis telescopes is a promising tool to study the Epoch of Reionization (EoR). Limiting the confusion from Galactic and extragalactic foregrounds is critical to the success of these telescopes. The instrumental response or the point-spread function (PSF) of such telescopes is inherently three dimensional with frequency mapping to the line-of-sight (LOS) distance. EoR signals will necessarily have to be detected in data where continuum confusion persists; therefore, it is important that the PSF has acceptable frequency structure so that the residual foreground does not confuse the EoR signature. This paper aims to understand the three-dimensional PSF and foreground contamination in the same framework. We develop a formalism to estimate the foreground contamination along frequency, or equivalently LOS dimension, and establish a relationship between foreground contamination in the image plane and visibility weights on the Fourier plane. We identify two dominant sources of LOS foreground contamination—'PSF contamination' and 'gridding contamination'. We show that PSF contamination is localized in LOS wavenumber space, beyond which there potentially exists an 'EoR window' with negligible foreground contamination where we may focus our efforts to detect EoR. PSF contamination in this window may be substantially reduced by judicious choice of a frequency window function. Gridding and imaging algorithms create additional gridding contamination and we propose a new imaging algorithm using the Chirp Z Transform that significantly reduces this contamination. Finally, we demonstrate the analytical relationships and the merit of the new imaging algorithm for the case of imaging with the Murchison Widefield Array.
Directory of Open Access Journals (Sweden)
Hui Huang
2017-01-01
Full Text Available According to the pros and cons of contourlet transform and multimodality medical imaging, here we propose a novel image fusion algorithm that combines nonlinear approximation of contourlet transform with image regional features. The most important coefficient bands of the contourlet sparse matrix are retained by nonlinear approximation. Low-frequency and high-frequency regional features are also elaborated to fuse medical images. The results strongly suggested that the proposed algorithm could improve the visual effects of medical image fusion and image quality, image denoising, and enhancement.
Optical design of the comet Shoemaker-Levy speckle camera
Energy Technology Data Exchange (ETDEWEB)
Bissinger, H. [Lawrence Livermore National Lab., CA (United States)
1994-11-15
An optical design is presented in which the Lick 3 meter telescope and a bare CCD speckle camera system was used to image the collision sites of the Shoemaker-Levy 9 comet with the Planet Jupiter. The brief overview includes of the optical constraints and system layout. The choice of a Risley prism combination to compensate for the time dependent atmospheric chromatic changes are described. Plate scale and signal-to-noise ratio curves resulting from imaging reference stars are compared with theory. Comparisons between un-corrected and reconstructed images of Jupiter`s impact sites. The results confirm that speckle imaging techniques can be used over an extended time period to provide a method to image large extended objects.
Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding
Directory of Open Access Journals (Sweden)
Yongjian Nian
2013-01-01
Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.
A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing
Overmeyer, Austin D.
2015-01-01
A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.
Analysis and improvement of a chaos-based image encryption algorithm
International Nuclear Information System (INIS)
Xiao Di; Liao Xiaofeng; Wei Pengcheng
2009-01-01
The security of digital image attracts much attention recently. In Guan et al. [Guan Z, Huang F, Guan W. Chaos-based image encryption algorithm. Phys Lett A 2005; 346: 153-7.], a chaos-based image encryption algorithm has been proposed. In this paper, the cause of potential flaws in the original algorithm is analyzed in detail, and then the corresponding enhancement measures are proposed. Both theoretical analysis and computer simulation indicate that the improved algorithm can overcome these flaws and maintain all the merits of the original one.
Hybridizing Differential Evolution with a Genetic Algorithm for Color Image Segmentation
Directory of Open Access Journals (Sweden)
R. V. V. Krishna
2016-10-01
Full Text Available This paper proposes a hybrid of differential evolution and genetic algorithms to solve the color image segmentation problem. Clustering based color image segmentation algorithms segment an image by clustering the features of color and texture, thereby obtaining accurate prototype cluster centers. In the proposed algorithm, the color features are obtained using the homogeneity model. A new texture feature named Power Law Descriptor (PLD which is a modification of Weber Local Descriptor (WLD is proposed and further used as a texture feature for clustering. Genetic algorithms are competent in handling binary variables, while differential evolution on the other hand is more efficient in handling real parameters. The obtained texture feature is binary in nature and the color feature is a real value, which suits very well the hybrid cluster center optimization problem in image segmentation. Thus in the proposed algorithm, the optimum texture feature centers are evolved using genetic algorithms, whereas the optimum color feature centers are evolved using differential evolution.
Research on Image Reconstruction Algorithms for Tuber Electrical Resistance Tomography System
Directory of Open Access Journals (Sweden)
Jiang Zili
2016-01-01
Full Text Available The application of electrical resistance tomography (ERT technology has been expanded to the field of agriculture, and the concept of TERT (Tuber Electrical Resistance Tomography is proposed. On the basis of the research on the forward and the inverse problems of the TERT system, a hybrid algorithm based on genetic algorithm is proposed, which can be used in TERT system to monitor the growth status of the plant tubers. The image reconstruction of TERT system is different from the conventional ERT system for two phase-flow measurement. Imaging of TERT needs more precision measurement and the conventional ERT cares more about the image reconstruction speed. A variety of algorithms are analyzed and optimized for the purpose of making them suitable for TERT system. For example: linear back projection, modified Newton-Raphson and genetic algorithm. Experimental results showed that the novel hybrid algorithm is superior to other algorithm and it can effectively improve the image reconstruction quality.
High-speed cell recognition algorithm for ultrafast flow cytometer imaging system
Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang
2018-04-01
An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.
Li, Shuo; Jin, Weiqi; Li, Li; Li, Yiyang
2018-05-01
Infrared thermal images can reflect the thermal-radiation distribution of a particular scene. However, the contrast of the infrared images is usually low. Hence, it is generally necessary to enhance the contrast of infrared images in advance to facilitate subsequent recognition and analysis. Based on the adaptive double plateaus histogram equalization, this paper presents an improved contrast enhancement algorithm for infrared thermal images. In the proposed algorithm, the normalized coefficient of variation of the histogram, which characterizes the level of contrast enhancement, is introduced as feedback information to adjust the upper and lower plateau thresholds. The experiments on actual infrared images show that compared to the three typical contrast-enhancement algorithms, the proposed algorithm has better scene adaptability and yields better contrast-enhancement results for infrared images with more dark areas or a higher dynamic range. Hence, it has high application value in contrast enhancement, dynamic range compression, and digital detail enhancement for infrared thermal images.
Global and Local Page Replacement Algorithms on Virtual Memory Systems for Image Processing
WADA, Ben Tsutom
1985-01-01
Three virtual memory systems for image processing, different one another in frame allocation algorithms and page replacement algorithms, were examined experimentally upon their page-fault characteristics. The hypothesis, that global page replacement algorithms are susceptible to thrashing, held in the raster scan experiment, while it did not in another non raster-scan experiment. The results of the experiments may be useful also in making parallel image processors more efficient, while they a...
A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations
Energy Technology Data Exchange (ETDEWEB)
Felix, Simon; Bolzern, Roman; Battaglia, Marina, E-mail: simon.felix@fhnw.ch, E-mail: roman.bolzern@fhnw.ch, E-mail: marina.battaglia@fhnw.ch [University of Applied Sciences and Arts Northwestern Switzerland FHNW, 5210 Windisch (Switzerland)
2017-11-01
One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS-CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS-CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation of quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.
A Compressed Sensing-based Image Reconstruction Algorithm for Solar Flare X-Ray Observations
Felix, Simon; Bolzern, Roman; Battaglia, Marina
2017-11-01
One way of imaging X-ray emission from solar flares is to measure Fourier components of the spatial X-ray source distribution. We present a new compressed sensing-based algorithm named VIS_CS, which reconstructs the spatial distribution from such Fourier components. We demonstrate the application of the algorithm on synthetic and observed solar flare X-ray data from the Reuven Ramaty High Energy Solar Spectroscopic Imager satellite and compare its performance with existing algorithms. VIS_CS produces competitive results with accurate photometry and morphology, without requiring any algorithm- and X-ray-source-specific parameter tuning. Its robustness and performance make this algorithm ideally suited for the generation of quicklook images or large image cubes without user intervention, such as for imaging spectroscopy analysis.
International Nuclear Information System (INIS)
Ogino, Takashi; Egawa, Sunao
1991-01-01
New algorithms of CT value correction for reconstructing a radiotherapy simulation image through axial CT images were developed. One, designated plane weighting method, is to correct CT value in proportion to the position of the beam element passing through the voxel. The other, designated solid weighting method, is to correct CT value in proportion to the length of the beam element passing through the voxel and the volume of voxel. Phantom experiments showed fair spatial resolution in the transverse direction. In the longitudinal direction, however, spatial resolution of under slice thickness could not be obtained. Contrast resolution was equivalent for both methods. In patient studies, the reconstructed radiotherapy simulation image was almost similar in visual perception of the density resolution to a simulation film taken by X-ray simulator. (author)
Directory of Open Access Journals (Sweden)
Xiaole Shen
2015-09-01
Full Text Available The uneven illumination phenomenon caused by thin clouds will reduce the quality of remote sensing images, and bring adverse effects to the image interpretation. To remove the effect of thin clouds on images, an uneven illumination correction can be applied. In this paper, an effective uneven illumination correction algorithm is proposed to remove the effect of thin clouds and to restore the ground information of the optical remote sensing image. The imaging model of remote sensing images covered by thin clouds is analyzed. Due to the transmission attenuation, reflection, and scattering, the thin cloud cover usually increases region brightness and reduces saturation and contrast of the image. As a result, a wavelet domain enhancement is performed for the image in Hue-Saturation-Value (HSV color space. We use images with thin clouds in Wuhan area captured by QuickBird and ZiYuan-3 (ZY-3 satellites for experiments. Three traditional uneven illumination correction algorithms, i.e., multi-scale Retinex (MSR algorithm, homomorphic filtering (HF-based algorithm, and wavelet transform-based MASK (WT-MASK algorithm are performed for comparison. Five indicators, i.e., mean value, standard deviation, information entropy, average gradient, and hue deviation index (HDI are used to analyze the effect of the algorithms. The experimental results show that the proposed algorithm can effectively eliminate the influences of thin clouds and restore the real color of ground objects under thin clouds.
Improved SURF Algorithm and Its Application in Seabed Relief Image Matching
Directory of Open Access Journals (Sweden)
Zhang Hong-Mei
2017-01-01
Full Text Available The matching based on seabed relief image is widely used in underwater relief matching navigation and target recognition, etc. However, being influenced by various factors, some conventional matching algorithms are difficult to obtain an ideal result in the matching of seabed relief image. SURF(Speeded Up Robust Features algorithm is based on feature points pair to achieve matching, and can get good results in the seabed relief image matching. However, in practical applications, the traditional SURF algorithm is easy to get false matching, especially when the area’s features are similar or not obvious, the problem is more seriously. In order to improve the robustness of the algorithm, this paper proposes an improved matching algorithm, which combines the SURF, and RANSAC (Random Sample Consensus algorithms. The new algorithm integrates the two algorithms advantages, firstly, the SURF algorithm is applied to detect and extract the feature points then to pre-match. Secondly, RANSAC algorithm is utilized to eliminate mismatching points, and then the accurate matching is accomplished with the correct matching points. The experimental results show that the improved algorithm overcomes the mismatching problem effectively and have better precision and faster speed than the traditional SURF algorithm.
Despeckling Polsar Images Based on Relative Total Variation Model
Jiang, C.; He, X. F.; Yang, L. J.; Jiang, J.; Wang, D. Y.; Yuan, Y.
2018-04-01
Relatively total variation (RTV) algorithm, which can effectively decompose structure information and texture in image, is employed in extracting main structures of the image. However, applying the RTV directly to polarimetric SAR (PolSAR) image filtering will not preserve polarimetric information. A new RTV approach based on the complex Wishart distribution is proposed considering the polarimetric properties of PolSAR. The proposed polarization RTV (PolRTV) algorithm can be used for PolSAR image filtering. The L-band Airborne SAR (AIRSAR) San Francisco data is used to demonstrate the effectiveness of the proposed algorithm in speckle suppression, structural information preservation, and polarimetric property preservation.
A novel image encryption algorithm based on a 3D chaotic map
Kanso, A.; Ghebleh, M.
2012-07-01
Recently [Solak E, Çokal C, Yildiz OT Biyikoǧlu T. Cryptanalysis of Fridrich's chaotic image encryption. Int J Bifur Chaos 2010;20:1405-1413] cryptanalyzed the chaotic image encryption algorithm of [Fridrich J. Symmetric ciphers based on two-dimensional chaotic maps. Int J Bifur Chaos 1998;8(6):1259-1284], which was considered a benchmark for measuring security of many image encryption algorithms. This attack can also be applied to other encryption algorithms that have a structure similar to Fridrich's algorithm, such as that of [Chen G, Mao Y, Chui, C. A symmetric image encryption scheme based on 3D chaotic cat maps. Chaos Soliton Fract 2004;21:749-761]. In this paper, we suggest a novel image encryption algorithm based on a three dimensional (3D) chaotic map that can defeat the aforementioned attack among other existing attacks. The design of the proposed algorithm is simple and efficient, and based on three phases which provide the necessary properties for a secure image encryption algorithm including the confusion and diffusion properties. In phase I, the image pixels are shuffled according to a search rule based on the 3D chaotic map. In phases II and III, 3D chaotic maps are used to scramble shuffled pixels through mixing and masking rules, respectively. Simulation results show that the suggested algorithm satisfies the required performance tests such as high level security, large key space and acceptable encryption speed. These characteristics make it a suitable candidate for use in cryptographic applications.
An efficient fractal image coding algorithm using unified feature and DCT
International Nuclear Information System (INIS)
Zhou Yiming; Zhang Chao; Zhang Zengke
2009-01-01
Fractal image compression is a promising technique to improve the efficiency of image storage and image transmission with high compression ratio, however, the huge time consumption for the fractal image coding is a great obstacle to the practical applications. In order to improve the fractal image coding, efficient fractal image coding algorithms using a special unified feature and a DCT coder are proposed in this paper. Firstly, based on a necessary condition to the best matching search rule during fractal image coding, the fast algorithm using a special unified feature (UFC) is addressed, and it can reduce the search space obviously and exclude most inappropriate matching subblocks before the best matching search. Secondly, on the basis of UFC algorithm, in order to improve the quality of the reconstructed image, a DCT coder is combined to construct a hybrid fractal image algorithm (DUFC). Experimental results show that the proposed algorithms can obtain good quality of the reconstructed images and need much less time than the baseline fractal coding algorithm.
Image defog algorithm based on open close filter and gradient domain recursive bilateral filter
Liu, Daqian; Liu, Wanjun; Zhao, Qingguo; Fei, Bowen
2017-11-01
To solve the problems of fuzzy details, color distortion, low brightness of the image obtained by the dark channel prior defog algorithm, an image defog algorithm based on open close filter and gradient domain recursive bilateral filter, referred to as OCRBF, was put forward. The algorithm named OCRBF firstly makes use of weighted quad tree to obtain more accurate the global atmospheric value, then exploits multiple-structure element morphological open and close filter towards the minimum channel map to obtain a rough scattering map by dark channel prior, makes use of variogram to correct the transmittance map,and uses gradient domain recursive bilateral filter for the smooth operation, finally gets recovery images by image degradation model, and makes contrast adjustment to get bright, clear and no fog image. A large number of experimental results show that the proposed defog method in this paper can be good to remove the fog , recover color and definition of the fog image containing close range image, image perspective, the image including the bright areas very well, compared with other image defog algorithms,obtain more clear and natural fog free images with details of higher visibility, what's more, the relationship between the time complexity of SIDA algorithm and the number of image pixels is a linear correlation.
Directory of Open Access Journals (Sweden)
Dongming Li
2017-04-01
Full Text Available An adaptive optics (AO system provides real-time compensation for atmospheric turbulence. However, an AO image is usually of poor contrast because of the nature of the imaging process, meaning that the image contains information coming from both out-of-focus and in-focus planes of the object, which also brings about a loss in quality. In this paper, we present a robust multi-frame adaptive optics image restoration algorithm via maximum likelihood estimation. Our proposed algorithm uses a maximum likelihood method with image regularization as the basic principle, and constructs the joint log likelihood function for multi-frame AO images based on a Poisson distribution model. To begin with, a frame selection method based on image variance is applied to the observed multi-frame AO images to select images with better quality to improve the convergence of a blind deconvolution algorithm. Then, by combining the imaging conditions and the AO system properties, a point spread function estimation model is built. Finally, we develop our iterative solutions for AO image restoration addressing the joint deconvolution issue. We conduct a number of experiments to evaluate the performances of our proposed algorithm. Experimental results show that our algorithm produces accurate AO image restoration results and outperforms the current state-of-the-art blind deconvolution methods.
The POKEMON Speckle Survey of Nearby M-Dwarfs
van Belle, Gerard; von Braun, Kaspar; Horch, Elliott; Clark, Catherine; DSSI Speckle Team
2018-01-01
The POKEMON (Pervasive Overview of Kompanions of Every M-dwarf in Our Neighborhood) survey of nearby M-dwarfs intends to inspect, at diffraction-limited resolution, every low-mass star out to 15pc, along with selected additional objects to 25pc. The primary emphasis of the survey is detection of low-mass companions to these M-dwarfs for refinement of the low-mass star multiplicity rate. The resultant catalog of M-dwarf companions will also guide immediate refinement of transit planet detection results from surveys such as TESS. POKEMON is using Lowell Observatory's 4.3-m Discovery Channel Telescope (DCT) with the Differential Speckle Survey Instrument (DSSI) speckle camera, along with the NN-Explore Exoplanet Stellar Speckle Imager (NESSI) speckle imager on 3.5-m WIYN; the survey takes advantage of the extremely rapid observing cadence rates possible with WIYN and (especially) DCT. The current status and preliminary results from the first 20+ nights of observing will be presented. Gotta observe them all!
A joint image encryption and watermarking algorithm based on compressive sensing and chaotic map
International Nuclear Information System (INIS)
Xiao Di; Cai Hong-Kun; Zheng Hong-Ying
2015-01-01
In this paper, a compressive sensing (CS) and chaotic map-based joint image encryption and watermarking algorithm is proposed. The transform domain coefficients of the original image are scrambled by Arnold map firstly. Then the watermark is adhered to the scrambled data. By compressive sensing, a set of watermarked measurements is obtained as the watermarked cipher image. In this algorithm, watermark embedding and data compression can be performed without knowing the original image; similarly, watermark extraction will not interfere with decryption. Due to the characteristics of CS, this algorithm features compressible cipher image size, flexible watermark capacity, and lossless watermark extraction from the compressed cipher image as well as robustness against packet loss. Simulation results and analyses show that the algorithm achieves good performance in the sense of security, watermark capacity, extraction accuracy, reconstruction, robustness, etc. (paper)
Images Encryption Method using Steganographic LSB Method, AES and RSA algorithm
Moumen, Abdelkader; Sissaoui, Hocine
2017-03-01
Vulnerability of communication of digital images is an extremely important issue nowadays, particularly when the images are communicated through insecure channels. To improve communication security, many cryptosystems have been presented in the image encryption literature. This paper proposes a novel image encryption technique based on an algorithm that is faster than current methods. The proposed algorithm eliminates the step in which the secrete key is shared during the encryption process. It is formulated based on the symmetric encryption, asymmetric encryption and steganography theories. The image is encrypted using a symmetric algorithm, then, the secret key is encrypted by means of an asymmetrical algorithm and it is hidden in the ciphered image using a least significant bits steganographic scheme. The analysis results show that while enjoying the faster computation, our method performs close to optimal in terms of accuracy.
Neural Network Blind Equalization Algorithm Applied in Medical CT Image Restoration
Directory of Open Access Journals (Sweden)
Yunshan Sun
2013-01-01
Full Text Available A new algorithm for iterative blind image restoration is presented in this paper. The method extends blind equalization found in the signal case to the image. A neural network blind equalization algorithm is derived and used in conjunction with Zigzag coding to restore the original image. As a result, the effect of PSF can be removed by using the proposed algorithm, which contributes to eliminate intersymbol interference (ISI. In order to obtain the estimation of the original image, what is proposed in this method is to optimize constant modulus blind equalization cost function applied to grayscale CT image by using conjugate gradient method. Analysis of convergence performance of the algorithm verifies the feasibility of this method theoretically; meanwhile, simulation results and performance evaluations of recent image quality metrics are provided to assess the effectiveness of the proposed method.
Dynamical speckles in watery surfaces
International Nuclear Information System (INIS)
Llovera-Gonzalez, J.J.; Moreno-Yeras, A.; Garcia-Diaz, M.; Alvarez-Salgado, Y.
2009-01-01
Recovery of watery surfaces with monolayer of surfactant substances is of interest in diverse technological applications. The format ion and study of molecular monolayer deposited in these surfaces require the application of measurements techniques that allow evaluating the recovery grade locally without modifying practically the studied surface. In this paper the preliminary results obtained by the authors it plows exposed applying the technique of dynamic speckle interferometry in watery surfaces and their consideration like to possible resource to measure the grade of local recovery of these surfaces on the it bases that the speckles pattern dog reveal the dynamics of evaporation that takes place in the same ones. (Author)
Revell, James; Mirmehdi, Majid; McNally, Donal
2005-06-01
We present the development and validation of an image based speckle tracking methodology, for determining temporal two-dimensional (2-D) axial and lateral displacement and strain fields from ultrasound video streams. We refine a multiple scale region matching approach incorporating novel solutions to known speckle tracking problems. Key contributions include automatic similarity measure selection to adapt to varying speckle density, quantifying trajectory fields, and spatiotemporal elastograms. Results are validated using tissue mimicking phantoms and in vitro data, before applying them to in vivo musculoskeletal ultrasound sequences. The method presented has the potential to improve clinical knowledge of tendon pathology from carpel tunnel syndrome, inflammation from implants, sport injuries, and many others.
International Nuclear Information System (INIS)
Diemoz, Paul C; Bravin, Alberto; Coan, Paola; Glaser, Christian
2010-01-01
In x-ray phase-contrast analyzer-based imaging, the contrast is provided by a combination of absorption, refraction and scattering effects. Several extraction algorithms, which attempt to separate and quantify these different physical contributions, have been proposed and applied. In a previous work, we presented a quantitative comparison of five among the most well-known extraction algorithms based on the geometrical optics approximation applied to planar images: diffraction-enhanced imaging (DEI), extended diffraction-enhanced imaging (E-DEI), generalized diffraction-enhanced imaging (G-DEI), multiple-image radiography (MIR) and Gaussian curve fitting (GCF). In this paper, we compare these algorithms in the case of the computed tomography (CT) modality. The extraction algorithms are applied to analyzer-based CT images of both plastic phantoms and biological samples (cartilage-on-bone cylinders). Absorption, refraction and scattering signals are derived. Results obtained with the different algorithms may vary greatly, especially in the case of large refraction angles. We show that ABI-CT extraction algorithms can provide an excellent tool to enhance the visualization of cartilage internal structures, which may find applications in a clinical context. Besides, by using the refraction images, the refractive index decrements for both the cartilage matrix and the cartilage cells have been estimated.
Low-dose multiple-information retrieval algorithm for X-ray grating-based imaging
International Nuclear Information System (INIS)
Wang Zhentian; Huang Zhifeng; Chen Zhiqiang; Zhang Li; Jiang Xiaolei; Kang Kejun; Yin Hongxia; Wang Zhenchang; Stampanoni, Marco
2011-01-01
The present work proposes a low dose information retrieval algorithm for X-ray grating-based multiple-information imaging (GB-MII) method, which can retrieve the attenuation, refraction and scattering information of samples by only three images. This algorithm aims at reducing the exposure time and the doses delivered to the sample. The multiple-information retrieval problem in GB-MII is solved by transforming a nonlinear equations set to a linear equations and adopting the nature of the trigonometric functions. The proposed algorithm is validated by experiments both on conventional X-ray source and synchrotron X-ray source, and compared with the traditional multiple-image-based retrieval algorithm. The experimental results show that our algorithm is comparable with the traditional retrieval algorithm and especially suitable for high Signal-to-Noise system.
Indian Academy of Sciences (India)
to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...
A MAP blind image deconvolution algorithm with bandwidth over-constrained
Ren, Zhilei; Liu, Jin; Liang, Yonghui; He, Yulong
2018-03-01
We demonstrate a maximum a posteriori (MAP) blind image deconvolution algorithm with bandwidth over-constrained and total variation (TV) regularization to recover a clear image from the AO corrected images. The point spread functions (PSFs) are estimated by bandwidth limited less than the cutoff frequency of the optical system. Our algorithm performs well in avoiding noise magnification. The performance is demonstrated on simulated data.
Fast Algorithms for Fitting Active Appearance Models to Unconstrained Images
Tzimiropoulos, Georgios; Pantic, Maja
2016-01-01
Fitting algorithms for Active Appearance Models (AAMs) are usually considered to be robust but slow or fast but less able to generalize well to unseen variations. In this paper, we look into AAM fitting algorithms and make the following orthogonal contributions: We present a simple “project-out‿
Color Image Encryption Algorithm Based on TD-ERCS System and Wavelet Neural Network
Directory of Open Access Journals (Sweden)
Kun Zhang
2015-01-01
Full Text Available In order to solve the security problem of transmission image across public networks, a new image encryption algorithm based on TD-ERCS system and wavelet neural network is proposed in this paper. According to the permutation process and the binary XOR operation from the chaotic series by producing TD-ERCS system and wavelet neural network, it can achieve image encryption. This encryption algorithm is a reversible algorithm, and it can achieve original image in the rule inverse process of encryption algorithm. Finally, through computer simulation, the experiment results show that the new chaotic encryption algorithm based on TD-ERCS system and wavelet neural network is valid and has higher security.
Hybrid phase retrieval algorithm for solving the twin image problem in in-line digital holography
Zhao, Jie; Wang, Dayong; Zhang, Fucai; Wang, Yunxin
2010-10-01
For the reconstruction in the in-line digital holography, there are three terms overlapping with each other on the image plane, named the zero order term, the real image and the twin image respectively. The unwanted twin image degrades the real image seriously. A hybrid phase retrieval algorithm is presented to address this problem, which combines the advantages of two popular phase retrieval algorithms. One is the improved version of the universal iterative algorithm (UIA), called the phase flipping-based UIA (PFB-UIA). The key point of this algorithm is to flip the phase of the object iteratively. It is proved that the PFB-UIA is able to find the support of the complicated object. Another one is the Fienup algorithm, which is a kind of well-developed algorithm and uses the support of the object as the constraint among the iteration procedure. Thus, by following the Fienup algorithm immediately after the PFB-UIA, it is possible to produce the amplitude and the phase distributions of the object with high fidelity. The primary simulated results showed that the proposed algorithm is powerful for solving the twin image problem in the in-line digital holography.
Miller, D; Lippert, C; Vollmer, F; Bozinov, O; Benes, L; Schulte, D M; Sure, U
2012-09-01
Freehand three-dimensional ultrasound imaging (3D-US) is increasingly used in image-guided surgery. During image acquisition, a set of B-scans is acquired that is distributed in a non-parallel manner over the area of interest. Reconstructing these images into a regular array allows 3D visualization. However, the reconstruction process may introduce artefacts and may therefore reduce image quality. The aim of the study is to compare different algorithms with respect to image quality and diagnostic value for image guidance in neurosurgery. 3D-US data sets were acquired during surgery of various intracerebral lesions using an integrated ultrasound-navigation device. They were stored for post-hoc evaluation. Five different reconstruction algorithms, a standard multiplanar reconstruction with interpolation (MPR), a pixel nearest neighbour method (PNN), a voxel nearest neighbour method (VNN) and two voxel based distance-weighted algorithms (VNN2 and DW) were tested with respect to image quality and artefact formation. The capability of the algorithm to fill gaps within the sample volume was investigated and a clinical evaluation with respect to the diagnostic value of the reconstructed images was performed. MPR was significantly worse than the other algorithms in filling gaps. In an image subtraction test, VNN2 and DW reliably reconstructed images even if large amounts of data were missing. However, the quality of the reconstruction improved, if data acquisition was performed in a structured manner. When evaluating the diagnostic value of reconstructed axial, sagittal and coronal views, VNN2 and DW were judged to be significantly better than MPR and VNN. VNN2 and DW could be identified as robust algorithms that generate reconstructed US images with a high diagnostic value. These algorithms improve the utility and reliability of 3D-US imaging during intraoperative navigation. Copyright © 2012 John Wiley & Sons, Ltd.
International Nuclear Information System (INIS)
Vaegler, Sven; Sauer, Otto; Stsepankou, Dzmitry; Hesser, Juergen
2015-01-01
The reduction of dose in cone beam computer tomography (CBCT) arises from the decrease of the tube current for each projection as well as from the reduction of the number of projections. In order to maintain good image quality, sophisticated image reconstruction techniques are required. The Prior Image Constrained Compressed Sensing (PICCS) incorporates prior images into the reconstruction algorithm and outperforms the widespread used Feldkamp-Davis-Kress-algorithm (FDK) when the number of projections is reduced. However, prior images that contain major variations are not appropriately considered so far in PICCS. We therefore propose the partial-PICCS (pPICCS) algorithm. This framework is a problem-specific extension of PICCS and enables the incorporation of the reliability of the prior images additionally. We assumed that the prior images are composed of areas with large and small deviations. Accordingly, a weighting matrix considered the assigned areas in the objective function. We applied our algorithm to the problem of image reconstruction from few views by simulations with a computer phantom as well as on clinical CBCT projections from a head-and-neck case. All prior images contained large local variations. The reconstructed images were compared to the reconstruction results by the FDK-algorithm, by Compressed Sensing (CS) and by PICCS. To show the gain of image quality we compared image details with the reference image and used quantitative metrics (root-mean-square error (RMSE), contrast-to-noise-ratio (CNR)). The pPICCS reconstruction framework yield images with substantially improved quality even when the number of projections was very small. The images contained less streaking, blurring and inaccurately reconstructed structures compared to the images reconstructed by FDK, CS and conventional PICCS. The increased image quality is also reflected in large RMSE differences. We proposed a modification of the original PICCS algorithm. The pPICCS algorithm
Infrared speckle observations of Io - an eruption in the Loki region
International Nuclear Information System (INIS)
Howell, R.R.; Mcginn, M.T.
1985-01-01
Speckle observations of Jupiter's satellite Io at a wavelength of 5 micrometers during July 1984 resolved the disk and showed emission from a hot spot in the Loki region. The hot spot contributed a flux approximately equal to 60 percent of that from the disk.Images reconstructed by means of the Knox-Thompson algorithm showed the spot moving across the disk as the satellite rotated. It was located at 301 deg + or - 6 deg west longitude, 10 deg + or - 6 deg north latitude, and had a radiance of (2.96 + or - 0.54) x 10 to the 22nd ergs/sec cm sr/A where A is the area of the spot. For an assumed temperature of 400 K, the area of the source would be 11,400 square kilometers. An active lava lake similar to that seen by Voyager may be the source of the infrared emission. 10 references
Rodriguez-Hernandez, Miguel A; Gomez-Sacristan, Angel; Sempere-Payá, Víctor M
2016-04-29
Ultrasound diagnosis is a widely used medical tool. Among the various ultrasound techniques, ultrasonic imaging is particularly relevant. This paper presents an improvement to a two-dimensional (2D) ultrasonic system using measurements taken from perpendicular planes, where digital signal processing techniques are used to combine one-dimensional (1D) A-scans were acquired by individual transducers in arrays located in perpendicular planes. An algorithm used to combine measurements is improved based on the wavelet transform, which includes a denoising step during the 2D representation generation process. The inclusion of this new denoising stage generates higher quality 2D representations with a reduced level of speckling. The paper includes different 2D representations obtained from noisy A-scans and compares the improvements obtained by including the denoising stage.
Real time processor for array speckle interferometry
Chin, Gordon; Florez, Jose; Borelli, Renan; Fong, Wai; Miko, Joseph; Trujillo, Carlos
1989-02-01
The authors are constructing a real-time processor to acquire image frames, perform array flat-fielding, execute a 64 x 64 element two-dimensional complex FFT (fast Fourier transform) and average the power spectrum, all within the 25 ms coherence time for speckles at near-IR (infrared) wavelength. The processor will be a compact unit controlled by a PC with real-time display and data storage capability. This will provide the ability to optimize observations and obtain results on the telescope rather than waiting several weeks before the data can be analyzed and viewed with offline methods. The image acquisition and processing, design criteria, and processor architecture are described.
SAR image regularization with fast approximate discrete minimization.
Denis, Loïc; Tupin, Florence; Darbon, Jérôme; Sigelle, Marc
2009-07-01
Synthetic aperture radar (SAR) images, like other coherent imaging modalities, suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. Numerous approaches have been proposed to filter speckle noise. Markov random field (MRF) modelization provides a convenient way to express both data fidelity constraints and desirable properties of the filtered image. In this context, total variation minimization has been extensively used to constrain the oscillations in the regularized image while preserving its edges. Speckle noise follows heavy-tailed distributions, and the MRF formulation leads to a minimization problem involving nonconvex log-likelihood terms. Such a minimization can be performed efficiently by computing minimum cuts on weighted graphs. Due to memory constraints, exact minimization, although theoretically possible, is not achievable on large images required by remote sensing applications. The computational burden of the state-of-the-art algorithm for approximate minimization (namely the alpha -expansion) is too heavy specially when considering joint regularization of several images. We show that a satisfying solution can be reached, in few iterations, by performing a graph-cut-based combinatorial exploration of large trial moves. This algorithm is applied to joint regularization of the amplitude and interferometric phase in urban area SAR images.
Tan, Ru-Chao; Lei, Tong; Zhao, Qing-Min; Gong, Li-Hua; Zhou, Zhi-Hong
2016-12-01
To improve the slow processing speed of the classical image encryption algorithms and enhance the security of the private color images, a new quantum color image encryption algorithm based on a hyper-chaotic system is proposed, in which the sequences generated by the Chen's hyper-chaotic system are scrambled and diffused with three components of the original color image. Sequentially, the quantum Fourier transform is exploited to fulfill the encryption. Numerical simulations show that the presented quantum color image encryption algorithm possesses large key space to resist illegal attacks, sensitive dependence on initial keys, uniform distribution of gray values for the encrypted image and weak correlation between two adjacent pixels in the cipher-image.
An Image Filter Based on Shearlet Transformation and Particle Swarm Optimization Algorithm
Directory of Open Access Journals (Sweden)
Kai Hu
2015-01-01
Full Text Available Digital image is always polluted by noise and made data postprocessing difficult. To remove noise and preserve detail of image as much as possible, this paper proposed image filter algorithm which combined the merits of Shearlet transformation and particle swarm optimization (PSO algorithm. Firstly, we use classical Shearlet transform to decompose noised image into many subwavelets under multiscale and multiorientation. Secondly, we gave weighted factor to those subwavelets obtained. Then, using classical Shearlet inverse transform, we obtained a composite image which is composed of those weighted subwavelets. After that, we designed fast and rough evaluation method to evaluate noise level of the new image; by using this method as fitness, we adopted PSO to find the optimal weighted factor we added; after lots of iterations, by the optimal factors and Shearlet inverse transform, we got the best denoised image. Experimental results have shown that proposed algorithm eliminates noise effectively and yields good peak signal noise ratio (PSNR.
International Nuclear Information System (INIS)
Davey, B.L.K.; Bates, R.H.T.; Cocke, W.J.; Mccarthy, D.W. Jr.; Christou, J.C.
1989-01-01
One-dimensional infrared speckle scans of Ross 614 AB were recorded at a wavelength of 2.2 microns, and the three bins corresponding to the three best seeing conditions were further processed by applying a shift-and-add algorithm to the set of images contained within each bin, generating three shift-and-add images with differing shift-and-add point-spread functions. A zero-and-add technique was used to deconvolve the three shift-and-add images in order to obtain parameters corresponding to the separation and the brightness ratio of a two-component model of Ross 614 Ab. Least-squares analysis results reveal a separation of 1.04 arcsec and a brightness ratio of 4.3 for the binary system at this wavelength. 31 refs
A high-performance spatial database based approach for pathology imaging algorithm evaluation
Directory of Open Access Journals (Sweden)
Fusheng Wang
2013-01-01
Full Text Available Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS data model. Aims: (1 Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2 Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3 Develop a set of queries to support data sampling and result comparisons; (4 Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1 algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2 algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The
A high-performance spatial database based approach for pathology imaging algorithm evaluation.
Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A D; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J; Saltz, Joel H
2013-01-01
Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. WE HAVE CONSIDERED TWO SCENARIOS FOR ALGORITHM EVALUATION: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and
An overview of methods to mitigate artifacts in optical coherence tomography imaging of the skin.
Adabi, Saba; Fotouhi, Audrey; Xu, Qiuyun; Daveluy, Steve; Mehregan, Darius; Podoleanu, Adrian; Nasiriavanaki, Mohammadreza
2018-05-01
Optical coherence tomography (OCT) of skin delivers three-dimensional images of tissue microstructures. Although OCT imaging offers a promising high-resolution modality, OCT images suffer from some artifacts that lead to misinterpretation of tissue structures. Therefore, an overview of methods to mitigate artifacts in OCT imaging of the skin is of paramount importance. Speckle, intensity decay, and blurring are three major artifacts in OCT images. Speckle is due to the low coherent light source used in the configuration of OCT. Intensity decay is a deterioration of light with respect to depth, and blurring is the consequence of deficiencies of optical components. Two speckle reduction methods (one based on artificial neural network and one based on spatial compounding), an attenuation compensation algorithm (based on Beer-Lambert law) and a deblurring procedure (using deconvolution), are described. Moreover, optical properties extraction algorithm based on extended Huygens-Fresnel (EHF) principle to obtain some additional information from OCT images are discussed. In this short overview, we summarize some of the image enhancement algorithms for OCT images which address the abovementioned artifacts. The results showed a significant improvement in the visibility of the clinically relevant features in the images. The quality improvement was evaluated using several numerical assessment measures. Clinical dermatologists benefit from using these image enhancement algorithms to improve OCT diagnosis and essentially function as a noninvasive optical biopsy. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Zhang, B.; Sang, Jun; Alam, Mohammad S.
2013-03-01
An image hiding method based on cascaded iterative Fourier transform and public-key encryption algorithm was proposed. Firstly, the original secret image was encrypted into two phase-only masks M1 and M2 via cascaded iterative Fourier transform (CIFT) algorithm. Then, the public-key encryption algorithm RSA was adopted to encrypt M2 into M2' . Finally, a host image was enlarged by extending one pixel into 2×2 pixels and each element in M1 and M2' was multiplied with a superimposition coefficient and added to or subtracted from two different elements in the 2×2 pixels of the enlarged host image. To recover the secret image from the stego-image, the two masks were extracted from the stego-image without the original host image. By applying public-key encryption algorithm, the key distribution was facilitated, and also compared with the image hiding method based on optical interference, the proposed method may reach higher robustness by employing the characteristics of the CIFT algorithm. Computer simulations show that this method has good robustness against image processing.
Research on Adaptive Optics Image Restoration Algorithm by Improved Expectation Maximization Method
Directory of Open Access Journals (Sweden)
Lijuan Zhang
2014-01-01
Full Text Available To improve the effect of adaptive optics images’ restoration, we put forward a deconvolution algorithm improved by the EM algorithm which joints multiframe adaptive optics images based on expectation-maximization theory. Firstly, we need to make a mathematical model for the degenerate multiframe adaptive optics images. The function model is deduced for the points that spread with time based on phase error. The AO images are denoised using the image power spectral density and support constraint. Secondly, the EM algorithm is improved by combining the AO imaging system parameters and regularization technique. A cost function for the joint-deconvolution multiframe AO images is given, and the optimization model for their parameter estimations is built. Lastly, the image-restoration experiments on both analog images and the real AO are performed to verify the recovery effect of our algorithm. The experimental results show that comparing with the Wiener-IBD or RL-IBD algorithm, our iterations decrease 14.3% and well improve the estimation accuracy. The model distinguishes the PSF of the AO images and recovers the observed target images clearly.
Clustering Batik Images using Fuzzy C-Means Algorithm Based on Log-Average Luminance
Directory of Open Access Journals (Sweden)
Ahmad Sanmorino
2012-06-01
Full Text Available Batik is a fabric or clothes that are made with a special staining technique called wax-resist dyeing and is one of the cultural heritage which has high artistic value. In order to improve the efficiency and give better semantic to the image, some researchers apply clustering algorithm for managing images before they can be retrieved. Image clustering is a process of grouping images based on their similarity. In this paper we attempt to provide an alternative method of grouping batik image using fuzzy c-means (FCM algorithm based on log-average luminance of the batik. FCM clustering algorithm is an algorithm that works using fuzzy models that allow all data from all cluster members are formed with different degrees of membership between 0 and 1. Log-average luminance (LAL is the average value of the lighting in an image. We can compare different image lighting from one image to another using LAL. From the experiments that have been made, it can be concluded that fuzzy c-means algorithm can be used for batik image clustering based on log-average luminance of each image possessed.
Efficient Active Contour and K-Means Algorithms in Image Segmentation
Directory of Open Access Journals (Sweden)
J.R. Rommelse
2004-01-01
Full Text Available In this paper we discuss a classic clustering algorithm that can be used to segment images and a recently developed active contour image segmentation model. We propose integrating aspects of the classic algorithm to improve the active contour model. For the resulting CVK and B-means segmentation algorithms we examine methods to decrease the size of the image domain. The CVK method has been implemented to run on parallel and distributed computers. By changing the order of updating the pixels, it was possible to replace synchronous communication with asynchronous communication and subsequently the parallel efficiency is improved.
Algorithm of pulmonary emphysema extraction using low dose thoracic 3D CT images
Saita, S.; Kubo, M.; Kawata, Y.; Niki, N.; Nakano, Y.; Omatsu, H.; Tominaga, K.; Eguchi, K.; Moriyama, N.
2006-03-01
Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to 100 thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.
Algorithm of pulmonary emphysema extraction using thoracic 3D CT images
Saita, Shinsuke; Kubo, Mitsuru; Kawata, Yoshiki; Niki, Noboru; Nakano, Yasutaka; Ohmatsu, Hironobu; Tominaga, Keigo; Eguchi, Kenji; Moriyama, Noriyuki
2007-03-01
Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.
Decoding using back-project algorithm from coded image in ICF
International Nuclear Information System (INIS)
Jiang shaoen; Liu Zhongli; Zheng Zhijian; Tang Daoyuan
1999-01-01
The principle of the coded imaging and its decoding in inertial confinement fusion is described simply. The authors take ring aperture microscope for example and use back-project (BP) algorithm to decode the coded image. The decoding program has been performed for numerical simulation. Simulations of two models are made, and the results show that the accuracy of BP algorithm is high and effect of reconstruction is good. Thus, it indicates that BP algorithm is applicable to decoding for coded image in ICF experiments
DEFF Research Database (Denmark)
Henriksen, Lars
1996-01-01
The sonar simulator integrated environment (SSIE) is a tool for developing high performance processing algorithms for single or sequences of sonar images. The tool is based on MATLAB providing a very short lead time from concept to executable code and thereby assessment of the algorithms tested...... of the algorithms is the availability of sonar images. To accommodate this problem the SSIE has been equipped with a simulator capable of generating high fidelity sonar images for a given scene of objects, sea-bed AUV path, etc. In the paper the main components of the SSIE is described and examples of different...... processing steps are given...
A new image encryption algorithm based on the fractional-order hyperchaotic Lorenz system
Wang, Zhen; Huang, Xia; Li, Yu-Xia; Song, Xiao-Na
2013-01-01
We propose a new image encryption algorithm on the basis of the fractional-order hyperchaotic Lorenz system. While in the process of generating a key stream, the system parameters and the derivative order are embedded in the proposed algorithm to enhance the security. Such an algorithm is detailed in terms of security analyses, including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. The experimental results demonstrate that the proposed image encryption scheme has the advantages of large key space and high security for practical image encryption.
Dibble, Elizabeth H; Swenson, David W; Cartagena, Claudia; Baird, Grayson L; Herliczek, Thaddeus W
2018-03-01
Purpose To establish, in a large cohort, the diagnostic performance of a staged algorithm involving ultrasonography (US) followed by conditional unenhanced magnetic resonance (MR) imaging for the imaging work-up of pediatric appendicitis. Materials and Methods A staged imaging algorithm in which US and unenhanced MR imaging were performed in pediatric patients suspected of having appendicitis was implemented at the authors' institution on January 1, 2011, with US as the initial modality followed by unenhanced MR imaging when US findings were equivocal. A search of the radiology database revealed 2180 pediatric patients who had undergone imaging for suspected appendicitis from January 1, 2011, through December 31, 2012. Of the 2180 patients, 1982 (90.9%) were evaluated according to the algorithm. The authors reviewed the electronic medical records and imaging reports for all patients. Imaging reports were reviewed and classified as positive, negative, or equivocal for appendicitis and correlated with surgical and pathology reports. Results The frequency of appendicitis was 20.5% (407 of 1982 patients). US alone was performed in 1905 of the 1982 patients (96.1%), yielding a sensitivity of 98.7% (386 of 391 patients) and specificity of 97.1% (1470 of 1514 patients) for appendicitis. Seventy-seven patients underwent unenhanced MR imaging after equivocal US findings, yielding an overall algorithm sensitivity of 98.2% (400 of 407 patients) and specificity of 97.1% (1530 of 1575 patients). Seven of the 1982 patients (0.4%) had false-negative results with the staged algorithm. The negative predictive value of the staged algorithm was 99.5% (1530 of 1537 patients). Conclusion A staged algorithm of US and unenhanced MR imaging for pediatric appendicitis appears to be effective. The results of this study demonstrate that this staged algorithm is 98.2% sensitive and 97.1% specific for the diagnosis of appendicitis in pediatric patients. © RSNA, 2017.
Single-shot speckle reduction in numerical reconstruction of digitally recorded holograms.
Hincapie, Diego; Herrera-Ramírez, Jorge; Garcia-Sucerquia, Jorge
2015-04-15
A single-shot method to reduce the speckle noise in the numerical reconstructions of electronically recorded holograms is presented. A recorded hologram with the dimensions N×M is split into S=T×T sub-holograms. The uncorrelated superposition of the individually reconstructed sub-holograms leads to an image with the speckle noise reduced proportionally to the 1/S law. The experimental results are presented to support the proposed methodology.
Synthetic aperture integration (SAI) algorithm for SAR imaging
Chambers, David H; Mast, Jeffrey E; Paglieroni, David W; Beer, N. Reginald
2013-07-09
A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes the return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.
ITERATION FREE FRACTAL COMPRESSION USING GENETIC ALGORITHM FOR STILL COLOUR IMAGES
Directory of Open Access Journals (Sweden)
A.R. Nadira Banu Kamal
2014-02-01
Full Text Available The storage requirements for images can be excessive, if true color and a high-perceived image quality are desired. An RGB image may be viewed as a stack of three gray-scale images that when fed into the red, green and blue inputs of a color monitor, produce a color image on the screen. The abnormal size of many images leads to long, costly, transmission times. Hence, an iteration free fractal algorithm is proposed in this research paper to design an efficient search of the domain pools for colour image compression using Genetic Algorithm (GA. The proposed methodology reduces the coding process time and intensive computation tasks. Parameters such as image quality, compression ratio and coding time are analyzed. It is observed that the proposed method achieves excellent performance in image quality with reduction in storage space.
Audi, Ahmad; Pierrot-Deseilligny, Marc; Meynard, Christophe; Thom, Christian
2017-07-18
Images acquired with a long exposure time using a camera embedded on UAVs (Unmanned Aerial Vehicles) exhibit motion blur due to the erratic movements of the UAV. The aim of the present work is to be able to acquire several images with a short exposure time and use an image processing algorithm to produce a stacked image with an equivalent long exposure time. Our method is based on the feature point image registration technique. The algorithm is implemented on the light-weight IGN (Institut national de l'information géographique) camera, which has an IMU (Inertial Measurement Unit) sensor and an SoC (System on Chip)/FPGA (Field-Programmable Gate Array). To obtain the correct parameters for the resampling of the images, the proposed method accurately estimates the geometrical transformation between the first and the N -th images. Feature points are detected in the first image using the FAST (Features from Accelerated Segment Test) detector, then homologous points on other images are obtained by template matching using an initial position benefiting greatly from the presence of the IMU sensor. The SoC/FPGA in the camera is used to speed up some parts of the algorithm in order to achieve real-time performance as our ultimate objective is to exclusively write the resulting image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images and block diagrams of the described architecture. The resulting stacked image obtained for real surveys does not seem visually impaired. An interesting by-product of this algorithm is the 3D rotation estimated by a photogrammetric method between poses, which can be used to recalibrate in real time the gyrometers of the IMU. Timing results demonstrate that the image resampling part of this algorithm is the most demanding processing task and should also be accelerated in the FPGA in future work.
Directory of Open Access Journals (Sweden)
Ahmad Audi
2017-07-01
Full Text Available Images acquired with a long exposure time using a camera embedded on UAVs (Unmanned Aerial Vehicles exhibit motion blur due to the erratic movements of the UAV. The aim of the present work is to be able to acquire several images with a short exposure time and use an image processing algorithm to produce a stacked image with an equivalent long exposure time. Our method is based on the feature point image registration technique. The algorithm is implemented on the light-weight IGN (Institut national de l’information géographique camera, which has an IMU (Inertial Measurement Unit sensor and an SoC (System on Chip/FPGA (Field-Programmable Gate Array. To obtain the correct parameters for the resampling of the images, the proposed method accurately estimates the geometrical transformation between the first and the N-th images. Feature points are detected in the first image using the FAST (Features from Accelerated Segment Test detector, then homologous points on other images are obtained by template matching using an initial position benefiting greatly from the presence of the IMU sensor. The SoC/FPGA in the camera is used to speed up some parts of the algorithm in order to achieve real-time performance as our ultimate objective is to exclusively write the resulting image to save bandwidth on the storage device. The paper includes a detailed description of the implemented algorithm, resource usage summary, resulting processing time, resulting images and block diagrams of the described architecture. The resulting stacked image obtained for real surveys does not seem visually impaired. An interesting by-product of this algorithm is the 3D rotation estimated by a photogrammetric method between poses, which can be used to recalibrate in real time the gyrometers of the IMU. Timing results demonstrate that the image resampling part of this algorithm is the most demanding processing task and should also be accelerated in the FPGA in future work.
Analysis and Evaluation of IKONOS Image Fusion Algorithm Based on Land Cover Classification
Institute of Scientific and Technical Information of China (English)
Xia; JING; Yan; BAO
2015-01-01
Different fusion algorithm has its own advantages and limitations,so it is very difficult to simply evaluate the good points and bad points of the fusion algorithm. Whether an algorithm was selected to fuse object images was also depended upon the sensor types and special research purposes. Firstly,five fusion methods,i. e. IHS,Brovey,PCA,SFIM and Gram-Schmidt,were briefly described in the paper. And then visual judgment and quantitative statistical parameters were used to assess the five algorithms. Finally,in order to determine which one is the best suitable fusion method for land cover classification of IKONOS image,the maximum likelihood classification( MLC) was applied using the above five fusion images. The results showed that the fusion effect of SFIM transform and Gram-Schmidt transform were better than the other three image fusion methods in spatial details improvement and spectral information fidelity,and Gram-Schmidt technique was superior to SFIM transform in the aspect of expressing image details. The classification accuracy of the fused image using Gram-Schmidt and SFIM algorithms was higher than that of the other three image fusion methods,and the overall accuracy was greater than 98%. The IHS-fused image classification accuracy was the lowest,the overall accuracy and kappa coefficient were 83. 14% and 0. 76,respectively. Thus the IKONOS fusion images obtained by the Gram-Schmidt and SFIM were better for improving the land cover classification accuracy.
Chaos-based image encryption algorithm [rapid communication
Guan, Zhi-Hong; Huang, Fangjun; Guan, Wenjie
2005-10-01
In this Letter, a new image encryption scheme is presented, in which shuffling the positions and changing the grey values of image pixels are combined to confuse the relationship between the cipher-image and the plain-image. Firstly, the Arnold cat map is used to shuffle the positions of the image pixels in the spatial-domain. Then the discrete output signal of the Chen's chaotic system is preprocessed to be suitable for the grayscale image encryption, and the shuffled image is encrypted by the preprocessed signal pixel by pixel. The experimental results demonstrate that the key space is large enough to resist the brute-force attack and the distribution of grey values of the encrypted image has a random-like behavior.
A novel image-domain-based cone-beam computed tomography enhancement algorithm
Energy Technology Data Exchange (ETDEWEB)
Li Xiang; Li Tianfang; Yang Yong; Heron, Dwight E; Huq, M Saiful, E-mail: lix@upmc.edu [Department of Radiation Oncology, University of Pittsburgh Cancer Institute, Pittsburgh, PA 15232 (United States)
2011-05-07
Kilo-voltage (kV) cone-beam computed tomography (CBCT) plays an important role in image-guided radiotherapy. However, due to a large cone-beam angle, scatter effects significantly degrade the CBCT image quality and limit its clinical application. The goal of this study is to develop an image enhancement algorithm to reduce the low-frequency CBCT image artifacts, which are also called the bias field. The proposed algorithm is based on the hypothesis that image intensities of different types of materials in CBCT images are approximately globally uniform (in other words, a piecewise property). A maximum a posteriori probability framework was developed to estimate the bias field contribution from a given CBCT image. The performance of the proposed CBCT image enhancement method was tested using phantoms and clinical CBCT images. Compared to the original CBCT images, the corrected images using the proposed method achieved a more uniform intensity distribution within each tissue type and significantly reduced cupping and shading artifacts. In a head and a pelvic case, the proposed method reduced the Hounsfield unit (HU) errors within the region of interest from 300 HU to less than 60 HU. In a chest case, the HU errors were reduced from 460 HU to less than 110 HU. The proposed CBCT image enhancement algorithm demonstrated a promising result by the reduction of the scatter-induced low-frequency image artifacts commonly encountered in kV CBCT imaging.
International Nuclear Information System (INIS)
Park, Seung-Kyu; Baik, Sung-Hoon; Cha, Hyung-Ki; Kim, Young-Suk; Cheong, Yong-Moo
2010-01-01
Less sensitivity to environmental vibrations is essential for industrial applications of a digital speckle pattern interferometer (DSPI) to measure micro deformations. In this paper, a robust DSPI using single fringe to mechanical vibrations is designed for measuring the strain distribution of a tensile specimen. This system adopts a noise-immune signal processing algorithm to acquire a 3D strain distribution image. To acquire an accurate strain distribution for a tensile-specimen, locally-averaged and directionally-oriented filters operating in the frequency domain are used. This system uses a path-independent least-squares phase-unwrapping algorithm to acquire the 3D shape of the strain distribution. As for the initial experiments to measure the strain distribution of a tensile specimen in a vibration field, this system demonstrated a feasibility for industrial applications by providing reliable strain data.