WorldWideScience

Sample records for image encryption algorithm

  1. A new chaotic algorithm for image encryption

    International Nuclear Information System (INIS)

    Recent researches of image encryption algorithms have been increasingly based on chaotic systems, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper presents a new nonlinear chaotic algorithm (NCA) which uses power function and tangent function instead of linear function. Its structural parameters are obtained by experimental analysis. And an image encryption algorithm in a one-time-one-password system is designed. The experimental results demonstrate that the image encryption algorithm based on NCA shows advantages of large key space and high-level security, while maintaining acceptable efficiency. Compared with some general encryption algorithms such as DES, the encryption algorithm is more secure

  2. ALGORITHM FOR IMAGE MIXING AND ENCRYPTION

    Directory of Open Access Journals (Sweden)

    Ayman M. Abdalla

    2013-04-01

    Full Text Available This new algorithm mixes two or more images of different types and sizes by employing a shuffling procedure combined with S-box substitution to perform lossless image encryption. This combines stream cipher with block cipher, on the byte level, in mixing the images. When this algorithm was implemented, empirical analysis using test images of different types and sizes showed that it is effective and resistant to attacks.

  3. Parallel image encryption algorithm based on discretized chaotic map

    International Nuclear Information System (INIS)

    Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms

  4. Stegano-Crypto Hiding Encrypted Data in Encrypted Image Using Advanced Encryption Standard and Lossy Algorithm

    Directory of Open Access Journals (Sweden)

    Ari Shawakat Tahir

    2015-12-01

    Full Text Available The Steganography is an art and science of hiding information by embedding messages within other, seemingly harmless messages and lots of researches are working in it. Proposed system is using AES Algorithm and Lossy technique to overcome the limitation of previous work and increasing the process’s speed. The sender uses AES Algorithm to encrypt message and image, then using LSB technique to hide encrypted data in encrypted message. The receive get the original data using the keys that had been used in encryption process. The proposed system has been implemented in NetBeans 7.3 software uses image and data in different size to find the system’s speed.

  5. Image encryption based on a new total shuffling algorithm

    International Nuclear Information System (INIS)

    This paper presents image encryption scheme, which employs a new image total shuffling matrix to shuffle the positions of image pixels and then uses the states combination of two chaotic systems to confuse the relationship between the plain-image and the cipher-image. The experimental results demonstrate that the new image total shuffling algorithm has a low time complexity and the suggested encryption algorithm of image has the advantages of large key space and high security, and moreover, the distribution of grey values of the encrypted image has a random-like behavior

  6. Meteosat Images Encryption based on AES and RSA Algorithms

    Directory of Open Access Journals (Sweden)

    Boukhatem Mohammed Belkaid

    2015-06-01

    Full Text Available Satellite image Security is playing a vital role in the field of communication system and Internet. This work is interested in securing transmission of Meteosat images on the Internet, in public or local networks. To enhance the security of Meteosat transmission in network communication, a hybrid encryption algorithm based on Advanced Encryption Standard (AES and Rivest Shamir Adleman (RSA algorithms is proposed. AES algorithm is used for data transmission because of its higher efficiency in block encryption and RSA algorithm is used for the encryption of the key of the AES because of its management advantages in key cipher. Our encryption system generates a unique password every new session of encryption. Cryptanalysis and various experiments have been carried out and the results were reported in this paper, which demonstrate the feasibility and flexibility of the proposed scheme.

  7. Image encryption a communication perspective

    CERN Document Server

    Abd El-Samie, Fathi E; Elashry, Ibrahim F; Shahieen, Mai H; Faragallah, Osama S; El-Rabaie, El-Sayed M; Alshebeili, Saleh A

    2013-01-01

    Presenting encryption algorithms with diverse characteristics, Image Encryption: A Communication Perspective examines image encryption algorithms for the purpose of secure wireless communication. It considers two directions for image encryption: permutation-based approaches encryption and substitution-based approaches.Covering the spectrum of image encryption principles and techniques, the book compares image encryption with permutation- and diffusion-based approaches. It explores number theory-based encryption algorithms such as the Data Encryption Standard, the Advanced Encryption Standard,

  8. Chaos-Based Image Encryption Algorithm Using Decomposition

    Directory of Open Access Journals (Sweden)

    Xiuli Song

    2013-07-01

    Full Text Available The proposed chaos-based image encryption algorithm consists of four stages: decomposition, shuffle, diffusion and combination. Decomposition is that an original image is decomposed to components according to some rule. The purpose of the shuffle is to mask original organization of the pixels of the image, and the diffusion is to change their values. Combination is not necessary in the sender. To improve the efficiency, the parallel architecture is taken to process the shuffle and diffusion. To enhance the security of the algorithm, firstly, a permutation of the labels is designed. Secondly, two Logistic maps are used in diffusion stage to encrypt the components. One map encrypts the odd rows of the component and another map encrypts the even rows. Experiment results and security analysis demonstrate that the encryption algorithm not only is robust and flexible, but also can withstand common attacks such as statistical attacks and differential attacks.

  9. Design and Implementation of Image Encryption Algorithm Using Chaos

    Directory of Open Access Journals (Sweden)

    Sandhya Rani M.H.

    2014-06-01

    Full Text Available Images are widely used in diverse areas such as medical, military, science, engineering, art, advertising, entertainment, education as well as training, increasing the use of digital techniques for transmitting and storing images. So maintaining the confidentiality and integrity of images has become a major concern. This makes encryption necessary. The pixel values of neighbouring pixels in a plain image are strongly correlated. The proposed algorithm breaks this correlation increasing the entropy. Correlation is reduced by changing the pixel position this which is called confusion. Histogram is equalized by changing the pixel value this which is called diffusion. The proposed method of encryption algorithm is based on chaos theory. The plain-image is divided into blocks and then performs three levels of shuffling using different chaotic maps. In the first level the pixels within the block are shuffled. In the second level the blocks are shuffled and in the third level all the pixels in an image are shuffled. Finally the shuffled image is diffused using a chaotic sequence generated using symmetric keys, to produce the ciphered image for transmission. The experimental result demonstrates that the proposed algorithm can be used successfully to encrypt/decrypt the images with the secret keys. The analysis of the algorithm also shows that the algorithm gives larger key space and a high key sensitivity. The encrypted image has good encryption effect, information entropy and low correlation coefficient.

  10. Analyzing the Efficiency of Text-to-Image Encryption Algorithm

    OpenAIRE

    Ahmad Abusukhon; Mohammad Talib; Maher A. Nabulsi

    2012-01-01

    Today many of the activities are performed online through the Internet. One of the methods used to protect the data while sending it through the Internet is cryptography. In a previous work we proposed the Text-to-Image Encryption algorithm (TTIE) as a novel algorithm for network security. In this paper we investigate the efficiency of (TTIE) for large scale collection.

  11. Using Genetic Algorithm for Symmetric key Generation in Image Encryption

    Directory of Open Access Journals (Sweden)

    Aarti Soni, Suyash Agrawal

    2012-12-01

    Full Text Available Cryptography is essential for protectinginformation as the importance of security is increasing dayby day with the advent of online transaction processing ande commerce. In now a day the security of digital imagesattracts much attention, especially when these digitalimages are stored in memory or send through thecommunication networks. Genetic algorithms are a class ofoptimization algorithms. Many problems can be solvedusing genetic algorithms through modelling a simplifiedversion of genetic processes. In this paper, I proposed amethod based on Genetic Algorithm which is used togenerate key by the help of pseudo random numbergenerator. Random number will be generated on the basisof current time of the system. Using Genetic Algorithm wecan keep the strength of the key to be good, still make thewhole algorithm good enough. Symmetric key algorithmAES has been proposed for encrypting the image as it isvery secure method for symmetric key encryption.

  12. Analyzing the Efficiency of Text-to-Image Encryption Algorithm

    Directory of Open Access Journals (Sweden)

    Ahmad Abusukhon

    2012-12-01

    Full Text Available Today many of the activities are performed online through the Internet. One of the methods used to protect the data while sending it through the Internet is cryptography. In a previous work we proposed the Text-to-Image Encryption algorithm (TTIE as a novel algorithm for network security. In this paper we investigate the efficiency of (TTIE for large scale collection.

  13. Integral Imaging Based 3-D Image Encryption Algorithm Combined with Cellular Automata

    Scientific Electronic Library Online (English)

    X. W., Li; D. H., Kim; S. J., Cho; S. T., Kim.

    2013-08-01

    Full Text Available A novel optical encryption method is proposed in this paper to achieve 3-D image encryption. This proposed encryption algorithm combines the use of computational integral imaging (CII) and linear-complemented maximum-length cellular automata (LC-MLCA) to encrypt a 3D image. In the encryption process [...] , the 2-D elemental image array (EIA) recorded by light rays of the 3-D image are mapped inversely through the lenslet array according the ray tracing theory. Next, the 2-D EIA is encrypted by LC-MLCA algorithm. When decrypting the encrypted image, the 2-D EIA is recovered by the LC-MLCA. Using the computational integral imaging reconstruction (CIIR) technique and a 3-D object is subsequently reconstructed on the output plane from the 2-D recovered EIA. Because the 2-D EIA is composed of a number of elemental images having their own perspectives of a 3-D image, even if the encrypted image is seriously harmed, the 3-D image can be successfully reconstructed only with partial data. To verify the usefulness of the proposed algorithm, we perform computational experiments and present the experimental results for various attacks. The experiments demonstrate that the proposed encryption method is valid and exhibits strong robustness and security.

  14. Double color image encryption using iterative phase retrieval algorithm in quaternion gyrator domain.

    Science.gov (United States)

    Shao, Zhuhong; Shu, Huazhong; Wu, Jiasong; Dong, Zhifang; Coatrieux, Gouenou; Coatrieux, Jean Louis

    2014-03-10

    This paper describes a novel algorithm to encrypt double color images into a single undistinguishable image in quaternion gyrator domain. By using an iterative phase retrieval algorithm, the phase masks used for encryption are obtained. Subsequently, the encrypted image is generated via cascaded quaternion gyrator transforms with different rotation angles. The parameters in quaternion gyrator transforms and phases serve as encryption keys. By knowing these keys, the original color images can be fully restituted. Numerical simulations have demonstrated the validity of the proposed encryption system as well as its robustness against loss of data and additive Gaussian noise. PMID:24663832

  15. Comparative Analysis and Implementation of Image Encryption Algorithms?

    OpenAIRE

    Rajinder Kaur; Er. Kanwalpreet Singh?

    2013-01-01

    Due to the rapid growth of digital communication and multimedia application, security becomesan important issue of communication and storage of images. Image security has found a great need in manyapplications where the information (in the form of image) is to be protected from unauthorized access.Encryption is one of the ways to ensure high security. In recent years, encryption technology has beendeveloped and many image encryption methods have been used. These methods produce randomness in ...

  16. Quality of Encryption Measurement of Bitmap Images with RC6, MRC6, and Rijndael Block Cipher Algorithms

    OpenAIRE

    Nawal El-Fishawy; Osama M. Abu Zaid

    2007-01-01

    RC6, MRC6, and Rijndael are three block cipher algorithms. Different types of Bitmap images are encrypted with each of the three encryption algorithms. Visual inspection is not enough on judging the quality of encrypted images. So, other measuring factors are considered based on: measuring the maximum deviation between the original and the encrypted images, measuring the correlation coefficient between the encrypted and the original images, the difference between the pixel value of the origi...

  17. A chaos-based image encryption algorithm with variable control parameters

    International Nuclear Information System (INIS)

    In recent years, a number of image encryption algorithms based on the permutation-diffusion structure have been proposed. However, the control parameters used in the permutation stage are usually fixed in the whole encryption process, which favors attacks. In this paper, a chaos-based image encryption algorithm with variable control parameters is proposed. The control parameters used in the permutation stage and the keystream employed in the diffusion stage are generated from two chaotic maps related to the plain-image. As a result, the algorithm can effectively resist all known attacks against permutation-diffusion architectures. Theoretical analyses and computer simulations both confirm that the new algorithm possesses high security and fast encryption speed for practical image encryption.

  18. An image joint compression-encryption algorithm based on adaptive arithmetic coding

    International Nuclear Information System (INIS)

    Through a series of studies on arithmetic coding and arithmetic encryption, a novel image joint compression-encryption algorithm based on adaptive arithmetic coding is proposed. The contexts produced in the process of image compression are modified by keys in order to achieve image joint compression encryption. Combined with the bit-plane coding technique, the discrete wavelet transform coefficients in different resolutions can be encrypted respectively with different keys, so that the resolution selective encryption is realized to meet different application needs. Zero-tree coding is improved, and adaptive arithmetic coding is introduced. Then, the proposed joint compression-encryption algorithm is simulated. The simulation results show that as long as the parameters are selected appropriately, the compression efficiency of proposed image joint compression-encryption algorithm is basically identical to that of the original image compression algorithm, and the security of the proposed algorithm is better than the joint encryption algorithm based on interval splitting. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  19. Analysis and improvement of a chaos-based image encryption algorithm

    International Nuclear Information System (INIS)

    The security of digital image attracts much attention recently. In Guan et al. [Guan Z, Huang F, Guan W. Chaos-based image encryption algorithm. Phys Lett A 2005; 346: 153-7.], a chaos-based image encryption algorithm has been proposed. In this paper, the cause of potential flaws in the original algorithm is analyzed in detail, and then the corresponding enhancement measures are proposed. Both theoretical analysis and computer simulation indicate that the improved algorithm can overcome these flaws and maintain all the merits of the original one.

  20. Image encryption using fingerprint as key based on phase retrieval algorithm and public key cryptography

    Science.gov (United States)

    Zhao, Tieyu; Ran, Qiwen; Yuan, Lin; Chi, Yingying; Ma, Jing

    2015-09-01

    In this paper, a novel image encryption system with fingerprint used as a secret key is proposed based on the phase retrieval algorithm and RSA public key algorithm. In the system, the encryption keys include the fingerprint and the public key of RSA algorithm, while the decryption keys are the fingerprint and the private key of RSA algorithm. If the users share the fingerprint, then the system will meet the basic agreement of asymmetric cryptography. The system is also applicable for the information authentication. The fingerprint as secret key is used in both the encryption and decryption processes so that the receiver can identify the authenticity of the ciphertext by using the fingerprint in decryption process. Finally, the simulation results show the validity of the encryption scheme and the high robustness against attacks based on the phase retrieval technique.

  1. A Novel Image Encryption Algorithm Based on DNA Encoding and Spatiotemporal Chaos

    Directory of Open Access Journals (Sweden)

    Chunyan Song

    2015-10-01

    Full Text Available DNA computing based image encryption is a new, promising field. In this paper, we propose a novel image encryption scheme based on DNA encoding and spatiotemporal chaos. In particular, after the plain image is primarily diffused with the bitwise Exclusive-OR operation, the DNA mapping rule is introduced to encode the diffused image. In order to enhance the encryption, the spatiotemporal chaotic system is used to confuse the rows and columns of the DNA encoded image. The experiments demonstrate that the proposed encryption algorithm is of high key sensitivity and large key space, and it can resist brute-force attack, entropy attack, differential attack, chosen-plaintext attack, known-plaintext attack and statistical attack.

  2. A fast image encryption algorithm based on only blocks in cipher text

    International Nuclear Information System (INIS)

    In this paper, a fast image encryption algorithm is proposed, in which the shuffling and diffusion is performed simultaneously. The cipher-text image is divided into blocks and each block has k ×k pixels, while the pixels of the plain-text are scanned one by one. Four logistic maps are used to generate the encryption key stream and the new place in the cipher image of plain image pixels, including the row and column of the block which the pixel belongs to and the place where the pixel would be placed in the block. After encrypting each pixel, the initial conditions of logistic maps would be changed according to the encrypted pixel's value; after encrypting each row of plain image, the initial condition would also be changed by the skew tent map. At last, it is illustrated that this algorithm has a faster speed, big key space, and better properties in withstanding differential attacks, statistical analysis, known plaintext, and chosen plaintext attacks

  3. A Brief Study of Video Encryption Algorithms

    Directory of Open Access Journals (Sweden)

    Pranali Pasalkar,

    2015-02-01

    Full Text Available Video is a set of images .Video encryption is encrypting those set of images .Thus video encryption is simply hiding your video from prying eyes .Video monitoring has always been in concerned .Multimedia security is very important for multimedia commerce on Internet such as video on demand and Real time video multicast. There are various video encryption algorithm. All have some kind of weakness .In this paper classification of various existing algorithm, its advantages and disadvantages is discussed.

  4. SECURE PARTIAL IMAGE ENCRYPTION SCHEME USING SCAN BASED ALGORITHM

    OpenAIRE

    Sumithra Devi K A; K M Sunjiv Soyjaudah; Parameshachari B D

    2013-01-01

    Today data security is very important and high priority topic. With rapid growth in communication and computer technologies, there is a huge data transaction interment, teleconferencing and military applications. For all these applications we need a security. Encryption is the primary solution to provide security to the data, which is travelling on a communication link between any pair of nodes, but Partial encryption is a technique to save computational power, overhead, speed, time and to pr...

  5. Compressive Optical Image Encryption

    OpenAIRE

    Li, Jun; Sheng Li, Jiao; Yang Pan, Yang; Li, Rong

    2015-01-01

    An optical image encryption technique based on compressive sensing using fully optical means has been proposed. An object image is first encrypted to a white-sense stationary noise pattern using a double random phase encoding (DRPE) method in a Mach-Zehnder interferometer. Then, the encrypted image is highly compressed to a signal using single-pixel compressive holographic imaging in the optical domain. At the receiving terminal, the encrypted image is reconstructed well via compressive sensi...

  6. Analysis and improvement of a hash-based image encryption algorithm

    Science.gov (United States)

    Deng, Shaojiang; Zhan, Yanping; Xiao, Di; Li, Yantao

    2011-08-01

    The security of digital image attracts much attention recently. A hash-based digital image encryption algorithm has been proposed in Ref. [1]. But both the theoretical analysis and computer simulation show the characteristic of diffusion is too weak to resist Chosen Plaintext Attack and Known Plaintext Attack. Besides, one bit difference of the plain pixel will lead to only one corresponding bit change of the cipher pixel. In our improved algorithm, coupled with self-adaptive algorithm, only one pixel difference of the plain-image will cause changes of almost all the pixels in the cipher-image (NPCR > 98.77%), and the unified average changing intensity is high (UACI > 30.96%). Both theoretical analysis and computer simulation indicate that the improved algorithm can overcome these flaws and maintain all the merits of the original one.

  7. A novel algorithm for image encryption based on mixture of chaotic maps

    Energy Technology Data Exchange (ETDEWEB)

    Behnia, S. [Department of Physics, IAU, Urmia (Iran, Islamic Republic of)], E-mail: s.behnia@iaurmia.ac.ir; Akhshani, A.; Mahmodi, H. [Department of Physics, IAU, Urmia (Iran, Islamic Republic of); Akhavan, A. [Department of Engineering, IAU, Urmia (Iran, Islamic Republic of)

    2008-01-15

    Chaos-based encryption appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an implementation of digital image encryption scheme based on the mixture of chaotic systems is reported. The chaotic cryptography technique used in this paper is a symmetric key cryptography. In this algorithm, a typical coupled map was mixed with a one-dimensional chaotic map and used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail, along with its security analysis and implementation. The experimental results based on mixture of chaotic maps approves the effectiveness of the proposed method and the implementation of the algorithm. This mixture application of chaotic maps shows advantages of large key space and high-level security. The ciphertext generated by this method is the same size as the plaintext and is suitable for practical use in the secure transmission of confidential information over the Internet.

  8. A novel algorithm for image encryption based on mixture of chaotic maps

    International Nuclear Information System (INIS)

    Chaos-based encryption appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an implementation of digital image encryption scheme based on the mixture of chaotic systems is reported. The chaotic cryptography technique used in this paper is a symmetric key cryptography. In this algorithm, a typical coupled map was mixed with a one-dimensional chaotic map and used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail, along with its security analysis and implementation. The experimental results based on mixture of chaotic maps approves the effectiveness of the proposed method and the implementation of the algorithm. This mixture application of chaotic maps shows advantages of large key space and high-level security. The ciphertext generated by this method is the same size as the plaintext and is suitable for practical use in the secure transmission of confidential information over the Internet

  9. Novel Data Encryption Algorithm

    Directory of Open Access Journals (Sweden)

    Rajat Goel

    2011-07-01

    Full Text Available We always strive to get better algorithms for securing data. A variety of such algorithms are being used in cryptography. Manly block and stream ciphers are available and one of them is International Data Encryption Algorithm (IDEA, which was regarded arguably as one of the best for encryption purposes. A considerable time has elapsed since its advent and this period has witnessed a wide development in process approaches and applications. The number of transactions and exchanges of data has increased exponentially. Consequently, better and novel attacks on data evolved. Researchers believe that the security of the algorithm needs to be improved keeping a check on the time and space complexity. Within this research work we are looking for a robust algorithm known as NDEA which can be applied for securing modern environment applications.

  10. A joint image encryption and watermarking algorithm based on compressive sensing and chaotic map

    Science.gov (United States)

    Xiao, Di; Cai, Hong-Kun; Zheng, Hong-Ying

    2015-06-01

    In this paper, a compressive sensing (CS) and chaotic map-based joint image encryption and watermarking algorithm is proposed. The transform domain coefficients of the original image are scrambled by Arnold map firstly. Then the watermark is adhered to the scrambled data. By compressive sensing, a set of watermarked measurements is obtained as the watermarked cipher image. In this algorithm, watermark embedding and data compression can be performed without knowing the original image; similarly, watermark extraction will not interfere with decryption. Due to the characteristics of CS, this algorithm features compressible cipher image size, flexible watermark capacity, and lossless watermark extraction from the compressed cipher image as well as robustness against packet loss. Simulation results and analyses show that the algorithm achieves good performance in the sense of security, watermark capacity, extraction accuracy, reconstruction, robustness, etc. Project supported by the Open Research Fund of Chongqing Key Laboratory of Emergency Communications, China (Grant No. CQKLEC, 20140504), the National Natural Science Foundation of China (Grant Nos. 61173178, 61302161, and 61472464), and the Fundamental Research Funds for the Central Universities, China (Grant Nos. 106112013CDJZR180005 and 106112014CDJZR185501).

  11. Cryptanalysis of an image encryption scheme based on a new total shuffling algorithm

    International Nuclear Information System (INIS)

    Chaotic systems have been broadly exploited through the last two decades to build encryption methods. Recently, two new image encryption schemes have been proposed, where the encryption process involves a permutation operation and an XOR-like transformation of the shuffled pixels, which are controlled by three chaotic systems. This paper discusses some defects of the schemes and how to break them with a chosen-plaintext attack.

  12. Security Analysis of Image Encryption Based on Gyrator Transform by Searching the Rotation Angle with Improved PSO Algorithm

    Directory of Open Access Journals (Sweden)

    Jun Sang

    2015-08-01

    Full Text Available Gyrator transform has been widely used for image encryption recently. For gyrator transform-based image encryption, the rotation angle used in the gyrator transform is one of the secret keys. In this paper, by analyzing the properties of the gyrator transform, an improved particle swarm optimization (PSO algorithm was proposed to search the rotation angle in a single gyrator transform. Since the gyrator transform is continuous, it is time-consuming to exhaustedly search the rotation angle, even considering the data precision in a computer. Therefore, a computational intelligence-based search may be an alternative choice. Considering the properties of severe local convergence and obvious global fluctuations of the gyrator transform, an improved PSO algorithm was proposed to be suitable for such situations. The experimental results demonstrated that the proposed improved PSO algorithm can significantly improve the efficiency of searching the rotation angle in a single gyrator transform. Since gyrator transform is the foundation of image encryption in gyrator transform domains, the research on the method of searching the rotation angle in a single gyrator transform is useful for further study on the security of such image encryption algorithms.

  13. Compressive Optical Image Encryption

    Science.gov (United States)

    Li, Jun; Sheng Li, Jiao; Yang Pan, Yang; Li, Rong

    2015-05-01

    An optical image encryption technique based on compressive sensing using fully optical means has been proposed. An object image is first encrypted to a white-sense stationary noise pattern using a double random phase encoding (DRPE) method in a Mach-Zehnder interferometer. Then, the encrypted image is highly compressed to a signal using single-pixel compressive holographic imaging in the optical domain. At the receiving terminal, the encrypted image is reconstructed well via compressive sensing theory, and the original image can be decrypted with three reconstructed holograms and the correct keys. The numerical simulations show that the method is effective and suitable for optical image security transmission in future all-optical networks because of the ability of completely optical implementation and substantially smaller hologram data volume.

  14. Compressive optical image encryption.

    Science.gov (United States)

    Li, Jun; Sheng Li, Jiao; Yang Pan, Yang; Li, Rong

    2015-01-01

    An optical image encryption technique based on compressive sensing using fully optical means has been proposed. An object image is first encrypted to a white-sense stationary noise pattern using a double random phase encoding (DRPE) method in a Mach-Zehnder interferometer. Then, the encrypted image is highly compressed to a signal using single-pixel compressive holographic imaging in the optical domain. At the receiving terminal, the encrypted image is reconstructed well via compressive sensing theory, and the original image can be decrypted with three reconstructed holograms and the correct keys. The numerical simulations show that the method is effective and suitable for optical image security transmission in future all-optical networks because of the ability of completely optical implementation and substantially smaller hologram data volume. PMID:25992946

  15. Image Encryption using the Standard Hill Cipher

    Directory of Open Access Journals (Sweden)

    Gaurav Agarwal

    2010-12-01

    Full Text Available In the recent decade of cryptography the concept of image played a big role. Hiding image into another image may be a good idea for image encryption. Considerable work is done in this field. There are many ways to encrypt the image but in this paper we are presenting a new technique of image encryption by the standard hill cipher. Hill cipher algorithm is a technique for symmetric key algorithm in which we use the matrix form key for the encrypting the text data. Images are also a matrix of pixels and each pixel has its intensity value. Using this concept we generate a function which select a random key matrix and then encrypt the image using the key matrix. For the decryption we again use this key matrix to get the original image.

  16. Simple and secure Image Encryption

    OpenAIRE

    V.V.Divya; S.K.Sudha; V.R.Resmy

    2012-01-01

    Image Encryption is a wide area of research. Encryption basically deals with converting data or information from its original form to some other form that hides the information in it. The protection of image data from unauthorized access is important. Encryption is employed to increase the data security. The Encrypted Image is secure from any kind cryptanalysis. In the proposed work, the image to be encrypted is decomposed into 8X8 blocks, these blocks are transformed from the spatial domain ...

  17. Comparison of TACIT Encryption Algorithm with Various Encryption Algorithms

    Directory of Open Access Journals (Sweden)

    Manmeet Kaur

    2012-03-01

    Full Text Available A common goal of cryptographic research is to design protocols that provide a confidential and authenticated transmission channel for messages over an insecure network. A cryptographic algorithm is considered to be computationally secured if it cannot be broken with standard resources, either current or future and apart from the algorithm distribution of keys also more important is to make an efficient cryptosystem. TACIT Encryption Algorithm can produce best possible results if key size is the size of the packet expected to pass through the network is small. This paper gives the comparison of the various algorithms with TACIT Encryption Algorithm on the basis of parameters like key length, block size, type and features. This research work investigates HDL implementation of TACIT Encryption Algorithm.

  18. METRICS OF A NEW SYMMETRICAL ENCRYPTION ALGORITHM

    Directory of Open Access Journals (Sweden)

    Dr. R. UMARANI

    2011-12-01

    Full Text Available The hacking is the greatest problem in the wireless local area network (WLAN. Many algorithms like DES, 3DES, AES,CAST, UMARAM and RC6 have been used to prevent the outside attacks to eavesdrop or prevent the data to be transferred to the end-user correctly. The authentication protocols have been used for authentication and key-exchange processes. A new symmetrical encryption algorithm is proposed in this paper to prevent the outside attacks to obtain any information from any data-exchange in Wireless Local Area Network(WLAN. The new symmetrical algorithm avoids the key exchange between users and reduces the time taken for the encryption, decryption, and authentication processes. It operates at a data rate higher than DES, 3DES, AES, UMARAM and RC6 algorithms. It is applied on a text file and an image as an application. The encryption becomes more secure and high data rate than DES,3DES,AES,CAST,UMARAM and RC6. A comparison has been conducted for the encryption algorithms like DES, 3DES,AES,CAST,UMARAM and RC6 at different settings for each algorithm such as different sizes of data blocks, different data types, battery power consumption, different key size and finally encryption/decryption speed. Experimental results are given to demonstrate the effectiveness of each algorithm.

  19. 3D Chaotic Functions for Image Encryption

    OpenAIRE

    Pawan N. Khade; Manish Narnaware

    2012-01-01

    This paper proposes the chaotic encryption algorithm based on 3D logistic map, 3D Chebyshev map, and 3D, 2D Arnolds cat map for color image encryption. Here the 2D Arnolds cat map is used for image pixel scrambling and 3D Arnolds cat map is used for R, G, and B component substitution. 3D Chebyshev map is used for key generation and 3D logistic map is used for image scrambling. The use of 3D chaotic functions in the encryption algorithm provide more security by using the, shuffling and substit...

  20. Hybrid Encryption Algorithms in Cloud Computing

    OpenAIRE

    Ping Guo; Liping Su; Lijiang Ning; Guangxiang Dan

    2013-01-01

    The security issues of user privacy and data have become one of the most important factors in cloud computing. In this paper we focus on data encryption and study how to improve the security of data in the cloud through data encryption. Combining the feature of traditional encryption algorithms and the character of cloud platform, we design three hybrid encryption algorithms 3DES-AES, TDAES and TDAESalt. Experiment results show that the designed algorithms ...

  1. Image encryption using the Sudoku matrix

    Science.gov (United States)

    Wu, Yue; Zhou, Yicong; Noonan, Joseph P.; Panetta, Karen; Agaian, Sos

    2010-04-01

    This paper introduces a new effective and lossless image encryption algorithm using a Sudoku Matrix to scramble and encrypt the image. The new algorithm encrypts an image through a three stage process. In the first stage, a reference Sudoku matrix is generated as the foundation for the encryption and scrambling processes. The image pixels' intensities are then changed by using the reference Sudoku matrix values, and then the pixels' positions are shuffled using the Sudoku matrix as a mapping process. The advantages of this method is useful for efficiently encrypting a variety of digital images, such as binary images, gray images, and RGB images without any quality loss. The security keys of the presented algorithm are the combination of the parameters in a 1D chaotic logistic map, a parameter to control the size of Sudoku Matrix and the number of iteration times desired for scrambling. The possible security key space is extremely large. The principles of the presented scheme could be applied to provide security for a variety of systems including image, audio and video systems.

  2. Hybrid Encryption Algorithms in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ping Guo

    2013-01-01

    Full Text Available The security issues of user privacy and data have become one of the most important factors in cloud computing. In this paper we focus on data encryption and study how to improve the security of data in the cloud through data encryption. Combining the feature of traditional encryption algorithms and the character of cloud platform, we design three hybrid encryption algorithms 3DES-AES, TDAES and TDAESalt. Experiment results show that the designed algorithms have strong encryption strength and can generate ciphertexts quickly in cloud platform.

  3. Modulo image encryption with fractal keys

    Science.gov (United States)

    Rozouvan, Valerij

    2009-01-01

    An encryption technique based on the modulo operation is proposed. The technique is a one-to-one encryption-decryption single key algorithm. Fractal images are proposed to be used as a source of randomness to generate strong keys. The use of the proposed method is verified, both in the single image encryption-decryption, and in a real-time streaming application. The described algorithm provides a mechanism for controlling the strength of the keys. The advantages of the proposed method are discussed. A video setup is built and GUI software implemented to practically test this method. Numerical results of the test are provided and analyzed. Overall, the theoretically anticipated results are achieved, and the algorithm proven to be adept for real world cryptographic applications.

  4. Scan image compression-encryption hardware system

    Science.gov (United States)

    Bourbakis, Nikolaos G.; Brause, R.; Alexopoulos, C.

    1995-04-01

    This paper deals with the hardware design of an image compression/encryption scheme called SCAN. The scheme is based on the principles and ideas reflected by the specification of the SCAN language. SCAN is a fractal based context-free language which accesses sequentially the data of a 2D array, by describing and generating a wide range (near (nxn)) of space filling curves (or SCAN patterns) from a short set of simple ones. The SCAN method uses the algorithmic description of each 2D image as SCAN patterns combinations for the compression and encryption of the image data. Note that each SCAN letter or word accesses the image data with a different order (or sequence), thus the application of a variety of SCAN words associated with the compression scheme will produce various compressed versions of the same image. The compressed versions are compared in memory size and the best of them with the smallest size in bits could be used for the image compression/encryption. Note that the encryption of the image data is a result of the great number of possible space filling curves which could be generated by SCAN. Since the software implementation of the SCAN compression/encryption scheme requires some time, the hardware design and implementation of the SCAN scheme is necessary in order to reduce the image compression/encryption time to the real-time one. The development of such an image compression encryption system will have a significant impact on the transmission and storage of images. It will be applicable in multimedia and transmission of images through communication lines.

  5. Analyzing the Superlative Symmetric Cryptographic Encryption Algorithm

    Directory of Open Access Journals (Sweden)

    panchamukesh chandaka

    2011-08-01

    Full Text Available Cryptology is a science that deals with codes and passwords. Cryptology is alienated into cryptography and cryptanalysis. The Cryptography produces methods to protect the data, and cryptanalysis hack the protected data . Cryptography provide solutions for four different security areas - confidentiality, authentication, integrity and control of interaction between different parties involved in data exchange finally which leads to the security of information .Encryption algorithms play a key role in information security systems. This paper provides critical analysis of six most common encryption algorithms namely: DES, 3DES, RC2, Blowfish, AES (Rijndael, and RC6. A comparative study has been carried out for the above six encryption algorithms in terms of encryption key size ,block size, Number of Rounds ,Encryption/decryption time ,CPU process time, CPU clock cycles (in the form of throughput, Power consumption. And these comparisons are used to conclude the best Symmetric Cryptography Encryption algorithm.

  6. A Summarization on Image Encryption

    OpenAIRE

    Zhou Shihua; Zhang Qiang; Wei Xiaopeng; Zhou Changjun

    2010-01-01

    With the fast development of the computer technology and information processing technology, the problem of information security is becoming more and more important. Information hiding is usually used to protect the important information from disclosing when it is transmitting over an insecure channel. Digital image encryption is one of the most important methods of image information hiding and camouflage. The image encryption techniques mainly include compression methodology, modern cryptogra...

  7. Experimental optical encryption system based on a single-lens imaging architecture combined with a phase retrieval algorithm

    Science.gov (United States)

    Mosso, F.; Bolognini, N.; Pérez, D. G.

    2015-06-01

    We propose, and experimentally demonstrate, a single-lens imaging system as a compact encoding architecture by using a hybrid protocol for data processing. The encryption process consists of coherent light illuminating a random phase mask attached to an input image (the data), then the outgoing complex field propagates until reaching a second random phase mask next to a lens: encrypted data is obtained at some output plane after the lens. We demonstrate the feasibility of this proposal, and highlight the advantages of using tridimensional speckle as a secure random carrier instead of a standard ciphertext recording—holographic-based encryption techniques. Moreover, we expose the compact system benefits compared to conventional encrypting architectures in terms of energy loss and tolerance against classical attacks applicable to any linear cryptosystem. Experimental results validate our approach.

  8. Remote-sensing image encryption in hybrid domains

    Science.gov (United States)

    Zhang, Xiaoqiang; Zhu, Guiliang; Ma, Shilong

    2012-04-01

    Remote-sensing technology plays an important role in military and industrial fields. Remote-sensing image is the main means of acquiring information from satellites, which always contain some confidential information. To securely transmit and store remote-sensing images, we propose a new image encryption algorithm in hybrid domains. This algorithm makes full use of the advantages of image encryption in both spatial domain and transform domain. First, the low-pass subband coefficients of image DWT (discrete wavelet transform) decomposition are sorted by a PWLCM system in transform domain. Second, the image after IDWT (inverse discrete wavelet transform) reconstruction is diffused with 2D (two-dimensional) Logistic map and XOR operation in spatial domain. The experiment results and algorithm analyses show that the new algorithm possesses a large key space and can resist brute-force, statistical and differential attacks. Meanwhile, the proposed algorithm has the desirable encryption efficiency to satisfy requirements in practice.

  9. A novel image encryption scheme based on spatial chaos map

    International Nuclear Information System (INIS)

    In recent years, the chaos-based cryptographic algorithms have suggested some new and efficient ways to develop secure image encryption techniques, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. In this paper, spatial chaos system are used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail. The basic idea is to encrypt the image in space with spatial chaos map pixel by pixel, and then the pixels are confused in multiple directions of space. Using this method one cycle, the image becomes indistinguishable in space due to inherent properties of spatial chaotic systems. Several experimental results, key sensitivity tests, key space analysis, and statistical analysis show that the approach for image cryptosystems provides an efficient and secure way for real time image encryption and transmission from the cryptographic viewpoint

  10. Studying the Effects of Most Common Encryption Algorithms

    Directory of Open Access Journals (Sweden)

    Diaa Salama

    2011-01-01

    Full Text Available Wireless networks play critical roles in present work, home, and public places, so the needs of protecting of suchnetworks are increased. Encryption algorithms play vital roles in information systems security. Those algorithms consume asignificant amount of computing resources such as CPU time, memory, and battery power. CPU and memory usability areincreasing with a suitable rates, but battery technology is increasing at slower rate. The problem of the slower increasingbattery technology forms “battery gap”. The design of efficient secure protocols for wireless devices from the view of batteryconsumption needs to understand how encryption techniques affect the consumption of battery power with and without datatransmission. This paper studies the effects of six of the most common symmetric encryption algorithms on power consumptionfor wireless devices. at different settings for each algorithm. These setting include different sizes of data blocks, different datatypes (text, images, and audio file, battery power consumption, different key size, different cases of transmission of the data ,effect of varying signal to noise ratio and finally encryption/decryption speed. The experimental results show the superiority oftwo encryption algorithm over other algorithms in terms of the power consumption, processing time, and throughput .Theseresults can aid in new design of security protocol where energy efficiency is the main focus. Some suggestions for design ofsecure communications systems to handle the varying wireless environment have been provided to reduce the energyconsumption of security protocols.

  11. How Good Is The DES Algorithm In Image Ciphering?

    Directory of Open Access Journals (Sweden)

    Said F. El-Zoghdy

    2011-03-01

    Full Text Available Digital Images and video encryption plays an important role in today’s multimedia world. Many encryption schemes have been proposed to provide security for digital images. Usually the symmetric key ciphering algorithms are used in encrypting digital images because it is fast and use the techniques for block and stream ciphers. Data Encryption Standard is symmetric key encryption algorithm. In spite of the successful cracking of the data encryption standard by massive brute force attacks, data encryption standard algorithm is an entrenched technology and still useful for many purposes. In this paper, we use some of the image quality encryption measuring factors to study the effect of data encryption standard algorithm in image ciphering. The results show that the data encryption standard algorithm is fast and it achieves a good encryption rate for image ciphering using different modes of operation.

  12. Color image encryption based on Coupled Nonlinear Chaotic Map

    International Nuclear Information System (INIS)

    Image encryption is somehow different from text encryption due to some inherent features of image such as bulk data capacity and high correlation among pixels, which are generally difficult to handle by conventional methods. The desirable cryptographic properties of the chaotic maps such as sensitivity to initial conditions and random-like behavior have attracted the attention of cryptographers to develop new encryption algorithms. Therefore, recent researches of image encryption algorithms have been increasingly based on chaotic systems, though the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper proposes a Coupled Nonlinear Chaotic Map, called CNCM, and a novel chaos-based image encryption algorithm to encrypt color images by using CNCM. The chaotic cryptography technique which used in this paper is a symmetric key cryptography with a stream cipher structure. In order to increase the security of the proposed algorithm, 240 bit-long secret key is used to generate the initial conditions and parameters of the chaotic map by making some algebraic transformations to the key. These transformations as well as the nonlinearity and coupling structure of the CNCM have enhanced the cryptosystem security. For getting higher security and higher complexity, the current paper employs the image size and color components to cryptosystem, thereby significantly increasing the resistance to known/chosen-plaintext attacks. The results of several experimental, statistical analysis and key sensitivity tests show that the proposed image encryption scheme provides an efficient and secure way for real-time image encryption and transmission.

  13. A New Quaternion-Based Encryption Method for DICOM Images.

    Science.gov (United States)

    Dzwonkowski, Mariusz; Papaj, Michal; Rykaczewski, Roman

    2015-11-01

    In this paper, a new quaternion-based lossless encryption technique for digital image and communication on medicine (DICOM) images is proposed. We have scrutinized and slightly modified the concept of the DICOM network to point out the best location for the proposed encryption scheme, which significantly improves speed of DICOM images encryption in comparison with those originally embedded into DICOM advanced encryption standard and triple data encryption standard algorithms. The proposed algorithm decomposes a DICOM image into two 8-bit gray-tone images in order to perform encryption. The algorithm implements Feistel network like the scheme proposed by Sastry and Kumar. It uses special properties of quaternions to perform rotations of data sequences in 3D space for each of the cipher rounds. The images are written as Lipschitz quaternions, and modular arithmetic was implemented for operations with the quaternions. A computer-based analysis has been carried out, and the obtained results are shown at the end of this paper. PMID:26276993

  14. Algoritmi selektivnog šifrovanja - pregled sa ocenom performansi / Selective encryption algorithms: Overview with performance evaluation

    Directory of Open Access Journals (Sweden)

    Boriša Ž. Jovanovi?

    2010-10-01

    Full Text Available Digitalni multimedijalni sadržaj postaje zastupljeniji i sve više se razmenjuje putem ra?unarskih mreža i javnih kanala (satelitske komunikacije, beži?ne mreže, internet, itd. koji predstavljaju nebezbedne medijume za prenos informacija osetljive sadržine. Sve više na zna?aju dobijaju mehanizmi kriptološke zaštite slika i video sadržaja. Tradicionalni sistemi kriptografske obrade u sistemima za prenos ovih vrsta informacija garantuju visok stepen sigurnosti, ali i imaju svoje nedostatke - visoku cenu implementacije i znatno kašnjenje u prenosu podataka. Pomenuti nedostaci se prevazilaze primenom algoritama selektivnog šifrovanja. / Digital multimedia content is becoming widely used and increasingly exchanged over computer network and public channels (satelite, wireless networks, Internet, etc. which is unsecured transmission media for ex changing that kind of information. Mechanisms made to encrypt image and video data are becoming more and more significant. Traditional cryptographic techniques can guarantee a high level of security but at the cost of expensive implementation and important transmission delays. These shortcomings can be exceeded using selective encryption algorithms. Introduction In traditional image and video content protection schemes, called fully layered, the whole content is first compressed. Then, the compressed bitstream is entirely encrypted using a standard cipher (DES - Data Encryption Algorithm, IDEA - International Data Encryption Algorithm, AES - Advanced Encryption Algorithm etc.. The specific characteristics of this kind of data, high-transmission rate with limited bandwidth, make standard encryption algorithms inadequate. Another limitation of traditional systems consists of altering the whole bitstream syntax which may disable some codec functionalities on the delivery site coder and decoder on the receiving site. Selective encryption is a new trend in image and video content protection. As its name says, it consists of encrypting only a subset of the data. The aim of selective encryption is to reduce the amount of data to encrypt while preserving a sufficient level of security. Theoretical foundation of selective encryption The first theoretical foundation of selective encryption was given indirectly by Claude Elwood Shannon in his work about communication theory of secrecy systems. It is well known that statistics for image and video data differ much from classical text data. Indeed, image and video data are strongly correlated and have strong spatial/temporal redundancy. Evaluation criteria for selective encryption algorithm performance evaluation We need to define a set of evaluation criteria that will help evaluating and comparing selective encryption algorithms. - Tunability - Visual degradation - Cryptographic security - Encryption ratio - Compression friendliness - Format compliance - Error tolerance Classification of selective encryption algorithms One possible classification of selective encryption algorithms is relative to when encryption is performed with respect to compression. This classification is adequate since it has intrinsic consequences on selective encryption algorithms behavior. We consider three classes of algorithms as follows: - Precompression - Incompression - Postcompression Overview of selective encryption algorithms In accordance with their precedently defined classification, selective encryption algorithms were compared, briefly described with advantages and disadvantages and their quality was assessed. Applications Selective encryption mechanisms became more and more important and can be applied in many different areas. Some potential application areas of this mechanism are: - Monitoring encrypted content - PDAs (PDA - Personal Digital Assistant, mobile phones, and other mobile terminals - Multiple encryptions - Transcodability/scalability of encrypted content Conclusion As we can see through foregoing analysis, we can notice that tunability, cryptographic security and error tolerance are the main unsatisfied criteria. Sel

  15. PERFORMANCE ANALYSIS OF IMAGE SECURITY BASED ON ENCRYPTED HYBRID COMPRESSION

    Directory of Open Access Journals (Sweden)

    D. Ramkumar

    2014-01-01

    Full Text Available In this research, we propose an image security scheme using hybrid compression techniques. In this scheme, the data is being provided two-fold security by both encryption stage and hiding stage. The data/message which has to be secured undergoes encryption technique at the initial stage. In this stage, the permutation algorithm is employed which requires a pair of numbers as a key to permute the original message. Following the encryption stage, the deformed message is then embedded onto a JPEG image by considering the low and high quantization tables. The main motivation behind this research work is to provide image security through compression. The final result is an encrypted and compressed JPEG image with a different image quality. The receiver has to perform the reverse process to extract the original data/information. The performance analysis is performed in terms of PSNR for different quantization tables.

  16. A fractal-based image encryption system

    KAUST Repository

    Abd-El-Hafiz, S. K.

    2014-12-01

    This study introduces a novel image encryption system based on diffusion and confusion processes in which the image information is hidden inside the complex details of fractal images. A simplified encryption technique is, first, presented using a single-fractal image and statistical analysis is performed. A general encryption system utilising multiple fractal images is, then, introduced to improve the performance and increase the encryption key up to hundreds of bits. This improvement is achieved through several parameters: feedback delay, multiplexing and independent horizontal or vertical shifts. The effect of each parameter is studied separately and, then, they are combined to illustrate their influence on the encryption quality. The encryption quality is evaluated using different analysis techniques such as correlation coefficients, differential attack measures, histogram distributions, key sensitivity analysis and the National Institute of Standards and Technology (NIST) statistical test suite. The obtained results show great potential compared to other techniques.

  17. A New Color Image Encryption Scheme Using CML and a Fractional-Order Chaotic System

    Science.gov (United States)

    Wu, Xiangjun; Li, Yang; Kurths, Jürgen

    2015-01-01

    The chaos-based image cryptosystems have been widely investigated in recent years to provide real-time encryption and transmission. In this paper, a novel color image encryption algorithm by using coupled-map lattices (CML) and a fractional-order chaotic system is proposed to enhance the security and robustness of the encryption algorithms with a permutation-diffusion structure. To make the encryption procedure more confusing and complex, an image division-shuffling process is put forward, where the plain-image is first divided into four sub-images, and then the position of the pixels in the whole image is shuffled. In order to generate initial conditions and parameters of two chaotic systems, a 280-bit long external secret key is employed. The key space analysis, various statistical analysis, information entropy analysis, differential analysis and key sensitivity analysis are introduced to test the security of the new image encryption algorithm. The cryptosystem speed is analyzed and tested as well. Experimental results confirm that, in comparison to other image encryption schemes, the new algorithm has higher security and is fast for practical image encryption. Moreover, an extensive tolerance analysis of some common image processing operations such as noise adding, cropping, JPEG compression, rotation, brightening and darkening, has been performed on the proposed image encryption technique. Corresponding results reveal that the proposed image encryption method has good robustness against some image processing operations and geometric attacks. PMID:25826602

  18. A new encryption and signing algorithm

    OpenAIRE

    Urszula Romañczuk

    2008-01-01

    In this paper we describe a new method of encryption that orig-inates from the public key cryptography and number theory. Our algorithmwas inspired by the RSA algorithm and Diffie-Hellman key exchange proto-col. It is based on a computationally difficult problem - the discrete logarithmproblem in multiplicative group.

  19. Hardware Realization of Chaos Based Symmetric Image Encryption

    KAUST Repository

    Barakat, Mohamed L.

    2012-06-01

    This thesis presents a novel work on hardware realization of symmetric image encryption utilizing chaos based continuous systems as pseudo random number generators. Digital implementation of chaotic systems results in serious degradations in the dynamics of the system. Such defects are illuminated through a new technique of generalized post proceeding with very low hardware cost. The thesis further discusses two encryption algorithms designed and implemented as a block cipher and a stream cipher. The security of both systems is thoroughly analyzed and the performance is compared with other reported systems showing a superior results. Both systems are realized on Xilinx Vetrix-4 FPGA with a hardware and throughput performance surpassing known encryption systems.

  20. A Novel Image Encryption Algorithm Based on the Two-Dimensional Logistic Map and the Latin Square Image Cipher

    Science.gov (United States)

    Machkour, M.; Saaidi, A.; Benmaati, M. L.

    2015-12-01

    In this paper, we introduce a new hybrid system consisting of a permutation-substitution network based on two different encryption techniques: chaotic systems and the Latin square. This homogeneity between the two systems allows us to provide the good properties of confusion and diffusion, robustness to the integration of noise in decryption. The security analysis shows that the system is secure enough to resist brute-force attack, differential attack, chosen-plaintext attack, known-plaintext attack and statistical attack. Therefore, this robustness is proven and justified.

  1. Cryptanalysis of "an improvement over an image encryption method based on total shuffling"

    Science.gov (United States)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2015-09-01

    In the past two decades, several image encryption algorithms based on chaotic systems had been proposed. Many of the proposed algorithms are meant to improve other chaos based and conventional cryptographic algorithms. Whereas, many of the proposed improvement methods suffer from serious security problems. In this paper, the security of the recently proposed improvement method for a chaos-based image encryption algorithm is analyzed. The results indicate the weakness of the analyzed algorithm against chosen plain-text.

  2. Image Encryption Based On Diffusion And Multiple Chaotic Maps

    Directory of Open Access Journals (Sweden)

    G.A.Sathishkumar

    2011-03-01

    Full Text Available In the recent world, security is a prime important issue, and encryption is one of the best alternative wayto ensure security. More over, there are many image encryption schemes have been proposed, each one ofthem has its own strength and weakness. This paper presents a new algorithm for the imageencryption/decryption scheme. This paper is devoted to provide a secured image encryption techniqueusing multiple chaotic based circular mapping. In this paper, first, a pair of sub keys is given by usingchaotic logistic maps. Second, the image is encrypted using logistic map sub key and in its transformationleads to diffusion process. Third, sub keys are generated by four different chaotic maps. Based on theinitial conditions, each map may produce various random numbers from various orbits of the maps.Among those random numbers, a particular number and from a particular orbit are selected as a key forthe encryption algorithm. Based on the key, a binary sequence is generated to control the encryptionalgorithm. The input image of 2-D is transformed into a 1- D array by using two different scanningpattern (raster and Zigzag and then divided into various sub blocks. Then the position permutation andvalue permutation is applied to each binary matrix based on multiple chaos maps. Finally the receiveruses the same sub keys to decrypt the encrypted images. The salient features of the proposed imageencryption method are loss-less, good peak signal –to noise ratio (PSNR, Symmetric key encryption, lesscross correlation, very large number of secret keys, and key-dependent pixel value replacement.

  3. Design of AES Pipelined Architecture for Image Encryption/Decryption Module

    Directory of Open Access Journals (Sweden)

    Pravin V. Kinge

    2014-07-01

    Full Text Available The relentless growth of Internet and communication technologies has made the extensive use of images unavoidable. The specific characteristics of image like high transmission rate with limited bandwidth, redundancy, bulk capacity and correlation among pixels makes standard algorithms not suitable for image encryption. In order to overcome these limitations for real time applications, design of new algorithms that require less computational power while preserving a sufficient level of security has always been a subject of interest. Here Advanced Encryption Standard (AES,as the most widely used encryption algorithm in many security applications. AES standard has different key size variants, where longer bit keys provide more secure ciphered text output. The available AES algorithm is used for  data and it is also suitable for image encryption and decryption to protect the confidential image from an unauthorized access. This project proposes a method in which the image data is an input to Pipelined AES algorithm through Textio, to obtain the encrypted image. and the encrypted image is the input to Pipelined AES Decryption to get the original image. This project proposed to implement the 128,192 & 256 bit Pipelined AES algorithm for image encryption and decryption, also to compare the latency , efficiency, security, frequency & throughput . The proposed work will be synthesized and simulated on FPGA family of Xilink ISE 13.2 and Modelsim tool respectively in Very high speed integrated circuit Hardware Description Language.

  4. OCML-based colour image encryption

    International Nuclear Information System (INIS)

    The chaos-based cryptographic algorithms have suggested some new ways to develop efficient image-encryption schemes. While most of these schemes are based on low-dimensional chaotic maps, it has been proposed recently to use high-dimensional chaos namely spatiotemporal chaos, which is modelled by one-way coupled-map lattices (OCML). Owing to their hyperchaotic behaviour, such systems are assumed to enhance the cryptosystem security. In this paper, we propose an OCML-based colour image encryption scheme with a stream cipher structure. We use a 192-bit-long external key to generate the initial conditions and the parameters of the OCML. We have made several tests to check the security of the proposed cryptosystem namely, statistical tests including histogram analysis, calculus of the correlation coefficients of adjacent pixels, security test against differential attack including calculus of the number of pixel change rate (NPCR) and unified average changing intensity (UACI), and entropy calculus. The cryptosystem speed is analyzed and tested as well.

  5. Watermarking patient data in encrypted medical images

    Indian Academy of Sciences (India)

    A Lavanya; V Natarajan

    2012-12-01

    In this paper, we propose a method for watermarking medical images for data integrity which consists of image encryption, data embedding and image-recovery phases. Data embedding can be completely recovered from the watermarked image after the watermark has been extracted. In the proposed method, we utilize standard stream cipher for image encryption and selecting non-region of interest tile to embed patient data. We show that the lower bound of the PSNR (peak-signal-to-noise-ratio) values for medical images is about 48 dB. Experimental results demonstrate that the proposed scheme can embed a large amount of data while keeping high visual quality of test images.

  6. Encryption Quality Analysis and Security Evaluation of CAST-128 Algorithm and its Modified Version using Digital Images

    OpenAIRE

    Krishnamurthy G N; Dr. V Ramaswamy

    2009-01-01

    this paper demonstrates analysis of well known block cipher CAST-128 and its modified version using avalanche criterion and other tests namely encryption quality, correlation coefficient, histogram analysis and key sensitivity tests.

  7. Image encryption method using a class of fractals

    Science.gov (United States)

    Alexopoulos, C.; Bourbakis, Nikolaos G.; Ioannou, N.

    1995-07-01

    We present a cryptographic scheme for encrypting 2D gray scale images by using a large family of fractals. This scheme is based on a transposition of the image elements implemented by a generator of 2D hierarchical scanning patterns producing a large subset of the (n2)l possible orders defined on a 2D image of n X n elements. Each pattern defines a distinct order of pixels and can be described by an expression, which is considered as the key of the transposition. This transposition cipher can easily be combined with various substitution ciphers, producing efficient product ciphers operating on pictorial data. Two such ciphers are constructed and their effects on real gray value images are shown. Encryption and decryption algorithms are derived from a parallel algorithm implementing the creation of the family of scanning patterns.

  8. Comparative Study of Speech Encryption Algorithms Using Mobile Applications

    Directory of Open Access Journals (Sweden)

    Jaspreet kaur#1 , Er. Kanwal preet Singh

    2013-07-01

    Full Text Available Speech communication is most popular now days. Everyone wants secure communication that’s way use encrypts and decrypt data scheme. It is basically used for military and business purpose. People want high security level during their communication..The numbers of algorithms are used for speech encryption and decryption. However in this paper the work is done on three different kinds of algorithms i.e. NTRU, RSA and RINJDAEL these three popular algorithms are used for speech encryption and decryption approach. Basically NTRU and RSA algorithms are asymmetric in nature and RINJDAEL algorithm is symmetric in nature. In speech encryption, first the speech is converted into text then further the text is converted into cipher text. The cipher text is sent to be particular receiver in which transmitter want to communicate. At the receiver end, receiver receives the original data through decryption process. At the end the performance is analyzed of these three approaches respectively. The parameters calculated are Encryption, Decryption and Delay time, complexity, and packet lost, Security level. In this three approaches, Encryption, decryption and delay time, are varying according to the number of bits per seconds. On the Other hand, complexity and packet lost are approximately same. There is no packet lost during transmitting and receiving the data. After the analysis of these three algorithms, The NTRU algorithm is faster in encryption and decryption time than others algorithms. The security level are very high than other algorithms. The android platform are used for these three algorithm to find the results in which algorithm took less time for encrypt or decrypt the data and help to evaluate the performance in speech encryption algorithms.

  9. LOW POWER ENCRYPTED MIPS PROCESSOR BASED ON AES ALGORITHM

    OpenAIRE

    Shivani Parmar; Kirat Pal Singh

    2012-01-01

    The paper describes the Low power 32-bit encrypted MIPS processor based on AES algorithm and MIPS pipeline architecture. The pipeline stages of MIPS processor are arranged in such a way that pipeline can be clocked at high frequency and clock gating technique is used for reducing power consumption. Encryption blocks of Advanced Encryption Standard (AES) cryptosystem and dependency among pipeline stages are explained in detail with the help of block diagram. In order to reduce the power consum...

  10. A RESEARCH PAPER ON A SECURE IMAGE ENCRYPTION - THEN COMPRESSION SYSTEM USING WAVELET VIA PREDICTION ERROR CLUSTERING AND RANDOM PERMUTATION

    Directory of Open Access Journals (Sweden)

    Er. Maninder Kaur*

    2015-09-01

    Full Text Available mages can be encrypted in many ways; several techniques have used different encryption methods. In this research, we apply a new modified International Data Encryption Algorithm to encrypt the full image in an efficient secure manner, after encryption the original file will be compressed and we get compressed image. This paper introduced a highly efficient image encryption - then compression (ETC system using wavelet. The proposed image encryption scheme operated in the prediction error domain is able to provide a reasonably high level of secur ity. More notably, the proposed compression approach applied to encrypted images is only slightly worse, unencrypted images as inputs. The proposed image encryption scheme operated in the prediction error domain is shown to be able to provide a reasonably high level of security. We also demonstrate that an arithmetic coding - based approach can be exploited to efficiently compress the encrypted images. More notably, the proposed compression approach applied to encrypted images is only slightly worse, in terms of compression efficiency, than the state - of - the - art lossless/lossy image coders, which take original, unencrypted images as inputs. In contrast, most of the existing ETC solutions induce significant penalty on the compression efficiency. For the implemen tation of this proposed work we use the Image Processing Toolbox under Matlab software.

  11. Developing and Evaluation of New Hybrid Encryption Algorithms

    Directory of Open Access Journals (Sweden)

    DiaaSalama AbdElminaam

    2014-03-01

    Full Text Available Wireless Sensor networks consist of hundreds or thousands of low cost, low power and self-organizing nodes which are highly distributed. As wireless sensor networks continue to grow, so does the need for effective security mechanisms because sensor networks may interact with sensitive data. Encryption algorithms play good roles in information security systems (ISS. Those algorithms consume a significant amount of computing resources such as battery power. Wireless Sensor networks are powered by a battery which is a very limited resource. At present, various types of cryptographic algorithms provide high security to information on networks, but there are also has some drawbacks.  The present asymmetric encryption methods and symmetric encryption methods can offer the security levels but with many limitations. For instance key maintenance is a great problem faced in symmetric encryption methods and less security level is the problem of asymmetric encryption methods even though key maintenance is easy. To improve the strength of these algorithms, we propose a new hybrid cryptographic algorithm in this paper. The algorithm is designed using combination of two symmetric cryptographic techniques and two Asymmetric cryptographic techniques. This protocol provides three cryptographic primitives, integrity, confidentiality and authentication. It is a hybrid encryption method where elliptical curve cryptography (ECC and advanced encryption (AES are combined to provide node encryption. RSA algorithm and Blowfish are combined to provide authentication and (MD5 for integrity. The results show that the proposed hybrid cryptographic algorithm gives better performance in terms of computation time and the size of cipher text.This paper tries to present a fair comparison between the new protocols with four existing different hybrid protocols according to power consumption. A comparison has been conducted for those protocols at different settings for each protocol such as different sizes of data blocks, and finally encryption/decryption speed. Experimental results are given to demonstrate the effectiveness of each algorithm.

  12. Image compression and encryption scheme based on 2D compressive sensing and fractional Mellin transform

    Science.gov (United States)

    Zhou, Nanrun; Li, Haolin; Wang, Di; Pan, Shumin; Zhou, Zhihong

    2015-05-01

    Most of the existing image encryption techniques bear security risks for taking linear transform or suffer encryption data expansion for adopting nonlinear transformation directly. To overcome these difficulties, a novel image compression-encryption scheme is proposed by combining 2D compressive sensing with nonlinear fractional Mellin transform. In this scheme, the original image is measured by measurement matrices in two directions to achieve compression and encryption simultaneously, and then the resulting image is re-encrypted by the nonlinear fractional Mellin transform. The measurement matrices are controlled by chaos map. The Newton Smoothed l0 Norm (NSL0) algorithm is adopted to obtain the decryption image. Simulation results verify the validity and the reliability of this scheme.

  13. Overview on Selective Encryption of Image and Video: Challenges and Perspectives

    Directory of Open Access Journals (Sweden)

    Massoudi A

    2008-01-01

    Full Text Available In traditional image and video content protection schemes, called fully layered, the whole content is first compressed. Then, the compressed bitstream is entirely encrypted using a standard cipher (DES, AES, IDEA, etc.. The specific characteristics of this kind of data (high-transmission rate with limited bandwidth make standard encryption algorithms inadequate. Another limitation of fully layered systems consists of altering the whole bitstream syntax which may disable some codec functionalities. Selective encryption is a new trend in image and video content protection. It consists of encrypting only a subset of the data. The aim of selective encryption is to reduce the amount of data to encrypt while preserving a sufficient level of security. This computation saving is very desirable especially in constrained communications (real-time networking, high-definition delivery, and mobile communications with limited computational power devices. In addition, selective encryption allows preserving some codec functionalities such as scalability. This tutorial is intended to give an overview on selective encryption algorithms. The theoretical background of selective encryption, potential applications, challenges, and perspectives is presented.

  14. A New Encryption Method for Secure Transmission of Images

    Directory of Open Access Journals (Sweden)

    B.V.Rama Devi,

    2010-12-01

    Full Text Available In this paper, a novel approach is designed for transmitting images securely using a technique called Gödelization followed by the public key encryption. The image which is to be transmitted is transformed into a sequence called Gödel Number Sequence (GNS using a new technique called Gödelization. This is compressed using Alphabetic coding AC and encrypted by an encryption method. This encryption string is transmitted and reconstructed at the decoding end by using thereverse process.

  15. An Image Encryption Method: SD-Advanced Image Encryption Standard: SD-AIES

    Directory of Open Access Journals (Sweden)

    Somdip Dey

    2015-05-01

    Full Text Available The security of digital information in modern times is one of the most important factors to keep in mind. For this reason, in this paper, the author has proposed a new standard method of image encryption. The proposed method consists of 4 different stages: 1 First, a number is generated from the password and each pixel of the image is converted to its equivalent eight binary number, and in that eight bit number, the number of bits, which are equal to the length of the number generated from the password, are rotated and reversed; 2 In second stage, extended hill cipher technique is applied by using involutory matrix, which is generated by same password used in second stage of encryption to make it more secure; 3 In third stage, generalized modified Vernam Cipher with feedback mechanism is used on the file to create the next level of encryption; 4 Finally in fourth stage, the whole image file is randomized multiple number of times using modified MSA randomization encryption technique and the randomization is dependent on another number, which is generated from the password provided for encryption method. SD-AIES is an upgraded version of SD-AEI Image Encryption Technique. The proposed method, SD-AIES is tested on different image files and the results were far more than satisfactory.

  16. Securing Image Transmission Using in- Compression Encryption Technique

    Directory of Open Access Journals (Sweden)

    Shaimaa A. El-said

    2010-12-01

    Full Text Available Multimedia is one of the most popular data shared in the Web, and the protection of it via encryption techniques is of vast interest. In this paper, a secure and computationally feasible Algorithm called Optimized Multiple Huffman Tables (OMHT technique is proposed. OMHT depends on using statisticalmodelbased compression method to generate different tables from the same data type of images or videos to be encrypted leading to increase compression efficiency and security of the used tables. A systematic study on how to strategically integrate different atomic operations to build a multimedia encryption system is presented. The resulting system can provide superior performance over other techniques by both its generic encryption and its simple adaptation to multimedia in terms of a joint consideration of security, and bitrate overhead. The effectiveness and robustness of this scheme is verified by measuring its security strength and comparing its computational cost against other techniques. The proposed technique guarantees security, and fastness without noticeable increase in encoded image size

  17. On the Speed of Image Encryption with Chaotically Coupled Chaotic Maps

    OpenAIRE

    A Akhavan; Samsudin, A; A. Akhshani

    2012-01-01

    Pisarchik et al. presented an image encryption algorithm based on chaotically coupled chaotic maps. But G. Arroyo et al. found some flaws and security weaknesses in the proposed cryptosystem. In this letter, the speed of this chaotic cryptosystem is analyzed and the encryption speed claimed by the authors is argued. The maximum possible speed for the proposed algorithm is calculated analytically. Although it was claimed that, the chaotic cryptosystem is high-speed, our analysis shows that suc...

  18. Reversible Data Hiding in Encrypted Image and Separating the Image and Data

    Directory of Open Access Journals (Sweden)

    Naresh Achari B., Sri. Swami Naik J

    2013-05-01

    Full Text Available Since many years, the protection of multimedia system knowledge is changing into vital. The protection of this multimedia system knowledge is often finished coding or knowledge concealment algorithms. To decrease the UTC, the information compression is critical. Since few years, a replacement drawback is making an attempt to mix in a very single step, compression, and coding and knowledge concealment. So far, few solutions are planned to mix image coding and compression for instance. Nowadays, a replacement challenge consists to plant knowledge in encrypted pictures. Since the entropy of encrypted image is supreme, the embedding step, thought of like noise, isn't doable by victimization customary knowledge concealment algorithms. A replacement plan is to use reversible knowledge hiding algorithms on encrypted pictures by wish to get rid of the embedded knowledge before the image decoding. Recent reversible knowledge concealment ways are planned with high capability, however these ways aren't applicable on encrypted pictures. During this paper we tend to propose associate analysis of the native variance of the marked encrypted pictures so as to get rid of the embedded knowledge throughout the decoding step. We’ve got applied our methodology on varied pictures, and that we show and analyze the obtained results.

  19. Image encryption based on nonlinear encryption system and public-key cryptography

    Science.gov (United States)

    Zhao, Tieyu; Ran, Qiwen; Chi, Yingying

    2015-03-01

    Recently, optical asymmetric cryptosystem (OACS) has became the focus of discussion and concern of researchers. Some researchers pointed out that OACS was not tenable because of misunderstanding the concept of asymmetric cryptosystem (ACS). We propose an improved cryptosystem using RSA public-key algorithm based on existing OACS and the new system conforms to the basic agreement of public key cryptosystem. At the beginning of the encryption process, the system will produce an independent phase matrix and allocate the input image, which also conforms to one-time pad cryptosystem. The simulation results show that the validity of the improved cryptosystem and the high robustness against attack scheme using phase retrieval technique.

  20. On the Design of Perceptual MPEG-Video Encryption Algorithms

    CERN Document Server

    Li, S; Cheung, A; Bhargava, B; Li, Shujun; Chen, Guanrong; Cheung, Albert; Bhargava, Bharat

    2005-01-01

    In this paper, some existing perceptual encryption algorithms of MPEG videos are surveyed, and a more effective design is proposed, which selectively encrypts fixed-length codewords (FLC) in MPEG-video bitstreams under the control of three perceptibility factors. Compared with the previously-proposed schemes, the new design can provide more useful features, such as strict size-preservation, on-the-fly encryption and multiple perceptibility, which makes it possible to support more applications with different requirements. Four different methods are suggested to provide better security against known/chosen-plaintext attacks.

  1. Image Encryption Using Differential Evolution Approach in Frequency Domain

    CERN Document Server

    Hassan, Maaly Awad S; 10.5121/sipij.2011.2105

    2011-01-01

    This paper presents a new effective method for image encryption which employs magnitude and phase manipulation using Differential Evolution (DE) approach. The novelty of this work lies in deploying the concept of keyed discrete Fourier transform (DFT) followed by DE operations for encryption purpose. To this end, a secret key is shared between both encryption and decryption sides. Firstly two dimensional (2-D) keyed discrete Fourier transform is carried out on the original image to be encrypted. Secondly crossover is performed between two components of the encrypted image, which are selected based on Linear Feedback Shift Register (LFSR) index generator. Similarly, keyed mutation is performed on the real parts of a certain components selected based on LFSR index generator. The LFSR index generator initializes it seed with the shared secret key to ensure the security of the resulting indices. The process shuffles the positions of image pixels. A new image encryption scheme based on the DE approach is developed...

  2. Implementation Of Encryption Algorithm For Communication By Microcontrollers

    Directory of Open Access Journals (Sweden)

    Udayan Patankar

    2014-04-01

    Full Text Available Abstract- This paper presents novel architecture for the Advanced Encryption algorithm to use it on low end microcontrollers with less number of bits. Now a day’s communication with high data rate transmission and less power consuming system is required which will deliver less error. Though the data rate is enhanced we try to keep things less complicated with respect to its manufacturing and packaging. Thus rather than going for more bits microcontroller we try to implement it with low bit size microcontrollers, which is also a cost effective. As we have heard that from ancient times it was the trend to use coded language for highly secured data as well as for fast communication on the same basis The algorithm used here encrypts frame information using an encrypted key. The key undergoes 4 stages of encryption. Then the frame information is passed through 15 stages of encryption using the encrypted key to create a cipher text. So that system will deliver less error and guaranteed communication is possible.

  3. Image Encryption with Space-filling Curves (Short Communication

    Directory of Open Access Journals (Sweden)

    V. Suresh

    2012-01-01

    Full Text Available Conventional encryption techniques are usually applicable for text data and often unsuited for encrypting multimedia objects for two reasons. Firstly, the huge sizes associated with multimedia objects make conventional encryption computationally costly. Secondly, multimedia objects come with massive redundancies which are useful in avoiding encryption of the objects in their entirety. Hence a class of encryption techniques devoted to encrypting multimedia objects like images have been developed. These techniques make use of the fact that the data comprising multimedia objects like images could in general be seggregated into two disjoint components, namely salient and non-salient. While the former component contributes to the perceptual quality of the object, the latter only adds minor details to it. In the context of images, the salient component is often much smaller in size than the non-salient component. Encryption effort is considerably reduced if only the salient component is encrypted while leaving the other component unencrypted. A key challenge is to find means to achieve a desirable seggregation so that the unencrypted component does not reveal any information about the object itself. In this study, an image encryption approach that uses fractal structures–known as space-filling curves- in order to reduce the encryption overload is presented. In addition, the approach also enables a high quality lossy compression of images.Defence Science Journal, 2012, 62(1, pp.46-50, DOI:http://dx.doi.org/10.14429/dsj.62.1441

  4. LOW POWER ENCRYPTED MIPS PROCESSOR BASED ON AES ALGORITHM

    Directory of Open Access Journals (Sweden)

    Shivani Parmar

    2012-05-01

    Full Text Available The paper describes the Low power 32-bit encrypted MIPS processor based on AES algorithm and MIPS pipeline architecture. The pipeline stages of MIPS processor are arranged in such a way that pipeline can be clocked at high frequency and clock gating technique is used for reducing power consumption. Encryption blocks of Advanced Encryption Standard (AES cryptosystem and dependency among pipeline stages are explained in detail with the help of block diagram. In order to reduce the power consumption, especially for portable devices and security application switching activity is used inside pipeline stages. The design has been synthesized at 40nm process technology targeting using Xilinx Virtex-6 device. The encrypted MIPS pipeline processor can work at 210MHz and power consumption is 1.313W.

  5. A Robust Chaotic and Fast Walsh Transform Encryption for Gray Scale Biomedical Image Transmission

    Directory of Open Access Journals (Sweden)

    Adelaide Nicole Kengnou Telem

    2015-06-01

    Full Text Available In this work, a new scheme of image encryption based on chaos and Fast Walsh Transform (FWT has been proposed. We used two chaotic logistic maps and combined chaotic encryption methods to the two-dimensional FWT of images. The encryption process involves two steps: firstly, chaotic sequences generated by the chaotic logistic maps are used to permute and mask the intermediate results or array of FWT, the next step consist in changing the chaotic sequences or the initial conditions of chaotic logistic maps among two intermediate results of the same row or column. Changing the encryption key several times on the same row or column makes the cipher more robust against any attack. We tested our algorithms on many biomedical images. We also used images from data bases to compare our algorithm to those in literature. It comes out from statistical analysis and key sensitivity tests that our proposed image encryption scheme provides an efficient and secure way for real-time encryption and transmission biomedical images.

  6. Micro-lens array based 3-D color image encryption using the combination of gravity model and Arnold transform

    Science.gov (United States)

    You, Suping; Lu, Yucheng; Zhang, Wei; Yang, Bo; Peng, Runling; Zhuang, Songlin

    2015-11-01

    This paper proposes a 3-D image encryption scheme based on micro-lens array. The 3-D image can be reconstructed by applying the digital refocusing algorithm to the picked-up light field. To improve the security of the cryptosystem, the Arnold transform and the Gravity Model based image encryption method are employed. Experiment results demonstrate the high security in key space of the proposed encryption scheme. The results also indicate that the employment of light field imaging significant strengthens the robustness of the cipher image against some conventional image processing attacks.

  7. A REVIEW PAPER ON A SECURE IMAGE ENCRYPTION-THEN COMPRESSION SYSTEM USING WAVELET VIA PREDICTION ERROR CLUSTERING AND RANDOM PERMUTATION

    Directory of Open Access Journals (Sweden)

    Er. Maninder Kaur*

    2015-04-01

    Full Text Available Images can be encrypted in many ways; several techniques have used different encryption methods. In this research, we apply a new modified International Data Encryption Algorithm to encrypt the full image in an efficient secure manner, after encryption the original file will be compressed and we get compressed image. This paper introduced a highly efficient image encryption-then compression (ETC system using wavelet. The proposed image encryption scheme operated in the prediction error domain is able to provide a reasonably high level of security. More notably, the proposed compression approach applied to encrypted images is only slightly worse, unencrypted images as inputs. The proposed image encryption scheme operated in the prediction error domain is shown to be able to provide a reasonably high level of security. We also demonstrate that an arithmetic coding-based approach can be exploited to efficiently compress the encrypted images. More notably, the proposed compression approach applied to encrypted images is only slightly worse, in terms of compression efficiency, than the state-of-the-art lossless/lossy image coders, which take original, unencrypted images as inputs. In contrast, most of the existing ETC solutions induce significant penalty on the compression efficiency. For the implementation of this proposed work we use the Image Processing Toolbox under Matlab software.

  8. Analization and Comparison of Selective Encryption Algorithms with Full Encryption for Wireless Networks

    Directory of Open Access Journals (Sweden)

    Pavithra. C#1 , Vinod. B. Durdi

    2013-05-01

    Full Text Available Cryptography has been widely accepted as a traditional platform of data protection for decades.The most significant and efficient cryptosystems these days are the Symmetric key algorithms for cryptography. Hence, they have a very wide range of applications in many realms. Ad-hoc networks are the most commonly used type in the present scenario because of their non-fixed infrastructure. Providing security to such kinds of network is the main objective of the work here. In this project, we present a systematic approach for selective encryption of data. In the present day scenario where all the wireless ad-hoc network nodes run or work on battery, Full encryption of all the data may lead to a high overhead and also waste the computational power or the resources. Hence, two selective encryption algorithms are introduced and a secure method for communication between the user and the entrusted is also being carried out. Eventually, we carry out an extensive set of experiments using Core Java and Java cryptosystems. A very attractive GUI is being designed to make it more user friendly. This can be used whenever people work remotely and connect to their host server through VPN. We first create an ad-hoc network and communicate between the nodes of the network using basic server client methodology. Two selective encryption algorithms were developed and more than 50 percent encryption of the data was maintained in both the algorithms. However, the security aspect can be changed depending on the kind of the data which is being communicated.

  9. Cryptanalysis of a modulo image encryption scheme with fractal keys

    Science.gov (United States)

    Yoon, Eun-Jun; Yoo, Kee-Young

    2010-07-01

    Recently, Rozouvan proposed a modulo image encryption scheme with fractal keys. This paper demonstrates that Rozouvan's scheme is not secure to the following three different classical types of attacks: chosen plaintext, chosen ciphertext, and known plaintext. In the three attacks, only a pair of (plaintext/ciphertext) was needed to break the image encryption scheme.

  10. Shannon Entropy based Randomness Measurement and Test for Image Encryption

    CERN Document Server

    Wu, Yue; Agaian, Sos

    2011-01-01

    The quality of image encryption is commonly measured by the Shannon entropy over the ciphertext image. However, this measurement does not consider to the randomness of local image blocks and is inappropriate for scrambling based image encryption methods. In this paper, a new information entropy-based randomness measurement for image encryption is introduced which, for the first time, answers the question of whether a given ciphertext image is sufficiently random-like. It measures the randomness over the ciphertext in a fairer way by calculating the averaged entropy of a series of small image blocks within the entire test image. In order to fulfill both quantitative and qualitative measurement, the expectation and the variance of this averaged block entropy for a true-random image are strictly derived and corresponding numerical reference tables are also provided. Moreover, a hypothesis test at significance?-level is given to help accept or reject the hypothesis that the test image is ideally encrypted/random-...

  11. Studying the Effects of Most Common Encryption Algorithms

    OpenAIRE

    Diaa Salama; Hatem Abdual Kader; Mohiy Hadhoud

    2011-01-01

    Wireless networks play critical roles in present work, home, and public places, so the needs of protecting of suchnetworks are increased. Encryption algorithms play vital roles in information systems security. Those algorithms consume asignificant amount of computing resources such as CPU time, memory, and battery power. CPU and memory usability areincreasing with a suitable rates, but battery technology is increasing at slower rate. The problem of the slower increasingbattery technology form...

  12. Plaintext Related Two-level Secret Key Image Encryption Scheme

    OpenAIRE

    Bin Chen; Peng Cai; Yong Zhang; Jiali Xia

    2012-01-01

    Some chaos-based image encryption schemes using plain-images independent secret code streams have weak encryption security and are vulnerable to chosen plaintext and chosen cipher-text attacks. This paper proposed a two-level secret key image encryption scheme, where the first-level secret key is the private symmetric secret key, and the second-level secret key is derived from both the first-level secret key and the plain image by iterating piecewise linear map and Logistic map. Even though t...

  13. Encryption-Decryption RGB Color Image Using Matrix Multiplication

    Directory of Open Access Journals (Sweden)

    Mohamad M.AL-Laham

    2015-10-01

    Full Text Available An enhanced technique of color image encryption based on random matrix key encoding is proposed. To encrypt the color image a separation into Red Green and Blue (R, G, B channels will applied. Each channel is encrypted using a technique called double random matrix key encoding then three new coding image matrices are constructed. To obtain the reconstructed image that is the same as the original image in the receipted side; simple extracted and decryption operations can be maintained. The results shown that the proposed technique is powerful for color image encryption and decryption and a MATLAB and simulations were used to get the results. The proposed technique has high security features because each color component is separately treated using its own double random matrix key which is generated randomly and make the process of hacking the three keys very difficult

  14. Selective Image Encryption Using DCT with Stream Cipher

    OpenAIRE

    Sapna Sasidharan; Jithin R

    2010-01-01

    Encryption is used to securely transmit data in open networks. Each type of data has its own features; therefore different techniques should be used to protect confidential image data from unauthorized access. In this paper, selective image encryption using DCT with Stream Cipher is done. In the DCT method, the basic idea is to decompose the image into 8×8 blocks and these blocks are transformed from the spatial domain to the frequency domain by the DCT. Then, the DCT coefficients correlated ...

  15. Bi-serial DNA Encryption Algorithm(BDEA)

    CERN Document Server

    Prabhu, D

    2011-01-01

    The vast parallelism, exceptional energy efficiency and extraordinary information inherent in DNA molecules are being explored for computing, data storage and cryptography. DNA cryptography is a emerging field of cryptography. In this paper a novel encryption algorithm is devised based on number conversion, DNA digital coding, PCR amplification, which can effectively prevent attack. Data treatment is used to transform the plain text into cipher text which provides excellent security

  16. Efficient Hardware Implementation of the Lightweight Block Encryption Algorithm LEA

    OpenAIRE

    Donggeon Lee; Dong-Chan Kim; Daesung Kwon; Howon Kim

    2014-01-01

    Recently, due to the advent of resource-constrained trends, such as smartphones and smart devices, the computing environment is changing. Because our daily life is deeply intertwined with ubiquitous networks, the importance of security is growing. A lightweight encryption algorithm is essential for secure communication between these kinds of resource-constrained devices, and many researchers have been investigating this field. Recently, a lightweight block cipher called LEA was proposed. LEA ...

  17. A Chaotic Encryption Algorithm Based on Skew Tent Map

    Directory of Open Access Journals (Sweden)

    Chen Shou-gang

    2009-04-01

    Full Text Available With the rapid development and extensive applications of computer technology, network technology, communication technology, and Internet in particular, the security of network information is becoming increasingly key problems that must be solved urgently. The applying chaos theory to secure communication and information encryption has already become one of the hot research projects on the combination of nonlinear science and information science, and it is a novel branch of high-tech research fields. In this paper, a chaotic encryption algorithm based on skew tent map is proposed. In the process of encryption, its update look-up table depends on plaintext and external key, the 8-bit subkey is dynamically generated with skew tent map and depends on updating look-up table, the key is initial condition X0 of skew tent map, control parameter p and a external key K. Theoretical analysis and simulated experiments show that the algorithm can resist the statistic and differential attacks, and the algorithm has high security.

  18. A New Data Encryption Algorithm Based on the Location of Mobile Users

    OpenAIRE

    Hsien- Chou Liao; Yun -Hsiang Chao

    2008-01-01

    The wide spread of WLAN and the popularity of mobile devices increases the frequency of data transmission among mobile users. However, most of the data encryption technology is location-independent. An encrypted data can be decrypted anywhere. The encryption technology cannot restrict the location of data decryption. In order to meet the demand of mobile users in the future, a location-dependent approach, called Location-Dependent Data Encryption Algorithm (LDEA), is proposed in this study. A...

  19. A Digital Watermarking for Lifting Based Compression And Encryption of JPEG 2000 Images

    Directory of Open Access Journals (Sweden)

    Ansu Anna Ponnachen1 , Lidiya Xavier

    2013-06-01

    Full Text Available In the digital world the digital media is currently evolving at such rapidly;copyright protection is become increasingly important.Now a days these media is available with various image formats,due to which they are simple to copy and resell without any loss of quality.A wide range of digital media is often distributed by multiple levels of distributors in a compressed and encrypted format.It is sometimes necessary to watermark the compressed encrypted media items in the compressed encrypted domain itself for tamper detection or ownership declaration or copyright management purposes.The objective of image compression is to reduce irrelevance and redundancy of the image data inorder to be able to store or transmit data in an efficient form.This paper deals with the watermarking of compressed and encrypted JPEG 2000 images.The compression is achieved on JPEG 2000 images by lifting based architecture.The encryption algorithm used is stream cipher.The identification of watermark can be done in the decrypted domain.The watermarking technique used is spread spectrum.This can be implemented through matlab.

  20. Optical image encryption by random shifting in fractional Fourier domains

    OpenAIRE

    Hennelly, Bryan M.; Sheridan, John T.

    2003-01-01

    A number of methods have recently been proposed in the literature for the encryption of two-dimensional information by use of optical systems based on the fractional Fourier transform. Typically, these methods require random phase screen keys for decrypting the data, which must be stored at the receiver and must be carefully aligned with the received encrypted data. A new technique based on a random shifting, or jigsaw, algorithm is proposed. This method does not require the use of phase keys...

  1. An Analysis of Encryption and Decryption Application by using One Time Pad Algorithm

    Directory of Open Access Journals (Sweden)

    Zaeniah

    2015-09-01

    Full Text Available Security of data in a computer is needed to protect critical data and information from other parties. One way to protect data is to apply the science of cryptography to perform data encryption. There are wide variety of algorithms used for encryption of data, this study used a one-time pad algorithm for encrypting data. Algorithm One Time Pad uses the same key in the encryption process and a decryption of the data. An encrypted data will be transformed into cipher text so that the only person who has the key can open that data. Therefore, analysis will be done for an application that implements a one-time pad algorithm for encrypting data. The application that implements the one time pad algorithm can help users to store data securely.

  2. Performance Evaluation of Encryption Algorithms' Key Length Size on Web Browsers

    OpenAIRE

    Syed Idrus, Syed Zulkarnain; Aljunid, Syed Alwee; Mohd Asi, Salina; Sudin, Suhizaz

    2012-01-01

    In this article, the research correlates to our previous study done on encryption algorithms' "text length size". However, in this study, the evaluation is analysed on a different means instead, which is the encryption algorithms' "key length size", but by imposing the same method and programming language over the same Web browsers in order to signify their performance differences. The performance is based on the encryption process of the programming language's script with the Web browsers. W...

  3. Simultaneous Color Image Compression and Encryption using Number Theory

    Directory of Open Access Journals (Sweden)

    Navaneethakrishnan Navaneethakrishnan

    2012-09-01

    Full Text Available The dependence on computing machines and utility of information has been growing tremendously in the last few decades. As a result, evolving effective techniques for storing and transmitting the ever increasing volumes of data has become a high priority issue. Image compression addresses the problem by reducing the amount of data required to represent a digital image. The underlying basis of the compression process is the removal of redundant data. Selection of a suitable of a suitable compression scheme for a given application depends on the available memory for processing, the number of mathematical computations and the available bandwidth for transmission. The security of digital images is another important issue that has been receiving considerable attention in the recent past. Different image encryption methods have been proposed in the literature towards ensuring the security of data. The encryption process transforms a 2 – D pixel array into a statistically uncorrelated data set. In this paper, an enhanced number theory based color image compression and encryption scheme is proposed. This technique encompasses the twin – based application of image compression and I age encryption simultaneously adopting a model based paradigm for the general compression – encryption standards.

  4. SECURE IMAGE DATA BY USING DIFFERENT ENCRYPTION TECHNIQUES A REVIEW

    Directory of Open Access Journals (Sweden)

    GAYATHRI D.

    2013-03-01

    Full Text Available Information security is an increasingly important problem in the present era of advanced technology, because of which encryption is becoming very important to ensure security. Popular application of multimedia technology and increasing transmission ability of network gradually leads us to acquire information directly and clearly through Images. The digital images, which are transmitted over the internet, must be protected from unauthorized access during storage and transmission for communication, copyright protection and authentication purposes. This can be accomplished using image encryption which is an intelligent hiding of information. In this paper, I survey on existing work which is used different techniques for image encryption and also give the general introduction about cryptography.

  5. A Chaos Based Encryption Method for Monochrom Images and Text.

    Directory of Open Access Journals (Sweden)

    Varsha S

    2012-08-01

    Full Text Available We propose a new method for encryption of monochrome (Black & White image and text documents using Hilbert transform and chaos theory with added security feature of Rubik Cube Operation. The input text or image is transformed using Hilbert transform. Random phase mask is generated using a logistic map function .The transformed image is combined with the random phase mask. The pixels of the image obtained from the combination are shifted row and column wise according to a random number sequence ,which also acts as a key .We call this as Rubik Cube Operation as it resembles the Rubik cube .The image obtained after Rubik Cube operation is the encrypted image. The image has been decrypted and the MSE and correlation coefficient between the decrypted and input image is calculated.

  6. A technique for image encryption using digital signature

    Science.gov (United States)

    Sinha, Aloka; Singh, Kehar

    2003-04-01

    We propose a new technique to encrypt an image for secure image transmission. The digital signature of the original image is added to the encoded version of the original image. The encoding of the image is done using an appropriate error control code, such as a Bose-Chaudhuri Hochquenghem (BCH) code. At the receiver end, after the decryption of the image, the digital signature can be used to verify the authenticity of the image. Detailed simulations have been carried out to test the encryption technique. An optical correlator, in either the JTC or the VanderLugt geometry, or a digital correlation technique, can be used to verify the authenticity of the decrypted image.

  7. A Chaos Based Encryption Method for Monochrom Images and Text.

    OpenAIRE

    Varsha S; Avinash Kumar Jha

    2012-01-01

    We propose a new method for encryption of monochrome (Black & White image ) and text documents using Hilbert transform and chaos theory with added security feature of Rubik Cube Operation. The input text or image is transformed using Hilbert transform. Random phase mask is generated using a logistic map function .The transformed image is combined with the random phase mask. The pixels of the image obtained from the combination are shifted row and column wise according to a random number seque...

  8. Efficient image or video encryption based on spatiotemporal chaos system

    International Nuclear Information System (INIS)

    In this paper, an efficient image/video encryption scheme is constructed based on spatiotemporal chaos system. The chaotic lattices are used to generate pseudorandom sequences and then encrypt image blocks one by one. By iterating chaotic maps for certain times, the generated pseudorandom sequences obtain high initial-value sensitivity and good randomness. The pseudorandom-bits in each lattice are used to encrypt the Direct Current coefficient (DC) and the signs of the Alternating Current coefficients (ACs). Theoretical analysis and experimental results show that the scheme has good cryptographic security and perceptual security, and it does not affect the compression efficiency apparently. These properties make the scheme a suitable choice for practical applications.

  9. Single-Channel Color Image Encryption Using the Reality-Preserving Fractional Discrete Cosine Transform in YCbCr Space

    Directory of Open Access Journals (Sweden)

    Jianhua Wu

    2013-11-01

    Full Text Available A novel single-channel color image encryption algorithm is proposed, which utilizes the reality-preserving fractional discrete cosine transform in YCbCr space. The color image to be encrypted is decomposed into Y, Cb, and Cr components, which are then separately transformed by Discrete Cosine Transform (DCT. The resulting three spectra sequences, obtained by zig-zag scanning the spectra matrices, are truncated and the lower frequency coefficients of the three components are scrambled up into a single matrix of the same size with the original color image.  Then the obtained single matrix is encrypted by the fractional discrete cosine transform, which is a kind of encryption with secrecy of pixel value and pixel position simultaneously. The encrypted image is convenient for display, transmission and storage, thanks to the reality-preserving property of the fractional discrete cosine transform. Additionally, the proposed algorithm enlarges the key space by employing the generating sequence as an extra key in addition to the fractional orders. Simulation results and security analysis demonstrate the proposed algorithm is feasible, effective and secure. The robustness to noise attack is also guaranteed to some extent.

  10. Improving Security of Parallel Algorithm Using Key Encryption Technique

    Directory of Open Access Journals (Sweden)

    R. Swarna Raja

    2013-01-01

    Full Text Available In the recent cloud era, computing moves to a new plane of running large scale scientific applications. Many parallel algorithms have been created to support a large dataset. MapReduce is one such parallel data processing framework adopted widely for scientific research, machine learning and high end computing. The most prevalent implementation of MapReduce is the open source project Hadoop. To protect the integrity and confidentiality of data uploaded, MapReduce introduced a Kerberos-based model with tokens for datablocks and processing nodes. The tokens are symmetrically encrypted and distributed across the nodes. Such a technique is vulnerable to man-in-the-middle attacks like data loss, data modification and stealing of keys. In this study, a novel technique is proposed based on public key encryption on top of the Kerberos model to enhance Security. The various attack scenarios on the current Hadoop implementation model has been analyzed and a secure environment has been proposed. The study shows that the proposed framework provides an improved level of security when using RSA (Rivest Shamir Adleman with 65,537 keysize consumed 23 milli seconds, while using 257 bits keysize which consumed 21 milli seconds.

  11. Image encryption using the two-dimensional logistic chaotic map

    Science.gov (United States)

    Wu, Yue; Yang, Gelan; Jin, Huixia; Noonan, Joseph P.

    2012-01-01

    Chaos maps and chaotic systems have been proved to be useful and effective for cryptography. In our study, the two-dimensional logistic map with complicated basin structures and attractors are first used for image encryption. The proposed method adopts the classic framework of the permutation-substitution network in cryptography and thus ensures both confusion and diffusion properties for a secure cipher. The proposed method is able to encrypt an intelligible image into a random-like one from the statistical point of view and the human visual system point of view. Extensive simulation results using test images from the USC-SIPI image database demonstrate the effectiveness and robustness of the proposed method. Security analysis results of using both the conventional and the most recent tests show that the encryption quality of the proposed method reaches or excels the current state-of-the-art methods. Similar encryption ideas can be applied to digital data in other formats (e.g., digital audio and video). We also publish the cipher MATLAB open-source-code under the web page https://sites.google.com/site/tuftsyuewu/source-code.

  12. An Image Encryption Scheme Based on Lorenz System for Low Profile Applications

    Science.gov (United States)

    Anees, Amir

    2015-09-01

    Advanced encryption standard being a benchmark for encryption is very ideal for digital images encryption for its security reasons but might not be effective for low profile applications due to its high computational and hardware complexity. In this paper, we presents a robust image encryption scheme for these types of applications based on chaotic sequences of Lorenz system, also ensuring the system security as well. The security strength is evident from the results of statistical and key analysis done in this paper.

  13. A Review On Data Hiding Techniques In Encrypted Images

    Directory of Open Access Journals (Sweden)

    Ms. Anagha Markandey

    2013-10-01

    Full Text Available Now a days there is very big problem of data hacking into the networking era. There are number of techniques available in the industry to overcome this problem. So, data hiding in the encrypted image is one of the solution, but the problem is the original cover cannot be losslessly recovered by this technique. That’s why Recently, more and more attention is paid to reversible data hiding (RDH in encrypted images, since it offers the excellent property that the original cover can be recovered without any los after embedded data is extracted while protecting the image content’s confidentiality. This paper enlists the various methods of data hiding in the image like differential expansion, histogram shift or the combination of both the techniques. This is useful in the way that these methods recovers the image with its original quality with improved PSNR ratio.

  14. DATA SECURITY IN LOCAL AREA NETWORK BASED ON FAST ENCRYPTION ALGORITHM

    Directory of Open Access Journals (Sweden)

    G. Ramesh

    2010-06-01

    Full Text Available Hacking is one of the greatest problems in the wireless local area networks. Many algorithms have been used to prevent the outside attacks to eavesdrop or prevent the data to be transferred to the end-user safely and correctly. In this paper, a new symmetrical encryption algorithm is proposed that prevents the outside attacks. The new algorithm avoids key exchange between users and reduces the time taken for the encryption and decryption. It operates at high data rate in comparison with The Data Encryption Standard (DES, Triple DES (TDES, Advanced Encryption Standard (AES-256, and RC6 algorithms. The new algorithm is applied successfully on both text file and voice message.

  15. Optical image encryption via photon-counting imaging and compressive sensing based ptychography

    Science.gov (United States)

    Rawat, Nitin; Hwang, In-Chul; Shi, Yishi; Lee, Byung-Geun

    2015-06-01

    In this study, we investigate the integration of compressive sensing (CS) and photon-counting imaging (PCI) techniques with a ptychography-based optical image encryption system. Primarily, the plaintext real-valued image is optically encrypted and recorded via a classical ptychography technique. Further, the sparse-based representations of the original encrypted complex data can be produced by combining CS and PCI techniques with the primary encrypted image. Such a combination takes an advantage of reduced encrypted samples (i.e., linearly projected random compressive complex samples and photon-counted complex samples) that can be exploited to realize optical decryption, which inherently serves as a secret key (i.e., independent to encryption phase keys) and makes an intruder attack futile. In addition to this, recording fewer encrypted samples provides a substantial bandwidth reduction in online transmission. We demonstrate that the fewer sparse-based complex samples have adequate information to realize decryption. To the best of our knowledge, this is the first report on integrating CS and PCI with conventional ptychography-based optical image encryption.

  16. Hybrid approach for Image Encryption Using SCAN Patterns and Carrier Images

    OpenAIRE

    Panduranga H. T.; Naveen Kumar S.K

    2010-01-01

    We propose a hybrid technique for image encryption which employs the concept of carrier image and SCAN patterns generated by SCAN methodology. Although it involves existing method like SCAN methodology, the novelty of the work lies in hybridizing and carrier image creation for encryption. Here the carrier image is created with the help of alphanumeric keyword. Each alphanumeric key will be having a unique 8bit value generated by 4 out of 8-code. This newly generated carrier ...

  17. An Effective Method in Steganography to Improve Protection Using Advanced Encryption Standard Algorithm

    Directory of Open Access Journals (Sweden)

    K Kamalam

    2014-12-01

    Full Text Available Steganography is the art and science of writing hidden messages in such a way that no one, apart from the sender and anticipated reciver, imagine the existence of the information, a form of security through obscurity. It is an emerging area which is used for secured data broadcast over any public media. In this study a novel advance of image steganography based on LSB (Least Significant Bit insertion, RSA encryption and AES (Advanced Encryption Standard Algorithm technique for the lossless jpeg images has been proposed. In this paper, we present a strategy of attaining maximum embedding ability in an image in a way that maximum possible neighboring pixels are analyzed for their frequencies, to determine the amount of content to be added in each pixel. The techniques provide a seamless insertion of data into the carrier image and reduce the error consideration and artifacts insertion required to a minimal. We validate our approach with the help of an experimental evaluation on a prototypic implementation of the proposed model.

  18. NEW CONCEPT OF SYMMETRIC ENCRYPTION ALGORITHM A HYBRID APPROACH OF CAESAR CIPHER AND COLUMNAR TRANSPOSITION IN MULTI STAGES

    Directory of Open Access Journals (Sweden)

    Dharmendra Kumar Gupta

    2012-02-01

    Full Text Available Internet and networks applications are growing very fast, so the needs to protect such applications are increased. Encryption algorithms play a main role in information security systems. This work gives an insight into the new concept called hybrid approach of conventional encryption, which gives the concept of strong encryption of the data. The symmetric encryption also called conventional encryption or single key encryption was the only type of encryption is use prior to the development of public-key encryption. All the conventional encryption algorithms are very weak concept of encryption and brute-force attack and cryptanalysis attacks can easily determined the plain text. With the increasing use of the secure transmission of data and information over the internet, the need of strong encryption algorithm increasing day by day. In this work of encryption technique we present a new concept of conventional or symmetric encryption algorithm that hybrid two primitive (i.e. Caesar Cipher and Columnar Transposition and weak approach of encryption algorithm in multi stages to make the new approach more secure and strong than the earlier concept. The core of this algorithm is the use of two different secret keys for the Caesar Cipher and Columnar Transposition respectively. The cryptanalysis attack can not determine the plaintext easily, brute-force attack required long time to obtain plaintext. The mechanism used for this new hybrid algorithm is, first encrypt the message by applying Caesar Cipher technique and again Transpose the encrypted message receive from the Caesar Cipher, this process repeat again and again as many times as number of digits in the secret key for the Caesar cipher.

  19. SECURE IMAGE DATA BY USING DIFFERENT ENCRYPTION TECHNIQUES A REVIEW

    OpenAIRE

    D.Gayathri

    2013-01-01

    Information security is an increasingly important problem in the present era of advanced technology, because of which encryption is becoming very important to ensure security. Popular application of multimedia technology and increasing transmission ability of network gradually leads us to acquire information directly and clearly through Images. The digital images, which are transmitted over the internet, must be protected from unauthorized access during storage and transmission for communicat...

  20. The New Image Encryption and Decryption Using Quasi Group

    Directory of Open Access Journals (Sweden)

    Ankit Agarwal

    2014-07-01

    Full Text Available Multimedia Communication is the new age of communication. Image Communication is one of the most popular types of multimedia communication. This type of communication always faces security challenges. Security breach is rising at a rapid rate. Image privacy is facing an even bigger threat. There are several image cryptography techniques having both positive and negative consequences. Via this paper, we are presenting a new approach for image encryption and decryption by using pixel shifting and using Quasigroup (Latin Square without performing translation. It requires low computation as a key. Image pixel reshuffling is done randomly and in a non-repeated manner.

  1. An efficient diffusion approach for chaos-based image encryption

    International Nuclear Information System (INIS)

    One of the existing chaos-based image cryptosystems is composed of alternative substitution and diffusion stages. A multi-dimensional chaotic map is usually employed in the substitution stage for image pixel permutation while a one-dimensional (1D) chaotic map is used for diffusion purpose. As the latter usually involves real number arithmetic operations, the overall encryption speed is limited by the diffusion stage. In this paper, we propose a more efficient diffusion mechanism using simple table lookup and swapping techniques as a light-weight replacement of the 1D chaotic map iteration. Simulation results show that at a similar security level, the proposed cryptosystem needs about one-third the encryption time of a similar cryptosystem. The effective acceleration of chaos-based image cryptosystems is thus achieved.

  2. Efficient Hardware Implementation of the Lightweight Block Encryption Algorithm LEA

    Directory of Open Access Journals (Sweden)

    Donggeon Lee

    2014-01-01

    Full Text Available Recently, due to the advent of resource-constrained trends, such as smartphones and smart devices, the computing environment is changing. Because our daily life is deeply intertwined with ubiquitous networks, the importance of security is growing. A lightweight encryption algorithm is essential for secure communication between these kinds of resource-constrained devices, and many researchers have been investigating this field. Recently, a lightweight block cipher called LEA was proposed. LEA was originally targeted for efficient implementation on microprocessors, as it is fast when implemented in software and furthermore, it has a small memory footprint. To reflect on recent technology, all required calculations utilize 32-bit wide operations. In addition, the algorithm is comprised of not complex S-Box-like structures but simple Addition, Rotation, and XOR operations. To the best of our knowledge, this paper is the first report on a comprehensive hardware implementation of LEA. We present various hardware structures and their implementation results according to key sizes. Even though LEA was originally targeted at software efficiency, it also shows high efficiency when implemented as hardware.

  3. Optical image encryption and hiding based on a modified Mach-Zehnder interferometer.

    Science.gov (United States)

    Li, Jun; Li, Jiaosheng; Shen, Lina; Pan, Yangyang; Li, Rong

    2014-02-24

    A method for optical image hiding and for optical image encryption and hiding in the Fresnel domain via completely optical means is proposed, which encodes original object image into the encrypted image and then embeds it into host image in our modified Mach-Zehnder interferometer architecture. The modified Mach-Zehnder interferometer not only provides phase shifts to record complex amplitude of final encrypted object image on CCD plane but also introduces host image into reference path of the interferometer to hide it. The final encrypted object image is registered as interference patterns, which resemble a Fresnel diffraction pattern of the host image, and thus the secure information is imperceptible to unauthorized receivers. The method can simultaneously realize image encryption and image hiding at a high speed in pure optical system. The validity of the method and its robustness against some common attacks are investigated by numerical simulations and experiments. PMID:24663801

  4. An image encryption scheme based on quantum logistic map

    Science.gov (United States)

    Akhshani, A.; Akhavan, A.; Lim, S.-C.; Hassan, Z.

    2012-12-01

    The topic of quantum chaos has begun to draw increasing attention in recent years. While a satisfactory definition for it is not settled yet in order to differentiate between its classical counterparts. Dissipative quantum maps can be characterized by sensitive dependence on initial conditions, like classical maps. Considering this property, an implementation of image encryption scheme based on the quantum logistic map is proposed. The security and performance analysis of the proposed image encryption is performed using well-known methods. The results of the reliability analysis are encouraging and it can be concluded that, the proposed scheme is efficient and secure. The results of this study also suggest application of other quantum maps such as quantum standard map and quantum baker map in cryptography and other aspects of security and privacy.

  5. Hardware realization of chaos based block cipher for image encryption

    KAUST Repository

    Barakat, Mohamed L.

    2011-12-01

    Unlike stream ciphers, block ciphers are very essential for parallel processing applications. In this paper, the first hardware realization of chaotic-based block cipher is proposed for image encryption applications. The proposed system is tested for known cryptanalysis attacks and for different block sizes. When implemented on Virtex-IV, system performance showed high throughput and utilized small area. Passing successfully in all tests, our system proved to be secure with all block sizes. © 2011 IEEE.

  6. Chaos-based encryption for fractal image coding

    Science.gov (United States)

    Yuen, Ching-Hung; Wong, Kwok-Wo

    2012-01-01

    A chaos-based cryptosystem for fractal image coding is proposed. The Rényi chaotic map is employed to determine the order of processing the range blocks and to generate the keystream for masking the encoded sequence. Compared with the standard approach of fractal image coding followed by the Advanced Encryption Standard, our scheme offers a higher sensitivity to both plaintext and ciphertext at a comparable operating efficiency. The keystream generated by the Rényi chaotic map passes the randomness tests set by the United States National Institute of Standards and Technology, and so the proposed scheme is sensitive to the key.

  7. Chaos-based encryption for fractal image coding

    International Nuclear Information System (INIS)

    A chaos-based cryptosystem for fractal image coding is proposed. The Rényi chaotic map is employed to determine the order of processing the range blocks and to generate the keystream for masking the encoded sequence. Compared with the standard approach of fractal image coding followed by the Advanced Encryption Standard, our scheme offers a higher sensitivity to both plaintext and ciphertext at a comparable operating efficiency. The keystream generated by the Rényi chaotic map passes the randomness tests set by the United States National Institute of Standards and Technology, and so the proposed scheme is sensitive to the key. (general)

  8. An image encryption approach based on chaotic maps

    International Nuclear Information System (INIS)

    It is well-known that images are different from texts in many aspects, such as highly redundancy and correlation, the local structure and the characteristics of amplitude-frequency. As a result, the methods of conventional encryption cannot be applicable to images. In this paper, we improve the properties of confusion and diffusion in terms of discrete exponential chaotic maps, and design a key scheme for the resistance to statistic attack, differential attack and grey code attack. Experimental and theoretical results also show that our scheme is efficient and very secure

  9. An efficient and robust image encryption scheme for medical applications

    Science.gov (United States)

    Kanso, A.; Ghebleh, M.

    2015-07-01

    In this paper, we propose a novel full and selective chaos-based image encryption scheme suitable for medical image encryption applications. The proposed approach consists of several rounds, where each round is made up of two phases, a shuffling phase and a masking phase. Both phases are block-based and use chaotic cat maps to shuffle and mask an input image. To improve the speed of the proposed scheme while maintaining a high level of security, the scheme employs a pseudorandom matrix, of the same size as the input image, in the masking phase of each round. Blocks of this pseudorandom matrix are permuted in each round of the shuffling phase according to the outputs of some chaotic maps. The proposed scheme applies mixing between blocks of the image in order to prevent cryptanalytic attacks such as differential attacks. Simulation results demonstrate high performance of the proposed scheme and show its robustness against cryptanalytic attacks, thus confirming its suitability for real-time secure image communication.

  10. Subjective and Objective Quality Assessment of Transparently Encrypted JPEG2000 Images

    OpenAIRE

    Stutz, Thomas; Pankajakshan, Vinod; Autrusseau, Florent; Uhl, Andreas; Hofbauer, Heinz

    2010-01-01

    Transparent encryption has two main requirements, i.e. security and perceived quality. The perceptual quality aspect has never been thoroughly investigated. In this work, three variants to transparently encrypt JPEG2000 images are compared from a perceptual quality viewpoint. The assessment is based on sub jective and ob jective quality assessment of the transparently encrypted images and if the requirements with respect to desired functionalities can be met by the respective techniques. In p...

  11. A novel chaotic image encryption scheme using DNA sequence operations

    Science.gov (United States)

    Wang, Xing-Yuan; Zhang, Ying-Qian; Bao, Xue-Mei

    2015-10-01

    In this paper, we propose a novel image encryption scheme based on DNA (Deoxyribonucleic acid) sequence operations and chaotic system. Firstly, we perform bitwise exclusive OR operation on the pixels of the plain image using the pseudorandom sequences produced by the spatiotemporal chaos system, i.e., CML (coupled map lattice). Secondly, a DNA matrix is obtained by encoding the confused image using a kind of DNA encoding rule. Then we generate the new initial conditions of the CML according to this DNA matrix and the previous initial conditions, which can make the encryption result closely depend on every pixel of the plain image. Thirdly, the rows and columns of the DNA matrix are permuted. Then, the permuted DNA matrix is confused once again. At last, after decoding the confused DNA matrix using a kind of DNA decoding rule, we obtain the ciphered image. Experimental results and theoretical analysis show that the scheme is able to resist various attacks, so it has extraordinarily high security.

  12. Bluetooth Based Chaos Synchronization Using Particle Swarm Optimization and Its Applications to Image Encryption

    Directory of Open Access Journals (Sweden)

    Tzu-Hsiang Hung

    2012-06-01

    Full Text Available This study used the complex dynamic characteristics of chaotic systems and Bluetooth to explore the topic of wireless chaotic communication secrecy and develop a communication security system. The PID controller for chaos synchronization control was applied, and the optimum parameters of this PID controller were obtained using a Particle Swarm Optimization (PSO algorithm. Bluetooth was used to realize wireless transmissions, and a chaotic wireless communication security system was developed in the design concept of a chaotic communication security system. The experimental results show that this scheme can be used successfully in image encryption.

  13. Performance Analysis of Most Common Encryption Algorithms on Different Web Browsers

    Directory of Open Access Journals (Sweden)

    R. Umarani

    2012-11-01

    Full Text Available The hacking is the greatest problem in the wireless local area network (WLAN. Many algorithms like DES, 3DES, AES,UMARAM, RC6 and UR5 have been used to prevent the outside attacks to eavesdrop or prevent the data to be transferred to the end-user correctly. We have proposed a Web programming language to be analyzed with five Web browsers in term of their performances to process the encryption of the programming language’s script with the Web browsers. This is followed by conducting tests simulation in order to obtain the best encryption algorithm versus Web browser. The results of the experimental analysis are presented in the form of graphs. We finally conclude on the findings that different algorithms perform differently to different Web browsers like Internet Explorer, Mozilla Firefox, Opera and Netscape Navigator. Hence, we now determine which algorithm works best and most compatible with which Web browser.A comparison has been conducted for those encryption algorithms at different settings for each algorithm such as encryption/decryption speed in the different web Browsers. Experimental results are given to demonstrate the effectiveness of each algorithm.

  14. Optimized Partial Image Encryption Using Pixel Position Manipulation Technique Based on Region of Interest

    Directory of Open Access Journals (Sweden)

    Parameshachari Bidare Divakarachari

    2014-10-01

    Full Text Available Today’s, the most important locomotive to provide confidentiality is image encryption. In real-time applications the classical and modern ciphers are not appropriate because of vast quantity of data. However, certain applications like Pay-TV or Payable Internet Imaging Albums do not require entire part of an encryption, but requires a part of the image to be transparent to all users. Partial encryption is an approach to encode only the most essential portion of the data in order to afford a proportional confidentiality and to trim down the computational requirements and also execution time for encryption is reduced. In this paper, partial image encryption of color images using pixel position manipulation technique based on region of interest is proposed. It offers the amenities of partial encryption and rebuilds the images partially. Here input image is divided in to sub blocks, then selected blocks are encrypted using the proposed technique. The proposed technique promises the rapid security by encrypting the selected blocks of an image.

  15. Sparse-based multispectral image encryption via ptychography

    Science.gov (United States)

    Rawat, Nitin; Shi, Yishi; Kim, Byoungho; Lee, Byung-Geun

    2015-12-01

    Recently, we proposed a model of securing a ptychography-based monochromatic image encryption system via the classical Photon-counting imaging (PCI) technique. In this study, we examine a single-channel multispectral sparse-based photon-counting ptychography imaging (SMPI)-based cryptosystem. A ptychography-based cryptosystem creates a complex object wave field, which can be reconstructed by a series of diffraction intensity patterns through an aperture movement. The PCI sensor records only a few complex Bayer patterned samples that have been utilized in the decryption process. Sparse sensing and nonlinear properties of the classical PCI system, together with the scanning probes, enlarge the key space, and such a combination therefore enhances the system's security. We demonstrate that the sparse samples have adequate information for image decryption, as well as information authentication by means of optical correlation.

  16. An Approach of Visual Cryptography Scheme by Cumulative Image Encryption Technique Using Image-key Encryption, Bit-Sieved Operation and K-N Secret Sharing Scheme

    Directory of Open Access Journals (Sweden)

    Anupam Bhakta,

    2013-06-01

    Full Text Available Visual Cryptography is a special type of encryption technique to obscure image-based secret information which can be decrypted by Human Visual System (HVS. It is imperceptible to reveal the secret information unless a certain number of shares (k or more among n number of shares are superimposed. As the decryption process is done by human visual system, secret information can be retrieved by anyone if the person gets at least k number of shares. For this, simple visual cryptography is very in secure. In this current work we have proposed a method where we done the encryption in several level. First we use a variable length image key to encrypt the original image then bit sieve procedure is used on resultant image and lastly we perform K-N secret sharing scheme on the final encrypted image. Decryption is done in reverse level of encryption that means we do K-N secret sharing scheme, bit sieve method and image key decryption respectively. As multiple levels of encryptions are being used thus the security is being increased in great extant.

  17. Double-image encryption scheme combining DWT-based compressive sensing with discrete fractional random transform

    Science.gov (United States)

    Zhou, Nanrun; Yang, Jianping; Tan, Changfa; Pan, Shumin; Zhou, Zhihong

    2015-11-01

    A new discrete fractional random transform based on two circular matrices is designed and a novel double-image encryption-compression scheme is proposed by combining compressive sensing with discrete fractional random transform. The two random circular matrices and the measurement matrix utilized in compressive sensing are constructed by using a two-dimensional sine Logistic modulation map. Two original images can be compressed, encrypted with compressive sensing and connected into one image. The resulting image is re-encrypted by Arnold transform and the discrete fractional random transform. Simulation results and security analysis demonstrate the validity and security of the scheme.

  18. A NEW DATA HIDING ALGORITHM WITH ENCRYPTED SECRET MESSAGE USING TTJSA SYMMETRIC KEY CRYPTO SYSTEM

    OpenAIRE

    Sayak Guha

    2012-01-01

    In the present work we are proposing a new steganogarphy method to hide any encrypted secret message inside a cover file by substituting in the LSB. For encrypting secret message we have used new algorithm namely TTJSA developed by Nath et al [10]. For hiding secret message we have used a method proposed by Nath et al [2]. The TTJSA method comprises of 3 distinct methods which are also developed by Nath et al[1,7]. The methods are MSA[Meheboob, Saima and Asoke][1], NJJSAA[Neeraj, Joel, Joyshr...

  19. Vulnerability issues on research in WLAN encryption algorithms WEP WPA/WPA2 Personal

    International Nuclear Information System (INIS)

    This paper presents historic and new evidence that wireless encryption algorithms can be cracked or even bypassed which has been proved by other researchers. The paper presents a description of how WEP and WPA/WPA2 Personal encrypt data and how the passphrase is shared between the nodes of the network. Modern tools available on the internet have been evaluated, decomposed and tested to provide evidence on the reliability of passwords. A number of criteria are used to compare the tools and their efficiency

  20. Parallel Vectorized Algebraic AES in MATLAB for Rapid Prototyping of Encrypted Sensor Processing Algorithms and Database Analytics

    OpenAIRE

    Kepner, Jeremy; Gadepally, Vijay; Hancock, Braden; Michaleas, Peter; Michel, Elizabeth; Varia, Mayank

    2015-01-01

    The increasing use of networked sensor systems and networked databases has led to an increased interest in incorporating encryption directly into sensor algorithms and database analytics. MATLAB is the dominant tool for rapid prototyping of sensor algorithms and has extensive database analytics capabilities. The advent of high level and high performance Galois Field mathematical environments allows encryption algorithms to be expressed succinctly and efficiently. This work l...

  1. Compressive optical image encryption with two-step-only quadrature phase-shifting digital holography

    Science.gov (United States)

    Li, Jun; Li, Hongbing; Li, Jiaosheng; Pan, Yangyang; Li, Rong

    2015-06-01

    An image encryption method which combines two-step-only quadrature phase-shifting digital holography with compressive sensing (CS) has been proposed in the fully optical domain. An object image is firstly encrypted to two on-axis quadrature-phase holograms using the two random phase masks in the Mach-Zehnder interferometer. Then, the two encrypted images are highly compressed to a one-dimensional signal using the single-pixel compressive holographic imaging in the optical domain. At the receiving terminal, the two compressive encrypted holograms are exactly reconstructed from much less than the Nyquist sampling number of observations by solving an optimization problem, and the original image can be decrypted with only two reconstructed holograms and the correct keys. This method largely decreases holograms data volume for the current optical image encryption system, and it is also suitable for some special optical imaging cases such as different wavelengths imaging and weak light imaging. Numerical simulation is performed to demonstrate the feasibility and validity of this novel image encryption method.

  2. An Improved FPGA Implementation of the Modified Hybrid Hiding Encryption Algorithm (MHHEA) for Data Communication Security

    OpenAIRE

    Hala A. Farouk; Saeb, Magdy

    2005-01-01

    The hybrid hiding encryption algorithm, as its name implies, embraces concepts from both steganography and cryptography. In this exertion, an improved micro-architecture Field Programmable Gate Array (FPGA) implementation of this algorithm is presented. This design overcomes the observed limitations of a previously-designed micro-architecture. These observed limitations are: no exploitation of the possibility of parallel bit replacement, and the fact that the input plaintext...

  3. Ensemble Of Blowfish With Chaos Based S Box Design For Text And Image Encryption

    Directory of Open Access Journals (Sweden)

    Jeyamala Chandrasekaran

    2011-08-01

    Full Text Available The rapid and extensive usage of Internet in the present decade has put forth information security as an utmost concern. Most of the commercial transactions taking place over the Internet involves a wide variety of data including text, images, audio and video. With the increasing use of digital techniques for transmitting and storing Multimedia data, the fundamental issue of protecting the confidentiality, integrity and authenticity of the information poses a major challenge for security professionals and hassled to the major developments in Cryptography . In cryptography, an S-Box (Substitution-box is a basic componentof symmetric key algorithms, which performs substitution and is typically used to make the relationship between the key and the cipher text non linear and most of the symmetric key algorithms like DES, Blowfish makes use of S boxes. This paper proposes a new method for design of S boxes based on chaos theory. Chaotic equations are popularly known for its randomness, extreme sensitivity to initial conditions and ergodicity. The modified design has been tested with blowfish algorithm which has no effective crypt analysis reported against its design till date because of its salient design features including the key dependant s boxes and complex key generation process. However every new key requires pre-processing equivalent to encrypting about 4 kilobytes of text, which is very slow compared to other block ciphers and it prevents its usage in memory limited applications and embedded systems. The modified design of S boxesmaintains the non linearity [3] [5] and key dependency factors of S boxes with a major reduction in time complexity of generation of S boxes and P arrays. The algorithm has been implemented and the proposed design has been analyzed for size of key space, key sensitivity and Avalanche effect. Experimental results on text and Image Encryption show that the modified design of key generation continues to offer the same level of security as the original Blowfish cipher with a less computational overhead in key generation

  4. AES Encryption Algorithm Hardware Implementation: Throughput and Area Comparison of 128, 192 and 256-bits Key

    Directory of Open Access Journals (Sweden)

    Samir El Adib

    2012-06-01

    Full Text Available Advanced Encryption Standard (AES adopted by the National Institute of Standards and Technology (NIST to replace existing Data Encryption Standard (DES, as the most widely used encryption algorithm in many security applications. Up to today, AES standard has key size variants of 128, 192, and 256-bit, where longer bit keys provide more secure ciphered text output. In the hardware perspective, bigger key size also means bigger area and small throughput. Some companies that employ ultra-high security in their systems may look for a key size bigger than 128-bit AES. In this paper, 128, 192 and 256-bit AES hardware are implemented and compared in terms of throughput and area. The target hardware used in this paper is Virtex XC5VLX50 FPGA from Xilinx. Total area and Throughput results are presented and graphically compared.

  5. Applying Encryption Algorithm for Data Security and Privacy in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mohit Marwaha

    2013-01-01

    Full Text Available Cloud computing is the next big thing after internet in the field of information technology; some say its a metaphor for internet. It is an Internet-based computing technology, in which software, shared recourses and information, are provided to consumers and devices on-demand, and as per users requirement on a pay per use model. Even though the cloud continues to grow in popularity, Usability and respectability, Problems with data protection and data privacy and other Security issues play a major setback in the field of Cloud Computing. Privacy and security are the key issue for cloud storage. Encryption is a well known technology for protecting sensitive data. Use of the combination of Public and Private key encryption to hide the sensitive data of users, and cipher text retrieval. The paper analyzes the feasibility of the applying encryption algorithm for data security and privacy in cloud Storage.

  6. A NOVEL APPROACH OF HYBRID MODEL OF ENCRYPTION ALGORITHMS AND FRAGMENTATION TO ENSURE CLOUD SECURITY

    Directory of Open Access Journals (Sweden)

    Amandeep Kaur

    2015-09-01

    Full Text Available Cloud is a term used as a metaphor for the wide area networks (like internet or any such large networked environment. It came partly from the cloud-like symbol used to represent the complexities of the networks in the schematic diagrams. It represents all the complexities of the network which may include everything from cables, routers, servers, data centers and all such other devices. Cloud based systems saves data off multiple organizations on shared hardware systems. Data segregation is done by encrypting data of users, but encryption is not complete solution. In the proposed work, we have tried to increase the cloud security by using encryption algorithms like AES and RSA along with OTP authentication. We have also fragmented the data by using data distribution at the server end.

  7. Asymmetric multiple-image encryption based on the cascaded fractional Fourier transform

    Science.gov (United States)

    Li, Yanbin; Zhang, Feng; Li, Yuanchao; Tao, Ran

    2015-09-01

    A multiple-image cryptosystem is proposed based on the cascaded fractional Fourier transform. During an encryption procedure, each of the original images is directly separated into two phase masks. A portion of the masks is subsequently modulated into an interim mask, which is encrypted into the ciphertext image; the others are used as the encryption keys. Using phase truncation in the fractional Fourier domain, one can use an asymmetric cryptosystem to produce a real-valued noise-like ciphertext, while a legal user can reconstruct all of the original images using a different group of phase masks. The encryption key is an indivisible part of the corresponding original image and is still useful during decryption. The proposed system has high resistance to various potential attacks, including the chosen-plaintext attack. Numerical simulations also demonstrate the security and feasibility of the proposed scheme.

  8. A sensitive data extraction algorithm based on the content associated encryption technology for ICS

    Science.gov (United States)

    Wang, Wei; Hao, Huang; Xie, Changsheng

    With the development of HD video, the protection of copyright becomes more complicated. More advanced copyright protection technology is needed. Traditional digital copyright protection technology generally uses direct or selective encryption algorithm and the key does not associate with the video content [1]. Once the encryption method is cracked or the key is stolen, the copyright of the video will be violated. To address this issue, this paper proposes a Sensitive Data Extraction Algorithm (SDEA) based on the content associated encryption technology which applies to the Internet Certification Service (ICS). The principle of content associated encryption is to extract some data from the video and use this extracted data as the key to encrypt the rest data. The extracted part from video is called sensitive data, and the rest part is called the main data. After extraction, the main data will not be played or poorly played. The encrypted sensitive data reach the terminal device through the safety certificated network and the main data are through ICS disc. The terminal equipments are responsible for synthesizing and playing these two parts of data. Consequently, even if the main data on disc is illegally obtained, the video cannot be played normally due to the lack of necessary sensitive data. It is proved by experiments that ICS using SDEA can destruct the video effectively with 0.25% extraction rates and the destructed video cannot be played well. It can also guarantee the consistency of the destructive effect on different videos with different contents. The sensitive data can be transported smoothly under the home Internet bandwidth.

  9. Optical interference-based image encryption using circular harmonic expansion and spherical illumination in gyrator transform domain

    Science.gov (United States)

    Wang, Qu; Guo, Qing; Lei, Liang; Zhou, Jinyun

    2015-07-01

    In this paper, a new optical interference-based encryption method using off-axis circular harmonic component (CHC) expansion and iterative phase retrieval algorithm in gyrator transform (GT) domain is proposed. Off-axis CHC expansion is employed to divide the inverse GT spectrum of primitive image into two parts: the zero-order CHC and the sum of the other CHCs. The sum term of CHCs is further encrypted into a complex image whose amplitude constraint is devised to be the amplitude of zero-order CHC by the iterative retrieval GT algorithm. The amplitude part of CHC is the final ciphertext which has rotation-symmetric distribution. Three phase-only keys, the main keys of this proposal, are also calculated during the digital encryption process. To recover the primitive image correctly, two identical ciphertexts placed in the two interference branch should be illuminated by two spherical waves with required parameters (wavelength and radius). Moreover, rotational center of ciphertexts must be placed in a predefined position, which is off the optical axis. The transform angles of GTs, the propagation parameters of spherical waves and the relative position of rotational center of ciphertext are sensitive additional keys for correct retrieval. Numerical simulation tests have been carried out to verify the effectiveness of the proposed scheme.

  10. Fast ghost imaging and ghost encryption based on the discrete cosine transform

    International Nuclear Information System (INIS)

    We introduce the discrete cosine transform as an advanced compression tool for images in computational ghost imaging. A novel approach to fast imaging and encryption, the discrete cosine transform, promotes the security level of ghost images and reduces the image retrieval time. To discuss the advantages of this technique we compare experimental outcomes with simulated ones. (paper)

  11. Optical transformation based image encryption and data embedding techniques using MATLAB

    Science.gov (United States)

    Bhattacharya, Debalina; Ghosh, Ajay

    2015-06-01

    The proposed work describes optical transformations such as Fourier transformation and Fresnel transformation based encryption and decryption of images using random phase masks (RPMs). The encrypted images have been embedded in some secret cover files of other formats like text files, word files, audio files etc to increase the robustness in the security applications. So, if any one wants to send confidential documents, it will be difficult for the interloper to unhide the secret information. The whole work has been done in MATLAB®

  12. A Modified Location-Dependent Image Encryption for Mobile Information System

    Directory of Open Access Journals (Sweden)

    Prasad Reddy.P.V.G.D

    2010-05-01

    Full Text Available The wide spread use of WLAN (Wireless LAN and the popularity of mobile devices increases the frequency of data transmission among mobile users. In such scenario, a need for Secure Communication arises. Secure communication is possible through encryption of data. A lot of encryption techniques have evolved over time. However, most of the data encryption techniques are location-independent. Data encrypted with such techniques can be decrypted anywhere. The encryption technology cannot restrict the location of data decryption. GPS-based encryption (or geoencryption is an innovative technique that uses GPS-technology to encode location information into the encryption keys to provide location based security. In this paper a location-dependent approach is proposed for mobile information system. The mobile client transmits a target latitude/longitude coordinate and an LDEA key is obtained for data encryption to information server. The client can only decrypt the ciphertext when the coordinate acquired form GPS receiver matches with the target coordinate. For improved security, a random key (R-key is incorporated in addition to the LDEA key. The proposed method is applied for images.

  13. Cloud Computing: A CRM Service Based on a Separate Encryption and Decryption using Blowfish algorithm

    Directory of Open Access Journals (Sweden)

    G.Devi, M.Pramod Kumar

    2012-08-01

    Full Text Available A LMS (Learning Management System service is described in this project using Blowfish algorithm. It promotes more accessibility to LMS service providers to send their training modules and syllabus via Internet at any point of the hour much more efficiently. This gives rise to reduced cost of hardware and software tools, which in return would scale-up the e-learning environment. In the existing system RSA algorithm used. It requires more computation time for large volumes of data. To reduce this computation time we are using Blowfish algorithm. The LMS Service utilizes three cloud systems, including an encryption and decryption system, a storagesystem, and LMS application system.

  14. Analization and Comparison of Selective Encryption Algorithms with Full Encryption for Wireless Networks

    OpenAIRE

    Pavithra. C#1 , Vinod. B. Durdi

    2013-01-01

    Cryptography has been widely accepted as a traditional platform of data protection for decades.The most significant and efficient cryptosystems these days are the Symmetric key algorithms for cryptography. Hence, they have a very wide range of applications in many realms. Ad-hoc networks are the most commonly used type in the present scenario because of their non-fixed infrastructure. Providing security to such kinds of network is the main objective of the work here. In this project, we prese...

  15. Encrypting image by assembling the fractal-image addition method and the binary encoding method

    Science.gov (United States)

    Lin, Kuang Tsan; Yeh, Sheng Lih

    2012-05-01

    The fractal-image addition method and the binary encoding method are assembled to form a hybrid method for encrypting a digital covert image. For this hybrid method, a host image is used to create an overt image with the information of the covert image. First, the fractal-image addition method is used to add some fractal images and the covert image to form an image-mixing matrix. Then, all the pixel values of the image-mixing matrix are transferred into binary data. Finally, the binary data are encoded into the host image to create an overt image. The pixels of the overt image contain eight groups of codes used for reconstructing the covert image. The eight groups of codes are identification codes, row amount codes, covert-image dimension codes, fractal-image amount codes, starting-pixel codes, character amount codes, character codes, and information codes. The overt image and the host image look almost the same for eyes. Furthermore, the covert image can be directly reconstructed from the overt image without using the host image. The most important feature is that the reconstructed covert image is identical to the original covert image, i.e. there is no distortion in the decoding work.

  16. Multi-image encryption based on synchronization of chaotic lasers and iris authentication

    Science.gov (United States)

    Banerjee, Santo; Mukhopadhyay, Sumona; Rondoni, Lamberto

    2012-07-01

    A new technique of transmitting encrypted combinations of gray scaled and chromatic images using chaotic lasers derived from Maxwell-Bloch's equations has been proposed. This novel scheme utilizes the general method of solution of a set of linear equations to transmit similar sized heterogeneous images which are a combination of monochrome and chromatic images. The chaos encrypted gray scaled images are concatenated along the three color planes resulting in color images. These are then transmitted over a secure channel along with a cover image which is an iris scan. The entire cryptology is augmented with an iris-based authentication scheme. The secret messages are retrieved once the authentication is successful. The objective of our work is briefly outlined as (a) the biometric information is the iris which is encrypted before transmission, (b) the iris is used for personal identification and verifying for message integrity, (c) the information is transmitted securely which are colored images resulting from a combination of gray images, (d) each of the images transmitted are encrypted through chaos based cryptography, (e) these encrypted multiple images are then coupled with the iris through linear combination of images before being communicated over the network. The several layers of encryption together with the ergodicity and randomness of chaos render enough confusion and diffusion properties which guarantee a fool-proof approach in achieving secure communication as demonstrated by exhaustive statistical methods. The result is vital from the perspective of opening a fundamental new dimension in multiplexing and simultaneous transmission of several monochromatic and chromatic images along with biometry based authentication and cryptography.

  17. Optical binary image encryption using aperture-key and dual wavelengths.

    Science.gov (United States)

    Wang, Xiaogang; Chen, Wen; Chen, Xudong

    2014-11-17

    We described a method where the secret binary image that has been encoded into a single amplitude pattern in Fresnel domain can be recovered based on phase retrieval with an aperture-key and wavelength keys, and no holographic recording is needed in the encryption. The predesigned aperture-key not only realizes the intensity modulation of the encrypted image, but also helps to retrieve the secret image with high quality. All the necessary decryption keys can be kept in digital form that facilitates data transmission and loading in image retrieval process. Numerical simulation results are given for testing the validity and security of the proposed approach. PMID:25402048

  18. A New Loss-Tolerant Image Encryption Scheme Based on Secret Sharing and Two Chaotic Systems

    Directory of Open Access Journals (Sweden)

    Li Li

    2012-04-01

    Full Text Available In this study, we propose an efficient loss-tolerant image encryption scheme that protects both confidentiality and loss-tolerance simultaneously in shadow images. In this scheme, we generate the key sequence based on two chaotic maps and then encrypt the image during the sharing phase based on Shamir’s method. Experimental results show a better performance of the proposed scheme for different images than other methods from human vision. Security analysis confirms a high probability to resist both brute-force and collusion attacks.

  19. Implementation of Optimized DES Encryption Algorithm upto 4 Round on Spartan 3

    Directory of Open Access Journals (Sweden)

    Nimmi Gupta

    2012-02-01

    Full Text Available Data Security is an important parameter for the industries. It can be achieved by Encryption algorithms which are used to prevent unauthorized access of data. Cryptography is the science of keeping data transfer secure, so that eavesdroppers (or attackers cannot decipher the transmitted message. In this paper the DES algorithm is optimized upto 4 round using Xilinx software and implemented on Spartan 3 Modelsim. The paper deals with various parameters such as variable key length, key generation mechanism, etc. used in order to provide optimized results.

  20. Grayscale image encryption using a hyperchaotic unstable dissipative system.

    Czech Academy of Sciences Publication Activity Database

    Ontanon-García, L.J.; García-Martínez, M.; Campos-Cantón, E.; ?elikovský, Sergej

    London : IEEE, 2013, s. 508-512. ISBN 978-1-908320-16-2. [The 8th International Conference for Internet Technology and Secured Transactions (ICITST-2013). Londýn (GB), 09.12.2013-12.12.2013] R&D Projects: GA ?R(CZ) GAP103/12/1794 Institutional support: RVO:67985556 Keywords : Hyperchaos * piecewise linear systems * multi-scrolls * chaotic encryption * stream cypher encryption Subject RIV: BC - Control Systems Theory

  1. A symmetric image encryption scheme based on 3D chaotic cat maps

    International Nuclear Information System (INIS)

    Encryption of images is different from that of texts due to some intrinsic features of images such as bulk data capacity and high redundancy, which are generally difficult to handle by traditional methods. Due to the exceptionally desirable properties of mixing and sensitivity to initial conditions and parameters of chaotic maps, chaos-based encryption has suggested a new and efficient way to deal with the intractable problem of fast and highly secure image encryption. In this paper, the two-dimensional chaotic cat map is generalized to 3D for designing a real-time secure symmetric encryption scheme. This new scheme employs the 3D cat map to shuffle the positions (and, if desired, grey values as well) of image pixels and uses another chaotic map to confuse the relationship between the cipher-image and the plain-image, thereby significantly increasing the resistance to statistical and differential attacks. Thorough experimental tests are carried out with detailed analysis, demonstrating the high security and fast encryption speed of the new scheme

  2. A Novel Image Encryption Scheme Based on Multi-orbit Hybrid of Discrete Dynamical System

    Directory of Open Access Journals (Sweden)

    Ruisong Ye

    2014-10-01

    Full Text Available A multi-orbit hybrid image encryption scheme based on discrete chaotic dynamical systems is proposed. One generalized Arnold map is adopted to generate three orbits for three initial conditions. Another chaotic dynamical system, tent map, is applied to generate one pseudo-random sequence to determine the hybrid orbit points from which one of the three orbits of generalized Arnold map. The hybrid orbit sequence is then utilized to shuffle the pixels' positions of plain-image so as to get one permuted image. To enhance the encryption security, two rounds of pixel gray values' diffusion is employed as well. The proposed encryption scheme is simple and easy to manipulate. The security and performance of the proposed image encryption have been analyzed, including histograms, correlation coefficients, information entropy, key sensitivity analysis, key space analysis, differential analysis, etc. All the experimental results suggest that the proposed image encryption scheme is robust and secure and can be used for secure image and video communication applications.

  3. High performance optical encryption based on computational ghost imaging with QR code and compressive sensing technique

    Science.gov (United States)

    Zhao, Shengmei; Wang, Le; Liang, Wenqiang; Cheng, Weiwen; Gong, Longyan

    2015-10-01

    In this paper, we propose a high performance optical encryption (OE) scheme based on computational ghost imaging (GI) with QR code and compressive sensing (CS) technique, named QR-CGI-OE scheme. N random phase screens, generated by Alice, is a secret key and be shared with its authorized user, Bob. The information is first encoded by Alice with QR code, and the QR-coded image is then encrypted with the aid of computational ghost imaging optical system. Here, measurement results from the GI optical system's bucket detector are the encrypted information and be transmitted to Bob. With the key, Bob decrypts the encrypted information to obtain the QR-coded image with GI and CS techniques, and further recovers the information by QR decoding. The experimental and numerical simulated results show that the authorized users can recover completely the original image, whereas the eavesdroppers can not acquire any information about the image even the eavesdropping ratio (ER) is up to 60% at the given measurement times. For the proposed scheme, the number of bits sent from Alice to Bob are reduced considerably and the robustness is enhanced significantly. Meantime, the measurement times in GI system is reduced and the quality of the reconstructed QR-coded image is improved.

  4. Image encryption based on a delayed fractional-order chaotic logistic system

    International Nuclear Information System (INIS)

    A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security. (general)

  5. An Image Encryption Scheme Based on 2D Tent Map and Coupled Map Lattice

    Directory of Open Access Journals (Sweden)

    Ruisong Ye

    2011-12-01

    Full Text Available This paper proposes a chaos-based image encryption scheme where one 2D tent map with two control parameters is utilized to generate chaotic orbits applied to scramble the pixel positions while one coupled map lattice is employed to yield random gray value sequences to change the gray values so as to enhance the security. Experimental results are carried out with detailed analysis to demonstrate that the proposed image encryption scheme possesses large key space to resist brute-force attack and possesses good statistical properties to frustrate statistical analysis attacks. Experiments are also performed to illustrate the robustness against malicious attacks like cropping, noising, JPEG compression.

  6. A novel chaotic based image encryption using a hybrid model of deoxyribonucleic acid and cellular automata

    Science.gov (United States)

    Enayatifar, Rasul; Sadaei, Hossein Javedani; Abdullah, Abdul Hanan; Lee, Malrey; Isnin, Ismail Fauzi

    2015-08-01

    Currently, there are many studies have conducted on developing security of the digital image in order to protect such data while they are sending on the internet. This work aims to propose a new approach based on a hybrid model of the Tinkerbell chaotic map, deoxyribonucleic acid (DNA) and cellular automata (CA). DNA rules, DNA sequence XOR operator and CA rules are used simultaneously to encrypt the plain-image pixels. To determine rule number in DNA sequence and also CA, a 2-dimension Tinkerbell chaotic map is employed. Experimental results and computer simulations, both confirm that the proposed scheme not only demonstrates outstanding encryption, but also resists various typical attacks.

  7. Performance Analysis of Most Common Encryption Algorithms on Different Web Browsers

    OpenAIRE

    R.Umarani; G.Ramesh

    2012-01-01

    The hacking is the greatest problem in the wireless local area network (WLAN). Many algorithms like DES, 3DES, AES,UMARAM, RC6 and UR5 have been used to prevent the outside attacks to eavesdrop or prevent the data to be transferred to the end-user correctly. We have proposed a Web programming language to be analyzed with five Web browsers in term of their performances to process the encryption of the programming language’s script with the Web browsers. This is followed by conducting tests sim...

  8. A fast image encryption system based on chaotic maps with finite precision representation

    International Nuclear Information System (INIS)

    In this paper, a fast chaos-based image encryption system with stream cipher structure is proposed. In order to achieve a fast throughput and facilitate hardware realization, 32-bit precision representation with fixed point arithmetic is assumed. The major core of the encryption system is a pseudo-random keystream generator based on a cascade of chaotic maps, serving the purpose of sequence generation and random mixing. Unlike the other existing chaos-based pseudo-random number generators, the proposed keystream generator not only achieves a very fast throughput, but also passes the statistical tests of up-to-date test suite even under quantization. The overall design of the image encryption system is to be explained while detail cryptanalysis is given and compared with some existing schemes

  9. A NEW TECHNIQUE BASED ON CHAOTIC STEGANOGRAPHY AND ENCRYPTION TEXT IN DCT DOMAIN FOR COLOR IMAGE

    Directory of Open Access Journals (Sweden)

    MELAD J. SAEED

    2013-10-01

    Full Text Available Image steganography is the art of hiding information into a cover image. This paper presents a new technique based on chaotic steganography and encryption text in DCT domain for color image, where DCT is used to transform original image (cover image from spatial domain to frequency domain. This technique used chaotic function in two phases; firstly; for encryption secret message, second; for embedding in DCT cover image. With this new technique, good results are obtained through satisfying the important properties of steganography such as: imperceptibility; improved by having mean square error (MSE, peak signal to noise ratio (PSNR and normalized correlation (NC, to phase and capacity; improved by encoding the secret message characters with variable length codes and embedding the secret message in one level of color image only.

  10. An Implementation of BLOWFISH Encryption Algorithm using KERBEROS Authentication Mechanism

    Directory of Open Access Journals (Sweden)

    Ch Panchamukesh

    2011-07-01

    Full Text Available The sensitive information stored on computers and transmitted over the Internet need to ensure information security and safety measures. Without our knowledge, the Intruders sneak into the systems, misuse it and even create back doors to our computer systems. Thus, there must not be any compromise in securing our resources. Hence, Cryptography is mainly used to ensure secrecy. Access Control Policy is used for securing the resources as initial state, which determines the potential threats, the solutions and the ways of implementation of the security. Various security solutions to block the unauthenticated users starts from a series mechanism from Firewalls to Kerberos, most of them need a strong cryptographic base. Cryptography provide solutions for four different security areas confidentiality, authentication, integrity and control of interaction between different parties involved in data exchange ultimately which tend to the security of information. Among which Kerberos authentication promises most secured and unbreakable. It works on the basis of granting tickets for each session and resource access. This paper includes a mechanism that implements the Blowfish algorithm with a 64 bit length key with an improved security assurance.

  11. A Lightweight White-Box Symmetric Encryption Algorithm against Node Capture for WSNs

    Directory of Open Access Journals (Sweden)

    Yang Shi

    2015-05-01

    Full Text Available Wireless Sensor Networks (WSNs are often deployed in hostile environments and, thus, nodes can be potentially captured by an adversary. This is a typical white-box attack context, i.e., the adversary may have total visibility of the implementation of the build-in cryptosystem and full control over its execution platform. Handling white-box attacks in a WSN scenario is a challenging task. Existing encryption algorithms for white-box attack contexts require large memory footprint and, hence, are not applicable for wireless sensor networks scenarios. As a countermeasure against the threat in this context, in this paper, we propose a class of lightweight secure implementations of the symmetric encryption algorithm SMS4. The basic idea of our approach is to merge several steps of the round function of SMS4 into table lookups, blended by randomly generated mixing bijections. Therefore, the size of the implementations are significantly reduced while keeping the same security efficiency. The security and efficiency of the proposed solutions are theoretically analyzed. Evaluation shows our solutions satisfy the requirement of sensor nodes in terms of limited memory size and low computational costs.

  12. An Image Encryption Scheme Based on Hyperchaotic Rabinovich and Exponential Chaos Maps

    Directory of Open Access Journals (Sweden)

    Xiaojun Tong

    2015-01-01

    Full Text Available This paper proposes a new four-dimensional hyperchaotic map based on the Rabinovich system to realize chaotic encryption in higher dimension and improve the security. The chaotic sequences generated by Runge-Kutta method are combined with the chaotic sequences generated by an exponential chaos map to generate key sequences. The key sequences are used for image encryption. The security test results indicate that the new hyperchaotic system has high security and complexity. The comparison between the new hyperchaotic system and the several low-dimensional chaotic systems shows that the proposed system performs more efficiently.

  13. Enhancing RSA algorithm using Mersenne Primes with reduced size of encrypted file

    Directory of Open Access Journals (Sweden)

    Shilpa Madhaorao Pund

    2013-05-01

    Full Text Available Message passing from source to destination is one of the important aspects of communication. However, it is requiredmany times that this message gets transmitted secretly, so that no unauthorized person gets knowledge of the contents ofthat message. To retain the confidentiality of the message transmitted is a challenging task as it needs to be guaranteedthat the message arrive in the right hands exactly as it was transmitted. Another challenge is of transmitting the messageover a public, insecure channel. In this paper, RSA algorithm is implemented using Mersenne Primes which guaranteesthe primality. This is an enhanced algorithm which increases the strength of RSA by generating large prime numbers andalso reduces the size of encrypted file.

  14. System for Information Encryption Implementing Several Chaotic Orbits

    Directory of Open Access Journals (Sweden)

    Jiménez-Rodríguez Maricela

    2015-07-01

    Full Text Available This article proposes a symmetric encryption algorithm that takes, as input value, the original information of length L, that when encoded, generates the ciphertext of greater length LM. A chaotic discrete system (logistic map is implemented to generate 3 different orbits: the first is used for applying a diffusion technique in order to mix the original data, the second orbit is combined with the mixed information and increases the length of L to LM, and with the third orbit, the confusion technique is implemented. The encryption algorithm was applied to encode an image which is then totally recovered by the keys used to encrypt and his respective, decrypt algorithm. The algorithm can encode any information, just dividing into 8 bits, it can cover the requirements for high level security, it uses 7 keys to encrypt and provides good encryption speed

  15. Medical Image Protection using steganography by crypto-image as cover image

    OpenAIRE

    Vinay Pandey; Manish Shrivastava

    2012-01-01

    This paper presents securing the transmission of medical images. The presented algorithms will be applied to images. This work presents a new method that combines image cryptography, data hiding and Steganography technique for denoised and safe image transmission purpose. In This method we encrypt the original image with two shares mechanism encryption algorithm then embed the encrypted image with patient information by using lossless data embedding technique with data hiding method after tha...

  16. Hybrid Encryption-Compression Scheme Based on Multiple Parameter Discrete Fractional Fourier Transform with Eigen Vector Decomposition Algorithm

    Directory of Open Access Journals (Sweden)

    Deepak Sharma

    2014-09-01

    Full Text Available Encryption along with compression is the process used to secure any multimedia content processing with minimum data storage and transmission. The transforms plays vital role for optimizing any encryption-compression systems. Earlier the original information in the existing security system based on the fractional Fourier transform (FRFT is protected by only a certain order of FRFT. In this article, a novel method for encryption-compression scheme based on multiple parameters of discrete fractional Fourier transform (DFRFT with random phase matrices is proposed. The multiple-parameter discrete fractional Fourier transform (MPDFRFT possesses all the desired properties of discrete fractional Fourier transform. The MPDFRFT converts to the DFRFT when all of its order parameters are the same. We exploit the properties of multiple-parameter DFRFT and propose a novel encryption-compression scheme using the double random phase in the MPDFRFT domain for encryption and compression data. The proposed scheme with MPDFRFT significantly enhances the data security along with image quality of decompressed image compared to DFRFT and FRFT and it shows consistent performance with different images. The numerical simulations demonstrate the validity and efficiency of this scheme based on Peak signal to noise ratio (PSNR, Compression ratio (CR and the robustness of the schemes against bruit force attack is examined.

  17. Color image encryption using iterative phase retrieve process in quaternion Fourier transform domain

    Science.gov (United States)

    Sui, Liansheng; Duan, Kuaikuai

    2015-02-01

    A single-channel color image encryption method is proposed based on iterative phase iterative process in quaternion Fourier transform domain. First, three components of the plain color image is confused respectively by using cat map. Second, the confused components are combined into a pure quaternion image, which is encode to the phase only function by using an iterative phase retrieval process. Finally, the phase only function is encrypted into the gray scale ciphertext with stationary white noise distribution based on the chaotic diffusion, which has camouflage property to some extent. The corresponding plain color image can be recovered from the ciphertext only with correct keys in the decryption process. Simulation results verify the feasibility and effectiveness of the proposed method.

  18. Research on image self-recovery algorithm based on DCT

    Directory of Open Access Journals (Sweden)

    Shengbing Che

    2010-06-01

    Full Text Available Image compression operator based on discrete cosine transform was brought up. A securer scrambling locational operator was put forward based on the concept of anti-tamper radius. The basic idea of the algorithm is that it first combined image block compressed data with eigenvalue of image block and its offset block, then scrambled or encrypted and embeded them into least significant bit of corresponding offset block. This algorithm could pinpoint tampered image block and tampering type accurately. It could recover tampered block with good image quality when tamper occured within the limits of the anti-tamper radius. It could effectively resist vector quantization and synchronous counterfeiting attacks on self-embedding watermarking schemes.

  19. An Efficient Chaos-based Image Encryption Scheme Using Affine Modular Maps

    OpenAIRE

    Ruisong Ye; Haiying Zhao

    2012-01-01

    Linear congruential generator has been widely applied to generate pseudo-random numbers successfully. This paper proposes a novel chaos-based image encryption scheme using affine modular maps, which are extensions of linear congruential generators, acting on the unit interval. A permutation process utilizes two affine modular maps to get two index order sequences for the shuffling of image pixel positions, while a diffusion process employs another two affine modular maps to yield two pseudo-r...

  20. Speckle imaging algorithms for planetary imaging

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, E. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    I will discuss the speckle imaging algorithms used to process images of the impact sites of the collision of comet Shoemaker-Levy 9 with Jupiter. The algorithms use a phase retrieval process based on the average bispectrum of the speckle image data. High resolution images are produced by estimating the Fourier magnitude and Fourier phase of the image separately, then combining them and inverse transforming to achieve the final result. I will show raw speckle image data and high-resolution image reconstructions from our recent experiment at Lick Observatory.

  1. Modified Prime Number Factorization Algorithm (MPFA For RSA Public Key Encryption

    Directory of Open Access Journals (Sweden)

    Kuldeep Singh

    2012-09-01

    Full Text Available The Public key encryption security such as RSA scheme relied on the integer factoring problem. The security of RSA algorithm based on positive integer N, which is the product of two prime numbers, the factorization of N is very intricate. In this paper a factorization method is proposed, which is used to obtain the factor of positive integer N. The present work focuses on factorization of all trivial and nontrivial integer numbers as per Fermat method and requires fewer steps for factorization process of RSA modulus N. By experimental results it has been shown that factorization speed becomes increasing as compare to traditional Trial Division method, Fermat Factorization method, Brent’s Factorization method and Pollard Rho Factorization method.

  2. A novel scheme for image encryption based on 2D piecewise chaotic maps

    Science.gov (United States)

    Akhshani, A.; Behnia, S.; Akhavan, A.; Hassan, H. Abu; Hassan, Z.

    2010-09-01

    In this paper, a hierarchy of two-dimensional piecewise nonlinear chaotic maps with an invariant measure is introduced. These maps have interesting features such as invariant measure, ergodicity and the possibility of K-S entropy calculation. Then by using significant properties of these chaotic maps such as ergodicity, sensitivity to initial condition and control parameter, one-way computation and random like behavior, we present a new scheme for image encryption. Based on all analysis and experimental results, it can be concluded that, this scheme is efficient, practicable and reliable, with high potential to be adopted for network security and secure communications. Although the two-dimensional piecewise nonlinear chaotic maps presented in this paper aims at image encryption, it is not just limited to this area and can be widely applied in other information security fields.

  3. A New Public-Key Encryption Scheme Based on Non-Expansion Visual Cryptography and Boolean Operation

    OpenAIRE

    Abdullah M. Jaafar; Azman Samsudin

    2010-01-01

    Currently, most of the existing public-key encryption schemes are based on complex algorithms with heavy computations. In 1994, Naor and Shamir proposed a simple cryptography method for digital images called visual cryptography. Existing visual cryptography primitives can be considered as a special kind of secret-key cryptography that does not require heavy computations for encrypting and decrypting an image. In this paper, we propose a new public-key encryption scheme for image based on non-...

  4. A Survey on Various Encryption Techniques

    Directory of Open Access Journals (Sweden)

    John Justin M

    2012-03-01

    Full Text Available This paper focuses mainly on the different kinds of encryption techniques that are existing, and framing all the techniques together as a literature survey. Aim an extensive experimental study of implementations of various available encryption techniques. Also focuses on image encryption techniques, information encryption techniques, double encryption and Chaos-based encryption techniques. This study extends to the performance parameters used in encryption processes and analyzing on their security issues.

  5. Image Compression Algorithms Using Dct

    OpenAIRE

    Er. Abhishek Kaushik; Er. Deepti Nain

    2014-01-01

    Image compression is the application of Data compression on digital images. The discrete cosine transform (DCT) is a technique for converting a signal into elementary frequency components. It is widely used in image compression. Here we develop some simple functions to compute the DCT and to compress images. An image compression algorithm was comprehended using Matlab code, and modified to perform better when implemented in hardware description language. The IMAP block and IMA...

  6. Enhancement of Image Security Using Random Permutation

    Directory of Open Access Journals (Sweden)

    S.Vasu Deva Simha

    2014-11-01

    Full Text Available In recent days transmitting digital media having large size data through the internet became simple task but providing security and security became big issue these days. Using pseudorandom permutation, image encryption is obtained. Confidentiality and access control is done by encryption. Image encryption has to be conducted prior to image compression. In this paper how to design a pair of image encryption and compression algorithms such that compressing encrypted images can still be efficiently performed .This paper introduced a highly efficient image encryption-then compression (ETC system. The proposed image encryption scheme operated in the prediction error domain is able to provide a reasonably high level of security. More notably, the proposed compression approach applied to encrypted images is only slightly worse, unencrypted images as inputs.

  7. Triple Layered Encryption Algorithm for IEEE 802.11 WLANs in E-Government Services

    Directory of Open Access Journals (Sweden)

    M A Kabir

    2013-09-01

    Full Text Available Wireless local area network (WLAN can provide e-government services at all levels, from local to national as the WLAN enabled devices have the flexibility to move from one place to another within offices while maintaining connectivity with the network. However, government organizations are subject to strict security policies and other compliance requirements. Therefore, WLAN must ensure the safeguard the privacy of individual data with the strictest levels of security. The 802.11 MAC specifications describe an encryption protocol called Wired Equivalent Privacy (WEP which is used to protect wireless communications from eavesdropping. It is also capable of preventing unauthorized access. However, the WEP protocol often fails to accomplish its security goal due to the weakness in RC4 and the way it is applied in WEP protocol. This paper focuses the improvement of existing WEP protocol using the varying secret key for each transmission. This will remove the insecurities that currently make the RC4 unattractive for secured networking and this will add further cryptographic strength if applied to Rijndael algorithm. Our result shows that the proposed algorithm is more suitable for small and medium packets and AES for large packets.

  8. Optical Encryption with Jigsaw Transform using Matlab

    OpenAIRE

    Giraldo, Leidy Marcela; Villegas, Edward Yesid

    2012-01-01

    This article will describe an optical encryption technical of images which it is proposed in an analogical and digital way. The development of the technical to a digital level, it is made to implementing algorithms (routines) in MATLAB. We will propose a functional diagram to the described analogical development from which designated the optical systems associated with each functional block. Level of security that the jigsaw algorithms provide applied on an image, which has ...

  9. A Modified Location-Dependent Image Encryption for Mobile Information System

    OpenAIRE

    Prasad Reddy.P.V.G.D; Sudha, K. R.; P Sanyasi Naidu

    2010-01-01

    The wide spread use of WLAN (Wireless LAN) and the popularity of mobile devices increases the frequency of data transmission among mobile users. In such scenario, a need for Secure Communication arises. Secure communication is possible through encryption of data. A lot of encryption techniques have evolved over time. However, most of the data encryption techniques are location-independent. Data encrypted with such techniques can be decrypted anywhere. The encryption technology cannot restrict...

  10. An Efficient Chaos-based Image Encryption Scheme Using Affine Modular Maps

    Directory of Open Access Journals (Sweden)

    Ruisong Ye

    2012-07-01

    Full Text Available Linear congruential generator has been widely applied to generate pseudo-random numbers successfully. This paper proposes a novel chaos-based image encryption scheme using affine modular maps, which are extensions of linear congruential generators, acting on the unit interval. A permutation process utilizes two affine modular maps to get two index order sequences for the shuffling of image pixel positions, while a diffusion process employs another two affine modular maps to yield two pseudo-random gray value sequences for a two-way diffusion of gray values. Experimental results are carried out with detailed analysis to demonstrate that the proposed image encryption scheme possesses large key space to frustrate brute-force attack efficiently and can resist statistical attack, differential attack, known-plaintext attack as well as chosen-plaintext attack thanks to the yielded gray value sequences in the diffusion process not only being sensitive to the control parameters and initial conditions of the considered chaotic maps, but also strongly depending on the plain-image processed.

  11. Advanced Imaging Algorithms for Radiation Imaging Systems

    Energy Technology Data Exchange (ETDEWEB)

    Marleau, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The intent of the proposed work, in collaboration with University of Michigan, is to develop the algorithms that will bring the analysis from qualitative images to quantitative attributes of objects containing SNM. The first step to achieving this is to develop an indepth understanding of the intrinsic errors associated with the deconvolution and MLEM algorithms. A significant new effort will be undertaken to relate the image data to a posited three-dimensional model of geometric primitives that can be adjusted to get the best fit. In this way, parameters of the model such as sizes, shapes, and masses can be extracted for both radioactive and non-radioactive materials. This model-based algorithm will need the integrated response of a hypothesized configuration of material to be calculated many times. As such, both the MLEM and the model-based algorithm require significant increases in calculation speed in order to converge to solutions in practical amounts of time.

  12. A Space-bit-plane Scrambling Algorithm for Image Based on Chaos

    Directory of Open Access Journals (Sweden)

    Rui Liu

    2011-10-01

    Full Text Available Image scrambling is an important technique in digital image encryption and digital image watermarking. This paper's main purpose is to research how to scramble image by space-bit-plane operation (SBPO. Based on analyzing traditional bit operation of individual pixels image scrambling method, this paper proposed a new scrambling algorithm. The new scrambling algorithm combined with SBPO and chaotic sequence. First, every eight pixels from different areas of image were selected according to chaotic sequence, and grouped together to form a collection. Second, the SBPO was performed in every collection and built eight pixels of the image with new values. The scrambling image was generated when all pixels were processed. In this way, the proposed algorithm transforms drastically the statistical characteristic of original image information, so, it increases the difficulty of an unauthorized individual to break the encryption. The simulation results and the performance analysis show that the algorithm has large secret-key space, high security, fast scrambling speed and strong robustness, and is suitable for practical use to protect the security of digital image information over the Internet.

  13. Encrypted Three-dimensional Dynamic Imaging using Snapshot Time-of-flight Compressed Ultrafast Photography.

    Science.gov (United States)

    Liang, Jinyang; Gao, Liang; Hai, Pengfei; Li, Chiye; Wang, Lihong V

    2015-01-01

    Compressed ultrafast photography (CUP), a computational imaging technique, is synchronized with short-pulsed laser illumination to enable dynamic three-dimensional (3D) imaging. By leveraging the time-of-flight (ToF) information of pulsed light backscattered by the object, ToF-CUP can reconstruct a volumetric image from a single camera snapshot. In addition, the approach unites the encryption of depth data with the compressed acquisition of 3D data in a single snapshot measurement, thereby allowing efficient and secure data storage and transmission. We demonstrated high-speed 3D videography of moving objects at up to 75 volumes per second. The ToF-CUP camera was applied to track the 3D position of a live comet goldfish. We have also imaged a moving object obscured by a scattering medium. PMID:26503834

  14. A simple public-key attack on phase-truncation-based double-images encryption system

    Science.gov (United States)

    Ding, Xiangling; Yang, Gaobo; He, Dajiang

    2015-07-01

    Phase-truncation based double-images cryptosystem can avoid the iterative Fourier transforms and realize double-images encryption. In this paper, a simple public-key attack is proposed to break this cryptosystem by using arbitrary position parameters and three public keys. The attack process is composed of two steps. Firstly, the decryption keys are simply generated with the help of arbitrary position parameters and the three public keys. Secondly, the two approximate values of the original images are obtained by using the generated decryption keys. Moreover, the proposed public-key attack is different from the existing attacks. It is not sensitive to position parameters of the double-images and the computing efficiency is also much better. Computer simulation results further prove its vulnerability.

  15. New Comprehensive Study to Assess Comparatively the QKD, XKMS, KDM in the PKI encryption algorithms

    OpenAIRE

    Bilal Bahaa Zaidan; Aws Alaa Zaidan; Harith Mwafak

    2009-01-01

    Protecting data is a very old art and wide use since the Egyptian when they identify the 1st encryption method in this world, the spiciest people categorize protecting data under computer forensic and others locate it under network security, cryptography methods are the backbone of this art, nowadays many of new techniques for the attacker's beings develops, a lot of methods for information protecting start dropping down such as RSA and stander encryption methods, in the same time a new metho...

  16. Query Processing Performance and Searching Over Encrypted Data By Using An Efficient Algorithm

    OpenAIRE

    Sharma, Manish; Chaudhary, Atul; Kumar, Santosh

    2013-01-01

    Data is the central asset of today's dynamically operating organization and their business. This data is usually stored in database. A major consideration is applied on the security of that data from the unauthorized access and intruders. Data encryption is a strong option for security of data in database and especially in those organizations where security risks are high. But there is a potential disadvantage of performance degradation. When we apply encryption on database ...

  17. An Efficient Image Encryption Scheme Based on a Peter De Jong Chaotic Map and a RC4 Stream Cipher

    Science.gov (United States)

    Hanchinamani, Gururaj; Kulkarni, Linganagouda

    2015-09-01

    Security is a vital issue in communication and storage of the images and encryption is one of the ways to ensure the security. This paper proposes an efficient image encryption scheme based on a Peter De Jong chaotic map and a RC4 stream cipher. A Peter De Jong map is employed to determine the initial keys for the RC4 stream generator and also during permutation stage. The RC4 stream generator is utilized to generate the pseudo random numbers for the pixel value rotation and diffusion operations. Each encryption round is comprised of three stages: permutation, pixel value rotation and diffusion. The permutation is based on scrambling the rows and columns, in addition, circular rotations of the rows and columns in alternate orientations. The second stage circularly rotates each and every pixel value by utilizing M × N pseudo random numbers. The last stage carries out the diffusion twice by scanning the image in two different ways. Each of the two diffusions accomplishes the diffusion in two orientations (forward and backward) with two previously diffused pixels and two pseudo random numbers. The security and performance of the proposed method is assessed thoroughly by using key space, statistical, differential, entropy and performance analysis. Moreover, two rounds of the call to the encrypt function provide the sufficient security. The experimental results show that the proposed encryption scheme is computationally fast with high security.

  18. A non-linear preprocessing for opto-digital image encryption using multiple-parameter discrete fractional Fourier transform

    Science.gov (United States)

    Azoug, Seif Eddine; Bouguezel, Saad

    2016-01-01

    In this paper, a novel opto-digital image encryption technique is proposed by introducing a new non-linear preprocessing and using the multiple-parameter discrete fractional Fourier transform (MPDFrFT). The non-linear preprocessing is performed digitally on the input image in the spatial domain using a piecewise linear chaotic map (PLCM) coupled with the bitwise exclusive OR (XOR). The resulting image is multiplied by a random phase mask before applying the MPDFrFT to whiten the image. Then, a chaotic permutation is performed on the output of the MPDFrFT using another PLCM different from the one used in the spatial domain. Finally, another MPDFrFT is applied to obtain the encrypted image. The parameters of the PLCMs together with the multiple fractional orders of the MPDFrFTs constitute the secret key for the proposed cryptosystem. Computer simulation results and security analysis are presented to show the robustness of the proposed opto-digital image encryption technique and the great importance of the new non-linear preprocessing introduced to enhance the security of the cryptosystem and overcome the problem of linearity encountered in the existing permutation-based opto-digital image encryption schemes.

  19. Integrating Error Detection with data encryption algorithm using Permutation Invariant RAO Alaka Shift transform

    Directory of Open Access Journals (Sweden)

    A.V. Narasimha Rao

    2008-10-01

    Full Text Available This paper adopts a novel approach for ensuring security of data with error detection capability. RAS Transform is a nonlinear recursive Transform. This simple but very effective RAS transform is Permutation Invariant and used to code the digital data at two levels, so that the data is encrypted and also there is multilevel error detection mechanism based on the properties of the RAS Transform. The first type is data independent and the later is data dependent. In data dependant encryption, the partially encrypted data is subjected to RAS Transformation at two levels namely, byte level and block level before transmission. The outcome is 128 bits of encrypted data together with Encryption Key. A code book of only 20 valid code words is generated to represent 256 possible octets of 8-bit data words. From each of the code words, the data word can be uniquely recovered using the data dependent symmetric encryption key. The result of this coding on a sample text data of about 189 characters size is presented.

  20. A Comparison between Memetic algorithm and Genetic algorithm for the cryptanalysis of Simplified Data Encryption Standard algorithm

    OpenAIRE

    Poonam Garg

    2010-01-01

    Genetic algorithms are a population-based Meta heuristics. They have been successfully applied to many optimization problems. However, premature convergence is an inherent characteristic of such classical genetic algorithms that makes them incapable of searching numerous solutions of the problem domain. A memetic algorithm is an extension of the traditional genetic algorithm. It uses a local search technique to reduce the likelihood of the premature convergence. The cryptana...

  1. STEGANOGRAPHY FOR TWO AND THREE LSBs USING EXTENDED SUBSTITUTION ALGORITHM

    OpenAIRE

    R.S. Gutte; Y.D. Chincholkar; P.U. Lahane

    2013-01-01

    The Security of data on internet has become a prior thing. Though any message is encrypted using a stronger cryptography algorithm, it cannot avoid the suspicion of intruder. This paper proposes an approach in such way that, data is encrypted using Extended Substitution Algorithm and then this cipher text is concealed at two or three LSB positions of the carrier image. This algorithm covers almost all type of symbols and alphabets. The encrypted text is concealed variably into the LSBs. There...

  2. Hardware stream cipher with controllable chaos generator for colour image encryption

    KAUST Repository

    Barakat, Mohamed L.

    2014-01-01

    This study presents hardware realisation of chaos-based stream cipher utilised for image encryption applications. A third-order chaotic system with signum non-linearity is implemented and a new post processing technique is proposed to eliminate the bias from the original chaotic sequence. The proposed stream cipher utilises the processed chaotic output to mask and diffuse input pixels through several stages of XORing and bit permutations. The performance of the cipher is tested with several input images and compared with previously reported systems showing superior security and higher hardware efficiency. The system is experimentally verified on XilinxVirtex 4 field programmable gate array (FPGA) achieving small area utilisation and a throughput of 3.62 Gb/s. © The Institution of Engineering and Technology 2013.

  3. An Integrated Algorithm Supporting Confidentiality and Integrity for Secured Access and Storage of DICOM Images

    Directory of Open Access Journals (Sweden)

    Suresh Jaganathan

    2012-04-01

    Full Text Available In healthcare industry, the patient's medical data plays a vital role because diagnosis of anyailments is done by using those data. The high volume of medical data leads to scalability andmaintenance issues when using health-care provider's on-site picture archiving andcommunication system (PACS and network oriented storage system. Therefore a standard isneeded for maintaining the medical data and for better diagnosis. Since the medical data reflectsin a similar way to individuals’ personal information, secrecy should be maintained. Maintainingsecrecy can be done by encrypting the data, but as medical data involves images and videos,traditional text based encryption/decryption schemes are not adequate for providingconfidentiality. In this paper, we propose an algorithm for securing the DICOM format medicalarchives by providing better confidentiality and maintaining their integrity. Our contribution in thisalgorithm is of twofold: (1 Development of Improved Chaotic based Arnold Cat Map forencryption/decryption of DICOM files and (2 Applying a new hash algorithm based on chaotictheory for those encrypted files for maintaining integrity. By applying this algorithm, the secrecy ofmedical data is maintained. The proposed algorithm is tested with various DICOM format imagearchives by studying the following parameters i PSNR - for quality of images and ii Key - forsecurity.

  4. New Way for Encryption Data Using Hourglas

    OpenAIRE

    Hamid Mehdi

    2013-01-01

    Nowadays, there are many encryption algorithms to protect information. Data confidentiality is one of themost important functions of encryption algorithms, it means when the transferring data between differentsystems is vague for unauthorized systems or people. Moreover Encryption algorithms must maintain dataintegrity and provide availability for information. New encryption methods cause the attackers cannot simplyaccess to the information and do not allow discovering the relationship betwee...

  5. Data Hiding and Retrival Using Advanced Encryption and Decryption Algorithms Mamtha Shetty, Shreedhar. A. Joshi

    OpenAIRE

    Mamtha Shetty; SHREEDHAR A JOSHI

    2014-01-01

    In this era of digital world, with the evolution of technology, there is an essential need for optimization of online digital data and information. Nowadays, Security and Authenticity of digital data has become a big challenge. This paper proposes an innovative method to authenticate the digital documents. A new method is introduced here, which allows multiple encryption and decryption of digital data.

  6. Image Compression Using Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Ryan Rey M. Daga

    2012-09-01

    Full Text Available Image compression techniques are important and useful in data storage and image transmission through the Internet. These techniques eliminate redundant information in an image which minimizes the physical space requirement of the image. Numerous types of image compression algorithms have been developed but the resulting image is still less than the optimal. The Harmony search algorithm (HSA, a meta-heuristic optimization algorithm inspired by the music improvisation process of musicians, was applied as the underlying algorithm for image compression. Experiment results show that it is feasible to use the harmony search algorithm as an algorithm for image compression. The HSA-based image compression technique was able to compress colored and grayscale images with minimal visual information loss.

  7. Image Fusion using Evolutionary Algorithm (GA

    Directory of Open Access Journals (Sweden)

    V Jyothi

    2011-03-01

    Full Text Available Image fusion is the process of combining images taken from different sources to obtain better situational awareness. In fusing source images the objective is to combine the most relevant information from source images into composite image. Genetic algorithm is used for solving optimization problems. Genetic algorithm can be employed to image fusion where some kind of parameter optimization is required.In this paper we proposed genetic algorithm based schemes for image fusion and proved that these schemes perform better than the conventional methods through comparison of parameters namely image quality index, mutual information, root mean square error and peak signal to noise ratio.

  8. Data Hiding and Retrival Using Advanced Encryption and Decryption Algorithms Mamtha Shetty, Shreedhar. A. Joshi

    Directory of Open Access Journals (Sweden)

    Mamtha Shetty

    2014-04-01

    Full Text Available In this era of digital world, with the evolution of technology, there is an essential need for optimization of online digital data and information. Nowadays, Security and Authenticity of digital data has become a big challenge. This paper proposes an innovative method to authenticate the digital documents. A new method is introduced here, which allows multiple encryption and decryption of digital data.

  9. A FUZZY LOGIC APPROACH TO ENCRYPTED WATERMARKING FOR STILL IMAGES IN WAVELET DOMAIN ON FPGA

    OpenAIRE

    Pankaj U.Lande; Sanjay N. Talbar; G.N.Shinde

    2010-01-01

    In this paper a fuzzy logic approach is introduced to embed the encrypted watermark in the wavelet domain. The multi-resolution representation based on DWT incorporates a model of Human VisualSystem (HVS). Encryption and digital watermarking techniques need to be incorporated in digital right management. It is clear that these two technologies are complimenting each other, and the completesecurity of the digital contains depends on both. Encryption transforms the original contain into unreada...

  10. Cryptanalysis and improvement of an optical image encryption scheme using a chaotic Baker map and double random phase encoding

    International Nuclear Information System (INIS)

    In this paper, we evaluate the security of an enhanced double random phase encoding (DRPE) image encryption scheme (2013 J. Lightwave Technol. 31 2533). The original system employs a chaotic Baker map prior to DRPE to provide more protection to the plain image and hence promote the security level of DRPE, as claimed. However, cryptanalysis shows that this scheme is vulnerable to a chosen-plaintext attack, and the ciphertext can be precisely recovered. The corresponding improvement is subsequently reported upon the basic premise that no extra equipment or computational complexity is required. The simulation results and security analyses prove its effectiveness and security. The proposed achievements are suitable for all cryptosystems under permutation and, following that, the DRPE architecture, and we hope that our work can motivate the further research on optical image encryption. (paper)

  11. New Comprehensive Study to Assess Comparatively the QKD, XKMS, KDM in the PKI encryption algorithms

    Directory of Open Access Journals (Sweden)

    Bilal Bahaa Zaidan

    2009-11-01

    Full Text Available Protecting data is a very old art and wide use since the Egyptian when they identify the 1st encryption method in this world, the spiciest people categorize protecting data under computer forensic and others locate it under network security, cryptography methods are the backbone of this art, nowadays many of new techniques for the attacker's beings develops, a lot of methods for information protecting start dropping down such as RSA and stander encryption methods, in the same time a new methods has been appeared such as Quantum cryptography, the new methods has faced problems in the key distribution or key management and some time such as RSA there is a function may estimate the keys, In this paper we will make a comparative study between the key management distribution methods, in fact we will talk about QKD Encryption in the fiber optic area vice verse the KDM in the normal networks, for instant there are two known methods, KDM "Diffi- Hellman" and XKMS.

  12. A Novel Steganographic Scheme Based on Hash Function Coupled With AES Encryption

    Directory of Open Access Journals (Sweden)

    Rinu Tresa M J

    2014-04-01

    Full Text Available In the present scenario the use of images increased extremely in the cyber world so that we can easily transfer data with the help of these images in a secured way. Image steganography becomes important in this manner. Steganography and cryptography are the two techniques that are often confused with each other. The input and output of steganogra phy looks alike, but for cryptography the output will be in an encrypted form which always draws attraction to the attacker. This paper combines both steganography and cryptography so that attacker doesn’t know about the existence of message and the message itself is encrypted to ensure more security. The textual data entered by the user is encrypted using AES algorithm. After encryption, the encrypted data is stored in the colour image by using a hash based algorithm. Most of the steganographic algorithms available today is suitable for a specific image format and these algorithms suffers from poor quality of the embedded image. The proposed work does not corrupt the images quality in any form. The striking feature is that this algorithm is suitable for almost all image formats e.g.: jpeg/jpg, Bitmap, TIFF and GIF

  13. Optical Encryption with Jigsaw Transform using Matlab

    CERN Document Server

    Giraldo, Leidy Marcela

    2012-01-01

    This article will describe an optical encryption technical of images which it is proposed in an analogical and digital way. The development of the technical to a digital level, it is made to implementing algorithms (routines) in MATLAB. We will propose a functional diagram to the described analogical development from which designated the optical systems associated with each functional block. Level of security that the jigsaw algorithms provide applied on an image, which has been decomposed into its bit-planes, is significantly better if they are applied on an image that has not been previously decomposed.

  14. Testing a Variety of Encryption Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Henson, T J

    2001-04-09

    Review and test speeds of various encryption technologies using Entrust Software. Multiple encryption algorithms are included in the product. Algorithms tested were IDEA, CAST, DES, and RC2. Test consisted of taking a 7.7 MB Word document file which included complex graphics and timing encryption, decryption and signing. Encryption is discussed in the GIAC Kickstart section: Information Security: The Big Picture--Part VI.

  15. Double-image encryption based on discrete multiple-parameter fractional angular transform and two-coupled logistic maps

    Science.gov (United States)

    Sui, Liansheng; Duan, Kuaikuai; Liang, Junli

    2015-05-01

    A new discrete fractional transform defined by the fractional order, periodicity and vector parameters is presented, which is named as the discrete multiple-parameter fractional angular transform. Based on this transform and two-coupled logistic map, a double-image encryption scheme is proposed. First, an enlarged image is obtained by connecting two plaintext images sequentially and scrambled by using a chaotic permutation process, in which the sequences of chaotic pairs generated by using the two-coupled logistic map. Then, the scrambled enlarged image is decomposed into two new components. Second, a chaotic random phase mask is generated based on the logistic map, with which one of two components is converted to the modulation phase mask. Another component is encoded into an interim matrix with the help of the modulation phase mask. Finally, the two-dimensional discrete multiple-parameter fractional angular transform is performed on the interim matrix to obtain the ciphertext with stationary white noise distribution. The proposed encryption scheme has an obvious advantage that no phase keys are used in the encryption and decryption process, which is convenient to key management. Moreover, the security of the cryptosystem can be enhanced by using extra parameters such as initial values of chaos functions, fractional orders and vector parameters of transform. Simulation results and security analysis verify the feasibility and effectiveness of the proposed scheme.

  16. Image Compression Based on Improved FFT Algorithm

    OpenAIRE

    Juanli Hu; Jiabin Deng; Juebo Wu

    2011-01-01

    Image compression is a crucial step in image processing area. Image Fourier transforms is the classical algorithm which can convert image from spatial domain to frequency domain. Because of its good concentrative property with transform energy, Fourier transform has been widely applied in image coding, image segmentation, image reconstruction. This paper adopts Radix-4 Fast Fourier transform (Radix-4 FFT) to realize the limit distortion for image coding, and to discuss the feasibility and the...

  17. Image Segmentation Algorithms and their use on Doppler Images

    OpenAIRE

    Tatiana D. C. A. Silva; Zhen Ma; Tavares, João Manuel R. S.

    2011-01-01

    This paper aims to make a review on current segmentation algorithms used for medical images. Image segmentation algorithms can be classified according to their methodologies, namely the ones based on thresholds, clustering, and deformable models. Each type of algorithms is discussed as well as their main application fields identified; additionally, the advantages and disadvantages of each type are pointed out. Experiments that apply the algorithms to segment Doppler images are presented to fu...

  18. Parallelized Dilate Algorithm for Remote Sensing Image

    OpenAIRE

    Suli Zhang; Haoran Hu; Xin Pan

    2014-01-01

    As an important algorithm, dilate algorithm can give us more connective view of a remote sensing image which has broken lines or objects. However, with the technological progress of satellite sensor, the resolution of remote sensing image has been increasing and its data quantities become very large. This would lead to the decrease of algorithm running speed or cannot obtain a result in limited memory or time. To solve this problem, our research proposed a parallelized dilate algorithm for re...

  19. Parallelized Dilate Algorithm for Remote Sensing Image

    Science.gov (United States)

    Zhang, Suli; Hu, Haoran; Pan, Xin

    2014-01-01

    As an important algorithm, dilate algorithm can give us more connective view of a remote sensing image which has broken lines or objects. However, with the technological progress of satellite sensor, the resolution of remote sensing image has been increasing and its data quantities become very large. This would lead to the decrease of algorithm running speed or cannot obtain a result in limited memory or time. To solve this problem, our research proposed a parallelized dilate algorithm for remote sensing Image based on MPI and MP. Experiments show that our method runs faster than traditional single-process algorithm. PMID:24955392

  20. A novel chaotic encryption scheme based on pseudorandom bit padding

    CERN Document Server

    Sadra, Yaser; Fard, Zahra Arasteh

    2012-01-01

    Cryptography is always very important in data origin authentications, entity authentication, data integrity and confidentiality. In recent years, a variety of chaotic cryptographic schemes have been proposed. These schemes has typical structure which performed the permutation and the diffusion stages, alternatively. The random number generators are intransitive in cryptographic schemes and be used in the diffusion functions of the image encryption for diffused pixels of plain image. In this paper, we propose a chaotic encryption scheme based on pseudorandom bit padding that the bits be generated by a novel logistic pseudorandom image algorithm. To evaluate the security of the cipher image of this scheme, the key space analysis, the correlation of two adjacent pixels and differential attack were performed. This scheme tries to improve the problem of failure of encryption such as small key space and level of security.

  1. Genetic algorithm applied to fractal image compression

    Directory of Open Access Journals (Sweden)

    Y. Chakrapani

    2009-02-01

    Full Text Available In this paper the technique of Genetic Algorithm (GA is applied for Fractal Image Compression (FIC. With the help of this evolutionary algorithm effort is made to reduce the search complexity of matching between range block and domain block. One of the image compression techniques in the spatial domain is Fractal Image Compression but the main drawback of FIC is that it involves more computational time due to global search. In order to improve the computational time and also the acceptable quality of the decoded image, Genetic algorithm is proposed. Experimental results show that the Genetic Algorithm is a better method than the traditional exhaustive search method.

  2. ENHANCED SECURITY MECHANISM IN CLOUD COMPUTING USING HYBRID ENCRYPTION ALGORITHM AND FRAGMENTATION: A REVIEW

    Directory of Open Access Journals (Sweden)

    Amandeep Kaur

    2015-05-01

    Full Text Available Cloud is a term used as a metaphor for the wide area networks (like internet or any such large networked environment. It came partly from the cloud-like symbol used to represent the complexities of the networks in the schematic diagrams. It represents all the complexities of the network which may include everything from cables, routers, servers, data centers and all such other devices. Cloud based systems saves data off multiple organizations on shared hardware systems. Data segregation is done by encrypting data of users, but encryption is not complete solution. We can do segregate data by creating virtual partitions of data for saving and allowing user to access data in his partition only. We will be implementing cloud security aspects for data mining by implementing cloud system. After implementing cloud infrastructure for data mining for cloud system we shall be evaluating security measure for data mining in cloud. We will be fixing threats in data mining to Personal/private data in cloud systems.  

  3. An Algorithm for image stitching and blending

    Science.gov (United States)

    Rankov, Vladan; Locke, Rosalind J.; Edens, Richard J.; Barber, Paul R.; Vojnovic, Borivoj

    2005-03-01

    In many clinical studies, including those of cancer, it is highly desirable to acquire images of whole tumour sections whilst retaining a microscopic resolution. A usual approach to this is to create a composite image by appropriately overlapping individual images acquired at high magnification under a microscope. A mosaic of these images can be accurately formed by applying image registration, overlap removal and blending techniques. We describe an optimised, automated, fast and reliable method for both image joining and blending. These algorithms can be applied to most types of light microscopy imaging. Examples from histology, from in vivo vascular imaging and from fluorescence applications are shown, both in 2D and 3D. The algorithms are robust to the varying image overlap of a manually moved stage, though examples of composite images acquired both with manually-driven and computer-controlled stages are presented. The overlap-removal algorithm is based on the cross-correlation method; this is used to determine and select the best correlation point between any new image and the previous composite image. A complementary image blending algorithm, based on a gradient method, is used to eliminate sharp intensity changes at the image joins, thus gradually blending one image onto the adjacent 'composite'. The details of the algorithm to overcome both intensity discrepancies and geometric misalignments between the stitched images will be presented and illustrated with several examples.

  4. Simple Encryption/Decryption Application

    Directory of Open Access Journals (Sweden)

    Majdi Al-qdah

    2007-06-01

    Full Text Available This paper presents an Encryption/Decryption application that is able to work with any type of file; for example: image files, data files, documentation files…etc. The method of encryption is simple enough yet powerful enough to fit the needs of students and staff in a small institution. The application uses simple key generation method of random number generation and combination. The final encryption is a binary one performed through rotation of bits and XOR operation applied on each block of data in any file using a symmetric decimal key. The key generation and Encryption are all done by the system itself after clicking the encryption button with transparency to the user. The same encryption key is also used to decrypt the encrypted binary file.

  5. Investigating Encrypted Material

    Science.gov (United States)

    McGrath, Niall; Gladyshev, Pavel; Kechadi, Tahar; Carthy, Joe

    When encrypted material is discovered during a digital investigation and the investigator cannot decrypt the material then s/he is faced with the problem of how to determine the evidential value of the material. This research is proposing a methodology of extracting probative value from the encrypted file of a hybrid cryptosystem. The methodology also incorporates a technique for locating the original plaintext file. Since child pornography (KP) images and terrorist related information (TI) are transmitted in encrypted format the digital investigator must ask the question Cui Bono? - who benefits or who is the recipient? By doing this the scope of the digital investigation can be extended to reveal the intended recipient.

  6. Interferometry based multispectral photon-limited 2D and 3D integral image encryption employing the Hartley transform.

    Science.gov (United States)

    Muniraj, Inbarasan; Guo, Changliang; Lee, Byung-Geun; Sheridan, John T

    2015-06-15

    We present a method of securing multispectral 3D photon-counted integral imaging (PCII) using classical Hartley Transform (HT) based encryption by employing optical interferometry. This method has the simultaneous advantages of minimizing complexity by eliminating the need for holography recording and addresses the phase sensitivity problem encountered when using digital cameras. These together with single-channel multispectral 3D data compactness, the inherent properties of the classical photon counting detection model, i.e. sparse sensing and the capability for nonlinear transformation, permits better authentication of the retrieved 3D scene at various depth cues. Furthermore, the proposed technique works for both spatially and temporally incoherent illumination. To validate the proposed technique simulations were carried out for both the 2D and 3D cases. Experimental data is processed and the results support the feasibility of the encryption method. PMID:26193568

  7. Video Encryption-A Survey

    OpenAIRE

    Jolly Shah; Vikas Saxena

    2011-01-01

    Multimedia data security is becoming important with the continuous increase of digital communications on internet. The encryption algorithms developed to secure text data are not suitable for multimedia application because of the large data size and real time constraint. In this paper, classification and description of various video encryption algorithms are presented. Analysis and Comparison of these algorithms with respect to various parameters like visual degradation, enc...

  8. Implementing of microscopic images mosaic revising algorithm

    Directory of Open Access Journals (Sweden)

    Haishun Wang

    2011-04-01

    Full Text Available Microscopic image mosaic stitches several adjacent images into an integrated seamless picture, and is of significant practical value to remote medicine applications, especially in remote diagnosis. However, due to limitation in image acquisition method, a mismatch could occur as a result of variance in adjacent image stitching data and accumulation of errors. The current image stitching method still has room for improvement regarding processing speed and effectiveness, particularly in precision. In this paper, we proposed a new image mosaic revising algorithms based on the relativity of adjacent images and expounding the principal and equations on image mosaic error revising, as well as achieving automatic intelligent calculation with the revised algorithm. Through experiment, inaccurate pathological mosaic images from 20 groups were revised rapidly and accurately with error controlled within one pixel. It was proved that the approach is effective in revising the error matching in microscopic images mosaic. Moreover, it is easy to operate and effective for more accurate image stitching.

  9. STEGANOGRAPHY FOR TWO AND THREE LSBs USING EXTENDED SUBSTITUTION ALGORITHM

    Directory of Open Access Journals (Sweden)

    R.S. Gutte

    2013-03-01

    Full Text Available The Security of data on internet has become a prior thing. Though any message is encrypted using a stronger cryptography algorithm, it cannot avoid the suspicion of intruder. This paper proposes an approach in such way that, data is encrypted using Extended Substitution Algorithm and then this cipher text is concealed at two or three LSB positions of the carrier image. This algorithm covers almost all type of symbols and alphabets. The encrypted text is concealed variably into the LSBs. Therefore, it is a stronger approach. The visible characteristics of the carrier image before and after concealment remained almost the same. The algorithm has been implemented using Matlab.

  10. Comparative Study of Image Denoising Algorithms in Digital Image Processing

    OpenAIRE

    Aarti .; Gaurav Pushkarna

    2014-01-01

    This paper proposes a basic scheme for understanding the fundamentals of digital image processing and the image denising algorithm. There are three basic operation categorized on during image processing i.e. image rectification and restoration, enhancement and information extraction. Image denoising is the basic problem in digital image processing. The main task is to make the image free from Noise. Salt & pepper (Impulse) noise and the additive white Gaussian noise and blurrednes...

  11. Three-Dimensional Image Security System Combines the Use of Smart Mapping Algorithm and Fibonacci Transformation Technique

    Scientific Electronic Library Online (English)

    Xiao-Wei, Li; Sung-Jin, Cho; In-Kwon, Lee; Seok-Tae, Kim.

    Full Text Available In this paper, a three-dimensional (3D) image security system combines the use of the smart pixel mapping (SPM) algorithm and the Fibonacci transformation technique is proposed. In order to reconstruct orthoscopic 3D images with improved image quality, a smart pixel mapping process is adopted. Based [...] on the SPM-based computational integral imaging (CII) system, the depth-converted elemental image array (EIA) is obtained for increasing the quality of the reconstructed image. In the encryption process, the depth-converted EIA is scrambled by the Fibonacci transformation (FT) algorithm. Meanwhile, the computational integral imaging reconstruction (CIIR) technique is used to reconstruct the 3D image in the image reconstruction process. Compared with conventional CII-based 3D image encryption methods, the proposed method enable us to reconstruct high-resolution orthoscopic 3D images at long distance. To demonstrate the effectiveness of the proposed method, some numerical experiments are made to test the validity and the capability of the proposed 3D image security system.

  12. Uncloneable Encryption

    CERN Document Server

    Gottesman, D

    2002-01-01

    Quantum states cannot be cloned. I show how to extend this property to classical messages encoded using quantum states, a task I call "uncloneable encryption." An uncloneable encryption scheme has the property that an eavesdropper Eve not only cannot read the encrypted message, but she cannot copy it down for later decoding. She could steal it, but then the receiver Bob would not receive the message, and would thus be alerted that something was amiss. I prove that any authentication scheme for quantum states acts as a secure uncloneable encryption scheme. Uncloneable encryption is also closely related to quantum key distribution (QKD), demonstrating a close connection between cryptographic tasks for quantum states and for classical messages. Thus, studying uncloneable encryption and quantum authentication allows for some modest improvements in QKD protocols. While the main results apply to a one-time key with unconditional security, I also show uncloneable encryption remains secure with a pseudorandom key. In...

  13. AMALGAMATION OF CYCLIC BIT OPERATION IN SD-EI IMAGE ENCRYPTION METHOD: AN ADVANCED VERSION OF SD-EI METHOD: SD-EI VER-2

    OpenAIRE

    Somdip Dey

    2012-01-01

    In this paper, the author presents an advanced version of image encryption technique, which is itself an upgraded version of SD-EI image encryption method. In this new method, SD-EI Ver-2, there are more bit wise manipulations compared to original SD-EI method. The proposed method consist of three stages: 1) First, a number is generated from the password and each pixel of the image is converted to its equivalent eight binary number, and in that eight bit number, the number of bits, which are ...

  14. SRI RAMSHALAKA: A VEDIC METHOD OF TEXT ENCRYPTION AND DECRYPTION

    Directory of Open Access Journals (Sweden)

    Rajkishore Prasad

    2013-07-01

    Full Text Available This paper investigates usability of SriRamshalakha, a vedic tool used in Indian Astrology, in the encryption and decryption of plain English text. Sri Ram Shalaka appears in Sri RamChartmanas, one of the very popular sacred epic of Hindu religion, written by great Saint Tulsidasji. SriRamshalakha is used to fetch/infer the approximate answer of questions/decisions by the believers. Basically, the said shalaka embed nine philosophical verses from Sri RamCharitmanas in a matrix form based on which answers to queries are inferred and ingrained. However, none of the verses are visible and directly readable. Thus here we take SriRamshalakha as the ancient Indian method of text encryption and decryption and based on the same algorithms for the encryption and decryption of plain English text areproposed. The developed algorithms are presented with examples and possibility of its use in steganography and text to image transformation are also discussed.

  15. Improved decryption quality and security of a joint transform correlator-based encryption system

    International Nuclear Information System (INIS)

    Some image encryption systems based on modified double random phase encoding and joint transform correlator architecture produce low quality decrypted images and are vulnerable to a variety of attacks. In this work, we analyse the algorithm of some reported methods that optically implement the double random phase encryption in a joint transform correlator. We show that it is possible to significantly improve the quality of the decrypted image by introducing a simple nonlinear operation in the encrypted function that contains the joint power spectrum. This nonlinearity also makes the system more resistant to chosen-plaintext attacks. We additionally explore the system resistance against this type of attack when a variety of probability density functions are used to generate the two random phase masks of the encryption–decryption process. Numerical results are presented and discussed. (paper)

  16. Using a Grid to Evaluate the Threat of Zombie Networks against Asymmetric Encryption Algorithms

    International Nuclear Information System (INIS)

    The large network of compromised computers currently available can be used to run complex algorithms in order to solve problems considered to be beyond current computational power. A distribute algorithm for obtaining prime number is presented. The algorithm was run in a grid system to obtain all prime numbers up to 240, measuring execution times and storage requirements. Extrapolating the results to larger limits it was found that using zombie networks it is possible to obtain prime number up to a limit of 250. The main conclusion is that using this technique, low transfer rates and storage capabilities are the current limiting factors, surpassing the limited computing power. (Author)

  17. Using a Grid to Evaluate the Threat of Zombie Networks against Asymmetric Encryption Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Palacios, R.

    2007-07-01

    The large network of compromised computers currently available can be used to run complex algorithms in order to solve problems considered to be beyond current computational power. A distribute algorithm for obtaining prime number is presented. The algorithm was run in a grid system to obtain all prime numbers up to 240, measuring execution times and storage requirements. Extrapolating the results to larger limits it was found that using zombie networks it is possible to obtain prime number up to a limit of 250. The main conclusion is that using this technique, low transfer rates and storage capabilities are the current limiting factors, surpassing the limited computing power. (Author)

  18. Medical Image Segmentation by Using the Pillar K Means Algorithm

    OpenAIRE

    S.Vaseem Akram1 , K.Sravya

    2013-01-01

    This paper presents the new algorithms for medical image segmentation .this paperpresents the to extract malignant part from the t1 weighted MR brain Image and analyze the performance of it with various algorithm .in different algorithms we use k means algorithm,cmeans algorithm ,pillar k means algorithm .k means and c means algorithm are already existent algorithms the new proposed algorithm is pillar k means algorithm. This segmentation process includes a new mechanism for clustering the el...

  19. Classification of Encrypted Text and Encrypted Speech (Short Communication

    Directory of Open Access Journals (Sweden)

    Rajesh Asthana

    2010-07-01

    Full Text Available The information to be exchanged between two parties can be text data or speech data. This data is encrypted for its security and communicated (to the other end. When an adversary intercepts these encrypted data then in order to recover the actual information, his first step is to identify whether intercepted data is encrypted text or encrypted speech are used. The next step is to get the actual information from encrypted text or encrypted speech. In this paper, pattern recognition techniques are applied for identification of encrypted text and encrypted speech. Some new and modified feature extraction techniques have been used to convert the text and speech data into three-dimensional, four-dimensional, and five-dimensional measurement vectors. These multi-dimensional measurement vectors are converted into two-dimensional vectors using projection pursuit technique based on Sammon.s algorithm and Chien.s algorithm. The quantified classification performances using minimum distance classifier and maximum likelihood classifier have also been given.Defence Science Journal, 2010, 60(4, pp.420-422, DOI:http://dx.doi.org/10.14429/dsj.60.497

  20. An Improved Algorithm For Image Compression Using Geometric Image Approximation

    Directory of Open Access Journals (Sweden)

    V. J. Rehna

    2014-06-01

    Full Text Available Abstract- Our dependence on digital media continues to grow and therefore finding competent ways of storing and conveying these large amounts of data has become a major concern. The technique of image compression has then become very essential and highly applicable. In this paper, the performance of an efficient image coding method based on Geometric Wavelets that divides the desired image using a recursive procedure for image coding is explored. The objective of the work is to optimize the performance of geometric wavelet based image coding scheme and to suggest a method to reduce the time complexity of the algorithm. We have used the polar coordinate form of the straight line in the BSP scheme for partitioning the image domain. A novel pruning algorithm is tried to optimize the rate distortion curve and achieve the desired bit rate. The algorithm is also implemented with the concept of no tiling and its effect in PSNR and computation time is explored. The enhanced results show a gain of 2.24 dB over the EZW algorithm and 1.4 dB over the SPIHT algorithm at the bit-rate 0.0625 bpp for the Lena test image. Image tiling is found to reduce considerably the computational complexity and in turn the time complexity of the algorithm without affecting its coding efficiency. The algorithm offers remarkable results in terms of PSNR compared to existing techniques.

  1. Image processing technologies algorithms, sensors, and applications

    CERN Document Server

    Aizawa, Kiyoharu; Suenaga, Yasuhito

    2004-01-01

    Showcasing the most influential developments, experiments, and architectures impacting the digital, surveillance, automotive, industrial, and medical sciences, Image Processing Technologies tracks the evolution and advancement of computer vision and image processing (CVIP) technologies, examining methods and algorithms for image analysis, optimization, segmentation, and restoration. It focuses on recent approaches and techniques in CVIP applications development and explores various coding methods for individual types of 3-D images. This text/reference brings researchers and specialists up-to-

  2. Algorithms for reconstructing images for industrial applications

    International Nuclear Information System (INIS)

    Several algorithms for reconstructing objects from their projections are being studied in our Laboratory, for industrial applications. Such algorithms are useful locating the position and shape of different composition of materials in the object. A Comparative study of two algorithms is made. The two investigated algorithsm are: The MART (Multiplicative - Algebraic Reconstruction Technique) and the Convolution Method. The comparison are carried out from the point view of the quality of the image reconstructed, number of views and cost. (Author)

  3. Image feature extraction in encrypted domain with privacy-preserving SIFT.

    Science.gov (United States)

    Hsu, Chao-Yung; Lu, Chun-Shien; Pei, Soo-Chang

    2012-11-01

    Privacy has received considerable attention but is still largely ignored in the multimedia community. Consider a cloud computing scenario where the server is resource-abundant, and is capable of finishing the designated tasks. It is envisioned that secure media applications with privacy preservation will be treated seriously. In view of the fact that scale-invariant feature transform (SIFT) has been widely adopted in various fields, this paper is the first to target the importance of privacy-preserving SIFT (PPSIFT) and to address the problem of secure SIFT feature extraction and representation in the encrypted domain. As all of the operations in SIFT must be moved to the encrypted domain, we propose a privacy-preserving realization of the SIFT method based on homomorphic encryption. We show through the security analysis based on the discrete logarithm problem and RSA that PPSIFT is secure against ciphertext only attack and known plaintext attack. Experimental results obtained from different case studies demonstrate that the proposed homomorphic encryption-based privacy-preserving SIFT performs comparably to the original SIFT and that our method is useful in SIFT-based privacy-preserving applications. PMID:22711774

  4. Color Image Clustering using Block Truncation Algorithm

    CERN Document Server

    Silakari, Sanjay; Maheshwari, Manish

    2009-01-01

    With the advancement in image capturing device, the image data been generated at high volume. If images are analyzed properly, they can reveal useful information to the human users. Content based image retrieval address the problem of retrieving images relevant to the user needs from image databases on the basis of low-level visual features that can be derived from the images. Grouping images into meaningful categories to reveal useful information is a challenging and important problem. Clustering is a data mining technique to group a set of unsupervised data based on the conceptual clustering principal: maximizing the intraclass similarity and minimizing the interclass similarity. Proposed framework focuses on color as feature. Color Moment and Block Truncation Coding (BTC) are used to extract features for image dataset. Experimental study using K-Means clustering algorithm is conducted to group the image dataset into various clusters.

  5. Quantum fully homomorphic encryption scheme based on universal quantum circuit

    Science.gov (United States)

    Liang, Min

    2015-08-01

    Fully homomorphic encryption enables arbitrary computation on encrypted data without decrypting the data. Here it is studied in the context of quantum information processing. Based on universal quantum circuit, we present a quantum fully homomorphic encryption (QFHE) scheme, which permits arbitrary quantum transformation on any encrypted data. The QFHE scheme is proved to be perfectly secure. In the scheme, the decryption key is different from the encryption key; however, the encryption key cannot be revealed. Moreover, the evaluation algorithm of the scheme is independent of the encryption key, so it is suitable for delegated quantum computing between two parties.

  6. Geometric direct search algorithms for image registration.

    Science.gov (United States)

    Lee, Seok; Choi, Minseok; Kim, Hyungmin; Park, Frank Chongwoo

    2007-09-01

    A widely used approach to image registration involves finding the general linear transformation that maximizes the mutual information between two images, with the transformation being rigid-body [i.e., belonging to SE(3)] or volume-preserving [i.e., belonging to SL(3)]. In this paper, we present coordinate-invariant, geometric versions of the Nelder-Mead optimization algorithm on the groups SL(3), SE(3), and their various subgroups, that are applicable to a wide class of image registration problems. Because the algorithms respect the geometric structure of the underlying groups, they are numerically more stable, and exhibit better convergence properties than existing local coordinate-based algorithms. Experimental results demonstrate the improved convergence properties of our geometric algorithms. PMID:17784595

  7. Review on Reserving Room Before Encryption for Reversible Data Hiding

    Directory of Open Access Journals (Sweden)

    Akshata Malwad

    2014-03-01

    Full Text Available Now a days there is very big problem of data hacking. There are number of techniques available in the industry to maintain security of data. So, data hiding in the encrypted image is comes into the picture, but the problem is the occurrence of distortion in original cover at the time of data extraction. That’s why Recently, more and more attention is paid to reversible data hiding (RDH in encrypted images, since it offers the excellent way so that the original cover can be recovered without any loss after embedded data is extracted while protecting the image content’s confidentiality. In This paper we propose a novel method by reserving room before encryption (RRBE with a traditional RDH algorithm, and thus it is easy for the data hider to reversibly embed data in the encrypted image. Using RRBE we can achieve real reversibility, that is, data extraction and image recovery are free of any error and also we can increase the rate of data to be hidden. This is useful in the way that these method recovers the image with its original quality with improved PSNR ratio.

  8. Public Key Encryption Algorithms for Wireless Sensor Networks In tinyOS

    Directory of Open Access Journals (Sweden)

    Chandni Vaghasia,

    2013-03-01

    Full Text Available generally, when people consider wireless devices they think of items such as cell phones, personal digital assistants, or laptops. These items are costly, target specialized applications, and rely on the pre-deployment of extensive infrastructure support. In contrast, wireless sensor networks use small, low-cost embedded devices for a wide range of applications and do not rely on any pre-existing infrastructure. The emerging field of wireless sensor networks (WSN combines sensing, computation, and communication into a single tiny device called sensor nodes or motes. Through advanced mesh networking protocols, these devices form a sea of connectivity that extends the reach of cyberspace out into the physical world. here some algorithms are implemented and result is analyzed on different platforms like PC MICA,Mica 2, Mica2dot and analyze which algorithm is best for which platform.

  9. ROBUST SECURE AND BLIND WATERMARKING BASED ON DWT DCT PARTIAL MULTI MAP CHAOTIC ENCRYPTION

    Directory of Open Access Journals (Sweden)

    Esam A. Hagras

    2011-11-01

    Full Text Available In this paper, a novel Commutative Watermarking and Partial Encryption (CWPE algorithm based on Discrete Wavelet Transform and Discrete Cosine Transform (DWT-DCT for watermarking and Multi-Map Wavelet Chaotic Encryption (MMW-CE is proposed. The original host image is first decomposed into four sub-bands using (DWT, each sub-band coefficients are relocated using Arnold transform to create a noiselike version, then apply partial encryption scheme using chaotic scrambled random number pattern bitwise XOR with the scrambled horizontal coefficients only and the shuffled approximation coefficients are divided into non-overlapping and equal sized blocks. Watermark embedding process is based on extracting the (DCT middle frequencies of the encrypted approximation coefficients blocks. Comparison based threshold of the extracted DCT mid-band coefficients, watermark bits are embedded in the coefficients of the corresponding DCT middle frequencies. The experimental results show that the proposed algorithm is robust against common signal processing attacks. The proposed algorithm is able to reduce encryption to one quarter of the image information. Statistical and differential analyses are performed to estimate the security strength of the proposed algorithm. The results of the security analysis show that the proposed algorithm provides a high security level for real time application.

  10. IMAGE STEGANOGRAPHY USING LEAST SIGNIFICANT BIT WITH CRYPTOGRAPHY

    OpenAIRE

    Vikas Tyagi

    2012-01-01

    To increase the security of messages sent over the internet steganography is used. This paper discussed a technique based on the LSB(least significant bit) and a new encryption algorithm. By matching data to an image, there is less chance of an attacker being able to use steganalysis to recover data. Before hiding the data in an image the application first encrypts it. Keywords- Steganography, LSB(least significant bit), Encryption, Decryption.

  11. Algorithm for fast fractal image compression

    Science.gov (United States)

    Kominek, John

    1995-04-01

    Fractal image compression is a promising new technology that may successfully provide a codec for PC-to-PC video communications. Unfortunately, the large amount of computation needed for the compression stage is a major obstacle that needs to be overcome. This paper introduces the Fast Fractal Image Compression algorithm, a new approach to breaking the `speed problem' that has plagued previous efforts. For still images, experiments show that at comparable quality levels the FFIC algorithm is 5 to 50 times faster than the current state of the art. Such an improvement brings real-time video applications within the reach of fractal mathematics.

  12. Image Compression Based on Improved FFT Algorithm

    Directory of Open Access Journals (Sweden)

    Juanli Hu

    2011-07-01

    Full Text Available Image compression is a crucial step in image processing area. Image Fourier transforms is the classical algorithm which can convert image from spatial domain to frequency domain. Because of its good concentrative property with transform energy, Fourier transform has been widely applied in image coding, image segmentation, image reconstruction. This paper adopts Radix-4 Fast Fourier transform (Radix-4 FFT to realize the limit distortion for image coding, and to discuss the feasibility and the advantage of Fourier transform for image compression. It aims to deal with the existing complex and time-consuming of Fourier transform, according to the symmetric conjugate of the image by Fourier transform to reduce data storage and computing complexity. Using Radix-4 FFT can also reduce algorithm time-consuming, it designs three different compression requirements of non-uniform quantification tables for different demands of image quality and compression ratio. Take the standard image Lena as experimental data using the presented method, the results show that the implementation by Radix-4 FFT is simple, the effect is ideal and lower time-consuming.

  13. Enhanced Throughput AES Encryption

    OpenAIRE

    Kunal Lala; Ajay Kumar; De, Amit Kumar

    2012-01-01

    This paper presents our experience in implementing the Advanced Encryption Standard (AES) algorithm. We have used 128 bit block size and 128 bit cipher key for the implementation. The AES also known as Rijndael algorithm is used to ensure security of transmission channels. Xilinx design tool 13.3 and Xilinx project navigator design tool are used for synthesis and simulation. Very high speed integrated circuit hardware description language (VHDL) is used for coding. The fully pipelined desig...

  14. Dermoscopic Image Segmentation using Machine Learning Algorithm

    Directory of Open Access Journals (Sweden)

    L. P. Suresh

    2011-01-01

    Full Text Available Problem statement: Malignant melanoma is the most frequent type of skin cancer. Its incidence has been rapidly increasing over the last few decades. Medical image segmentation is the most essential and crucial process in order to facilitate the characterization and visualization of the structure of interest in medical images. Approach: This study explains the task of segmenting skin lesions in Dermoscopy images based on intelligent systems such as Fuzzy and Neural Networks clustering techniques for the early diagnosis of Malignant Melanoma. The various intelligent system based clustering techniques used are Fuzzy C Means Algorithm (FCM, Possibilistic C Means Algorithm (PCM, Hierarchical C Means Algorithm (HCM; C-mean based Fuzzy Hopfield Neural Network, Adaline Neural Network and Regression Neural Network. Results: The segmented images are compared with the ground truth image using various parameters such as False Positive Error (FPE, False Negative Error (FNE Coefficient of similarity, spatial overlap and their performance is evaluated. Conclusion: The experimental results show that the Hierarchical C Means algorithm( Fuzzy provides better segmentation than other (Fuzzy C Means, Possibilistic C Means, Adaline Neural Network, FHNN and GRNN clustering algorithms. Thus Hierarchical C Means approach can handle uncertainties that exist in the data efficiently and useful for the lesion segmentation in a computer aided diagnosis system to assist the clinical diagnosis of dermatologists.

  15. Amalgamation of Cyclic Bit Operation in SD-EI Image Encryption Method: An Advanced Version of SD-EI Method: SD-EI Ver-2

    Directory of Open Access Journals (Sweden)

    Somdip Dey

    2015-05-01

    Full Text Available In this paper, the author presents an advanced version of image encryption technique, which is itself an upgraded version of SD-EI image encryption method. In this new method, SD-EI Ver-2, there are more bit wise manipulations compared to original SD-EI method. The proposed method consist of three stages: 1 First, a number is generated from the password and each pixel of the image is converted to its equivalent eight binary number, and in that eight bit number, the number of bits, which are equal to the length of the number generated from the password, are rotated and reversed; 2 In second stage, extended hill cipher technique is applied by using involutory matrix, which is generated by same password used in second stage of encryption to make it more secure; 3 In last stage, we perform modified Cyclic Bit manipulation. First, the pixel values are again converted to their 8 bit binary format. Then 8 consecutive pixels are chosen and a 8X8 matrix is formed out of these 8 bit 8 pixels. After that, matrix cyclic operation is performed randomized number of times, which is again dependent on the password provided for encryption. After the generation of new 8 bit value of pixels, they are again converted to their decimal format and the new value is written in place of the old pixel value. SD-EI Ver-2 has been tested on different image files and the results were very satisfactory.

  16. Design of AES Algorithm for 128/192/256 Key Length in FPGA

    Directory of Open Access Journals (Sweden)

    Pravin V. Kinge

    2014-05-01

    Full Text Available The cryptographic algorithms can be implemented with software or built with pure hardware. However Field Programmable Gate Arrays (FPGA implementation offers quicker solution and can be easily upgraded to incorporate any protocol changes. The available AES algorithm is used for  data and it is also suitable for image encryption and decryption to protect the confidential image from an unauthorized access. This project proposes a method in which the image data is an input to AES algorithm, to obtain the encrypted image. and the encrypted image is the input to AES Decryption to get the original image. This project proposed to implement the 128,192 & 256 bit AES algorithm for data encryption and decryption, also to compare the speed of operation, efficiency, security and frequency . The proposed work will be synthesized and simulated on FPGA family of Xilink ISE 13.2 and Modelsim tool respectively in Very high speed integrated circuit Hardware Description Language (VHDL.

  17. Fractional Fourier domain optical image hiding using phase retrieval algorithm based on iterative nonlinear double random phase encoding.

    Science.gov (United States)

    Wang, Xiaogang; Chen, Wen; Chen, Xudong

    2014-09-22

    We present a novel image hiding method based on phase retrieval algorithm under the framework of nonlinear double random phase encoding in fractional Fourier domain. Two phase-only masks (POMs) are efficiently determined by using the phase retrieval algorithm, in which two cascaded phase-truncated fractional Fourier transforms (FrFTs) are involved. No undesired information disclosure, post-processing of the POMs or digital inverse computation appears in our proposed method. In order to achieve the reduction in key transmission, a modified image hiding method based on the modified phase retrieval algorithm and logistic map is further proposed in this paper, in which the fractional orders and the parameters with respect to the logistic map are regarded as encryption keys. Numerical results have demonstrated the feasibility and effectiveness of the proposed algorithms. PMID:25321769

  18. FPGA Implementation of Park-Miller Algorithm to Generate Sequence of 32-Bit Pseudo Random Key for Encryption and Decryption of Plain Text

    Directory of Open Access Journals (Sweden)

    Bharatesh N

    2014-02-01

    Full Text Available There are many problems arises in randomized algorithms whose solutions are fundamentally based on assumptions that pure random numbers exist, so pseudo-random number generators can imitate randomness sufficiently well for most applications. The proposed scheme is a FPGA implementation of Park-Miller Algorithm for generating sequence of Pseudo-Random keys. The properties like High speed, low power and flexibility of designed PRNG(Pseudo Random Number Generator makes any digital circuit faster and smaller. The algorithm uses a PRNG Module, it contains 32-bit Booth Multiplier, 32-bit Floating point divider and a FSM module. After generating a sequence of 32-bit Pseudo-Random numbers we have used these numbers as a key to Encrypt 128-bit plain text to become a cipher text and by using the same key to decrypt the encrypted data to get original Plain text. The Programming is done in Verilog-HDL, successfully synthesized and implemented in XILINX Spartan 3E FPGA kit.

  19. Variable Weighted Ordered Subset Image Reconstruction Algorithm

    OpenAIRE

    Ming Jiang; Yan Han; Tie Zhou; Jinxiao Pan

    2006-01-01

    We propose two variable weighted iterative reconstruction algorithms (VW-ART and VW-OS-SART) to improve the algebraic reconstruction technique (ART) and simultaneous algebraic reconstruction technique (SART) and establish their convergence. In the two algorithms, the weighting varies with the geometrical direction of the ray. Experimental results with both numerical simulation and real CT data demonstrate that the VW-ART has a significant improvement in the quality of reconstructed images ove...

  20. Implementation of LBG Algorithm for Image Compression

    Directory of Open Access Journals (Sweden)

    Ms. Asmita A.Bardekar#1, Mr. P.A.Tijare

    2011-12-01

    Full Text Available This paper presents an implementation of LBG algorithm for image compression which makes it possible for creating file sizes of manageable, storable and transmittable dimensions. Image Compression techniques fall under two categories, namely, Lossless and Lossy. The Linde, Buzo, and Gray (LBG algorithm is an iterative algorithm which alternatively solves the two optimality criteria i.e. Nearest neighbor condition and centroid condition. The algorithm requires an initial codebook to start with. Codebook is generated using a training set of images. There are different methods like Random Codes and Splitting in which the initial code book can be obtained. This initial codebook is obtained by the splitting method in LBG algorithm. In this method an initial code vector is set as the average of the entire training sequence. This code vector is then split into two. The iterative algorithm is run with these two vectors as the initial codebook. The final two code vectors are splitted into four and the process is repeated until the desired number of code vector is obtained. The LBGalgorithm is measured by calculating performances such as Compression Ratio (CR, Mean square error (MSE, PeakSignal-to-Noise Ratio (PSNR

  1. Binary image authentication based on watermarking algorithm

    Science.gov (United States)

    Masoodifar, Behrang; Hashemi, S. Mojtaba; Zarei, Omid

    2011-06-01

    A digital image watermark embedding and extracting algorithm is presented based on the Finite Ridgelet Transform (FRT) which can efficiently represent image with linear singularities. In general RT also has directional sensitivity so that among the transformed coefficients the most significant one represents the most energetic direction of straight edges in an image. In this paper effect of RT is compared with wavelet transform in watermarking application. Different noises with different PSNR are added into the watermarked image in the experiments and the results are of robustness and transparency.

  2. Robust Secure and Blind Watermarking Based on DWT DCT Partial Multi Map Chaotic Encryption

    Directory of Open Access Journals (Sweden)

    Esam A. Hagras

    2011-12-01

    Full Text Available In this paper, a novel Commutative Watermarking and Partial Encryption (CWPE algorithm based onDiscrete Wavelet Transform and Discrete Cosine Transform (DWT-DCT for watermarking and Multi-MapWavelet Chaotic Encryption (MMW-CE is proposed. The original host image is first decomposed into foursub-bands using (DWT, each sub-band coefficients are relocated using Arnold transform to create a noiselikeversion, then apply partial encryption scheme using chaotic scrambled random number pattern bitwiseXOR with the scrambled horizontal coefficients only and the shuffled approximation coefficients aredivided into non-overlapping and equal sized blocks. Watermark embedding process is based on extractingthe (DCT middle frequencies of the encrypted approximation coefficients blocks. Comparison basedthreshold of the extracted DCT mid-band coefficients, watermark bits are embedded in the coefficients ofthe corresponding DCT middle frequencies. The experimental results show that the proposed algorithm isrobust against common signal processing attacks. The proposed algorithm is able to reduce encryption toone quarter of the image information. Statistical and differential analyses are performed to estimate thesecurity strength of the proposed algorithm. The results of the security analysis show that the proposedalgorithm provides a high security level for real time application.

  3. Algorithm for Fast Registration of Radar Images

    Directory of Open Access Journals (Sweden)

    Subrata Rakshit

    2002-07-01

    Full Text Available Radar imagery provides an all-weather and 24 h coverage, making it ideal for critical defence applications. In some applications, multiple images acquired of an area need to be registered for further processing. Such situations arise for battlefield surveillance based on satellite imagery. The registration has to be done between an earlier (reference image and a new (live image. For automated surveillance, registration is a prerequisite for change detection. Speed is essential due to large volumes of data involved and the need for quick responses. The registration transformation is quite simple, being mainly a global translation. (Scale and rotation corrections can be applied based on known camera parameters. The challenge lies in the fact that the radar images are not as feature-rich as optical images and the image content variation can be as high as 90 per cent. Even though the change on the ground may not be drastic, seasonal variations can significantly alter the radar signatures of ground, vegetation, and water bodies. This necessitates a novel approach different from the techniques developed for optical images. An algorithm has been developed that leads to fast registration of radar images, even in the presence of specular noise and significant scene content variation. The key features of this approach are adaptability to sensor/terrain types, ability to handle large content variations and false positive rejection. The present work shows that this algorithm allows for various cost-performance trade-offs, making it suitable for a wide variety of applications. The algorithm, in various cost-performance configurations, is tested on a set of ERS images. Results of such tests have been reported, indicating the performance of the algorithm for various cost-performance trade-offs.

  4. Recursive algorithms for implementing digital image filters.

    Science.gov (United States)

    Ferrari, L A; Sankar, P V; Shinnaka, S; Sklansky, J

    1987-03-01

    The B-spline functions are used to develop recursive algorithms for the efficient implementation of two-dimensional linear digital image filters. These filters may be spatially varying. The B-splines are used in a representation of the desired point spread function. We show that this leads to recursive algorithms and hardware implementations which are more efficient than either direct spatial domain filter realizations or FFT implementations. The Z-transform is used to develop a discrete version of Duhamel's theorem. A computer architecture for B-spline image filters is proposed and a complexity analysis and comparison to other approaches is provided. PMID:22516640

  5. Digital image processing an algorithmic approach with Matlab

    CERN Document Server

    Qidwai, Uvais

    2009-01-01

    Introduction to Image Processing and the MATLAB EnvironmentIntroduction Digital Image Definitions: Theoretical Account Image Properties MATLAB Algorithmic Account MATLAB CodeImage Acquisition, Types, and File I/OImage Acquisition Image Types and File I/O Basics of Color Images Other Color Spaces Algorithmic Account MATLAB CodeImage ArithmeticIntroduction Operator Basics Theoretical TreatmentAlgorithmic Treatment Coding ExamplesAffine and Logical Operations, Distortions, and Noise in ImagesIntroduction Affine Operations Logical Operators Noise in Images Distortions in ImagesAlgorithmic Account

  6. Tuning Range Image Segmentation by Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Gianluca Pignalberi

    2003-07-01

    Full Text Available Several range image segmentation algorithms have been proposed, each one to be tuned by a number of parameters in order to provide accurate results on a given class of images. Segmentation parameters are generally affected by the type of surfaces (e.g., planar versus curved and the nature of the acquisition system (e.g., laser range finders or structured light scanners. It is impossible to answer the question, which is the best set of parameters given a range image within a class and a range segmentation algorithm? Systems proposing such a parameter optimization are often based either on careful selection or on solution space-partitioning methods. Their main drawback is that they have to limit their search to a subset of the solution space to provide an answer in acceptable time. In order to provide a different automated method to search a larger solution space, and possibly to answer more effectively the above question, we propose a tuning system based on genetic algorithms. A complete set of tests was performed over a range of different images and with different segmentation algorithms. Our system provided a particularly high degree of effectiveness in terms of segmentation quality and search time.

  7. A fast chaotic encryption scheme based on piecewise nonlinear chaotic maps

    Energy Technology Data Exchange (ETDEWEB)

    Behnia, S. [Department of Physics, IAU, Urmia (Iran, Islamic Republic of)]. E-mail: s.behnia@iaurmia.ac.ir; Akhshani, A. [Department of Physics, IAU, Urmia (Iran, Islamic Republic of); Ahadpour, S. [Department of Theoretical Physics and Astrophysics, Tabriz University (Iran, Islamic Republic of); Department of Physics, Mohaghegh Ardabili University (Iran, Islamic Republic of); Mahmodi, H. [Department of Physics, IAU, Urmia (Iran, Islamic Republic of); Akhavan, A. [Department of Engineering, IAU, Urmia (Iran, Islamic Republic of)

    2007-07-02

    In recent years, a growing number of discrete chaotic cryptographic algorithms have been proposed. However, most of them encounter some problems such as the lack of robustness and security. In this Letter, we introduce a new image encryption algorithm based on one-dimensional piecewise nonlinear chaotic maps. The system is a measurable dynamical system with an interesting property of being either ergodic or having stable period-one fixed point. They bifurcate from a stable single periodic state to chaotic one and vice versa without having usual period-doubling or period-n-tippling scenario. Also, we present the KS-entropy of this maps with respect to control parameter. This algorithm tries to improve the problem of failure of encryption such as small key space, encryption speed and level of security.

  8. Experimental Study of Fractal Image Compression Algorithm

    Directory of Open Access Journals (Sweden)

    Chetan R. Dudhagara

    2012-08-01

    Full Text Available Image compression applications have been increasing in recent years. Fractal compression is a lossy compression method for digital images, based on fractals. The method is best suited for textures and natural images, relying on the fact that parts of an image often resemble other parts of the same image. In this paper, a study on fractal-based image compression and fixed-size partitioning will be made, analyzed for performance and compared with a standard frequency domain based image compression standard, JPEG. Sample images will be used to perform compression and decompression. Performance metrics such as compression ratio, compression time and decompression time will be measured in JPEG cases. Also the phenomenon of resolution/scale independence will be studied and described with examples. Fractal algorithms convert these parts into mathematical data called "fractal codes" which are used to recreate the encoded image. Fractal encoding is a mathematical process used to encode bitmaps containing a real-world image as a set of mathematical data that describes the fractal properties of the image. Fractal encoding relies on the fact that all natural, and most artificial, objects contain redundant information in the form of similar, repeating patterns called fractals.

  9. Image segmentation using an improved differential algorithm

    Science.gov (United States)

    Gao, Hao; Shi, Yujiao; Wu, Dongmei

    2014-10-01

    Among all the existing segmentation techniques, the thresholding technique is one of the most popular due to its simplicity, robustness, and accuracy (e.g. the maximum entropy method, Otsu's method, and K-means clustering). However, the computation time of these algorithms grows exponentially with the number of thresholds due to their exhaustive searching strategy. As a population-based optimization algorithm, differential algorithm (DE) uses a population of potential solutions and decision-making processes. It has shown considerable success in solving complex optimization problems within a reasonable time limit. Thus, applying this method into segmentation algorithm should be a good choice during to its fast computational ability. In this paper, we first propose a new differential algorithm with a balance strategy, which seeks a balance between the exploration of new regions and the exploitation of the already sampled regions. Then, we apply the new DE into the traditional Otsu's method to shorten the computation time. Experimental results of the new algorithm on a variety of images show that, compared with the EA-based thresholding methods, the proposed DE algorithm gets more effective and efficient results. It also shortens the computation time of the traditional Otsu method.

  10. CHAOS-BASED ADVANCED ENCRYPTION STANDARD

    KAUST Repository

    Abdulwahed, Naif B.

    2013-05-01

    This thesis introduces a new chaos-based Advanced Encryption Standard (AES). The AES is a well-known encryption algorithm that was standardized by U.S National Institute of Standard and Technology (NIST) in 2001. The thesis investigates and explores the behavior of the AES algorithm by replacing two of its original modules, namely the S-Box and the Key Schedule, with two other chaos- based modules. Three chaos systems are considered in designing the new modules which are Lorenz system with multiplication nonlinearity, Chen system with sign modules nonlinearity, and 1D multiscroll system with stair case nonlinearity. The three systems are evaluated on their sensitivity to initial conditions and as Pseudo Random Number Generators (PRNG) after applying a post-processing technique to their output then performing NIST SP. 800-22 statistical tests. The thesis presents a hardware implementation of dynamic S-Boxes for AES that are populated using the three chaos systems. Moreover, a full MATLAB package to analyze the chaos generated S-Boxes based on graphical analysis, Walsh-Hadamard spectrum analysis, and image encryption analysis is developed. Although these S-Boxes are dynamic, meaning they are regenerated whenever the encryption key is changed, the analysis results show that such S-Boxes exhibit good properties like the Strict Avalanche Criterion (SAC) and the nonlinearity and in the application of image encryption. Furthermore, the thesis presents a new Lorenz-chaos-based key expansion for the AES. Many researchers have pointed out that there are some defects in the original key expansion of AES and thus have motivated such chaos-based key expansion proposal. The new proposed key schedule is analyzed and assessed in terms of confusion and diffusion by performing the frequency and SAC test respectively. The obtained results show that the new proposed design is more secure than the original AES key schedule and other proposed designs in the literature. The proposed design is then enhanced to increase the operating speed using the divide- and-conquer concept. Such enhancement, did not only make the AES algorithm more secure, but also enabled the AES to be faster, as it can now operate on higher frequencies, and more area-efficient.

  11. ALGORITHMIC IMAGING APPROACH TO IMPOTENCE

    Directory of Open Access Journals (Sweden)

    Mahyar Ghafoori

    2012-05-01

    Full Text Available Impotence is a common problem that has great impact on the quality of life. Clinical evaluation usually can exclude endocrinologic imbalance, neurogenic dysfunction, and psychological problems as the etiology. A patient who fails to get an erection after vasoactive medications which are injected probably, has hemodynamic impotence. Dynamic studies that include imaging techniques are now available to discriminate between arterial and venous pathology.Doppler ultrasound with color flow and spectral analysis,dynamic infusion corpus cavernosometry and cavernosography, and selective internal pudendal arteriography are outpatient diagnostic procedures that will differentiate, image and quantify the abnormalities in patients with hemodynamic impotence. Not all tests are needed in every patient. Each of these examinations is preceded with the intracavernosal injection of vasoactive medication. Papaverine hydrochloride, phentolamine mesylate, or prostaglandin El will overcome normal sympathetic tone and produce an erection by smooth muscle relaxation and arterial dilatation in a normal patient. Color-flow Doppler and spectral analysis will showthe cavernosal arteries and can identify the hemodynamic effects of stricture or occlusion. Peak systolic velocity is measured. Normal ranges are well established. Spectral analysis also is used to predict the presence of venous disease. Sizable venous leaks in the dorsal penile vein are readily imaged.While the technique may not adequately identify low-grade venous pathology, it will identify the size and location of fibrous plaque formation associated with Peyronie's disease. Cavernosography or cavernosometry is a separate procedure that will quantitate the severity of venousincompetence as well as specifically identify the various avenues of systemic venous return that must be localized if venous occlusive therapy is chosen. In this study, the peak arterial systolic occlusion pressure is quantified during erection,and the presence of arterial pathology can be confirmed.The arterial data are not as reliable as the ultrasound-obtained data because they rely on audible Doppler, which can be obscured in the underlying "noise" heard with erection. The arterial data obtained with both of these examinations are quantitative and replace the qualitative audible Doppler used previously. Specialized equipment allows dynamic data acquisition, ensuring that the needed information is obtained at peak stimulation.Arteriography is done only if reconstructive surgery is contemplated. The examination includes subselective catheterization of the internal pudendal arteries, magnification technique, and evaluation of the recurrent epigastric arteries, which will be harvested for revascularization. An arterial operation is usually successful in younger patients with perineal trauma with a single point of stenosis or occlusion but has been least successful in those patients with atherosclerosis because of the multifocal nature of this disease. Diagnostic information available today is more specific and reliable than at any time in the past and can realistically estimate the severity of the hemodynamic disorder and allow individual treatment options. Ongoing studies will show whether the data these studies provide make an important difference in the treatment of vascular impotence.

  12. Simple Encryption/Decryption Application

    OpenAIRE

    Majdi Al-qdah; Lin Yi Hui

    2007-01-01

    This paper presents an Encryption/Decryption application that is able to work with any type of file; for example: image files, data files, documentation files…etc. The method of encryption is simple enough yet powerful enough to fit the needs of students and staff in a small institution. The application uses simple key generation method of random number generation and combination. The final encryption is a binary one performed through rotation of bits and XOR operation applied on each b...

  13. Variable Weighted Ordered Subset Image Reconstruction Algorithm

    Directory of Open Access Journals (Sweden)

    Ming Jiang

    2006-10-01

    Full Text Available We propose two variable weighted iterative reconstruction algorithms (VW-ART and VW-OS-SART to improve the algebraic reconstruction technique (ART and simultaneous algebraic reconstruction technique (SART and establish their convergence. In the two algorithms, the weighting varies with the geometrical direction of the ray. Experimental results with both numerical simulation and real CT data demonstrate that the VW-ART has a significant improvement in the quality of reconstructed images over ART and OS-SART. Moreover, both VW-ART and VW-OS-SART are more promising in convergence speed than the ART and SART, respectively.

  14. A speedy pixon image reconstruction algorithm

    OpenAIRE

    Eke, Vincent

    1999-01-01

    A speedy pixon algorithm for image reconstruction is described. Two applications of the method to simulated astronomical data sets are also reported. In one case, galaxy clusters are extracted from multiwavelength microwave sky maps using the spectral dependence of the Sunyaev-Zel'dovich effect to distinguish them from the microwave background fluctuations and the instrumental noise. The second example involves the recovery of a sharply peaked emission profile, such as might...

  15. Segmentation of Medical Image using Clustering and Watershed Algorithms

    OpenAIRE

    M. C.J. Christ; R.M.S. Parvathi

    2011-01-01

    Problem statement: Segmentation plays an important role in medical imaging. Segmentation of an image is the division or separation of the image into dissimilar regions of similar attribute. In this study we proposed a methodology that integrates clustering algorithm and marker controlled watershed segmentation algorithm for medical image segmentation. The use of the conservative watershed algorithm for medical image analysis is pervasive because of its advantages, such as always being able to...

  16. Image Colour Segmentation by Genetic Algorithms

    CERN Document Server

    Ramos, V; Ramos, Vitorino; Muge, Fernando

    2004-01-01

    Segmentation of a colour image composed of different kinds of texture regions can be a hard problem, namely to compute for an exact texture fields and a decision of the optimum number of segmentation areas in an image when it contains similar and/or unstationary texture fields. In this work, a method is described for evolving adaptive procedures for these problems. In many real world applications data clustering constitutes a fundamental issue whenever behavioural or feature domains can be mapped into topological domains. We formulate the segmentation problem upon such images as an optimisation problem and adopt evolutionary strategy of Genetic Algorithms for the clustering of small regions in colour feature space. The present approach uses k-Means unsupervised clustering methods into Genetic Algorithms, namely for guiding this last Evolutionary Algorithm in his search for finding the optimal or sub-optimal data partition, task that as we know, requires a non-trivial search because of its intrinsic NP-complet...

  17. Multiplexing of encrypted data using fractal masks.

    Science.gov (United States)

    Barrera, John F; Tebaldi, Myrian; Amaya, Dafne; Furlan, Walter D; Monsoriu, Juan A; Bolognini, Néstor; Torroba, Roberto

    2012-07-15

    In this Letter, we present to the best of our knowledge a new all-optical technique for multiple-image encryption and multiplexing, based on fractal encrypting masks. The optical architecture is a joint transform correlator. The multiplexed encrypted data are stored in a photorefractive crystal. The fractal parameters of the key can be easily tuned to lead to a multiplexing operation without cross talk effects. Experimental results that support the potential of the method are presented. PMID:22825170

  18. Multiplexing of encrypted data using fractal masks

    OpenAIRE

    Monsoriu Serra, Juan Antonio; Barrera, J.F.; M. Tebaldi; Amaya, D.; Furlan, W.D.; BOLOGNINI, NESTOR ALBERTO; TORROBA, ROBERTO DANIEL

    2012-01-01

    In this Letter, we present to the best of our knowledge a new all-optical technique for multiple-image encryption and multiplexing, based on fractal encrypting masks. The optical architecture is a joint transform correlator. The multiplexed encrypted data are stored in a photorefractive crystal. The fractal parameters of the key can be easily tuned to lead to a multiplexing operation without cross talk effects. Experimental results that support the potential of the method are presented.

  19. GPUs benchmarking in subpixel image registration algorithm

    Science.gov (United States)

    Sanz-Sabater, Martin; Picazo-Bueno, Jose Angel; Micó, Vicente; Ferrerira, Carlos; Granero, Luis; Garcia, Javier

    2015-05-01

    Image registration techniques are used among different scientific fields, like medical imaging or optical metrology. The straightest way to calculate shifting between two images is using the cross correlation, taking the highest value of this correlation image. Shifting resolution is given in whole pixels which cannot be enough for certain applications. Better results can be achieved interpolating both images, as much as the desired resolution we want to get, and applying the same technique described before, but the memory needed by the system is significantly higher. To avoid memory consuming we are implementing a subpixel shifting method based on FFT. With the original images, subpixel shifting can be achieved multiplying its discrete Fourier transform by a linear phase with different slopes. This method is high time consuming method because checking a concrete shifting means new calculations. The algorithm, highly parallelizable, is very suitable for high performance computing systems. GPU (Graphics Processing Unit) accelerated computing became very popular more than ten years ago because they have hundreds of computational cores in a reasonable cheap card. In our case, we are going to register the shifting between two images, doing the first approach by FFT based correlation, and later doing the subpixel approach using the technique described before. We consider it as `brute force' method. So we will present a benchmark of the algorithm consisting on a first approach (pixel resolution) and then do subpixel resolution approaching, decreasing the shifting step in every loop achieving a high resolution in few steps. This program will be executed in three different computers. At the end, we will present the results of the computation, with different kind of CPUs and GPUs, checking the accuracy of the method, and the time consumed in each computer, discussing the advantages, disadvantages of the use of GPUs.

  20. A cubic map chaos criterion theorem with applications in generalized synchronization based pseudorandom number generator and image encryption

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Xiuping, E-mail: yangxiuping-1990@163.com; Min, Lequan, E-mail: minlequan@sina.com; Wang, Xue, E-mail: wangxue-20130818@163.com [Schools of Mathematics and Physics, University of Science and Technology Beijing, Beijing 100083 (China)

    2015-05-15

    This paper sets up a chaos criterion theorem on a kind of cubic polynomial discrete maps. Using this theorem, Zhou-Song's chaos criterion theorem on quadratic polynomial discrete maps and generalized synchronization (GS) theorem construct an eight-dimensional chaotic GS system. Numerical simulations have been carried out to verify the effectiveness of theoretical results. The chaotic GS system is used to design a chaos-based pseudorandom number generator (CPRNG). Using FIPS 140-2 test suit/Generalized FIPS 140-2, test suit tests the randomness of two 1000 key streams consisting of 20?000 bits generated by the CPRNG, respectively. The results show that there are 99.9%/98.5% key streams to have passed the FIPS 140-2 test suit/Generalized FIPS 140-2 test. Numerical simulations show that the different keystreams have an average 50.001% same codes. The key space of the CPRNG is larger than 2{sup 1345}. As an application of the CPRNG, this study gives an image encryption example. Experimental results show that the linear coefficients between the plaintext and the ciphertext and the decrypted ciphertexts via the 100 key streams with perturbed keys are less than 0.00428. The result suggests that the decrypted texts via the keystreams generated via perturbed keys of the CPRNG are almost completely independent on the original image text, and brute attacks are needed to break the cryptographic system.

  1. A cubic map chaos criterion theorem with applications in generalized synchronization based pseudorandom number generator and image encryption

    International Nuclear Information System (INIS)

    This paper sets up a chaos criterion theorem on a kind of cubic polynomial discrete maps. Using this theorem, Zhou-Song's chaos criterion theorem on quadratic polynomial discrete maps and generalized synchronization (GS) theorem construct an eight-dimensional chaotic GS system. Numerical simulations have been carried out to verify the effectiveness of theoretical results. The chaotic GS system is used to design a chaos-based pseudorandom number generator (CPRNG). Using FIPS 140-2 test suit/Generalized FIPS 140-2, test suit tests the randomness of two 1000 key streams consisting of 20?000 bits generated by the CPRNG, respectively. The results show that there are 99.9%/98.5% key streams to have passed the FIPS 140-2 test suit/Generalized FIPS 140-2 test. Numerical simulations show that the different keystreams have an average 50.001% same codes. The key space of the CPRNG is larger than 21345. As an application of the CPRNG, this study gives an image encryption example. Experimental results show that the linear coefficients between the plaintext and the ciphertext and the decrypted ciphertexts via the 100 key streams with perturbed keys are less than 0.00428. The result suggests that the decrypted texts via the keystreams generated via perturbed keys of the CPRNG are almost completely independent on the original image text, and brute attacks are needed to break the cryptographic system

  2. Digital Image Mosaic Technology Based on Improved Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Yan

    2014-03-01

    Full Text Available Image mosaic technology is an important technology in the field of image processing. Based on the general adaptability and clustering of genetic algorithm, we improve it, and apply it to the mosaic algorithm in image processing. In this paper, we test the validity and reliability of the designed algorithm in the process of image mosaic algorithm. Based on the image illumination mosaic and painting texture mosaic image we achieve certain artistic effect. From the convergence results of general algorithm, the results of numerical show substantially concussion, oscillation amplitude reaches a maximum of . The calculation results of the genetic algorithm still have certain degree of concussion, oscillation amplitude reaches a maximum of , convergence results are slightly better than the general algorithm. The improved genetic algorithm results have no concussion, the stability is very good, and the results have better astringency.

  3. A Fast SIFT Image Mosaic Algorithm Based on Wavelet Transformation

    Directory of Open Access Journals (Sweden)

    ZUO Yi

    2014-05-01

    Full Text Available Recently, SIFT feature matching algorithm is becoming the focus of the image mosaic. In traditional SIFT algorithm, the mosaic procedure is computationally intensive and time-comsuming. For solving this problem, an improved SIFT algorithm is presented in this paper. The proposed algorithm combing wavelet transform into SIFT to simplify the scale-invariant feature extraction process, and speed up the image mosaic. The tests of two images mosaicing with the classical SIFT algorithm and the algorithm proposed in this paper have done respectively. With the control parameter ?=0.5, the classical SIFT algorithm took 1.192 891 seconds to extract the feature points, while the improved algorithm took 0.856 712 seconds. The contrast simulation tests demonstrate the validity of the proposed algorithm, and show that the speed and accuracy of the proposed algorithm is improved, at the same time the effect of the image mosaic is maintained.

  4. Enhanced Throughput AES Encryption

    Directory of Open Access Journals (Sweden)

    Kunal Lala

    2012-09-01

    Full Text Available This paper presents our experience in implementing the Advanced Encryption Standard (AES algorithm. We have used 128 bit block size and 128 bit cipher key for the implementation. The AES also known as Rijndael algorithm is used to ensure security of transmission channels. Xilinx design tool 13.3 and Xilinx project navigator design tool are used for synthesis and simulation. Very high speed integrated circuit hardware description language (VHDL is used for coding. The fully pipelined design was implemented on Virtex 6 FPGA family and a throughput of 49.3Gbits/s was achieved with an operational frequency of 384.793 MHz.

  5. Polarization image fusion algorithm based on improved PCNN

    Science.gov (United States)

    Zhang, Siyuan; Yuan, Yan; Su, Lijuan; Hu, Liang; Liu, Hui

    2013-12-01

    The polarization detection technique provides polarization information of objects which conventional detection techniques are unable to obtain. In order to fully utilize of obtained polarization information, various polarization imagery fusion algorithms have been developed. In this research, we proposed a polarization image fusion algorithm based on the improved pulse coupled neural network (PCNN). The improved PCNN algorithm uses polarization parameter images to generate the fused polarization image with object details for polarization information analysis and uses the matching degree M as the fusion rule. The improved PCNN fused image is compared with fused images based on Laplacian pyramid (LP) algorithm, Wavelet algorithm and PCNN algorithm. Several performance indicators are introduced to evaluate the fused images. The comparison showed the presented algorithm yields image with much higher quality and preserves more detail information of the objects.

  6. Evaluation of various deformable image registration algorithms for thoracic images

    International Nuclear Information System (INIS)

    We evaluated the accuracy of one commercially available and three publicly available deformable image registration (DIR) algorithms for thoracic four-dimensional (4D) computed tomography (CT) images. Five patients with esophagus cancer were studied. Datasets of the five patients were provided by DIR-lab (dir-lab.com) and consisted of thoracic 4D CT images and a coordinate list of anatomical landmarks that had been manually identified. Expert landmark correspondence was used for evaluating DIR spatial accuracy. First, the manually measured displacement vector field (mDVF) was obtained from the coordinate list of anatomical landmarks. Then the automatically calculated displacement vector field (aDVF) was calculated by using the following four DIR algorithms: B-spine implemented in Velocity AI (Velocity Medical, Atlanta, GA, USA), free-form deformation (FFD), Horn–Schunk optical flow (OF) and Demons in DIRART of MATLAB software. Registration error is defined as the difference between mDVF and aDVF. The mean 3D registration errors were 2.7 ± 0.8 mm for B-spline, 3.6 ± 1.0 mm for FFD, 2.4 ± 0.9 mm for OF and 2.4 ± 1.2 mm for Demons. The results showed that reasonable accuracy was achieved in B-spline, OF and Demons, and that these algorithms have the potential to be used for 4D dose calculation, automatic image segmentation and 4D CT ventilation imaging in patients with thoracic cancer. However, for all algorithms, the accuracy might be improved by using the optimized parameter setting. Furthermore, for B-spline in Velocity AI, the 3D registration error was small with displacements of less than ?10 mm, indicating that this software may be useful in this range of displacements

  7. MR Brain Image Segmentation using Bacteria Foraging Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    E. Ben George

    2012-10-01

    Full Text Available The most important task in digital image processing is image segmentation. This paper put forward an unique image segmentation algorithm that make use of a Markov Random Field (MRF hybrid with biologically inspired technique Bacteria Foraging Optimization Algorithm (BFOA for Brain Magnetic Resonance Images The proposed new algorithm works on the image pixel data and a region/neighborhood map to form a context in which they can merge. Hence, the MR brain image is segmented using MRF-BFOA and the results are compared to traditional metaheuristic segmentation method Genetic Algorithm. All the experiment results show that MRF-BFOA has better performancethan that of standard MRF-GA

  8. Performance analysis of Non Linear Filtering Algorithms for underwater images

    OpenAIRE

    Suresh Kumar Thakur; Subashini, Dr. P.; Mr. M. Muthu Kumar; Padmavathi, Dr. G.

    2009-01-01

    Image filtering algorithms are applied on images to remove the different types of noise that are either present in the image during capturing or injected in to the image during transmission. Underwater images when captured usually have Gaussian noise, speckle noise and salt and pepper noise. In this work, five different image filtering algorithms are compared for the three different noise types. The performances of the filters are compared using the Peak Signal to Noise Rati...

  9. Image Fusion Algorithms for Medical Images-A Comparison

    Directory of Open Access Journals (Sweden)

    M.D. Nandeesh

    2015-07-01

    Full Text Available This paper presents a comparative study of medical image fusion algorithms along with its performance analysis. Magnetic Resonance Imaging (MRI and Computed Tomography (CT images are used to fuse which form a contemporary image so as to improve the complementary and redundant information for diagnosis purpose. For this, Discrete Wavelet Transform (DWT, Stationary Wavelet Transform (SWT, Principle Component Analysis (PCA and curvelet transform techniques are employed and its experimental results are evaluated and compared. Comparison of fusion performance is based on its root mean square error (RMSE, peak signal to noise ratio (PSNR, Mutual Information (MI and Entropy (H. Comparison results demonstrate the achievement of better performance of fusion by using curvelet transform.

  10. Algorithms evaluation for fundus images enhancement

    International Nuclear Information System (INIS)

    Color images of the retina inherently involve noise and illumination artifacts. In order to improve the diagnostic quality of the images, it is desirable to homogenize the non-uniform illumination and increase contrast while preserving color characteristics. The visual result of different pre-processing techniques can be very dissimilar and it is necessary to make an objective assessment of the techniques in order to select the most suitable. In this article the performance of eight algorithms to correct the non-uniform illumination, contrast modification and color preservation was evaluated. In order to choose the most suitable a general score was proposed. The results got good impression from experts, although some differences suggest that not necessarily the best statistical quality of image is the one of best diagnostic quality to the trained doctor eye. This means that the best pre-processing algorithm for an automatic classification may be different to the most suitable one for visual diagnosis. However, both should result in the same final diagnosis.

  11. Improved bat algorithm applied to multilevel image thresholding.

    Science.gov (United States)

    Alihodzic, Adis; Tuba, Milan

    2014-01-01

    Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed. PMID:25165733

  12. Fast and efficient fractal image compression algorithm

    Science.gov (United States)

    Wang, Yigang; Peng, Qunsheng; Jin, Yiwen

    1998-09-01

    Since the encoding time for fractal encoding method is high and of importance, how to speed up the encoding is a key issue. In this paper, we propose a fast and efficient fractal coding algorithm, which is based on merged quadtree partitioning scheme. In this algorithm, the searching procedure in the finer level sufficiently uses the computing result of the coarser one. For the searches in finer levels, most of computation has been completed by searches in the coarser level, so the total search time reduces greatly. Besides, we use simple merged quadtree to reduce the number of transformation needed by encoding the whole image, so at the same PSNR, we also gain an increase of the compression ratio.

  13. A sparse reconstruction algorithm for ultrasonic images in nondestructive testing.

    Science.gov (United States)

    Guarneri, Giovanni Alfredo; Pipa, Daniel Rodrigues; Neves Junior, Flávio; de Arruda, Lúcia Valéria Ramos; Zibetti, Marcelo Victor Wüst

    2015-01-01

    Ultrasound imaging systems (UIS) are essential tools in nondestructive testing (NDT). In general, the quality of images depends on two factors: system hardware features and image reconstruction algorithms. This paper presents a new image reconstruction algorithm for ultrasonic NDT. The algorithm reconstructs images from A-scan signals acquired by an ultrasonic imaging system with a monostatic transducer in pulse-echo configuration. It is based on regularized least squares using a l1 regularization norm. The method is tested to reconstruct an image of a point-like reflector, using both simulated and real data. The resolution of reconstructed image is compared with four traditional ultrasonic imaging reconstruction algorithms: B-scan, SAFT, ?-k SAFT and regularized least squares (RLS). The method demonstrates significant resolution improvement when compared with B-scan-about 91% using real data. The proposed scheme also outperforms traditional algorithms in terms of signal-to-noise ratio (SNR). PMID:25905700

  14. Performance Comparison Of Evolutionary Algorithms For Image Clustering

    Science.gov (United States)

    Civicioglu, P.; Atasever, U. H.; Ozkan, C.; Besdok, E.; Karkinli, A. E.; Kesikoglu, A.

    2014-09-01

    Evolutionary computation tools are able to process real valued numerical sets in order to extract suboptimal solution of designed problem. Data clustering algorithms have been intensively used for image segmentation in remote sensing applications. Despite of wide usage of evolutionary algorithms on data clustering, their clustering performances have been scarcely studied by using clustering validation indexes. In this paper, the recently proposed evolutionary algorithms (i.e., Artificial Bee Colony Algorithm (ABC), Gravitational Search Algorithm (GSA), Cuckoo Search Algorithm (CS), Adaptive Differential Evolution Algorithm (JADE), Differential Search Algorithm (DSA) and Backtracking Search Optimization Algorithm (BSA)) and some classical image clustering techniques (i.e., k-means, fcm, som networks) have been used to cluster images and their performances have been compared by using four clustering validation indexes. Experimental test results exposed that evolutionary algorithms give more reliable cluster-centers than classical clustering techniques, but their convergence time is quite long.

  15. An Open Question on the Uniqueness of (Encrypted) Arithmetic

    OpenAIRE

    Breuer, Peter T.; Bowen, Jonathan P.

    2013-01-01

    We ask whether two or more images of arithmetic may inhabit the same space via different encodings. The answers have significance for a class of processor design that does all its computation in an encrypted form, without ever performing any decryption or encryption itself. Against the possibility of algebraic attacks against the arithmetic in a `crypto-processor' (KPU) we propose a defence called `ABC encryption' and show how this kind of encryption makes it impossible for ...

  16. New 2D CA based Image Encryption Scheme and a novel Non-Parametric Test for Pixel Randomness

    OpenAIRE

    J, BalaSuyambu; R, Radha; R, Rama

    2015-01-01

    In this paper we have proposed a new test for pixel randomness using non-parametric method in statistics. In order to validate this new non-parametric test we have designed an encryption scheme based on 2D cellular automata. The strength of the designed encryption scheme is first assessed by standard methods for security analysis and the pixel randomness is then determined by the newly proposed non-parametric method.

  17. Edge Detection of Medical Images Using Morpholgical Algorithms

    Directory of Open Access Journals (Sweden)

    Anurag Sharma

    2012-08-01

    Full Text Available Medical images edge detection is an important work for object recognition of the human organs and it is an important pre-processing step in medical image segmentation and 3D reconstruction. Conventionally, edge is detected according to some early brought forward algorithms such as gradient-based algorithm and template-based algorithm, but they are not so good for noise medical image edge detection. In this paper, basic mathematical morphological theory and operations are introduced at first, and then a novel mathematical morphological edge detection algorithm is proposed to detect the edge of lungs CT image with salt-and-pepper noise. The experimental results show that the proposed algorithm is more efficient for medical image denoising and edge detection than the usually used template-based edge detection algorithms and general morphological edge detection algorithms.

  18. An Algorithm for Transforming Color Images into Tactile Graphics

    CERN Document Server

    Rataj, A

    2004-01-01

    This paper presents an algorithm that transforms color visual images, like photographs or paintings, into tactile graphics. In the algorithm, the edges of objects are detected and colors of the objects are estimated. Then, the edges and the colors are encoded into lines and textures in the output tactile image. Design of the method is substantiated by various qualities of haptic recognizing of images. Also, means of presentation of the tactile images in printouts are discussed. Example translated images are shown.

  19. Multiply-agile encryption in high speed communication networks

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, L.G. [Sandia National Labs., Albuquerque, NM (United States); Witzke, E.L. [RE/SPEC Inc., Albuquerque, NM (United States)

    1997-05-01

    Different applications have different security requirements for data privacy, data integrity, and authentication. Encryption is one technique that addresses these requirements. Encryption hardware, designed for use in high-speed communications networks, can satisfy a wide variety of security requirements if that hardware is key-agile, robustness-agile and algorithm-agile. Hence, multiply-agile encryption provides enhanced solutions to the secrecy, interoperability and quality of service issues in high-speed networks. This paper defines these three types of agile encryption. Next, implementation issues are discussed. While single-algorithm, key-agile encryptors exist, robustness-agile and algorithm-agile encryptors are still research topics.

  20. AN INTENSITY-BASED MEDICAL IMAGE REGISTRATION USING GENETIC ALGORITHM

    Directory of Open Access Journals (Sweden)

    Shanmugapriya.S

    2014-10-01

    Full Text Available Medical imaging plays a vital role to create images of human body for clinical purposes. Biomedical imaging has taken a leap by entering into the field of image registration. Image registration integrates the large amount of medical information embedded in the images taken at different time intervals and images at different orientations. In this paper, an intensity-based real-coded genetic algorithm is used for registering two MRI images. To demonstrate the efficiency of the algorithm developed, the alignment of the image is altered and algorithm is tested for better performance. Also the work involves the comparison of two similarity metrics, and based on the outcome the best metric suited for genetic algorithm is studied.

  1. Research on Target Type Recognition Algorithm of Aerial Infrared Image

    OpenAIRE

    Bin Liu; Yangyu Fan; Jian Guo

    2013-01-01

    In order to improve the target type recognition rate of aerial infrared image under the new requirements of omnidirectional crossing and multitarget types, a recognition algorithm which has four steps is researched in this paper. Firstly the maximum between-cluster variance (Ostu) algorithm is applied to segment target from the infrared image. Secondly a new edge detection algorithm is proposed to get the target edge in the segmentation image. Thirdly the edge points are fitted to be a polygo...

  2. Edge Detection of Medical Images Using Morpholgical Algorithms

    OpenAIRE

    Anurag Sharma; Pankaj Sharma3; Rashmi,; Hardeep Kumar

    2012-01-01

    Medical images edge detection is an important work for object recognition of the human organs and it is an important pre-processing step in medical image segmentation and 3D reconstruction. Conventionally, edge is detected according to some early brought forward algorithms such as gradient-based algorithm and template-based algorithm, but they are not so good for noise medical image edge detection. In this paper, basic mathematical morphological theory and operations are introduced at first, ...

  3. Convergence of iterative image reconstruction algorithms for Digital Breast Tomosynthesis

    DEFF Research Database (Denmark)

    Sidky, Emil; JØrgensen, Jakob Heide

    2012-01-01

    Most iterative image reconstruction algorithms are based on some form of optimization, such as minimization of a data-fidelity term plus an image regularizing penalty term. While achieving the solution of these optimization problems may not directly be clinically relevant, accurate optimization solutions can aid in iterative image reconstruction algorithm design. This issue is particularly acute for iterative image reconstruction in Digital Breast Tomosynthesis (DBT), where the corresponding data model IS particularly poorly conditioned. The impact of this poor conditioning is that iterative algorithms applied to this system can be slow to converge. Recent developments in first-order algorithms are now beginning to allow for accurate solutions to optimization problems of interest to tomographic imaging in general. In particular, we investigate an algorithm developed by Chambolle and Pock (2011 J. Math. Imag. Vol. 40, pgs 120-145) and apply it to iterative image reconstruction in DBT.

  4. Automatic image enhancement by artificial bee colony algorithm

    Science.gov (United States)

    Yimit, Adiljan; Hagihara, Yoshihiro; Miyoshi, Tasuku; Hagihara, Yukari

    2013-03-01

    With regard to the improvement of image quality, image enhancement is an important process to assist human with better perception. This paper presents an automatic image enhancement method based on Artificial Bee Colony (ABC) algorithm. In this method, ABC algorithm is applied to find the optimum parameters of a transformation function, which is used in the enhancement by utilizing the local and global information of the image. In order to solve the optimization problem by ABC algorithm, an objective criterion in terms of the entropy and edge information is introduced to measure the image quality to make the enhancement as an automatic process. Several images are utilized in experiments to make a comparison with other enhancement methods, which are genetic algorithm-based and particle swarm optimization algorithm-based image enhancement methods.

  5. Multi-agent Remote Sensing Image Segmentation Algorithm

    Directory of Open Access Journals (Sweden)

    Jing Chen

    2014-05-01

    Full Text Available Due to fractal network evolution algorithm (FNEA in the treatment of the high spatial resolution remote sensing image (HSRI using a parallel global control strategies which limited when the objects in each cycle by traversal of and not good use the continuity of homogenous area on the space and lead to problems such as bad image segmentation, therefore puts forward the remote sensing image segmentation algorithm based on multi-agent. The algorithm in the merger guidelines, combining the image spectral and shape information, and by using region merging process of multi-agent parallel control integral, its global merger control strategy can ensure algorithm has the advantages of parallel computing and fully considering the regional homogeneity, and continuity. Finally simulation experiment was performed with FNEA algorithms, experimental results show that the proposed algorithm is better than FNEA algorithm in dividing the overall effect, has a good stability

  6. Analysis of image thresholding segmentation algorithms based on swarm intelligence

    Science.gov (United States)

    Zhang, Yi; Lu, Kai; Gao, Yinghui; Yang, Bo

    2013-03-01

    Swarm intelligence-based image thresholding segmentation algorithms are playing an important role in the research field of image segmentation. In this paper, we briefly introduce the theories of four existing image segmentation algorithms based on swarm intelligence including fish swarm algorithm, artificial bee colony, bacteria foraging algorithm and particle swarm optimization. Then some image benchmarks are tested in order to show the differences of the segmentation accuracy, time consumption, convergence and robustness for Salt & Pepper noise and Gaussian noise of these four algorithms. Through these comparisons, this paper gives qualitative analyses for the performance variance of the four algorithms. The conclusions in this paper would give a significant guide for the actual image segmentation.

  7. Bluetooth Based Chaos Synchronization Using Particle Swarm Optimization and Its Applications to Image Encryption

    OpenAIRE

    Tzu-Hsiang Hung; Her-Terng Yau; Chia-Chun Hsieh

    2012-01-01

    This study used the complex dynamic characteristics of chaotic systems and Bluetooth to explore the topic of wireless chaotic communication secrecy and develop a communication security system. The PID controller for chaos synchronization control was applied, and the optimum parameters of this PID controller were obtained using a Particle Swarm Optimization (PSO) algorithm. Bluetooth was used to realize wireless transmissions, and a chaotic wireless communication security system was developed ...

  8. Low complexity image recognition algorithm for handheld applications

    OpenAIRE

    Ayyalasomayajula, Pradyumna; Grassi Pauletti, Sara; Farine, Pierre-André

    2011-01-01

    We propose a low complexity image recognition algorithm based on Content Based Image Retrieval (CBIR) suitable for handheld applications. The target application is an Alternative and Augmentative Communication (AAC) device used in speech rehabilitation and education. The device recognizes images (pictograms and pictures) and plays a sound message associated with the recognized image. Experimental validation of the proposed algorithm using MATLAB and its DSP implementation is presented.

  9. Iris Recognition Using Image Moments and k-Means Algorithm

    OpenAIRE

    Yaser Daanial Khan; Sher Afzal Khan; Farooq Ahmad; Saeed Islam

    2014-01-01

    This paper presents a biometric technique for identification of a person using the iris image. The iris is first segmented from the acquired image of an eye using an edge detection algorithm. The disk shaped area of the iris is transformed into a rectangular form. Described moments are extracted from the grayscale image which yields a feature vector containing scale, rotation, and translation invariant moments. Images are clustered using the k-means algorithm and centroids for each cluster ar...

  10. Performance analysis of Non Linear Filtering Algorithms for underwater images

    CERN Document Server

    Padmavathi, Dr G; Kumar, Mr M Muthu; Thakur, Suresh Kumar

    2009-01-01

    Image filtering algorithms are applied on images to remove the different types of noise that are either present in the image during capturing or injected in to the image during transmission. Underwater images when captured usually have Gaussian noise, speckle noise and salt and pepper noise. In this work, five different image filtering algorithms are compared for the three different noise types. The performances of the filters are compared using the Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE). The modified spatial median filter gives desirable results in terms of the above two parameters for the three different noise. Forty underwater images are taken for study.

  11. Non-parametric Diffeomorphic Image Registration with the Demons Algorithm : Non-parametric Diffeomorphic Image Registration with the Demons Algorithm

    OpenAIRE

    Vercauteren, Tom; Pennec, Xavier; Perchant, Aymeric; Ayache, Nicholas

    2007-01-01

    We propose a non-parametric diffeomorphic image registration algorithm based on Thirion's demons algorithm. The demons algorithm can be seen as an optimization procedure on the entire space of displacement fields. The main idea of our algorithm is to adapt this procedure to a space of diffeomorphic transformations. In contrast to many diffeomorphic registration algorithms, our solution is computationally efficient since in practice it only replaces an addition of free form deformations by a f...

  12. Fast image mosaic algorithm based on the improved Harris-SIFT algorithm

    Science.gov (United States)

    Jiang, Zetao; Liu, Min

    2015-08-01

    This paper proposes a fast image mosaic algorithm based on the improved Harris-SIFT algorithm, according to such problems as more memory consumption, greater redundancy quantity of feature points, slower operation speed, and so on, resulting from using the SIFT algorithm in the image matching stage of the image mosaic process. Firstly in the matching stage of the algorithm, the corner point is extracted by using the multi-scale Harris, feature descriptor is constructed by the 88-dimensional vector based on the SIFT feature, the coarse matching is carried out by the nearest neighbor matching method, and then the precise matching point pair and image transformation matrix are obtained by the RANSAC method. The seamless mosaic can be achieved by using the weighted average image fusion. The experimental results show that this algorithm can not only achieve precise seamless mosaic but also improve operation efficiency, compared with the traditional algorithm.

  13. Segmentation of Medical Image using Clustering and Watershed Algorithms

    Directory of Open Access Journals (Sweden)

    M. C.J. Christ

    2011-01-01

    Full Text Available Problem statement: Segmentation plays an important role in medical imaging. Segmentation of an image is the division or separation of the image into dissimilar regions of similar attribute. In this study we proposed a methodology that integrates clustering algorithm and marker controlled watershed segmentation algorithm for medical image segmentation. The use of the conservative watershed algorithm for medical image analysis is pervasive because of its advantages, such as always being able to construct an entire division of the image. On the other hand, its disadvantages include over segmentation and sensitivity to false edges. Approach: In this study we proposed a methodology that integrates K-Means clustering with marker controlled watershed segmentation algorithm and integrates Fuzzy C-Means clustering with marker controlled watershed segmentation algorithm separately for medical image segmentation. The Clustering algorithms are unsupervised learning algorithms, while the marker controlled watershed segmentation algorithm makes use of automated thresholding on the gradient magnitude map and post-segmentation merging on the initial partitions to reduce the number of false edges and over-segmentation. Results: In this study, we compared K-means clustering and marker controlled watershed algorithm with Fuzzy C-means clustering and marker controlled watershed algorithm. And also we showed that our proposed method produced segmentation maps which gave fewer partitions than the segmentation maps produced by the conservative watershed algorithm. Conclusion: Integration of K-means clustering with marker controlled watershed algorithm gave better segmentation than integration of Fuzzy C-means clustering with marker controlled watershed algorithm. By reducing the amount of over segmentation, we obtained a segmentation map which is more diplomats of the several anatomies in the medical images.

  14. Three-dimensional imaging reconstruction algorithm of gated-viewing laser imaging with compressive sensing.

    Science.gov (United States)

    Li, Li; Xiao, Wei; Jian, Weijian

    2014-11-20

    Three-dimensional (3D) laser imaging combining compressive sensing (CS) has an advantage in lower power consumption and less imaging sensors; however, it brings enormous stress to subsequent calculation devices. In this paper we proposed a fast 3D imaging reconstruction algorithm to deal with time-slice images sampled by single-pixel detectors. The algorithm implements 3D imaging reconstruction before CS recovery, thus it saves plenty of runtime of CS recovery. Several experiments are conducted to verify the performance of the algorithm. Simulation results demonstrated that the proposed algorithm has better performance in terms of efficiency compared to an existing algorithm. PMID:25607878

  15. Image encryption based on nonseparable fractional Fourier transform and chaotic map

    Science.gov (United States)

    Ran, Qiwen; Yuan, Lin; Zhao, Tieyu

    2015-08-01

    In this paper an image cryptosystem is constructed by using double random phase masks and a chaotic map together with a novel transform which is similar to fractional Fourier transform and gyrator transform to some extent. The new transform is not periodic with respect to the transform order and cannot be expressed as a tensor product of two one-dimensional transforms neither in the space domain nor in the Wigner space-frequency domain. In the cryptosystem, the parameters of Arnold map, transform orders of the proposed transform and phase information serve as the main keys. The numerical simulations have demonstrated the validity and high security level of the image cryptosystem based on the proposed transform.

  16. An Encryption Scheme with DNA Technology and JPEG Zigzag Coding for Secure Transmission of Images

    OpenAIRE

    Jacob, Grasha; Murugan, A.

    2013-01-01

    The Internet is a ubiquitous and affordable communications network suited for e-commerce and medical image communications. Security has become a major issue as data communication channels can be intruded by intruders during transmission. Though, different methods have been proposed and used to protect the transmission of data from illegal and unauthorized access, code breakers have come up with various methods to crack them. DNA based Cryptography brings forward a new hope f...

  17. Image Registration Algorithm For A PC-Based System

    Science.gov (United States)

    Nutter, Brian S.; Mitra, Sunanda; Krile, Thomas F.

    1988-01-01

    Image registration algorithms are essential for subtractive analysis of sequential images. Discrepancies in lighting, image orientation, and scale must be minimized before effective subtraction of two images can occur. We have successfully implemented computationally intensive algorithms for registration, which include illuminance normalization and magnification correction, in a PC-based image processing system. A homomorphic filter in the spatial domain is used to reduce the illumination variations in the images. A modified sequential similarity detection technique is used to derive the minimum error factor associated with each combination of translation, magnification, and rotation variations. Each variation of the test image is masked with one of three masks, and the squares of the pixel intensity differences are summed for every test image. An adaptive threshold is used to decrease the time required for a misfit by aborting the test image under consideration when its summation exceeds the value of the previous best fit summation. After the best fit parameters are obtained, they are used to register the images so that the images can be subtracted. The difference image is subjected to further image enhancement operations. The execution time of the image registration algorithm has been reduced through use of a hybrid program written in C and Assembly languages. Applications of the registration algorithms in analysis of fundus images will be presented.

  18. A Multi-Stage Algorithm for Enhanced XRay Image Segmentation

    Directory of Open Access Journals (Sweden)

    ADITYA A. TIRODKAR

    2011-09-01

    Full Text Available With the ever increasing usage of empirical data collected from X-Ray and other Digital Imaging techniques, it has become imperative that this data be subjected to computer algorithms for speedy and more accurate diagnosis. Segmentation is one of the key techniques that are employed during the pre-processing stages of these algorithms for separating those details from the images that are required for analysis. There are currently a number of widespread techniques for segmentation, in use. Our proposed algorithm is a quick and morequalitatively efficient technique for segmentation that is optimized for X-Ray images. It applies Otsu’s algorithm to provide thresholding values that can be used for contrasting and binarizing the images. Also, an edge detection technique has been applied to better evince observations, allowing more fruitful extraction of information and the algorithm has itself been tested on a set of 40 images.

  19. A Quick Image Registration Algorithm Based on Delaunay Triangulation

    OpenAIRE

    Rui Zhang; Li Ma; Yongmei Zhang

    2013-01-01

    The traditional image matching algorithms adopt more complex strategies when dealing with mismatch caused by a lot of noise. In this paper, a simple, intuitive and effective noise processing algorithm is proposed based on Delaunay triangulation in computational geometry. The algorithm extracts feature points using SIFT method, respectively establishes Delaunay triangulation in multi-spectral and panchromatic images, and removes the feature points that three points are collinear and four po...

  20. A Fast and Efficient Topological Coding Algorithm for Compound Images

    Directory of Open Access Journals (Sweden)

    Xin Li

    2003-11-01

    Full Text Available We present a fast and efficient coding algorithm for compound images. Unlike popular mixture raster content (MRC based approaches, we propose to attack compound image coding problem from the perspective of modeling location uncertainty of image singularities. We suggest that a computationally simple two-class segmentation strategy is sufficient for the coding of compound images. We argue that jointly exploiting topological properties of image source in classification and coding stages is beneficial to the robustness of compound image coding systems. Experiment results have justified effectiveness and robustness of the proposed topological coding algorithm.

  1. Encryption And Portable Data Storage

    Directory of Open Access Journals (Sweden)

    Cynthia L. Knott

    2011-04-01

    Full Text Available The protection of data is key issue in today’s world. The wide of availability and use of portable technologies such as USB flash has increased concern about securing the data resides on these devices. Because USB flash drives are small, relatively inexpensive, and easy to use, the security of the information stored on these thumb drives is on-going concern. A number of approaches to safeguarding the information stored on these drives are available. This paper examines one approach to this goal through the use of encryption. This method encrypts all the data on the drive. In addition the fact the data on the drive is encrypted is not visually obvious when viewing the contents of the disk. The proposed approach uses publically available and free encryption algorithms. A user password is needed to view and access the data that has been encrypted.  The proposed methodology is quick and easy to use. Individuals who routinely carry around their USB drives need to be able to decrypt and encrypt the device quickly and conveniently. Furthermore, if the device is lost, it is still possible with the method advocated in this paper to include information about how to return the device to the owner without compromising the secured data on the drive. Without encrypting the data on portable drives, the user risks the disclosure of information. This paper argues that portable storage should be secured and suggests a way to secure the data through password and encryption that further enhances the usability and flexibility of the USB flash drive.  The paper includes the results and analysis of an undergraduate student survey that determined what habits and practices they followed with respect to securing their personal data and files.  Some of the questions included in the analysis are the following: Do you encrypt your USB flash drive?Do you use any type of security for your USB flash drive?How important do you think security is for a flash drive? (A Likert scaleDo you use passwords to protect your USB flash drive?Do you backup your work?Do you think it is important to use security when using a USB flash drive? The findings of the survey help to understand the perspective of today’s students and how to address the critical need to secure their information and data files with them. 

  2. High performance deformable image registration algorithms for manycore processors

    CERN Document Server

    Shackleford, James; Sharp, Gregory

    2013-01-01

    High Performance Deformable Image Registration Algorithms for Manycore Processors develops highly data-parallel image registration algorithms suitable for use on modern multi-core architectures, including graphics processing units (GPUs). Focusing on deformable registration, we show how to develop data-parallel versions of the registration algorithm suitable for execution on the GPU. Image registration is the process of aligning two or more images into a common coordinate frame and is a fundamental step to be able to compare or fuse data obtained from different sensor measurements. E

  3. Fast image matching algorithm based on projection characteristics

    Science.gov (United States)

    Zhou, Lijuan; Yue, Xiaobo; Zhou, Lijun

    2011-06-01

    Based on analyzing the traditional template matching algorithm, this paper identified the key factors restricting the speed of matching and put forward a brand new fast matching algorithm based on projection. Projecting the grayscale image, this algorithm converts the two-dimensional information of the image into one-dimensional one, and then matches and identifies through one-dimensional correlation, meanwhile, because of normalization has been done, when the image brightness or signal amplitude increasing in proportion, it could also perform correct matching. Experimental results show that the projection characteristics based image registration method proposed in this article could greatly improve the matching speed, which ensuring the matching accuracy as well.

  4. Encrypted message transmission in a QO-STBC encoded MISO wireless Communication system under implementation of low complexity ML decoding algorithm

    Directory of Open Access Journals (Sweden)

    Most. Farjana Sharmin

    2012-04-01

    Full Text Available In this paper, we made a comprehensive BER simulation study of a quasi- orthogonal space time block encoded (QO-STBC multiple-input single output(MISO system. The communication system under investigation has incorporated four digital modulations (QPSK, QAM, 16PSK and 16QAM over an Additative White Gaussian Noise (AWGN and Raleigh fading channels for three transmit and one receive antennas. In its FEC channel coding section, three schemes such as Cyclic, Reed-Solomon and ½-rated convolutionally encoding have been used. Under implementation of merely low complexity ML decoding based channel estimation and RSA cryptographic encoding /decoding algorithms, it is observable from conducted simulation test on encrypted text message transmission that the communication system with QAM digital modulation and ½-rated convolutionally encoding techniques is highly effective to combat inherent interferences under Raleigh fading and additive white Gaussian noise (AWGN channels. It is also noticeable from the study that the retrieving performance of the communication system degrades with the lowering of the signal to noise ratio (SNR and increasing in order of modulation.

  5. Generalized flow pattern image reconstruction algorithm for electrical capacitance tomography

    International Nuclear Information System (INIS)

    Successful applications of electrical capacitance tomography (ECT) depend on the speed and precision of the image reconstruction algorithms. In this paper, based on the semiparametric model, a generalized objective functional that considers the outliers in the measured capacitance data and the model error is proposed. A regularized combination minimax estimation is developed. An efficient algorithm, which integrates the advantages of the homotopy method where the homotopy equation is designed by the fixed-point homotopy and solved using the fixed-point iteration algorithm based on the alternate iteration scheme, the quantum particle swarm optimization algorithm that is coupled with the crossover and mutation operators, and the simulated annealing algorithm, is proposed. This algorithm is tested by the noise-free capacitance data and the noise-contaminated capacitance data, and encouraging results are observed. Numerical simulation results reveal the effectiveness and superiority of the proposed algorithm. In the cases of the reconstructed objects considered in this paper, the reconstructed results by the proposed algorithm show great improvement in the spatial resolution and accuracy. The spatial resolution of the reconstructed images is enhanced, and the artifacts in the reconstructed images can be removed effectively. Furthermore, the reconstructed results by the proposed algorithm under the noise-contaminated capacitance data reveal that the proposed algorithm is very competent to deal with the inaccurate nature in the capacitance data. Consequently, a promising algorithm is introduced for ECT image reconstruction.

  6. Image Compression Using A Fast 2-D DCT Algorithm

    International Nuclear Information System (INIS)

    Different efficient algorithms have been developed for the computation of the 1-D DCT and 2-D DCT. Although these algorithms are different, to achieve a high speed and high accuracy is the common goal. We present in this paper a fast and efficient 2-D DCT algorithm, which reduces computational complexity as measured in terms of the number of multiplications and additions and keep the accuracy of the reconstructed images. The algorithm is suitable for VLSI implementation

  7. Genetic Algorithms for Image Segmentation using Active Contours

    Directory of Open Access Journals (Sweden)

    Neeru Gulati

    2013-02-01

    Full Text Available Genetic Algorithm is a search technique used in computing to find approximate solutions to optimization and search problems. As a search strategy, genetic algorithm has been applied successfully in many fields. Firstly, this paper describes the genetic algorithms evolution process .It then describes the active contours to detect the boundaries of the object whose boundaries are not defined. Then it describes the use of genetic algorithm with active contours in image segmentation.

  8. Genetic Algorithms for Image Segmentation using Active Contours

    OpenAIRE

    Neeru Gulati; Poonam Panwar

    2013-01-01

    Genetic Algorithm is a search technique used in computing to find approximate solutions to optimization and search problems. As a search strategy, genetic algorithm has been applied successfully in many fields. Firstly, this paper describes the genetic algorithms evolution process .It then describes the active contours to detect the boundaries of the object whose boundaries are not defined. Then it describes the use of genetic algorithm with active contours in image segmentation.

  9. Implementation of watershed based image segmentation algorithm in FPGA

    OpenAIRE

    Ruparelia, Sameer

    2012-01-01

    The watershed algorithm is a commonly used method of solving the image segmentation problem. However, of the many variants of the watershed algorithm not all are equally well suited for hardware implementation. Different algorithms are studied and the watershed algorithm based on connected components is selected for the implementation, as it exhibits least computational complexity, good segmentation quality and can be implemented in the FPGA. It has simplified memory access compared to all ot...

  10. A Fast Image Matching Algorithm Based on GPU Parallel Computing

    OpenAIRE

    Wen Yongge; He Hongzhou; Li Haiyang

    2013-01-01

    In process of image matching, Scale Invariant Feature Transform (SIFT) algorithm is one of the best performance algorithms. But the drawback of being complex and time-consuming limits its application in more fields. Aiming at this shortage of SIFT algorithm, this study proposes a fast SIFT algorithm based on Graphic Processing Unit (GPU) and analyzes its parallelism. It is further optimized according to the detailed analysis on the thread and memory model of the graphic hardware. It is experi...

  11. A review on the current segmentation algorithms for medical images

    OpenAIRE

    Zhen Ma; João Manuel Ribeiro da Silva Tavares; Renato Manuel Natal Jorge Jorge

    2009-01-01

    This paper makes a review on the current segmentation algorithms used for medical images. Algorithms are divided into three categories according to their main ideas: the ones based on threshold, the ones based on pattern recognition techniques and the ones based on deformable models. The main tendency of each category with their principle ideas, application field, advantages and disadvantages are discussed. For each considered type some typical algorithms are described. Algorithms of the thir...

  12. CS-based fast ultrasound imaging with improved FISTA algorithm

    Science.gov (United States)

    Lin, Jie; He, Yugao; Shi, Guangming; Han, Tingyu

    2015-08-01

    In ultrasound imaging system, the wave emission and data acquisition is time consuming, which can be solved by adopting the plane wave as the transmitted signal, and the compressed sensing (CS) theory for data acquisition and image reconstruction. To overcome the very high computation complexity caused by introducing CS into ultrasound imaging, in this paper, we propose an improvement of the fast iterative shrinkage-thresholding algorithm (FISTA) to achieve the fast reconstruction of the ultrasound imaging, in which a modified setting is done with the parameter of step size for each iteration. Further, the GPU strategy is designed for the proposed algorithm, to guarantee the real time implementation of imaging. The simulation results show that the GPU-based image reconstruction algorithm can achieve the fast ultrasound imaging without damaging the quality of image.

  13. Algorithm of image fusion based on finite ridgelet transform

    Science.gov (United States)

    Liu, Kun; Guo, Lei; Chang, Weiwei; Li, Huihui

    2007-11-01

    Finite ridgelet transform (FRIT) overcomes the weakness of wavelet transform representing in two or higher dimensions and FRIT can efficiently represent the singularity of linear in image. When FRIT is applied to image fusion, the characters of original images can be effectively extracted and more important information is preserved. In this paper, we discussed the advantage of FRIT applying to the image fusion and the process of the fusion algorithm based on FRIT in details. Two sets of images are taken as experimental data, subjective and objective standard are used to evaluate the results. The experiment results show that the FRIT algorithm gets much better fusion results than wavelet, and image fusion algorithm based on FRIT is an effective and feasible algorithm.

  14. Imaging algorithm for steadily flying and maneuvering big targets

    Science.gov (United States)

    Xing, Mengdao; Bao, Zheng

    2001-08-01

    Usually inverse synthetic aperture radar (ISAR) imaging is for small aircraft, with long range, moreover the coherent integration angle is small, that is the target's wavenumber spectrum support region can be regard as a rectangle, Range-Doppler(RD) algorithm or Range-Instantaneous-Doppler (RID) algorithm are employed for image reconstruction after translational motion compensation (TMC), which includes envelope alignment (such as envelope correlation algorithm, minimum entropy algorithm) and autofocus (such as single PPP algorithm, multiple PPP algorithm, PGA, weighted least square algorithm). But migration through resolution cell (MTRC) is not considered after TMC, in fact, the scatterers around the target usually take place MTRC if the size of target is large. In the paper, we first align and focus the high resolution radar target echoes according target center, then we do time scale transform in target's wavenumber domain, that is Soumekh proposed 'keystone' interpolation to compensate MTRC (which can also be realized rapidly by DFT-IFFT or SFT-IFFT in azimuth direction), after range compression (range IFFT), for steadily flying target, target image can be obtained only after azimuth compression (that is FFT in azimuth direction), for maneuvering target, time-frequency analysis must be taken for every range cell, and the existing instantaneous imaging algorithms (such as joint time-frequency distribution algorithm, Radon-Wigner algorithm) are also effective to obtain RID images. This paper gives the ISAR imaging algorithm flow diagram to obtain images from raw data of steadily flying and maneuvering big targets, and simulate data and real data prove that algorithm flow is effective.

  15. A Distortion Input Parameter in Image Denoising Algorithms with Wavelets

    OpenAIRE

    Anisia GOGU; Dorel AIORDACHIOAIE

    2009-01-01

    The problem of image denoising based on wavelets is considered. The paper proposes an image denoising method by imposing a distortion input parameter instead of threshold. The method has two algorithms. The first one is running off line and it is applied to the prototype of the image class and it building a specific dependency, linear or nonlinear, between the final desired distortion and the necessary probability of the details coefficients. The next algorithm, is directly applying the denoi...

  16. Wireless Physical Layer Encryption

    Directory of Open Access Journals (Sweden)

    Gao Baojian

    2013-01-01

    Full Text Available With the rapid development of wireless and cognitive network technology, the security of wireless communication has faced great challenge, in which parameters of wireless communication such as modulation type and frequency are more likely to be detected. As a result, business, especially military communication faces the problems of pertinence interference and content security, which are becoming more and more serious. In this study, a hiding algorithm for OFDM constellation mapping based on physical layer encryption is proposed. A secret seed key is adopted to control the phase rotation factor and amplitude size, thus the OFDM constellation mapping process based on MPSK/MQAM was disrupted. In this way, modulation modes used by legitimate users cannot be detected, thus modulation protection was achieved and illegal users cannot distinguish the modulation type. Simulation results show that this algorithm has a high capacity of modulation hiding on the premise of not changing the original system performance.

  17. Image Compression Algorithms Using Intensity Based Adaptive Quantization Coding

    Directory of Open Access Journals (Sweden)

    Saad Al-Azawi

    2011-01-01

    Full Text Available Problem statement: Low complexity image compression algorithms are necessary for modern portable devices such as mobile phones, wireless sensor networks and high constraint power consumption devices. In such applications low bit rate along with an acceptable image quality are an essential requirements. Approach: This study proposes low and moderate complexity algorithms for colour image compression. Two algorithms will be presented; the first one is intensity based adaptive quantization coding, while the second is a combination of discrete wavelet transforms and the intensity based adaptive quantization coding algorithm. Adaptive quantization coding produces a good Peak Signal to Noise Ratio (PSNR, but with high bit rates compared with other low complex algorithms. The presented algorithms produce low bit rate whilst preserving the PSNR and image quality at an acceptable range. Results: Experiments were performed using different kinds of standard colour images, a multi level quantizer, different thresholds, different block sizes and different wavelet filters. Both algorithms considered the intensity variation of each colour plane. At high compression ratios the proposed algorithms produced 1-3 bpp bit rate reduction against the stand alone adaptive quantization coding for the same image quality. This reduction was achieved due to dropping of some blocks that claimed to be low intensity variation according to a comparison with predefined thresholds for each colour plane. The results show that the bit rate can be reduced by 72-88% for each low variation image block from the original bit rate. Conclusion: The results obtained show a good reduction in bit rate with the same PSNR, or a slightly less than PSNR of a standalone adaptive quantization coding algorithm. Further bit rate reduction has been achieved by decomposing the input image using different wavelet filters and intensity based adaptive quantization coding. The proposed algorithm comprises a number of parameters to control the performance of the compressed images.

  18. An Improved Image Restoration Algorithm for Overcast Based on MSR

    Directory of Open Access Journals (Sweden)

    Zhen Chen

    2013-10-01

    Full Text Available According to degradation model of cloudy images, the paper uses MSR (multi-scale retinex algorithm for restoration. But MSR algorithm can’t effectively restore the details and color of images in processing cloudy images, so the paper proposes a new MSR improvement algorithm to process cloudy images. Three scales of traditional MSR is changed into four scales, which means that a Gaussian function with middle scale keeping the details and color of images is added to. And information fusion strategy based on wavelet transform domain is substituted for linear weighted strategy of multi-scale reflection images for MSR algorithm. The basic idea of fusion is that: Firstly, the images to be fused are for two layers of wavelet decomposition. Then high-frequency component are taken the absolute value maximum to stress the details in image. And local energy method is used for low frequency component to adjust the background and color, which realizes the effect of fidelity. Lastly, subjective observation and objective evaluation in the paper indicates that the algorithm of the paper has better effect on details restoration and color fidelity than the traditional MSR algorithm for restoring cloudy images.

  19. A new modified fast fractal image compression algorithm

    DEFF Research Database (Denmark)

    Salarian, Mehdi; Nadernejad, Ehsan

    2013-01-01

    In this paper, a new fractal image compression algorithm is proposed, in which the time of the encoding process is considerably reduced. The algorithm exploits a domain pool reduction approach, along with the use of innovative predefined values for contrast scaling factor, S, instead of searching it across. Only the domain blocks with entropy greater than a threshold are considered to belong to the domain pool. The algorithm has been tested for some well-known images and the results have been compared with the state-of-the-art algorithms. The experiments show that our proposed algorithm has considerably lower encoding time than the other algorithms giving approximately the same quality for the encoded images.

  20. A Fast Image Matching Algorithm Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Wen Yongge

    2013-01-01

    Full Text Available In process of image matching, Scale Invariant Feature Transform (SIFT algorithm is one of the best performance algorithms. But the drawback of being complex and time-consuming limits its application in more fields. Aiming at this shortage of SIFT algorithm, this study proposes a fast SIFT algorithm based on Graphic Processing Unit (GPU and analyzes its parallelism. It is further optimized according to the detailed analysis on the thread and memory model of the graphic hardware. It is experimentally shown that the speed of the proposed algorithm is 25-45 times faster than the original algorithm. Processing 640×480 images, its speed can be up to 24 frames per second. The conclusion is that proposed algorithm can meet the needs of real-time applications.

  1. Implementation of Parallel Algorithms for Image Enhancement Using Matlab

    Directory of Open Access Journals (Sweden)

    Shaimaa Ibrahem * ,

    2014-08-01

    Full Text Available This paper presents an efficient implementation of algorithms which are used for image enhancement process, which filter and restore images of big size easy and faster. Application with sequential algorithm can no longer work to improve the program performance. In the image enhancement technique we need to work with large image data which takes a lot of time. Parallel computing is an efficient way to handle large size images and to reduce the processing time. This paper focuses on calculating the parallel execution time spend in parallel filtering program and compares it with corresponding sequential execution time, moreover we discuss the results of filtering under different filter types.

  2. Content-oriented Multi-level Security Authorization of Remote Sensing Images

    OpenAIRE

    Yuan Ye; Tu Chunxia; Liu Xiaojun

    2013-01-01

    In this study, on the basis of the characteristics of Large Quantity of Remote Sensing Data and application requirements on security, a scheme of authorizing the use of remote sensing images based on multi-level secrurity is put forward. We propose multi-region and multi-level confidential information of remote sensing images encryption algorithm based on content. The same remote sensing images after encryption are distrtbuted to different level users, such as authorized user, partly authoriz...

  3. Ultrasonic particle image velocimetry for improved flow gradient imaging: algorithms, methodology and validation

    International Nuclear Information System (INIS)

    This paper presents a new algorithm for ultrasonic particle image velocimetry (Echo PIV) for improving the flow velocity measurement accuracy and efficiency in regions with high velocity gradients. The conventional Echo PIV algorithm has been modified by incorporating a multiple iterative algorithm, sub-pixel method, filter and interpolation method, and spurious vector elimination algorithm. The new algorithms' performance is assessed by analyzing simulated images with known displacements, and ultrasonic B-mode images of in vitro laminar pipe flow, rotational flow and in vivo rat carotid arterial flow. Results of the simulated images show that the new algorithm produces much smaller bias from the known displacements. For laminar flow, the new algorithm results in 1.1% deviation from the analytically derived value, and 8.8% for the conventional algorithm. The vector quality evaluation for the rotational flow imaging shows that the new algorithm produces better velocity vectors. For in vivo rat carotid arterial flow imaging, the results from the new algorithm deviate 6.6% from the Doppler-measured peak velocities averagely compared to 15% of that from the conventional algorithm. The new Echo PIV algorithm is able to effectively improve the measurement accuracy in imaging flow fields with high velocity gradients.

  4. A Fingerprint Encryption Scheme Based on Irreversible Function and Secure Authentication

    OpenAIRE

    Yang, Yijun; Yu, JianPing; Zhang , Peng; Wang, Shulan

    2015-01-01

    A fingerprint encryption scheme based on irreversible function has been designed in this paper. Since the fingerprint template includes almost the entire information of users' fingerprints, the personal authentication can be determined only by the fingerprint features. This paper proposes an irreversible transforming function (using the improved SHA1 algorithm) to transform the original minutiae which are extracted from the thinned fingerprint image. Then, Chinese remainder theorem is used to...

  5. ASC-1 : An Authenticated Encryption Stream Cipher

    DEFF Research Database (Denmark)

    Jakimoski, Goce; Khajuria, Samant

    2011-01-01

    The goal of the modes of operation for authenticated encryption is to achieve faster encryption and message authentication by performing both the encryption and the message authentication in a single pass as opposed to the traditional encrypt-then-mac approach, which requires two passes. Unfortunately, the use of a block cipher as a building block limits the performance of the authenticated encryption schemes to at most one message block per block cipher evaluation. In this paper, we propose the authenticated encryption scheme ASC-1 (Authenticating Stream Cipher One). Similarly to LEX, ASC-1 uses leak extraction from diÆerent AES rounds to compute the key material that is XOR-ed with the message to compute the ciphertext. Unlike LEX, the ASC-1 operates in a CFB fashion to compute an authentication tag over the encrypted message. We argue that ASC-1 is secure by reducingits (IND-CCA , INT-CTXT) security to the problem of distinguishing the case when the round keys are uniformly random from the case when the round keys are generated by a key scheduling algorithm.

  6. Comparative Assessment of Some Target Detection Algorithms for Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Manoj K. Arora

    2013-01-01

    Full Text Available Target detection is of particular interest in hyperspectral image analysis as many unknown and subtle signals (spectral response unresolved by multispectral sensors can be discovered in hyperspectral images. The detection of signals in the form of small objects and targets from hyperspectral sensors has a wide range of applications both civilian and military. It has been observed that a number of target detection algorithms are in vogue; each has its own advantages and disadvantages and assumptions. The selection of a particular algorithm may depend on the amount of information available as per the requirement of the algorithm, application area, the computational complexity etc. In the present study, three algorithms, namely, orthogonal subspace projection (OSP, constrained energy minimization (CEM and a nonlinear version of OSP called kernel orthogonal subspace projection (KOSP, have been investigated for target detection from hyperspectral remote sensing data. The efficacy of algorithms has been examined over two different hyperspectral datasets which include a synthetic image and an AVIRIS image. The quality of target detection from these algorithms has been evaluated through visual interpretation as well as through receiver operating characteristic (ROC curves. The performance of OSP algorithm has been found to be better than or comparable to CEM algorithm. However, KOSP outperforms both the algorithms.

  7. Three-dimensional information encryption and anticounterfeiting using digital holography.

    Science.gov (United States)

    Shiu, Min-Tzung; Chew, Yang-Kun; Chan, Huang-Tian; Wong, Xin-Yu; Chang, Chi-Ching

    2015-01-01

    In this work, arbitrary micro phase-step digital holography with optical interferometry and digital image processing is utilized to obtain information about an image of a three-dimensional object and encrypting keys. Then, a computer-generated hologram is used for the purpose of holographic encryption. All information about the keys is required to perform the decryption, comprising the amplitude and phase distribution of the encrypting key, the distance of image reconstruction, zero-order term elimination, and twin-image term suppression. In addition to using identifiable information on different image planes and linear superposition processing hidden within the encrypted information, not only can we convey an important message, but we can also achieve anticounterfeiting. This approach retains the strictness of traditional holographic encryption and the convenience of digital holographic processing without image distortion. Therefore, this method provides better solutions to earlier methods for the security of the transmission of holographic information. PMID:25967026

  8. Sistema para codificar información implementando varias órbitas caóticas / System for Information Encryption Implementing Several Chaotic Orbits

    Scientific Electronic Library Online (English)

    Maricela, Jiménez-Rodríguez; Octavio, Flores-Siordia; María Guadalupe, González-Novoa.

    2015-09-01

    Full Text Available En este artículo se propone un algoritmo de cifrado simétrico que toma como entrada la información original de longitud L y al codificarla genera el texto cifrado de longitud mayor LM. Se implementa el sistema discreto caótico mapa logístico para generar 3 órbitas diferentes: la primera se utiliza p [...] ara aplicar una técnica de difusión con la finalidad de mezclar la información original, la segunda órbita se combina con la información mezclada y se incrementa la longitud de L hasta LM y con la tercer órbita se implementa la técnica de confusión. El algoritmo de cifrado se aplicó para codificar una imagen que después se recupera totalmente mediante las llaves que se utilizaron para cifrar y su respectivo algoritmo para descifrar. El algoritmo puede codificar cualquier información con solo dividirla en bloques de 8 bits, puede cumplir con los requerimientos de alto nivel de seguridad, utiliza 7 claves para cifrar y además proporciona buena velocidad de cifrado. Abstract in english This article proposes a symmetric encryption algorithm that takes, as input value, the original information of length L, that when encoded, generates the ciphertext of greater length LM. A chaotic discrete system (logistic map) is implemented to generate 3 different orbits: the first is used for app [...] lying a diffusion technique in order to mix the original data, the second orbit is combined with the mixed information and increases the length of L to LM, and with the third orbit, the confusion technique is implemented. The encryption algorithm was applied to encode an image which is then totally recovered by the keys used to encrypt and his respective, decrypt algorithm. The algorithm can encode any information, just dividing into 8 bits, it can cover the requirements for high level security, it uses 7 keys to encrypt and provides good encryption speed.

  9. COMPARISON AND ANALYSIS OF WATERMARKING ALGORITHMS IN COLOR IMAGESIMAGE SECURITY PARADIGM

    OpenAIRE

    Biswas, D; Biswas, S; P.P.Sarkar; Sarkar, D.; Banerjee, S.(); Pal, A.

    2011-01-01

    This paper is based on a comparative study between different watermarking techniques such as LSB hiding algorithm, (2, 2) visual cryptography based watermarking for color images [3,4] and Randomized LSB-MSB hiding algorithm [1]. Here, we embed the secret image in a host or original image, by using these bit-wise pixel manipulation algorithms. This is followed by a comparative study of the resultantimages through Peak Signal to Noise Ratio (PSNR) calculation. The property wise variation of di...

  10. Camera Image Mosaicing Based on an Optimized SURF Algorithm

    Directory of Open Access Journals (Sweden)

    Yanshuang Song

    2012-12-01

    Full Text Available For real-time and robust web camera image mosaicing, a method based on an optimized SURF (Speeded Up Robust Features was proposed in this paper. Firstly, the feature points from the overlapping parts between the reference image and the target image are extracted by employing the rapid matching algorithm - BBF (Best-Bin-Fist. Then RANSAC (Random Sample Consensus is used for mismatched features eliminating and projection matrix calculating to resample the target image and achieve the calibrated one. Finally, registered images are fused with evolutional fusion algorithm to produce image mosaicing. The results demonstrates that our image mosaicing method can stitch the images captured with the web cameras with noise and different lighting effectively and efficiently while also being robust.

  11. Study of image matching algorithm and sub-pixel fitting algorithm in target tracking

    Science.gov (United States)

    Yang, Ming-dong; Jia, Jianjun; Qiang, Jia; Wang, Jian-yu

    2015-03-01

    Image correlation matching is a tracking method that searched a region most approximate to the target template based on the correlation measure between two images. Because there is no need to segment the image, and the computation of this method is little. Image correlation matching is a basic method of target tracking. This paper mainly studies the image matching algorithm of gray scale image, which precision is at sub-pixel level. The matching algorithm used in this paper is SAD (Sum of Absolute Difference) method. This method excels in real-time systems because of its low computation complexity. The SAD method is introduced firstly and the most frequently used sub-pixel fitting algorithms are introduced at the meantime. These fitting algorithms can't be used in real-time systems because they are too complex. However, target tracking often requires high real-time performance, we put forward a fitting algorithm named paraboloidal fitting algorithm based on the consideration above, this algorithm is simple and realized easily in real-time system. The result of this algorithm is compared with that of surface fitting algorithm through image matching simulation. By comparison, the precision difference between these two algorithms is little, it's less than 0.01pixel. In order to research the influence of target rotation on precision of image matching, the experiment of camera rotation was carried on. The detector used in the camera is a CMOS detector. It is fixed to an arc pendulum table, take pictures when the camera rotated different angles. Choose a subarea in the original picture as the template, and search the best matching spot using image matching algorithm mentioned above. The result shows that the matching error is bigger when the target rotation angle is larger. It's an approximate linear relation. Finally, the influence of noise on matching precision was researched. Gaussian noise and pepper and salt noise were added in the image respectively, and the image was processed by mean filter and median filter, then image matching was processed. The result show that when the noise is little, mean filter and median filter can achieve a good result. But when the noise density of salt and pepper noise is bigger than 0.4, or the variance of Gaussian noise is bigger than 0.0015, the result of image matching will be wrong.

  12. KM_GrabCut: a fast interactive image segmentation algorithm

    Science.gov (United States)

    Li, Jianbo; Yao, Yiping; Tang, Wenjie

    2015-03-01

    Image segmentation is critical for image processing. Among several algorithms, GrabCut is well known by its little user interaction and desirable segmentation result. However, it needs to take a lot of time to adjust the Gaussian Mixture Model (GMM) and to cut the weighted graph with Max-Flow/Min-Cut Algorithm iteratively. To solve this problem, we first build a common algorithmic framework which can be shared by the class of GrabCut-like segmentation algorithms, and then propose KM_GrabCut algorithm based on this framework. The KM_GrabCut first uses K-means clustering algorithm to cluster pixels in foreground and background respectively, and then constructs a GMM based on each clustering result and cuts the corresponding weighted graph only once. Experimental results demonstrate that KM_GrabCut outperforms GrabCut with higher performance, comparable segmentation result and user interaction.

  13. Fuzzy algorithms with applications to image processing and pattern recognition

    CERN Document Server

    Hong, Yan

    1996-01-01

    This text deals with the subject of fuzzy algorithms and their applications to image processing and pattern recognition. Subjects covered include membership functions; fuzzy clustering; fuzzy rulers and defuzzification; fuzzy classifiers; and combined classifiers.

  14. Computerized Clinical Imaging Algorithms for Medical Student Education

    OpenAIRE

    Cronan, John J.; Hanson, Daniel J.; McEnery, Kevin W.; Rowe, Lynda E.

    1988-01-01

    A computer based education system has been developed which teaches medical students efficient imaging management of common medical and surgical problems. Through the use of imaging algorithms students learn the sequence of imaging tests which are necessary to effectively determine a patient's clinical diagnosis. The instruction is presented on a personal computer which is interfaced to a random access slide projector. Visual feedback of images ordered are provided along with textual results.

  15. Algorithm for Improved Image Compression and Reconstruction Performances

    OpenAIRE

    G.Chenchu Krishnaiah; T. Jayachandra Prasad; M.N. Giri Prasad

    2012-01-01

    Energy efficient wavelet image transform algorithm (EEWITA) which is capable of evolving non-wavelettransforms consistently outperform wavelets when applied to a large class of images subject to quantizationerror. An EEWITA can evolve a set of coefficients which describes a matched forward and inversetransform pair that can be used at each level of a multi-resolution analysis (MRA) transform to minimizethe original image size and the mean squared error (MSE) in the reconstructed image. Simula...

  16. Robust Algorithm for Face Detection in Color Images

    OpenAIRE

    Hlaing Htake Khaung Tin

    2012-01-01

    Robust Algorithm is presented for frontal face detection in color images. Face detection is an important task in facial analysis systems in order to have a priori localized faces in a given image. Applications such as face tracking, facial expression recognition, gesture recognition, etc., for example, have a pre-requisite that a face is already located in the given image or the image sequence. Facial features such as eyes, nose and mouth are automatically detected based on properties of the ...

  17. Triple Encrypted Holographic Storage and Digital Holographic System

    International Nuclear Information System (INIS)

    We propose a triple encrypted holographic memory containing a digital holographic system. The original image is encrypted using double random phase encryption and stored in a LiNbO3:Fe crystal with shift-multiplexing. Both the reference beams of the memory and the digital holographic system are random phase encoded. We theoretically and experimentally demonstrate the encryption and decryption of multiple images and the results show high quality and good fault tolerance. The total key length of this system is larger than 4.7 × 1033. (fundamental areas of phenomenology (including applications))

  18. Algorithm to reduce anisoplanatism effects on infrared images

    Science.gov (United States)

    Roggemann, Michael C.; Welsh, Byron M.; Klein, Troy L.

    2000-11-01

    Atmospheric turbulence adversely affects imaging systems by causing a random distribution of the index of refraction of the air through which the light must propagate. The resulting image degradation can seriously undermine the effectiveness of the sensor. In many astronomical systems, which typically have a very narrow field of view, the entire image can be modeled by the convolution of the object with a single point spread function (PSF), and as a result of the narrow field of view, adaptive optical systems can be highly effective in correcting astronomical images. In the case of tactical infrared sensors the field of view is generally much larger than the isoplanatic angle, and the image cannot be modeled by a single point spread function convolved with the scene. Hence, adaptive optical solutions to wide angle infrared imaging over horizontal paths would be difficult, if not impossible, and post-detection processing of the images is required to mitigate turbulence effects. The overall effect of turbulence within a given isoplanatic path is not as strong as in the astronomical imaging case due to shorter paths and longer wavelengths. Tilt and low order turbulence modes dominate the aberration experienced within individual isoplanatic patches, greatly simplifying image reconstruction problems. In this paper we describe an algorithm for processing video sequences capable of partially correcting these turbulence effects. The algorithm is based on block matching algorithms used in video compression. Simulation results show that this algorithm reduces the squared error of the imagery, and subjectively better images are obtained.

  19. Gray Cerebrovascular Image Skeleton Extraction Algorithm Using Level Set Model

    OpenAIRE

    Jian Wu; Guang-ming Zhang; Jie Xia; Zhi-ming Cui

    2010-01-01

    The ambiguity and complexity of medical cerebrovascular image makes the skeleton gained by conventional skeleton algorithm discontinuous, which is sensitive at the weak edges, with poor robustness and too many burrs. This paper proposes a cerebrovascular image skeleton extraction algorithm based on Level Set model, using Euclidean distance field and improved gradient vector flow to obtain two different energy functions. The first energy function controls the  obtain of topological nodes ...

  20. Fourier-transfrom Ghost Imaging based on Compressive Sampling algorithm

    OpenAIRE

    Wang, Hui; Han, Shensheng

    2010-01-01

    A special algorithm for the Fourier-transform Ghost Imaging (GI) scheme is discussed based on the Compressive Sampling (CS) theory. The CS algorithm could also be used for the Fourier spectrum reconstruction of pure phase object by setting a proper sensing matrix. This could find its application in diffraction imaging of X-ray, neutron and electron with higher efficiency and resolution. Experiment results are also presented to prove the feasibility.

  1. A New Hybrid Watermarking Algorithm for Images in Frequency Domain

    Directory of Open Access Journals (Sweden)

    AhmadReza Naghsh-Nilchi

    2008-03-01

    Full Text Available In recent years, digital watermarking has become a popular technique for digital images by hiding secret information which can protect the copyright. The goal of this paper is to develop a hybrid watermarking algorithm. This algorithm used DCT coefficient and DWT coefficient to embedding watermark, and the extracting procedure is blind. The proposed approach is robust to a variety of signal distortions, such as JPEG, image cropping and scaling.

  2. Image Matching Algorithm Based On Human Perception

    Directory of Open Access Journals (Sweden)

    MRIGANK SHARMA, AMRITA PRIYAM

    2013-05-01

    Full Text Available There has been in depth analysis and numerous researches on image matching and storage. Major software distributors providing RDBMS provide image storage based on textual description. Since Images can easily be searched based on textual information, it is not so efficient and hence brings need of such technique that allows search based on image feature and not based on textual description of image. In recent years dramatic changes have been seen in digital image libraries and other multimedia databases. In order to effectively and precisely retrieve the desired images from a large image database, the development of a content-based image matching system has become an important research issue. However, most of the proposed approaches emphasize on finding the best representation for different image features. Human perception of comparing image is something else. Visual perception is the ability to interpret the surrounding environment by processing information that is contained in visible light. The resulting perception is also known as eyesight, sight, or vision. Hence color attributes like the mean value, the standard deviation, and the image bitmap of a color image are used as the features for matching. In addition, the entropy based on the gray level co-occurrence matrix and the edge histogram of an image is also considered as the texture features.

  3. The Research of Mobile phone Entrance Guard System Model based on the Encryption Two-dimensional Code

    Directory of Open Access Journals (Sweden)

    Chu Jianli

    2013-09-01

    Full Text Available This article designs a new mobile-phone entrance guard system, uses the encryption two-dimensional code for identity authentication. Different from other similar products in the market, this system does not rely on specialized mobile phone card or NFC (near field communication module. It can be directly realized through mobile-phone software, and it can be operated simple and safer. This article designs the whole system model, includes structure, function and workflow. It also analyzes and researches the main algorithms used in the system, which include security policy algorithm, encryption two-dimensional code algorithm and image recognition algorithm. Finally, it provides the solution method for the problem in the experimental simulation. It also evaluated and summarized the experimental results.

  4. Encrypted Domain DCT Based on Homomorphic Cryptosystems

    OpenAIRE

    Tiziano Bianchi; Alessandro Piva; Mauro Barni

    2009-01-01

    Signal processing in the encrypted domain (s.p.e.d.) appears an elegant solution in application scenarios, where valuable signals must be protected from a possibly malicious processing device. In this paper, we consider the application of the Discrete Cosine Transform (DCT) to images encrypted by using an appropriate homomorphic cryptosystem. An s.p.e.d. 1-dimensional DCT is obtained by defining a convenient signal model and is extended to the 2-dimensional case by using separable processing...

  5. Design of AES Algorithm for 128/192/256 Key Length in FPGA

    OpenAIRE

    Pravin V Kinge; S.J. Honale; C.M. Bobade

    2014-01-01

    The cryptographic algorithms can be implemented with software or built with pure hardware. However Field Programmable Gate Arrays (FPGA) implementation offers quicker solution and can be easily upgraded to incorporate any protocol changes. The available AES algorithm is used for  data and it is also suitable for image encryption and decryption to protect the confidential image from an unauthorized access. This project proposes a method in which the image data is an input to AES algorithm, to ...

  6. Attack on Fully Homomorphic Encryption over the Integers

    Directory of Open Access Journals (Sweden)

    Gu Chunsheng

    2012-09-01

    Full Text Available This paper presents an attack on the fully homomorphic encryption over the integers by using lattice reduction algorithm. Our result shows that the FHE in [4] is not secure for some parameter settings. We also present an improvement FHE scheme to avoid this lattice attack. Keywords: Fully Homomorphic Encryption, Cryptanalysis, Lattice Reduction

  7. State Of Art in Homomorphic Encryption Schemes

    Directory of Open Access Journals (Sweden)

    S. Sobitha Ahila

    2014-02-01

    Full Text Available The demand for privacy of digital data and of algorithms for handling more complex structures have increased exponentially over the last decade. However, the critical problem arises when there is a requirement for publicly computing with private data or to modify functions or algorithms in such a way that they are still executable while their privacy is ensured. This is where homomorphic cryptosystems can be used since these systems enable computations with encrypted data. A fully homomorphic encryption scheme enables computation of arbitrary functions on encrypted data.. This enables a customer to generate a program that can be executed by a third party, without revealing the underlying algorithm or the processed data. We will take the reader through a journey of these developments and provide a glimpse of the exciting research directions that lie ahead. In this paper, we propose a selection of the most important available solutions, discussing their properties and limitations.

  8. A New Adaptive Visible Watermarking Algorithm for Document Images

    Directory of Open Access Journals (Sweden)

    Mingfang Jiang

    2012-01-01

    Full Text Available Many previous visible image watermarking schemes are not applicable for document images. A novel visible watermarking algorithm is proposed by considering the unique visual perception of document images. To obtain good visual quality of watermarked images and good perceptual watermark translucence, an adaptive scaling factor and embedding factor calculation method is designed by exploiting luminance and edge masking characteristics of document images. Experimental results prove that the scheme has both good visual quality of watermarked images and satisfactory visible watermark translucence. In addition, it successfully resists removal attacks.

  9. Evaluation of Image Warping Algorithms for Implementation in FPGA

    OpenAIRE

    Serguienko, Anton

    2008-01-01

    The target of this master thesis is to evaluate the Image Warping technique and propose a possible design for an implementation in FPGA. The Image Warping is widely used in the image processing for image correction and rectification. A DSP is a usual choice for implantation of the image processing algorithms, but to decrease a cost of the target system it was proposed to use an FPGA for implementation. In this work a different Image Warping methods was evaluated in terms of performance, produ...

  10. A biological phantom for evaluation of CT image reconstruction algorithms

    Science.gov (United States)

    Cammin, J.; Fung, G. S. K.; Fishman, E. K.; Siewerdsen, J. H.; Stayman, J. W.; Taguchi, K.

    2014-03-01

    In recent years, iterative algorithms have become popular in diagnostic CT imaging to reduce noise or radiation dose to the patient. The non-linear nature of these algorithms leads to non-linearities in the imaging chain. However, the methods to assess the performance of CT imaging systems were developed assuming the linear process of filtered backprojection (FBP). Those methods may not be suitable any longer when applied to non-linear systems. In order to evaluate the imaging performance, a phantom is typically scanned and the image quality is measured using various indices. For reasons of practicality, cost, and durability, those phantoms often consist of simple water containers with uniform cylinder inserts. However, these phantoms do not represent the rich structure and patterns of real tissue accurately. As a result, the measured image quality or detectability performance for lesions may not reflect the performance on clinical images. The discrepancy between estimated and real performance may be even larger for iterative methods which sometimes produce "plastic-like", patchy images with homogeneous patterns. Consequently, more realistic phantoms should be used to assess the performance of iterative algorithms. We designed and constructed a biological phantom consisting of porcine organs and tissue that models a human abdomen, including liver lesions. We scanned the phantom on a clinical CT scanner and compared basic image quality indices between filtered backprojection and an iterative reconstruction algorithm.

  11. A Wavelet Transform Algorithm for 2n Shades Image

    Directory of Open Access Journals (Sweden)

    Aditya Kumar,

    2011-05-01

    Full Text Available Wavelet anatomization is globally appreciated up to the sixtieth part of an hour tools for timefrequency. It acquires an exceptional development based on Fourier fractionation and plays a consequential character in the signal processing remarkably in image compression. By analyzing the relations of the coefficients between every main block and the complete image, we can locate the catalogue location of the sub-band of each son block in the same sub-band of the whole image is same as that of the son block in the origin image. Many image compression algorithms (lossy or lossless have already been devised adhering to their perspective point of view. In this paper we propose conceptually an algorithm for image compression for minimizing a number of bits for storing an image into disk and reducing spatial redundancy and correlation between pixels.

  12. An Efficient Algorithm for Image Enhancement

    Directory of Open Access Journals (Sweden)

    Manglesh Khandelwal

    2011-02-01

    Full Text Available In the digital image processing field enhancement and removing the noise in the image is a critical issue. We have proposed a new algorithmto enhance color Image corrupted by Gaussian noise using fuzzy logic which describes uncertain features of images with modification of median filter. . The performance of the proposed technique has been evaluated and compared to the existing mean and median filter.

  13. Successive approximation algorithm for cancellation of artifacts in DSA images

    International Nuclear Information System (INIS)

    In this paper, we propose an algorithm for cancellation of artifacts in DSA images. We have already proposed an automatic registration method based on the detection of local movements. When motion of the object is large, it is difficult to estimate the exact movement, and the cancellation of artifacts may therefore fail. The algorithm we propose here is based on a simple rigid model. We present the results of applying the proposed method to a series of experimental X-ray images, as well as the results of applying the algorithm as preprocessing for a registration method based on local movement. (author)

  14. Improving Diagnostic Viewing of Medical Images using Enhancement Algorithms

    Directory of Open Access Journals (Sweden)

    Hanan S.S. Ahmed

    2011-01-01

    Full Text Available Problem statement: Various images are low quality and difficultly to detect and extract information. Therefore, the image has to get under a process called image enhancement which contains an aggregation of techniques that look for improving the visual aspect of an image. Medical images are one of the fundamental images, because they are used in more sensitive field which is a medical field. The raw data obtained straight from devices of medical acquisition may afford a comparatively poor image quality representation and may destroy by several types of noises. Image Enhancement (IE and denoising algorithms for executing the requirements of digital medical image enhancement is introduced. The main goal of this study is to improve features and gain better characteristics of medical images for a right diagnosis. Approach: The proposed techniques start by the median filter for removing noise on images followed by unsharp mask filter which is believable the usual type of sharpening. Medical images were usually poor quality especially in contrast. For solving this problem, we proposed Contrast Limited Adaptive Histogram Equalization (CLAHE which is one of the techniques in a computer image processing domain. It was used to amend contrast in images. Results: For testing purposes, different sizes and various types of medical images were used and more than 60 images in different parts of the body. From the experts? evaluation, they noted that the enhanced images improved up to 80% from the original images depends on medical images modalities. Conclusion: The proposed algorithms results were significant for increasing the visibleness of relatively details without distorting the images.

  15. Reconstruction Algorithms in Undersampled AFM Imaging

    DEFF Research Database (Denmark)

    Arildsen, Thomas; Oxvig, Christian Schou

    2015-01-01

    This paper provides a study of spatial undersampling in atomic force microscopy (AFM) imaging followed by different image reconstruction techniques based on sparse approximation as well as interpolation. The main reasons for using undersampling is that it reduces the path length and thereby the scanning time as well as the amount of interaction between the AFM probe and the specimen. It can easily be applied on conventional AFM hardware. Due to undersampling, it is then necessary to further process the acquired image in order to reconstruct an approximation of the image. Based on real AFM cell images, our simulations reveal that using a simple raster scanning pattern in combination with conventional image interpolation performs very well. Moreover, this combination enables a reduction by a factor 10 of the scanning time while retaining an average reconstruction quality around 36 dB PSNR on the tested cell images.

  16. A Comparative Study of Image Compression Algorithms

    Directory of Open Access Journals (Sweden)

    Kiran Bindu

    2012-09-01

    Full Text Available Digital images in their uncompressed form require an enormous amount of storage capacity. Such uncompressed data needs large transmission bandwidth for the transmission over the network. Discrete Cosine Transform (DCT is one of the widely used image compression method and the Discrete Wavelet Transform (DWT provides substantial improvements in the quality of picture because of multi resolution nature. Image compression reduces the storage space of image and also maintains the quality information of the image. In this research study the performance of three most widely used techniques namely DCT, DWT and Hybrid DCT-DWT are discussed for image compression and their performance is evaluated in terms of Peak Signal to Noise Ratio (PSNR, Mean Square Error (MSE and Compression Ratio (CR. The experimental results obtained from the study shows that the Hybrid DCT- DWT technique for image compression has in general a better performance than individual DCT or DWT.

  17. Probabilistic Encryption Based ECC Mechanism

    Directory of Open Access Journals (Sweden)

    Addepalli V.N. Krishna

    2011-04-01

    Full Text Available Elliptic Curve Cryptography provides a secure means of  exchanging keys among communicating hosts using the Diffie Hellman Key Exchange algorithm. Encryption and Decryption of texts and messages have also been attempted. In the paper[15], the authors presented the implementation of ECC by first transforming the message into an affine point on the EC, and then applying the knapsack algorithm on ECC encrypted message over the finite field GF(p. The kanp sack problem is not secure in the present standards and more over in the work the authors in their decryption process used elliptic curve discrete logarithm to get back the plain text. This may form a computationally infeasible problem if the values are large enough in generating the plain text. In the present work the output of ECC algorithm is provided with probabilistic features which make the algorithm free from Chosen cipher text attack. Thus by having key lengths of even less than 160 bits, the present algorithm provides sufficient strength against crypto analysis and whose performance can be compared with standard algorithms like RSA. 

  18. Image processing algorithm for robot tracking in reactor vessel

    International Nuclear Information System (INIS)

    In this paper, we proposed an image processing algorithm to find the position of an underwater robot in the reactor vessel. Proposed algorithm is composed of Modified SURF(Speeded Up Robust Feature) based on Mean-Shift and CAMSHIFT(Continuously Adaptive Mean Shift Algorithm) based on color tracking algorithm. Noise filtering using luminosity blend method and color clipping are preprocessed. Initial tracking area for the CAMSHIFT is determined by using modified SURF. And then extracting the contour and corner points in the area of target tracked by CAMSHIFT method. Experiments are performed at the reactor vessel mockup and verified to use in the control of robot by visual tracking

  19. Image processing algorithm for robot tracking in reactor vessel

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Won; Choi, Young Soo; Lee, Sung Uk; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Nam Kyun [Korea Plant Service and Engineering Co., Seongnam (Korea, Republic of)

    2011-10-15

    In this paper, we proposed an image processing algorithm to find the position of an underwater robot in the reactor vessel. Proposed algorithm is composed of Modified SURF(Speeded Up Robust Feature) based on Mean-Shift and CAMSHIFT(Continuously Adaptive Mean Shift Algorithm) based on color tracking algorithm. Noise filtering using luminosity blend method and color clipping are preprocessed. Initial tracking area for the CAMSHIFT is determined by using modified SURF. And then extracting the contour and corner points in the area of target tracked by CAMSHIFT method. Experiments are performed at the reactor vessel mockup and verified to use in the control of robot by visual tracking

  20. An Adaptive Algorithm for Improving the Fractal Image Compression (FIC

    Directory of Open Access Journals (Sweden)

    Taha Mohammed Hasan

    2011-12-01

    Full Text Available In this paper an adaptive algorithm is proposed to reduce the long time that has been taken in the Fractal Image Compression (FIC technique. This algorithm worked on reducing the number of matching operations between range and domain blocks by reducing both of the range and domain blocks needed in the matching process, for this purpose, two techniques have been proposed; the first one is called Range Exclusion (RE, in this technique variance factor is used to reduce the number of range blocks by excluding ranges of the homogenous or flat regions from the matching process; the second technique is called Reducing the Domain Image Size (RDIZ, it is responsible for the reduction of the domain by minimizing the Domain Image Size to 1/16th instead of 1/4th of the original image size used in the traditional FIC. This in turn will affect the encoding time, compression ratio and the reconstructed image quality. For getting best results, the two techniques are coupled in one algorithm; the new algorithm is called (RD-RE. The tested (256x256 gray images are partitioned into fixed (4x4 blocks and then compressed using visual C++ 6.0 code. The results show that RE technique is faster and gets more compression ratio than the traditional FIC and keeping a high reconstructed images quality while RD-RE is faster and it gets higher compression ratio than RE but with slight loss in the reconstructed image quality.

  1. Remote Sensing Image Resolution Enlargement Algorithm Based on Wavelet Transformation

    Directory of Open Access Journals (Sweden)

    Samiul Azam

    2014-05-01

    Full Text Available In this paper, we present a new image resolution enhancement algorithm based on cycle spinning and stationary wavelet subband padding. The proposed technique or algorithm uses stationary wavelet transformation (SWT to decompose the low resolution (LR image into frequency subbands. All these frequency subbands are interpolated using either bicubic or lanczos interpolation, and these interpolated subbands are put into inverse SWT process for generating intermediate high resolution (HR image. Finally, cycle spinning (CS is applied on this intermediate high resolution image for reducing blocking artifacts, followed by, traditional Laplacian sharpening filter is used to make the generated high resolution image sharper. This new technique has been tested on several satellite images. Experimental result shows that the proposed technique outperforms the conventional and the state-of-the-art techniques in terms of peak signal to noise ratio, root mean square error, entropy, as well as, visual perspective.

  2. Anisotropic conductivity imaging with MREIT using equipotential projection algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Degirmenci, Evren [Department of Electrical and Electronics Engineering, Mersin University, Mersin (Turkey); Eyueboglu, B Murat [Department of Electrical and Electronics Engineering, Middle East Technical University, 06531, Ankara (Turkey)

    2007-12-21

    Magnetic resonance electrical impedance tomography (MREIT) combines magnetic flux or current density measurements obtained by magnetic resonance imaging (MRI) and surface potential measurements to reconstruct images of true conductivity with high spatial resolution. Most of the biological tissues have anisotropic conductivity; therefore, anisotropy should be taken into account in conductivity image reconstruction. Almost all of the MREIT reconstruction algorithms proposed to date assume isotropic conductivity distribution. In this study, a novel MREIT image reconstruction algorithm is proposed to image anisotropic conductivity. Relative anisotropic conductivity values are reconstructed iteratively, using only current density measurements without any potential measurement. In order to obtain true conductivity values, only either one potential or conductivity measurement is sufficient to determine a scaling factor. The proposed technique is evaluated on simulated data for isotropic and anisotropic conductivity distributions, with and without measurement noise. Simulation results show that the images of both anisotropic and isotropic conductivity distributions can be reconstructed successfully.

  3. Anisotropic conductivity imaging with MREIT using equipotential projection algorithm

    International Nuclear Information System (INIS)

    Magnetic resonance electrical impedance tomography (MREIT) combines magnetic flux or current density measurements obtained by magnetic resonance imaging (MRI) and surface potential measurements to reconstruct images of true conductivity with high spatial resolution. Most of the biological tissues have anisotropic conductivity; therefore, anisotropy should be taken into account in conductivity image reconstruction. Almost all of the MREIT reconstruction algorithms proposed to date assume isotropic conductivity distribution. In this study, a novel MREIT image reconstruction algorithm is proposed to image anisotropic conductivity. Relative anisotropic conductivity values are reconstructed iteratively, using only current density measurements without any potential measurement. In order to obtain true conductivity values, only either one potential or conductivity measurement is sufficient to determine a scaling factor. The proposed technique is evaluated on simulated data for isotropic and anisotropic conductivity distributions, with and without measurement noise. Simulation results show that the images of both anisotropic and isotropic conductivity distributions can be reconstructed successfully

  4. Anisotropic conductivity imaging with MREIT using equipotential projection algorithm

    Science.gov (United States)

    De?irmenci, Evren; Murat Eyübo?lu, B.

    2007-12-01

    Magnetic resonance electrical impedance tomography (MREIT) combines magnetic flux or current density measurements obtained by magnetic resonance imaging (MRI) and surface potential measurements to reconstruct images of true conductivity with high spatial resolution. Most of the biological tissues have anisotropic conductivity; therefore, anisotropy should be taken into account in conductivity image reconstruction. Almost all of the MREIT reconstruction algorithms proposed to date assume isotropic conductivity distribution. In this study, a novel MREIT image reconstruction algorithm is proposed to image anisotropic conductivity. Relative anisotropic conductivity values are reconstructed iteratively, using only current density measurements without any potential measurement. In order to obtain true conductivity values, only either one potential or conductivity measurement is sufficient to determine a scaling factor. The proposed technique is evaluated on simulated data for isotropic and anisotropic conductivity distributions, with and without measurement noise. Simulation results show that the images of both anisotropic and isotropic conductivity distributions can be reconstructed successfully.

  5. Dermoscopic Image Segmentation using Machine Learning Algorithm

    OpenAIRE

    L. P. Suresh; Shunmuganathan, K. L.; S. H.K. Veni

    2011-01-01

    Problem statement: Malignant melanoma is the most frequent type of skin cancer. Its incidence has been rapidly increasing over the last few decades. Medical image segmentation is the most essential and crucial process in order to facilitate the characterization and visualization of the structure of interest in medical images. Approach: This study explains the task of segmenting skin lesions in Dermoscopy images based on intelligent systems such as Fuzzy and Neural Networks clustering techniqu...

  6. Survey of Watermarking Algorithms For Medical Images

    OpenAIRE

    A.Umaamaheshvari, K.Thanuskodi

    2012-01-01

    Watermarking is a branch of information hiding which is used to hide proprietary information in digital media like photographs, digital music, or digital video. And also which has seen a lot of research interest recently. The Medical images are also much important in the field of medicine ,all these medical images are need to be stored for future reference of the patients and their hospital findings hence, the medical image need to undergo the process of compression before storing it. In this...

  7. Distributed computing of Seismic Imaging Algorithms

    OpenAIRE

    Emami, Masnida; Setayesh, Ali; Jaberi, Nasrin

    2012-01-01

    The primary use of technical computing in the oil and gas industries is for seismic imaging of the earth's subsurface, driven by the business need for making well-informed drilling decisions during petroleum exploration and production. Since each oil/gas well in exploration areas costs several tens of millions of dollars, producing high-quality seismic images in a reasonable time can significantly reduce the risk of drilling a "dry hole". Similarly, these images are importan...

  8. Visual cryptography for JPEG color images

    Science.gov (United States)

    Sudharsanan, Subramania I.

    2004-10-01

    There have been a large number of methods proposed for encrypting images by shared key encryption mechanisms. All the existing techniques are applicable to primarily non-compressed images. However, most imaging applications including digital photography, archiving, and internet communications nowadays use images in the JPEG domain. Application of the existing shared key cryptographic schemes for these images requires conversion back into spatial domain. In this paper we propose a shared key algorithm that works directly in the JPEG domain, thus enabling shared key image encryption for a variety of applications. The scheme directly works on the quantized DCT coefficient domain and the resulting noise-like shares are also stored in the JPEG format. The decryption process is lossless. Our experiments indicate that each share image is approximately the same size as the original JPEG retaining the storage advantage provided by JPEG.

  9. A robust improved image stitching algorithm based on keypoints registration

    Science.gov (United States)

    Lei, Hua; Gu, Feiyong; Feng, Huajun; Xu, Zhihai; Li, Qi

    2010-08-01

    An improved algorithm based on Harris Corner Detection is presented. It has good noise resistive and can extract feature point on edge precisely and reduce the effect of noise and isolated pixels. To improve the Harris corner detection performance, several methods are involved. Instead of the Gaussian function, a B-spline smooth function is used. It can preserve more corner information and prevent corner position offsetting. To get more robust results, a modified corner response is also employed. In the corner matching process, the gradient information within a 5×5 window is used to perform a more robust matching. By an improved image matching algorithm, it can match the feature point precisely and reduce the incorrect matches. After that the images are aligned, the image of the overlapping band is smoothed with fusion algorithm to obtain a seamless mosaic image.

  10. Medical Image Registration by Simulated Annealing and genetic algorithms

    OpenAIRE

    Ait-Aoudia, Samy; Mahiou, Ramdane

    2007-01-01

    Registration techniques in medical image processing are used to match anatomic structures from two or more images (CT, MRI, PET,...) taken at different times to track for example the evolution of a disease. The core of the registration process is the maximization of a cost function expressing the similarity between these images. To resolve this problem, we have tested two global optimization techniques that are genetic algorithms and simulated annealing. In this paper we show some results obt...

  11. Compressed Sensing Photoacoustic Imaging Based on Fast Alternating Direction Algorithm

    OpenAIRE

    Liu, Xueyan; Peng, Dong; Guo, Wei; Ma, Xibo; YANG, Xin; Tian, Jie

    2012-01-01

    Photoacoustic imaging (PAI) has been employed to reconstruct endogenous optical contrast present in tissues. At the cost of longer calculations, a compressive sensing reconstruction scheme can achieve artifact-free imaging with fewer measurements. In this paper, an effective acceleration framework using the alternating direction method (ADM) was proposed for recovering images from limited-view and noisy observations. Results of the simulation demonstrated that the proposed algorithm could per...

  12. Note: thermal imaging enhancement algorithm for gas turbine aerothermal characterization.

    Science.gov (United States)

    Beer, S K; Lawson, S A

    2013-08-01

    An algorithm was developed to convert radiation intensity images acquired using a black and white CCD camera to thermal images without requiring knowledge of incident background radiation. This unique infrared (IR) thermography method was developed to determine aerothermal characteristics of advanced cooling concepts for gas turbine cooling application. Compared to IR imaging systems traditionally used for gas turbine temperature monitoring, the system developed for the current study is relatively inexpensive and does not require calibration with surface mounted thermocouples. PMID:24007128

  13. A comparative study of Image Region-Based Segmentation Algorithms

    OpenAIRE

    Lahouaoui LALAOUI; Tayeb MOHAMADI

    2013-01-01

    Image segmentation has recently become an essential step in image processing as it mainly conditions the interpretation which is done afterwards. It is still difficult to justify the accuracy of a segmentation algorithm, regardless of the nature of the treated image. In this paper we perform an objective comparison of region-based segmentation techniques such as supervised and unsupervised deterministic classification, non-parametric and parametric probabilistic classification. Eight methods ...

  14. Einstein’s Image Compression Algorithm: Version 1.00

    Directory of Open Access Journals (Sweden)

    Yasser Arafat

    2011-12-01

    Full Text Available Purpose: The Einstein’s compression technique is a new method of compression and decompression of images by matrix addition and the possible sequence of the sum. The main purpose of implementing a new algorithm is to reduce the complexity of algorithms used for image compression in recent days. The major advantage of this technique is that the compression is highly secure and highly compressed. This method does not use earlier compression techniques. This method of compression is a rastor compression. This method can be used for astronomical images and medical images because the image compression is considered to be lossless.Design/Methodology/Approach: The idea uses the previous literature as a base to explore the use of image compression technique.Findings: This type of compression can be used to reduce the size of the database for non- frequently used important data. This technique of compression will be in future used for compression of colour images and will be researched for file compression also.Social Implications: This idea of image compression is expected to create a new technique of image compression and will promote more researchers to research more on this type of compressionOriginality/Value: The idea intends to create a new technique of compression in the compression of image research.Keywords: Image Compression; Einstein’s Image Compression; New Compression Technique; Matrix Addition Based Compression.Paper Type: Technical

  15. Parametric blind-deconvolution algorithm to remove image artifacts in hybrid imaging systems.

    Science.gov (United States)

    Demenikov, Mads; Harvey, Andrew R

    2010-08-16

    Hybrid imaging systems employing cubic phase modulation in the pupil-plane enable significantly increased depth of field, but artifacts in the recovered images are a major problem. We present a parametric blind-deconvolution algorithm, based on minimization of the high-frequency content of the restored image that enables recovery of artifact-free images for a wide range of defocus. We show that the algorithm enables robust matching of the image recovery kernel with the optical point-spread function to enable, for the first time, optimally low noise levels in recovered images. PMID:20721189

  16. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  17. Image fusion approach with noise reduction using Genetic algorithm

    Directory of Open Access Journals (Sweden)

    Gehad Mohamed Taher

    2013-12-01

    Full Text Available Image fusion is becoming a challenging field as for its importance to different applications, Multi focus image fusion is a type of image fusion that is used in medical fields, surveillances, and military issues to get the image all in focus from multi images every one is in focus in a different part, and for making the input images more accurate before making the fusing process we use Genetic Algorithm (GA for image de-noising as a preprocessing process. In our research paper we introduce a new approach that begin with image de-noising using GA and then apply the curvelet transform for image decomposition to get a multi focus image fusion image that is focused in all of its parts. The results show that Curvelet transform had been proven to be effective at detecting image activity along curves, and increasing the quality of the obtained fused images. And applying the mean fusion rule for fusing multi-focus images gives accurate results than PCA, contrast and mode fusion rule, Also, GA shows more accurate results in image de-noising after comparing it to contourlet transform.

  18. Image Enhancement Based on Selective - Retinex Fusion Algorithm

    Directory of Open Access Journals (Sweden)

    Xuebo Jin

    2012-06-01

    Full Text Available The brightness adjustment method for the night-vision image enhancement is considered in this paper. The color RGB night-vision image is transformed into an uncorrelated color space--- the YUV space. According to the characteristics of the night-vision image, we develop the modified Retinex algorithm based on the S curve firstly, by which the luminance component is enhanced and the brightness of the night-vision image is effectively improved. Then the luminance component of source image is enhanced by the selective and nonlinear gray mapping to retain the essential sunlight and shade information. Based on the two enhancement images, the night-vision image with enough bright and necessary sunlight and shade information is combined by the weighted parameter. According to experimental results, the night-vision image obtained is very fit for the visual observation.

  19. Insight Into Efficient Image Registration Techniques and the Demons Algorithm : Insight Into Efficient Image Registration Techniques and the Demons Algorithm

    OpenAIRE

    Vercauteren, Tom; Pennec, Xavier; Malis, Ezio; Perchant, Aymeric; Ayache, Nicholas

    2007-01-01

    As image registration becomes more and more central to many biomedical imaging applications, the efficiency of the algorithms becomes a key issue. Image registration is classically performed by optimizing a similarity criterion over a given spatial transformation space. Even if this problem is considered as almost solved for linear registration, we show in this paper that some tools that have recently been developed in the field of vision-based robot control can outperform classical solutions...

  20. Segmentation algorithms for ear image data towards biomechanical studies.

    Science.gov (United States)

    Ferreira, Ana; Gentil, Fernanda; Tavares, João Manuel R S

    2014-01-01

    In recent years, the segmentation, i.e. the identification, of ear structures in video-otoscopy, computerised tomography (CT) and magnetic resonance (MR) image data, has gained significant importance in the medical imaging area, particularly those in CT and MR imaging. Segmentation is the fundamental step of any automated technique for supporting the medical diagnosis and, in particular, in biomechanics studies, for building realistic geometric models of ear structures. In this paper, a review of the algorithms used in ear segmentation is presented. The review includes an introduction to the usually biomechanical modelling approaches and also to the common imaging modalities. Afterwards, several segmentation algorithms for ear image data are described, and their specificities and difficulties as well as their advantages and disadvantages are identified and analysed using experimental examples. Finally, the conclusions are presented as well as a discussion about possible trends for future research concerning the ear segmentation. PMID:22994296

  1. A one-step reconstruction algorithm for quantitative photoacoustic imaging

    Science.gov (United States)

    Ding, Tian; Ren, Kui; Vallélian, Sarah

    2015-09-01

    Quantitative photoacoustic tomography (QPAT) is a recent hybrid imaging modality that couples optical tomography with ultrasound imaging to achieve high resolution imaging of optical properties of scattering media. Image reconstruction in QPAT is usually a two-step process. In the first step, the initial pressure field inside the medium, generated by the photoacoustic effect, is reconstructed using measured acoustic data. In the second step, this initial ultrasound pressure field datum is used to reconstruct optical properties of the medium. We propose in this work a one-step inversion algorithm for image reconstruction in QPAT that reconstructs the optical absorption coefficient directly from measured acoustic data. The algorithm can be used to recover simultaneously the absorption coefficient and the ultrasound speed of the medium from multiple acoustic data sets, with appropriate a priori bounds on the unknowns. We demonstrate, through numerical simulations based on synthetic data, the feasibility of the proposed reconstruction method.

  2. A Flexible Hierarchical Classification Algorithm for Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Qiao Liu

    2013-06-01

    Full Text Available The goal of paper is to describe a flexible hierarchical classification algorithm and a new image similarity computing model based on mixing several image features for promoting the performance and efficiency of speed for content-based image retrieval. With an experimental comparison of a large number of different representative point selection approach, we are trying to seek for a method of uniform division of image space, eventually design a novel approach enlightening by high-dimensional indexing and social networking, that introduces the directivity to image classification that is used to explain the convergence of images to edge points of the high-dimension feature space in this paper. Meanwhile we find the laws of parameter setting of this algorithm through experiments and these laws acquires satisfied effects in different dataset. In addition to that algorithm, we also find some features assembling with reasonable formula to represent images better in color, texture and shape. Experimental results based on a database of about 50,000 person images demonstrate improved performance, as compare with other combinations in our descriptor set consisting of several general features mentioned below.

  3. Reconstruction Algorithms in Undersampled AFM Imaging

    DEFF Research Database (Denmark)

    Arildsen, Thomas; Oxvig, Christian Schou; Pedersen, Patrick Steffen; Østergaard, Jan; Larsen, Torben

    2015-01-01

    This paper provides a study of spatial undersampling in atomic force microscopy (AFM) imaging followed by different image reconstruction techniques based on sparse approximation as well as interpolation. The main reasons for using undersampling is that it reduces the path length and thereby the scanning time as well as the amount of interaction between the AFM probe and the specimen. It can easily be applied on conventional AFM hardware. Due to undersampling, it is then necessary to further proc...

  4. Image Segmentation and Region Growing Algorithm

    Directory of Open Access Journals (Sweden)

    Shilpa Kamdi

    2012-02-01

    Full Text Available -In areas such as computer vision and mage processing, image segmentation has been and still is a relevant research area due to its wide spread usage and application. This paper provides a survey of achievements, problems being encountered, and the open issues in the research area of image segmentation and usage of the techniques in different areas.. We considered the techniques under the following three groups: Threshold-based, Edge-based and Region-based.

  5. Algorithm for haze determination using digital camera images

    Science.gov (United States)

    Wong, C. J.; MatJafri, M. Z.; Abdullah, K.; Lim, H. S.; Hashim, S. A.

    2008-04-01

    An algorithm for haze determination was developed based on the atmospheric optical properties to determine the concentration of particulate matter with diameter less than 10 micrometers (PM10). The purpose of this study was to use digital camera images to determine the PM10 concentration. This algorithm was developed based on the relationship between the measured PM10 concentration and the reflected components from a surface material and the atmosphere. A digital camera was used to capture images of dark and bright targets at near and far distances from the position of the targets. Ground-based PM10 measurements were carried out at selected locations simultaneously with the digital camera images acquisition using a DustTrak TM meter. The PCI Geomatica version 9.1 digital image processing software was used in all imageprocessing analyses. The digital colour images were separated into three bands namely red, green and blue for multi-spectral analysis. The digital numbers (DN) for each band corresponding to the ground-truth locations were extracted and converted to radiance and reflectance values. Then the atmospheric reflectance was related to the PM10 using the regression algorithm analysis. The proposed algorithm produced a high correlation coefficient (R) and low root-meansquare error (RMS) between the measured and estimated PM10. This indicates that the technique using the digital camera images can provide a useful tool for air quality studies.

  6. A novel algorithm for segmentation of brain MR images

    International Nuclear Information System (INIS)

    Accurate and fully automatic segmentation of brain from magnetic resonance (MR) scans is a challenging problem that has received an enormous amount of . attention lately. Many researchers have applied various techniques however a standard fuzzy c-means algorithm has produced better results compared to other methods. In this paper, we present a modified fuzzy c-means (FCM) based algorithm for segmentation of brain MR images. Our algorithm is formulated by modifying the objective function of the standard FCM and uses a special spread method to get a smooth and slow varying bias field This method has the advantage that it can be applied at an early stage in an automated data analysis before a tissue model is available. The results on MRI images show that this method provides better results compared to standard FCM algorithms. (author)

  7. Image Segmentation with PCNN Model and Immune Algorithm

    Directory of Open Access Journals (Sweden)

    Jianfeng Li

    2013-09-01

    Full Text Available In the domain of image processing, PCNN (Pulse Coupled Neural Network need to adjust parameters time after time to obtain the better performance. To this end, we propose a novel PCNN parameters automatic decision algorithm based on immune algorithm. The proposed method transforms PCNN parameters setting problem into parameters optimization based on immune algorithm. It takes image entropy as the evaluation basis of the best fitness of immune algorithm so that PCNN parameters can be adjusted adaptively. Meanwhile, in order to break the condition that population information fall into local optimum, the proposed method introduces gradient information to affect the evolution of antibody to keep the population activity. Experiment results show that the proposed method realizes the adaptive adjustment of PCNN parameters and yields the better segmentation performance than many existing methods.

  8. Leukemic cells segmentation algorithm based on molecular spectral imaging technology

    Science.gov (United States)

    Li, Qingli; Dai, Chunni; Liu, Hongjing; Liu, Jingao

    2009-07-01

    A molecular spectral imaging system instead of common microscope was used to capture the spectral images of blood smears. Then an improved spectral angle mapper algorithm for automatic blood cells segmentation was presented. In this algorithm, the spectral vectors of blood cells were normalized first. Then the spectral angles of all bands and partial bands were calculated respectively. Finally, the blood cells were segmented according to the spectral angles combined with the threshold segmentation method. As a case study, the leukemia cells were selected as the target and segmented with the new algorithm. The results demonstrate that this algorithm can utilizes both spectral and spatial information of blood cells and segment the leukemia cells more accurately.

  9. Optimization of image processing algorithms on mobile platforms

    Science.gov (United States)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  10. Genetic algorithm and image processing for osteoporosis diagnosis.

    Science.gov (United States)

    Jennane, R; Almhdie-Imjabber, A; Hambli, R; Ucan, O N; Benhamou, C L

    2010-01-01

    Osteoporosis is considered as a major public health threat. It is characterized by a decrease in the density of bone, decreasing its strength and leading to an increased risk of fracture. In this work, the morphological, topological and mechanical characteristics of 2 populations of arthritic and osteoporotic trabecular bone samples are evaluated using artificial intelligence and recently developed skeletonization algorithms. Results show that genetic algorithms associated with image processing tools can precisely separate the 2 populations. PMID:21096487

  11. A review of segmentation algorithms for ear image data

    OpenAIRE

    Ana Ferreira; Tavares, João Manuel R. S.; Fernanda Gentil

    2012-01-01

    The segmentation of ear structures in computed tomography (CT) and magnetic resonance (MR) images has gained high importance in the last years. In this paper, we present a review on the segmentation algorithms presented in the literature. As such, several algorithms, approaches, related issues and problems are defined and explained. Additionally, their use is illustrated and discussed, and their advantages and disadvantages are pointed out.

  12. COMPARISON OF DIFFERENT SEGMENTATION ALGORITHMS FOR DERMOSCOPIC IMAGES

    Directory of Open Access Journals (Sweden)

    A.A. Haseena Thasneem

    2015-05-01

    Full Text Available This paper compares different algorithms for the segmentation of skin lesions in dermoscopic images. The basic segmentation algorithms compared are Thresholding techniques (Global and Adaptive, Region based techniques (K-means, Fuzzy C means, Expectation Maximization and Statistical Region Merging, Contour models (Active Contour Model and Chan - Vese Model and Spectral Clustering. Accuracy, sensitivity, specificity, Border error, Hammoude distance, Hausdorff distance, MSE, PSNR and elapsed time metrices were used to evaluate various segmentation techniques.

  13. Reduction Algorithms for the Multiband Imaging Photometer for Spitzer

    OpenAIRE

    Gordon, K. D.; Rieke, G. H.; Engelbracht, C. W.; Muzerolle, J.; Stansberry, J. A.; Misselt, K. A.; Morrison, J. E.; Cadien, J.; Young, E T; Dole, H.; Kelly, D. M.; Alonso-Herrero, A.; Egami, E.; Su, K. Y. L.; Papovich, C

    2005-01-01

    We describe the data reduction algorithms for the Multiband Imaging Photometer for Spitzer (MIPS) instrument. These algorithms were based on extensive preflight testing and modeling of the Si:As (24 micron) and Ge:Ga (70 and 160 micron) arrays in MIPS and have been refined based on initial flight data. The behaviors we describe are typical of state-of-the-art infrared focal planes operated in the low backgrounds of space. The Ge arrays are bulk photoconductors and therefore ...

  14. Image Retrieval Relevance Feedback algorithms: Trends and Techniques

    OpenAIRE

    Puja Kumar

    2013-01-01

    Abstract: With many applications, Content based Image Retrieval (CBIR) has come into the attention in recent decades. To reduce the schematic gap a wide variety of relevance feedback (RF) algorithms have been developed in recent years to improve the performance of CBIR systems. These RF algorithms capture user’s preferences and bridge the semantic gap. Many schemes and techniques of relevance feedback exist with many assumptions and operating criteria. Yet there exist few ways of quantitati...

  15. Image Restoration Algorithm Research on Local Motion-blur

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2011-06-01

    Full Text Available In this paper, we aim at the restoration of local motion-blur. On the base of construction of basic model of local motion-blur, the formation mechanism of local motion-blur is analyzed, and a new restoration algorithm aimed at local motion-blur in a complex background is proposed. In the algorithm, the problem of restoration of blurred image with complex background is simplified. First, the blurred part is extracted from the complex background, and then it is pasted onto a bottom with monochromatic background. After restoration in the monochromatic background, the restored part is pasted back to the original complex background. All the operations can be completed in spatial domain. Because the restoration of blur image with monochromatic background is easier, so the algorithm proposed in this paper is simple, fast and effectual. It is an effective method of blur image restoration.

  16. A comparative study of Image Region-Based Segmentation Algorithms

    Directory of Open Access Journals (Sweden)

    Lahouaoui LALAOUI

    2013-07-01

    Full Text Available Image segmentation has recently become an essential step in image processing as it mainly conditions the interpretation which is done afterwards. It is still difficult to justify the accuracy of a segmentation algorithm, regardless of the nature of the treated image. In this paper we perform an objective comparison of region-based segmentation techniques such as supervised and unsupervised deterministic classification, non-parametric and parametric probabilistic classification. Eight methods among the well-known and used in the scientific community have been selected and compared. The Martin’s(GCE, LCE, probabilistic Rand Index (RI, Variation of Information (VI and Boundary Displacement Error (BDE criteria are used to evaluate the performance of these algorithms on Magnetic Resonance (MR brain images, synthetic MR image, and synthetic images. MR brain image are composed of the gray matter (GM, white matter (WM and cerebrospinal fluid (CSF and others, and the synthetic MR image composed of the same for real image and the plus edema, and the tumor. Results show that segmentation is an image dependent process and that some of the evaluated methods are well suited for a better segmentation.

  17. Retrieval Applications in Image Segmentation using Wavelets and CBIR Algorithm

    Directory of Open Access Journals (Sweden)

    MANIMEGALAI.S

    2013-04-01

    Full Text Available Segmentation refers to the process of partitioning a digital image into multiple segments. Image segmentation is typically used to locate objects and boundaries (lines, curves, etc. in images. Processing images for specific targets on a large scale has to handle various kinds of contents with regular processing steps. To segment objects in one image, using the dual multiScalE Graylevel mOrphological open and close recoNstructions (SEGON algorithm. It can be used to build a BackGround (BG gray-level variation mesh, which is to identify BG and object regions. Content-Based Image Retrieval (CBIR was carried out to evaluate the object segmentation capability in dealing with large-scale database images. CBIR is a technique which uses visual contents to search image from large scale database. The object segmentation method can be extended to extract other image features, and new feature types can be incorporated into the algorithm to further improve the image retrieval performance.

  18. Distributed computing of Seismic Imaging Algorithms

    CERN Document Server

    Emami, Masnida; Jaberi, Nasrin

    2012-01-01

    The primary use of technical computing in the oil and gas industries is for seismic imaging of the earth's subsurface, driven by the business need for making well-informed drilling decisions during petroleum exploration and production. Since each oil/gas well in exploration areas costs several tens of millions of dollars, producing high-quality seismic images in a reasonable time can significantly reduce the risk of drilling a "dry hole". Similarly, these images are important as they can improve the position of wells in a billion-dollar producing oil field. However seismic imaging is very data- and compute-intensive which needs to process terabytes of data and require Gflop-years of computation (using "flop" to mean floating point operation per second). Due to the data/computing intensive nature of seismic imaging, parallel computing are used to process data to reduce the time compilation. With introducing of Cloud computing, MapReduce programming model has been attracted a lot of attention in parallel and di...

  19. COMPARISON AND ANALYSIS OF WATERMARKING ALGORITHMS IN COLOR IMAGESIMAGE SECURITY PARADIGM

    Directory of Open Access Journals (Sweden)

    D. Biswas

    2011-06-01

    Full Text Available This paper is based on a comparative study between different watermarking techniques such as LSB hiding algorithm, (2, 2 visual cryptography based watermarking for color images [3,4] and Randomized LSB-MSB hiding algorithm [1]. Here, we embed the secret image in a host or original image, by using these bit-wise pixel manipulation algorithms. This is followed by a comparative study of the resultantimages through Peak Signal to Noise Ratio (PSNR calculation. The property wise variation of differenttypes of secret images that are embedded into the host image plays an important role in this context. The calculation of the Peak Signal to Noise Ratio is done for different color levels (red, green, blue and also for their equivalent gray level images. From the results, we are trying to predict which technique is more suitable to which type of secret image.

  20. Research on Target Type Recognition Algorithm of Aerial Infrared Image

    Directory of Open Access Journals (Sweden)

    Bin Liu

    2013-10-01

    Full Text Available In order to improve the target type recognition rate of aerial infrared image under the new requirements of omnidirectional crossing and multitarget types, a recognition algorithm which has four steps is researched in this paper. Firstly the maximum between-cluster variance (Ostu algorithm is applied to segment target from the infrared image. Secondly a new edge detection algorithm is proposed to get the target edge in the segmentation image. Thirdly the edge points are fitted to be a polygon and its Fourier descriptor is extracted to obtain the target feature. Finally the target type is identified by a classification recognition algorithm based on the BP neural network and the Fourier descriptor of target. The simulation results show that the recognition rate of the target type recognition algorithm is more than 80% for the typical aerial target on every complex condition, and the processing time is just 0.1s. So the researched algorithm can meet the requirements of real-time performance and high recognition rate.

  1. Optimized Password Recovery for Encrypted RAR on GPUs

    OpenAIRE

    An, Xiaojing; Zhao, Haojun; Lulu DING; Fan, Zhongrui; Wang, Hanyue

    2015-01-01

    RAR uses classic symmetric encryption algorithm SHA-1 hashing and AES algorithm for encryption, and the only method of password recovery is brute force, which is very time-consuming. In this paper, we present an approach using GPUs to speed up the password recovery process. However, because the major calculation and time-consuming part, SHA-1 hashing, is hard to be parallelized, so this paper adopts coarse granularity parallel. That is, one GPU thread is responsible for the ...

  2. Imaging volcanic infrasound sources using time reversal mirror algorithm

    Science.gov (United States)

    Kim, Keehoon; Lees, Jonathan M.

    2015-09-01

    We investigate the capability of Time Reversal Mirror (TRM) algorithm to image local acoustic sources (weighted imaging condition to compensate for complicated transmission loss of the time-reversed wavefield and demonstrate that the presented condition significantly improves the focusing quality of TRM in the presence of complex topography. The consequent TRM source images exhibit remarkable agreement with the visual observation of the eruption implying that the TRM method with a proper imaging condition can be used to localize and track acoustic sources associated with complex volcanic eruptions.

  3. Publickey encryption by ordering

    OpenAIRE

    Terasawa, Yoshihiro

    2014-01-01

    In 1999, public key cryptography using the matrix was devised by a hish school student of 16 yesrs old girl Sarah Flannery. This cryptosystem seemed faster than RSA, and it's having the strength to surpass even the encryption to RSA. However, this encryption scheme was broken bfore har papers were published. In this paper, We try to construct publickey encryption scheme from permutation group that is equivalent to matrix as noncommutative group. And we explore the potential ...

  4. Algorithm-Architecture Matching for Signal and Image Processing

    CERN Document Server

    Gogniat, Guy; Morawiec, Adam; Erdogan, Ahmet

    2011-01-01

    Advances in signal and image processing together with increasing computing power are bringing mobile technology closer to applications in a variety of domains like automotive, health, telecommunication, multimedia, entertainment and many others. The development of these leading applications, involving a large diversity of algorithms (e.g. signal, image, video, 3D, communication, cryptography) is classically divided into three consecutive steps: a theoretical study of the algorithms, a study of the target architecture, and finally the implementation. Such a linear design flow is reaching its li

  5. Performance Evaluation of Noise Reduction Algorithm in Magnetic Resonance Images

    Directory of Open Access Journals (Sweden)

    Milindkumar V Sarode

    2011-03-01

    Full Text Available The objective of this paper is to do the estimation of the noise in the Magnetic Resonance images and evaluate the noise reduction algorithm present in this paper. We propose a method for reduction of Rician noise in MRI. This method shows an optimal estimation result that is more accurate in recovering the true signal from Rician noise. The method proposed specifically for Rician noise reduction, but because Rician noise can be approximated to Gaussian when SNR is high, therefore, we expect the proposed algorithm also has advantage in denoising of complex MR images.

  6. A New Method For Encryption Using Fuzzy Set Theory

    Directory of Open Access Journals (Sweden)

    Dr.S.S.Dhenakaran, N.Kavinilavu

    2012-06-01

    Full Text Available Security of data is important factor in data transmission through network. In this paper, we propose a new method using fuzzy set theory to enhance the security. The data in the form of text to be transmitted is encrypted by using the AES Rijndael algorithm. The encryption algorithm is the mathematical procedure for performing encryption of data. A key is used to cipher a message and to decipher it back to the original message. Then the scrambled encrypted text is converted into the form of numerics by applying the fuzzy set theory. The fuzzy logic will provide the text in the zero to one value. These numerical before decryption the numerical are again converted into scrambled text. After this, if the key provided by the user is the same key that is used for the encryption then original data will be retrieved. The paper, integrates the encryption of text and conversion of the unscrambled text from numerical to original by using fuzzy logic. In this paper, we involve matrix conversion of text after encryption. Hence, the intruders are unaware of the text that is encrypted. For this matrix conversion fuzzy membership functions are involved. Before decryption the matrix transformation is taken place to find text. After that, the decryption taken place by providing key

  7. Photonic encryption using all optical logic.

    Energy Technology Data Exchange (ETDEWEB)

    Blansett, Ethan L.; Schroeppel, Richard Crabtree; Tang, Jason D.; Robertson, Perry J.; Vawter, Gregory Allen; Tarman, Thomas David; Pierson, Lyndon George

    2003-12-01

    With the build-out of large transport networks utilizing optical technologies, more and more capacity is being made available. Innovations in Dense Wave Division Multiplexing (DWDM) and the elimination of optical-electrical-optical conversions have brought on advances in communication speeds as we move into 10 Gigabit Ethernet and above. Of course, there is a need to encrypt data on these optical links as the data traverses public and private network backbones. Unfortunately, as the communications infrastructure becomes increasingly optical, advances in encryption (done electronically) have failed to keep up. This project examines the use of optical logic for implementing encryption in the photonic domain to achieve the requisite encryption rates. In order to realize photonic encryption designs, technology developed for electrical logic circuits must be translated to the photonic regime. This paper examines two classes of all optical logic (SEED, gain competition) and how each discrete logic element can be interconnected and cascaded to form an optical circuit. Because there is no known software that can model these devices at a circuit level, the functionality of the SEED and gain competition devices in an optical circuit were modeled in PSpice. PSpice allows modeling of the macro characteristics of the devices in context of a logic element as opposed to device level computational modeling. By representing light intensity as voltage, 'black box' models are generated that accurately represent the intensity response and logic levels in both technologies. By modeling the behavior at the systems level, one can incorporate systems design tools and a simulation environment to aid in the overall functional design. Each black box model of the SEED or gain competition device takes certain parameters (reflectance, intensity, input response), and models the optical ripple and time delay characteristics. These 'black box' models are interconnected and cascaded in an encrypting/scrambling algorithm based on a study of candidate encryption algorithms. We found that a low gate count, cascadable encryption algorithm is most feasible given device and processing constraints. The modeling and simulation of optical designs using these components is proceeding in parallel with efforts to perfect the physical devices and their interconnect. We have applied these techniques to the development of a 'toy' algorithm that may pave the way for more robust optical algorithms. These design/modeling/simulation techniques are now ready to be applied to larger optical designs in advance of our ability to implement such systems in hardware.

  8. Image Enhancement Of Under Water Images Using Structured Preserving Noise Reduction Algorithm

    Directory of Open Access Journals (Sweden)

    D.Napoleon

    2013-10-01

    Full Text Available This paper presents SUSAN (Structure Preserving Noise Reduction algorithm for gray scale images contaminated by traditional median noise. Remote sensing images are affected by different types of noise like Gaussian noise, Speckle noise and impulse noise. These noises are introduced into the under water images during acquisition or transmission process. In this paper traditional median filter and structured preserving noise reduction Algorithm is used for reduce the noise rate, when compare to this filter and SUSAN Algorithm, Structured Preserving Noise Reduction gives better results.

  9. Adaptive Proximal Point Algorithms for Total Variation Image Restoration

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2015-02-01

    Full Text Available Image restoration is a fundamental problem in various areas of imaging sciences. This paper presents a class of adaptive proximal point algorithms (APPA with contraction strategy for total variational image restoration. In each iteration, the proposed methods choose an adaptive proximal parameter matrix which is not necessary symmetric. In fact, there is an inner extrapolation in the prediction step, which is followed by a correction step for contraction. And the inner extrapolation is implemented by an adaptive scheme. By using the framework of contraction method, global convergence result and a convergence rate of O(1/N could be established for the proposed methods. Numerical results are reported to illustrate the efficiency of the APPA methods for solving total variation image restoration problems. Comparisons with the state-of-the-art algorithms demonstrate that the proposed methods are comparable and promising.

  10. Performance evaluation of image segmentation algorithms on microscopic image data.

    Czech Academy of Sciences Publication Activity Database

    Beneš, Miroslav; Zitová, Barbara

    -, - (2014), s. 1-21. ISSN 0022-2720 R&D Projects: GA ?R GAP103/12/2211 Institutional support: RVO:67985556 Keywords : image segmentation * performance evaluation * microscopic images Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.331, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/zitova-0434809-DOI.pdf

  11. Evaluation of image deblurring algorithms for real-time applications

    OpenAIRE

    Airo Farulla, Giuseppe; Rolfo, Daniele; Russo, Ludovico Orlando; Indaco, Marco; Trotta, Pascal

    2014-01-01

    Camera shake is a well-known source of degradation in digital images, as it introduces motion blur. Taking satisfactory photos under dim lighting conditions or using a hand-held camera is challenging. Same problems arise when camera is connected to mechanical equipments, that transfer vibrations to the camera itself. Since decades, many different theories and algorithms have been proposed with the aim of retrieving latent images from blurry inputs; most of them work quite well, but very often...

  12. Performance evaluation of 2D image registration algorithms with the numeric image registration and comparison platform

    International Nuclear Information System (INIS)

    The objective of this work is to present the capabilities of the NUMERICS web platform for evaluation of the performance of image registration algorithms. The NUMERICS platform is a web accessible tool which provides access to dedicated numerical algorithms for registration and comparison of medical images (http://numerics.phys.uni-sofia.bg). The platform allows comparison of noisy medical images by means of different types of image comparison algorithms, which are based on statistical tests for outliers. The platform also allows 2D image registration with different techniques like Elastic Thin-Plate Spline registration, registration based on rigid transformations, affine transformations, as well as non-rigid image registration based on Mobius transformations. In this work we demonstrate how the platform can be used as a tool for evaluation of the quality of the image registration process. We demonstrate performance evaluation of a deformable image registration technique based on Mobius transformations. The transformations are applied with appropriate cost functions like: Mutual information, Correlation coefficient, Sum of Squared Differences. The accent is on the results provided by the platform to the user and their interpretation in the context of the performance evaluation of 2D image registration. The NUMERICS image registration and image comparison platform provides detailed statistical information about submitted image registration jobs and can be used to perform quantitative evaluation of the performance of different image registration techniques. (authors)

  13. An Overview of Transform Domain Robust Digital Image Watermarking Algorithms

    Directory of Open Access Journals (Sweden)

    Baisa L. Gunjal

    2011-01-01

    Full Text Available Internet and Multimedia technologies have become our daily needs. Hence it has become a common practice to create copy, transmit and distribute digital data. Obviously, it leads to unauthorized replication problem. Digital image watermarking provides copyright protection to image by hiding appropriate information in original image to declare rightful ownership. Aim of this paper is to provide complete overview of Digital Image watermarking. The study focuses on quality factors essential for good quality watermarking, Performance evaluation metrics (PSNR and Correlation Factors and possible attacks. Overview of several methods with spatial and Transform Domain watermarking is done with detail mathematical formulae, their implementations, strengths and weaknesses. The generalized algorithms are presented for DWT, CDMA based, DCT-DWT combined approach. The Ridgelet Transform is also introduced. Comparative results of Digital Image Watermarking using LSB, DCT and DWT are also presented. The paper recommends DWT based techniques for achieving Robustness in Digital Image Watermarking.

  14. Overview and Challenges of Different Image Morphing Algorithms

    Directory of Open Access Journals (Sweden)

    Md. Baharul Islam, Md. Tukhrejul Inam, Balaji Kaliyaperumal

    2013-04-01

    Full Text Available Gradual transformation from one graphical object or image (source to another graphical object or image (target is called morphing. It is useful for creating special effects in animation. This technology is very popular in film and advertisement industries. The idea is distorting the first image into the second one and possible to vice versa. The middle image is a key point of this technology that is neither first image nor second image. If it looks good then probably the entire animated sequence will look good. Complete overview and all of the challenges of this technology is needed to all researchers to work with it. This paper describes an overview and challenges of almost all morphing algorithms that is used in computer animation. Cross-dissolving is one of the problems in morphing technology.

  15. Object Recognition Algorithm Utilizing Graph Cuts Based Image Segmentation

    Directory of Open Access Journals (Sweden)

    Zhaofeng Li

    2014-02-01

    Full Text Available This paper concentrates on designing an object recognition algorithm utilizing image segmentation. The main innovations of this paper lie in that we convert the image segmentation problem into graph cut problem, and then the graph cut results can be obtained by calculating the probability of intensity for a given pixel which is belonged to the object and the background intensity. After the graph cut process, the pixels in a same component are similar, and the pixels in different components are dissimilar. To detect the objects in the test image, the visual similarity between the segments of the testing images and the object types deduced from the training images is estimated. Finally, a series of experiments are conducted to make performance evaluation. Experimental results illustrate that compared with existing methods, the proposed scheme can effectively detect the salient objects. Particularly, we testify that, in our scheme, the precision of object recognition is proportional to image segmentation accuracy

  16. Cropping and noise resilient steganography algorithm using secret image sharing

    Science.gov (United States)

    Juarez-Sandoval, Oswaldo; Fierro-Radilla, Atoany; Espejel-Trujillo, Angelina; Nakano-Miyatake, Mariko; Perez-Meana, Hector

    2015-03-01

    This paper proposes an image steganography scheme, in which a secret image is hidden into a cover image using a secret image sharing (SIS) scheme. Taking advantage of the fault tolerant property of the (k,n)-threshold SIS, where using any k of n shares (k?n), the secret data can be recovered without any ambiguity, the proposed steganography algorithm becomes resilient to cropping and impulsive noise contamination. Among many SIS schemes proposed until now, Lin and Chan's scheme is selected as SIS, due to its lossless recovery capability of a large amount of secret data. The proposed scheme is evaluated from several points of view, such as imperceptibility of the stegoimage respect to its original cover image, robustness of hidden data to cropping operation and impulsive noise contamination. The evaluation results show a high quality of the extracted secret image from the stegoimage when it suffered more than 20% cropping or high density noise contamination.

  17. Visual-hint Boundary to Segment Algorithm for Image Segmentation

    CERN Document Server

    Su, Yu

    2010-01-01

    Image segmentation has been a very active research topic in image analysis area. Currently, most of the image segmentation algorithms are designed based on the idea that images are partitioned into a set of regions preserving homogeneous intra-regions and inhomogeneous inter-regions. However, human visual intuition does not always follow this pattern. A new image segmentation method named Visual-Hint Boundary to Segment (VHBS) is introduced, which is more consistent with human perceptions. VHBS abides by two visual hint rules based on human perceptions: (i) the global scale boundaries tend to be the real boundaries of the objects; (ii) two adjacent regions with quite different colors or textures tend to result in the real boundaries between them. It has been demonstrated by experiments that, compared with traditional image segmentation method, VHBS has better performance and also preserves higher computational efficiency.

  18. Color Image Segmentation Method Based on Improved Spectral Clustering Algorithm

    Directory of Open Access Journals (Sweden)

    Dong Qin

    2014-08-01

    Full Text Available Contraposing to the features of image data with high sparsity of and the problems on determination of clustering numbers, we try to put forward an color image segmentation algorithm, combined with semi-supervised machine learning technology and spectral graph theory. By the research of related theories and methods of spectral clustering algorithms, we introduce information entropy conception to design a method which can automatically optimize the scale parameter value. So it avoids the unstability in clustering result of the scale parameter input manually. In addition, we try to excavate available priori information existing in large number of non-generic data and apply semi-supervised algorithm to improve the clustering performance for rare class. We also use added tag data to compute similar matrix and perform clustering through FKCM algorithms. By the simulation of standard dataset and image segmentation, the experiments demonstrate our algorithm has overcome the defects of traditional spectral clustering methods, which are sensitive to outliers and easy to fall into local optimum, and also poor in the convergence rate

  19. Development of computed tomography system and image reconstruction algorithm

    International Nuclear Information System (INIS)

    Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 ?m pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)

  20. An Improved Fast SPIHT Image Compression Algorithm for Aerial Applications

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2011-12-01

    Full Text Available In this paper, an improved fast SPIHT algorithm has been presented. SPIHT and NLS (Not List SPIHT are efficient compression algorithms, but the algorithms application is limited by the shortcomings of the poor error resistance and slow compression speed in the aviation areas. In this paper, the error resilience and the compression speed are improved. The remote sensing images are decomposed by Le Gall5/3 wavelet, and wavelet coefficients are indexed, scanned and allocated by the means of family blocks. The bit-plane importance is predicted by bitwise OR, so the N bit-planes can be encoded at the same time. Compared with the SPIHT algorithm, this improved algorithm is easy implemented by hardware, and the compression speed is improved. The PSNR of reconstructed images encoded by fast SPIHT is higher than SPIHT and CCSDS from 0.3 to 0.9db, and the speed is 4-6 times faster than SPIHT encoding process. The algorithm meets the high speed and reliability requirements of aerial applications.

  1. Three-dimensional optical encryption based on ptychography

    Science.gov (United States)

    Zhang, Jun; Li, Tuo; Wang, Yali; Qiao, Liang; Yang, Xiubo; Shi, Yishi

    2015-10-01

    We propose a novel optical encryption system for three-dimension imaging combined with three-dimension Ptychography. Employing the proposed cryptosystem, a 3D object can be encrypted and decrypted successfully. Compared with the conventional three-dimensional cryptosystem, not only encrypting the pure amplitude 3D object is available, but also the encryption of complex amplitude 3D object is achievable. Considering that the probes overlapping with each other is the crucial factor in ptychography, their complex-amplitude functions can serve as a kind of secret keys that lead to the enlarged key space and the enhanced system security. Varies of simulation results demonstrate that the feasibility and robust of the cryptosystem. Furthermore, the proposed system could also be used for other potential applications, such as three-dimensional information hiding and multiple images encryption.

  2. Digital Image Watermarking Using Edge Detection and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Seyed Sahand Mohammadi Ziabari1 , Reza Ebrahimi Atani2 , Kian Keyghobad3 , Abdolmajid Riazi4

    2014-01-01

    Full Text Available Watermarking in the regions which knew as an edge of the image with some filters which uses as an edge detection improves the watermarking results. But since there is an embedding strength coefficient, finding the best values for this coefficient needs an optimization work. Here, genetic algorithm is used for finding the best values of embedding coefficients.

  3. Improved MCA-TV algorithm for interference hyperspectral image decomposition

    Science.gov (United States)

    Wen, Jia; Zhao, Junsuo; Cailing, Wang

    2015-12-01

    The technology of interference hyperspectral imaging, which can get the spectral and spatial information of the observed targets, is a very powerful technology in the field of remote sensing. Due to the special imaging principle, there are many position-fixed interference fringes in each frame of the interference hyperspectral image (IHI) data. This characteristic will affect the result of compressed sensing theory and traditional compression algorithms used on IHI data. According to this characteristic of the IHI data, morphological component analysis (MCA) is adopted to separate the interference fringes layers and the background layers of the LSMIS (Large Spatially Modulated Interference Spectral Image) data, and an improved MCA and Total Variation (TV) combined algorithm is proposed in this paper. An update mode of the threshold in traditional MCA is proposed, and the traditional TV algorithm is also improved according to the unidirectional characteristic of the interference fringes in IHI data. The experimental results prove that the proposed improved MCA-TV (IMT) algorithm can get better results than the traditional MCA, and also can meet the convergence conditions much faster than the traditional MCA.

  4. A Novel Algorithm of Surface Eliminating in Undersurface Optoacoustic Imaging

    Directory of Open Access Journals (Sweden)

    Zhulina Yulia V

    2004-01-01

    Full Text Available This paper analyzes the task of optoacoustic imaging of the objects located under the surface covering them. In this paper, we suggest the algorithm of the surface eliminating based on the fact that the intensity of the image as a function of the spatial point should change slowly inside the local objects, and will suffer a discontinuity of the spatial gradients on their boundaries. The algorithm forms the 2-dimensional curves along which the discontinuity of the signal derivatives is detected. Then, the algorithm divides the signal space into the areas along these curves. The signals inside the areas with the maximum level of the signal amplitudes and the maximal gradient absolute values on their edges are put equal to zero. The rest of the signals are used for the image restoration. This method permits to reconstruct the picture of the surface boundaries with a higher contrast than that of the surface detection technique based on the maximums of the received signals. This algorithm does not require any prior knowledge of the signals' statistics inside and outside the local objects. It may be used for reconstructing any images with the help of the signals representing the integral over the object's volume. Simulation and real data are also provided to validate the proposed method.

  5. Gray Cerebrovascular Image Skeleton Extraction Algorithm Using Level Set Model

    Directory of Open Access Journals (Sweden)

    Jian Wu

    2010-06-01

    Full Text Available The ambiguity and complexity of medical cerebrovascular image makes the skeleton gained by conventional skeleton algorithm discontinuous, which is sensitive at the weak edges, with poor robustness and too many burrs. This paper proposes a cerebrovascular image skeleton extraction algorithm based on Level Set model, using Euclidean distance field and improved gradient vector flow to obtain two different energy functions. The first energy function controls the  obtain of topological nodes for the beginning of skeleton curve. The second energy function controls the extraction of skeleton surface. This algorithm avoids the locating and classifying of the skeleton connection points which guide the skeleton extraction. Because all its parameters are gotten by the analysis and reasoning, no artificial interference is needed.

  6. Volumetric segmentation of brain images using parallel genetic algorithms.

    Science.gov (United States)

    Fan, Yong; Jiang, Tianzi; Evans, David J

    2002-08-01

    Active model-based segmentation has frequently been used in medical image processing with considerable success. Although the active model-based method was initially viewed as an optimization problem, most researchers implement it as a partial differential equation solution. The advantages and disadvantages of the active model-based method are distinct: speed and stability. To improve its performance, a parallel genetic algorithm-based active model method is proposed and applied to segment the lateral ventricles from magnetic resonance brain images. First, an objective function is defined. Then one instance surface was extracted using the finite-difference method-based active model and used to initialize the first generation of a parallel genetic algorithm. Finally, the parallel genetic algorithm is employed to refine the result. We demonstrate that the method successfully overcomes numerical instability and is capable of generating an accurate and robust anatomic descriptor for complex objects in the human brain, such as the lateral ventricles. PMID:12472263

  7. Image compression based on fuzzy algorithms for learning vector quantization and wavelet image decomposition.

    Science.gov (United States)

    Karayiannis, N B; Pai, P I; Zervos, N

    1998-01-01

    This work evaluates the performance of an image compression system based on wavelet-based subband decomposition and vector quantization. The images are decomposed using wavelet filters into a set of subbands with different resolutions corresponding to different frequency bands. The resulting subbands are vector quantized using the Linde-Buzo-Gray (LBG) algorithm and various fuzzy algorithms for learning vector quantization (FALVQ). These algorithms perform vector quantization by updating all prototypes of a competitive neural network through an unsupervised learning process. The quality of the multiresolution codebooks designed by these algorithms is measured on the reconstructed images belonging to the training set used for multiresolution codebook design and the reconstructed images from a testing set. PMID:18276335

  8. Semi-Supervised Learning Based Social Image Semantic Mining Algorithm

    Directory of Open Access Journals (Sweden)

    Guangwu AO

    2014-02-01

    Full Text Available As social image semantic mining is of great importance in social image retrieval, and it can also solve the problem of semantic gap. In this paper, a novel social image semantic mining algorithm based on semi-supervised learning is proposed. Firstly, labels which tagged the images in the test image dataset are extracted, and noisy semantic information are pruned. Secondly, the labels are propagated to construct an extended collection. Thirdly, image visual features are extracted from the unlabeled images by three steps, including watershed segmentation, region feature extraction and codebooks construction. Fourthly, vectors of image visual feature are obtained by dimension reduction. Fifthly, after the process of semi-supervised learning and classifier training, the confidence score of semantic terms for the unlabeled image are calculated by integrating different types of social image features, and then the heterogeneous feature spaces are divided into several disjoint groups. Finally, experiments are conducted to make performance evaluation. Compared with other existing methods, it can be seen than the proposed can effectively extract semantic information of social images

  9. The ANACONDA algorithm for deformable image registration in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Weistrand, Ola; Svensson, Stina, E-mail: stina.svensson@raysearchlabs.com [RaySearch Laboratories AB, Sveavägen 44, SE-11134 Stockholm (Sweden)

    2015-01-15

    Purpose: The purpose of this work was to describe a versatile algorithm for deformable image registration with applications in radiotherapy and to validate it on thoracic 4DCT data as well as CT/cone beam CT (CBCT) data. Methods: ANAtomically CONstrained Deformation Algorithm (ANACONDA) combines image information (i.e., intensities) with anatomical information as provided by contoured image sets. The registration problem is formulated as a nonlinear optimization problem and solved with an in-house developed solver, tailored to this problem. The objective function, which is minimized during optimization, is a linear combination of four nonlinear terms: 1. image similarity term; 2. grid regularization term, which aims at keeping the deformed image grid smooth and invertible; 3. a shape based regularization term which works to keep the deformation anatomically reasonable when regions of interest are present in the reference image; and 4. a penalty term which is added to the optimization problem when controlling structures are used, aimed at deforming the selected structure in the reference image to the corresponding structure in the target image. Results: To validate ANACONDA, the authors have used 16 publically available thoracic 4DCT data sets for which target registration errors from several algorithms have been reported in the literature. On average for the 16 data sets, the target registration error is 1.17 ± 0.87 mm, Dice similarity coefficient is 0.98 for the two lungs, and image similarity, measured by the correlation coefficient, is 0.95. The authors have also validated ANACONDA using two pelvic cases and one head and neck case with planning CT and daily acquired CBCT. Each image has been contoured by a physician (radiation oncologist) or experienced radiation therapist. The results are an improvement with respect to rigid registration. However, for the head and neck case, the sample set is too small to show statistical significance. Conclusions: ANACONDA performs well in comparison with other algorithms. By including CT/CBCT data in the validation, the various aspects of the algorithm such as its ability to handle different modalities, large deformations, and air pockets are shown.

  10. The ANACONDA algorithm for deformable image registration in radiotherapy

    International Nuclear Information System (INIS)

    Purpose: The purpose of this work was to describe a versatile algorithm for deformable image registration with applications in radiotherapy and to validate it on thoracic 4DCT data as well as CT/cone beam CT (CBCT) data. Methods: ANAtomically CONstrained Deformation Algorithm (ANACONDA) combines image information (i.e., intensities) with anatomical information as provided by contoured image sets. The registration problem is formulated as a nonlinear optimization problem and solved with an in-house developed solver, tailored to this problem. The objective function, which is minimized during optimization, is a linear combination of four nonlinear terms: 1. image similarity term; 2. grid regularization term, which aims at keeping the deformed image grid smooth and invertible; 3. a shape based regularization term which works to keep the deformation anatomically reasonable when regions of interest are present in the reference image; and 4. a penalty term which is added to the optimization problem when controlling structures are used, aimed at deforming the selected structure in the reference image to the corresponding structure in the target image. Results: To validate ANACONDA, the authors have used 16 publically available thoracic 4DCT data sets for which target registration errors from several algorithms have been reported in the literature. On average for the 16 data sets, the target registration error is 1.17 ± 0.87 mm, Dice similarity coefficient is 0.98 for the two lungs, and image similarity, measured by the correlation coefficient, is 0.95. The authors have also validated ANACONDA using two pelvic cases and one head and neck case with planning CT and daily acquired CBCT. Each image has been contoured by a physician (radiation oncologist) or experienced radiation therapist. The results are an improvement with respect to rigid registration. However, for the head and neck case, the sample set is too small to show statistical significance. Conclusions: ANACONDA performs well in comparison with other algorithms. By including CT/CBCT data in the validation, the various aspects of the algorithm such as its ability to handle different modalities, large deformations, and air pockets are shown

  11. Fingerprint matching algorithm for poor quality images

    Directory of Open Access Journals (Sweden)

    Vedpal Singh

    2015-04-01

    Full Text Available The main aim of this study is to establish an efficient platform for fingerprint matching for low-quality images. Generally, fingerprint matching approaches use the minutiae points for authentication. However, it is not such a reliable authentication method for low-quality images. To overcome this problem, the current study proposes a fingerprint matching methodology based on normalised cross-correlation, which would improve the performance and reduce the miscalculations during authentication. It would decrease the computational complexities. The error rate of the proposed method is 5.4%, which is less than the two-dimensional (2D dynamic programming (DP error rate of 5.6%, while Lee's method produces 5.9% and the combined method has 6.1% error rate. Genuine accept rate at 1% false accept rate is 89.3% but at 0.1% value it is 96.7%, which is higher. The outcome of this study suggests that the proposed methodology has a low error rate with minimum computational effort as compared with existing methods such as Lee's method and 2D DP and the combined method.

  12. Beam hardening correction algorithm in microtomography images

    Energy Technology Data Exchange (ETDEWEB)

    Sales, Erika S.; Lima, Inaya C.B.; Lopes, Ricardo T., E-mail: esales@con.ufrj.b, E-mail: ricardo@lin.ufrj.b [Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Lab. de Instrumentacao Nuclear; Assis, Joaquim T. de, E-mail: joaquim@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico. Dept. de Engenharia Mecanica

    2009-07-01

    Quantification of mineral density of bone samples is directly related to the attenuation coefficient of bone. The X-rays used in microtomography images are polychromatic and have a moderately broad spectrum of energy, which makes the low-energy X-rays passing through a sample to be absorbed, causing a decrease in the attenuation coefficient and possibly artifacts. This decrease in the attenuation coefficient is due to a process called beam hardening. In this work the beam hardening of microtomography images of vertebrae of Wistar rats subjected to a study of hyperthyroidism was corrected by the method of linearization of the projections. It was discretized using a spectrum in energy, also called the spectrum of Herman. The results without correction for beam hardening showed significant differences in bone volume, which could lead to a possible diagnosis of osteoporosis. But the data with correction showed a decrease in bone volume, but this decrease was not significant in a confidence interval of 95%. (author)

  13. Image Compression Algorithms for Fingerprint System

    Directory of Open Access Journals (Sweden)

    Preeti Pathak

    2010-05-01

    Full Text Available Fingerprint-which have been used for about 100 years are the oldest biometric signs of identity. Humans have used fingerprints for personal identification for centuries and the validity of fingerprint identification has been well established. In fact, fingerprint technology is so common in Human Identification that it has almost become the synonym of biometrics. Fingerprints are believed to be unique across individuals and across fingers of same individual. Even identical twins having similar DNA, are believed to have different fingerprints. The analysis of fingerprints for matching purposes generally requires the comparison of several features of the print pattern. These include patterns, which are aggregate characteristics of ridges, and minutia points, which are unique features found within the patterns. is also necessary to know the structure and properties of human skin in order to successfully employ some of the imaging technologies. A major approach for fingerprint recognition today is to extract minutiae from fingerprint images and to perform fingerprint matching based on the number of corresponding minutiae pairings. One of the most difficult problems in fingerprint recognition has been that the recognition performance is significantly influenced by fingertip surface condition, which may vary depending on environmental or personal causes. Addressing this problem this paper propose some extra features that can be used to strengthen the present approaches followed in developing Fingerprint recognition system. To increase security and accuracy we can use Infrared technique and technique to assign a score value to each of extracted minutiae.

  14. Beam hardening correction algorithm in microtomography images

    International Nuclear Information System (INIS)

    Quantification of mineral density of bone samples is directly related to the attenuation coefficient of bone. The X-rays used in microtomography images are polychromatic and have a moderately broad spectrum of energy, which makes the low-energy X-rays passing through a sample to be absorbed, causing a decrease in the attenuation coefficient and possibly artifacts. This decrease in the attenuation coefficient is due to a process called beam hardening. In this work the beam hardening of microtomography images of vertebrae of Wistar rats subjected to a study of hyperthyroidism was corrected by the method of linearization of the projections. It was discretized using a spectrum in energy, also called the spectrum of Herman. The results without correction for beam hardening showed significant differences in bone volume, which could lead to a possible diagnosis of osteoporosis. But the data with correction showed a decrease in bone volume, but this decrease was not significant in a confidence interval of 95%. (author)

  15. Genetic Algorithm Processor for Image Noise Filtering Using Evolvable Hardware

    Directory of Open Access Journals (Sweden)

    K. Sri Rama Krishna, A. Guruva Reddy, M.N. Giri Prasad, K. Chandrabushan Rao & M. Madhavi

    2010-08-01

    Full Text Available General-purpose image filters lack the flexibility and adaptability of un-modelednoise types. On the contrary, evolutionary algorithm based filter architecturesseem to be very promising due to their capability of providing solutions to harddesign problems. Through this novel approach, it is made possible to have animage filter that can employ a completely different design style that is performedby an evolutionary algorithm. In this context, an evolutionary algorithm basedfilter is designed in this paper with the kernel or the whole circuit for automaticallyevolved.The Evolvable Hard Ware architecture proposed in this paper can evolve filterswithout a priori information. The proposed filter architecture considers spatialdomain approach and uses the overlapping window to filter the signal. Theapproach that is chosen in this work is based on functional level evolution whosearchitecture includes nonlinear functions and uses genetic algorithm for findingthe best filter configuration.

  16. A Study Of Image Segmentation Algorithms For Different Types Of Images

    Directory of Open Access Journals (Sweden)

    Krishna Kant Singh

    2010-09-01

    Full Text Available In computer vision, segmentation refers to the process of partitioning a digital image into multiple segments (sets of pixels, also known as superpixels.Image segmentation is typically used to locate objects and boundaries (lines, curves, etc. in images. More precisely, image segmentation is the process of assigning a label to every pixel in an image such that pixels with the same label share certain visual characteristics.The result of image segmentation is a set of segments that collectively cover the entire image, or a set of contours extracted from the image (see edge detection. Each of the pixels in a region are similar with respect to some characteristic or computed property, such as color, intensity, or texture.Due to the importance of image segmentation a number of algorithms have been proposed but based on the image that is inputted the algorithm should be chosen to get the best results. In this paper the author gives a study of the various algorithms that are available for color images,text and gray scale images.

  17. State Of Art in Homomorphic Encryption Schemes

    OpenAIRE

    S. Sobitha Ahila; Dr. K.L.Shunmuganathan

    2014-01-01

    The demand for privacy of digital data and of algorithms for handling more complex structures have increased exponentially over the last decade. However, the critical problem arises when there is a requirement for publicly computing with private data or to modify functions or algorithms in such a way that they are still executable while their privacy is ensured. This is where homomorphic cryptosystems can be used since these systems enable computations with encrypted data. A f...

  18. A robust algorithm for sky background computation in CCD images

    CERN Document Server

    Patat, F

    2003-01-01

    In this paper we present a non-interactive algorithm to estimate a representative value for the sky background on CCD images. The method we have devised uses the mode as a robust estimator of the background brightness in sub-windows distributed across the input frame. The presence of contaminating objects is detected through the study of the local intensity distribution function and the perturbed areas are rejected using a statistical criterion which was derived from numerical simulations. The technique has been extensively tested on a large amount of images and it is suitable for fully automatic processing of large data volumes. The implementation we discuss here has been optimized for the ESO-FORS1 instrument, but it can be easily generalized to all CCD imagers with a sufficiently large field of view. The algorithm has been successfully used for the UBVRI ESO-Paranal night sky brightness survey (Patat 2003).

  19. A Distortion Input Parameter in Image Denoising Algorithms with Wavelets

    Directory of Open Access Journals (Sweden)

    Anisia GOGU

    2009-07-01

    Full Text Available The problem of image denoising based on wavelets is considered. The paper proposes an image denoising method by imposing a distortion input parameter instead of threshold. The method has two algorithms. The first one is running off line and it is applied to the prototype of the image class and it building a specific dependency, linear or nonlinear, between the final desired distortion and the necessary probability of the details coefficients. The next algorithm, is directly applying the denoising with a threshold computed from the previous step. The threshold is estimated by using the probability density function of the details coefficients and by imposing the probability of the coefficients which will be kept. The obtained results are at the same quality level with other well known methods.

  20. Analysis of Fast- ICA Algorithm for Separation of Mixed Images

    Directory of Open Access Journals (Sweden)

    Tanmay Awasthy

    2013-10-01

    Full Text Available Independent component analysis (ICA is a newly developed method in which the aim is to find a linear representation of nongaussian statistics so that the components are statistically independent, or as independent as possible. Such techniques are actively being used in study of both statistical image processing and unsupervised neural learning application. This paper represents the Fast Independent component analysis algorithm for separation of mixed images. To solve the blind signal separation problems Independent component analysis approach used statistical independence of the source signals. This paper focuses on the theory and methods of ICA in contrast to classical transformations along with the applications of this method to blind source separation .For an illustration of the algorithm, visualized the immixing process with a set of images has been done. To express the results of our analysis simulations have been presented.

  1. Multi-node Cooperative Image Mosaicking Algorithm for Wireless Multimedia Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiong Zhe-Yuan

    2013-01-01

    Full Text Available Panoramic images are important for video surveillance of Wireless Multimedia Sensor Networks (WMSN. The traditional image mosaicking approaches always accompany lots of energy and bandwidth consumption, which are not suitable for resource-constrained WMSN. For sufficient utilization of constraint resources on generating high-resolution images with wide field of view, a cooperative image mosaicking algorithm among multiple video nodes is proposed. Image block searching algorithm is applied for image registration to decrease the energy consumption, sum of absolute difference algorithm is adapted to improve the accuracy of image registration and weighted mean algorithm is applied for image stitching. Data volumes of transmission are decreased after image mosaicking, which can efficiently reduce the network loading. Simulation results demonstrate that the computation complexity of proposed algorithm is lower than other image mosaicking algorithms, under certain image registration accuracy and image quality.

  2. Fast Correction Algorithm Research of Image Geometric Distortion in the Image Tracking

    Directory of Open Access Journals (Sweden)

    Tan Lian

    2013-01-01

    Full Text Available In high-performance image tracking system, the image geometric distortion is an important factor restricting accuracy of the algorithm. In order to ensure accuracy and to minimize the computation time, this study proposes a method through research. Radial distortion is the main factor of the image distortion. By the coordinate transformation, this paper got the camera distortion model. This study briefly introduces the image of the principles of geometric distortions. And in the basis of analyzing polynomial algorithm on the coordinate conversion, it approaches a distortion algorithm non-uniform by region. The fixed focus image is established to the corresponding model and non-uniform divided to rectangular areas which are within a polynomial in a high order polynomial. By comparing the correction effect and devotion of time of the correction method between one order region and third order polynomial, the validity of the proposed method by the article can be verified.

  3. Secure Transmission and Recovery of Embedded Patient Information from Biomedical Images of Different Modalities through a Combination of Cryptography and Watermarking

    OpenAIRE

    Subhajit Koley; Koushik Pal; Goutam Ghosh; Mahua Bhattacharya

    2014-01-01

    In this paper a new type of information hiding skill in biomedical images is proposed through a combination of cryptography and digital watermarking to achieve the enhancement in confidential and authenticated data storage and secured transmission. Here patient's name and doctor's name are considered as patient's information which is encrypted using cryptography and embedded in the scan image of that patient through watermarking. RSA algorithm is used for encryption and higher order bit LSB r...

  4. Extending the Visual Cryptography Algorithm Without Removing Cover Images

    Directory of Open Access Journals (Sweden)

    Dr.V.R.Anitha, M.Tech, Ph.d 1 , Dilip kumar Kotthapalli

    2013-04-01

    Full Text Available Visual cryptography is a simple and powerful method which can provide high security for confidential information. This technique generate noiselike random pixels on share images to hide secret information which on overlay decrypt the information this technique is known as conventional visual secret sharing schemes. It suffers a management problem, because of which dealers cannot visually identify each share. This problem is solved by the extended visual cryptography scheme (EVCS, which adds a meaningful cover image in each share. But while removing the extra cover image it produces extra noise or degrades the hidden image quality. Hence we are Extending the visual cryptography algorithm without removing cover images where it reduces pixel expansion problem and shares match duration

  5. An automatic image fusion algorithm for unregistered multiply multi-focus images

    Science.gov (United States)

    Liu, Yan; Yu, Feihong

    2015-04-01

    The multi-focus image fusion technique provides a promising way to extend the depth of defocused images by combining multiple images with diverse focuses into a single focused one. In this paper, we present a robust and automated algorithm for the fusion of unregistered multiply multi-focus images. The motivation of our method lies in the fact that the source images are assumed to be perfectly aligned in the majority of previous research. Actually, the assumption is difficult to achieve in many practical situations. Hence, image registration method for multi-focus images is talked in this paper. We choose a multi-focus image as reference one in the image registration process by entropy theory. Speeded Up Robust Features (SURF) feature detector with Binary Robust Invariant Scalable Keypoints (BRISK) feature descriptor is used in the feature matching process. An improved RANdom Sample Consensus (RANSAC) algorithm is adopted to reject incorrect matches. The registered images are fused using stationary wavelet transform (SWT) with sym5 wavelet basis. The experimental results prove that the proposed algorithm achieves better performance for unregistered multiply multi-focus images, and it is especially robust to scale and rotation translation compared with traditional direct fusion method.

  6. Knapsack Based ECC Encryption and Decryption

    Directory of Open Access Journals (Sweden)

    R. Rajaram Ramasamy

    2009-11-01

    Full Text Available Elliptic Curve Cryptography provides a secure means of exchanging keys among communicating hosts using the Diffie Hellman Key Exchange algorithm. Encryption and Decryption of texts and messages have also been attempted. This paper presents the implementation of ECC by first transforming the message into an affine point on the EC, and then applying the knapsack algorithm on ECC encrypted message over the finite field GF(p. In ECC we normally start with an affine point called $P_{m}$(x,y. This point lies on the elliptic curve. In this paper we have illustrated encryption/decryption involving the ASCII value of the characters constituting the message, and then subjecting it to the knapsack algorithm. We compare our proposed algorithm with RSA algorithm and show that our algorithm is better due to the high degree of sophistication and complexity involved. It is almost infeasible to attempt a brute force attack. Moreover only one parameter, namely the Knapsack vector ai alone needs to be kept secret. On the contrary in RSA, three parameters such as the modulus n, its factors p and q need to be kept secret.

  7. Robust digital image inpainting algorithm in the wireless environment

    Science.gov (United States)

    Karapetyan, G.; Sarukhanyan, H. G.; Agaian, S. S.

    2014-05-01

    Image or video inpainting is the process/art of retrieving missing portions of an image without introducing undesirable artifacts that are undetectable by an ordinary observer. An image/video can be damaged due to a variety of factors, such as deterioration due to scratches, laser dazzling effects, wear and tear, dust spots, loss of data when transmitted through a channel, etc. Applications of inpainting include image restoration (removing laser dazzling effects, dust spots, date, text, time, etc.), image synthesis (texture synthesis), completing panoramas, image coding, wireless transmission (recovery of the missing blocks), digital culture protection, image de-noising, fingerprint recognition, and film special effects and production. Most inpainting methods can be classified in two key groups: global and local methods. Global methods are used for generating large image regions from samples while local methods are used for filling in small image gaps. Each method has its own advantages and limitations. For example, the global inpainting methods perform well on textured image retrieval, whereas the classical local methods perform poorly. In addition, some of the techniques are computationally intensive; exceeding the capabilities of most currently used mobile devices. In general, the inpainting algorithms are not suitable for the wireless environment. This paper presents a new and efficient scheme that combines the advantages of both local and global methods into a single algorithm. Particularly, it introduces a blind inpainting model to solve the above problems by adaptively selecting support area for the inpainting scheme. The proposed method is applied to various challenging image restoration tasks, including recovering old photos, recovering missing data on real and synthetic images, and recovering the specular reflections in endoscopic images. A number of computer simulations demonstrate the effectiveness of our scheme and also illustrate the main properties and implementation steps of the presented algorithm. Furthermore, the simulation results show that the presented method is among the state-of-the-art and compares favorably against many available methods in the wireless environment. Robustness in the wireless environment with respect to the shape of the manually selected "marked" region is also illustrated. Currently, we are working on the expansion of this work to video and 3-D data.

  8. Computer vision algorithms in DNA ploidy image analysis

    Science.gov (United States)

    Alexandratou, Eleni; Sofou, Anastasia; Papasaika, Haris; Maragos, Petros; Yova, Dido; Kavantzas, Nikolaos

    2006-02-01

    The high incidence and mortality rates of prostate cancer have stimulated research for prevention, early diagnosis and appropriate treatment. DNA ploidy status of tumour cells is an important parameter with diagnostic and prognostic significance. In the current study, DNA ploidy analysis was performed using image cytometry technique and digital image processing and analysis. Tissue samples from prostate patients were stained using the Feulgen method. Images were acquired using a digital imaging microscopy system consisting of an Olympus BX-50 microscope equipped with a color CCD camera. Segmentation of such images is not a trivial problem because of the uneven background, intensity variations within the nuclei and cell clustering. In this study specific algorithms were developed in Matlab based on the most prominent image segmentation approaches that emanate from the field of Mathematical Morphology, focusing on region-based watershed segmentation. First biomedical images were simplified under non-linear filtering (alternate sequential filters, levelings), and next image features such as gradient information and markers were extracted so as to lead the segmentation process. The extracted markers are used as seeds; watershed transformation was performed to the gradient of the filtered image. Image flooding was performed isotropically from the markers using hierarchical queues based on Beucher and Meyer methodology. The developed algorithms have successfully segmented the cell from its background and from cells clusters as well. To characterize the nuclei, we attempt to derive a set of effective color features. By analyzing more than 50 color features, we have found that a set of color features, hue, saturation-weighted hue, I I=(R+G+B)/3, I II=(R-B),I 3=(2G-R-B)/2, Karhunen-Loeve transformation and energy operator, are effective.

  9. Encrypted Data Storage in EGEE

    CERN Document Server

    Frohner, Ákos

    2006-01-01

    The medical community is routinely using clinical images and associated medical data for diagnosis, intervention planning and therapy follow-up. Medical imaging is producing an increasing number of digital images for which computerized archiving, processing and analysis are needed. Grids are promising infrastructures for managing and analyzing the huge medical databases. Given the sensitive nature of medical images, practiotionners are often reluctant to use distributed systems though. Security if often implemented by isolating the imaging network from the outside world inside hospitals. Given the wide scale distribution of grid infrastructures and their multiple administrative entities, the level of security for manipulating medical data should be particularly high. In this presentation we describe the architecture of a solution, the gLite Encrypted Data Storage (EDS), which was developed in the framework of Enabling Grids for E-sciencE (EGEE), a project of the European Commission (contract number INFSO--508...

  10. A Colour Image Quantization Algorithm for Time-Constrained Applications

    Directory of Open Access Journals (Sweden)

    Wattanapong KURDTHONGMEE

    2005-06-01

    Full Text Available Many techniques have been proposed to quantize a digital colour image in order to reduce the representative number of colours to be suitable for presenting on different types of display screens. In addition, the techniques have been used to significantly reduce the amount of image data required to transfer over a communication network. Most of the published techniques are targetted for implementing on a general purpose multitasking computer with low restriction on time and resource utilizations. The drawback of these techniques relies on the fact that they cannot fulfill the requirement of some applications for real-time constraint and limited resources. In addition, most of the techniques are too complex for hardware realization. In this paper, an algorithm which is more suitable for time critical applications with an additional feature of simplicity to implement on FPGA (Field Programmable Gate Array platforms is proposed and the details of its implementation and experimentation are presented. The dominate point of the proposed algorithm relies on the fact that it utilizes the weighted sum of the nearest distance along the axis under consideration, which is nontrivial to calculate, instead of the squared Euclidean distance to find the axis to split during. Also, the proposed algorithm has proved that by reducing the number of subspaces to be considered during the variance representative value calculation from 8 to 2 subspaces, the quality of quantized images are comparable to the previously proposed approaches. This makes it possible to further speed up the computational time of the quantization algorithm.

  11. An Iterative CT Reconstruction Algorithm for Fast Fluid Flow Imaging.

    Science.gov (United States)

    Van Eyndhoven, Geert; Batenburg, K Joost; Kazantsev, Daniil; Van Nieuwenhove, Vincent; Lee, Peter D; Dobson, Katherine J; Sijbers, Jan

    2015-11-01

    The study of fluid flow through solid matter by computed tomography (CT) imaging has many applications, ranging from petroleum and aquifer engineering to biomedical, manufacturing, and environmental research. To avoid motion artifacts, current experiments are often limited to slow fluid flow dynamics. This severely limits the applicability of the technique. In this paper, a new iterative CT reconstruction algorithm for improved a temporal/spatial resolution in the imaging of fluid flow through solid matter is introduced. The proposed algorithm exploits prior knowledge in two ways. First, the time-varying object is assumed to consist of stationary (the solid matter) and dynamic regions (the fluid flow). Second, the attenuation curve of a particular voxel in the dynamic region is modeled by a piecewise constant function over time, which is in accordance with the actual advancing fluid/air boundary. Quantitative and qualitative results on different simulation experiments and a real neutron tomography data set show that, in comparison with the state-of-the-art algorithms, the proposed algorithm allows reconstruction from substantially fewer projections per rotation without image quality loss. Therefore, the temporal resolution can be substantially increased, and thus fluid flow experiments with faster dynamics can be performed. PMID:26259219

  12. Hypercube algorithms suitable for image understanding in uncertain environments

    International Nuclear Information System (INIS)

    Computer vision in a dynamic environment needs to be fast and able to tolerate incomplete or uncertain intermediate results. An appropriately chose representation coupled with a parallel architecture addresses both concerns. The wide range of numerical and symbolic processing needed for robust computer vision can only be achieved through a blend of SIMD and MIMD processing techniques. The 1024 element hypercube architecture has these capabilities, and was chosen as the test-bed hardware for development of highly parallel computer vision algorithms. This paper presents and analyzes parallel algorithms for color image segmentation and edge detection. These algorithms are part of a recently developed computer vision system which uses multiple valued logic to represent uncertainty in the imaging process and in intermediate results. Algorithms for the extraction of three dimensional properties of objects using dynamic scene analysis techniques within the same framework are examined. Results from experimental studies using a 1024 element hypercube implementation of the algorithm as applied to a series of natural scenes are reported

  13. Multispectral Image Clustering Using Enhanced Genetic k-Means Algorithm

    Directory of Open Access Journals (Sweden)

    K. Venkatalakshmi

    2007-01-01

    Full Text Available An attempt has been made in this study to find globally optimal cluster centers for multispectral images with Enhanced Genetic k-Means algorithm. The idea is to avoid the expensive crossover or fitness to produce valid clusters in pure GA and to improve the convergence time. The drawback of using pure GA in this problem is the usage of an expensive crossover or fitness to produce valid clusters (Non-empty clusters. To circumvent the disadvantage of GA, hybridization of GA with k-Means as Genetic k-Means is already proposed. This Genetic k-Means Algorithm (GKA always finds the globally optimal cluster centers but the drawback is the usage of an expensive fitness function which involves ? truncation. The Enhanced GKA alleviates the problem by using a simple fitness function with an incremental factor. A k-Means operator (one-step of k-Means algorithm used in GKA as a search operator is adopted in this study. In Enhanced GKA the mutation involves less computation than the mutation involved in GKA. In order to avoid the invalid clusters formed during the iterations the empty clusters are converted into singleton cluster by adding a randomly selected data item until none of the cluster is empty. The results show that the proposed algorithm converges to the global optimum in fewer numbers of generations than conventional GA and also found to consume less computational complexity than GKA. It proves to be an effective clustering algorithm for multispectral images.

  14. STEGANOGRAPHY USING RC4 ALGORITHM

    Directory of Open Access Journals (Sweden)

    P rayag s. Desale

    2015-06-01

    Full Text Available Steganography is art of hiding data using a image is discussed in this paper.This is done by using the encryption and decryption techniques.Internet security is a big issue nowadays. Specially the areas where highly confidential and secret data is needed to be transferred, there is a possibility that confidential data might be hacked. So, it is ne cessary to provide a high level of security to the secret data. Hence we have built a device which can capture an image in real time and send it over internet. We have built a device that can be used to transfer secret data over internet.Using this device encrypted data will be hidden inside an image with help of the rc4 algorithm and then it will be transferred to respective destination.Where it is decrypted and original message is obtained and delivere

  15. An algorithm to estimate the object support in truncated images

    International Nuclear Information System (INIS)

    Purpose: Truncation artifacts in CT occur if the object to be imaged extends past the scanner field of view (SFOV). These artifacts impede diagnosis and could possibly introduce errors in dose plans for radiation therapy. Several approaches exist for correcting truncation artifacts, but existing correction algorithms do not accurately recover the skin line (or support) of the patient, which is important in some dose planning methods. The purpose of this paper was to develop an iterative algorithm that recovers the support of the object. Methods: The authors assume that the truncated portion of the image is made up of soft tissue of uniform CT number and attempt to find a shape consistent with the measured data. Each known measurement in the sinogram is interpreted as an estimate of missing mass along a line. An initial estimate of the object support is generated by thresholding a reconstruction made using a previous truncation artifact correction algorithm (e.g., water cylinder extrapolation). This object support is iteratively deformed to reduce the inconsistency with the measured data. The missing data are estimated using this object support to complete the dataset. The method was tested on simulated and experimentally truncated CT data. Results: The proposed algorithm produces a better defined skin line than water cylinder extrapolation. On the experimental data, the RMS error of the skin line is reduced by about 60%. For moderately truncated images, some soft tissue contrast is retained near the SFOV. As the extent of truncation increases, the soft tissue contrast outside the SFOV becomes unusable although the skin line remains clearly defined, and in reformatted images it varies smoothly from slice to slice as expected. Conclusions: The support recovery algorithm provides a more accurate estimate of the patient outline than thresholded, basic water cylinder extrapolation, and may be preferred in some radiation therapy applications

  16. Remote sensing image registration algorithm based on circle correlation

    Science.gov (United States)

    Yang, Changqing; Xie, Tianhua

    2015-08-01

    To estimation the scaling parameter higher than 1.3, a robust remote sensing image registration algorithm is proposed. It decomposes the corner feature windows into a new circle feature curve sequences. According to scaling invariance, the corresponding points are obtained through with calculating the correlation of curves. In the end, it uses line least-squares estimating the parameters. The experiments show that, the corresponding point pair's error is less than 5%, and the multi-parameter error is less than 0.5 pixel, 0.1degree, and 0.1% scales. Especially, in the condition that scaling is more than 1.3, this algorithm has good performance.

  17. Spatial correlation genetic algorithm for fractal image compression

    Energy Technology Data Exchange (ETDEWEB)

    Wu, M.-S. [Department of Electrical Engineering, National Sun Yet-Sen University, 70 Lien-Hai Rd., Kaohsiung 804, Taiwan (China)] e-mail: d9131806@student.nsysu.edu.tw; Teng, W.-C. [Department of Electrical Engineering, National Sun Yet-Sen University, 70 Lien-Hai Rd., Kaohsiung 804, Taiwan (China)] e-mail: m923010008@student.nsysu.edu.tw; Jeng, J.-H. [Department of Information Engineering, I-Shou University, Kaohsiung, Taiwan (China)] e-mail: jjeng@isu.edu.tw; Hsieh, J.-G. [Department of Electrical Engineering, National Sun Yet-Sen University, 70 Lien-Hai Rd., Kaohsiung 804, Taiwan (China)] e-mail: jghsieh@mail.ee.nsysu.edu.tw

    2006-04-01

    Fractal image compression explores the self-similarity property of a natural image and utilizes the partitioned iterated function system (PIFS) to encode it. This technique is of great interest both in theory and application. However, it is time-consuming in the encoding process and such drawback renders it impractical for real time applications. The time is mainly spent on the search for the best-match block in a large domain pool. In this paper, a spatial correlation genetic algorithm (SC-GA) is proposed to speed up the encoder. There are two stages for the SC-GA method. The first stage makes use of spatial correlations in images for both the domain pool and the range pool to exploit local optima. The second stage is operated on the whole image to explore more adequate similarities if the local optima are not satisfied. With the aid of spatial correlation in images, the encoding time is 1.5 times faster than that of traditional genetic algorithm method, while the quality of the retrieved image is almost the same. Moreover, about half of the matched blocks come from the correlated space, so fewer bits are required to represent the fractal transform and therefore the compression ratio is also improved.

  18. Spatial correlation genetic algorithm for fractal image compression

    International Nuclear Information System (INIS)

    Fractal image compression explores the self-similarity property of a natural image and utilizes the partitioned iterated function system (PIFS) to encode it. This technique is of great interest both in theory and application. However, it is time-consuming in the encoding process and such drawback renders it impractical for real time applications. The time is mainly spent on the search for the best-match block in a large domain pool. In this paper, a spatial correlation genetic algorithm (SC-GA) is proposed to speed up the encoder. There are two stages for the SC-GA method. The first stage makes use of spatial correlations in images for both the domain pool and the range pool to exploit local optima. The second stage is operated on the whole image to explore more adequate similarities if the local optima are not satisfied. With the aid of spatial correlation in images, the encoding time is 1.5 times faster than that of traditional genetic algorithm method, while the quality of the retrieved image is almost the same. Moreover, about half of the matched blocks come from the correlated space, so fewer bits are required to represent the fractal transform and therefore the compression ratio is also improved

  19. Algorithm for Improved Image Compression and Reconstruction Performances

    Directory of Open Access Journals (Sweden)

    G.Chenchu Krishnaiah

    2012-04-01

    Full Text Available Energy efficient wavelet image transform algorithm (EEWITA which is capable of evolving non-wavelet transforms consistently outperform wavelets when applied to a large class of images subject to quantization error. An EEWITA can evolve a set of coefficients which describes a matched forward and inverse transform pair that can be used at each level of a multi-resolution analysis (MRA transform to minimize the original image size and the mean squared error (MSE in the reconstructed image. Simulation results indicate that the benefit of using evolved transforms instead of wavelets increases in proportion to quantization level. Furthermore, coefficients evolved against a single representative training image generalize to effectively reduce MSE for a broad class of reconstructed images. In this paper an attempt has been made to perform the comparison of the performances of various wavelets and non-wavelets. Experimental results were obtained using different types of wavelets and non-wavelets for different types of photographic images (color and monochrome. These results concludes that the EEWITA method is competitive to well known methods for lossy image compression, in terms of compression ratio (CR, mean square error (MSE, peak signal to noise ratio (PSNR, encoding time, decoding time and transforming time or decomposition time. This analysis will help in choosing the wavelet for decomposition of images as required in a particular applications.

  20. Majorization-minimization algorithms for wavelet-based image restoration.

    Science.gov (United States)

    Figueiredo, Mário A T; Bioucas-Dias, José M; Nowak, Robert D

    2007-12-01

    Standard formulations of image/signal deconvolution under wavelet-based priors/regularizers lead to very high-dimensional optimization problems involving the following difficulties: the non-Gaussian (heavy-tailed) wavelet priors lead to objective functions which are nonquadratic, usually nondifferentiable, and sometimes even nonconvex; the presence of the convolution operator destroys the separability which underlies the simplicity of wavelet-based denoising. This paper presents a unified view of several recently proposed algorithms for handling this class of optimization problems, placing them in a common majorization-minimization (MM) framework. One of the classes of algorithms considered (when using quadratic bounds on nondifferentiable log-priors) shares the infamous "singularity issue" (SI) of "iteratively reweighted least squares" (IRLS) algorithms: the possibility of having to handle infinite weights, which may cause both numerical and convergence issues. In this paper, we prove several new results which strongly support the claim that the SI does not compromise the usefulness of this class of algorithms. Exploiting the unified MM perspective, we introduce a new algorithm, resulting from using l1 bounds for nonconvex regularizers; the experiments confirm the superior performance of this method, when compared to the one based on quadratic majorization. Finally, an experimental comparison of the several algorithms, reveals their relative merits for different standard types of scenarios. PMID:18092597

  1. An optimal point spread function subtraction algorithm for high-contrast imaging: a demonstration with angular differential imaging

    Energy Technology Data Exchange (ETDEWEB)

    Lafreniere, D; Marois, C; Doyon, R; Artigau, E; Nadeau, D

    2006-09-19

    Direct imaging of exoplanets is limited by bright quasi-static speckles in the point spread function (PSF) of the central star. This limitation can be reduced by subtraction of reference PSF images. We have developed an algorithm to construct an optimal reference PSF image from an arbitrary set of reference images. This image is built as a linear combination of all available images and is optimized independently inside multiple subsections of the image to ensure that the absolute minimum residual noise is achieved within each subsection. The algorithm developed is completely general and can be used with many high contrast imaging observing strategies, such as angular differential imaging (ADI), roll subtraction, spectral differential imaging, reference star observations, etc. The performance of the algorithm is demonstrated for ADI data. It is shown that for this type of data the new algorithm provides a gain in sensitivity by up 22 to a factor 3 at small separation over the algorithm previously used.

  2. Encryption Of Data Using Elliptic Curve Over Finite Fields

    Directory of Open Access Journals (Sweden)

    D. Sravana Kumar

    2012-02-01

    Full Text Available Cryptography is the study of techniques for ensuring the secrecy and authentication of the information. Public –key encryption schemes are secure only if the authenticity of the public-key is assured. Elliptic curve arithmetic can be used to develop a variety of elliptic curve cryptographic (ECC schemes including key exchange, encryption and digital signature. The principal attraction of elliptic curve cryptography compared to RSA is that it offers equal security for a smaller key-size, thereby reducing the processing overhead. In the present paper we propose a new encryption algorithm using Elliptic Curve over finite fields.

  3. Encryption of Data using Elliptic Curve over Finite fields

    CERN Document Server

    Kumar, D Sravana; Chandrasekhar, A; 10.5121/ijdps.2012.3125

    2012-01-01

    Cryptography is the study of techniques for ensuring the secrecy and authentication of the information. Public-key encryption schemes are secure only if the authenticity of the public-key is assured. Elliptic curve arithmetic can be used to develop a variety of elliptic curve cryptography (ECC) schemes including key exchange, encryption and digital signature. The principal attraction of elliptic curve cryptography compared to RSA is that it offers equal security for a smaller key-size, thereby reducing the processing overhead. In the present paper we propose a new encryption algorithm using some Elliptic Curve over finite fields

  4. Image nonlinearity and non-uniformity corrections using Papoulis - Gerchberg algorithm in gamma imaging systems

    Science.gov (United States)

    Shemer, A.; Schwarz, A.; Gur, E.; Cohen, E.; Zalevsky, Z.

    2015-04-01

    In this paper, the authors describe a novel technique for image nonlinearity and non-uniformity corrections in imaging systems based on gamma detectors. The limitation of the gamma detector prevents the producing of high quality images due to the radionuclide distribution. This problem causes nonlinearity and non-uniformity distortions in the obtained image. Many techniques have been developed to correct or compensate for these image artifacts using complex calibration processes. The presented method is based on the Papoulis - Gerchberg(PG) iterative algorithm and is obtained without need of detector calibration, tuning process or using any special test phantom.

  5. Improvements To The PCS Algorithm For Binary Images

    Science.gov (United States)

    Arps, Ronald B.; Pasco, Richard C.

    1989-04-01

    Progressive coding1 for image data is the focus of current standardization activity within the CCITT and ISO organizations. Recently, a Joint Bi-level Image experts Group (JBIG) has been formed by these two organizations to study the question of progressive coding for bi-level image data. The point of departure for this work is the PCS algorithm for binary images described in ISO/CCITT standards meetings by Endoh and Yamazaki2, in which they added the use of arithmetic coding3 to previous proposals. Our paper summarizes some enhancements that can be made to this PCS scheme by improving the application of adaptive binary arithmetic coding (ABAC) technology. In particular, the "Q-coder" form of ABAC technology is used4 which was recently featured in the November 1988 issue of the IBM Journal of Reseaich and Development5.

  6. Effect of severe image compression on face recognition algorithms

    Science.gov (United States)

    Zhao, Peilong; Dong, Jiwen; Li, Hengjian

    2015-10-01

    In today's information age, people will depend more and more on computers to obtain and make use of information, there is a big gap between the multimedia information after digitization that has large data and the current hardware technology that can provide the computer storage resources and network band width. For example, there is a large amount of image storage and transmission problem. Image compression becomes useful in cases when images need to be transmitted across networks in a less costly way by increasing data volume while reducing transmission time. This paper discusses image compression to effect on face recognition system. For compression purposes, we adopted the JPEG, JPEG2000, JPEG XR coding standard. The face recognition algorithms studied are SIFT. As a form of an extensive research, Experimental results show that it still maintains a high recognition rate under the high compression ratio, and JPEG XR standards is superior to other two kinds in terms of performance and complexity.

  7. A novel algorithm of image fusion using finite ridgelet transform

    Science.gov (United States)

    Miao, Qiguang; Wang, Baoshu

    2006-04-01

    In this paper, a novel image fusion method based on the finite ridgelet transform(FRIT). Firstly, the problem that wavelet transform could not efficiently represent the singularity of linear/curve in image processing is analyzed. Secondly, the principal of FRIT and its good performance in expressing the singularity of two or higher dimensional are studied. Finally, the feasibility of image fusion using FRIT is discussed in detail. A new fusion method based on FRIT and the fusion framework are proposed. The transform coefficients structure and the fusion procedure are given in detail in this paper. Experiments show that the proposed algorithm works better in preserving the edge and texture information than the wavelet transform method and the Laplacian pyramid methods do in image fusion.

  8. Algorithm for Improved Image Compression and Reconstruction Performances

    Directory of Open Access Journals (Sweden)

    G.Chenchu Krishnaiah

    2012-05-01

    Full Text Available Energy efficient wavelet image transform algorithm (EEWITA which is capable of evolving non-wavelettransforms consistently outperform wavelets when applied to a large class of images subject to quantizationerror. An EEWITA can evolve a set of coefficients which describes a matched forward and inversetransform pair that can be used at each level of a multi-resolution analysis (MRA transform to minimizethe original image size and the mean squared error (MSE in the reconstructed image. Simulation resultsindicate that the benefit of using evolved transforms instead of wavelets increases in proportion toquantization level. Furthermore, coefficients evolved against a single representative training imagegeneralize to effectively reduce MSE for a broad class of reconstructed images. In this paper an attempthas been made to perform the comparison of the performances of various wavelets and non-wavelets.Experimental results were obtained using different types of wavelets and non-wavelets for different types ofphotographic images (color and monochrome. These results concludes that the EEWITA method iscompetitive to well known methods for lossy image compression, in terms of compression ratio (CR, meansquare error (MSE, peak signal to noise ratio (PSNR, encoding time, decoding time and transformingtime or decomposition time. This analysis will help in choosing the wavelet for decomposition of images asrequired in a particular applications.

  9. Retaining local image information in gamut mapping algorithms.

    Science.gov (United States)

    Zolliker, Peter; Simon, Klaus

    2007-03-01

    Our topic is the potential of combining global gamut mapping with spatial methods to retain the percepted local image information in gamut mapping algorithms. The main goal is to recover the original local contrast between neighboring pixels in addition to the usual optimization of preserving lightness, saturation, and global contrast. Special emphasis is placed on avoiding artifacts introduced by the gamut mapping algorithm itself. We present an unsharp masking technique based on an edge-preserving smoothing algorithm allowing to avoid halo artifacts. The good performance of the presented approach is verified by a psycho-visual experiment using newspaper printing as a representative of a small destination gamut application. Furthermore, the improved mapping properties are documented with local mapping histograms. PMID:17357727

  10. Identity-based encryption

    CERN Document Server

    Chatterjee, Sanjit

    2011-01-01

    Identity Based Encryption (IBE) is a type of public key encryption and has been intensely researched in the past decade. Identity-Based Encryption summarizes the available research for IBE and the main ideas that would enable users to pursue further work in this area. This book will also cover a brief background on Elliptic Curves and Pairings, security against chosen Cipher text Attacks, standards and more. Advanced-level students in computer science and mathematics who specialize in cryptology, and the general community of researchers in the area of cryptology and data security will find Ide

  11. Classification of Image Registration Techniques and Algorithms in Digital Image Processing – A Research Survey

    Directory of Open Access Journals (Sweden)

    Sindhu Madhuri G

    2014-09-01

    Full Text Available Image Registration (IR occupied a dominant role in the digital Image processing in general and Image analysis in particular. Image registration is a process of transforming different sets of data into one coordinate system, and data may be from - (a multiple photographs, (b different sensors and both (a & (b vary from different (i times, (ii depths, and (iii viewpoints, and thus aligning to monitor the subtle differences between two or more images. The development of IR techniques and algorithms is highly complex because it is required to find spatial correspondences among images, and have vast applications in - Computer Vision, Medical Imaging, Image Mosaicking, Biological Imaging and Brain Mapping, Remote Sensing, Military, Satellite communication, Criminology, and Optimization, etc. Image registration techniques are not only required but also essentially necessary to compare data & images obtained from different measurements based on their application requirements. Due to its high potential requirement for research, there is a need to carryout a research survey on Image Registration techniques in order to understand the phenomenon of Image Registration and its implementation methodologies. This survey emphasizes Image Registration as the most essential part of panoramic image generation & creation, where applications and uses are unimaginable for researchers longing to invent & implement alternative image registration methods from general to specific to complex applications.

  12. An imaging algorithm for vertex reconstruction for ATLAS Run-2

    CERN Document Server

    The ATLAS collaboration

    2015-01-01

    The reconstruction of vertices corresponding to proton--proton collisions in ATLAS is an essential element of event reconstruction used in many performance studies and physics analyses. During Run-1 of the LHC, ATLAS has employed an iterative approach to vertex finding. In order to improve the flexibility of the algorithm and ensure continued performance for very high numbers of simultaneous collisions in future LHC data taking, a new approach to seeding vertex finding is being developed inspired by image reconstruction techniques. This note provides a brief outline of how reconstructed tracks are used to create an image of likely vertex collisions in an event and presents some preliminary results of the performance of the algorithm in simulation approximating early Run-2 conditions.

  13. Despeckle filtering algorithms and software for ultrasound imaging

    CERN Document Server

    Loizou, Christos

    2008-01-01

    It is well-known that speckle is a multiplicative noise that degrades image quality and the visual evaluation in ultrasound imaging. This necessitates the need for robust despeckling techniques for both routine clinical practice and teleconsultation. The goal for this book is to introduce the theoretical background (equations), the algorithmic steps, and the MATLAB™ code for the following group of despeckle filters: linear filtering, nonlinear filtering, anisotropic diffusion filtering and wavelet filtering. The book proposes a comparative evaluation framework of these despeckle filters based

  14. Image Matching Based on Improved Harris Criterion Algorithm

    Directory of Open Access Journals (Sweden)

    Shyi-Ching Liang

    2014-01-01

    Full Text Available This paper proposes a local invariant feature for image matching based on Harris threshold criterion. Based on the extraction of SIFT invariant features, this algorithm chooses the effective features from the extracted invariant features utilizing Harris threshold criterion, which removes lots of feature points with poor distinction, thus gains relatively stable and better distinctive feature points. Secondly, complete the precise matching between the feature point sets combined the invariant feature vector with graph transformation matching method. Experimental results show the feasibility and robustness of this method in image matching

  15. Deblurring Algorithms for Out-of-focus Infrared Images?

    OpenAIRE

    Zhu, Peter

    2010-01-01

    An image that has been subject to the out-of-focus phenomenon has reducedsharpness, contrast and level of detail depending on the amount of defocus. Torestore out-of-focused images is a complex task due to the information loss thatoccurs. However there exist many restoration algorithms that attempt to revertthis defocus by estimating a noise model and utilizing the point spread function.The purpose of this thesis, proposed by FLIR Systems, was to ?nd a robustalgorithm that can restore focus a...

  16. Mammographic images segmentation based on chaotic map clustering algorithm

    International Nuclear Information System (INIS)

    This work investigates the applicability of a novel clustering approach to the segmentation of mammographic digital images. The chaotic map clustering algorithm is used to group together similar subsets of image pixels resulting in a medically meaningful partition of the mammography. The image is divided into pixels subsets characterized by a set of conveniently chosen features and each of the corresponding points in the feature space is associated to a map. A mutual coupling strength between the maps depending on the associated distance between feature space points is subsequently introduced. On the system of maps, the simulated evolution through chaotic dynamics leads to its natural partitioning, which corresponds to a particular segmentation scheme of the initial mammographic image. The system provides a high recognition rate for small mass lesions (about 94% correctly segmented inside the breast) and the reproduction of the shape of regions with denser micro-calcifications in about 2/3 of the cases, while being less effective on identification of larger mass lesions. We can summarize our analysis by asserting that due to the particularities of the mammographic images, the chaotic map clustering algorithm should not be used as the sole method of segmentation. It is rather the joint use of this method along with other segmentation techniques that could be successfully used for increasing the segmentation performance and for providing extra information for the subsequent analysis stages such as the classification of the segmented ROI

  17. A robust algorithm for sky background computation in CCD images

    OpenAIRE

    Patat, F.

    2003-01-01

    In this paper we present a non-interactive algorithm to estimate a representative value for the sky background on CCD images. The method we have devised uses the mode as a robust estimator of the background brightness in sub-windows distributed across the input frame. The presence of contaminating objects is detected through the study of the local intensity distribution function and the perturbed areas are rejected using a statistical criterion which was derived from numeric...

  18. Algorithm for Soybean Classification Using Medium Resolution Satellite Images

    OpenAIRE

    Anibal Gusso; Jorge Ricardo Ducati

    2012-01-01

    An accurate estimation of soybean crop areas while the plants are still in the field is highly necessary for reliable calculation of real crop parameters as to yield, production and other data important to decision-making policies related to government planning. An algorithm for soybean classification over the Rio Grande do Sul State, Brazil, was developed as an objective, automated tool. It is based on reflectance from medium spatial resolution images. The classification method was called th...

  19. Inverse transport calculations in optical imaging with subspace optimization algorithms

    OpenAIRE

    Ding, Tian; Ren, Kui

    2014-01-01

    Inverse boundary value problems for the radiative transport equation play important roles in optics-based medical imaging techniques such as diffuse optical tomography (DOT) and fluorescence optical tomography (FOT). Despite the rapid progress in the mathematical theory and numerical computation of these inverse problems in recent years, developing robust and efficient reconstruction algorithms remains as a challenging task and an active research topic. We propose here a rob...

  20. Genetic algorithms for fast search in fractal image coding

    Science.gov (United States)

    Redmill, David W.; Bull, David R.; Martin, Ralph R.

    1996-02-01

    This paper demonstrates the application of genetic algorithms (GAs) to the real-time search problem of fractal image compression. An approach using GAs has been simulated and compared with both an exhaustive search method and a heuristic multi-grid method. Results, for various block sizes, show that the GA based approach offers a computationally more efficient search than either of the other methods.