WorldWideScience

Sample records for cryptographic hash algorithm

  1. Cryptographic quantum hashing

    Science.gov (United States)

    Ablayev, F. M.; Vasiliev, A. V.

    2014-02-01

    We present a version of quantum hash functions based on non-binary discrete functions. The proposed quantum procedure is ‘classical-quantum’, that is, it takes a classical bit string as an input and produces a quantum state. The resulting function has the property of a one-way function (pre-image resistance); in addition it has properties analogous to classical cryptographic hash second pre-image resistance and collision resistance. We also show that the proposed function can be naturally used in a quantum digital signature protocol.

  2. Cryptographic quantum hashing

    International Nuclear Information System (INIS)

    Ablayev, F M; Vasiliev, A V

    2014-01-01

    We present a version of quantum hash functions based on non-binary discrete functions. The proposed quantum procedure is ‘classical-quantum’, that is, it takes a classical bit string as an input and produces a quantum state. The resulting function has the property of a one-way function (pre-image resistance); in addition it has properties analogous to classical cryptographic hash second pre-image resistance and collision resistance. We also show that the proposed function can be naturally used in a quantum digital signature protocol. (letter)

  3. 76 FR 11433 - Federal Transition To Secure Hash Algorithm (SHA)-256

    Science.gov (United States)

    2011-03-02

    ... ADMINISTRATION [FAR-N-2011-01; Docket No. 2011-0083; Sequence 1] Federal Transition To Secure Hash Algorithm (SHA... acquisition community to transition to Secure Hash Algorithm SHA-256. SHA-256 is a cryptographic hash function... persons attending. Please cite ``Federal Transition to Secure Hash Algorithm SHA-256'' in all...

  4. Analysis and Implementation of Cryptographic Hash Functions in Programmable Logic Devices

    Directory of Open Access Journals (Sweden)

    Tautvydas Brukštus

    2016-06-01

    Full Text Available In this day’s world, more and more focused on data pro-tection. For data protection using cryptographic science. It is also important for the safe storage of passwords for this uses a cryp-tographic hash function. In this article has been selected the SHA-256 cryptographic hash function to implement and explore, based on fact that it is now a popular and safe. SHA-256 cryp-tographic function did not find any theoretical gaps or conflict situations. Also SHA-256 cryptographic hash function used cryptographic currencies. Currently cryptographic currency is popular and their value is high. For the measurements have been chosen programmable logic integrated circuits as they less effi-ciency then ASIC. We chose Altera Corporation produced prog-rammable logic integrated circuits. Counting speed will be inves-tigated by three programmable logic integrated circuit. We will use programmable logic integrated circuits belong to the same family, but different generations. Each programmable logic integ-rated circuit made using different dimension technology. Choo-sing these programmable logic integrated circuits: EP3C16, EP4CE115 and 5CSEMA5F31. To compare calculations perfor-mances parameters are provided in the tables and graphs. Re-search show the calculation speed and stability of different prog-rammable logic circuits.

  5. Cryptanalysis of Tav-128 hash function

    DEFF Research Database (Denmark)

    Kumar, Ashish; Sanadhya, Somitra Kumar; Gauravaram, Praveen

    2010-01-01

    Many RFID protocols use cryptographic hash functions for their security. The resource constrained nature of RFID systems forces the use of light weight cryptographic algorithms. Tav-128 is one such 128-bit light weight hash function proposed by Peris-Lopez et al. for a low-cost RFID tag authentic...

  6. Robust hashing for 3D models

    Science.gov (United States)

    Berchtold, Waldemar; Schäfer, Marcel; Rettig, Michael; Steinebach, Martin

    2014-02-01

    3D models and applications are of utmost interest in both science and industry. With the increment of their usage, their number and thereby the challenge to correctly identify them increases. Content identification is commonly done by cryptographic hashes. However, they fail as a solution in application scenarios such as computer aided design (CAD), scientific visualization or video games, because even the smallest alteration of the 3D model, e.g. conversion or compression operations, massively changes the cryptographic hash as well. Therefore, this work presents a robust hashing algorithm for 3D mesh data. The algorithm applies several different bit extraction methods. They are built to resist desired alterations of the model as well as malicious attacks intending to prevent correct allocation. The different bit extraction methods are tested against each other and, as far as possible, the hashing algorithm is compared to the state of the art. The parameters tested are robustness, security and runtime performance as well as False Acceptance Rate (FAR) and False Rejection Rate (FRR), also the probability calculation of hash collision is included. The introduced hashing algorithm is kept adaptive e.g. in hash length, to serve as a proper tool for all applications in practice.

  7. Chaos-based hash function (CBHF) for cryptographic applications

    International Nuclear Information System (INIS)

    Amin, Mohamed; Faragallah, Osama S.; Abd El-Latif, Ahmed A.

    2009-01-01

    As the core of cryptography, hash is the basic technique for information security. Many of the hash functions generate the message digest through a randomizing process of the original message. Subsequently, a chaos system also generates a random behavior, but at the same time a chaos system is completely deterministic. In this paper, an algorithm for one-way hash function construction based on chaos theory is introduced. Theoretical analysis and computer simulation indicate that the algorithm can satisfy all performance requirements of hash function in an efficient and flexible manner and secure against birthday attacks or meet-in-the-middle attacks, which is good choice for data integrity or authentication.

  8. Chaos-based hash function (CBHF) for cryptographic applications

    Energy Technology Data Exchange (ETDEWEB)

    Amin, Mohamed [Dept. of Mathematics and Computer Science, Faculty of Science, Menoufia University, Shebin El-Koom 32511 (Egypt)], E-mail: mamin04@yahoo.com; Faragallah, Osama S. [Dept. of Computer Science and Engineering, Faculty of Electronic Engineering, Menoufia University, Menouf 32952 (Egypt)], E-mail: osam_sal@yahoo.com; Abd El-Latif, Ahmed A. [Dept. of Mathematics and Computer Science, Faculty of Science, Menoufia University, Shebin El-Koom 32511 (Egypt)], E-mail: ahmed_rahiem@yahoo.com

    2009-10-30

    As the core of cryptography, hash is the basic technique for information security. Many of the hash functions generate the message digest through a randomizing process of the original message. Subsequently, a chaos system also generates a random behavior, but at the same time a chaos system is completely deterministic. In this paper, an algorithm for one-way hash function construction based on chaos theory is introduced. Theoretical analysis and computer simulation indicate that the algorithm can satisfy all performance requirements of hash function in an efficient and flexible manner and secure against birthday attacks or meet-in-the-middle attacks, which is good choice for data integrity or authentication.

  9. The FPGA realization of the general cellular automata based cryptographic hash functions: Performance and effectiveness

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2014-01-01

    Full Text Available In the paper the author considers hardware implementation of the GRACE-H family general cellular automata based cryptographic hash functions. VHDL is used as a language and Altera FPGA as a platform for hardware implementation. Performance and effectiveness of the FPGA implementations of GRACE-H hash functions were compared with Keccak (SHA-3, SHA-256, BLAKE, Groestl, JH, Skein hash functions. According to the performed tests, performance of the hardware implementation of GRACE-H family hash functions significantly (up to 12 times exceeded performance of the hardware implementation of previously known hash functions, and effectiveness of that hardware implementation was also better (up to 4 times.

  10. The hash function BLAKE

    CERN Document Server

    Aumasson, Jean-Philippe; Phan, Raphael; Henzen, Luca

    2014-01-01

    This is a comprehensive description of the cryptographic hash function BLAKE, one of the five final contenders in the NIST SHA3 competition, and of BLAKE2, an improved version popular among developers. It describes how BLAKE was designed and why BLAKE2 was developed, and it offers guidelines on implementing and using BLAKE, with a focus on software implementation.   In the first two chapters, the authors offer a short introduction to cryptographic hashing, the SHA3 competition, and BLAKE. They review applications of cryptographic hashing, they describe some basic notions such as security de

  11. Cryptanalysis of Tav-128 hash function

    DEFF Research Database (Denmark)

    Many RFID protocols use cryptographic hash functions for their security. The resource constrained nature of RFID systems forces the use of light weight cryptographic algorithms. Tav-128 is one such 128-bit light weight hash function proposed by Peris-Lopez et al. for a low-cost RFID tag...... authentication protocol. Apart from some statistical tests for randomness by the designers themselves, Tav-128 has not undergone any other thorough security analysis. Based on these tests, the designers claimed that Tav-128 does not posses any trivial weaknesses. In this article, we carry out the first third...... party security analysis of Tav-128 and show that this hash function is neither collision resistant nor second preimage resistant. Firstly, we show a practical collision attack on Tav-128 having a complexity of 237 calls to the compression function and produce message pairs of arbitrary length which...

  12. MiMC: Efficient encryption and cryptographic hashing with minimal multiplicative complexity

    DEFF Research Database (Denmark)

    Albrecht, Martin; Grassi, Lorenzo; Rechberger, Christian

    2016-01-01

    and cryptographic hash functions is to reconsider and simplify the round function of the Knudsen-Nyberg cipher from 1995. The mapping F(x) := x3 is used as the main component there and is also the main component of our family of proposals called “MiMC”. We study various attack vectors for this construction and give...... a new attack vector that outperforms others in relevant settings. Due to its very low number of multiplications, the design lends itself well to a large class of applications, especially when the depth does not matter but the total number of multiplications in the circuit dominates all aspects...

  13. Cryptographic Hash Functions

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde

    2010-01-01

    functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle...

  14. The LabelHash algorithm for substructure matching

    Directory of Open Access Journals (Sweden)

    Bryant Drew H

    2010-11-01

    Full Text Available Abstract Background There is an increasing number of proteins with known structure but unknown function. Determining their function would have a significant impact on understanding diseases and designing new therapeutics. However, experimental protein function determination is expensive and very time-consuming. Computational methods can facilitate function determination by identifying proteins that have high structural and chemical similarity. Results We present LabelHash, a novel algorithm for matching substructural motifs to large collections of protein structures. The algorithm consists of two phases. In the first phase the proteins are preprocessed in a fashion that allows for instant lookup of partial matches to any motif. In the second phase, partial matches for a given motif are expanded to complete matches. The general applicability of the algorithm is demonstrated with three different case studies. First, we show that we can accurately identify members of the enolase superfamily with a single motif. Next, we demonstrate how LabelHash can complement SOIPPA, an algorithm for motif identification and pairwise substructure alignment. Finally, a large collection of Catalytic Site Atlas motifs is used to benchmark the performance of the algorithm. LabelHash runs very efficiently in parallel; matching a motif against all proteins in the 95% sequence identity filtered non-redundant Protein Data Bank typically takes no more than a few minutes. The LabelHash algorithm is available through a web server and as a suite of standalone programs at http://labelhash.kavrakilab.org. The output of the LabelHash algorithm can be further analyzed with Chimera through a plugin that we developed for this purpose. Conclusions LabelHash is an efficient, versatile algorithm for large-scale substructure matching. When LabelHash is running in parallel, motifs can typically be matched against the entire PDB on the order of minutes. The algorithm is able to identify

  15. A Novel Perceptual Hash Algorithm for Multispectral Image Authentication

    Directory of Open Access Journals (Sweden)

    Kaimeng Ding

    2018-01-01

    Full Text Available The perceptual hash algorithm is a technique to authenticate the integrity of images. While a few scholars have worked on mono-spectral image perceptual hashing, there is limited research on multispectral image perceptual hashing. In this paper, we propose a perceptual hash algorithm for the content authentication of a multispectral remote sensing image based on the synthetic characteristics of each band: firstly, the multispectral remote sensing image is preprocessed with band clustering and grid partition; secondly, the edge feature of the band subsets is extracted by band fusion-based edge feature extraction; thirdly, the perceptual feature of the same region of the band subsets is compressed and normalized to generate the perceptual hash value. The authentication procedure is achieved via the normalized Hamming distance between the perceptual hash value of the recomputed perceptual hash value and the original hash value. The experiments indicated that our proposed algorithm is robust compared to content-preserved operations and it efficiently authenticates the integrity of multispectral remote sensing images.

  16. Implementation of 4-way Superscalar Hash MIPS Processor Using FPGA

    Science.gov (United States)

    Sahib Omran, Safaa; Fouad Jumma, Laith

    2018-05-01

    Due to the quick advancements in the personal communications systems and wireless communications, giving data security has turned into a more essential subject. This security idea turns into a more confounded subject when next-generation system requirements and constant calculation speed are considered in real-time. Hash functions are among the most essential cryptographic primitives and utilized as a part of the many fields of signature authentication and communication integrity. These functions are utilized to acquire a settled size unique fingerprint or hash value of an arbitrary length of message. In this paper, Secure Hash Algorithms (SHA) of types SHA-1, SHA-2 (SHA-224, SHA-256) and SHA-3 (BLAKE) are implemented on Field-Programmable Gate Array (FPGA) in a processor structure. The design is described and implemented using a hardware description language, namely VHSIC “Very High Speed Integrated Circuit” Hardware Description Language (VHDL). Since the logical operation of the hash types of (SHA-1, SHA-224, SHA-256 and SHA-3) are 32-bits, so a Superscalar Hash Microprocessor without Interlocked Pipelines (MIPS) processor are designed with only few instructions that were required in invoking the desired Hash algorithms, when the four types of hash algorithms executed sequentially using the designed processor, the total time required equal to approximately 342 us, with a throughput of 4.8 Mbps while the required to execute the same four hash algorithms using the designed four-way superscalar is reduced to 237 us with improved the throughput to 5.1 Mbps.

  17. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  18. Proposals for Iterated Hash Functions

    DEFF Research Database (Denmark)

    Knudsen, Lars Ramkilde; Thomsen, Søren Steffen

    2008-01-01

    The past few years have seen an increase in the number of attacks on cryptographic hash functions. These include attacks directed at specific hash functions, and generic attacks on the typical method of constructing hash functions. In this paper we discuss possible methods for protecting against...... some generic attacks. We also give a concrete proposal for a new hash function construction, given a secure compression function which, unlike in typical existing constructions, is not required to be resistant to all types of collisions. Finally, we show how members of the SHA-family can be turned...

  19. Proposals for iterated hash functions

    DEFF Research Database (Denmark)

    Knudsen, Lars Ramkilde; Thomsen, Søren Steffen

    2006-01-01

    The past few years have seen an increase in the number of attacks on cryptographic hash functions. These include attacks directed at specific hash functions, and generic attacks on the typical method of constructing hash functions. In this paper we discuss possible methods for protecting against...... some generic attacks. We also give a concrete proposal for a new hash function construction, given a secure compression function which, unlike in typical existing constructions, is not required to be resistant to all types of collisions. Finally, we show how members of the SHA-family can be turned...

  20. Building Modern GPU Brute-Force Collision Resistible Hash Algorithm

    Directory of Open Access Journals (Sweden)

    L. A. Nadeinsky

    2012-03-01

    Full Text Available The article considers methods of fixing storing passwords in hashed form security vulnerability. Suggested hashing algorithm is based on the specifics of architecture of modern graphics processors.

  1. Final report for LDRD Project 93633 : new hash function for data protection.

    Energy Technology Data Exchange (ETDEWEB)

    Draelos, Timothy John; Dautenhahn, Nathan; Schroeppel, Richard Crabtree; Tolk, Keith Michael; Orman, Hilarie (PurpleStreak, Inc.); Walker, Andrea Mae; Malone, Sean; Lee, Eric; Neumann, William Douglas; Cordwell, William R.; Torgerson, Mark Dolan; Anderson, Eric; Lanzone, Andrew J.; Collins, Michael Joseph; McDonald, Timothy Scott; Caskey, Susan Adele

    2009-03-01

    The security of the widely-used cryptographic hash function SHA1 has been impugned. We have developed two replacement hash functions. The first, SHA1X, is a drop-in replacement for SHA1. The second, SANDstorm, has been submitted as a candidate to the NIST-sponsored SHA3 Hash Function competition.

  2. Quantum hashing is maximally secure against classical leakage

    OpenAIRE

    Huang, Cupjin; Shi, Yaoyun

    2017-01-01

    Cryptographic hash functions are fundamental primitives widely used in practice. For such a function $f:\\{0, 1\\}^n\\to\\{0, 1\\}^m$, it is nearly impossible for an adversary to produce the hash $f(x)$ without knowing the secret message $x\\in\\{0, 1\\}^n$. Unfortunately, all hash functions are vulnerable under the side-channel attack, which is a grave concern for information security in practice. This is because typically $m\\ll n$ and an adversary needs only $m$ bits of information to pass the veri...

  3. The Speech multi features fusion perceptual hash algorithm based on tensor decomposition

    Science.gov (United States)

    Huang, Y. B.; Fan, M. H.; Zhang, Q. Y.

    2018-03-01

    With constant progress in modern speech communication technologies, the speech data is prone to be attacked by the noise or maliciously tampered. In order to make the speech perception hash algorithm has strong robustness and high efficiency, this paper put forward a speech perception hash algorithm based on the tensor decomposition and multi features is proposed. This algorithm analyses the speech perception feature acquires each speech component wavelet packet decomposition. LPCC, LSP and ISP feature of each speech component are extracted to constitute the speech feature tensor. Speech authentication is done by generating the hash values through feature matrix quantification which use mid-value. Experimental results showing that the proposed algorithm is robust for content to maintain operations compared with similar algorithms. It is able to resist the attack of the common background noise. Also, the algorithm is highly efficiency in terms of arithmetic, and is able to meet the real-time requirements of speech communication and complete the speech authentication quickly.

  4. Adoption of the Hash algorithm in a conceptual model for the civil registry of Ecuador

    Science.gov (United States)

    Toapanta, Moisés; Mafla, Enrique; Orizaga, Antonio

    2018-04-01

    The Hash security algorithm was analyzed in order to mitigate information security in a distributed architecture. The objective of this research is to develop a prototype for the Adoption of the algorithm Hash in a conceptual model for the Civil Registry of Ecuador. The deductive method was used in order to analyze the published articles that have a direct relation with the research project "Algorithms and Security Protocols for the Civil Registry of Ecuador" and articles related to the Hash security algorithm. It resulted from this research: That the SHA-1 security algorithm is appropriate for use in Ecuador's civil registry; we adopted the SHA-1 algorithm used in the flowchart technique and finally we obtained the adoption of the hash algorithm in a conceptual model. It is concluded that from the comparison of the DM5 and SHA-1 algorithm, it is suggested that in the case of an implementation, the SHA-1 algorithm is taken due to the amount of information and data available from the Civil Registry of Ecuador; It is determined that the SHA-1 algorithm that was defined using the flowchart technique can be modified according to the requirements of each institution; the model for adopting the hash algorithm in a conceptual model is a prototype that can be modified according to all the actors that make up each organization.

  5. Parallel Algorithm of Geometrical Hashing Based on NumPy Package and Processes Pool

    Directory of Open Access Journals (Sweden)

    Klyachin Vladimir Aleksandrovich

    2015-10-01

    Full Text Available The article considers the problem of multi-dimensional geometric hashing. The paper describes a mathematical model of geometric hashing and considers an example of its use in localization problems for the point. A method of constructing the corresponding hash matrix by parallel algorithm is considered. In this paper an algorithm of parallel geometric hashing using a development pattern «pool processes» is proposed. The implementation of the algorithm is executed using the Python programming language and NumPy package for manipulating multidimensional data. To implement the process pool it is proposed to use a class Process Pool Executor imported from module concurrent.futures, which is included in the distribution of the interpreter Python since version 3.2. All the solutions are presented in the paper by corresponding UML class diagrams. Designed GeomNash package includes classes Data, Result, GeomHash, Job. The results of the developed program presents the corresponding graphs. Also, the article presents the theoretical justification for the application process pool for the implementation of parallel algorithms. It is obtained condition t2 > (p/(p-1*t1 of the appropriateness of process pool. Here t1 - the time of transmission unit of data between processes, and t2 - the time of processing unit data by one processor.

  6. Cryptanalysis of the LAKE Hash Family

    DEFF Research Database (Denmark)

    Biryukov, Alex; Gauravaram, Praveen; Guo, Jian

    2009-01-01

    We analyse the security of the cryptographic hash function LAKE-256 proposed at FSE 2008 by Aumasson, Meier and Phan. By exploiting non-injectivity of some of the building primitives of LAKE, we show three different collision and near-collision attacks on the compression function. The first attac...

  7. SPONGENT: The Design Space of Lightweight Cryptographic Hashing

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Knezevic, Miroslav; Leander, Gregor

    2013-01-01

    construction instantiated with present-type permutations. The resulting family of hash functions is called spongent. We propose 13 spongent variants--or different levels of collision and (second) preimage resistance as well as for various implementation constraints. For each of them, we provide several ASIC...

  8. Practical Attacks on AES-like Cryptographic Hash Functions

    DEFF Research Database (Denmark)

    Kölbl, Stefan; Rechberger, Christian

    2015-01-01

    to drastically reduce the complexity of attacks to very practical values for reduced-round versions. Furthermore, we describe new and practical attacks on Whirlpool and the recently proposed GOST R hash function with one or more of the following properties: more rounds, less time/memory complexity, and more...

  9. Automated detection and classification of cryptographic algorithms in binary programs through machine learning

    OpenAIRE

    Hosfelt, Diane Duros

    2015-01-01

    Threats from the internet, particularly malicious software (i.e., malware) often use cryptographic algorithms to disguise their actions and even to take control of a victim's system (as in the case of ransomware). Malware and other threats proliferate too quickly for the time-consuming traditional methods of binary analysis to be effective. By automating detection and classification of cryptographic algorithms, we can speed program analysis and more efficiently combat malware. This thesis wil...

  10. A multi-pattern hash-binary hybrid algorithm for URL matching in the HTTP protocol.

    Directory of Open Access Journals (Sweden)

    Ping Zeng

    Full Text Available In this paper, based on our previous multi-pattern uniform resource locator (URL binary-matching algorithm called HEM, we propose an improved multi-pattern matching algorithm called MH that is based on hash tables and binary tables. The MH algorithm can be applied to the fields of network security, data analysis, load balancing, cloud robotic communications, and so on-all of which require string matching from a fixed starting position. Our approach effectively solves the performance problems of the classical multi-pattern matching algorithms. This paper explores ways to improve string matching performance under the HTTP protocol by using a hash method combined with a binary method that transforms the symbol-space matching problem into a digital-space numerical-size comparison and hashing problem. The MH approach has a fast matching speed, requires little memory, performs better than both the classical algorithms and HEM for matching fields in an HTTP stream, and it has great promise for use in real-world applications.

  11. DEVELOPMENT AND IMPLEMENTATION OF HASH FUNCTION FOR GENERATING HASHED MESSAGE

    Directory of Open Access Journals (Sweden)

    Amir Ghaeedi

    2016-09-01

    Full Text Available Steganography is a method of sending confidential information in a way that the existence of the channel in this communication remains secret. A collaborative approach between steganography and digital signature provides a high secure hidden data. Unfortunately, there are wide varieties of attacks that affect the quality of image steganography. Two issues that required to be addressed are large size of the ciphered data in digital signature and high bandwidth. The aim of the research is to propose a new method for producing a dynamic hashed message algorithm in digital signature and then embedded into image for enhancing robustness of image steganography with reduced bandwidth. A digital signature with smaller hash size than other hash algorithms was developed for authentication purposes. A hash function is used in the digital signature generation. The encoder function encoded the hashed message to generate the digital signature and then embedded into an image as a stego-image. In enhancing the robustness of the digital signature, we compressed or encoded it or performed both operations before embedding the data into the image. This encryption algorithm is also computationally efficient whereby for messages with the sizes less than 1600 bytes, the hashed file reduced the original file up to 8.51%.

  12. FSH: fast spaced seed hashing exploiting adjacent hashes.

    Science.gov (United States)

    Girotto, Samuele; Comin, Matteo; Pizzi, Cinzia

    2018-01-01

    Patterns with wildcards in specified positions, namely spaced seeds , are increasingly used instead of k -mers in many bioinformatics applications that require indexing, querying and rapid similarity search, as they can provide better sensitivity. Many of these applications require to compute the hashing of each position in the input sequences with respect to the given spaced seed, or to multiple spaced seeds. While the hashing of k -mers can be rapidly computed by exploiting the large overlap between consecutive k -mers, spaced seeds hashing is usually computed from scratch for each position in the input sequence, thus resulting in slower processing. The method proposed in this paper, fast spaced-seed hashing (FSH), exploits the similarity of the hash values of spaced seeds computed at adjacent positions in the input sequence. In our experiments we compute the hash for each positions of metagenomics reads from several datasets, with respect to different spaced seeds. We also propose a generalized version of the algorithm for the simultaneous computation of multiple spaced seeds hashing. In the experiments, our algorithm can compute the hashing values of spaced seeds with a speedup, with respect to the traditional approach, between 1.6[Formula: see text] to 5.3[Formula: see text], depending on the structure of the spaced seed. Spaced seed hashing is a routine task for several bioinformatics application. FSH allows to perform this task efficiently and raise the question of whether other hashing can be exploited to further improve the speed up. This has the potential of major impact in the field, making spaced seed applications not only accurate, but also faster and more efficient. The software FSH is freely available for academic use at: https://bitbucket.org/samu661/fsh/overview.

  13. Cryptographic Boolean functions and applications

    CERN Document Server

    Cusick, Thomas W

    2009-01-01

    Boolean functions are the building blocks of symmetric cryptographic systems. Symmetrical cryptographic algorithms are fundamental tools in the design of all types of digital security systems (i.e. communications, financial and e-commerce).Cryptographic Boolean Functions and Applications is a concise reference that shows how Boolean functions are used in cryptography. Currently, practitioners who need to apply Boolean functions in the design of cryptographic algorithms and protocols need to patch together needed information from a variety of resources (books, journal articles and other sources). This book compiles the key essential information in one easy to use, step-by-step reference. Beginning with the basics of the necessary theory the book goes on to examine more technical topics, some of which are at the frontier of current research.-Serves as a complete resource for the successful design or implementation of cryptographic algorithms or protocols using Boolean functions -Provides engineers and scient...

  14. Dakota - hashing from a combination of modular arithmetic and symmetric cryptography

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Knudsen, Lars Ramkilde; Thomsen, Søren Steffen

    In this paper a cryptographic hash function is proposed, where collision resistance is based upon an assumption that involves squaring modulo an RSA modulus in combination with a one-way function that does not compress its input, and may therefore be constructed from standard techniques and assum...

  15. Dakota – Hashing from a Combination of Modular Arithmetic and Symmetric Cryptography

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Knudsen, Lars Ramkilde; Thomsen, Søren Steffen

    2008-01-01

    In this paper a cryptographic hash function is proposed, where collision resistance is based upon an assumption that involves squaring modulo an RSA modulus in combination with a one-way function that does not compress its input, and may therefore be constructed from standard techniques and assum...

  16. Incremental cryptography and security of public hash functions ...

    African Journals Online (AJOL)

    An investigation of incremental algorithms for crytographic functions was initiated. The problem, for collision-free hashing, is to design a scheme for which there exists an efficient “update” algorithm: this algorithm is given the hash function H, the hash h = H(M) of message M and the “replacement request” (j, m), and outputs ...

  17. An algorithm for the detection of move repetition without the use of hash-keys

    Directory of Open Access Journals (Sweden)

    Vučković Vladan

    2007-01-01

    Full Text Available This paper addresses the theoretical and practical aspects of an important problem in computer chess programming - the problem of draw detection in cases of position repetition. The standard approach used in the majority of computer chess programs is hash-oriented. This method is sufficient in most cases, as the Zobrist keys are already present due to the systemic positional hashing, so that they need not be computed anew for the purpose of draw detection. The new type of the algorithm that we have developed solves the problem of draw detection in cases when Zobrist keys are not used in the program, i.e. in cases when the memory is not hashed.

  18. Fast and powerful hashing using tabulation

    DEFF Research Database (Denmark)

    Thorup, Mikkel

    2017-01-01

    Randomized algorithms are often enjoyed for their simplicity, but the hash functions employed to yield the desired probabilistic guarantees are often too complicated to be practical. Here, we survey recent results on how simple hashing schemes based on tabulation provide unexpectedly strong......, linear probing and Cuckoo hashing. Next, we consider twisted tabulation where one input character is "twisted" in a simple way. The resulting hash function has powerful distributional properties: Chernoffstyle tail bounds and a very small bias for minwise hashing. This is also yields an extremely fast...... pseudorandom number generator that is provably good for many classic randomized algorithms and data-structures. Finally, we consider double tabulation where we compose two simple tabulation functions, applying one to the output of the other, and show that this yields very high independence in the classic...

  19. Structure Sensitive Hashing With Adaptive Product Quantization.

    Science.gov (United States)

    Liu, Xianglong; Du, Bowen; Deng, Cheng; Liu, Ming; Lang, Bo

    2016-10-01

    Hashing has been proved as an attractive solution to approximate nearest neighbor search, owing to its theoretical guarantee and computational efficiency. Though most of prior hashing algorithms can achieve low memory and computation consumption by pursuing compact hash codes, however, they are still far beyond the capability of learning discriminative hash functions from the data with complex inherent structure among them. To address this issue, in this paper, we propose a structure sensitive hashing based on cluster prototypes, which explicitly exploits both global and local structures. An alternating optimization algorithm, respectively, minimizing the quantization loss and spectral embedding loss, is presented to simultaneously discover the cluster prototypes for each hash function, and optimally assign unique binary codes to them satisfying the affinity alignment between them. For hash codes of a desired length, an adaptive bit assignment is further appended to the product quantization of the subspaces, approximating the Hamming distances and meanwhile balancing the variance among hash functions. Experimental results on four large-scale benchmarks CIFAR-10, NUS-WIDE, SIFT1M, and GIST1M demonstrate that our approach significantly outperforms state-of-the-art hashing methods in terms of semantic and metric neighbor search.

  20. Parallel keyed hash function construction based on chaotic maps

    International Nuclear Information System (INIS)

    Xiao Di; Liao Xiaofeng; Deng Shaojiang

    2008-01-01

    Recently, a variety of chaos-based hash functions have been proposed. Nevertheless, none of them works efficiently in parallel computing environment. In this Letter, an algorithm for parallel keyed hash function construction is proposed, whose structure can ensure the uniform sensitivity of hash value to the message. By means of the mechanism of both changeable-parameter and self-synchronization, the keystream establishes a close relation with the algorithm key, the content and the order of each message block. The entire message is modulated into the chaotic iteration orbit, and the coarse-graining trajectory is extracted as the hash value. Theoretical analysis and computer simulation indicate that the proposed algorithm can satisfy the performance requirements of hash function. It is simple, efficient, practicable, and reliable. These properties make it a good choice for hash on parallel computing platform

  1. Scalable Packet Classification with Hash Tables

    Science.gov (United States)

    Wang, Pi-Chung

    In the last decade, the technique of packet classification has been widely deployed in various network devices, including routers, firewalls and network intrusion detection systems. In this work, we improve the performance of packet classification by using multiple hash tables. The existing hash-based algorithms have superior scalability with respect to the required space; however, their search performance may not be comparable to other algorithms. To improve the search performance, we propose a tuple reordering algorithm to minimize the number of accessed hash tables with the aid of bitmaps. We also use pre-computation to ensure the accuracy of our search procedure. Performance evaluation based on both real and synthetic filter databases shows that our scheme is effective and scalable and the pre-computation cost is moderate.

  2. A hash-based image encryption algorithm

    Science.gov (United States)

    Cheddad, Abbas; Condell, Joan; Curran, Kevin; McKevitt, Paul

    2010-03-01

    There exist several algorithms that deal with text encryption. However, there has been little research carried out to date on encrypting digital images or video files. This paper describes a novel way of encrypting digital images with password protection using 1D SHA-2 algorithm coupled with a compound forward transform. A spatial mask is generated from the frequency domain by taking advantage of the conjugate symmetry of the complex imagery part of the Fourier Transform. This mask is then XORed with the bit stream of the original image. Exclusive OR (XOR), a logical symmetric operation, that yields 0 if both binary pixels are zeros or if both are ones and 1 otherwise. This can be verified simply by modulus (pixel1, pixel2, 2). Finally, confusion is applied based on the displacement of the cipher's pixels in accordance with a reference mask. Both security and performance aspects of the proposed method are analyzed, which prove that the method is efficient and secure from a cryptographic point of view. One of the merits of such an algorithm is to force a continuous tone payload, a steganographic term, to map onto a balanced bits distribution sequence. This bit balance is needed in certain applications, such as steganography and watermarking, since it is likely to have a balanced perceptibility effect on the cover image when embedding.

  3. Random multispace quantization as an analytic mechanism for BioHashing of biometric and random identity inputs.

    Science.gov (United States)

    Teoh, Andrew B J; Goh, Alwyn; Ngo, David C L

    2006-12-01

    Biometric analysis for identity verification is becoming a widespread reality. Such implementations necessitate large-scale capture and storage of biometric data, which raises serious issues in terms of data privacy and (if such data is compromised) identity theft. These problems stem from the essential permanence of biometric data, which (unlike secret passwords or physical tokens) cannot be refreshed or reissued if compromised. Our previously presented biometric-hash framework prescribes the integration of external (password or token-derived) randomness with user-specific biometrics, resulting in bitstring outputs with security characteristics (i.e., noninvertibility) comparable to cryptographic ciphers or hashes. The resultant BioHashes are hence cancellable, i.e., straightforwardly revoked and reissued (via refreshed password or reissued token) if compromised. BioHashing furthermore enhances recognition effectiveness, which is explained in this paper as arising from the Random Multispace Quantization (RMQ) of biometric and external random inputs.

  4. An adaptive cryptographic accelerator for network storage security on dynamically reconfigurable platform

    Science.gov (United States)

    Tang, Li; Liu, Jing-Ning; Feng, Dan; Tong, Wei

    2008-12-01

    Existing security solutions in network storage environment perform poorly because cryptographic operations (encryption and decryption) implemented in software can dramatically reduce system performance. In this paper we propose a cryptographic hardware accelerator on dynamically reconfigurable platform for the security of high performance network storage system. We employ a dynamic reconfigurable platform based on a FPGA to implement a PowerPCbased embedded system, which executes cryptographic algorithms. To reduce the reconfiguration latency, we apply prefetch scheduling. Moreover, the processing elements could be dynamically configured to support different cryptographic algorithms according to the request received by the accelerator. In the experiment, we have implemented AES (Rijndael) and 3DES cryptographic algorithms in the reconfigurable accelerator. Our proposed reconfigurable cryptographic accelerator could dramatically increase the performance comparing with the traditional software-based network storage systems.

  5. Linear Subspace Ranking Hashing for Cross-Modal Retrieval.

    Science.gov (United States)

    Li, Kai; Qi, Guo-Jun; Ye, Jun; Hua, Kien A

    2017-09-01

    Hashing has attracted a great deal of research in recent years due to its effectiveness for the retrieval and indexing of large-scale high-dimensional multimedia data. In this paper, we propose a novel ranking-based hashing framework that maps data from different modalities into a common Hamming space where the cross-modal similarity can be measured using Hamming distance. Unlike existing cross-modal hashing algorithms where the learned hash functions are binary space partitioning functions, such as the sign and threshold function, the proposed hashing scheme takes advantage of a new class of hash functions closely related to rank correlation measures which are known to be scale-invariant, numerically stable, and highly nonlinear. Specifically, we jointly learn two groups of linear subspaces, one for each modality, so that features' ranking orders in different linear subspaces maximally preserve the cross-modal similarities. We show that the ranking-based hash function has a natural probabilistic approximation which transforms the original highly discontinuous optimization problem into one that can be efficiently solved using simple gradient descent algorithms. The proposed hashing framework is also flexible in the sense that the optimization procedures are not tied up to any specific form of loss function, which is typical for existing cross-modal hashing methods, but rather we can flexibly accommodate different loss functions with minimal changes to the learning steps. We demonstrate through extensive experiments on four widely-used real-world multimodal datasets that the proposed cross-modal hashing method can achieve competitive performance against several state-of-the-arts with only moderate training and testing time.

  6. Deep Constrained Siamese Hash Coding Network and Load-Balanced Locality-Sensitive Hashing for Near Duplicate Image Detection.

    Science.gov (United States)

    Hu, Weiming; Fan, Yabo; Xing, Junliang; Sun, Liang; Cai, Zhaoquan; Maybank, Stephen

    2018-09-01

    We construct a new efficient near duplicate image detection method using a hierarchical hash code learning neural network and load-balanced locality-sensitive hashing (LSH) indexing. We propose a deep constrained siamese hash coding neural network combined with deep feature learning. Our neural network is able to extract effective features for near duplicate image detection. The extracted features are used to construct a LSH-based index. We propose a load-balanced LSH method to produce load-balanced buckets in the hashing process. The load-balanced LSH significantly reduces the query time. Based on the proposed load-balanced LSH, we design an effective and feasible algorithm for near duplicate image detection. Extensive experiments on three benchmark data sets demonstrate the effectiveness of our deep siamese hash encoding network and load-balanced LSH.

  7. One-way hash function construction based on the spatiotemporal chaotic system

    International Nuclear Information System (INIS)

    Luo Yu-Ling; Du Ming-Hui

    2012-01-01

    Based on the spatiotemporal chaotic system, a novel algorithm for constructing a one-way hash function is proposed and analysed. The message is divided into fixed length blocks. Each message block is processed by the hash compression function in parallel. The hash compression is constructed based on the spatiotemporal chaos. In each message block, the ASCII code and its position in the whole message block chain constitute the initial conditions and the key of the hash compression function. The final hash value is generated by further compressing the mixed result of all the hash compression values. Theoretic analyses and numerical simulations show that the proposed algorithm presents high sensitivity to the message and key, good statistical properties, and strong collision resistance. (general)

  8. Cryptographic framework for document-objects resulting from multiparty collaborative transactions.

    Science.gov (United States)

    Goh, A

    2000-01-01

    Multiparty transactional frameworks--i.e. Electronic Data Interchange (EDI) or Health Level (HL) 7--often result in composite documents which can be accurately modelled using hyperlinked document-objects. The structural complexity arising from multiauthor involvement and transaction-specific sequencing would be poorly handled by conventional digital signature schemes based on a single evaluation of a one-way hash function and asymmetric cryptography. In this paper we outline the generation of structure-specific authentication hash-trees for the the authentication of transactional document-objects, followed by asymmetric signature generation on the hash-tree value. Server-side multi-client signature verification would probably constitute the single most compute-intensive task, hence the motivation for our usage of the Rabin signature protocol which results in significantly reduced verification workloads compared to the more commonly applied Rivest-Shamir-Adleman (RSA) protocol. Data privacy is handled via symmetric encryption of message traffic using session-specific keys obtained through key-negotiation mechanisms based on discrete-logarithm cryptography. Individual client-to-server channels can be secured using a double key-pair variation of Diffie-Hellman (DH) key negotiation, usage of which also enables bidirectional node authentication. The reciprocal server-to-client multicast channel is secured through Burmester-Desmedt (BD) key-negotiation which enjoys significant advantages over the usual multiparty extensions to the DH protocol. The implementation of hash-tree signatures and bi/multidirectional key negotiation results in a comprehensive cryptographic framework for multiparty document-objects satisfying both authentication and data privacy requirements.

  9. Improved security analysis of Fugue-256

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Bagheri, Nasour; Knudsen, Lars Ramkilde

    2011-01-01

    Fugue is a cryptographic hash function designed by Halevi, Hall and Jutla and was one of the fourteen hash algorithms of the second round of NIST’s SHA3 hash competition. We consider Fugue-256, the 256-bit instance of Fugue. Fugue-256 updates a state of 960 bits with a round transformation R para...

  10. One-way hash function construction based on chaotic map network

    International Nuclear Information System (INIS)

    Yang Huaqian; Wong, K.-W.; Liao Xiaofeng; Wang Yong; Yang Degang

    2009-01-01

    A novel chaotic hash algorithm based on a network structure formed by 16 chaotic maps is proposed. The original message is first padded with zeros to make the length a multiple of four. Then it is divided into a number of blocks each contains 4 bytes. In the hashing process, the blocks are mixed together by the chaotic map network since the initial value and the control parameter of each tent map are dynamically determined by the output of its neighbors. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high flexibility, as required by practical keyed hash functions.

  11. Paradeisos: A perfect hashing algorithm for many-body eigenvalue problems

    Science.gov (United States)

    Jia, C. J.; Wang, Y.; Mendl, C. B.; Moritz, B.; Devereaux, T. P.

    2018-03-01

    We describe an essentially perfect hashing algorithm for calculating the position of an element in an ordered list, appropriate for the construction and manipulation of many-body Hamiltonian, sparse matrices. Each element of the list corresponds to an integer value whose binary representation reflects the occupation of single-particle basis states for each element in the many-body Hilbert space. The algorithm replaces conventional methods, such as binary search, for locating the elements of the ordered list, eliminating the need to store the integer representation for each element, without increasing the computational complexity. Combined with the "checkerboard" decomposition of the Hamiltonian matrix for distribution over parallel computing environments, this leads to a substantial savings in aggregate memory. While the algorithm can be applied broadly to many-body, correlated problems, we demonstrate its utility in reducing total memory consumption for a series of fermionic single-band Hubbard model calculations on small clusters with progressively larger Hilbert space dimension.

  12. An Ad Hoc Adaptive Hashing Technique forNon-Uniformly Distributed IP Address Lookup in Computer Networks

    Directory of Open Access Journals (Sweden)

    Christopher Martinez

    2007-02-01

    Full Text Available Hashing algorithms long have been widely adopted to design a fast address look-up process which involves a search through a large database to find a record associated with a given key. Hashing algorithms involve transforming a key inside each target data to a hash value hoping that the hashing would render the database a uniform distribution with respect to this new hash value. The close the final distribution is to uniform, the less search time would be required when a query is made. When the database is already key-wise uniformly distributed, any regular hashing algorithm, such as bit-extraction, bit-group XOR, etc., would easily lead to a statistically perfect uniform distribution after the hashing. On the other hand, if records in the database are instead not uniformly distributed as in almost all known practical applications, then even different regular hash functions would lead to very different performance. When the target database has a key with a highly skewed distributed value, performance delivered by regular hashing algorithms usually becomes far from desirable. This paper aims at designing a hashing algorithm to achieve the highest probability in leading to a uniformly distributed hash result from a non-uniformly distributed database. An analytical pre-process on the original database is first performed to extract critical information that would significantly benefit the design of a better hashing algorithm. This process includes sorting on the bits of the key to prioritize the use of them in the XOR hashing sequence, or in simple bit extraction, or even a combination of both. Such an ad hoc hash design is critical to adapting to all real-time situations when there exists a changing (and/or expanding database with an irregular non-uniform distribution. Significant improvement from simulation results is obtained in randomly generated data as well as real data.

  13. Hash function based on piecewise nonlinear chaotic map

    International Nuclear Information System (INIS)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2009-01-01

    Chaos-based cryptography appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an algorithm for one-way hash function construction based on piecewise nonlinear chaotic map with a variant probability parameter is proposed. Also the proposed algorithm is an attempt to present a new chaotic hash function based on multithreaded programming. In this chaotic scheme, the message is connected to the chaotic map using probability parameter and other parameters of chaotic map such as control parameter and initial condition, so that the generated hash value is highly sensitive to the message. Simulation results indicate that the proposed algorithm presented several interesting features, such as high flexibility, good statistical properties, high key sensitivity and message sensitivity. These properties make the scheme a suitable choice for practical applications.

  14. A novel method for one-way hash function construction based on spatiotemporal chaos

    Energy Technology Data Exchange (ETDEWEB)

    Ren Haijun [College of Software Engineering, Chongqing University, Chongqing 400044 (China); State Key Laboratory of Power Transmission Equipment and System Security and New Technology, Chongqing University, Chongqing 400044 (China)], E-mail: jhren@cqu.edu.cn; Wang Yong; Xie Qing [Key Laboratory of Electronic Commerce and Logistics of Chongqing, Chongqing University of Posts and Telecommunications, Chongqing 400065 (China); Yang Huaqian [Department of Computer and Modern Education Technology, Chongqing Education of College, Chongqing 400067 (China)

    2009-11-30

    A novel hash algorithm based on a spatiotemporal chaos is proposed. The original message is first padded with zeros if needed. Then it is divided into a number of blocks each contains 32 bytes. In the hashing process, each block is partitioned into eight 32-bit values and input into the spatiotemporal chaotic system. Then, after iterating the system for four times, the next block is processed by the same way. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. The hash value is obtained from the final state value of the spatiotemporal chaotic system. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high efficiency, as required by practical keyed hash functions.

  15. A novel method for one-way hash function construction based on spatiotemporal chaos

    International Nuclear Information System (INIS)

    Ren Haijun; Wang Yong; Xie Qing; Yang Huaqian

    2009-01-01

    A novel hash algorithm based on a spatiotemporal chaos is proposed. The original message is first padded with zeros if needed. Then it is divided into a number of blocks each contains 32 bytes. In the hashing process, each block is partitioned into eight 32-bit values and input into the spatiotemporal chaotic system. Then, after iterating the system for four times, the next block is processed by the same way. To enhance the confusion and diffusion effect, the cipher block chaining (CBC) mode is adopted in the algorithm. The hash value is obtained from the final state value of the spatiotemporal chaotic system. Theoretic analyses and numerical simulations both show that the proposed hash algorithm possesses good statistical properties, strong collision resistance and high efficiency, as required by practical keyed hash functions.

  16. Security Analysis of Randomize-Hash-then-Sign Digital Signatures

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde

    2012-01-01

    At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar...... functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online...... 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash...

  17. Large-Scale Unsupervised Hashing with Shared Structure Learning.

    Science.gov (United States)

    Liu, Xianglong; Mu, Yadong; Zhang, Danchen; Lang, Bo; Li, Xuelong

    2015-09-01

    Hashing methods are effective in generating compact binary signatures for images and videos. This paper addresses an important open issue in the literature, i.e., how to learn compact hash codes by enhancing the complementarity among different hash functions. Most of prior studies solve this problem either by adopting time-consuming sequential learning algorithms or by generating the hash functions which are subject to some deliberately-designed constraints (e.g., enforcing hash functions orthogonal to one another). We analyze the drawbacks of past works and propose a new solution to this problem. Our idea is to decompose the feature space into a subspace shared by all hash functions and its complementary subspace. On one hand, the shared subspace, corresponding to the common structure across different hash functions, conveys most relevant information for the hashing task. Similar to data de-noising, irrelevant information is explicitly suppressed during hash function generation. On the other hand, in case that the complementary subspace also contains useful information for specific hash functions, the final form of our proposed hashing scheme is a compromise between these two kinds of subspaces. To make hash functions not only preserve the local neighborhood structure but also capture the global cluster distribution of the whole data, an objective function incorporating spectral embedding loss, binary quantization loss, and shared subspace contribution is introduced to guide the hash function learning. We propose an efficient alternating optimization method to simultaneously learn both the shared structure and the hash functions. Experimental results on three well-known benchmarks CIFAR-10, NUS-WIDE, and a-TRECVID demonstrate that our approach significantly outperforms state-of-the-art hashing methods.

  18. Online Hashing for Scalable Remote Sensing Image Retrieval

    Directory of Open Access Journals (Sweden)

    Peng Li

    2018-05-01

    Full Text Available Recently, hashing-based large-scale remote sensing (RS image retrieval has attracted much attention. Many new hashing algorithms have been developed and successfully applied to fast RS image retrieval tasks. However, there exists an important problem rarely addressed in the research literature of RS image hashing. The RS images are practically produced in a streaming manner in many real-world applications, which means the data distribution keeps changing over time. Most existing RS image hashing methods are batch-based models whose hash functions are learned once for all and kept fixed all the time. Therefore, the pre-trained hash functions might not fit the ever-growing new RS images. Moreover, the batch-based models have to load all the training images into memory for model learning, which consumes many computing and memory resources. To address the above deficiencies, we propose a new online hashing method, which learns and adapts its hashing functions with respect to the newly incoming RS images in terms of a novel online partial random learning scheme. Our hash model is updated in a sequential mode such that the representative power of the learned binary codes for RS images are improved accordingly. Moreover, benefiting from the online learning strategy, our proposed hashing approach is quite suitable for scalable real-world remote sensing image retrieval. Extensive experiments on two large-scale RS image databases under online setting demonstrated the efficacy and effectiveness of the proposed method.

  19. Five Performance Enhancements for Hybrid Hash Join

    National Research Council Canada - National Science Library

    Graefe, Goetz

    1992-01-01

    .... We discuss five performance enhancements for hybrid hash join algorithms, namely data compression, large cluster sizes and multi-level recursion, role reversal of build and probe inputs, histogram...

  20. Constructing a one-way hash function based on the unified chaotic system

    International Nuclear Information System (INIS)

    Long Min; Peng Fei; Chen Guanrong

    2008-01-01

    A new one-way hash function based on the unified chaotic system is constructed. With different values of a key parameter, the unified chaotic system represents different chaotic systems, based on which the one-way hash function algorithm is constructed with three round operations and an initial vector on an input message. In each round operation, the parameters are processed by three different chaotic systems generated from the unified chaotic system. Feed-forwards are used at the end of each round operation and at the end of each element of the message processing. Meanwhile, in each round operation, parameter-exchanging operations are implemented. Then, the hash value of length 160 bits is obtained from the last six parameters. Simulation and analysis both demonstrate that the algorithm has great flexibility, satisfactory hash performance, weak collision property, and high security. (general)

  1. Ranking Based Locality Sensitive Hashing Enabled Cancelable Biometrics: Index-of-Max Hashing

    OpenAIRE

    Jin, Zhe; Lai, Yen-Lung; Hwang, Jung-Yeon; Kim, Soohyung; Teoh, Andrew Beng Jin

    2017-01-01

    In this paper, we propose a ranking based locality sensitive hashing inspired two-factor cancelable biometrics, dubbed "Index-of-Max" (IoM) hashing for biometric template protection. With externally generated random parameters, IoM hashing transforms a real-valued biometric feature vector into discrete index (max ranked) hashed code. We demonstrate two realizations from IoM hashing notion, namely Gaussian Random Projection based and Uniformly Random Permutation based hashing schemes. The disc...

  2. Hashing for Statistics over K-Partitions

    DEFF Research Database (Denmark)

    Dahlgaard, Soren; Knudsen, Mathias Baek Tejs; Rotenberg, Eva

    2015-01-01

    In this paper we analyze a hash function for k-partitioning a set into bins, obtaining strong concentration bounds for standard algorithms combining statistics from each bin. This generic method was originally introduced by Flajolet and Martin [FOCS'83] in order to save a factor Ω(k) of time per...... concentration bounds on the most popular applications of k-partitioning similar to those we would get using a truly random hash function. The analysis is very involved and implies several new results of independent interest for both simple and double tabulation, e.g. A simple and efficient construction...

  3. Design and Analysis of Optimization Algorithms to Minimize Cryptographic Processing in BGP Security Protocols.

    Science.gov (United States)

    Sriram, Vinay K; Montgomery, Doug

    2017-07-01

    The Internet is subject to attacks due to vulnerabilities in its routing protocols. One proposed approach to attain greater security is to cryptographically protect network reachability announcements exchanged between Border Gateway Protocol (BGP) routers. This study proposes and evaluates the performance and efficiency of various optimization algorithms for validation of digitally signed BGP updates. In particular, this investigation focuses on the BGPSEC (BGP with SECurity extensions) protocol, currently under consideration for standardization in the Internet Engineering Task Force. We analyze three basic BGPSEC update processing algorithms: Unoptimized, Cache Common Segments (CCS) optimization, and Best Path Only (BPO) optimization. We further propose and study cache management schemes to be used in conjunction with the CCS and BPO algorithms. The performance metrics used in the analyses are: (1) routing table convergence time after BGPSEC peering reset or router reboot events and (2) peak-second signature verification workload. Both analytical modeling and detailed trace-driven simulation were performed. Results show that the BPO algorithm is 330% to 628% faster than the unoptimized algorithm for routing table convergence in a typical Internet core-facing provider edge router.

  4. One-way hash function based on hyper-chaotic cellular neural network

    International Nuclear Information System (INIS)

    Yang Qunting; Gao Tiegang

    2008-01-01

    The design of an efficient one-way hash function with good performance is a hot spot in modern cryptography researches. In this paper, a hash function construction method based on cell neural network with hyper-chaos characteristics is proposed. First, the chaos sequence is gotten by iterating cellular neural network with Runge–Kutta algorithm, and then the chaos sequence is iterated with the message. The hash code is obtained through the corresponding transform of the latter chaos sequence. Simulation and analysis demonstrate that the new method has the merit of convenience, high sensitivity to initial values, good hash performance, especially the strong stability. (general)

  5. On Randomizing Hash Functions to Strengthen the Security of Digital Signatures

    DEFF Research Database (Denmark)

    Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-th...... that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack....

  6. On randomizing hash functions to strengthen the security of digital signatures

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Knudsen, Lars Ramkilde

    2009-01-01

    Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-th...... schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack....

  7. Quicksort, largest bucket, and min-wise hashing with limited independence

    DEFF Research Database (Denmark)

    Knudsen, Mathias Bæk Tejs; Stöckel, Morten

    2015-01-01

    Randomized algorithms and data structures are often analyzed under the assumption of access to a perfect source of randomness. The most fundamental metric used to measure how “random” a hash function or a random number generator is, is its independence: a sequence of random variables is said...... to be k-independent if every variable is uniform and every size k subset is independent. In this paper we consider three classic algorithms under limited independence. Besides the theoretical interest in removing the unrealistic assumption of full independence, the work is motivated by lower independence...... being more practical. We provide new bounds for randomized quicksort, min-wise hashing and largest bucket size under limited independence. Our results can be summarized as follows. Randomized Quicksort. When pivot elements are computed using a 5-independent hash function, Karloff and Raghavan, J.ACM’93...

  8. The Usefulness of Multilevel Hash Tables with Multiple Hash Functions in Large Databases

    Directory of Open Access Journals (Sweden)

    A.T. Akinwale

    2009-05-01

    Full Text Available In this work, attempt is made to select three good hash functions which uniformly distribute hash values that permute their internal states and allow the input bits to generate different output bits. These functions are used in different levels of hash tables that are coded in Java Programming Language and a quite number of data records serve as primary data for testing the performances. The result shows that the two-level hash tables with three different hash functions give a superior performance over one-level hash table with two hash functions or zero-level hash table with one function in term of reducing the conflict keys and quick lookup for a particular element. The result assists to reduce the complexity of join operation in query language from O( n2 to O( 1 by placing larger query result, if any, in multilevel hash tables with multiple hash functions and generate shorter query result.

  9. A fingerprint key binding algorithm based on vector quantization and error correction

    Science.gov (United States)

    Li, Liang; Wang, Qian; Lv, Ke; He, Ning

    2012-04-01

    In recent years, researches on seamless combination cryptosystem with biometric technologies, e.g. fingerprint recognition, are conducted by many researchers. In this paper, we propose a binding algorithm of fingerprint template and cryptographic key to protect and access the key by fingerprint verification. In order to avoid the intrinsic fuzziness of variant fingerprints, vector quantization and error correction technique are introduced to transform fingerprint template and then bind with key, after a process of fingerprint registration and extracting global ridge pattern of fingerprint. The key itself is secure because only hash value is stored and it is released only when fingerprint verification succeeds. Experimental results demonstrate the effectiveness of our ideas.

  10. Probabilistic hypergraph based hash codes for social image search

    Institute of Scientific and Technical Information of China (English)

    Yi XIE; Hui-min YU; Roland HU

    2014-01-01

    With the rapid development of the Internet, recent years have seen the explosive growth of social media. This brings great challenges in performing efficient and accurate image retrieval on a large scale. Recent work shows that using hashing methods to embed high-dimensional image features and tag information into Hamming space provides a powerful way to index large collections of social images. By learning hash codes through a spectral graph partitioning algorithm, spectral hashing (SH) has shown promising performance among various hashing approaches. However, it is incomplete to model the relations among images only by pairwise simple graphs which ignore the relationship in a higher order. In this paper, we utilize a probabilistic hypergraph model to learn hash codes for social image retrieval. A probabilistic hypergraph model offers a higher order repre-sentation among social images by connecting more than two images in one hyperedge. Unlike a normal hypergraph model, a probabilistic hypergraph model considers not only the grouping information, but also the similarities between vertices in hy-peredges. Experiments on Flickr image datasets verify the performance of our proposed approach.

  11. RFID Cryptographic Protocol Based on Cyclic Redundancy Check for High Efficiency

    Directory of Open Access Journals (Sweden)

    Nian Liu

    2014-04-01

    Full Text Available In this paper, RFID encryption protocol is proposed based on the security problems in wireless signal channel. In order to solve the privacy issues of electronic tags, the most commonly way is to improve algorithms based on Hash function. However, there are some problems that can only play roles in some specific domains. Due to the limitations in various kinds of algorithms, in this paper we put forward a new kind of agreement. When it is required to locate target labels accurately and rapidly in a movement environment, using this agreement can achieve high efficiency through combining the Hash function, the two division search algorithm and CRC check. The results show that this algorithm can accurately identify the tags with merits of low cost, execution rate and anti-attack ability etc.

  12. Cryptographic Protocols Based on Root Extracting

    DEFF Research Database (Denmark)

    Koprowski, Maciej

    In this thesis we design new cryptographic protocols, whose security is based on the hardness of root extracting or more speci cally the RSA problem. First we study the problem of root extraction in nite Abelian groups, where the group order is unknown. This is a natural generalization of the...... complexity of root extraction, even if the algorithm can choose the "public exponent'' itself. In other words, both the standard and the strong RSA assumption are provably true w.r.t. generic algorithms. The results hold for arbitrary groups, so security w.r.t. generic attacks follows for any cryptographic...... groups. In all cases, security follows from a well de ned complexity assumption (the strong root assumption), without relying on random oracles. A smooth natural number has no big prime factors. The probability, that a random natural number not greater than x has all prime factors smaller than x1/u...

  13. Toward Optimal Manifold Hashing via Discrete Locally Linear Embedding.

    Science.gov (United States)

    Rongrong Ji; Hong Liu; Liujuan Cao; Di Liu; Yongjian Wu; Feiyue Huang

    2017-11-01

    Binary code learning, also known as hashing, has received increasing attention in large-scale visual search. By transforming high-dimensional features to binary codes, the original Euclidean distance is approximated via Hamming distance. More recently, it is advocated that it is the manifold distance, rather than the Euclidean distance, that should be preserved in the Hamming space. However, it retains as an open problem to directly preserve the manifold structure by hashing. In particular, it first needs to build the local linear embedding in the original feature space, and then quantize such embedding to binary codes. Such a two-step coding is problematic and less optimized. Besides, the off-line learning is extremely time and memory consuming, which needs to calculate the similarity matrix of the original data. In this paper, we propose a novel hashing algorithm, termed discrete locality linear embedding hashing (DLLH), which well addresses the above challenges. The DLLH directly reconstructs the manifold structure in the Hamming space, which learns optimal hash codes to maintain the local linear relationship of data points. To learn discrete locally linear embeddingcodes, we further propose a discrete optimization algorithm with an iterative parameters updating scheme. Moreover, an anchor-based acceleration scheme, termed Anchor-DLLH, is further introduced, which approximates the large similarity matrix by the product of two low-rank matrices. Experimental results on three widely used benchmark data sets, i.e., CIFAR10, NUS-WIDE, and YouTube Face, have shown superior performance of the proposed DLLH over the state-of-the-art approaches.

  14. Algorithmic cryptanalysis

    CERN Document Server

    Joux, Antoine

    2009-01-01

    Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program.Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applic

  15. Elliptic net and its cryptographic application

    Science.gov (United States)

    Muslim, Norliana; Said, Mohamad Rushdan Md

    2017-11-01

    Elliptic net is a generalization of elliptic divisibility sequence and in cryptography field, most cryptographic pairings that are based on elliptic curve such as Tate pairing can be improved by applying elliptic nets algorithm. The elliptic net is constructed by using n dimensional array of values in rational number satisfying nonlinear recurrence relations that arise from elliptic divisibility sequences. The two main properties hold in the recurrence relations are for all positive integers m>n, hm +nhm -n=hm +1hm -1hn2-hn +1hn -1hm2 and hn divides hm whenever n divides m. In this research, we discuss elliptic divisibility sequence associated with elliptic nets based on cryptographic perspective and its possible research direction.

  16. Cache-Oblivious Hashing

    DEFF Research Database (Denmark)

    Pagh, Rasmus; Wei, Zhewei; Yi, Ke

    2014-01-01

    The hash table, especially its external memory version, is one of the most important index structures in large databases. Assuming a truly random hash function, it is known that in a standard external hash table with block size b, searching for a particular key only takes expected average t q =1...

  17. Cracking PwdHash: A Bruteforce Attack on Client-side Password Hashing

    OpenAIRE

    Llewellyn-Jones, David; Rymer, Graham Matthew

    2017-01-01

    PwdHash is a widely-used tool for client-side password hashing. Originally released as a browser extension, it replaces the user’s password with a hash that combines both the password and the website’s domain. As a result, while the user only remembers a single secret, the passwords received are all unique for each site. We demonstrate how the hashcat password recovery tool can be extended to allow passwords generated using PwdHash to be identified and recovered, revealing the user’s master p...

  18. Using pseudo-random number generator for making iterative algorithms of hashing data

    International Nuclear Information System (INIS)

    Ivanov, M.A.; Vasil'ev, N.P.; Kozyrskij, B.L.

    2014-01-01

    The method of stochastic data transformation made for usage in cryptographic methods of information protection has been analyzed. The authors prove the usage of cryptographically strong pseudo-random number generators as a basis for Sponge construction. This means that the analysis of the quality of the known methods and tools for assessing the statistical security of pseudo-random number generators can be used effectively [ru

  19. One-way Hash function construction based on the chaotic map with changeable-parameter

    International Nuclear Information System (INIS)

    Xiao Di; Liao Xiaofeng; Deng Shaojiang

    2005-01-01

    An algorithm for one-way Hash function construction based on the chaotic map with changeable-parameter is proposed in this paper. A piecewise linear chaotic map with changeable-parameter P is chosen, and cipher block chaining mode (CBC) is introduced to ensure that the parameter P in each iteration is dynamically decided by the last-time iteration value and the corresponding message bit in different positions. The final Hash value is obtained by means of the linear transform on the iteration sequence. Theoretical analysis and computer simulation indicate that our algorithm can satisfy all the performance requirements of Hash function in an efficient and flexible manner. It is practicable and reliable, with high potential to be adopted for E-commerce

  20. One-way Hash function construction based on the chaotic map with changeable-parameter

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Di [College of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China) and College of Mechanical Engineering, Chongqing University, Chongqing 400044 (China)]. E-mail: xiaodi_cqu@hotmail.com; Liao Xiaofeng [College of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China)]. E-mail: xfliao@cqu.edu.cn; Deng Shaojiang [College of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China)

    2005-04-01

    An algorithm for one-way Hash function construction based on the chaotic map with changeable-parameter is proposed in this paper. A piecewise linear chaotic map with changeable-parameter P is chosen, and cipher block chaining mode (CBC) is introduced to ensure that the parameter P in each iteration is dynamically decided by the last-time iteration value and the corresponding message bit in different positions. The final Hash value is obtained by means of the linear transform on the iteration sequence. Theoretical analysis and computer simulation indicate that our algorithm can satisfy all the performance requirements of Hash function in an efficient and flexible manner. It is practicable and reliable, with high potential to be adopted for E-commerce.

  1. Cryptographic Primitives with Quasigroup Transformations

    OpenAIRE

    Mileva, Aleksandra

    2010-01-01

    Cryptology is the science of secret communication, which consists of two complementary disciplines: cryptography and cryptanalysis. Cryptography is dealing with design and development of new primitives, algorithms and schemas for data enciphering and deciphering. For many centuries cryptographic technics have been applied in protection of secrecy and authentication in diplomatic, political and military correspondences and communications. Cryptanalysis is dealing with different attacks on c...

  2. NESSIE: A European Approach to Evaluate Cryptographic Algorithms

    OpenAIRE

    Preneel, Bart

    2002-01-01

    The NESSIE project (New European Schemes for Signature, Integrity and Encryption) intends to put forward a portfolio containing the next generation of cryptographic primitives. These primitives will offer a higher security level than existing primitives, and/or will offer a higher confidence level, built up by an open evaluation process. Moreover, they should be better suited for the constraints of future hardware and software environments. In order to reach this goal, the project has launche...

  3. Perceptual Audio Hashing Functions

    Directory of Open Access Journals (Sweden)

    Emin Anarım

    2005-07-01

    Full Text Available Perceptual hash functions provide a tool for fast and reliable identification of content. We present new audio hash functions based on summarization of the time-frequency spectral characteristics of an audio document. The proposed hash functions are based on the periodicity series of the fundamental frequency and on singular-value description of the cepstral frequencies. They are found, on one hand, to perform very satisfactorily in identification and verification tests, and on the other hand, to be very resilient to a large variety of attacks. Moreover, we address the issue of security of hashes and propose a keying technique, and thereby a key-dependent hash function.

  4. A scalable lock-free hash table with open addressing

    DEFF Research Database (Denmark)

    Nielsen, Jesper Puge; Karlsson, Sven

    2016-01-01

    and concurrent operations without any locks. In this paper, we present a new fully lock-free open addressed hash table with a simpler design than prior published work. We split hash table insertions into two atomic phases: first inserting a value ignoring other concurrent operations, then in the second phase......Concurrent data structures synchronized with locks do not scale well with the number of threads. As more scalable alternatives, concurrent data structures and algorithms based on widely available, however advanced, atomic operations have been proposed. These data structures allow for correct...

  5. Local Deep Hashing Matching of Aerial Images Based on Relative Distance and Absolute Distance Constraints

    Directory of Open Access Journals (Sweden)

    Suting Chen

    2017-12-01

    Full Text Available Aerial images have features of high resolution, complex background, and usually require large amounts of calculation, however, most algorithms used in matching of aerial images adopt the shallow hand-crafted features expressed as floating-point descriptors (e.g., SIFT (Scale-invariant Feature Transform, SURF (Speeded Up Robust Features, which may suffer from poor matching speed and are not well represented in the literature. Here, we propose a novel Local Deep Hashing Matching (LDHM method for matching of aerial images with large size and with lower complexity or fast matching speed. The basic idea of the proposed algorithm is to utilize the deep network model in the local area of the aerial images, and study the local features, as well as the hash function of the images. Firstly, according to the course overlap rate of aerial images, the algorithm extracts the local areas for matching to avoid the processing of redundant information. Secondly, a triplet network structure is proposed to mine the deep features of the patches of the local image, and the learned features are imported to the hash layer, thus obtaining the representation of a binary hash code. Thirdly, the constraints of the positive samples to the absolute distance are added on the basis of the triplet loss, a new objective function is constructed to optimize the parameters of the network and enhance the discriminating capabilities of image patch features. Finally, the obtained deep hash code of each image patch is used for the similarity comparison of the image patches in the Hamming space to complete the matching of aerial images. The proposed LDHM algorithm evaluates the UltraCam-D dataset and a set of actual aerial images, simulation result demonstrates that it may significantly outperform the state-of-the-art algorithm in terms of the efficiency and performance.

  6. Security analysis of a one-way hash function based on spatiotemporal chaos

    International Nuclear Information System (INIS)

    Wang Shi-Hong; Shan Peng-Yang

    2011-01-01

    The collision and statistical properties of a one-way hash function based on spatiotemporal chaos are investigated. Analysis and simulation results indicate that collisions exist in the original algorithm and, therefore, the original algorithm is insecure and vulnerable. An improved algorithm is proposed to avoid the collisions. (general)

  7. Enhanced K-means clustering with encryption on cloud

    Science.gov (United States)

    Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.

    2017-11-01

    This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3

  8. Research on the Maritime Communication Cryptographic Chip’s Compiler Optimization

    Directory of Open Access Journals (Sweden)

    Sheng Li

    2017-08-01

    Full Text Available In the process of ocean development, the technology for maritime communication system is a hot research field, of which information security is vital for the normal operation of the whole system, and that is also one of the difficulties in the research of maritime communication system. In this paper, a kind of maritime communication cryptographic SOC(system on chip is introduced, and its compiler framework is put forward through analysis of working mode and problems faced by compiler front end. Then, a loop unrolling factor calculating algorithm based on queue theory, named UFBOQ (unrolling factor based on queue, is proposed to make parallel optimization in the compiler frontend with consideration of the instruction memory capacity limit. Finally, the scalar replacement method is used to optimize unrolled code to solve the memory access latency on the parallel computing efficiency, for continuous data storage characteristics of cryptographic algorithm. The UFBOQ algorithm and scalar replacement prove effective and appropriate, of which the effect achieves the linear speedup.

  9. Multi-operation cryptographic engine: VLSI design and implementation

    International Nuclear Information System (INIS)

    Selimis, George; Koufopavlou, Odysseas

    2005-01-01

    The environment of smart card lacks of system resources but the commercial and economic transactions via smart cards demand the use of certificated and secure cryptographic methods. In this paper a cryptographic approach in hardware for smart cards is proposed. The proposed system supports two basic operations of cryptography, authentication and encryption. The basic component of system is the one round of DES algorithm which supports the DES, Triple DES and the ANSI X9.17 standards. The proposed system is efficient in terms of area resources and techniques for low power consumption have applied. Due to the fact that the system is for smart card applications the overall throughput outperforms the typical smart card throughput standards

  10. Side channel analysis of some hash based MACs:A response to SHA-3 requirements

    DEFF Research Database (Denmark)

    The forthcoming NIST's Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash...... function which is more resistant to known side channel attacks (SCA) when plugged into HMAC, or that has an alternative MAC mode which is more resistant to known SCA than the other submitted alternatives. In response to this, we perform differential power analysis (DPA) on the possible smart card...... implementations of some of the recently proposed MAC alternatives to NMAC (a fully analyzed variant of HMAC) and HMAC algorithms and NMAC/HMAC versions of some recently proposed hash and compression function modes. We show that the recently proposed BNMAC and KMDP MAC schemes are even weaker than NMAC...

  11. Architecture-Conscious Hashing

    NARCIS (Netherlands)

    M. Zukowski (Marcin); S. Héman (Sándor); P.A. Boncz (Peter)

    2006-01-01

    textabstractHashing is one of the fundamental techniques used to implement query processing operators such as grouping, aggregation and join. This paper studies the interaction between modern computer architecture and hash-based query processing techniques. First, we focus on extracting maximum

  12. Efficient computation of hashes

    International Nuclear Information System (INIS)

    Lopes, Raul H C; Franqueira, Virginia N L; Hobson, Peter R

    2014-01-01

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  13. A brief history of cryptology and cryptographic algorithms

    CERN Document Server

    Dooley, John F

    2013-01-01

    The science of cryptology is made up of two halves. Cryptography is the study of how to create secure systems for communications. Cryptanalysis is the study of how to break those systems. The conflict between these two halves of cryptology is the story of secret writing. For over 2,000 years, the desire to communicate securely and secretly has resulted in the creation of numerous and increasingly complicated systems to protect one's messages. Yet for every system there is a cryptanalyst creating a new technique to break that system. With the advent of computers the cryptographer seems to final

  14. Symbolic Analysis of Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten

    We present our work on using abstract models for formally analysing cryptographic protocols: First, we present an ecient method for verifying trace-based authenticity properties of protocols using nonces, symmetric encryption, and asymmetric encryption. The method is based on a type system...... of Gordon et al., which we modify to support fully-automated type inference. Tests conducted via an implementation of our algorithm found it to be very ecient. Second, we show how privacy may be captured in a symbolic model using an equivalencebased property and give a formal denition. We formalise...

  15. Type-Based Automated Verification of Authenticity in Asymmetric Cryptographic Protocols

    DEFF Research Database (Denmark)

    Dahl, Morten; Kobayashi, Naoki; Sun, Yunde

    2011-01-01

    Gordon and Jeffrey developed a type system for verification of asymmetric and symmetric cryptographic protocols. We propose a modified version of Gordon and Jeffrey's type system and develop a type inference algorithm for it, so that protocols can be verified automatically as they are, without any...... type annotations or explicit type casts. We have implemented a protocol verifier SpiCa based on the algorithm, and confirmed its effectiveness....

  16. The Grindahl Hash Functions

    DEFF Research Database (Denmark)

    Knudsen, Lars Ramkilde; Rechberger, Christian; Thomsen, Søren Steffen

    2007-01-01

    to the state. We propose two concrete hash functions, Grindahl-256 and Grindahl-512 with claimed security levels with respect to collision, preimage and second preimage attacks of 2^128 and 2^256, respectively. Both proposals have lower memory requirements than other hash functions at comparable speeds...

  17. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    Science.gov (United States)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information.

  18. Quantum Hash function and its application to privacy amplification in quantum key distribution, pseudo-random number generation and image encryption

    Science.gov (United States)

    Yang, Yu-Guang; Xu, Peng; Yang, Rui; Zhou, Yi-Hua; Shi, Wei-Min

    2016-01-01

    Quantum information and quantum computation have achieved a huge success during the last years. In this paper, we investigate the capability of quantum Hash function, which can be constructed by subtly modifying quantum walks, a famous quantum computation model. It is found that quantum Hash function can act as a hash function for the privacy amplification process of quantum key distribution systems with higher security. As a byproduct, quantum Hash function can also be used for pseudo-random number generation due to its inherent chaotic dynamics. Further we discuss the application of quantum Hash function to image encryption and propose a novel image encryption algorithm. Numerical simulations and performance comparisons show that quantum Hash function is eligible for privacy amplification in quantum key distribution, pseudo-random number generation and image encryption in terms of various hash tests and randomness tests. It extends the scope of application of quantum computation and quantum information. PMID:26823196

  19. RANCANG BANGUN APLIKASI ANTIVIRUS KOMPUTER DENGAN MENGGUNAKAN METODE SECURE HASH ALGORITHM 1 (SHA1 DAN HEURISTIC STRING

    Directory of Open Access Journals (Sweden)

    I Gusti Made Panji Indrawinatha

    2016-12-01

    Full Text Available Virus komputer merupakan perangkat lunak berbahaya yang dapat merusak data dan menggandakan diri pada sistem komputer. Untuk mendeteksi dan membersihkan virus dari sistem komputer, maka dibuatlah aplikasi antivirus. Dalam mendeteksi berbagai jenis virus sebuah aplikasi antivirus biasanya menggunakan beberapa metode. Pada penelitian ini akan membahas perancangan sebuah aplikasi antivirus menggunakan metode Secure Hash Algorithm 1 (SHA1 dan heuristic string sebagai metode pendeteksian virus. Dari pengujian yang dilakukan diperoleh hasil dimana saat tidak menggunakan heuristic, antivirus hanya mendeteksi 12 file dari 34 file sample virus atau memiliki tingkat akurasi pendeteksian sebesar 35%. sedangkan saat menggunakan heuristic, antivirus berhasil mendeteksi 31 file dari 34 file sample virus atau memiliki tingkat akurasi pendeteksian sebesar 91%.

  20. Remarks on Gödel's Code as a Hash Function

    Czech Academy of Sciences Publication Activity Database

    Mikuš, M.; Savický, Petr

    2010-01-01

    Roč. 47, č. 3 (2010), s. 67-80 ISSN 1210-3195 R&D Projects: GA ČR GAP202/10/1333 Institutional research plan: CEZ:AV0Z10300504 Keywords : Gödel numbering function * hash function * rational reconstruction * integer relation algorithm Subject RIV: BA - General Mathematics http://www.sav.sk/journals/uploads/0317151904m-s.pdf

  1. Rationality in the Cryptographic Model

    DEFF Research Database (Denmark)

    Hubacek, Pavel

    This thesis presents results in the field of rational cryptography. In the first part we study the use of cryptographic protocols to avoid mediation and binding commitment when implementing game theoretic equilibrium concepts. First, we concentrate on the limits of cryptographic cheap talk...... to implement correlated equilibria of two-player strategic games in a sequentially rational way. We show that there exist two-player games for which no cryptographic protocol can implement the mediator in a sequentially rational way; that is, without introducing empty threats. In the context of computational...... with appealing economic applications. Our implementation puts forward a notion of cryptographically blinded games that exploits the power of encryption to selectively restrict the information available to players about sampled action profiles, such that these desirable equilibria can be stably achieved...

  2. Authenticated hash tables

    DEFF Research Database (Denmark)

    Triandopoulos, Nikolaos; Papamanthou, Charalampos; Tamassia, Roberto

    2008-01-01

    Hash tables are fundamental data structures that optimally answer membership queries. Suppose a client stores n elements in a hash table that is outsourced at a remote server so that the client can save space or achieve load balancing. Authenticating the hash table functionality, i.e., verifying...... to a scheme that achieves different trade-offs---namely, constant update time and O(nε/logκε n) query time for fixed ε > 0 and κ > 0. An experimental evaluation of our solution shows very good scalability....

  3. A Symmetric Key Cryptographic Technique Through Swapping Bits in Binary Field Using p-Box Matrix

    OpenAIRE

    Subhranil Som; Soumasree Banerjee

    2014-01-01

    In this paper a symmetric key cryptographic algorithm named as “A Symmetric Key Cryptographic Technique Through Swapping Bits in Binary Field Using p-box Matrix“ is proposed. Secret sharing is a technique by which any information can be break down into small pieces. The secret can be reconstructed only when a sufficient number of pieces of shares are combined together; individual shares are of no use on their own. Traditional secret sharing scheme possesses high computational ...

  4. Efficient hash tables for network applications.

    Science.gov (United States)

    Zink, Thomas; Waldvogel, Marcel

    2015-01-01

    Hashing has yet to be widely accepted as a component of hard real-time systems and hardware implementations, due to still existing prejudices concerning the unpredictability of space and time requirements resulting from collisions. While in theory perfect hashing can provide optimal mapping, in practice, finding a perfect hash function is too expensive, especially in the context of high-speed applications. The introduction of hashing with multiple choices, d-left hashing and probabilistic table summaries, has caused a shift towards deterministic DRAM access. However, high amounts of rare and expensive high-speed SRAM need to be traded off for predictability, which is infeasible for many applications. In this paper we show that previous suggestions suffer from the false precondition of full generality. Our approach exploits four individual degrees of freedom available in many practical applications, especially hardware and high-speed lookups. This reduces the requirement of on-chip memory up to an order of magnitude and guarantees constant lookup and update time at the cost of only minute amounts of additional hardware. Our design makes efficient hash table implementations cheaper, more predictable, and more practical.

  5. Neighborhood Discriminant Hashing for Large-Scale Image Retrieval.

    Science.gov (United States)

    Tang, Jinhui; Li, Zechao; Wang, Meng; Zhao, Ruizhen

    2015-09-01

    With the proliferation of large-scale community-contributed images, hashing-based approximate nearest neighbor search in huge databases has aroused considerable interest from the fields of computer vision and multimedia in recent years because of its computational and memory efficiency. In this paper, we propose a novel hashing method named neighborhood discriminant hashing (NDH) (for short) to implement approximate similarity search. Different from the previous work, we propose to learn a discriminant hashing function by exploiting local discriminative information, i.e., the labels of a sample can be inherited from the neighbor samples it selects. The hashing function is expected to be orthogonal to avoid redundancy in the learned hashing bits as much as possible, while an information theoretic regularization is jointly exploited using maximum entropy principle. As a consequence, the learned hashing function is compact and nonredundant among bits, while each bit is highly informative. Extensive experiments are carried out on four publicly available data sets and the comparison results demonstrate the outperforming performance of the proposed NDH method over state-of-the-art hashing techniques.

  6. Robust Image Hashing Using Radon Transform and Invariant Features

    Directory of Open Access Journals (Sweden)

    Y.L. Liu

    2016-09-01

    Full Text Available A robust image hashing method based on radon transform and invariant features is proposed for image authentication, image retrieval, and image detection. Specifically, an input image is firstly converted into a counterpart with a normalized size. Then the invariant centroid algorithm is applied to obtain the invariant feature point and the surrounding circular area, and the radon transform is employed to acquire the mapping coefficient matrix of the area. Finally, the hashing sequence is generated by combining the feature vectors and the invariant moments calculated from the coefficient matrix. Experimental results show that this method not only can resist against the normal image processing operations, but also some geometric distortions. Comparisons of receiver operating characteristic (ROC curve indicate that the proposed method outperforms some existing methods in classification between perceptual robustness and discrimination.

  7. Design of cryptographically secure AES like S-Box using second-order reversible cellular automata for wireless body area network applications

    Science.gov (United States)

    Rafi Ahamed, Shaik

    2016-01-01

    In biomedical, data security is the most expensive resource for wireless body area network applications. Cryptographic algorithms are used in order to protect the information against unauthorised access. Advanced encryption standard (AES) cryptographic algorithm plays a vital role in telemedicine applications. The authors propose a novel approach for design of substitution bytes (S-Box) using second-order reversible one-dimensional cellular automata (RCA2) as a replacement to the classical look-up-table (LUT) based S-Box used in AES algorithm. The performance of proposed RCA2 based S-Box and conventional LUT based S-Box is evaluated in terms of security using the cryptographic properties such as the nonlinearity, correlation immunity bias, strict avalanche criteria and entropy. Moreover, it is also shown that RCA2 based S-Boxes are dynamic in nature, invertible and provide high level of security. Further, it is also found that the RCA2 based S-Box have comparatively better performance than that of conventional LUT based S-Box. PMID:27733924

  8. Design of cryptographically secure AES like S-Box using second-order reversible cellular automata for wireless body area network applications.

    Science.gov (United States)

    Gangadari, Bhoopal Rao; Rafi Ahamed, Shaik

    2016-09-01

    In biomedical, data security is the most expensive resource for wireless body area network applications. Cryptographic algorithms are used in order to protect the information against unauthorised access. Advanced encryption standard (AES) cryptographic algorithm plays a vital role in telemedicine applications. The authors propose a novel approach for design of substitution bytes (S-Box) using second-order reversible one-dimensional cellular automata (RCA 2 ) as a replacement to the classical look-up-table (LUT) based S-Box used in AES algorithm. The performance of proposed RCA 2 based S-Box and conventional LUT based S-Box is evaluated in terms of security using the cryptographic properties such as the nonlinearity, correlation immunity bias, strict avalanche criteria and entropy. Moreover, it is also shown that RCA 2 based S-Boxes are dynamic in nature, invertible and provide high level of security. Further, it is also found that the RCA 2 based S-Box have comparatively better performance than that of conventional LUT based S-Box.

  9. Spongent: A lightweight hash function

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Knežević, Miroslav; Leander, Gregor

    2011-01-01

    This paper proposes spongent - a family of lightweight hash functions with hash sizes of 88 (for preimage resistance only), 128, 160, 224, and 256 bits based on a sponge construction instantiated with a present-type permutation, following the hermetic sponge strategy. Its smallest implementations...

  10. Efficient tabling of structured data with enhanced hash-consing

    DEFF Research Database (Denmark)

    Zhou, Neng-Fa; Have, Christian Theil

    2012-01-01

    techniques, called input sharing and hash code memoization, for reducing the time complexity by avoiding computing hash codes for certain terms. The improved system is able to eliminate the extra linear factor in the old system for processing sequences, thus significantly enhancing the scalability...... uses hash tables, but also systems that use tries such as XSB and YAP. In this paper, we apply hash-consing to tabling structured data in B-Prolog. While hash-consing can reduce the space consumption when sharing is effective, it does not change the time complexity. We enhance hash-consing with two...

  11. Symmetric cryptographic protocols

    CERN Document Server

    Ramkumar, Mahalingam

    2014-01-01

    This book focuses on protocols and constructions that make good use of symmetric pseudo random functions (PRF) like block ciphers and hash functions - the building blocks for symmetric cryptography. Readers will benefit from detailed discussion of several strategies for utilizing symmetric PRFs. Coverage includes various key distribution strategies for unicast, broadcast and multicast security, and strategies for constructing efficient digests of dynamic databases using binary hash trees.   •        Provides detailed coverage of symmetric key protocols •        Describes various applications of symmetric building blocks •        Includes strategies for constructing compact and efficient digests of dynamic databases

  12. On hash functions using checksums

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Kelsey, John; Knudsen, Lars Ramkilde

    2010-01-01

    We analyse the security of iterated hash functions that compute an input dependent checksum which is processed as part of the hash computation. We show that a large class of such schemes, including those using non-linear or even one-way checksum functions, is not secure against the second preimag...

  13. On hash functions using checksums

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Kelsey, John; Knudsen, Lars Ramkilde

    2008-01-01

    We analyse the security of iterated hash functions that compute an input dependent checksum which is processed as part of the hash computation. We show that a large class of such schemes, including those using non-linear or even one-way checksum functions, is not secure against the second preimag...

  14. Deep Hashing Based Fusing Index Method for Large-Scale Image Retrieval

    Directory of Open Access Journals (Sweden)

    Lijuan Duan

    2017-01-01

    Full Text Available Hashing has been widely deployed to perform the Approximate Nearest Neighbor (ANN search for the large-scale image retrieval to solve the problem of storage and retrieval efficiency. Recently, deep hashing methods have been proposed to perform the simultaneous feature learning and the hash code learning with deep neural networks. Even though deep hashing has shown the better performance than traditional hashing methods with handcrafted features, the learned compact hash code from one deep hashing network may not provide the full representation of an image. In this paper, we propose a novel hashing indexing method, called the Deep Hashing based Fusing Index (DHFI, to generate a more compact hash code which has stronger expression ability and distinction capability. In our method, we train two different architecture’s deep hashing subnetworks and fuse the hash codes generated by the two subnetworks together to unify images. Experiments on two real datasets show that our method can outperform state-of-the-art image retrieval applications.

  15. Multiview alignment hashing for efficient image search.

    Science.gov (United States)

    Liu, Li; Yu, Mengyang; Shao, Ling

    2015-03-01

    Hashing is a popular and efficient method for nearest neighbor search in large-scale data spaces by embedding high-dimensional feature descriptors into a similarity preserving Hamming space with a low dimension. For most hashing methods, the performance of retrieval heavily depends on the choice of the high-dimensional feature descriptor. Furthermore, a single type of feature cannot be descriptive enough for different images when it is used for hashing. Thus, how to combine multiple representations for learning effective hashing functions is an imminent task. In this paper, we present a novel unsupervised multiview alignment hashing approach based on regularized kernel nonnegative matrix factorization, which can find a compact representation uncovering the hidden semantics and simultaneously respecting the joint probability distribution of data. In particular, we aim to seek a matrix factorization to effectively fuse the multiple information sources meanwhile discarding the feature redundancy. Since the raised problem is regarded as nonconvex and discrete, our objective function is then optimized via an alternate way with relaxation and converges to a locally optimal solution. After finding the low-dimensional representation, the hashing functions are finally obtained through multivariable logistic regression. The proposed method is systematically evaluated on three data sets: 1) Caltech-256; 2) CIFAR-10; and 3) CIFAR-20, and the results show that our method significantly outperforms the state-of-the-art multiview hashing techniques.

  16. Forensic hash for multimedia information

    Science.gov (United States)

    Lu, Wenjun; Varna, Avinash L.; Wu, Min

    2010-01-01

    Digital multimedia such as images and videos are prevalent on today's internet and cause significant social impact, which can be evidenced by the proliferation of social networking sites with user generated contents. Due to the ease of generating and modifying images and videos, it is critical to establish trustworthiness for online multimedia information. In this paper, we propose novel approaches to perform multimedia forensics using compact side information to reconstruct the processing history of a document. We refer to this as FASHION, standing for Forensic hASH for informatION assurance. Based on the Radon transform and scale space theory, the proposed forensic hash is compact and can effectively estimate the parameters of geometric transforms and detect local tampering that an image may have undergone. Forensic hash is designed to answer a broader range of questions regarding the processing history of multimedia data than the simple binary decision from traditional robust image hashing, and also offers more efficient and accurate forensic analysis than multimedia forensic techniques that do not use any side information.

  17. ANALISA FUNGSI HASH DALAM ENKRIPSI IDEA UNTUK KEAMANAN RECORD INFORMASI

    Directory of Open Access Journals (Sweden)

    Ramen Antonov Purba

    2014-02-01

    Full Text Available Issues of security and confidentiality of data is very important to organization or individual. If the data in a network of computers connected with a public network such as the Internet. Of course a very important data is viewed or hijacked by unauthorized persons. Because if this happens we will probably corrupted data can be lost even that will cause huge material losses. This research discusses the security system of sending messages/data using the encryption aims to maintain access of security a message from the people who are not authorized/ eligible. Because of this delivery system is very extensive security with the scope then this section is limited only parsing the IDEA Algorithm with hash functions, which include encryption, decryption. By combining the encryption IDEA methods (International Data Encryption Algorithm to encrypt the contents of the messages/data with the hash function to detect changes the content of messages/data is expected security level to be better. Results from this study a software that can perform encryption and decryption of messages/data, generate the security key based on the message/data is encrypted.

  18. Porovnání současných a nových hašovacích funkcí

    OpenAIRE

    Suchan, Martin

    2007-01-01

    The goal of this study is to present comparison of today's most widely used cryptographic hash functions and compare them with drafts of new hash functions, which are being currently developed for Advanced Hash Standard competition. This study also includes implementation of all described functions in programming language C#.

  19. Securing ad hoc wireless sensor networks under Byzantine attacks by implementing non-cryptographic method

    Directory of Open Access Journals (Sweden)

    Shabir Ahmad Sofi

    2017-05-01

    Full Text Available Ad Hoc wireless sensor network (WSN is a collection of nodes that do not need to rely on predefined infrastructure to keep the network connected. The level of security and performance are always somehow related to each other, therefore due to limited resources in WSN, cryptographic methods for securing the network against attacks is not feasible. Byzantine attacks disrupt the communication between nodes in the network without regard to its own resource consumption. This paper discusses the performance of cluster based WSN comparing LEACH with Advanced node based clusters under byzantine attacks. This paper also proposes an algorithm for detection and isolation of the compromised nodes to mitigate the attacks by non-cryptographic means. The throughput increases after using the algorithm for isolation of the malicious nodes, 33% in case of Gray Hole attack and 62% in case of Black Hole attack.

  20. Compact binary hashing for music retrieval

    Science.gov (United States)

    Seo, Jin S.

    2014-03-01

    With the huge volume of music clips available for protection, browsing, and indexing, there is an increased attention to retrieve the information contents of the music archives. Music-similarity computation is an essential building block for browsing, retrieval, and indexing of digital music archives. In practice, as the number of songs available for searching and indexing is increased, so the storage cost in retrieval systems is becoming a serious problem. This paper deals with the storage problem by extending the supervector concept with the binary hashing. We utilize the similarity-preserving binary embedding in generating a hash code from the supervector of each music clip. Especially we compare the performance of the various binary hashing methods for music retrieval tasks on the widely-used genre dataset and the in-house singer dataset. Through the evaluation, we find an effective way of generating hash codes for music similarity estimation which improves the retrieval performance.

  1. Lightweight Cryptographic Techniques

    National Research Council Canada - National Science Library

    Yuen, Horace

    2004-01-01

    The objective of this project was to develop new cryptographic techniques, and to modify the important existing ones, for applications to encryption and authentication in energy-constrained sensors...

  2. Hash3: Proofs, Analysis and Implementation

    DEFF Research Database (Denmark)

    Gauravaram, Praveen

    2009-01-01

    This report outlines the talks presented at the winter school on Hash3: Proofs, Analysis, and Implementation, ECRYPT II Event on Hash Functions. In general, speakers may not write everything what they talk on the slides. So, this report also outlines such findings following the understanding of t...

  3. FBC: a flat binary code scheme for fast Manhattan hash retrieval

    Science.gov (United States)

    Kong, Yan; Wu, Fuzhang; Gao, Lifa; Wu, Yanjun

    2018-04-01

    Hash coding is a widely used technique in approximate nearest neighbor (ANN) search, especially in document search and multimedia (such as image and video) retrieval. Based on the difference of distance measurement, hash methods are generally classified into two categories: Hamming hashing and Manhattan hashing. Benefitting from better neighborhood structure preservation, Manhattan hashing methods outperform earlier methods in search effectiveness. However, due to using decimal arithmetic operations instead of bit operations, Manhattan hashing becomes a more time-consuming process, which significantly decreases the whole search efficiency. To solve this problem, we present an intuitive hash scheme which uses Flat Binary Code (FBC) to encode the data points. As a result, the decimal arithmetic used in previous Manhattan hashing can be replaced by more efficient XOR operator. The final experiments show that with a reasonable memory space growth, our FBC speeds up more than 80% averagely without any search accuracy loss when comparing to the state-of-art Manhattan hashing methods.

  4. Hierarchical Recurrent Neural Hashing for Image Retrieval With Hierarchical Convolutional Features.

    Science.gov (United States)

    Lu, Xiaoqiang; Chen, Yaxiong; Li, Xuelong

    Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep learning architectures can learn more effective image representation features. However, these methods only use semantic features to generate hash codes by shallow projection but ignore texture details. In this paper, we proposed a novel hashing method, namely hierarchical recurrent neural hashing (HRNH), to exploit hierarchical recurrent neural network to generate effective hash codes. There are three contributions of this paper. First, a deep hashing method is proposed to extensively exploit both spatial details and semantic information, in which, we leverage hierarchical convolutional features to construct image pyramid representation. Second, our proposed deep network can exploit directly convolutional feature maps as input to preserve the spatial structure of convolutional feature maps. Finally, we propose a new loss function that considers the quantization error of binarizing the continuous embeddings into the discrete binary codes, and simultaneously maintains the semantic similarity and balanceable property of hash codes. Experimental results on four widely used data sets demonstrate that the proposed HRNH can achieve superior performance over other state-of-the-art hashing methods.Hashing has been an important and effective technology in image retrieval due to its computational efficiency and fast search speed. The traditional hashing methods usually learn hash functions to obtain binary codes by exploiting hand-crafted features, which cannot optimally represent the information of the sample. Recently, deep learning methods can achieve better performance, since deep

  5. Protecting Cryptographic Memory against Tampering Attack

    DEFF Research Database (Denmark)

    Mukherjee, Pratyay

    In this dissertation we investigate the question of protecting cryptographic devices from tampering attacks. Traditional theoretical analysis of cryptographic devices is based on black-box models which do not take into account the attacks on the implementations, known as physical attacks. In prac......In this dissertation we investigate the question of protecting cryptographic devices from tampering attacks. Traditional theoretical analysis of cryptographic devices is based on black-box models which do not take into account the attacks on the implementations, known as physical attacks....... In practice such attacks can be executed easily, e.g. by heating the device, as substantiated by numerous works in the past decade. Tampering attacks are a class of such physical attacks where the attacker can change the memory/computation, gains additional (non-black-box) knowledge by interacting...... with the faulty device and then tries to break the security. Prior works show that generically approaching such problem is notoriously difficult. So, in this dissertation we attempt to solve an easier question, known as memory-tampering, where the attacker is allowed tamper only with the memory of the device...

  6. Hash function based on chaotic map lattices.

    Science.gov (United States)

    Wang, Shihong; Hu, Gang

    2007-06-01

    A new hash function system, based on coupled chaotic map dynamics, is suggested. By combining floating point computation of chaos and some simple algebraic operations, the system reaches very high bit confusion and diffusion rates, and this enables the system to have desired statistical properties and strong collision resistance. The chaos-based hash function has its advantages for high security and fast performance, and it serves as one of the most highly competitive candidates for practical applications of hash function for software realization and secure information communications in computer networks.

  7. Enhanced Matrix Power Function for Cryptographic Primitive Construction

    Directory of Open Access Journals (Sweden)

    Eligijus Sakalauskas

    2018-02-01

    Full Text Available A new enhanced matrix power function (MPF is presented for the construction of cryptographic primitives. According to the definition in previously published papers, an MPF is an action of two matrices powering some base matrix on the left and right. The MPF inversion equations, corresponding to the MPF problem, are derived and have some structural similarity with classical multivariate quadratic (MQ problem equations. Unlike the MQ problem, the MPF problem seems to be more complicated, since its equations are not defined over the field, but are represented as left–right action of two matrices defined over the infinite near-semiring on the matrix defined over the certain infinite, additive, noncommuting semigroup. The main results are the following: (1 the proposition of infinite, nonsymmetric, and noncommuting algebraic structures for the construction of the enhanced MPF, satisfying associativity conditions, which are necessary for cryptographic applications; (2 the proof that MPF inversion is polynomially equivalent to the solution of a certain kind of generalized multivariate quadratic (MQ problem which can be reckoned as hard; (3 the estimation of the effectiveness of direct MPF value computation; and (4 the presentation of preliminary security analysis, the determination of the security parameter, and specification of its secure value. These results allow us to make a conjecture that enhanced MPF can be a candidate one-way function (OWF, since the effective (polynomial-time inversion algorithm for it is not yet known. An example of the application of the proposed MPF for the Key Agreement Protocol (KAP is presented. Since the direct MPF value is computed effectively, the proposed MPF is suitable for the realization of cryptographic protocols in devices with restricted computation resources.

  8. Authenticity techniques for PACS images and records

    Science.gov (United States)

    Wong, Stephen T. C.; Abundo, Marco; Huang, H. K.

    1995-05-01

    Along with the digital radiology environment supported by picture archiving and communication systems (PACS) comes a new problem: How to establish trust in multimedia medical data that exist only in the easily altered memory of a computer. Trust is characterized in terms of integrity and privacy of digital data. Two major self-enforcing techniques can be used to assure the authenticity of electronic images and text -- key-based cryptography and digital time stamping. Key-based cryptography associates the content of an image with the originator using one or two distinct keys and prevents alteration of the document by anyone other than the originator. A digital time stamping algorithm generates a characteristic `digital fingerprint' for the original document using a mathematical hash function, and checks that it has not been modified. This paper discusses these cryptographic algorithms and their appropriateness for a PACS environment. It also presents experimental results of cryptographic algorithms on several imaging modalities.

  9. Hashing in computer science fifty years of slicing and dicing

    CERN Document Server

    Konheim, Alan G

    2009-01-01

    Written by one of the developers of the technology, Hashing is both a historical document on the development of hashing and an analysis of the applications of hashing in a society increasingly concerned with security. The material in this book is based on courses taught by the author, and key points are reinforced in sample problems and an accompanying instructor s manual. Graduate students and researchers in mathematics, cryptography, and security will benefit from this overview of hashing and the complicated mathematics that it requires

  10. Putting Wings on SPHINCS

    DEFF Research Database (Denmark)

    Kölbl, Stefan

    2018-01-01

    SPHINCS is a recently proposed stateless hash-based signature scheme and promising candidate for a post-quantum secure digital signature scheme. In this work we provide a comparison of the performance when instantiating SPHINCS with different cryptographic hash functions on both recent Intel...

  11. Collision analysis of one kind of chaos-based hash function

    International Nuclear Information System (INIS)

    Xiao Di; Peng Wenbing; Liao Xiaofeng; Xiang Tao

    2010-01-01

    In the last decade, various chaos-based hash functions have been proposed. Nevertheless, the corresponding analyses of them lag far behind. In this Letter, we firstly take a chaos-based hash function proposed very recently in Amin, Faragallah and Abd El-Latif (2009) as a sample to analyze its computational collision problem, and then generalize the construction method of one kind of chaos-based hash function and summarize some attentions to avoid the collision problem. It is beneficial to the hash function design based on chaos in the future.

  12. Super-Encryption Implementation Using Monoalphabetic Algorithm and XOR Algorithm for Data Security

    Science.gov (United States)

    Rachmawati, Dian; Andri Budiman, Mohammad; Aulia, Indra

    2018-03-01

    The exchange of data that occurs offline and online is very vulnerable to the threat of data theft. In general, cryptography is a science and art to maintain data secrecy. An encryption is a cryptography algorithm in which data is transformed into cipher text, which is something that is unreadable and meaningless so it cannot be read or understood by other parties. In super-encryption, two or more encryption algorithms are combined to make it more secure. In this work, Monoalphabetic algorithm and XOR algorithm are combined to form a super- encryption. Monoalphabetic algorithm works by changing a particular letter into a new letter based on existing keywords while the XOR algorithm works by using logic operation XOR Since Monoalphabetic algorithm is a classical cryptographic algorithm and XOR algorithm is a modern cryptographic algorithm, this scheme is expected to be both easy-to-implement and more secure. The combination of the two algorithms is capable of securing the data and restoring it back to its original form (plaintext), so the data integrity is still ensured.

  13. Feature hashing for fast image retrieval

    Science.gov (United States)

    Yan, Lingyu; Fu, Jiarun; Zhang, Hongxin; Yuan, Lu; Xu, Hui

    2018-03-01

    Currently, researches on content based image retrieval mainly focus on robust feature extraction. However, due to the exponential growth of online images, it is necessary to consider searching among large scale images, which is very timeconsuming and unscalable. Hence, we need to pay much attention to the efficiency of image retrieval. In this paper, we propose a feature hashing method for image retrieval which not only generates compact fingerprint for image representation, but also prevents huge semantic loss during the process of hashing. To generate the fingerprint, an objective function of semantic loss is constructed and minimized, which combine the influence of both the neighborhood structure of feature data and mapping error. Since the machine learning based hashing effectively preserves neighborhood structure of data, it yields visual words with strong discriminability. Furthermore, the generated binary codes leads image representation building to be of low-complexity, making it efficient and scalable to large scale databases. Experimental results show good performance of our approach.

  14. Multi-biometrics based cryptographic key regeneration scheme

    OpenAIRE

    Kanade , Sanjay Ganesh; Petrovska-Delacrétaz , Dijana; Dorizzi , Bernadette

    2009-01-01

    International audience; Biometrics lack revocability and privacy while cryptography cannot detect the user's identity. By obtaining cryptographic keys using biometrics, one can achieve the properties such as revocability, assurance about user's identity, and privacy. In this paper, we propose a multi-biometric based cryptographic key regeneration scheme. Since left and right irises of a person are uncorrelated, we treat them as two independent biometrics and combine in our system. We propose ...

  15. Secure and Efficient Regression Analysis Using a Hybrid Cryptographic Framework: Development and Evaluation.

    Science.gov (United States)

    Sadat, Md Nazmus; Jiang, Xiaoqian; Aziz, Md Momin Al; Wang, Shuang; Mohammed, Noman

    2018-03-05

    Machine learning is an effective data-driven tool that is being widely used to extract valuable patterns and insights from data. Specifically, predictive machine learning models are very important in health care for clinical data analysis. The machine learning algorithms that generate predictive models often require pooling data from different sources to discover statistical patterns or correlations among different attributes of the input data. The primary challenge is to fulfill one major objective: preserving the privacy of individuals while discovering knowledge from data. Our objective was to develop a hybrid cryptographic framework for performing regression analysis over distributed data in a secure and efficient way. Existing secure computation schemes are not suitable for processing the large-scale data that are used in cutting-edge machine learning applications. We designed, developed, and evaluated a hybrid cryptographic framework, which can securely perform regression analysis, a fundamental machine learning algorithm using somewhat homomorphic encryption and a newly introduced secure hardware component of Intel Software Guard Extensions (Intel SGX) to ensure both privacy and efficiency at the same time. Experimental results demonstrate that our proposed method provides a better trade-off in terms of security and efficiency than solely secure hardware-based methods. Besides, there is no approximation error. Computed model parameters are exactly similar to plaintext results. To the best of our knowledge, this kind of secure computation model using a hybrid cryptographic framework, which leverages both somewhat homomorphic encryption and Intel SGX, is not proposed or evaluated to this date. Our proposed framework ensures data security and computational efficiency at the same time. ©Md Nazmus Sadat, Xiaoqian Jiang, Md Momin Al Aziz, Shuang Wang, Noman Mohammed. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 05.03.2018.

  16. Side channel analysis of some hash based MACs: A response to SHA-3 requirements

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Okeya, Katsuyuki

    2008-01-01

    The forthcoming NIST's Advanced Hash Standard (AHS) competition to select SHA-3 hash function requires that each candidate hash function submission must have at least one construction to support FIPS 198 HMAC application. As part of its evaluation, NIST is aiming to select either a candidate hash...

  17. Cryptanalysis of an Iterated Halving-based hash function: CRUSH

    DEFF Research Database (Denmark)

    Bagheri, Nasour; Henricksen, Matt; Knudsen, Lars Ramkilde

    2009-01-01

    Iterated Halving has been suggested as a replacement to the Merkle–Damgård (MD) construction in 2004 anticipating the attacks on the MDx family of hash functions. The CRUSH hash function provides a specific instantiation of the block cipher for Iterated Halving. The authors identify structural pr...

  18. Model-based recognition of 3-D objects by geometric hashing technique

    International Nuclear Information System (INIS)

    Severcan, M.; Uzunalioglu, H.

    1992-09-01

    A model-based object recognition system is developed for recognition of polyhedral objects. The system consists of feature extraction, modelling and matching stages. Linear features are used for object descriptions. Lines are obtained from edges using rotation transform. For modelling and recognition process, geometric hashing method is utilized. Each object is modelled using 2-D views taken from the viewpoints on the viewing sphere. A hidden line elimination algorithm is used to find these views from the wire frame model of the objects. The recognition experiments yielded satisfactory results. (author). 8 refs, 5 figs

  19. HashDist: Reproducible, Relocatable, Customizable, Cross-Platform Software Stacks for Open Hydrological Science

    Science.gov (United States)

    Ahmadia, A. J.; Kees, C. E.

    2014-12-01

    Developing scientific software is a continuous balance between not reinventing the wheel and getting fragile codes to interoperate with one another. Binary software distributions such as Anaconda provide a robust starting point for many scientific software packages, but this solution alone is insufficient for many scientific software developers. HashDist provides a critical component of the development workflow, enabling highly customizable, source-driven, and reproducible builds for scientific software stacks, available from both the IPython Notebook and the command line. To address these issues, the Coastal and Hydraulics Laboratory at the US Army Engineer Research and Development Center has funded the development of HashDist in collaboration with Simula Research Laboratories and the University of Texas at Austin. HashDist is motivated by a functional approach to package build management, and features intelligent caching of sources and builds, parametrized build specifications, and the ability to interoperate with system compilers and packages. HashDist enables the easy specification of "software stacks", which allow both the novice user to install a default environment and the advanced user to configure every aspect of their build in a modular fashion. As an advanced feature, HashDist builds can be made relocatable, allowing the easy redistribution of binaries on all three major operating systems as well as cloud, and supercomputing platforms. As a final benefit, all HashDist builds are reproducible, with a build hash specifying exactly how each component of the software stack was installed. This talk discusses the role of HashDist in the hydrological sciences, including its use by the Coastal and Hydraulics Laboratory in the development and deployment of the Proteus Toolkit as well as the Rapid Operational Access and Maneuver Support project. We demonstrate HashDist in action, and show how it can effectively support development, deployment, teaching, and

  20. Hash function construction using weighted complex dynamical networks

    International Nuclear Information System (INIS)

    Song Yu-Rong; Jiang Guo-Ping

    2013-01-01

    A novel scheme to construct a hash function based on a weighted complex dynamical network (WCDN) generated from an original message is proposed in this paper. First, the original message is divided into blocks. Then, each block is divided into components, and the nodes and weighted edges are well defined from these components and their relations. Namely, the WCDN closely related to the original message is established. Furthermore, the node dynamics of the WCDN are chosen as a chaotic map. After chaotic iterations, quantization and exclusive-or operations, the fixed-length hash value is obtained. This scheme has the property that any tiny change in message can be diffused rapidly through the WCDN, leading to very different hash values. Analysis and simulation show that the scheme possesses good statistical properties, excellent confusion and diffusion, strong collision resistance and high efficiency. (general)

  1. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  2. Designing an ASIP for cryptographic pairings over Barreto-Naehrig curves

    NARCIS (Netherlands)

    Kammler, D.; Zhang, D.; Schwabe, P.; Scharwaechter, H.; Langenberg, M.; Auras, D.; Ascheid, G.; Mathar, R.; Clavier, C.; Gaj, K.

    2009-01-01

    This paper presents a design-space exploration of an application-specific instruction-set processor (ASIP) for the computation of various cryptographic pairings over Barreto-Naehrig curves (BN curves). Cryptographic pairings are based on elliptic curves over finite fields—in the case of BN curves a

  3. Secure method for biometric-based recognition with integrated cryptographic functions.

    Science.gov (United States)

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.

  4. Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions

    Directory of Open Access Journals (Sweden)

    Shin-Yan Chiou

    2013-01-01

    Full Text Available Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.

  5. NHash: Randomized N-Gram Hashing for Distributed Generation of Validatable Unique Study Identifiers in Multicenter Research.

    Science.gov (United States)

    Zhang, Guo-Qiang; Tao, Shiqiang; Xing, Guangming; Mozes, Jeno; Zonjy, Bilal; Lhatoo, Samden D; Cui, Licong

    2015-11-10

    A unique study identifier serves as a key for linking research data about a study subject without revealing protected health information in the identifier. While sufficient for single-site and limited-scale studies, the use of common unique study identifiers has several drawbacks for large multicenter studies, where thousands of research participants may be recruited from multiple sites. An important property of study identifiers is error tolerance (or validatable), in that inadvertent editing mistakes during their transmission and use will most likely result in invalid study identifiers. This paper introduces a novel method called "Randomized N-gram Hashing (NHash)," for generating unique study identifiers in a distributed and validatable fashion, in multicenter research. NHash has a unique set of properties: (1) it is a pseudonym serving the purpose of linking research data about a study participant for research purposes; (2) it can be generated automatically in a completely distributed fashion with virtually no risk for identifier collision; (3) it incorporates a set of cryptographic hash functions based on N-grams, with a combination of additional encryption techniques such as a shift cipher; (d) it is validatable (error tolerant) in the sense that inadvertent edit errors will mostly result in invalid identifiers. NHash consists of 2 phases. First, an intermediate string using randomized N-gram hashing is generated. This string consists of a collection of N-gram hashes f1, f2, ..., fk. The input for each function fi has 3 components: a random number r, an integer n, and input data m. The result, fi(r, n, m), is an n-gram of m with a starting position s, which is computed as (r mod |m|), where |m| represents the length of m. The output for Step 1 is the concatenation of the sequence f1(r1, n1, m1), f2(r2, n2, m2), ..., fk(rk, nk, mk). In the second phase, the intermediate string generated in Phase 1 is encrypted using techniques such as shift cipher. The result

  6. On Cryptographic Information Security in Cloud Infrastructures: PKI and IBE Methods

    Directory of Open Access Journals (Sweden)

    Konstantin Grigorevich Kogos

    2014-05-01

    Full Text Available The application of cryptographic security methods in cloud infrastructure information security is analyzed. The cryptographic problems in cloudy infrastructures are chosen; the appropriate protocols are investigated; the appropriate mathematical problems are examined.

  7. Practical security and privacy attacks against biometric hashing using sparse recovery

    Science.gov (United States)

    Topcu, Berkay; Karabat, Cagatay; Azadmanesh, Matin; Erdogan, Hakan

    2016-12-01

    Biometric hashing is a cancelable biometric verification method that has received research interest recently. This method can be considered as a two-factor authentication method which combines a personal password (or secret key) with a biometric to obtain a secure binary template which is used for authentication. We present novel practical security and privacy attacks against biometric hashing when the attacker is assumed to know the user's password in order to quantify the additional protection due to biometrics when the password is compromised. We present four methods that can reconstruct a biometric feature and/or the image from a hash and one method which can find the closest biometric data (i.e., face image) from a database. Two of the reconstruction methods are based on 1-bit compressed sensing signal reconstruction for which the data acquisition scenario is very similar to biometric hashing. Previous literature introduced simple attack methods, but we show that we can achieve higher level of security threats using compressed sensing recovery techniques. In addition, we present privacy attacks which reconstruct a biometric image which resembles the original image. We quantify the performance of the attacks using detection error tradeoff curves and equal error rates under advanced attack scenarios. We show that conventional biometric hashing methods suffer from high security and privacy leaks under practical attacks, and we believe more advanced hash generation methods are necessary to avoid these attacks.

  8. Quantum Communication Attacks on Classical Cryptographic Protocols

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre

    , one can show that the protocol remains secure even under such an attack. However, there are also cases where the honest players are quantum as well, even if the protocol uses classical communication. For instance, this is the case when classical multiparty computation is used as a “subroutine......In the literature on cryptographic protocols, it has been studied several times what happens if a classical protocol is attacked by a quantum adversary. Usually, this is taken to mean that the adversary runs a quantum algorithm, but communicates classically with the honest players. In several cases......” in quantum multiparty computation. Furthermore, in the future, players in a protocol may employ quantum computing simply to improve efficiency of their local computation, even if the communication is supposed to be classical. In such cases, it no longer seems clear that a quantum adversary must be limited...

  9. Quantum Communication Attacks on Classical Cryptographic Protocols

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre

    , one can show that the protocol remains secure even under such an attack. However, there are also cases where the honest players are quantum as well, even if the protocol uses classical communication. For instance, this is the case when classical multiparty computation is used as a “subroutine......” in quantum multiparty computation. Furthermore, in the future, players in a protocol may employ quantum computing simply to improve efficiency of their local computation, even if the communication is supposed to be classical. In such cases, it no longer seems clear that a quantum adversary must be limited......In the literature on cryptographic protocols, it has been studied several times what happens if a classical protocol is attacked by a quantum adversary. Usually, this is taken to mean that the adversary runs a quantum algorithm, but communicates classically with the honest players. In several cases...

  10. Superposition Attacks on Cryptographic Protocols

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Funder, Jakob Løvstad; Nielsen, Jesper Buus

    2011-01-01

    of information. In this paper, we introduce a fundamentally new model of quantum attacks on classical cryptographic protocols, where the adversary is allowed to ask several classical queries in quantum superposition. This is a strictly stronger attack than the standard one, and we consider the security......Attacks on classical cryptographic protocols are usually modeled by allowing an adversary to ask queries from an oracle. Security is then defined by requiring that as long as the queries satisfy some constraint, there is some problem the adversary cannot solve, such as compute a certain piece...... of several primitives in this model. We show that a secret-sharing scheme that is secure with threshold $t$ in the standard model is secure against superposition attacks if and only if the threshold is lowered to $t/2$. We use this result to give zero-knowledge proofs for all of NP in the common reference...

  11. Protecting privacy in a clinical data warehouse.

    Science.gov (United States)

    Kong, Guilan; Xiao, Zhichun

    2015-06-01

    Peking University has several prestigious teaching hospitals in China. To make secondary use of massive medical data for research purposes, construction of a clinical data warehouse is imperative in Peking University. However, a big concern for clinical data warehouse construction is how to protect patient privacy. In this project, we propose to use a combination of symmetric block ciphers, asymmetric ciphers, and cryptographic hashing algorithms to protect patient privacy information. The novelty of our privacy protection approach lies in message-level data encryption, the key caching system, and the cryptographic key management system. The proposed privacy protection approach is scalable to clinical data warehouse construction with any size of medical data. With the composite privacy protection approach, the clinical data warehouse can be secure enough to keep the confidential data from leaking to the outside world. © The Author(s) 2014.

  12. A comprehensive evaluation of alignment algorithms in the context of RNA-seq.

    Directory of Open Access Journals (Sweden)

    Robert Lindner

    Full Text Available Transcriptome sequencing (RNA-Seq overcomes limitations of previously used RNA quantification methods and provides one experimental framework for both high-throughput characterization and quantification of transcripts at the nucleotide level. The first step and a major challenge in the analysis of such experiments is the mapping of sequencing reads to a transcriptomic origin including the identification of splicing events. In recent years, a large number of such mapping algorithms have been developed, all of which have in common that they require algorithms for aligning a vast number of reads to genomic or transcriptomic sequences. Although the FM-index based aligner Bowtie has become a de facto standard within mapping pipelines, a much larger number of possible alignment algorithms have been developed also including other variants of FM-index based aligners. Accordingly, developers and users of RNA-seq mapping pipelines have the choice among a large number of available alignment algorithms. To provide guidance in the choice of alignment algorithms for these purposes, we evaluated the performance of 14 widely used alignment programs from three different algorithmic classes: algorithms using either hashing of the reference transcriptome, hashing of reads, or a compressed FM-index representation of the genome. Here, special emphasis was placed on both precision and recall and the performance for different read lengths and numbers of mismatches and indels in a read. Our results clearly showed the significant reduction in memory footprint and runtime provided by FM-index based aligners at a precision and recall comparable to the best hash table based aligners. Furthermore, the recently developed Bowtie 2 alignment algorithm shows a remarkable tolerance to both sequencing errors and indels, thus, essentially making hash-based aligners obsolete.

  13. Object-Location-Aware Hashing for Multi-Label Image Retrieval via Automatic Mask Learning.

    Science.gov (United States)

    Huang, Chang-Qin; Yang, Shang-Ming; Pan, Yan; Lai, Han-Jiang

    2018-09-01

    Learning-based hashing is a leading approach of approximate nearest neighbor search for large-scale image retrieval. In this paper, we develop a deep supervised hashing method for multi-label image retrieval, in which we propose to learn a binary "mask" map that can identify the approximate locations of objects in an image, so that we use this binary "mask" map to obtain length-limited hash codes which mainly focus on an image's objects but ignore the background. The proposed deep architecture consists of four parts: 1) a convolutional sub-network to generate effective image features; 2) a binary "mask" sub-network to identify image objects' approximate locations; 3) a weighted average pooling operation based on the binary "mask" to obtain feature representations and hash codes that pay most attention to foreground objects but ignore the background; and 4) the combination of a triplet ranking loss designed to preserve relative similarities among images and a cross entropy loss defined on image labels. We conduct comprehensive evaluations on four multi-label image data sets. The results indicate that the proposed hashing method achieves superior performance gains over the state-of-the-art supervised or unsupervised hashing baselines.

  14. Improving the security of a parallel keyed hash function based on chaotic maps

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Di, E-mail: xiaodi_cqu@hotmail.co [College of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China); Liao Xiaofeng [College of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China); Wang Yong [College of Computer Science and Engineering, Chongqing University, Chongqing 400044 (China)] [College of Economy and Management, Chongqing University of Posts and Telecommunications, Chongqing 400065 (China)

    2009-11-23

    In this Letter, we analyze the cause of vulnerability of the original parallel keyed hash function based on chaotic maps in detail, and then propose the corresponding enhancement measures. Theoretical analysis and computer simulation indicate that the modified hash function is more secure than the original one. At the same time, it can keep the parallel merit and satisfy the other performance requirements of hash function.

  15. Improving the security of a parallel keyed hash function based on chaotic maps

    International Nuclear Information System (INIS)

    Xiao Di; Liao Xiaofeng; Wang Yong

    2009-01-01

    In this Letter, we analyze the cause of vulnerability of the original parallel keyed hash function based on chaotic maps in detail, and then propose the corresponding enhancement measures. Theoretical analysis and computer simulation indicate that the modified hash function is more secure than the original one. At the same time, it can keep the parallel merit and satisfy the other performance requirements of hash function.

  16. Robust visual hashing via ICA

    International Nuclear Information System (INIS)

    Fournel, Thierry; Coltuc, Daniela

    2010-01-01

    Designed to maximize information transmission in the presence of noise, independent component analysis (ICA) could appear in certain circumstances as a statistics-based tool for robust visual hashing. Several ICA-based scenarios can attempt to reach this goal. A first one is here considered.

  17. Cryptographic Key Management System

    Energy Technology Data Exchange (ETDEWEB)

    No, author

    2014-02-21

    This report summarizes the outcome of U.S. Department of Energy (DOE) contract DE-OE0000543, requesting the design of a Cryptographic Key Management System (CKMS) for the secure management of cryptographic keys for the energy sector infrastructure. Prime contractor Sypris Electronics, in collaboration with Oak Ridge National Laboratories (ORNL), Electric Power Research Institute (EPRI), Valicore Technologies, and Purdue University's Center for Education and Research in Information Assurance and Security (CERIAS) and Smart Meter Integration Laboratory (SMIL), has designed, developed and evaluated the CKMS solution. We provide an overview of the project in Section 3, review the core contributions of all contractors in Section 4, and discuss bene ts to the DOE in Section 5. In Section 6 we describe the technical construction of the CKMS solution, and review its key contributions in Section 6.9. Section 7 describes the evaluation and demonstration of the CKMS solution in different environments. We summarize the key project objectives in Section 8, list publications resulting from the project in Section 9, and conclude with a discussion on commercialization in Section 10 and future work in Section 11.

  18. Matching Real and Synthetic Panoramic Images Using a Variant of Geometric Hashing

    Science.gov (United States)

    Li-Chee-Ming, J.; Armenakis, C.

    2017-05-01

    This work demonstrates an approach to automatically initialize a visual model-based tracker, and recover from lost tracking, without prior camera pose information. These approaches are commonly referred to as tracking-by-detection. Previous tracking-by-detection techniques used either fiducials (i.e. landmarks or markers) or the object's texture. The main contribution of this work is the development of a tracking-by-detection algorithm that is based solely on natural geometric features. A variant of geometric hashing, a model-to-image registration algorithm, is proposed that searches for a matching panoramic image from a database of synthetic panoramic images captured in a 3D virtual environment. The approach identifies corresponding features between the matched panoramic images. The corresponding features are to be used in a photogrammetric space resection to estimate the camera pose. The experiments apply this algorithm to initialize a model-based tracker in an indoor environment using the 3D CAD model of the building.

  19. sPECTRA: a Precise framEwork for analyzing CrypTographic vulneRabilities in Android apps

    OpenAIRE

    Gajrani, J.; Tripathi, M.; Laxmi, V.; Gaur, M. S.; Conti, M.; Rajarajan, M.

    2017-01-01

    The majority of Android applications (apps) deals with user's personal data. Users trust these apps and allow them to access all sensitive data. Cryptography, when employed in an appropriate way, can be used to prevent misuse of data. Unfortunately, cryptographic libraries also include vulnerable cryptographic services. Since Android app developers may not be cryptographic experts, this makes apps become the target of various attacks due to cryptographic vulnerabilities. In this work, we pres...

  20. Range-efficient consistent sampling and locality-sensitive hashing for polygons

    DEFF Research Database (Denmark)

    Gudmundsson, Joachim; Pagh, Rasmus

    2017-01-01

    Locality-sensitive hashing (LSH) is a fundamental technique for similarity search and similarity estimation in high-dimensional spaces. The basic idea is that similar objects should produce hash collisions with probability significantly larger than objects with low similarity. We consider LSH for...... or union of a set of preprocessed polygons. Curiously, our consistent sampling method uses transformation to a geometric problem....

  1. Authentication codes from ε-ASU hash functions with partially secret keys

    NARCIS (Netherlands)

    Liu, S.L.; Tilborg, van H.C.A.; Weng, J.; Chen, Kefei

    2014-01-01

    An authentication code can be constructed with a family of e-Almost strong universal (e-ASU) hash functions, with the index of hash functions as the authentication key. This paper considers the performance of authentication codes from e-ASU, when the authentication key is only partially secret. We

  2. Linear-XOR and Additive Checksums Don't Protect Damgard-Merkle Hashes

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Kelsey, John

    2008-01-01

    We consider the security of Damg\\aa{}rd-Merkle variants which compute linear-XOR or additive checksums over message blocks, intermediate hash values, or both, and process these checksums in computing the final hash value. We show that these Damg\\aa{}rd-Merkle variants gain almost no security...

  3. Gene function prediction based on Gene Ontology Hierarchy Preserving Hashing.

    Science.gov (United States)

    Zhao, Yingwen; Fu, Guangyuan; Wang, Jun; Guo, Maozu; Yu, Guoxian

    2018-02-23

    Gene Ontology (GO) uses structured vocabularies (or terms) to describe the molecular functions, biological roles, and cellular locations of gene products in a hierarchical ontology. GO annotations associate genes with GO terms and indicate the given gene products carrying out the biological functions described by the relevant terms. However, predicting correct GO annotations for genes from a massive set of GO terms as defined by GO is a difficult challenge. To combat with this challenge, we introduce a Gene Ontology Hierarchy Preserving Hashing (HPHash) based semantic method for gene function prediction. HPHash firstly measures the taxonomic similarity between GO terms. It then uses a hierarchy preserving hashing technique to keep the hierarchical order between GO terms, and to optimize a series of hashing functions to encode massive GO terms via compact binary codes. After that, HPHash utilizes these hashing functions to project the gene-term association matrix into a low-dimensional one and performs semantic similarity based gene function prediction in the low-dimensional space. Experimental results on three model species (Homo sapiens, Mus musculus and Rattus norvegicus) for interspecies gene function prediction show that HPHash performs better than other related approaches and it is robust to the number of hash functions. In addition, we also take HPHash as a plugin for BLAST based gene function prediction. From the experimental results, HPHash again significantly improves the prediction performance. The codes of HPHash are available at: http://mlda.swu.edu.cn/codes.php?name=HPHash. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. Handwriting: Feature Correlation Analysis for Biometric Hashes

    Science.gov (United States)

    Vielhauer, Claus; Steinmetz, Ralf

    2004-12-01

    In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation), the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  5. Handwriting: Feature Correlation Analysis for Biometric Hashes

    Directory of Open Access Journals (Sweden)

    Ralf Steinmetz

    2004-04-01

    Full Text Available In the application domain of electronic commerce, biometric authentication can provide one possible solution for the key management problem. Besides server-based approaches, methods of deriving digital keys directly from biometric measures appear to be advantageous. In this paper, we analyze one of our recently published specific algorithms of this category based on behavioral biometrics of handwriting, the biometric hash. Our interest is to investigate to which degree each of the underlying feature parameters contributes to the overall intrapersonal stability and interpersonal value space. We will briefly discuss related work in feature evaluation and introduce a new methodology based on three components: the intrapersonal scatter (deviation, the interpersonal entropy, and the correlation between both measures. Evaluation of the technique is presented based on two data sets of different size. The method presented will allow determination of effects of parameterization of the biometric system, estimation of value space boundaries, and comparison with other feature selection approaches.

  6. Two-phase hybrid cryptography algorithm for wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Rawya Rizk

    2015-12-01

    Full Text Available For achieving security in wireless sensor networks (WSNs, cryptography plays an important role. In this paper, a new security algorithm using combination of both symmetric and asymmetric cryptographic techniques is proposed to provide high security with minimized key maintenance. It guarantees three cryptographic primitives, integrity, confidentiality and authentication. Elliptical Curve Cryptography (ECC and Advanced Encryption Standard (AES are combined to provide encryption. XOR-DUAL RSA algorithm is considered for authentication and Message Digest-5 (MD5 for integrity. The results show that the proposed hybrid algorithm gives better performance in terms of computation time, the size of cipher text, and the energy consumption in WSN. It is also robust against different types of attacks in the case of image encryption.

  7. A Verifiable Language for Cryptographic Protocols

    DEFF Research Database (Denmark)

    Nielsen, Christoffer Rosenkilde

    We develop a formal language for specifying cryptographic protocols in a structured and clear manner, which allows verification of many interesting properties; in particular confidentiality and integrity. The study sheds new light on the problem of creating intuitive and human readable languages...

  8. System using data compression and hashing adapted for use for multimedia encryption

    Science.gov (United States)

    Coffland, Douglas R [Livermore, CA

    2011-07-12

    A system and method is disclosed for multimedia encryption. Within the system of the present invention, a data compression module receives and compresses a media signal into a compressed data stream. A data acquisition module receives and selects a set of data from the compressed data stream. And, a hashing module receives and hashes the set of data into a keyword. The method of the present invention includes the steps of compressing a media signal into a compressed data stream; selecting a set of data from the compressed data stream; and hashing the set of data into a keyword.

  9. “Robots in Space” Multiagent Problem: Complexity, Information and Cryptographic Aspects

    Directory of Open Access Journals (Sweden)

    A. Yu. Bernstein

    2013-01-01

    Full Text Available We study a multiagent algorithmic problem that we call Robot in Space (RinS: There are n ≥ 2 autonomous robots, that need to agree without outside interference on distribution of shelters, so that straight pathes to the shelters will not intersect. The problem is closely related to the assignment problem in Graph Theory, to the convex hull problem in Combinatorial Geometry, or to the path-planning problem in Artificial Intelligence. Our algorithm grew up from a local search solution of the problem suggested by E.W. Dijkstra. We present a multiagent anonymous and scalable algorithm (protocol solving the problem, give an upper bound for the algorithm, prove (manually its correctness, and examine two communication aspects of the RinS problem — the informational and cryptographic. We proved that (1 there is no protocol that solves the RinS, which transfers a bounded number of bits, and (2 suggested the protocol that allows robots to check whether their paths intersect, without revealing additional information about their relative positions (with respect to shelters. The present paper continues the research presented in Mars Robot Puzzle (a Multiagent Approach to the Dijkstra Problem (by E.V. Bodin, N.O. Garanina, and N.V. Shilov, published in Modeling and analysis of information systems, 18(2, 2011.

  10. Data Collision Prevention with Overflow Hashing Technique in Closed Hash Searching Process

    Science.gov (United States)

    Rahim, Robbi; Nurjamiyah; Rafika Dewi, Arie

    2017-12-01

    Hash search is a method that can be used for various search processes such as search engines, sorting, machine learning, neural network and so on, in the search process the possibility of collision data can happen and to prevent the occurrence of collision can be done in several ways one of them is to use Overflow technique, the use of this technique perform with varying length of data and this technique can prevent the occurrence of data collisions.

  11. Cryptanalysis of Lin et al.'s Efficient Block-Cipher-Based Hash Function

    NARCIS (Netherlands)

    Liu, Bozhong; Gong, Zheng; Chen, Xiaohong; Qiu, Weidong; Zheng, Dong

    2010-01-01

    Hash functions are widely used in authentication. In this paper, the security of Lin et al.'s efficient block-cipher-based hash function is reviewed. By using Joux's multicollisions and Kelsey et al.'s expandable message techniques, we find the scheme is vulnerable to collision, preimage and second

  12. Using Compilers to Enhance Cryptographic Product Development

    Science.gov (United States)

    Bangerter, E.; Barbosa, M.; Bernstein, D.; Damgård, I.; Page, D.; Pagter, J. I.; Sadeghi, A.-R.; Sovio, S.

    Developing high-quality software is hard in the general case, and it is significantly more challenging in the case of cryptographic software. A high degree of new skill and understanding must be learnt and applied without error to avoid vulnerability and inefficiency. This is often beyond the financial, manpower or intellectual resources avail-able. In this paper we present the motivation for the European funded CACE (Computer Aided Cryptography Engineering) project The main objective of CACE is to provide engineers (with limited or no expertise in cryptography) with a toolbox that allows them to generate robust and efficient implementations of cryptographic primitives. We also present some preliminary results already obtained in the early stages of this project, and discuss the relevance of the project as perceived by stakeholders in the mobile device arena.

  13. A novel, privacy-preserving cryptographic approach for sharing sequencing data

    Science.gov (United States)

    Cassa, Christopher A; Miller, Rachel A; Mandl, Kenneth D

    2013-01-01

    Objective DNA samples are often processed and sequenced in facilities external to the point of collection. These samples are routinely labeled with patient identifiers or pseudonyms, allowing for potential linkage to identity and private clinical information if intercepted during transmission. We present a cryptographic scheme to securely transmit externally generated sequence data which does not require any patient identifiers, public key infrastructure, or the transmission of passwords. Materials and methods This novel encryption scheme cryptographically protects participant sequence data using a shared secret key that is derived from a unique subset of an individual’s genetic sequence. This scheme requires access to a subset of an individual’s genetic sequence to acquire full access to the transmitted sequence data, which helps to prevent sample mismatch. Results We validate that the proposed encryption scheme is robust to sequencing errors, population uniqueness, and sibling disambiguation, and provides sufficient cryptographic key space. Discussion Access to a set of an individual’s genotypes and a mutually agreed cryptographic seed is needed to unlock the full sequence, which provides additional sample authentication and authorization security. We present modest fixed and marginal costs to implement this transmission architecture. Conclusions It is possible for genomics researchers who sequence participant samples externally to protect the transmission of sequence data using unique features of an individual’s genetic sequence. PMID:23125421

  14. Locality-sensitive Hashing without False Negatives

    DEFF Research Database (Denmark)

    Pagh, Rasmus

    2016-01-01

    We consider a new construction of locality-sensitive hash functions for Hamming space that is covering in the sense that is it guaranteed to produce a collision for every pair of vectors within a given radius r. The construction is efficient in the sense that the expected number of hash collisions......(n)/k, where n is the number of points in the data set and k ∊ N, and differs from it by at most a factor ln(4) in the exponent for general values of cr. As a consequence, LSH-based similarity search in Hamming space can avoid the problem of false negatives at little or no cost in efficiency. Read More: http...... between vectors at distance cr, for a given c > 1, comes close to that of the best possible data independent LSH without the covering guarantee, namely, the seminal LSH construction of Indyk and Motwani (FOCS ′98). The efficiency of the new construction essentially matches their bound if cr = log...

  15. Simultenious binary hash and features learning for image retrieval

    Science.gov (United States)

    Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.

    2016-05-01

    Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.

  16. Quantum-secured blockchain

    OpenAIRE

    Kiktenko, E. O.; Pozhar, N. O.; Anufriev, M. N.; Trushechkin, A. S.; Yunusov, R. R.; Kurochkin, Y. V.; Lvovsky, A. I.; Fedorov, A. K.

    2017-01-01

    Blockchain is a distributed database which is cryptographically protected against malicious modifications. While promising for a wide range of applications, current blockchain platforms rely on digital signatures, which are vulnerable to attacks by means of quantum computers. The same, albeit to a lesser extent, applies to cryptographic hash functions that are used in preparing new blocks, so parties with access to quantum computation would have unfair advantage in procuring mining rewards. H...

  17. Implementation of Rivest Shamir Adleman Algorithm (RSA) and Vigenere Cipher In Web Based Information System

    Science.gov (United States)

    Aryanti, Aryanti; Mekongga, Ikhthison

    2018-02-01

    Data security and confidentiality is one of the most important aspects of information systems at the moment. One attempt to secure data such as by using cryptography. In this study developed a data security system by implementing the cryptography algorithm Rivest, Shamir Adleman (RSA) and Vigenere Cipher. The research was done by combining Rivest, Shamir Adleman (RSA) and Vigenere Cipher cryptographic algorithms to document file either word, excel, and pdf. This application includes the process of encryption and decryption of data, which is created by using PHP software and my SQL. Data encryption is done on the transmit side through RSA cryptographic calculations using the public key, then proceed with Vigenere Cipher algorithm which also uses public key. As for the stage of the decryption side received by using the Vigenere Cipher algorithm still use public key and then the RSA cryptographic algorithm using a private key. Test results show that the system can encrypt files, decrypt files and transmit files. Tests performed on the process of encryption and decryption of files with different file sizes, file size affects the process of encryption and decryption. The larger the file size the longer the process of encryption and decryption.

  18. Cryptographic framework for analyzing the privacy of recommender algorithms

    NARCIS (Netherlands)

    Tang, Qiang

    2012-01-01

    Recommender algorithms are widely used, ranging from traditional Video on Demand to a wide variety of Web 2.0 services. Unfortunately, the related privacy concerns have not received much attention. In this paper, we study the privacy concerns associated with recommender algorithms and present a

  19. The suffix-free-prefix-free hash function construction and its indifferentiability security analysis

    DEFF Research Database (Denmark)

    Bagheri, Nasour; Gauravaram, Praveen; Knudsen, Lars R.

    2012-01-01

    In this paper, we observe that in the seminal work on indifferentiability analysis of iterated hash functions by Coron et al. and in subsequent works, the initial value $$(IV)$$ of hash functions is fixed. In addition, these indifferentiability results do not depend on the Merkle–Damgård (MD) str...

  20. On Boolean functions with generalized cryptographic properties

    NARCIS (Netherlands)

    Braeken, A.; Nikov, V.S.; Nikova, S.I.; Preneel, B.; Canteaut, A.; Viswanathan, K.

    2004-01-01

    By considering a new metric, we generalize cryptographic properties of Boolean functions such as resiliency and propagation characteristics. These new definitions result in a better understanding of the properties of Boolean functions and provide a better insight in the space defined by this metric.

  1. Low-power cryptographic coprocessor for autonomous wireless sensor networks

    Science.gov (United States)

    Olszyna, Jakub; Winiecki, Wiesław

    2013-10-01

    The concept of autonomous wireless sensor networks involves energy harvesting, as well as effective management of system resources. Public-key cryptography (PKC) offers the advantage of elegant key agreement schemes with which a secret key can be securely established over unsecure channels. In addition to solving the key management problem, the other major application of PKC is digital signatures, with which non-repudiation of messages exchanges can be achieved. The motivation for studying low-power and area efficient modular arithmetic algorithms comes from enabling public-key security for low-power devices that can perform under constrained environment like autonomous wireless sensor networks. This paper presents a cryptographic coprocessor tailored to the autonomous wireless sensor networks constraints. Such hardware circuit is aimed to support the implementation of different public-key cryptosystems based on modular arithmetic in GF(p) and GF(2m). Key components of the coprocessor are described as GEZEL models and can be easily transformed to VHDL and implemented in hardware.

  2. Cryptographic Trust Management Requirements Specification: Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Edgar, Thomas W.

    2009-09-30

    The Cryptographic Trust Management (CTM) Project is being developed for Department of Energy, OE-10 by the Pacific Northwest National Laboratory (PNNL). It is a component project of the NSTB Control Systems Security R&D Program.

  3. Implementation of Rivest Shamir Adleman Algorithm (RSA and Vigenere Cipher In Web Based Information System

    Directory of Open Access Journals (Sweden)

    Aryanti Aryanti

    2018-01-01

    Full Text Available Data security and confidentiality is one of the most important aspects of information systems at the moment. One attempt to secure data such as by using cryptography. In this study developed a data security system by implementing the cryptography algorithm Rivest, Shamir Adleman (RSA and Vigenere Cipher. The research was done by combining Rivest, Shamir Adleman (RSA and Vigenere Cipher cryptographic algorithms to document file either word, excel, and pdf. This application includes the process of encryption and decryption of data, which is created by using PHP software and my SQL. Data encryption is done on the transmit side through RSA cryptographic calculations using the public key, then proceed with Vigenere Cipher algorithm which also uses public key. As for the stage of the decryption side received by using the Vigenere Cipher algorithm still use public key and then the RSA cryptographic algorithm using a private key. Test results show that the system can encrypt files, decrypt files and transmit files. Tests performed on the process of encryption and decryption of files with different file sizes, file size affects the process of encryption and decryption. The larger the file size the longer the process of encryption and decryption.

  4. SIMPL Systems, or: Can We Design Cryptographic Hardware without Secret Key Information?

    Science.gov (United States)

    Rührmair, Ulrich

    This paper discusses a new cryptographic primitive termed SIMPL system. Roughly speaking, a SIMPL system is a special type of Physical Unclonable Function (PUF) which possesses a binary description that allows its (slow) public simulation and prediction. Besides this public key like functionality, SIMPL systems have another advantage: No secret information is, or needs to be, contained in SIMPL systems in order to enable cryptographic protocols - neither in the form of a standard binary key, nor as secret information hidden in random, analog features, as it is the case for PUFs. The cryptographic security of SIMPLs instead rests on (i) a physical assumption on their unclonability, and (ii) a computational assumption regarding the complexity of simulating their output. This novel property makes SIMPL systems potentially immune against many known hardware and software attacks, including malware, side channel, invasive, or modeling attacks.

  5. Internal differential collision attacks on the reduced-round Grøstl-0 hash function

    DEFF Research Database (Denmark)

    Ideguchi, Kota; Tischhauser, Elmar Wolfgang; Preneel, Bart

    2014-01-01

    . This results in collision attacks and semi-free-start collision attacks on the Grøstl-0 hash function and compression function with reduced rounds. Specifically, we show collision attacks on the Grøstl-0-256 hash function reduced to 5 and 6 out of 10 rounds with time complexities 248 and 2112 and on the Grøstl......-0-512 hash function reduced to 6 out of 14 rounds with time complexity 2183. Furthermore, we demonstrate semi-free-start collision attacks on the Grøstl-0-256 compression function reduced to 8 rounds and the Grøstl-0-512 compression function reduced to 9 rounds. Finally, we show improved...

  6. Designing and implementing of improved cryptographic algorithm using modular arithmetic theory

    Directory of Open Access Journals (Sweden)

    Maryam Kamarzarrin

    2015-05-01

    Full Text Available Maintaining the privacy and security of people information are two most important principles of electronic health plan. One of the methods of creating privacy and securing of information is using Public key cryptography system. In this paper, we compare two algorithms, Common And Fast Exponentiation algorithms, for enhancing the efficiency of public key cryptography. We express that a designed system by Fast Exponentiation Algorithm has high speed and performance but low power consumption and space occupied compared with Common Exponentiation algorithm. Although designed systems by Common Exponentiation algorithm have slower speed and lower performance, designing by this algorithm has less complexity, and easier designing compared with Fast Exponentiation algorithm. In this paper, we will try to examine and compare two different methods of exponentiation, also observe performance Impact of these two approaches in the form of hardware with VHDL language on FPGA.

  7. Cryptanalysis of the 10-Round Hash and Full Compression Function of SHAvite-3-512

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Leurent, Gaëtan; Mendel, Florian

    2010-01-01

    In this paper, we analyze SHAvite-3-512 hash function, as proposed for round 2 of the SHA-3 competition. We present cryptanalytic results on 10 out of 14 rounds of the hash function SHAvite-3-512, and on the full 14 round compression function of SHAvite-3-512. We show a second preimage attack on ...

  8. On another two cryptographic identities in universal Osborn loops

    Directory of Open Access Journals (Sweden)

    T. G. Jaiyéolá

    2010-03-01

    Full Text Available In this study, by establishing an identity for universal Osborn loops, two other identities (of degrees 4 and 6 are deduced from it and they are recognized and recommended for cryptography in a similar spirit in which the cross inverse property (of degree 2 has been used by Keedwell following the fact that it was observed that universal Osborn loops that do not have the 3-power associative property or weaker forms of; inverse property, power associativity and diassociativity to mention a few, will have cycles (even long ones. These identities are found to be cryptographic in nature for universal Osborn loops and thereby called cryptographic identities. They were also found applicable to security patterns, arrangements and networks which the CIP may not be applicable to.

  9. The first collision for full SHA-1

    NARCIS (Netherlands)

    M.M.J. Stevens (Marc); E. Bursztein (Elie); P. Karpman (Pierre); A. Albertini (Ange); Y. Markov (Yarik)

    2017-01-01

    textabstractSHA-1 is a widely used 1995 NIST cryptographic hash function standard that was officially deprecated by NIST in 2011 due to fundamental security weaknesses demonstrated in various analyses and theoretical attacks. Despite its deprecation, SHA-1 remains widely used in 2017 for document

  10. Second-Preimage Analysis of Reduced SHA-1

    DEFF Research Database (Denmark)

    Rechberger, Christian

    2010-01-01

    Many applications using cryptographic hash functions do not require collision resistance, but some kind of preimage resistance. That's also the reason why the widely used SHA-1 continues to be recommended in all applications except digital signatures after 2010. Recent work on preimage and second...

  11. Visual hashing of digital video : applications and techniques

    NARCIS (Netherlands)

    Oostveen, J.; Kalker, A.A.C.M.; Haitsma, J.A.; Tescher, A.G.

    2001-01-01

    his paper present the concept of robust video hashing as a tool for video identification. We present considerations and a technique for (i) extracting essential perceptual features from a moving image sequences and (ii) for identifying any sufficiently long unknown video segment by efficiently

  12. Efficient key management for cryptographically enforced access control

    NARCIS (Netherlands)

    Zych, Anna; Petkovic, Milan; Jonker, Willem

    Cryptographic enforcement of access control mechanisms relies on encrypting protected data with the keys stored by authorized users. This approach poses the problem of the distribution of secret keys. In this paper, a key management scheme is presented where each user stores a single key and is

  13. Cryptographic key generation using handwritten signature

    OpenAIRE

    Freire, Manuel R.; Fiérrez, Julián; Ortega-García, Javier

    2006-01-01

    M. Freire-Santos ; J. Fierrez-Aguilar ; J. Ortega-Garcia; "Cryptographic key generation using handwritten signature", Biometric Technology for Human Identification III, Proc. SPIE 6202 (April 17, 2006); doi:10.1117/12.665875. Copyright 2006 Society of Photo‑Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of...

  14. Novel Quantum Encryption Algorithm Based on Multiqubit Quantum Shift Register and Hill Cipher

    International Nuclear Information System (INIS)

    Khalaf, Rifaat Zaidan; Abdullah, Alharith Abdulkareem

    2014-01-01

    Based on a quantum shift register, a novel quantum block cryptographic algorithm that can be used to encrypt classical messages is proposed. The message is encoded and decoded by using a code generated by the quantum shift register. The security of this algorithm is analysed in detail. It is shown that, in the quantum block cryptographic algorithm, two keys can be used. One of them is the classical key that is used in the Hill cipher algorithm where Alice and Bob use the authenticated Diffie Hellman key exchange algorithm using the concept of digital signature for the authentication of the two communicating parties and so eliminate the man-in-the-middle attack. The other key is generated by the quantum shift register and used for the coding of the encryption message, where Alice and Bob share the key by using the BB84 protocol. The novel algorithm can prevent a quantum attack strategy as well as a classical attack strategy. The problem of key management is discussed and circuits for the encryption and the decryption are suggested

  15. 75 FR 52798 - State-07, Cryptographic Clearance Records

    Science.gov (United States)

    2010-08-27

    ... Information Programs and Services, A/GIS/ IPS, Department of State, SA-2, 515 22nd Street, NW., Washington, DC... Department of State and Agency for International Development who have applied for cryptographic clearances as... that apply to all of its Privacy Act systems of records. These notices appear in the form of a...

  16. Modelling Cryptographic Keys in Dynamic Epistemic Logic with DEMO

    NARCIS (Netherlands)

    H. van Ditmarsch (Hans); D.J.N. van Eijck (Jan); F.A.G. Sietsma (Floor); S.E. Simon (Sunil); not CWI et al; J.B. Perez; not CWI et al

    2012-01-01

    textabstractIt is far from obvious to find logical counterparts to cryptographic protocol primitives. In logic, a common assumption is that agents are perfectly rational and have no computational limitations. This creates a dilemma. If one merely abstracts from computational aspects, protocols

  17. UQlust: combining profile hashing with linear-time ranking for efficient clustering and analysis of big macromolecular data.

    Science.gov (United States)

    Adamczak, Rafal; Meller, Jarek

    2016-12-28

    Advances in computing have enabled current protein and RNA structure prediction and molecular simulation methods to dramatically increase their sampling of conformational spaces. The quickly growing number of experimentally resolved structures, and databases such as the Protein Data Bank, also implies large scale structural similarity analyses to retrieve and classify macromolecular data. Consequently, the computational cost of structure comparison and clustering for large sets of macromolecular structures has become a bottleneck that necessitates further algorithmic improvements and development of efficient software solutions. uQlust is a versatile and easy-to-use tool for ultrafast ranking and clustering of macromolecular structures. uQlust makes use of structural profiles of proteins and nucleic acids, while combining a linear-time algorithm for implicit comparison of all pairs of models with profile hashing to enable efficient clustering of large data sets with a low memory footprint. In addition to ranking and clustering of large sets of models of the same protein or RNA molecule, uQlust can also be used in conjunction with fragment-based profiles in order to cluster structures of arbitrary length. For example, hierarchical clustering of the entire PDB using profile hashing can be performed on a typical laptop, thus opening an avenue for structural explorations previously limited to dedicated resources. The uQlust package is freely available under the GNU General Public License at https://github.com/uQlust . uQlust represents a drastic reduction in the computational complexity and memory requirements with respect to existing clustering and model quality assessment methods for macromolecular structure analysis, while yielding results on par with traditional approaches for both proteins and RNAs.

  18. EFFICIENCY ANALYSIS OF HASHING METHODS FOR FILE SYSTEMS IN USER MODE

    Directory of Open Access Journals (Sweden)

    E. Y. Ivanov

    2013-05-01

    Full Text Available The article deals with characteristics and performance of interaction protocols between virtual file system and file system, their influence on processing power of microkernel operating systems. User mode implementation of ext2 file system for MINIX 3 OS is used to show that in microkernel operating systems file object identification time might increase up to 26 times in comparison with monolithic systems. Therefore, we present efficiency analysis of various hashing methods for file systems, running in user mode. Studies have shown that using hashing methods recommended in this paper it is possible to achieve competitive performance of the considered component of I/O stacks in microkernel and monolithic operating systems.

  19. HASH: the Hong Kong/AAO/Strasbourg Hα planetary nebula database

    International Nuclear Information System (INIS)

    Parker, Quentin A; Bojičić, Ivan S; Frew, David J

    2016-01-01

    By incorporating our major recent discoveries with re-measured and verified contents of existing catalogues we provide, for the first time, an accessible, reliable, on-line SQL database for essential, up-to date information for all known Galactic planetary nebulae (PNe). We have attempted to: i) reliably remove PN mimics/false ID's that have biased previous studies and ii) provide accurate positions, sizes, morphologies, multi-wavelength imagery and spectroscopy. We also provide a link to CDS/Vizier for the archival history of each object and other valuable links to external data. With the HASH interface, users can sift, select, browse, collate, investigate, download and visualise the entire currently known Galactic PNe diversity. HASH provides the community with the most complete and reliable data with which to undertake new science. (paper)

  20. Investigation of the Practical Possibility of Solving Problems on Generalized Cellular Automata Associated with Cryptanalysis by Mean Algebraic Methods

    Directory of Open Access Journals (Sweden)

    P. G. Klyucharev

    2017-01-01

    Full Text Available A number of previous author’s papers proposed methods for constructing various cryptographic algorithms, including block ciphers and cryptographic hash functions, based on generalized cellular automata. This one is aimed at studying a possibility to use the algebraic cryptanalysis methods related to the construction of Gröbner bases for the generalized cellular automata to be applied in cryptography, i.e. this paper studies the possibility for using algebraic cryptanalysis methods to solve the problems of inversion of a generalized cellular automaton and recovering the key of such an automaton.If the cryptographic algorithm is represented as a system of polynomial equations over a certain finite field, then its breach is reduced to solving this system with respect to the key. Although the problem of solving a system of polynomial equations in a finite field is NP-difficult in the general case, the solution of a particular system can have low computational cost.Cryptanalysis based on the construction of a system of polynomial equations that links plain text, cipher-text and key, and its solution by algebraic methods, is usually called algebraic cryptanalysis. Among the main methods to solve systems of polynomial equations are those to construct Gröbner bases.Cryptanalysis of ciphers and hash functions based on generalized cellular automata can be reduced to various problems. We will consider two such problems: the problem of inversion of a generalized cellular automaton, which, in case we know the values of the cells after k iterations, enables us to find the initial values. And the task of recovering the key, which is to find the initial values of the remaining cells, using the cell values after k steps and the initial values of a part of the cells.A computational experiment was carried out to solve the two problems above stated in order to determine the maximum size of a generalized cellular automaton for which the solution of these

  1. Simulation-Based Performance Evaluation of Predictive-Hashing Based Multicast Authentication Protocol

    Directory of Open Access Journals (Sweden)

    Seonho Choi

    2012-12-01

    Full Text Available A predictive-hashing based Denial-of-Service (DoS resistant multicast authentication protocol was proposed based upon predictive-hashing, one-way key chain, erasure codes, and distillation codes techniques [4, 5]. It was claimed that this new scheme should be more resistant to various types of DoS attacks, and its worst-case resource requirements were derived in terms of coarse-level system parameters including CPU times for signature verification and erasure/distillation decoding operations, attack levels, etc. To show the effectiveness of our approach and to analyze exact resource requirements in various attack scenarios with different parameter settings, we designed and implemented an attack simulator which is platformindependent. Various attack scenarios may be created with different attack types and parameters against a receiver equipped with the predictive-hashing based protocol. The design of the simulator is explained, and the simulation results are presented with detailed resource usage statistics. In addition, resistance level to various types of DoS attacks is formulated with a newly defined resistance metric. By comparing these results to those from another approach, PRABS [8], we show that the resistance level of our protocol is greatly enhanced even in the presence of many attack streams.

  2. A Key Management Method for Cryptographically Enforced Access Control

    NARCIS (Netherlands)

    Zych, Anna; Petkovic, Milan; Jonker, Willem; Fernández-Medina, Eduardo; Yagüe, Mariemma I.

    Cryptographic enforcement of access control mechanisms relies on encrypting protected data with the keys stored by authorized users. This approach poses the problem of the distribution of secret keys. In this paper, a key management scheme is presented where each user stores a single key and is

  3. Predicting the winner of the 2008 US Presidential Elections using a Sony PlayStation 3

    NARCIS (Netherlands)

    Stevens, M.M.J.; Lenstra, A.K.; Weger, de B.M.M.

    2007-01-01

    We have used a Sony Playstation 3 to correctly predict the outcome of the 2008 US presidential elections. In order not to influence the voters we keep our prediction secret, but commit to it by publishing its cryptographic hash on this website. The document with the correct prediction and matching

  4. A pipelined FPGA implementation of an encryption algorithm based on genetic algorithm

    Science.gov (United States)

    Thirer, Nonel

    2013-05-01

    With the evolution of digital data storage and exchange, it is essential to protect the confidential information from every unauthorized access. High performance encryption algorithms were developed and implemented by software and hardware. Also many methods to attack the cipher text were developed. In the last years, the genetic algorithm has gained much interest in cryptanalysis of cipher texts and also in encryption ciphers. This paper analyses the possibility to use the genetic algorithm as a multiple key sequence generator for an AES (Advanced Encryption Standard) cryptographic system, and also to use a three stages pipeline (with four main blocks: Input data, AES Core, Key generator, Output data) to provide a fast encryption and storage/transmission of a large amount of data.

  5. A comparative study of Message Digest 5(MD5) and SHA256 algorithm

    Science.gov (United States)

    Rachmawati, D.; Tarigan, J. T.; Ginting, A. B. C.

    2018-03-01

    The document is a collection of written or printed data containing information. The more rapid advancement of technology, the integrity of a document should be kept. Because of the nature of an open document means the document contents can be read and modified by many parties so that the integrity of the information as a content of the document is not preserved. To maintain the integrity of the data, it needs to create a mechanism which is called a digital signature. A digital signature is a specific code which is generated from the function of producing a digital signature. One of the algorithms that used to create the digital signature is a hash function. There are many hash functions. Two of them are message digest 5 (MD5) and SHA256. Those both algorithms certainly have its advantages and disadvantages of each. The purpose of this research is to determine the algorithm which is better. The parameters which used to compare that two algorithms are the running time and complexity. The research results obtained from the complexity of the Algorithms MD5 and SHA256 is the same, i.e., ⊖ (N), but regarding the speed is obtained that MD5 is better compared to SHA256.

  6. Accelerating SPARQL queries by exploiting hash-based locality and adaptive partitioning

    KAUST Repository

    Al-Harbi, Razen; Abdelaziz, Ibrahim; Kalnis, Panos; Mamoulis, Nikos; Ebrahim, Yasser; Sahli, Majed

    2016-01-01

    State-of-the-art distributed RDF systems partition data across multiple computer nodes (workers). Some systems perform cheap hash partitioning, which may result in expensive query evaluation. Others try to minimize inter-node communication, which

  7. Generating cryptographic keys by radioactive decays

    International Nuclear Information System (INIS)

    Grupen, Claus; Maurer, Ingo; Schmidt, Dieter; Smolik, Ludek

    2001-01-01

    We are presenting a new method for the generation of statistically genuine random bitstream with very high frequency which can be employed for cryptographic purposes. The method uses the feature of statistically unpredictable radioactive decays as the source of randomness. The measured quantity is the time distance between the responses of a small ionisation chamber due to the recording of ionising decay products. This time measurement is converted into states representing 0o r 1. The data generated in our experiment successfully passed FIPS PUB 140-1 and die hard statistical tests. For the simulation of systematic effects Monte Carlo techniques were used

  8. Evaluation of Information Leakage from Cryptographic Hardware via Common-Mode Current

    Science.gov (United States)

    Hayashi, Yu-Ichi; Homma, Naofumi; Mizuki, Takaaki; Sugawara, Takeshi; Kayano, Yoshiki; Aoki, Takafumi; Minegishi, Shigeki; Satoh, Akashi; Sone, Hideaki; Inoue, Hiroshi

    This paper presents a possibility of Electromagnetic (EM) analysis against cryptographic modules outside their security boundaries. The mechanism behind the information leakage is explained from the view point of Electromagnetic Compatibility: electric fluctuation released from cryptographic modules can conduct to peripheral circuits based on ground bounce, resulting in radiation. We demonstrate the consequence of the mechanism through experiments where the ISO/IEC standard block cipher AES (Advanced Encryption Standard) is implemented on an FPGA board and EM radiations from power and communication cables are measured. Correlation Electromagnetic Analysis (CEMA) is conducted in order to evaluate the information leakage. The experimental results show that secret keys are revealed even though there are various disturbing factors such as voltage regulators and AC/DC converters between the target module and the measurement points. We also discuss information-suppression techniques as electrical-level countermeasures against such CEMAs.

  9. HashLearn Now: Mobile Tutoring in India

    OpenAIRE

    Arun Kumar Agariya; Binay Krishna Shivam; Shashank Murali; Jyoti Tikoria

    2016-01-01

    Looking at today’s competitive exams scenario, a single mark may lead to a differentiation of rank in multiples of hundreds or even thousands. Looking at this problem from student’s perspective this article discusses the role of anywhere, anytime help for the students in getting answers for their problems on a real-time basis from the application known as HashLearn Now. The smart phones usage by students clearly signifies the importance of this application for getting their queries answered b...

  10. AUTHENTICATION ALGORITHM FOR PARTICIPANTS OF INFORMATION INTEROPERABILITY IN PROCESS OF OPERATING SYSTEM REMOTE LOADING ON THIN CLIENT

    Directory of Open Access Journals (Sweden)

    Y. A. Gatchin

    2016-05-01

    Full Text Available Subject of Research.This paper presents solution of authentication problem for all components of information interoperabilityin process of operation system network loading on thin client from terminal server. System Definition. In the proposed solution operation system integrity check is made by hardware-software module, including USB-token with protected memory for secure storage of cryptographic keys and loader. The key requirement for the solution is mutual authentication of four participants: terminal server, thin client, token and user. We have created two algorithms for the problem solution. The first of the designed algorithms compares the encrypted one-time password (random number with the reference value stored in the memory of the token and updates this number in case of successful authentication. The second algorithm uses the public and private keys of the token and the server. As a result of cryptographic transformation, participants are authenticated and the secure channel is formed between the token, thin client and terminal server. Main Results. Additional research was carried out to find out if the designed algorithms meet the necessary requirements. Criteria used included applicability in a multi-access terminal system architecture, potential threats evaluation and overall system security. According to analysis results, it is recommended to use the algorithm based on PKI due to its high scalability and usability. High level of data security is proved as a result of asymmetric cryptography application with the guarantee that participants' private keys are never sent in the authentication process. Practical Relevance. The designed PKI-based algorithm allows solving the problem with the use of cryptographic algorithms according to state standard even in its absence on asymmetric cryptography. Thus, it can be applied in the State Information Systems with increased requirements to information security.

  11. The Cryptographic Implications of the LinkedIn Data Breach

    OpenAIRE

    Gune, Aditya

    2017-01-01

    Data security and personal privacy are difficult to maintain in the Internet age. In 2012, professional networking site LinkedIn suffered a breach, compromising the login of over 100 million accounts. The passwords were cracked and sold online, exposing the authentication credentials millions of users. This manuscript dissects the cryptographic failures implicated in the breach, and explores more secure methods of storing passwords.

  12. k-Nearest Neighbors Algorithm in Profiling Power Analysis Attacks

    Directory of Open Access Journals (Sweden)

    Z. Martinasek

    2016-06-01

    Full Text Available Power analysis presents the typical example of successful attacks against trusted cryptographic devices such as RFID (Radio-Frequency IDentifications and contact smart cards. In recent years, the cryptographic community has explored new approaches in power analysis based on machine learning models such as Support Vector Machine (SVM, RF (Random Forest and Multi-Layer Perceptron (MLP. In this paper, we made an extensive comparison of machine learning algorithms in the power analysis. For this purpose, we implemented a verification program that always chooses the optimal settings of individual machine learning models in order to obtain the best classification accuracy. In our research, we used three datasets, the first containing the power traces of an unprotected AES (Advanced Encryption Standard implementation. The second and third datasets are created independently from public available power traces corresponding to a masked AES implementation (DPA Contest v4. The obtained results revealed some interesting facts, namely, an elementary k-NN (k-Nearest Neighbors algorithm, which has not been commonly used in power analysis yet, shows great application potential in practice.

  13. Distributed hash table theory, platforms and applications

    CERN Document Server

    Zhang, Hao; Xie, Haiyong; Yu, Nenghai

    2013-01-01

    This SpringerBrief summarizes the development of Distributed Hash Table in both academic and industrial fields. It covers the main theory, platforms and applications of this key part in distributed systems and applications, especially in large-scale distributed environments. The authors teach the principles of several popular DHT platforms that can solve practical problems such as load balance, multiple replicas, consistency and latency. They also propose DHT-based applications including multicast, anycast, distributed file systems, search, storage, content delivery network, file sharing and c

  14. Cryptographic applications of analytic number theory complexity lower bounds and pseudorandomness

    CERN Document Server

    2003-01-01

    The book introduces new ways of using analytic number theory in cryptography and related areas, such as complexity theory and pseudorandom number generation. Key topics and features: - various lower bounds on the complexity of some number theoretic and cryptographic problems, associated with classical schemes such as RSA, Diffie-Hellman, DSA as well as with relatively new schemes like XTR and NTRU - a series of very recent results about certain important characteristics (period, distribution, linear complexity) of several commonly used pseudorandom number generators, such as the RSA generator, Blum-Blum-Shub generator, Naor-Reingold generator, inversive generator, and others - one of the principal tools is bounds of exponential sums, which are combined with other number theoretic methods such as lattice reduction and sieving - a number of open problems of different level of difficulty and proposals for further research - an extensive and up-to-date bibliography Cryptographers and number theorists will find th...

  15. Rebound Attacks on the Reduced Grøstl Hash Function

    DEFF Research Database (Denmark)

    Mendel, Florian; Rechberger, C.; Schlaffer, Martin

    2010-01-01

    Grøstl is one of 14 second round candidates of the NIST SHA-3 competition. Cryptanalytic results on the wide-pipe compression function of Grøstl-256 have already been published. However, little is known about the hash function, arguably a much more interesting cryptanalytic setting. Also, Grøstl...

  16. Accelerating SPARQL queries by exploiting hash-based locality and adaptive partitioning

    KAUST Repository

    Al-Harbi, Razen

    2016-02-08

    State-of-the-art distributed RDF systems partition data across multiple computer nodes (workers). Some systems perform cheap hash partitioning, which may result in expensive query evaluation. Others try to minimize inter-node communication, which requires an expensive data preprocessing phase, leading to a high startup cost. Apriori knowledge of the query workload has also been used to create partitions, which, however, are static and do not adapt to workload changes. In this paper, we propose AdPart, a distributed RDF system, which addresses the shortcomings of previous work. First, AdPart applies lightweight partitioning on the initial data, which distributes triples by hashing on their subjects; this renders its startup overhead low. At the same time, the locality-aware query optimizer of AdPart takes full advantage of the partitioning to (1) support the fully parallel processing of join patterns on subjects and (2) minimize data communication for general queries by applying hash distribution of intermediate results instead of broadcasting, wherever possible. Second, AdPart monitors the data access patterns and dynamically redistributes and replicates the instances of the most frequent ones among workers. As a result, the communication cost for future queries is drastically reduced or even eliminated. To control replication, AdPart implements an eviction policy for the redistributed patterns. Our experiments with synthetic and real data verify that AdPart: (1) starts faster than all existing systems; (2) processes thousands of queries before other systems become online; and (3) gracefully adapts to the query load, being able to evaluate queries on billion-scale RDF data in subseconds.

  17. The legal response to illegal "hash clubs" in Denmark

    DEFF Research Database (Denmark)

    Asmussen, V.; Moesby-Johansen, C.

    2004-01-01

    Fra midten af 1990'erne er der skudt en række hashklubber op i Danmark. Overordnet er der to slags klubber: salgssteder og væresteder. De første klubber er udelukkende organiseret om salget af hash, mens de andre er klubber, hvor man både kan købe hashen og opholde sig på stedet for at deltage i ...

  18. CWI cryptanalyst discovers new cryptographic attack variant in Flame spy malware

    NARCIS (Netherlands)

    M.M.J. Stevens (Marc); R.J.F. Cramer (Ronald)

    2012-01-01

    htmlabstractCryptanalyst Marc Stevens from the Centrum Wiskunde & Informatica (CWI) in Amsterdam, known for breaking the https security in 2008 using a cryptanalytic attack on MD5, analyzed the recent Flame virus this week. He discovered that for this spy malware an as yet unknown cryptographic

  19. Cryptographic Key Management and Critical Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, Robert K [ORNL

    2014-05-01

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) CyberSecurity for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing CyberSecurity for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system

  20. Parallel Algorithms for the Exascale Era

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-10-19

    New parallel algorithms are needed to reach the Exascale level of parallelism with millions of cores. We look at some of the research developed by students in projects at LANL. The research blends ideas from the early days of computing while weaving in the fresh approach brought by students new to the field of high performance computing. We look at reproducibility of global sums and why it is important to parallel computing. Next we look at how the concept of hashing has led to the development of more scalable algorithms suitable for next-generation parallel computers. Nearly all of this work has been done by undergraduates and published in leading scientific journals.

  1. Secure Programming Cookbook for C and C++ Recipes for Cryptography, Authentication, Input Validation & More

    CERN Document Server

    Viega, John

    2009-01-01

    Secure Programming Cookbook for C and C++ is an important new resource for developers serious about writing secure code for Unix® (including Linux®) and Windows® environments. This essential code companion covers a wide range of topics, including safe initialization, access control, input validation, symmetric and public key cryptography, cryptographic hashes and MACs, authentication and key exchange, PKI, random numbers, and anti-tampering.

  2. Optimized Data Indexing Algorithms for OLAP Systems

    Directory of Open Access Journals (Sweden)

    Lucian BORNAZ

    2010-12-01

    Full Text Available The need to process and analyze large data volumes, as well as to convey the information contained therein to decision makers naturally led to the development of OLAP systems. Similarly to SGBDs, OLAP systems must ensure optimum access to the storage environment. Although there are several ways to optimize database systems, implementing a correct data indexing solution is the most effective and less costly. Thus, OLAP uses indexing algorithms for relational data and n-dimensional summarized data stored in cubes. Today database systems implement derived indexing algorithms based on well-known Tree, Bitmap and Hash indexing algorithms. This is because no indexing algorithm provides the best performance for any particular situation (type, structure, data volume, application. This paper presents a new n-dimensional cube indexing algorithm, derived from the well known B-Tree index, which indexes data stored in data warehouses taking in consideration their multi-dimensional nature and provides better performance in comparison to the already implemented Tree-like index types.

  3. SECOM: A novel hash seed and community detection based-approach for genome-scale protein domain identification

    KAUST Repository

    Fan, Ming

    2012-06-28

    With rapid advances in the development of DNA sequencing technologies, a plethora of high-throughput genome and proteome data from a diverse spectrum of organisms have been generated. The functional annotation and evolutionary history of proteins are usually inferred from domains predicted from the genome sequences. Traditional database-based domain prediction methods cannot identify novel domains, however, and alignment-based methods, which look for recurring segments in the proteome, are computationally demanding. Here, we propose a novel genome-wide domain prediction method, SECOM. Instead of conducting all-against-all sequence alignment, SECOM first indexes all the proteins in the genome by using a hash seed function. Local similarity can thus be detected and encoded into a graph structure, in which each node represents a protein sequence and each edge weight represents the shared hash seeds between the two nodes. SECOM then formulates the domain prediction problem as an overlapping community-finding problem in this graph. A backward graph percolation algorithm that efficiently identifies the domains is proposed. We tested SECOM on five recently sequenced genomes of aquatic animals. Our tests demonstrated that SECOM was able to identify most of the known domains identified by InterProScan. When compared with the alignment-based method, SECOM showed higher sensitivity in detecting putative novel domains, while it was also three orders of magnitude faster. For example, SECOM was able to predict a novel sponge-specific domain in nucleoside-triphosphatase (NTPases). Furthermore, SECOM discovered two novel domains, likely of bacterial origin, that are taxonomically restricted to sea anemone and hydra. SECOM is an open-source program and available at http://sfb.kaust.edu.sa/Pages/Software.aspx. © 2012 Fan et al.

  4. SECOM: A novel hash seed and community detection based-approach for genome-scale protein domain identification

    KAUST Repository

    Fan, Ming; Wong, Ka-Chun; Ryu, Tae Woo; Ravasi, Timothy; Gao, Xin

    2012-01-01

    With rapid advances in the development of DNA sequencing technologies, a plethora of high-throughput genome and proteome data from a diverse spectrum of organisms have been generated. The functional annotation and evolutionary history of proteins are usually inferred from domains predicted from the genome sequences. Traditional database-based domain prediction methods cannot identify novel domains, however, and alignment-based methods, which look for recurring segments in the proteome, are computationally demanding. Here, we propose a novel genome-wide domain prediction method, SECOM. Instead of conducting all-against-all sequence alignment, SECOM first indexes all the proteins in the genome by using a hash seed function. Local similarity can thus be detected and encoded into a graph structure, in which each node represents a protein sequence and each edge weight represents the shared hash seeds between the two nodes. SECOM then formulates the domain prediction problem as an overlapping community-finding problem in this graph. A backward graph percolation algorithm that efficiently identifies the domains is proposed. We tested SECOM on five recently sequenced genomes of aquatic animals. Our tests demonstrated that SECOM was able to identify most of the known domains identified by InterProScan. When compared with the alignment-based method, SECOM showed higher sensitivity in detecting putative novel domains, while it was also three orders of magnitude faster. For example, SECOM was able to predict a novel sponge-specific domain in nucleoside-triphosphatase (NTPases). Furthermore, SECOM discovered two novel domains, likely of bacterial origin, that are taxonomically restricted to sea anemone and hydra. SECOM is an open-source program and available at http://sfb.kaust.edu.sa/Pages/Software.aspx. © 2012 Fan et al.

  5. Physically Unclonable Cryptographic Primitives by Chemical Vapor Deposition of Layered MoS2.

    Science.gov (United States)

    Alharbi, Abdullah; Armstrong, Darren; Alharbi, Somayah; Shahrjerdi, Davood

    2017-12-26

    Physically unclonable cryptographic primitives are promising for securing the rapidly growing number of electronic devices. Here, we introduce physically unclonable primitives from layered molybdenum disulfide (MoS 2 ) by leveraging the natural randomness of their island growth during chemical vapor deposition (CVD). We synthesize a MoS 2 monolayer film covered with speckles of multilayer islands, where the growth process is engineered for an optimal speckle density. Using the Clark-Evans test, we confirm that the distribution of islands on the film exhibits complete spatial randomness, hence indicating the growth of multilayer speckles is a spatial Poisson process. Such a property is highly desirable for constructing unpredictable cryptographic primitives. The security primitive is an array of 2048 pixels fabricated from this film. The complex structure of the pixels makes the physical duplication of the array impossible (i.e., physically unclonable). A unique optical response is generated by applying an optical stimulus to the structure. The basis for this unique response is the dependence of the photoemission on the number of MoS 2 layers, which by design is random throughout the film. Using a threshold value for the photoemission, we convert the optical response into binary cryptographic keys. We show that the proper selection of this threshold is crucial for maximizing combination randomness and that the optimal value of the threshold is linked directly to the growth process. This study reveals an opportunity for generating robust and versatile security primitives from layered transition metal dichalcogenides.

  6. Speaker Linking and Applications using Non-Parametric Hashing Methods

    Science.gov (United States)

    2016-09-08

    nonparametric estimate of a multivariate density function,” The Annals of Math- ematical Statistics , vol. 36, no. 3, pp. 1049–1051, 1965. [9] E. A. Patrick...Speaker Linking and Applications using Non-Parametric Hashing Methods† Douglas Sturim and William M. Campbell MIT Lincoln Laboratory, Lexington, MA...with many approaches [1, 2]. For this paper, we focus on using i-vectors [2], but the methods apply to any embedding. For the task of speaker QBE and

  7. Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions

    OpenAIRE

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to c...

  8. GSHR-Tree: a spatial index tree based on dynamic spatial slot and hash table in grid environments

    Science.gov (United States)

    Chen, Zhanlong; Wu, Xin-cai; Wu, Liang

    2008-12-01

    Computation Grids enable the coordinated sharing of large-scale distributed heterogeneous computing resources that can be used to solve computationally intensive problems in science, engineering, and commerce. Grid spatial applications are made possible by high-speed networks and a new generation of Grid middleware that resides between networks and traditional GIS applications. The integration of the multi-sources and heterogeneous spatial information and the management of the distributed spatial resources and the sharing and cooperative of the spatial data and Grid services are the key problems to resolve in the development of the Grid GIS. The performance of the spatial index mechanism is the key technology of the Grid GIS and spatial database affects the holistic performance of the GIS in Grid Environments. In order to improve the efficiency of parallel processing of a spatial mass data under the distributed parallel computing grid environment, this paper presents a new grid slot hash parallel spatial index GSHR-Tree structure established in the parallel spatial indexing mechanism. Based on the hash table and dynamic spatial slot, this paper has improved the structure of the classical parallel R tree index. The GSHR-Tree index makes full use of the good qualities of R-Tree and hash data structure. This paper has constructed a new parallel spatial index that can meet the needs of parallel grid computing about the magnanimous spatial data in the distributed network. This arithmetic splits space in to multi-slots by multiplying and reverting and maps these slots to sites in distributed and parallel system. Each sites constructs the spatial objects in its spatial slot into an R tree. On the basis of this tree structure, the index data was distributed among multiple nodes in the grid networks by using large node R-tree method. The unbalance during process can be quickly adjusted by means of a dynamical adjusting algorithm. This tree structure has considered the

  9. Analysis of cryptographic mechanisms used in ransomware CryptXXX v3

    Directory of Open Access Journals (Sweden)

    Michał Glet

    2016-12-01

    Full Text Available The main purpose of this paper was to analysis how malicious software is using cryptographic mechanisms. Reverse engineering were applied in order to discover mechanisms used in ransomware CryptXXX v3. At the end were given some useful advices how to improve CryptXXX.[b]Keyword:[/b] ransomware, software engineering, reverse engineering, RC4, RSA, malicious software

  10. Construction of secure and fast hash functions using nonbinary error-correcting codes

    DEFF Research Database (Denmark)

    Knudsen, Lars Ramkilde; Preneel, Bart

    2002-01-01

    constructions based on block ciphers such as the Data Encryption Standard (DES), where the key size is slightly smaller than the block size; IDEA, where the key size is twice the block size; Advanced Encryption Standard (AES), with a variable key size; and to MD4-like hash functions. Under reasonable...

  11. Review and Analysis of Cryptographic Schemes Implementing Threshold Signature

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-03-01

    Full Text Available This work is devoted to the study of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, ellipt ic curves and bilinear pairings were investigated. Different methods of generation and verification of threshold signatures were explored, e.g. used in a mobile agents, Internet banking and e-currency. The significance of the work is determined by the reduction of the level of counterfeit electronic documents, signed by certain group of users.

  12. Analysis and improvement for the performance of Baptista's cryptographic scheme

    International Nuclear Information System (INIS)

    Wei Jun; Liao Xiaofeng; Wong, K.W.; Zhou Tsing; Deng Yigui

    2006-01-01

    Based on Baptista's chaotic cryptosystem, we propose a secure and robust chaotic cryptographic scheme after investigating the problems found in this cryptosystem as well as its variants. In this proposed scheme, a subkey array generated from the key and the plaintext is adopted to enhance the security. Some methods are introduced to increase the efficiency. Theoretical analyses and numerical simulations indicate that the proposed scheme is secure and efficient for practical use

  13. Design and implementation of an ASIP-based cryptography processor for AES, IDEA, and MD5

    Directory of Open Access Journals (Sweden)

    Karim Shahbazi

    2017-08-01

    Full Text Available In this paper, a new 32-bit ASIP-based crypto processor for AES, IDEA, and MD5 is designed. The instruction-set consists of both general purpose and specific instructions for the above cryptographic algorithms. The proposed architecture has nine function units and two data buses. It has also two types of 32-bit instruction formats for executing Memory Reference (M.R., Register Reference (R.R., and Input/Output Reference (I/O R. instructions. The maximum achieved frequency is 166.916 MHz. The encoded output results of the encryption process of a 128-bit input block are obtained after 122, 146 and 170 clock cycles for AES-128, AES-192, and AES-256, respectively. Moreover, it takes 95 clock cycles to encrypt or decrypt a 64-bit input block by using IDEA. Finally, the MD5 hash algorithm requires 469 clock cycles to generate the coded outputs for a block of 512 bits. The performance of the proposed processor is compared to some previous and state-of-the-art implementations in terms of speed, latency, throughput, and flexibility.

  14. Authentication and Encryption Using Modified Elliptic Curve Cryptography with Particle Swarm Optimization and Cuckoo Search Algorithm

    Science.gov (United States)

    Kota, Sujatha; Padmanabhuni, Venkata Nageswara Rao; Budda, Kishor; K, Sruthi

    2018-05-01

    Elliptic Curve Cryptography (ECC) uses two keys private key and public key and is considered as a public key cryptographic algorithm that is used for both authentication of a person and confidentiality of data. Either one of the keys is used in encryption and other in decryption depending on usage. Private key is used in encryption by the user and public key is used to identify user in the case of authentication. Similarly, the sender encrypts with the private key and the public key is used to decrypt the message in case of confidentiality. Choosing the private key is always an issue in all public key Cryptographic Algorithms such as RSA, ECC. If tiny values are chosen in random the security of the complete algorithm becomes an issue. Since the Public key is computed based on the Private Key, if they are not chosen optimally they generate infinity values. The proposed Modified Elliptic Curve Cryptography uses selection in either of the choices; the first option is by using Particle Swarm Optimization and the second option is by using Cuckoo Search Algorithm for randomly choosing the values. The proposed algorithms are developed and tested using sample database and both are found to be secured and reliable. The test results prove that the private key is chosen optimally not repetitive or tiny and the computations in public key will not reach infinity.

  15. Live chat alternative security protocol

    Science.gov (United States)

    Rahman, J. P. R.; Nugraha, E.; Febriany, A.

    2018-05-01

    Indonesia is one of the largest e-commerce markets in Southeast Asia, as many as 5 million people do transactions in e-commerce, therefore more and more people use live chat service to communicate with customer service. In live chat, the customer service often asks customers’ data such as, full name, address, e-mail, transaction id, which aims to verify the purchase of the product. One of the risks that will happen is sniffing which will lead to the theft of confidential information that will cause huge losses to the customer. The anticipation that will be done is build an alternative security protocol for user interaction in live chat by using a cryptographic algorithm that is useful for protecting confidential messages. Live chat requires confidentiality and data integration with encryption and hash functions. The used algorithm are Rijndael 256 bits, RSA, and SHA256. To increase the complexity, the Rijndael algorithm will be modified in the S-box and ShiftRow sections based on the shannon principle rule, the results show that all pass the Randomness test, but the modification in Shiftrow indicates a better avalanche effect. Therefore the message will be difficult to be stolen or changed.

  16. Cryptographic pseudo-random sequence from the spatial chaotic map

    International Nuclear Information System (INIS)

    Sun Fuyan; Liu Shutang

    2009-01-01

    A scheme for pseudo-random binary sequence generation based on the spatial chaotic map is proposed. In order to face the challenge of using the proposed PRBS in cryptography, the proposed PRBS is subjected to statistical tests which are the well-known FIPS-140-1 in the area of cryptography, and correlation properties of the proposed sequences are investigated. The proposed PRBS successfully passes all these tests. Results of statistical testing of the sequences are found encouraging. The results of statistical tests suggest strong candidature for cryptographic applications.

  17. GATHERING TECHNOLOGY BASED ON REGEX WEB PAGE DENOISING HASH ALIGNMENTS WEB CRAWLER WITHOUT LANDING THE MICRO - BLOG ABUNDANT%基于 Regex 网页去噪 Hash 比对的网络爬虫无登陆微博采集技术

    Institute of Scientific and Technical Information of China (English)

    陈宇; 孟凡龙; 刘培玉; 朱振方

    2015-01-01

    针对当前微博采集无精确去噪方法和微博无法无登陆采集现象,笔者提出了基于 Regex 网页去噪 Hash 对比的网络爬虫采集方案并利用插件采集实现了无登陆采集。该方法通过 Regex 构建 DFA 和 NFA 模型来去除网页噪声,通过 Hash 对比对确定采集页面,并通过插件权限提升实现无登陆技术。有效的避免了 Hash 值的变化与网页内容变化产生偏离的现象,解决了网络爬虫虚拟登录时多次对 URL 采集造成的身份认证问题。实验表明,该方法可以实时快速的获取微博信息,为舆情数据分析提供批量精准的数据。%In view of the current micro - blog acquisition without accurate denoising method and unable abundantly the non - debarkation gathering phenomenon,we present a web crawler acquisition scheme of Regex Webpage denoising Hash based on comparison and realize no landing collection by using plug - in acquisition. The method of Regex to construct DFA and NFA model to remove Webpage noise,comparing the Hash to determine the collection page,and the plug - in privilege without landing techniques are presented. Experiments show that,this method quickly gets micro - blog information in real time,and provides,accurate data for the mass public opinion data analysis.

  18. A model of quantum communication device for quantum hashing

    International Nuclear Information System (INIS)

    Vasiliev, A

    2016-01-01

    In this paper we consider a model of quantum communications between classical computers aided with quantum processors, connected by a classical and a quantum channel. This type of communications implying both classical and quantum messages with moderate use of quantum processing is implicitly used in many quantum protocols, such as quantum key distribution or quantum digital signature. We show that using the model of a quantum processor on multiatomic ensembles in the common QED cavity we can speed up quantum hashing, which can be the basis of quantum digital signature and other communication protocols. (paper)

  19. Integral computer-generated hologram via a modified Gerchberg-Saxton algorithm

    International Nuclear Information System (INIS)

    Wu, Pei-Jung; Lin, Bor-Shyh; Chen, Chien-Yue; Huang, Guan-Syun; Deng, Qing-Long; Chang, Hsuan T

    2015-01-01

    An integral computer-generated hologram, which modulates the phase function of an object based on a modified Gerchberg–Saxton algorithm and compiles a digital cryptographic diagram with phase synthesis, is proposed in this study. When the diagram completes position demultiplexing decipherment, multi-angle elemental images can be reconstructed. Furthermore, an integral CGH with a depth of 225 mm and a visual angle of ±11° is projected through the lens array. (paper)

  20. The generation of shared cryptographic keys through channel impulse response estimation at 60 GHz.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Derek P.; Forman, Michael A.; Dowdle, Donald Ryan

    2010-09-01

    Methods to generate private keys based on wireless channel characteristics have been proposed as an alternative to standard key-management schemes. In this work, we discuss past work in the field and offer a generalized scheme for the generation of private keys using uncorrelated channels in multiple domains. Proposed cognitive enhancements measure channel characteristics, to dynamically change transmission and reception parameters as well as estimate private key randomness and expiration times. Finally, results are presented on the implementation of a system for the generation of private keys for cryptographic communications using channel impulse-response estimation at 60 GHz. The testbed is composed of commercial millimeter-wave VubIQ transceivers, laboratory equipment, and software implemented in MATLAB. Novel cognitive enhancements are demonstrated, using channel estimation to dynamically change system parameters and estimate cryptographic key strength. We show for a complex channel that secret key generation can be accomplished on the order of 100 kb/s.

  1. Power efficient and high performance VLSI architecture for AES algorithm

    Directory of Open Access Journals (Sweden)

    K. Kalaiselvi

    2015-09-01

    Full Text Available Advanced encryption standard (AES algorithm has been widely deployed in cryptographic applications. This work proposes a low power and high throughput implementation of AES algorithm using key expansion approach. We minimize the power consumption and critical path delay using the proposed high performance architecture. It supports both encryption and decryption using 256-bit keys with a throughput of 0.06 Gbps. The VHDL language is utilized for simulating the design and an FPGA chip has been used for the hardware implementations. Experimental results reveal that the proposed AES architectures offer superior performance than the existing VLSI architectures in terms of power, throughput and critical path delay.

  2. Fast Structural Alignment of Biomolecules Using a Hash Table, N-Grams and String Descriptors

    Directory of Open Access Journals (Sweden)

    Robert Preissner

    2009-04-01

    Full Text Available This work presents a generalized approach for the fast structural alignment of thousands of macromolecular structures. The method uses string representations of a macromolecular structure and a hash table that stores n-grams of a certain size for searching. To this end, macromolecular structure-to-string translators were implemented for protein and RNA structures. A query against the index is performed in two hierarchical steps to unite speed and precision. In the first step the query structure is translated into n-grams, and all target structures containing these n-grams are retrieved from the hash table. In the second step all corresponding n-grams of the query and each target structure are subsequently aligned, and after each alignment a score is calculated based on the matching n-grams of query and target. The extendable framework enables the user to query and structurally align thousands of protein and RNA structures on a commodity machine and is available as open source from http://lajolla.sf.net.

  3. An implementation of super-encryption using RC4A and MDTM cipher algorithms for securing PDF Files on android

    Science.gov (United States)

    Budiman, M. A.; Rachmawati, D.; Parlindungan, M. R.

    2018-03-01

    MDTM is a classical symmetric cryptographic algorithm. As with other classical algorithms, the MDTM Cipher algorithm is easy to implement but it is less secure compared to modern symmetric algorithms. In order to make it more secure, a stream cipher RC4A is added and thus the cryptosystem becomes super encryption. In this process, plaintexts derived from PDFs are firstly encrypted with the MDTM Cipher algorithm and are encrypted once more with the RC4A algorithm. The test results show that the value of complexity is Θ(n2) and the running time is linearly directly proportional to the length of plaintext characters and the keys entered.

  4. Cryptanalysis on a parallel keyed hash function based on chaotic maps

    International Nuclear Information System (INIS)

    Guo Wei; Wang Xiaoming; He Dake; Cao Yang

    2009-01-01

    This Letter analyzes the security of a novel parallel keyed hash function based on chaotic maps, proposed by Xiao et al. to improve the efficiency in parallel computing environment. We show how to devise forgery attacks on Xiao's scheme with differential cryptanalysis and give the experiment results of two kinds of forgery attacks firstly. Furthermore, we discuss the problem of weak keys in the scheme and demonstrate how to utilize weak keys to construct collision.

  5. MinHash-Based Fuzzy Keyword Search of Encrypted Data across Multiple Cloud Servers

    Directory of Open Access Journals (Sweden)

    Jingsha He

    2018-05-01

    Full Text Available To enhance the efficiency of data searching, most data owners store their data files in different cloud servers in the form of cipher-text. Thus, efficient search using fuzzy keywords becomes a critical issue in such a cloud computing environment. This paper proposes a method that aims at improving the efficiency of cipher-text retrieval and lowering storage overhead for fuzzy keyword search. In contrast to traditional approaches, the proposed method can reduce the complexity of Min-Hash-based fuzzy keyword search by using Min-Hash fingerprints to avoid the need to construct the fuzzy keyword set. The method will utilize Jaccard similarity to rank the results of retrieval, thus reducing the amount of calculation for similarity and saving a lot of time and space overhead. The method will also take consideration of multiple user queries through re-encryption technology and update user permissions dynamically. Security analysis demonstrates that the method can provide better privacy preservation and experimental results show that efficiency of cipher-text using the proposed method can improve the retrieval time and lower storage overhead as well.

  6. Cost Comparison Among Provable Data Possession Schemes

    Science.gov (United States)

    2016-03-01

    of Acronyms and Abbreviations AE authenticated encryption AWS Amazon Web Services CIO Chief Information Officer DISA Defense Information Systems Agency...the number of possible challenges, H be a cryptographic hash function, AE be an authenticated encryption scheme, f be a keyed pseudo-random function...key kenc R←− Kenc for symmetric encryption scheme Enc, and a random HMAC key kmac R←− Kmac. The secret key is sk = 〈kenc, kmac〉 and public key is pk

  7. BIOMETRIC CRYPTOGRAPHY AND NETWORK AUTHENTICATION

    Directory of Open Access Journals (Sweden)

    Tonimir Kišasondi

    2007-06-01

    Full Text Available In this paper we will present some schemes for strengthening network authentification over insecure channels with biometric concepts or how to securely transfer or use biometric characteristics as cryptographic keys. We will show why some current authentification schemes are insufficient and we will present our concepts of biometric hashes and authentification that rely on unimodal and multimodal biometrics. Our concept can be applied on any biometric authentification scheme and is universal for all systems.

  8. Security of Cooperative Intelligent Transport Systems: Standards, Threats Analysis and Cryptographic Countermeasures

    Directory of Open Access Journals (Sweden)

    Elyes Ben Hamida

    2015-07-01

    Full Text Available Due to the growing number of vehicles on the roads worldwide, road traffic accidents are currently recognized as a major public safety problem. In this context, connected vehicles are considered as the key enabling technology to improve road safety and to foster the emergence of next generation cooperative intelligent transport systems (ITS. Through the use of wireless communication technologies, the deployment of ITS will enable vehicles to autonomously communicate with other nearby vehicles and roadside infrastructures and will open the door for a wide range of novel road safety and driver assistive applications. However, connecting wireless-enabled vehicles to external entities can make ITS applications vulnerable to various security threats, thus impacting the safety of drivers. This article reviews the current research challenges and opportunities related to the development of secure and safe ITS applications. It first explores the architecture and main characteristics of ITS systems and surveys the key enabling standards and projects. Then, various ITS security threats are analyzed and classified, along with their corresponding cryptographic countermeasures. Finally, a detailed ITS safety application case study is analyzed and evaluated in light of the European ETSI TC ITS standard. An experimental test-bed is presented, and several elliptic curve digital signature algorithms (ECDSA are benchmarked for signing and verifying ITS safety messages. To conclude, lessons learned, open research challenges and opportunities are discussed.

  9. Mobile Device Based Dynamic Key Management Protocols for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Chin-Ling Chen

    2015-01-01

    Full Text Available In recent years, wireless sensor network (WSN applications have tended to transmit data hop by hop, from sensor nodes through cluster nodes to the base station. As a result, users must collect data from the base station. This study considers two different applications: hop by hop transmission of data from cluster nodes to the base station and the direct access to cluster nodes data by mobile users via mobile devices. Due to the hardware limitations of WSNs, some low-cost operations such as symmetric cryptographic algorithms and hash functions are used to implement a dynamic key management. The session key can be updated to prevent threats of attack from each communication. With these methods, the data gathered in wireless sensor networks can be more securely communicated. Moreover, the proposed scheme is analyzed and compared with related schemes. In addition, an NS2 simulation is developed in which the experimental results show that the designed communication protocol is workable.

  10. A neural algorithm for a fundamental computing problem.

    Science.gov (United States)

    Dasgupta, Sanjoy; Stevens, Charles F; Navlakha, Saket

    2017-11-10

    Similarity search-for example, identifying similar images in a database or similar documents on the web-is a fundamental computing problem faced by large-scale information retrieval systems. We discovered that the fruit fly olfactory circuit solves this problem with a variant of a computer science algorithm (called locality-sensitive hashing). The fly circuit assigns similar neural activity patterns to similar odors, so that behaviors learned from one odor can be applied when a similar odor is experienced. The fly algorithm, however, uses three computational strategies that depart from traditional approaches. These strategies can be translated to improve the performance of computational similarity searches. This perspective helps illuminate the logic supporting an important sensory function and provides a conceptually new algorithm for solving a fundamental computational problem. Copyright © 2017 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  11. A Hash Based Remote User Authentication and Authenticated Key Agreement Scheme for the Integrated EPR Information System.

    Science.gov (United States)

    Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi; Wang, Chun-Cheng

    2015-11-01

    To protect patient privacy and ensure authorized access to remote medical services, many remote user authentication schemes for the integrated electronic patient record (EPR) information system have been proposed in the literature. In a recent paper, Das proposed a hash based remote user authentication scheme using passwords and smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various passive and active attacks. However, in this paper, we found that Das's authentication scheme is still vulnerable to modification and user duplication attacks. Thereafter we propose a secure and efficient authentication scheme for the integrated EPR information system based on lightweight hash function and bitwise exclusive-or (XOR) operations. The security proof and performance analysis show our new scheme is well-suited to adoption in remote medical healthcare services.

  12. 76 FR 7817 - Announcing Draft Federal Information Processing Standard 180-4, Secure Hash Standard, and Request...

    Science.gov (United States)

    2011-02-11

    ...-02] Announcing Draft Federal Information Processing Standard 180-4, Secure Hash Standard, and Request... and request for comments. SUMMARY: This notice announces the Draft Federal Information Processing..., Information Technology Laboratory, Attention: Comments on Draft FIPS 180-4, 100 Bureau Drive--Stop 8930...

  13. Low Power S-Box Architecture for AES Algorithm using Programmable Second Order Reversible Cellular Automata: An Application to WBAN.

    Science.gov (United States)

    Gangadari, Bhoopal Rao; Ahamed, Shaik Rafi

    2016-12-01

    In this paper, we presented a novel approach of low energy consumption architecture of S-Box used in Advanced Encryption Standard (AES) algorithm using programmable second order reversible cellular automata (RCA 2 ). The architecture entails a low power implementation with minimal delay overhead and the performance of proposed RCA 2 based S-Box in terms of security is evaluated using the cryptographic properties such as nonlinearity, correlation immunity bias, strict avalanche criteria, entropy and also found that the proposed architecture is secure enough for cryptographic applications. Moreover, the proposed AES algorithm architecture simulation studies show that energy consumption of 68.726 nJ, power dissipation of 3.856 mW for 0.18- μm at 13.69 MHz and energy consumption of 29.408 nJ, power dissipation of 1.65 mW for 0.13- μm at 13.69 MHz. The proposed AES algorithm with RCA 2 based S-Box shows a reduction power consumption by 50 % and energy consumption by 5 % compared to best classical S-Box and composite field arithmetic based AES algorithm. Apart from that, it is also shown that RCA 2 based S-Boxes are dynamic in nature, invertible, low power dissipation compared to that of LUT based S-Box and hence suitable for Wireless Body Area Network (WBAN) applications.

  14. VIRTEX-5 Fpga Implementation of Advanced Encryption Standard Algorithm

    Science.gov (United States)

    Rais, Muhammad H.; Qasim, Syed M.

    2010-06-01

    In this paper, we present an implementation of Advanced Encryption Standard (AES) cryptographic algorithm using state-of-the-art Virtex-5 Field Programmable Gate Array (FPGA). The design is coded in Very High Speed Integrated Circuit Hardware Description Language (VHDL). Timing simulation is performed to verify the functionality of the designed circuit. Performance evaluation is also done in terms of throughput and area. The design implemented on Virtex-5 (XC5VLX50FFG676-3) FPGA achieves a maximum throughput of 4.34 Gbps utilizing a total of 399 slices.

  15. Adaptive sampling algorithm for detection of superpoints

    Institute of Scientific and Technical Information of China (English)

    CHENG Guang; GONG Jian; DING Wei; WU Hua; QIANG ShiQiang

    2008-01-01

    The superpoints are the sources (or the destinations) that connect with a great deal of destinations (or sources) during a measurement time interval, so detecting the superpoints in real time is very important to network security and management. Previous algorithms are not able to control the usage of the memory and to deliver the desired accuracy, so it is hard to detect the superpoints on a high speed link in real time. In this paper, we propose an adaptive sampling algorithm to detect the superpoints in real time, which uses a flow sample and hold module to reduce the detection of the non-superpoints and to improve the measurement accuracy of the superpoints. We also design a data stream structure to maintain the flow records, which compensates for the flow Hash collisions statistically. An adaptive process based on different sampling probabilities is used to maintain the recorded IP ad dresses in the limited memory. This algorithm is compared with the other algo rithms by analyzing the real network trace data. Experiment results and mathematic analysis show that this algorithm has the advantages of both the limited memory requirement and high measurement accuracy.

  16. A novel method to design S-box based on chaotic map and genetic algorithm

    International Nuclear Information System (INIS)

    Wang, Yong; Wong, Kwok-Wo; Li, Changbing; Li, Yang

    2012-01-01

    The substitution box (S-box) is an important component in block encryption algorithms. In this Letter, the problem of constructing S-box is transformed to a Traveling Salesman Problem and a method for designing S-box based on chaos and genetic algorithm is proposed. Since the proposed method makes full use of the traits of chaotic map and evolution process, stronger S-box is obtained. The results of performance test show that the presented S-box has good cryptographic properties, which justify that the proposed algorithm is effective in generating strong S-boxes. -- Highlights: ► The problem of constructing S-box is transformed to a Traveling Salesman Problem. ► We present a new method for designing S-box based on chaos and genetic algorithm. ► The proposed algorithm is effective in generating strong S-boxes.

  17. A novel method to design S-box based on chaotic map and genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yong, E-mail: wangyong_cqupt@163.com [State Key Laboratory of Power Transmission Equipment and System Security and New Technology, Chongqing University, Chongqing 400044 (China); Key Laboratory of Electronic Commerce and Logistics, Chongqing University of Posts and Telecommunications, Chongqing 400065 (China); Wong, Kwok-Wo [Department of Electronic Engineering, City University of Hong Kong, 83 Tat Chee Avenue, Kowloon Tong (Hong Kong); Li, Changbing [Key Laboratory of Electronic Commerce and Logistics, Chongqing University of Posts and Telecommunications, Chongqing 400065 (China); Li, Yang [Department of Automatic Control and Systems Engineering, The University of Sheffield, Mapping Street, S1 3DJ (United Kingdom)

    2012-01-30

    The substitution box (S-box) is an important component in block encryption algorithms. In this Letter, the problem of constructing S-box is transformed to a Traveling Salesman Problem and a method for designing S-box based on chaos and genetic algorithm is proposed. Since the proposed method makes full use of the traits of chaotic map and evolution process, stronger S-box is obtained. The results of performance test show that the presented S-box has good cryptographic properties, which justify that the proposed algorithm is effective in generating strong S-boxes. -- Highlights: ► The problem of constructing S-box is transformed to a Traveling Salesman Problem. ► We present a new method for designing S-box based on chaos and genetic algorithm. ► The proposed algorithm is effective in generating strong S-boxes.

  18. Rotation invariant deep binary hashing for fast image retrieval

    Science.gov (United States)

    Dai, Lai; Liu, Jianming; Jiang, Aiwen

    2017-07-01

    In this paper, we study how to compactly represent image's characteristics for fast image retrieval. We propose supervised rotation invariant compact discriminative binary descriptors through combining convolutional neural network with hashing. In the proposed network, binary codes are learned by employing a hidden layer for representing latent concepts that dominate on class labels. A loss function is proposed to minimize the difference between binary descriptors that describe reference image and the rotated one. Compared with some other supervised methods, the proposed network doesn't have to require pair-wised inputs for binary code learning. Experimental results show that our method is effective and achieves state-of-the-art results on the CIFAR-10 and MNIST datasets.

  19. Scalable Content Authentication in H.264/SVC Videos Using Perceptual Hashing based on Dempster-Shafer theory

    Directory of Open Access Journals (Sweden)

    Ye Dengpan

    2012-09-01

    Full Text Available The content authenticity of the multimedia delivery is important issue with rapid development and widely used of multimedia technology. Till now many authentication solutions had been proposed, such as cryptology and watermarking based methods. However, in latest heterogeneous network the video stream transmission has been coded in scalable way such as H.264/SVC, there is still no good authentication solution. In this paper, we firstly summarized related works and proposed a scalable content authentication scheme using a ratio of different energy (RDE based perceptual hashing in Q/S dimension, which is used Dempster-Shafer theory and combined with the latest scalable video coding (H.264/SVC construction. The idea of aldquo;sign once and verify in scalable wayardquo; can be realized. Comparing with previous methods, the proposed scheme based on perceptual hashing outperforms previous works in uncertainty (robustness and efficiencies in the H.264/SVC video streams. At last, the experiment results verified the performance of our scheme.

  20. Architecture for the Secret-Key BC3 Cryptography Algorithm

    Directory of Open Access Journals (Sweden)

    Arif Sasongko

    2011-08-01

    Full Text Available Cryptography is a very important aspect in data security. The focus of research in this field is shifting from merely security aspect to consider as well the implementation aspect. This paper aims to introduce BC3 algorithm with focus on its hardware implementation. It proposes architecture for the hardware implementation for this algorithm. BC3 algorithm is a secret-key cryptography algorithm developed with two considerations: robustness and implementation efficiency. This algorithm has been implemented on software and has good performance compared to AES algorithm. BC3 is improvement of BC2 and AE cryptographic algorithm and it is expected to have the same level of robustness and to gain competitive advantages in the implementation aspect. The development of the architecture gives much attention on (1 resource sharing and (2 having single clock for each round. It exploits regularity of the algorithm. This architecture is then implemented on an FPGA. This implementation is three times smaller area than AES, but about five times faster. Furthermore, this BC3 hardware implementation has better performance compared to BC3 software both in key expansion stage and randomizing stage. For the future, the security of this implementation must be reviewed especially against side channel attack.

  1. Hash-chain-based authentication for IoT

    Directory of Open Access Journals (Sweden)

    Antonio PINTO

    2016-12-01

    Full Text Available The number of everyday interconnected devices continues to increase and constitute the Internet of Things (IoT. Things are small computers equipped with sensors and wireless communications capabilities that are driven by energy constraints, since they use batteries and may be required to operate over long periods of time. The majority of these devices perform data collection. The collected data is stored on-line using web-services that, sometimes, operate without any special considerations regarding security and privacy. The current work proposes a modified hash-chain authentication mechanism that, with the help of a smartphone, can authenticate each interaction of the devices with a REST web-service using One Time Passwords (OTP while using open wireless networks. Moreover, the proposed authentication mechanism adheres to the stateless, HTTP-like behavior expected of REST web-services, even allowing the caching of server authentication replies within a predefined time window. No other known web-service authentication mechanism operates in such manner.

  2. Murasaki: a fast, parallelizable algorithm to find anchors from multiple genomes.

    Directory of Open Access Journals (Sweden)

    Kris Popendorf

    Full Text Available BACKGROUND: With the number of available genome sequences increasing rapidly, the magnitude of sequence data required for multiple-genome analyses is a challenging problem. When large-scale rearrangements break the collinearity of gene orders among genomes, genome comparison algorithms must first identify sets of short well-conserved sequences present in each genome, termed anchors. Previously, anchor identification among multiple genomes has been achieved using pairwise alignment tools like BLASTZ through progressive alignment tools like TBA, but the computational requirements for sequence comparisons of multiple genomes quickly becomes a limiting factor as the number and scale of genomes grows. METHODOLOGY/PRINCIPAL FINDINGS: Our algorithm, named Murasaki, makes it possible to identify anchors within multiple large sequences on the scale of several hundred megabases in few minutes using a single CPU. Two advanced features of Murasaki are (1 adaptive hash function generation, which enables efficient use of arbitrary mismatch patterns (spaced seeds and therefore the comparison of multiple mammalian genomes in a practical amount of computation time, and (2 parallelizable execution that decreases the required wall-clock and CPU times. Murasaki can perform a sensitive anchoring of eight mammalian genomes (human, chimp, rhesus, orangutan, mouse, rat, dog, and cow in 21 hours CPU time (42 minutes wall time. This is the first single-pass in-core anchoring of multiple mammalian genomes. We evaluated Murasaki by comparing it with the genome alignment programs BLASTZ and TBA. We show that Murasaki can anchor multiple genomes in near linear time, compared to the quadratic time requirements of BLASTZ and TBA, while improving overall accuracy. CONCLUSIONS/SIGNIFICANCE: Murasaki provides an open source platform to take advantage of long patterns, cluster computing, and novel hash algorithms to produce accurate anchors across multiple genomes with

  3. Encryption algorithms for databases

    Directory of Open Access Journals (Sweden)

    Doina FUSARU

    2010-06-01

    Full Text Available For most cases, people use an ecrypted mode when sending personal information to a server, via an electronic form.  Whenever shopping is done online, the browser uses cryptographic methods to send to the server the credit card number and private information. Thanks to the surprising development of the Internet, and not to the structural models (OSI and TCP/IP this technology is based on, the electronic commerce requires quality, security, reliability and, above all, the possibility of implementing all such concepts. It is interesting that none of the widely used cryptographic systems is mathematically demonstrated to be safe. As a matter of fact, the entire technology of cryptography is based on mathematical problems that are still unanswered to. Looking at the above, the study of the cryptographic and security methods, as well as finding strong crypto-systems is still a pivotal issue.

  4. Quantum key management

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard John; Thrasher, James Thomas; Nordholt, Jane Elizabeth

    2016-11-29

    Innovations for quantum key management harness quantum communications to form a cryptography system within a public key infrastructure framework. In example implementations, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a Merkle signature scheme (using Winternitz one-time digital signatures or other one-time digital signatures, and Merkle hash trees) to constitute a cryptography system. More generally, the quantum key management innovations combine quantum key distribution and a quantum identification protocol with a hash-based signature scheme. This provides a secure way to identify, authenticate, verify, and exchange secret cryptographic keys. Features of the quantum key management innovations further include secure enrollment of users with a registration authority, as well as credential checking and revocation with a certificate authority, where the registration authority and/or certificate authority can be part of the same system as a trusted authority for quantum key distribution.

  5. DNA Cryptography and Deep Learning using Genetic Algorithm with NW algorithm for Key Generation.

    Science.gov (United States)

    Kalsi, Shruti; Kaur, Harleen; Chang, Victor

    2017-12-05

    Cryptography is not only a science of applying complex mathematics and logic to design strong methods to hide data called as encryption, but also to retrieve the original data back, called decryption. The purpose of cryptography is to transmit a message between a sender and receiver such that an eavesdropper is unable to comprehend it. To accomplish this, not only we need a strong algorithm, but a strong key and a strong concept for encryption and decryption process. We have introduced a concept of DNA Deep Learning Cryptography which is defined as a technique of concealing data in terms of DNA sequence and deep learning. In the cryptographic technique, each alphabet of a letter is converted into a different combination of the four bases, namely; Adenine (A), Cytosine (C), Guanine (G) and Thymine (T), which make up the human deoxyribonucleic acid (DNA). Actual implementations with the DNA don't exceed laboratory level and are expensive. To bring DNA computing on a digital level, easy and effective algorithms are proposed in this paper. In proposed work we have introduced firstly, a method and its implementation for key generation based on the theory of natural selection using Genetic Algorithm with Needleman-Wunsch (NW) algorithm and Secondly, a method for implementation of encryption and decryption based on DNA computing using biological operations Transcription, Translation, DNA Sequencing and Deep Learning.

  6. UnoHop: Efficient Distributed Hash Table with O(1 Lookup Performance

    Directory of Open Access Journals (Sweden)

    Herry Sitepu

    2008-05-01

    Full Text Available Distributed Hash Tables (DHTs with O(1 lookup performance strive to minimize the maintenance traffic which required for propagating membership changes information (events. These events distribution allows each node in the peer-to-peer network maintains accurate routing tables with complete membership information. We present UnoHop, a novel DHT protocol with O(1 lookup performance. The protocol uses an efficient mechanism to distribute events through a dissemination tree that constructed dynamically rooted at the node that detect the events. Our protocol produces symmetric bandwidth usage at all nodes while decreasing the events propagation delay.

  7. Correlation Immunity, Avalanche Features, and Other Cryptographic Properties of Generalized Boolean Functions

    Science.gov (United States)

    2017-09-01

    satisfying the strict avalanche criterion,” Discrete Math ., vol. 185, pp. 29–39, 1998. [2] R.C. Bose, “On some connections between the design of... Discrete Appl. Math ., vol. 149, pp. 73–86, 2005. [11] T.W. Cusick and P. Stănică, Cryptographic Boolean Functions and Applications, 2nd ed., San Diego...Stănică, “Bisecting binomial coefficients,” Discrete Appl. Math ., vol. 227, pp. 70–83, 2017. [28] T. Martinsen, W. Meidl, and P. Stănică, “Generalized

  8. Secure Hashing of Dynamic Hand Signatures Using Wavelet-Fourier Compression with BioPhasor Mixing and Discretization

    Directory of Open Access Journals (Sweden)

    Wai Kuan Yip

    2007-01-01

    Full Text Available We introduce a novel method for secure computation of biometric hash on dynamic hand signatures using BioPhasor mixing and discretization. The use of BioPhasor as the mixing process provides a one-way transformation that precludes exact recovery of the biometric vector from compromised hashes and stolen tokens. In addition, our user-specific discretization acts both as an error correction step as well as a real-to-binary space converter. We also propose a new method of extracting compressed representation of dynamic hand signatures using discrete wavelet transform (DWT and discrete fourier transform (DFT. Without the conventional use of dynamic time warping, the proposed method avoids storage of user's hand signature template. This is an important consideration for protecting the privacy of the biometric owner. Our results show that the proposed method could produce stable and distinguishable bit strings with equal error rates (EERs of and for random and skilled forgeries for stolen token (worst case scenario, and for both forgeries in the genuine token (optimal scenario.

  9. Implementing SSL/TLS using cryptography and PKI

    CERN Document Server

    Davies, Joshua

    2011-01-01

    Hands-on, practical guide to implementing SSL and TLS protocols for Internet security If you are a network professional who knows C programming, this practical book is for you.  Focused on how to implement Secure Socket Layer (SSL) and Transport Layer Security (TLS), this book guides you through all necessary steps, whether or not you have a working knowledge of cryptography. The book covers SSLv2, TLS 1.0, and TLS 1.2, including implementations of the relevant cryptographic protocols, secure hashing, certificate parsing, certificate generation, and more.  Coverage includes: Underst

  10. Privacy-Preserving and Scalable Service Recommendation Based on SimHash in a Distributed Cloud Environment

    Directory of Open Access Journals (Sweden)

    Yanwei Xu

    2017-01-01

    Full Text Available With the increasing volume of web services in the cloud environment, Collaborative Filtering- (CF- based service recommendation has become one of the most effective techniques to alleviate the heavy burden on the service selection decisions of a target user. However, the service recommendation bases, that is, historical service usage data, are often distributed in different cloud platforms. Two challenges are present in such a cross-cloud service recommendation scenario. First, a cloud platform is often not willing to share its data to other cloud platforms due to privacy concerns, which decreases the feasibility of cross-cloud service recommendation severely. Second, the historical service usage data recorded in each cloud platform may update over time, which reduces the recommendation scalability significantly. In view of these two challenges, a novel privacy-preserving and scalable service recommendation approach based on SimHash, named SerRecSimHash, is proposed in this paper. Finally, through a set of experiments deployed on a real distributed service quality dataset WS-DREAM, we validate the feasibility of our proposal in terms of recommendation accuracy and efficiency while guaranteeing privacy-preservation.

  11. MEANING OF THE BITCOIN CRYPTOGRAPHIC CURRENCY AS A MEDIUM OF EXCHANGE

    Directory of Open Access Journals (Sweden)

    Łukasz Dopierała

    2014-06-01

    Full Text Available This article presents one of the new elements of virtual reality, which is the Bitcoin cryptocurrency. This thesis focuses on the condition and perspectives on development of the trading function of this instrument. The authors discuss the legal aspects of functioning of the Bitcoin, conduct a SWOT analysis of this cryptocurrency as a medium of exchange, and examin the scale of use of Bitcoin in transaction purposes. As of March 1, 2014 the trading system gradually develops and the strengths of this cryptographic currency outweigh its weaknesses, but the future of Bitcoin as a medium of exchange is difficult to determine.

  12. An Experimental Evaluation of the DQ-DHT Algorithm in a Grid Information Service

    Science.gov (United States)

    Papadakis, Harris; Trunfio, Paolo; Talia, Domenico; Fragopoulou, Paraskevi

    DQ-DHT is a resource discovery algorithm that combines the Dynamic Querying (DQ) technique used in unstructured peer-to-peer networks with an algorithm for efficient broadcast over a Distributed Hash Table (DHT). Similarly to DQ, DQ-DHT dynamically controls the query propagation on the basis of the desired number of results and the popularity of the resource to be located. Differently from DQ, DQ-DHT exploits the structural properties of a DHT to avoid message duplications, thus reducing the amount of network traffic generated by each query. The goal of this paper is to evaluate experimentally the amount of traffic generated by DQ-DHT compared to the DQ algorithm in a Grid infrastructure. A prototype of a Grid information service, which can use both DQ and DQ-DHT as resource discovery algorithm, has been implemented and deployed on the Grid'5000 infrastructure for evaluation. The experimental results presented in this paper show that DQ-DHT significantly reduces the amount of network traffic generated during the discovery process compared to the original DQ algorithm.

  13. Computer Security: Your privacy at CERN matters

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2015-01-01

    Congrats to all those who spotted that our last contribution to the CERN Bulletin (“CERN Secure Password Competition” – see here) was an April Fools’ Day hoax. Of course, there is no review and no jury and there won’t be any competition. Consequently, we are sorry to say that we cannot announce any winners. The extension of the password history rule and the initiative of finding password duplicates are absolute nonsense too.   In fact, the Computer Security team, just like the CERN Account Management service, the Single Sign-On team and the ServiceDesk, does not know and has no need to know your password. Passwords are actually salted and hashed using the SHA256 cryptographic hash function. Thus, there is no literal password database and no way that anyone apart from you can know your password – unless you have given it away intentionally or inadvertently… Remember, your password is yours and only yours, so please do not...

  14. Refined repetitive sequence searches utilizing a fast hash function and cross species information retrievals

    Directory of Open Access Journals (Sweden)

    Reneker Jeff

    2005-05-01

    Full Text Available Abstract Background Searching for small tandem/disperse repetitive DNA sequences streamlines many biomedical research processes. For instance, whole genomic array analysis in yeast has revealed 22 PHO-regulated genes. The promoter regions of all but one of them contain at least one of the two core Pho4p binding sites, CACGTG and CACGTT. In humans, microsatellites play a role in a number of rare neurodegenerative diseases such as spinocerebellar ataxia type 1 (SCA1. SCA1 is a hereditary neurodegenerative disease caused by an expanded CAG repeat in the coding sequence of the gene. In bacterial pathogens, microsatellites are proposed to regulate expression of some virulence factors. For example, bacteria commonly generate intra-strain diversity through phase variation which is strongly associated with virulence determinants. A recent analysis of the complete sequences of the Helicobacter pylori strains 26695 and J99 has identified 46 putative phase-variable genes among the two genomes through their association with homopolymeric tracts and dinucleotide repeats. Life scientists are increasingly interested in studying the function of small sequences of DNA. However, current search algorithms often generate thousands of matches – most of which are irrelevant to the researcher. Results We present our hash function as well as our search algorithm to locate small sequences of DNA within multiple genomes. Our system applies information retrieval algorithms to discover knowledge of cross-species conservation of repeat sequences. We discuss our incorporation of the Gene Ontology (GO database into these algorithms. We conduct an exhaustive time analysis of our system for various repetitive sequence lengths. For instance, a search for eight bases of sequence within 3.224 GBases on 49 different chromosomes takes 1.147 seconds on average. To illustrate the relevance of the search results, we conduct a search with and without added annotation terms for the

  15. Enforcing Security Mechanisms in the IP-Based Internet of Things: An Algorithmic Overview

    Directory of Open Access Journals (Sweden)

    Luca Veltri

    2013-04-01

    Full Text Available The Internet of Things (IoT refers to the Internet-like structure of billions of interconnected constrained devices, denoted as “smart objects”. Smart objects have limited capabilities, in terms of computational power and memory, and might be battery-powered devices, thus raising the need to adopt particularly energy efficient technologies. Among the most notable challenges that building interconnected smart objects brings about, there are standardization and interoperability. The use of IP has been foreseen as the standard for interoperability for smart objects. As billions of smart objects are expected to come to life and IPv4 addresses have eventually reached depletion, IPv6 has been identified as a candidate for smart-object communication. The deployment of the IoT raises many security issues coming from (i the very nature of smart objects, e.g., the adoption of lightweight cryptographic algorithms, in terms of processing and memory requirements; and (ii the use of standard protocols, e.g., the need to minimize the amount of data exchanged between nodes. This paper provides a detailed overview of the security challenges related to the deployment of smart objects. Security protocols at network, transport, and application layers are discussed, together with lightweight cryptographic algorithms proposed to be used instead of conventional and demanding ones, in terms of computational resources. Security aspects, such as key distribution and security bootstrapping, and application scenarios, such as secure data aggregation and service authorization, are also discussed.

  16. A Secure and Robust User Authenticated Key Agreement Scheme for Hierarchical Multi-medical Server Environment in TMIS.

    Science.gov (United States)

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2015-09-01

    The telecare medicine information system (TMIS) helps the patients to gain the health monitoring facility at home and access medical services over the Internet of mobile networks. Recently, Amin and Biswas presented a smart card based user authentication and key agreement security protocol usable for TMIS system using the cryptographic one-way hash function and biohashing function, and claimed that their scheme is secure against all possible attacks. Though their scheme is efficient due to usage of one-way hash function, we show that their scheme has several security pitfalls and design flaws, such as (1) it fails to protect privileged-insider attack, (2) it fails to protect strong replay attack, (3) it fails to protect strong man-in-the-middle attack, (4) it has design flaw in user registration phase, (5) it has design flaw in login phase, (6) it has design flaw in password change phase, (7) it lacks of supporting biometric update phase, and (8) it has flaws in formal security analysis. In order to withstand these security pitfalls and design flaws, we aim to propose a secure and robust user authenticated key agreement scheme for the hierarchical multi-server environment suitable in TMIS using the cryptographic one-way hash function and fuzzy extractor. Through the rigorous security analysis including the formal security analysis using the widely-accepted Burrows-Abadi-Needham (BAN) logic, the formal security analysis under the random oracle model and the informal security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme using the most-widely accepted and used Automated Validation of Internet Security Protocols and Applications (AVISPA) tool. The simulation results show that our scheme is also secure. Our scheme is more efficient in computation and communication as compared to Amin-Biswas's scheme and other related schemes. In addition, our scheme supports extra functionality features as compared to

  17. MATCHING AERIAL IMAGES TO 3D BUILDING MODELS BASED ON CONTEXT-BASED GEOMETRIC HASHING

    Directory of Open Access Journals (Sweden)

    J. Jung

    2016-06-01

    Full Text Available In this paper, a new model-to-image framework to automatically align a single airborne image with existing 3D building models using geometric hashing is proposed. As a prerequisite process for various applications such as data fusion, object tracking, change detection and texture mapping, the proposed registration method is used for determining accurate exterior orientation parameters (EOPs of a single image. This model-to-image matching process consists of three steps: 1 feature extraction, 2 similarity measure and matching, and 3 adjustment of EOPs of a single image. For feature extraction, we proposed two types of matching cues, edged corner points representing the saliency of building corner points with associated edges and contextual relations among the edged corner points within an individual roof. These matching features are extracted from both 3D building and a single airborne image. A set of matched corners are found with given proximity measure through geometric hashing and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on co-linearity equations. The result shows that acceptable accuracy of single image's EOP can be achievable by the proposed registration approach as an alternative to labour-intensive manual registration process.

  18. MULTIMEDIA DATA TRANSMISSION THROUGH TCP/IP USING HASH BASED FEC WITH AUTO-XOR SCHEME

    OpenAIRE

    R. Shalin; D. Kesavaraja

    2012-01-01

    The most preferred mode for communication of multimedia data is through the TCP/IP protocol. But on the other hand the TCP/IP protocol produces huge packet loss unavoidable due to network traffic and congestion. In order to provide a efficient communication it is necessary to recover the loss of packets. The proposed scheme implements Hash based FEC with auto XOR scheme for this purpose. The scheme is implemented through Forward error correction, MD5 and XOR for providing efficient transmissi...

  19. Algoritmi selektivnog šifrovanja - pregled sa ocenom performansi / Selective encryption algorithms: Overview with performance evaluation

    Directory of Open Access Journals (Sweden)

    Boriša Ž. Jovanović

    2010-10-01

    Full Text Available Digitalni multimedijalni sadržaj postaje zastupljeniji i sve više se razmenjuje putem računarskih mreža i javnih kanala (satelitske komunikacije, bežične mreže, internet, itd. koji predstavljaju nebezbedne medijume za prenos informacija osetljive sadržine. Sve više na značaju dobijaju mehanizmi kriptološke zaštite slika i video sadržaja. Tradicionalni sistemi kriptografske obrade u sistemima za prenos ovih vrsta informacija garantuju visok stepen sigurnosti, ali i imaju svoje nedostatke - visoku cenu implementacije i znatno kašnjenje u prenosu podataka. Pomenuti nedostaci se prevazilaze primenom algoritama selektivnog šifrovanja. / Digital multimedia content is becoming widely used and increasingly exchanged over computer network and public channels (satelite, wireless networks, Internet, etc. which is unsecured transmission media for ex changing that kind of information. Mechanisms made to encrypt image and video data are becoming more and more significant. Traditional cryptographic techniques can guarantee a high level of security but at the cost of expensive implementation and important transmission delays. These shortcomings can be exceeded using selective encryption algorithms. Introduction In traditional image and video content protection schemes, called fully layered, the whole content is first compressed. Then, the compressed bitstream is entirely encrypted using a standard cipher (DES - Data Encryption Algorithm, IDEA - International Data Encryption Algorithm, AES - Advanced Encryption Algorithm etc.. The specific characteristics of this kind of data, high-transmission rate with limited bandwidth, make standard encryption algorithms inadequate. Another limitation of traditional systems consists of altering the whole bitstream syntax which may disable some codec functionalities on the delivery site coder and decoder on the receiving site. Selective encryption is a new trend in image and video content protection. As its

  20. Formalizing the Relationship Between Commitment and Basic Cryptographic Primitives

    Directory of Open Access Journals (Sweden)

    S. Sree Vivek

    2016-11-01

    Full Text Available Signcryption is a cryptographic primitive which offers the functionality of both digital signature and encryption with lower combined computational cost. On the other hand, commitment scheme allows an entity to commit to a value, where the entity reveals the committed value later during a decommit phase. In this paper, we explore the connection between commitment schemes, public key encryption, digital signatures and signcryption. We establish formal relationship between commitment and the other primitives. Our main result is that we show signcryption can be used as a commitment scheme with appropriate security notions. We show that if the underlying signcryption scheme is IND-CCA2 secure, then the hiding property of the commitment scheme is satisfied. Similarly, we show that if the underlying signcryption scheme is unforgeable, then the relaxed biding property of the commitment scheme is satisfied. Moreover, we prove that if the underlying signcryption scheme is NM-CCA2, then the commitment scheme is non-malleable.

  1. Cryptographic Key Management in Delay Tolerant Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Sofia Anna Menesidou

    2017-06-01

    Full Text Available Since their appearance at the dawn of the second millennium, Delay or Disruption Tolerant Networks (DTNs have gradually evolved, spurring the development of a variety of methods and protocols for making them more secure and resilient. In this context, perhaps, the most challenging problem to deal with is that of cryptographic key management. To the best of our knowledge, the work at hand is the first to survey the relevant literature and classify the various so far proposed key management approaches in such a restricted and harsh environment. Towards this goal, we have grouped the surveyed key management methods into three major categories depending on whether the particular method copes with (a security initialization, (b key establishment, and (c key revocation. We have attempted to provide a concise but fairly complete evaluation of the proposed up-to-date methods in a generalized way with the aim of offering a central reference point for future research.

  2. Cryptographic robustness of a quantum cryptography system using phase-time coding

    International Nuclear Information System (INIS)

    Molotkov, S. N.

    2008-01-01

    A cryptographic analysis is presented of a new quantum key distribution protocol using phase-time coding. An upper bound is obtained for the error rate that guarantees secure key distribution. It is shown that the maximum tolerable error rate for this protocol depends on the counting rate in the control time slot. When no counts are detected in the control time slot, the protocol guarantees secure key distribution if the bit error rate in the sifted key does not exceed 50%. This protocol partially discriminates between errors due to system defects (e.g., imbalance of a fiber-optic interferometer) and eavesdropping. In the absence of eavesdropping, the counts detected in the control time slot are not caused by interferometer imbalance, which reduces the requirements for interferometer stability.

  3. A hash based mutual RFID tag authentication protocol in telecare medicine information system.

    Science.gov (United States)

    Srivastava, Keerti; Awasthi, Amit K; Kaul, Sonam D; Mittal, R C

    2015-01-01

    Radio Frequency Identification (RFID) is a technology which has multidimensional applications to reduce the complexity of today life. Everywhere, like access control, transportation, real-time inventory, asset management and automated payment systems etc., RFID has its enormous use. Recently, this technology is opening its wings in healthcare environments, where potential applications include patient monitoring, object traceability and drug administration systems etc. In this paper, we propose a secure RFID-based protocol for the medical sector. This protocol is based on hash operation with synchronized secret. The protocol is safe against active and passive attacks such as forgery, traceability, replay and de-synchronization attack.

  4. SEMANTIC SEGMENTATION OF BUILDING ELEMENTS USING POINT CLOUD HASHING

    Directory of Open Access Journals (Sweden)

    M. Chizhova

    2018-05-01

    Full Text Available For the interpretation of point clouds, the semantic definition of extracted segments from point clouds or images is a common problem. Usually, the semantic of geometrical pre-segmented point cloud elements are determined using probabilistic networks and scene databases. The proposed semantic segmentation method is based on the psychological human interpretation of geometric objects, especially on fundamental rules of primary comprehension. Starting from these rules the buildings could be quite well and simply classified by a human operator (e.g. architect into different building types and structural elements (dome, nave, transept etc., including particular building parts which are visually detected. The key part of the procedure is a novel method based on hashing where point cloud projections are transformed into binary pixel representations. A segmentation approach released on the example of classical Orthodox churches is suitable for other buildings and objects characterized through a particular typology in its construction (e.g. industrial objects in standardized enviroments with strict component design allowing clear semantic modelling.

  5. Parameter-free Locality Sensitive Hashing for Spherical Range Reporting

    DEFF Research Database (Denmark)

    Ahle, Thomas Dybdahl; Pagh, Rasmus; Aumüller, Martin

    2017-01-01

    We present a data structure for *spherical range reporting* on a point set S, i.e., reporting all points in S that lie within radius r of a given query point q. Our solution builds upon the Locality-Sensitive Hashing (LSH) framework of Indyk and Motwani, which represents the asymptotically best...... solutions to near neighbor problems in high dimensions. While traditional LSH data structures have several parameters whose optimal values depend on the distance distribution from q to the points of S, our data structure is parameter-free, except for the space usage, which is configurable by the user...... query time bounded by O(t(n/t)ρ), where t is the number of points to report and ρ∈(0,1) depends on the data distribution and the strength of the LSH family used. We further present a parameter-free way of using multi-probing, for LSH families that support it, and show that for many such families...

  6. PDES, Fips Standard Data Encryption Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Nessett, D N [Lawrence Livermore National Laboratory (United States)

    1991-03-26

    Description of program or function: PDES performs the National Bureau of Standards FIPS Pub. 46 data encryption/decryption algorithm used for the cryptographic protection of computer data. The DES algorithm is designed to encipher and decipher blocks of data consisting of 64 bits under control of a 64-bit key. The key is generated in such a way that each of the 56 bits used directly by the algorithm are random and the remaining 8 error-detecting bits are set to make the parity of each 8-bit byte of the key odd, i. e. there is an odd number of '1' bits in each 8-bit byte. Each member of a group of authorized users of encrypted computer data must have the key that was used to encipher the data in order to use it. Data can be recovered from cipher only by using exactly the same key used to encipher it, but with the schedule of addressing the key bits altered so that the deciphering process is the reverse of the enciphering process. A block of data to be enciphered is subjected to an initial permutation, then to a complex key-dependent computation, and finally to a permutation which is the inverse of the initial permutation. Two PDES routines are included; both perform the same calculation. One, identified as FDES.MAR, is designed to achieve speed in execution, while the other identified as PDES.MAR, presents a clearer view of how the algorithm is executed

  7. PDES, Fips Standard Data Encryption Algorithm

    International Nuclear Information System (INIS)

    Nessett, D.N.

    1991-01-01

    Description of program or function: PDES performs the National Bureau of Standards FIPS Pub. 46 data encryption/decryption algorithm used for the cryptographic protection of computer data. The DES algorithm is designed to encipher and decipher blocks of data consisting of 64 bits under control of a 64-bit key. The key is generated in such a way that each of the 56 bits used directly by the algorithm are random and the remaining 8 error-detecting bits are set to make the parity of each 8-bit byte of the key odd, i. e. there is an odd number of '1' bits in each 8-bit byte. Each member of a group of authorized users of encrypted computer data must have the key that was used to encipher the data in order to use it. Data can be recovered from cipher only by using exactly the same key used to encipher it, but with the schedule of addressing the key bits altered so that the deciphering process is the reverse of the enciphering process. A block of data to be enciphered is subjected to an initial permutation, then to a complex key-dependent computation, and finally to a permutation which is the inverse of the initial permutation. Two PDES routines are included; both perform the same calculation. One, identified as FDES.MAR, is designed to achieve speed in execution, while the other identified as PDES.MAR, presents a clearer view of how the algorithm is executed

  8. Evaluación de la propuesta algorítmica criptográfica con la incorporación de la esteganografía en imágenes

    Directory of Open Access Journals (Sweden)

    Méndez-Naranjo, Pablo

    2017-12-01

    Full Text Available The present study was quasi experimental, applicative and integrated two fields of security: the cryptography that encrypts the message and the steganography that hides the message behind a multimedia medium, which strengthens the level of security. Netbeans was the software used for the research as a development environment, Beyond Compare to compare the hexadecimal code of the images, Ion Forge Image Diff to compare the differences between pixel to pixel images and Cyptool for the cryptanalysis tests. The AES (Advanced Encryption Standard cryptographic algorithm was used as the basis and LSB (Least Significant Bit was selected for the steganographic technique in images. New functions that were included in Prototype II were implemented and evaluated, and results obtained by running cryptanalysis were compared to the encrypted messages between Prototype II, that uses the new cryptographic algorithm named NAES, and Prototype I, that uses the AES base algorithm, to which incorporated the steganographic technique into LSB images. It was concluded that the new cryptographic algorithm NAES with the incorporation of the LSB technique improved the security, in comparison with the AES cryptographic algorithm, since the message is more diffuse.

  9. Detection of beamsplitting attack in a quantum cryptographic channel based on photon number statistics monitoring

    International Nuclear Information System (INIS)

    Gaidash, A A; Egorov, V I; Gleim, A V

    2014-01-01

    Quantum cryptography in theory allows distributing secure keys between two users so that any performed eavesdropping attempt would be immediately discovered. However, in practice an eavesdropper can obtain key information from multi-photon states when attenuated laser radiation is used as a source. In order to overcome this possibility, it is generally suggested to implement special cryptographic protocols, like decoy states or SARG04. We present an alternative method based on monitoring photon number statistics after detection. This method can therefore be used with any existing protocol

  10. Matching Aerial Images to 3D Building Models Using Context-Based Geometric Hashing

    Directory of Open Access Journals (Sweden)

    Jaewook Jung

    2016-06-01

    Full Text Available A city is a dynamic entity, which environment is continuously changing over time. Accordingly, its virtual city models also need to be regularly updated to support accurate model-based decisions for various applications, including urban planning, emergency response and autonomous navigation. A concept of continuous city modeling is to progressively reconstruct city models by accommodating their changes recognized in spatio-temporal domain, while preserving unchanged structures. A first critical step for continuous city modeling is to coherently register remotely sensed data taken at different epochs with existing building models. This paper presents a new model-to-image registration method using a context-based geometric hashing (CGH method to align a single image with existing 3D building models. This model-to-image registration process consists of three steps: (1 feature extraction; (2 similarity measure; and matching, and (3 estimating exterior orientation parameters (EOPs of a single image. For feature extraction, we propose two types of matching cues: edged corner features representing the saliency of building corner points with associated edges, and contextual relations among the edged corner features within an individual roof. A set of matched corners are found with given proximity measure through geometric hashing, and optimal matches are then finally determined by maximizing the matching cost encoding contextual similarity between matching candidates. Final matched corners are used for adjusting EOPs of the single airborne image by the least square method based on collinearity equations. The result shows that acceptable accuracy of EOPs of a single image can be achievable using the proposed registration approach as an alternative to a labor-intensive manual registration process.

  11. Design and analysis of cryptographic algorithms

    DEFF Research Database (Denmark)

    Kölbl, Stefan

    . From securing our passwords and personal data to protecting mobile communication from eavesdroppers and our electronic bank transactions from manipulation. These applications would be impossible without cryptography. The main topic of this thesis is the design and security analysis of the most......In today’s world computers are ubiquitous. They can be found in virtually any industry and most households own at least one personal computer or have a mobile phone. Apart from these fairly large and complex devices, we also see computers on a much smaller scale appear in everyday objects...... to this development. However, most of this communication happens over inherently insecure channels requiring methods to protect our communication. A further issue is the vast amount of data generated, which raises serious privacy concerns. Cryptography provides the key components for protecting our communication...

  12. A Reusable Software Copy Protection Using Hash Result and Asymetrical Encryption

    Directory of Open Access Journals (Sweden)

    Aswin Wibisurya

    2014-12-01

    Full Text Available Desktop application is one of the most popular types of application being used in computer due to the one time install simplicity and the quick accessibility from the moment the computer being turned on. Limitation of the copy and usage of desktop applications has long been an important issue to application providers. For security concerns, software copy protection is usually integrated with the application. However, developers seek to reuse the copy protection component of the software. This paper proposes an approach of reusable software copy protection which consists of a certificate validator on the client computer and a certificate generator on the server. The certificate validator integrity is protected using hashing result while all communications are encrypted using asymmetrical encryption to ensure the security of this approach.

  13. Comparison of Various Similarity Measures for Average Image Hash in Mobile Phone Application

    Science.gov (United States)

    Farisa Chaerul Haviana, Sam; Taufik, Muhammad

    2017-04-01

    One of the main issue in Content Based Image Retrieval (CIBR) is similarity measures for resulting image hashes. The main key challenge is to find the most benefits distance or similarity measures for calculating the similarity in term of speed and computing costs, specially under limited computing capabilities device like mobile phone. This study we utilize twelve most common and popular distance or similarity measures technique implemented in mobile phone application, to be compared and studied. The results show that all similarity measures implemented in this study was perform equally under mobile phone application. This gives more possibilities for method combinations to be implemented for image retrieval.

  14. Classification of cognitive systems dedicated to data sharing

    Science.gov (United States)

    Ogiela, Lidia; Ogiela, Marek R.

    2017-08-01

    In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.

  15. Investigation of Current State of Crytpography and Theoretical Implementation of a Cryptographic System for the Combat Service Support Control System.

    Science.gov (United States)

    1987-05-01

    34 Advances in Crypt g: Proceedings of CRYPTO 84,r o ... .. .. _ __...o ... .. ... ....... ed. by G.R. Blakely and D. Chaum . [Wagn84b] Wagner, Neal R...in Distributed Computer Systems," IEEE Trans. on Computers, Vol. C-35, No. 7, Jul. 86, pp. 583-590. Gifford, David K., "Cryptographic Sealing for

  16. MULTIMEDIA DATA TRANSMISSION THROUGH TCP/IP USING HASH BASED FEC WITH AUTO-XOR SCHEME

    Directory of Open Access Journals (Sweden)

    R. Shalin

    2012-09-01

    Full Text Available The most preferred mode for communication of multimedia data is through the TCP/IP protocol. But on the other hand the TCP/IP protocol produces huge packet loss unavoidable due to network traffic and congestion. In order to provide a efficient communication it is necessary to recover the loss of packets. The proposed scheme implements Hash based FEC with auto XOR scheme for this purpose. The scheme is implemented through Forward error correction, MD5 and XOR for providing efficient transmission of multimedia data. The proposed scheme provides transmission high accuracy, throughput and low latency and loss.

  17. Energy efficient security in MANETs: a comparison of cryptographic and artificial immune systems

    International Nuclear Information System (INIS)

    Mazhar, N.

    2010-01-01

    MANET is characterized by a set of mobile nodes in an inherently insecure environment, having limited battery capacities. Provisioning of energy efficient security in MANETs is, therefore, an open problem for which a number of solutions have been proposed. In this paper, we present an overview and comparison of the MANET security at routing layer by using the cryptographic and Artificial Immune System (AIS) approaches. The BeeAdHoc protocol, which is a Bio-inspired MANET routing protocol based on the foraging principles of honey bee colony, is taken as case study. We carry out an analysis of the three security frameworks that we have proposed earlier for securing BeeAdHoc protocol; one based on asymmetric key encryption, i.e BeeSec, and the other two using the AIS approach, i.e BeeAIS based on self non-self discrimination from adaptive immune system and BeeAIS-DC based on Dendritic Cell (DC) behavior from innate immune system. We extensively evaluate the performance of the three protocols through network simulations in ns-2 and compare with BeeAdHoc, the base protocol, as well as with state-of-the-art MANET routing protocols DSR and AODV. Our results clearly indicate that AIS based systems provide security at much lower cost to energy as compared with the cryptographic systems. Moreover, the use of dendritic cells and danger signals instead of the classical self non-self discrimination allows to detect the non-self antigens with greater accuracy. Based on the results of this investigation, we also propose a composite AIS model for BeeAdHoc security by combining the concepts from both the adaptive and the innate immune systems by modelling the attributes and behavior of the B-cells and DCs. (author)

  18. Secure Network Coding against Wiretapping and Byzantine Attacks

    Directory of Open Access Journals (Sweden)

    Qin Guo

    2010-01-01

    Full Text Available In wireless networks, an attacker can tune a receiver and tap the communication between two nodes. Whether or not some meaningful information is obtained by tapping a wireless connection depends on the transmission scheme. In this paper, we design some secure network coding by combining information-theoretic approaches with cryptographic approaches. It ensures that the wiretapper cannot get any meaningful information no matter how many channels are wiretapped. In addition, if each source packet is augmented with a hash symbol which is computed from a simple nonlinear polynomial function of the data symbols, then the probability of detecting the modification is very high.

  19. Architecture for the Secret-Key BC3 Cryptography Algorithm

    Directory of Open Access Journals (Sweden)

    Arif Sasongko

    2014-11-01

    Full Text Available Cryptography is a very important aspect in data security. The focus of research in this field is shifting from merely security aspect to consider as well the  implementation  aspect.  This  paper  aims  to  introduce  BC3  algorithm  with focus  on  its  hardware  implementation.  It  proposes  an  architecture  for  the hardware  implementation  for  this  algorithm.  BC3  algorithm  is  a  secret-key cryptography  algorithm  developed  with  two  considerations:  robustness  and implementation  efficiency.  This  algorithm  has  been  implemented  on  software and has good performance compared to AES algorithm. BC3 is improvement of BC2 and AE cryptographic algorithm and it is expected to have the same level of robustness and to gain competitive advantages in the implementation aspect. The development of the architecture gives much attention on (1 resource sharing and (2  having  single  clock  for  each  round.  It  exploits  regularity  of  the  algorithm. This architecture is then implemented on an FPGA. This implementation is three times smaller area than AES, but about five times faster. Furthermore, this BC3 hardware  implementation  has  better  performance  compared  to  BC3  software both in key expansion stage and randomizing stage. For the future, the security of this implementation must be reviewed especially against side channel attack.

  20. Experimental realization of Shor's quantum factoring algorithm using nuclear magnetic resonance.

    Science.gov (United States)

    Vandersypen, L M; Steffen, M; Breyta, G; Yannoni, C S; Sherwood, M H; Chuang, I L

    The number of steps any classical computer requires in order to find the prime factors of an l-digit integer N increases exponentially with l, at least using algorithms known at present. Factoring large integers is therefore conjectured to be intractable classically, an observation underlying the security of widely used cryptographic codes. Quantum computers, however, could factor integers in only polynomial time, using Shor's quantum factoring algorithm. Although important for the study of quantum computers, experimental demonstration of this algorithm has proved elusive. Here we report an implementation of the simplest instance of Shor's algorithm: factorization of N = 15 (whose prime factors are 3 and 5). We use seven spin-1/2 nuclei in a molecule as quantum bits, which can be manipulated with room temperature liquid-state nuclear magnetic resonance techniques. This method of using nuclei to store quantum information is in principle scalable to systems containing many quantum bits, but such scalability is not implied by the present work. The significance of our work lies in the demonstration of experimental and theoretical techniques for precise control and modelling of complex quantum computers. In particular, we present a simple, parameter-free but predictive model of decoherence effects in our system.

  1. Evaluation of Four Encryption Algorithms for Viability, Reliability and Performance Estimation

    Directory of Open Access Journals (Sweden)

    J. B. Awotunde

    2016-12-01

    Full Text Available Data and information in storage, in transit or during processing are found in various computers and computing devices with wide range of hardware specifications. Cryptography is the knowledge of using codes to encrypt and decrypt data. It enables one to store sensitive information or transmit it across computer in a more secured ways so that it cannot be read by anyone except the intended receiver. Cryptography also allows secure storage of sensitive data on any computer. Cryptography as an approach to computer security comes at a cost in terms of resource utilization such as time, memory and CPU usability time which in some cases may not be in abundance to achieve the set out objective of protecting data. This work looked into the memory construction rate, different key size, CPU utilization time period and encryption speed of the four algorithms to determine the amount of computer resource that is expended and how long it takes each algorithm to complete its task. Results shows that key length of the cryptographic algorithm is proportional to the resource utilization in most cases as found out in the key length of Blowfish, AES, 3DES and DES algorithms respectively. Further research can be carried out in order to determine the power utilization of each of these algorithms.

  2. A Secure Alignment Algorithm for Mapping Short Reads to Human Genome.

    Science.gov (United States)

    Zhao, Yongan; Wang, Xiaofeng; Tang, Haixu

    2018-05-09

    The elastic and inexpensive computing resources such as clouds have been recognized as a useful solution to analyzing massive human genomic data (e.g., acquired by using next-generation sequencers) in biomedical researches. However, outsourcing human genome computation to public or commercial clouds was hindered due to privacy concerns: even a small number of human genome sequences contain sufficient information for identifying the donor of the genomic data. This issue cannot be directly addressed by existing security and cryptographic techniques (such as homomorphic encryption), because they are too heavyweight to carry out practical genome computation tasks on massive data. In this article, we present a secure algorithm to accomplish the read mapping, one of the most basic tasks in human genomic data analysis based on a hybrid cloud computing model. Comparing with the existing approaches, our algorithm delegates most computation to the public cloud, while only performing encryption and decryption on the private cloud, and thus makes the maximum use of the computing resource of the public cloud. Furthermore, our algorithm reports similar results as the nonsecure read mapping algorithms, including the alignment between reads and the reference genome, which can be directly used in the downstream analysis such as the inference of genomic variations. We implemented the algorithm in C++ and Python on a hybrid cloud system, in which the public cloud uses an Apache Spark system.

  3. Physically unclonable cryptographic primitives using self-assembled carbon nanotubes

    Science.gov (United States)

    Hu, Zhaoying; Comeras, Jose Miguel M. Lobez; Park, Hongsik; Tang, Jianshi; Afzali, Ali; Tulevski, George S.; Hannon, James B.; Liehr, Michael; Han, Shu-Jen

    2016-06-01

    Information security underpins many aspects of modern society. However, silicon chips are vulnerable to hazards such as counterfeiting, tampering and information leakage through side-channel attacks (for example, by measuring power consumption, timing or electromagnetic radiation). Single-walled carbon nanotubes are a potential replacement for silicon as the channel material of transistors due to their superb electrical properties and intrinsic ultrathin body, but problems such as limited semiconducting purity and non-ideal assembly still need to be addressed before they can deliver high-performance electronics. Here, we show that by using these inherent imperfections, an unclonable electronic random structure can be constructed at low cost from carbon nanotubes. The nanotubes are self-assembled into patterned HfO2 trenches using ion-exchange chemistry, and the width of the trench is optimized to maximize the randomness of the nanotube placement. With this approach, two-dimensional (2D) random bit arrays are created that can offer ternary-bit architecture by determining the connection yield and switching type of the nanotube devices. As a result, our cryptographic keys provide a significantly higher level of security than conventional binary-bit architecture with the same key size.

  4. An update on the side channel cryptanalysis of MACs based on cryptographic hash functions

    DEFF Research Database (Denmark)

    Gauravaram, Praveen; Okeya, Katsuyuki

    2007-01-01

    Okeya has established that HMAC/NMAC implementations based on only Matyas-Meyer-Oseas (MMO) PGV scheme and his two refined PGV schemes are secure against side channel DPA attacks when the block cipher in these constructions is secure against these attacks. The significant result of Okeya's analys...

  5. Opportunities in white-box cryptography

    NARCIS (Netherlands)

    Michiels, W.

    White-box cryptography is the discipline of implementing a cryptographic algorithm in software such that an adversary will have difficulty extracting the cryptographic key. This approach assumes that the adversary has full access to and full control over the implementation's execution. White-box

  6. An Efficient Biometric-Based Algorithm Using Heart Rate Variability for Securing Body Sensor Networks.

    Science.gov (United States)

    Pirbhulal, Sandeep; Zhang, Heye; Mukhopadhyay, Subhas Chandra; Li, Chunyue; Wang, Yumei; Li, Guanglin; Wu, Wanqing; Zhang, Yuan-Ting

    2015-06-26

    Body Sensor Network (BSN) is a network of several associated sensor nodes on, inside or around the human body to monitor vital signals, such as, Electroencephalogram (EEG), Photoplethysmography (PPG), Electrocardiogram (ECG), etc. Each sensor node in BSN delivers major information; therefore, it is very significant to provide data confidentiality and security. All existing approaches to secure BSN are based on complex cryptographic key generation procedures, which not only demands high resource utilization and computation time, but also consumes large amount of energy, power and memory during data transmission. However, it is indispensable to put forward energy efficient and computationally less complex authentication technique for BSN. In this paper, a novel biometric-based algorithm is proposed, which utilizes Heart Rate Variability (HRV) for simple key generation process to secure BSN. Our proposed algorithm is compared with three data authentication techniques, namely Physiological Signal based Key Agreement (PSKA), Data Encryption Standard (DES) and Rivest Shamir Adleman (RSA). Simulation is performed in Matlab and results suggest that proposed algorithm is quite efficient in terms of transmission time utilization, average remaining energy and total power consumption.

  7. Embedded Platform for Automatic Testing and Optimizing of FPGA Based Cryptographic True Random Number Generators

    Directory of Open Access Journals (Sweden)

    M. Varchola

    2009-12-01

    Full Text Available This paper deals with an evaluation platform for cryptographic True Random Number Generators (TRNGs based on the hardware implementation of statistical tests for FPGAs. It was developed in order to provide an automatic tool that helps to speed up the TRNG design process and can provide new insights on the TRNG behavior as it will be shown on a particular example in the paper. It enables to test sufficient statistical properties of various TRNG designs under various working conditions on the fly. Moreover, the tests are suitable to be embedded into cryptographic hardware products in order to recognize TRNG output of weak quality and thus increase its robustness and reliability. Tests are fully compatible with the FIPS 140 standard and are implemented by the VHDL language as an IP-Core for vendor independent FPGAs. A recent Flash based Actel Fusion FPGA was chosen for preliminary experiments. The Actel version of the tests possesses an interface to the Actel’s CoreMP7 softcore processor that is fully compatible with the industry standard ARM7TDMI. Moreover, identical tests suite was implemented to the Xilinx Virtex 2 and 5 in order to compare the performance of the proposed solution with the performance of already published one based on the same FPGAs. It was achieved 25% and 65% greater clock frequency respectively while consuming almost equal resources of the Xilinx FPGAs. On the top of it, the proposed FIPS 140 architecture is capable of processing one random bit per one clock cycle which results in 311.5 Mbps throughput for Virtex 5 FPGA.

  8. A novel image encryption algorithm based on a 3D chaotic map

    Science.gov (United States)

    Kanso, A.; Ghebleh, M.

    2012-07-01

    Recently [Solak E, Çokal C, Yildiz OT Biyikoǧlu T. Cryptanalysis of Fridrich's chaotic image encryption. Int J Bifur Chaos 2010;20:1405-1413] cryptanalyzed the chaotic image encryption algorithm of [Fridrich J. Symmetric ciphers based on two-dimensional chaotic maps. Int J Bifur Chaos 1998;8(6):1259-1284], which was considered a benchmark for measuring security of many image encryption algorithms. This attack can also be applied to other encryption algorithms that have a structure similar to Fridrich's algorithm, such as that of [Chen G, Mao Y, Chui, C. A symmetric image encryption scheme based on 3D chaotic cat maps. Chaos Soliton Fract 2004;21:749-761]. In this paper, we suggest a novel image encryption algorithm based on a three dimensional (3D) chaotic map that can defeat the aforementioned attack among other existing attacks. The design of the proposed algorithm is simple and efficient, and based on three phases which provide the necessary properties for a secure image encryption algorithm including the confusion and diffusion properties. In phase I, the image pixels are shuffled according to a search rule based on the 3D chaotic map. In phases II and III, 3D chaotic maps are used to scramble shuffled pixels through mixing and masking rules, respectively. Simulation results show that the suggested algorithm satisfies the required performance tests such as high level security, large key space and acceptable encryption speed. These characteristics make it a suitable candidate for use in cryptographic applications.

  9. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    Science.gov (United States)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  10. Fast parallel molecular algorithms for DNA-based computation: solving the elliptic curve discrete logarithm problem over GF2.

    Science.gov (United States)

    Li, Kenli; Zou, Shuting; Xv, Jin

    2008-01-01

    Elliptic curve cryptographic algorithms convert input data to unrecognizable encryption and the unrecognizable data back again into its original decrypted form. The security of this form of encryption hinges on the enormous difficulty that is required to solve the elliptic curve discrete logarithm problem (ECDLP), especially over GF(2(n)), n in Z+. This paper describes an effective method to find solutions to the ECDLP by means of a molecular computer. We propose that this research accomplishment would represent a breakthrough for applied biological computation and this paper demonstrates that in principle this is possible. Three DNA-based algorithms: a parallel adder, a parallel multiplier, and a parallel inverse over GF(2(n)) are described. The biological operation time of all of these algorithms is polynomial with respect to n. Considering this analysis, cryptography using a public key might be less secure. In this respect, a principal contribution of this paper is to provide enhanced evidence of the potential of molecular computing to tackle such ambitious computations.

  11. Threshold quantum cryptograph based on Grover's algorithm

    International Nuclear Information System (INIS)

    Du Jianzhong; Qin Sujuan; Wen Qiaoyan; Zhu Fuchen

    2007-01-01

    We propose a threshold quantum protocol based on Grover's operator and permutation operator on one two-qubit signal. The protocol is secure because the dishonest parties can only extract 2 bits from 3 bits information of operation on one two-qubit signal while they have to introduce error probability 3/8. The protocol includes a detection scheme to resist Trojan horse attack. With probability 1/2, the detection scheme can detect a multi-qubit signal that is used to replace a single-qubit signal, while it makes every legitimate qubit invariant

  12. Cryptographically Secure Multiparty Computation and Distributed Auctions Using Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Anunay Kulshrestha

    2017-12-01

    Full Text Available We introduce a robust framework that allows for cryptographically secure multiparty computations, such as distributed private value auctions. The security is guaranteed by two-sided authentication of all network connections, homomorphically encrypted bids, and the publication of zero-knowledge proofs of every computation. This also allows a non-participant verifier to verify the result of any such computation using only the information broadcasted on the network by each individual bidder. Building on previous work on such systems, we design and implement an extensible framework that puts the described ideas to practice. Apart from the actual implementation of the framework, our biggest contribution is the level of protection we are able to guarantee from attacks described in previous work. In order to provide guidance to users of the library, we analyze the use of zero knowledge proofs in ensuring the correct behavior of each node in a computation. We also describe the usage of the library to perform a private-value distributed auction, as well as the other challenges in implementing the protocol, such as auction registration and certificate distribution. Finally, we provide performance statistics on our implementation of the auction.

  13. A Fast Approximate Algorithm for Mapping Long Reads to Large Reference Databases.

    Science.gov (United States)

    Jain, Chirag; Dilthey, Alexander; Koren, Sergey; Aluru, Srinivas; Phillippy, Adam M

    2018-04-30

    Emerging single-molecule sequencing technologies from Pacific Biosciences and Oxford Nanopore have revived interest in long-read mapping algorithms. Alignment-based seed-and-extend methods demonstrate good accuracy, but face limited scalability, while faster alignment-free methods typically trade decreased precision for efficiency. In this article, we combine a fast approximate read mapping algorithm based on minimizers with a novel MinHash identity estimation technique to achieve both scalability and precision. In contrast to prior methods, we develop a mathematical framework that defines the types of mapping targets we uncover, establish probabilistic estimates of p-value and sensitivity, and demonstrate tolerance for alignment error rates up to 20%. With this framework, our algorithm automatically adapts to different minimum length and identity requirements and provides both positional and identity estimates for each mapping reported. For mapping human PacBio reads to the hg38 reference, our method is 290 × faster than Burrows-Wheeler Aligner-MEM with a lower memory footprint and recall rate of 96%. We further demonstrate the scalability of our method by mapping noisy PacBio reads (each ≥5 kbp in length) to the complete NCBI RefSeq database containing 838 Gbp of sequence and >60,000 genomes.

  14. Data protection by using the «Сhua’s circuit » chaos generator

    Directory of Open Access Journals (Sweden)

    Тетяна Олександрівна Левицька

    2017-07-01

    Full Text Available This article focuses on the justification of the use of cryptosystems based on a mathematical model of the chaos generator (an electric circuit, showing modes of chaotic oscillations, proposed by Leon Chua in 1983. This article also describes the principles of implementation of cryptographic algorithm and its application prospects. Reviewed the next questions: the problems of widespread cryptosystems, the theory of cryptographically strong algorithms, absolutely and computationally secure ciphers, particular theoretical method for solving the problem of increasing the reliability of hybrid computational proof systems by inclusion of a mathematical model of chaos as a generator to encrypt transmitted data key. Here described the recommendations on the implementation of cryptographic system and requirements on the Chua’s circuit generator ch

  15. Evaluating privacy-preserving record linkage using cryptographic long-term keys and multibit trees on large medical datasets.

    Science.gov (United States)

    Brown, Adrian P; Borgs, Christian; Randall, Sean M; Schnell, Rainer

    2017-06-08

    Integrating medical data using databases from different sources by record linkage is a powerful technique increasingly used in medical research. Under many jurisdictions, unique personal identifiers needed for linking the records are unavailable. Since sensitive attributes, such as names, have to be used instead, privacy regulations usually demand encrypting these identifiers. The corresponding set of techniques for privacy-preserving record linkage (PPRL) has received widespread attention. One recent method is based on Bloom filters. Due to superior resilience against cryptographic attacks, composite Bloom filters (cryptographic long-term keys, CLKs) are considered best practice for privacy in PPRL. Real-world performance of these techniques using large-scale data is unknown up to now. Using a large subset of Australian hospital admission data, we tested the performance of an innovative PPRL technique (CLKs using multibit trees) against a gold-standard derived from clear-text probabilistic record linkage. Linkage time and linkage quality (recall, precision and F-measure) were evaluated. Clear text probabilistic linkage resulted in marginally higher precision and recall than CLKs. PPRL required more computing time but 5 million records could still be de-duplicated within one day. However, the PPRL approach required fine tuning of parameters. We argue that increased privacy of PPRL comes with the price of small losses in precision and recall and a large increase in computational burden and setup time. These costs seem to be acceptable in most applied settings, but they have to be considered in the decision to apply PPRL. Further research on the optimal automatic choice of parameters is needed.

  16. Combination of Rivest-Shamir-Adleman Algorithm and End of File Method for Data Security

    Science.gov (United States)

    Rachmawati, Dian; Amalia, Amalia; Elviwani

    2018-03-01

    Data security is one of the crucial issues in the delivery of information. One of the ways which used to secure the data is by encoding it into something else that is not comprehensible by human beings by using some crypto graphical techniques. The Rivest-Shamir-Adleman (RSA) cryptographic algorithm has been proven robust to secure messages. Since this algorithm uses two different keys (i.e., public key and private key) at the time of encryption and decryption, it is classified as asymmetric cryptography algorithm. Steganography is a method that is used to secure a message by inserting the bits of the message into a larger media such as an image. One of the known steganography methods is End of File (EoF). In this research, the cipher text resulted from the RSA algorithm is compiled into an array form and appended to the end of the image. The result of the EoF is the image which has a line with black gradations under it. This line contains the secret message. This combination of cryptography and steganography in securing the message is expected to increase the security of the message, since the message encryption technique (RSA) is mixed with the data hiding technique (EoF).

  17. Maximize Minimum Utility Function of Fractional Cloud Computing System Based on Search Algorithm Utilizing the Mittag-Leffler Sum

    Directory of Open Access Journals (Sweden)

    Rabha W. Ibrahim

    2018-01-01

    Full Text Available The maximum min utility function (MMUF problem is an important representative of a large class of cloud computing systems (CCS. Having numerous applications in practice, especially in economy and industry. This paper introduces an effective solution-based search (SBS algorithm for solving the problem MMUF. First, we suggest a new formula of the utility function in term of the capacity of the cloud. We formulate the capacity in CCS, by using a fractional diffeo-integral equation. This equation usually describes the flow of CCS. The new formula of the utility function is modified recent active utility functions. The suggested technique first creates a high-quality initial solution by eliminating the less promising components, and then develops the quality of the achieved solution by the summation search solution (SSS. This method is considered by the Mittag-Leffler sum as hash functions to determine the position of the agent. Experimental results commonly utilized in the literature demonstrate that the proposed algorithm competes approvingly with the state-of-the-art algorithms both in terms of solution quality and computational efficiency.

  18. Implementasi Algoritma Kriptografi RSA untuk Enkripsi dan Dekripsi Email

    Directory of Open Access Journals (Sweden)

    Albert Ginting

    2015-04-01

    Full Text Available In the world of Internet nothing is really safe. There's always a gap in any application made. Likewise in email delivery. To minimize attacks on the data transmission is usually applied cryptography. One fairly popular cryptographic algorithms are RSA algorithm. In this study will discuss the implementation of a cryptographic algorithm RSA encryption and decryption process email. To test created a java-based email client program with message encryption and decryption features messages. This application uses the Java programming language and Netbeans 7.4 as editor. Mail servers used is Google Mail. The initial step of this study was to download email from the Google server and encrypt the message. The second step is decrypt the message to verify whether the message is still the same as the original message before it is encrypted . Results from this study is the application that can encrypt and decrypt messages using RSA cryptographic algorithm. With this application is expected to mail delivery is much safer. Because encrypted email will generate a random decimal number of unknown value .

  19. EFFICIENT ADAPTIVE STEGANOGRAPHY FOR COLOR IMAGESBASED ON LSBMR ALGORITHM

    Directory of Open Access Journals (Sweden)

    B. Sharmila

    2012-02-01

    Full Text Available Steganography is the art of hiding the fact that communication is taking place, by hiding information in other medium. Many different carrier file formats can be used, but digital images are the most popular because of their frequent use on the Internet. For hiding secret information in images, there exists a large variety of steganographic techniques. The Least Significant Bit (LSB based approach is a simplest type of steganographic algorithm. In all the existing approaches, the decision of choosing the region within a cover image is performed without considering the relationship between image content and the size of secret message. Thus, the plain regions in the cover will be ruin after data hiding even at a low data rate. Hence choosing the edge region for data hiding will be a solution. Many algorithms are deal with edges in images for data hiding. The Paper 'Edge adaptive image steganography based on LSBMR algorithm' is a LSB steganography presented the results of algorithms on gray-scale images only. This paper presents the results of analyzing the performance of edge adaptive steganography for colored images (JPEG. The algorithms have been slightly modified for colored image implementation and are compared on the basis of evaluation parameters like peak signal noise ratio (PSNR and mean square error (MSE. This method can select the edge region depending on the length of secret message and difference between two consecutive bits in the cover image. For length of message is short, only small edge regions are utilized while on leaving other region as such. When the data rate increases, more regions can be used adaptively for data hiding by adjusting the parameters. Besides this, the message is encrypted using efficient cryptographic algorithm which further increases the security.

  20. Centralized Cryptographic Key Management and Critical Risk Assessment - CRADA Final Report For CRADA Number NFE-11-03562

    Energy Technology Data Exchange (ETDEWEB)

    Abercrombie, R. K. [ORNL; Peters, Scott [Sypris Electronics, LLC

    2014-05-28

    The Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) Cyber Security for Energy Delivery Systems (CSEDS) industry led program (DE-FOA-0000359) entitled "Innovation for Increasing Cyber Security for Energy Delivery Systems (12CSEDS)," awarded a contract to Sypris Electronics LLC to develop a Cryptographic Key Management System for the smart grid (Scalable Key Management Solutions for Critical Infrastructure Protection). Oak Ridge National Laboratory (ORNL) and Sypris Electronics, LLC as a result of that award entered into a CRADA (NFE-11-03562) between ORNL and Sypris Electronics, LLC. ORNL provided its Cyber Security Econometrics System (CSES) as a tool to be modified and used as a metric to address risks and vulnerabilities in the management of cryptographic keys within the Advanced Metering Infrastructure (AMI) domain of the electric sector. ORNL concentrated our analysis on the AMI domain of which the National Electric Sector Cyber security Organization Resource (NESCOR) Working Group 1 (WG1) has documented 29 failure scenarios. The computational infrastructure of this metric involves system stakeholders, security requirements, system components and security threats. To compute this metric, we estimated the stakes that each stakeholder associates with each security requirement, as well as stochastic matrices that represent the probability of a threat to cause a component failure and the probability of a component failure to cause a security requirement violation. We applied this model to estimate the security of the AMI, by leveraging the recently established National Institute of Standards and Technology Interagency Report (NISTIR) 7628 guidelines for smart grid security and the International Electrotechnical Commission (IEC) 63351, Part 9 to identify the life cycle for cryptographic key management, resulting in a vector that assigned to each stakeholder an estimate of their average loss in terms of dollars per day of system

  1. A novel algorithm for thermal image encryption.

    Science.gov (United States)

    Hussain, Iqtadar; Anees, Amir; Algarni, Abdulmohsen

    2018-04-16

    Thermal images play a vital character at nuclear plants, Power stations, Forensic labs biological research, and petroleum products extraction. Safety of thermal images is very important. Image data has some unique features such as intensity, contrast, homogeneity, entropy and correlation among pixels that is why somehow image encryption is trickier as compare to other encryptions. With conventional image encryption schemes it is normally hard to handle these features. Therefore, cryptographers have paid attention to some attractive properties of the chaotic maps such as randomness and sensitivity to build up novel cryptosystems. That is why, recently proposed image encryption techniques progressively more depends on the application of chaotic maps. This paper proposed an image encryption algorithm based on Chebyshev chaotic map and S8 Symmetric group of permutation based substitution boxes. Primarily, parameters of chaotic Chebyshev map are chosen as a secret key to mystify the primary image. Then, the plaintext image is encrypted by the method generated from the substitution boxes and Chebyshev map. By this process, we can get a cipher text image that is perfectly twisted and dispersed. The outcomes of renowned experiments, key sensitivity tests and statistical analysis confirm that the proposed algorithm offers a safe and efficient approach for real-time image encryption.

  2. An Efficient Biometric-Based Algorithm Using Heart Rate Variability for Securing Body Sensor Networks

    Directory of Open Access Journals (Sweden)

    Sandeep Pirbhulal

    2015-06-01

    Full Text Available Body Sensor Network (BSN is a network of several associated sensor nodes on, inside or around the human body to monitor vital signals, such as, Electroencephalogram (EEG, Photoplethysmography (PPG, Electrocardiogram (ECG, etc. Each sensor node in BSN delivers major information; therefore, it is very significant to provide data confidentiality and security. All existing approaches to secure BSN are based on complex cryptographic key generation procedures, which not only demands high resource utilization and computation time, but also consumes large amount of energy, power and memory during data transmission. However, it is indispensable to put forward energy efficient and computationally less complex authentication technique for BSN. In this paper, a novel biometric-based algorithm is proposed, which utilizes Heart Rate Variability (HRV for simple key generation process to secure BSN. Our proposed algorithm is compared with three data authentication techniques, namely Physiological Signal based Key Agreement (PSKA, Data Encryption Standard (DES and Rivest Shamir Adleman (RSA. Simulation is performed in Matlab and results suggest that proposed algorithm is quite efficient in terms of transmission time utilization, average remaining energy and total power consumption.

  3. An Efficient Biometric-Based Algorithm Using Heart Rate Variability for Securing Body Sensor Networks

    Science.gov (United States)

    Pirbhulal, Sandeep; Zhang, Heye; Mukhopadhyay, Subhas Chandra; Li, Chunyue; Wang, Yumei; Li, Guanglin; Wu, Wanqing; Zhang, Yuan-Ting

    2015-01-01

    Body Sensor Network (BSN) is a network of several associated sensor nodes on, inside or around the human body to monitor vital signals, such as, Electroencephalogram (EEG), Photoplethysmography (PPG), Electrocardiogram (ECG), etc. Each sensor node in BSN delivers major information; therefore, it is very significant to provide data confidentiality and security. All existing approaches to secure BSN are based on complex cryptographic key generation procedures, which not only demands high resource utilization and computation time, but also consumes large amount of energy, power and memory during data transmission. However, it is indispensable to put forward energy efficient and computationally less complex authentication technique for BSN. In this paper, a novel biometric-based algorithm is proposed, which utilizes Heart Rate Variability (HRV) for simple key generation process to secure BSN. Our proposed algorithm is compared with three data authentication techniques, namely Physiological Signal based Key Agreement (PSKA), Data Encryption Standard (DES) and Rivest Shamir Adleman (RSA). Simulation is performed in Matlab and results suggest that proposed algorithm is quite efficient in terms of transmission time utilization, average remaining energy and total power consumption. PMID:26131666

  4. Enhanced diffie-hellman algorithm for reliable key exchange

    Science.gov (United States)

    Aryan; Kumar, Chaithanya; Vincent, P. M. Durai Raj

    2017-11-01

    The Diffie -Hellman is one of the first public-key procedure and is a certain way of exchanging the cryptographic keys securely. This concept was introduced by Ralph Markel and it is named after Whitfield Diffie and Martin Hellman. Sender and Receiver make a common secret key in Diffie-Hellman algorithm and then they start communicating with each other over the public channel which is known to everyone. A number of internet services are secured by Diffie -Hellman. In Public key cryptosystem, the sender has to trust while receiving the public key of the receiver and vice-versa and this is the challenge of public key cryptosystem. Man-in-the-Middle attack is very much possible on the existing Diffie-Hellman algorithm. In man-in-the-middle attack, the attacker exists in the public channel, the attacker receives the public key of both sender and receiver and sends public keys to sender and receiver which is generated by his own. This is how man-in-the-middle attack is possible on Diffie-Hellman algorithm. Denial of service attack is another attack which is found common on Diffie-Hellman. In this attack, the attacker tries to stop the communication happening between sender and receiver and attacker can do this by deleting messages or by confusing the parties with miscommunication. Some more attacks like Insider attack, Outsider attack, etc are possible on Diffie-Hellman. To reduce the possibility of attacks on Diffie-Hellman algorithm, we have enhanced the Diffie-Hellman algorithm to a next level. In this paper, we are extending the Diffie -Hellman algorithm by using the concept of the Diffie -Hellman algorithm to get a stronger secret key and that secret key is further exchanged between the sender and the receiver so that for each message, a new secret shared key would be generated. The second secret key will be generated by taking primitive root of the first secret key.

  5. Cryptographic analysis on the key space of optical phase encryption algorithm based on the design of discrete random phase mask

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Li, Zengyan

    2013-07-01

    The key space of phase encryption algorithm using discrete random phase mask is investigated by numerical simulation in this paper. Random phase mask with finite and discrete phase levels is considered as the core component in most practical optical encryption architectures. The key space analysis is based on the design criteria of discrete random phase mask. The role of random amplitude mask and random phase mask in optical encryption system is identified from the perspective of confusion and diffusion. The properties of discrete random phase mask in a practical double random phase encoding scheme working in both amplitude encoding (AE) and phase encoding (PE) modes are comparably analyzed. The key space of random phase encryption algorithm is evaluated considering both the encryption quality and the brute-force attack resistibility. A method for enlarging the key space of phase encryption algorithm is also proposed to enhance the security of optical phase encryption techniques.

  6. A covert authentication and security solution for GMOs.

    Science.gov (United States)

    Mueller, Siguna; Jafari, Farhad; Roth, Don

    2016-09-21

    Proliferation and expansion of security risks necessitates new measures to ensure authenticity and validation of GMOs. Watermarking and other cryptographic methods are available which conceal and recover the original signature, but in the process reveal the authentication information. In many scenarios watermarking and standard cryptographic methods are necessary but not sufficient and new, more advanced, cryptographic protocols are necessary. Herein, we present a new crypto protocol, that is applicable in broader settings, and embeds the authentication string indistinguishably from a random element in the signature space and the string is verified or denied without disclosing the actual signature. Results show that in a nucleotide string of 1000, the algorithm gives a correlation of 0.98 or higher between the distribution of the codon and that of E. coli, making the signature virtually invisible. This algorithm may be used to securely authenticate and validate GMOs without disclosing the actual signature. While this protocol uses watermarking, its novelty is in use of more complex cryptographic techniques based on zero knowledge proofs to encode information.

  7. A novel heuristic method for obtaining S-boxes

    International Nuclear Information System (INIS)

    Chen Guo

    2008-01-01

    An efficient algorithm named chaotic multi-swapping and simulated annealing (CMSSA) for obtaining cryptographically strong 8 x 8 S-boxes is presented. The method is based on chaotic maps and simulated annealing. In addition, cryptographic properties such as bijectivity, strict avalanche criterion, nonlinearity, output bits independence criterion and equiprobable input/output XOR distribution are analyzed in detail for the S-box produced. The results of numerical analysis show that the box has nearly fulfilled the criteria for a cryptographically strong S-box and can effectively resist several attacks

  8. A Dynamic Linear Hashing Method for Redundancy Management in Train Ethernet Consist Network

    Directory of Open Access Journals (Sweden)

    Xiaobo Nie

    2016-01-01

    Full Text Available Massive transportation systems like trains are considered critical systems because they use the communication network to control essential subsystems on board. Critical system requires zero recovery time when a failure occurs in a communication network. The newly published IEC62439-3 defines the high-availability seamless redundancy protocol, which fulfills this requirement and ensures no frame loss in the presence of an error. This paper adopts these for train Ethernet consist network. The challenge is management of the circulating frames, capable of dealing with real-time processing requirements, fast switching times, high throughout, and deterministic behavior. The main contribution of this paper is the in-depth analysis it makes of network parameters imposed by the application of the protocols to train control and monitoring system (TCMS and the redundant circulating frames discarding method based on a dynamic linear hashing, using the fastest method in order to resolve all the issues that are dealt with.

  9. Data Recovery of Distributed Hash Table with Distributed-to-Distributed Data Copy

    Science.gov (United States)

    Doi, Yusuke; Wakayama, Shirou; Ozaki, Satoshi

    To realize huge-scale information services, many Distributed Hash Table (DHT) based systems have been proposed. For example, there are some proposals to manage item-level product traceability information with DHTs. In such an application, each entry of a huge number of item-level IDs need to be available on a DHT. To ensure data availability, the soft-state approach has been employed in previous works. However, this does not scale well against the number of entries on a DHT. As we expect 1010 products in the traceability case, the soft-state approach is unacceptable. In this paper, we propose Distributed-to-Distributed Data Copy (D3C). With D3C, users can reconstruct the data as they detect data loss, or even migrate to another DHT system. We show why it scales well against the number of entries on a DHT. We have confirmed our approach with a prototype. Evaluation shows our approach fits well on a DHT with a low rate of failure and a huge number of data entries.

  10. DNA-based watermarks using the DNA-Crypt algorithm

    Directory of Open Access Journals (Sweden)

    Barnekow Angelika

    2007-05-01

    Full Text Available Abstract Background The aim of this paper is to demonstrate the application of watermarks based on DNA sequences to identify the unauthorized use of genetically modified organisms (GMOs protected by patents. Predicted mutations in the genome can be corrected by the DNA-Crypt program leaving the encrypted information intact. Existing DNA cryptographic and steganographic algorithms use synthetic DNA sequences to store binary information however, although these sequences can be used for authentication, they may change the target DNA sequence when introduced into living organisms. Results The DNA-Crypt algorithm and image steganography are based on the same watermark-hiding principle, namely using the least significant base in case of DNA-Crypt and the least significant bit in case of the image steganography. It can be combined with binary encryption algorithms like AES, RSA or Blowfish. DNA-Crypt is able to correct mutations in the target DNA with several mutation correction codes such as the Hamming-code or the WDH-code. Mutations which can occur infrequently may destroy the encrypted information, however an integrated fuzzy controller decides on a set of heuristics based on three input dimensions, and recommends whether or not to use a correction code. These three input dimensions are the length of the sequence, the individual mutation rate and the stability over time, which is represented by the number of generations. In silico experiments using the Ypt7 in Saccharomyces cerevisiae shows that the DNA watermarks produced by DNA-Crypt do not alter the translation of mRNA into protein. Conclusion The program is able to store watermarks in living organisms and can maintain the original information by correcting mutations itself. Pairwise or multiple sequence alignments show that DNA-Crypt produces few mismatches between the sequences similar to all steganographic algorithms.

  11. DNA-based watermarks using the DNA-Crypt algorithm.

    Science.gov (United States)

    Heider, Dominik; Barnekow, Angelika

    2007-05-29

    The aim of this paper is to demonstrate the application of watermarks based on DNA sequences to identify the unauthorized use of genetically modified organisms (GMOs) protected by patents. Predicted mutations in the genome can be corrected by the DNA-Crypt program leaving the encrypted information intact. Existing DNA cryptographic and steganographic algorithms use synthetic DNA sequences to store binary information however, although these sequences can be used for authentication, they may change the target DNA sequence when introduced into living organisms. The DNA-Crypt algorithm and image steganography are based on the same watermark-hiding principle, namely using the least significant base in case of DNA-Crypt and the least significant bit in case of the image steganography. It can be combined with binary encryption algorithms like AES, RSA or Blowfish. DNA-Crypt is able to correct mutations in the target DNA with several mutation correction codes such as the Hamming-code or the WDH-code. Mutations which can occur infrequently may destroy the encrypted information, however an integrated fuzzy controller decides on a set of heuristics based on three input dimensions, and recommends whether or not to use a correction code. These three input dimensions are the length of the sequence, the individual mutation rate and the stability over time, which is represented by the number of generations. In silico experiments using the Ypt7 in Saccharomyces cerevisiae shows that the DNA watermarks produced by DNA-Crypt do not alter the translation of mRNA into protein. The program is able to store watermarks in living organisms and can maintain the original information by correcting mutations itself. Pairwise or multiple sequence alignments show that DNA-Crypt produces few mismatches between the sequences similar to all steganographic algorithms.

  12. DNA-based watermarks using the DNA-Crypt algorithm

    Science.gov (United States)

    Heider, Dominik; Barnekow, Angelika

    2007-01-01

    Background The aim of this paper is to demonstrate the application of watermarks based on DNA sequences to identify the unauthorized use of genetically modified organisms (GMOs) protected by patents. Predicted mutations in the genome can be corrected by the DNA-Crypt program leaving the encrypted information intact. Existing DNA cryptographic and steganographic algorithms use synthetic DNA sequences to store binary information however, although these sequences can be used for authentication, they may change the target DNA sequence when introduced into living organisms. Results The DNA-Crypt algorithm and image steganography are based on the same watermark-hiding principle, namely using the least significant base in case of DNA-Crypt and the least significant bit in case of the image steganography. It can be combined with binary encryption algorithms like AES, RSA or Blowfish. DNA-Crypt is able to correct mutations in the target DNA with several mutation correction codes such as the Hamming-code or the WDH-code. Mutations which can occur infrequently may destroy the encrypted information, however an integrated fuzzy controller decides on a set of heuristics based on three input dimensions, and recommends whether or not to use a correction code. These three input dimensions are the length of the sequence, the individual mutation rate and the stability over time, which is represented by the number of generations. In silico experiments using the Ypt7 in Saccharomyces cerevisiae shows that the DNA watermarks produced by DNA-Crypt do not alter the translation of mRNA into protein. Conclusion The program is able to store watermarks in living organisms and can maintain the original information by correcting mutations itself. Pairwise or multiple sequence alignments show that DNA-Crypt produces few mismatches between the sequences similar to all steganographic algorithms. PMID:17535434

  13. Compact data structure and scalable algorithms for the sparse grid technique

    KAUST Repository

    Murarasu, Alin

    2011-01-01

    The sparse grid discretization technique enables a compressed representation of higher-dimensional functions. In its original form, it relies heavily on recursion and complex data structures, thus being far from well-suited for GPUs. In this paper, we describe optimizations that enable us to implement compression and decompression, the crucial sparse grid algorithms for our application, on Nvidia GPUs. The main idea consists of a bijective mapping between the set of points in a multi-dimensional sparse grid and a set of consecutive natural numbers. The resulting data structure consumes a minimum amount of memory. For a 10-dimensional sparse grid with approximately 127 million points, it consumes up to 30 times less memory than trees or hash tables which are typically used. Compared to a sequential CPU implementation, the speedups achieved on GPU are up to 17 for compression and up to 70 for decompression, respectively. We show that the optimizations are also applicable to multicore CPUs. Copyright © 2011 ACM.

  14. IMPLEMENTATION OF NEURAL - CRYPTOGRAPHIC SYSTEM USING FPGA

    Directory of Open Access Journals (Sweden)

    KARAM M. Z. OTHMAN

    2011-08-01

    Full Text Available Modern cryptography techniques are virtually unbreakable. As the Internet and other forms of electronic communication become more prevalent, electronic security is becoming increasingly important. Cryptography is used to protect e-mail messages, credit card information, and corporate data. The design of the cryptography system is a conventional cryptography that uses one key for encryption and decryption process. The chosen cryptography algorithm is stream cipher algorithm that encrypt one bit at a time. The central problem in the stream-cipher cryptography is the difficulty of generating a long unpredictable sequence of binary signals from short and random key. Pseudo random number generators (PRNG have been widely used to construct this key sequence. The pseudo random number generator was designed using the Artificial Neural Networks (ANN. The Artificial Neural Networks (ANN providing the required nonlinearity properties that increases the randomness statistical properties of the pseudo random generator. The learning algorithm of this neural network is backpropagation learning algorithm. The learning process was done by software program in Matlab (software implementation to get the efficient weights. Then, the learned neural network was implemented using field programmable gate array (FPGA.

  15. Learning binary code via PCA of angle projection for image retrieval

    Science.gov (United States)

    Yang, Fumeng; Ye, Zhiqiang; Wei, Xueqi; Wu, Congzhong

    2018-01-01

    With benefits of low storage costs and high query speeds, binary code representation methods are widely researched for efficiently retrieving large-scale data. In image hashing method, learning hashing function to embed highdimensions feature to Hamming space is a key step for accuracy retrieval. Principal component analysis (PCA) technical is widely used in compact hashing methods, and most these hashing methods adopt PCA projection functions to project the original data into several dimensions of real values, and then each of these projected dimensions is quantized into one bit by thresholding. The variances of different projected dimensions are different, and with real-valued projection produced more quantization error. To avoid the real-valued projection with large quantization error, in this paper we proposed to use Cosine similarity projection for each dimensions, the angle projection can keep the original structure and more compact with the Cosine-valued. We used our method combined the ITQ hashing algorithm, and the extensive experiments on the public CIFAR-10 and Caltech-256 datasets validate the effectiveness of the proposed method.

  16. A chimeric fusion of the hASH1 and EZH2 promoters mediates high and specific reporter and suicide gene expression and cytotoxicity in small cell lung cancer cells

    DEFF Research Database (Denmark)

    Poulsen, T.T.; Pedersen, N.; Juel, H.

    2008-01-01

    Transcriptionally targeted gene therapy is a promising experimental modality for treatment of systemic malignancies such as small cell lung cancer (SCLC). We have identified the human achaete-scute homolog 1 (hASH1) and enhancer of zeste homolog 2 (EZH2) genes as highly upregulated in SCLC compar...

  17. Breaking an encryption scheme based on chaotic baker map

    International Nuclear Information System (INIS)

    Alvarez, Gonzalo; Li, Shujun

    2006-01-01

    In recent years, a growing number of cryptosystems based on chaos have been proposed, many of them fundamentally flawed by a lack of robustness and security. This Letter describes the security weaknesses of a recently proposed cryptographic algorithm with chaos at the physical level based on the baker map. It is shown that the security is trivially compromised for practical implementations of the cryptosystem with finite computing precision and for the use of the iteration number n as the secret key. Some possible countermeasures to enhance the security of the chaos-based cryptographic algorithm are also discussed

  18. Best-First Heuristic Search for Multicore Machines

    Science.gov (United States)

    2010-01-01

    Otto, 1998) to implement an asynchronous version of PRA* that they call Hash Distributed A* ( HDA *). HDA * distributes nodes using a hash function in...nodes which are being communicated between peers are in transit. In contact with the authors of HDA *, we have created an implementation of HDA * for...Also, our implementation of HDA * allows us to make a fair comparison between algorithms by sharing common data structures such as priority queues and

  19. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    Science.gov (United States)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.

  20. An adaptive secret key-directed cryptographic scheme for secure transmission in wireless sensor networks

    International Nuclear Information System (INIS)

    Muhammad, K.; Jan, Z.; Khan, Z

    2015-01-01

    Wireless Sensor Networks (WSNs) are memory and bandwidth limited networks whose main goals are to maximize the network lifetime and minimize the energy consumption and transmission cost. To achieve these goals, different techniques of compression and clustering have been used. However, security is an open and major issue in WSNs for which different approaches are used, both in centralized and distributed WSNs' environments. This paper presents an adaptive cryptographic scheme for secure transmission of various sensitive parameters, sensed by wireless sensors to the fusion center for further processing in WSNs such as military networks. The proposed method encrypts the sensitive captured data of sensor nodes using various encryption procedures (bitxor operation, bits shuffling, and secret key based encryption) and then sends it to the fusion center. At the fusion center, the received encrypted data is decrypted for taking further necessary actions. The experimental results with complexity analysis, validate the effectiveness and feasibility of the proposed method in terms of security in WSNs. (author)

  1. A Cryptographic SoC for Robust Protection of Secret Keys in IPTV DRM Systems

    Science.gov (United States)

    Lee, Sanghan; Yang, Hae-Yong; Yeom, Yongjin; Park, Jongsik

    The security level of an internet protocol television (IPTV) digital right management (DRM) system ultimately relies on protection of secret keys. Well known devices for the key protection include smartcards and battery backup SRAMs (BB-SRAMs); however, these devices could be vulnerable to various physical attacks. In this paper, we propose a secure and cost-effective design of a cryptographic system on chip (SoC) that integrates the BB-SRAM with a cell-based design technique. The proposed SoC provides robust safeguard against the physical attacks, and satisfies high-speed and low-price requirements of IPTV set-top boxes. Our implementation results show that the maximum encryption rate of the SoC is 633Mb/s. In order to verify the data retention capabilities, we made a prototype chip using 0.18µm standard cell technology. The experimental results show that the integrated BB-SRAM can reliably retain data with a 1.4µA leakage current.

  2. BIX Certificates: Cryptographic Tokens for Anonymous Transactions Based on Certificates Public Ledger

    Directory of Open Access Journals (Sweden)

    Sead Muftic

    2016-12-01

    Full Text Available With the widespread use of Internet, Web, and mobile technologies, a new category of applications and transactions that requires anonymity is gaining increased interest and importance. Examples of such new applications are innovative payment systems, digital notaries, electronic voting, documents sharing, electronic auctions, medical applications, and many others. In addition to anonymity, these applications and transactions also require standard security services: identification, authentication, and authorization of users and protection of their transactions. Providing those services in combination with anonymity is an especially challenging issue, because all security services require explicit user identification and authentication. To solve this issue and enable applications with security and also anonymity we introduce a new type of cryptographically encapsulated objects called BIX certificates. “BIX” is an abbreviation for “Blockchain Information Exchange.” Their purpose is equivalent to X.509 certificates: to support security services for users and transactions, but also enhanced with anonymity. This paper describes the structure and attributes of BIX certificate objects and all related protocols for their creation, distribution, and use. The BIX Certification Infrastructure (BCI as a distributed public ledger is also briefly described.

  3. A Note on 5-bit Quadratic Permutations’ Classification

    OpenAIRE

    Božilov, Dušan; Bilgin, Begül; Sahin, Hacı Ali

    2017-01-01

    Classification of vectorial Boolean functions up to affine equivalence is used widely to analyze various cryptographic and implementation properties of symmetric-key algorithms. We show that there exist 75 affine equivalence classes of 5-bit quadratic permutations. Furthermore, we explore important cryptographic properties of these classes, such as linear and differential properties and degrees of their inverses, together with multiplicative complexity and existence of uniform threshold reali...

  4. Continuous Non-malleable Codes

    DEFF Research Database (Denmark)

    Faust, Sebastian; Mukherjee, Pratyay; Nielsen, Jesper Buus

    2014-01-01

    or modify it to the encoding of a completely unrelated value. This paper introduces an extension of the standard non-malleability security notion - so-called continuous non-malleability - where we allow the adversary to tamper continuously with an encoding. This is in contrast to the standard notion of non...... is necessary to achieve continuous non-malleability in the split-state model. Moreover, we illustrate that none of the existing constructions satisfies our uniqueness property and hence is not secure in the continuous setting. We construct a split-state code satisfying continuous non-malleability. Our scheme...... is based on the inner product function, collision-resistant hashing and non-interactive zero-knowledge proofs of knowledge and requires an untamperable common reference string. We apply continuous non-malleable codes to protect arbitrary cryptographic primitives against tampering attacks. Previous...

  5. BLAZE-DEM: A GPU based Polyhedral DEM particle transport code

    CSIR Research Space (South Africa)

    Govender, Nicolin

    2013-05-01

    Full Text Available expensive and cannot be done in real time. This paper will discuss methods and algorithms that substantially reduce the computational run-time of such simulations. An example is the spatial partitioning and hashing algorithm that allows just the nearest...

  6. Threshold implementations : as countermeasure against higher-order differential power analysis

    NARCIS (Netherlands)

    Bilgin, Begül

    2015-01-01

    Embedded devices are used pervasively in a wide range of applications some of which require cryptographic algorithms in order to provide security. Today’s standardized algorithms are secure in the black-box model where an adversary has access to several inputs and/or outputs of the algorithm.

  7. Cryptographically supported NFC tags in medication for better inpatient safety.

    Science.gov (United States)

    Özcanhan, Mehmet Hilal; Dalkılıç, Gökhan; Utku, Semih

    2014-08-01

    Reliable sources report that errors in drug administration are increasing the number of harmed or killed inpatients, during healthcare. This development is in contradiction to patient safety norms. A correctly designed hospital-wide ubiquitous system, using advanced inpatient identification and matching techniques, should provide correct medicine and dosage at the right time. Researchers are still making grouping proof protocol proposals based on the EPC Global Class 1 Generation 2 ver. 1.2 standard tags, for drug administration. Analyses show that such protocols make medication unsecure and hence fail to guarantee inpatient safety. Thus, the original goal of patient safety still remains. In this paper, a very recent proposal (EKATE) upgraded by a cryptographic function is shown to fall short of expectations. Then, an alternative proposal IMS-NFC which uses a more suitable and newer technology; namely Near Field Communication (NFC), is described. The proposed protocol has the additional support of stronger security primitives and it is compliant to ISO communication and security standards. Unlike previous works, the proposal is a complete ubiquitous system that guarantees full patient safety; and it is based on off-the-shelf, new technology products available in every corner of the world. To prove the claims the performance, cost, security and scope of IMS-NFC are compared with previous proposals. Evaluation shows that the proposed system has stronger security, increased patient safety and equal efficiency, at little extra cost.

  8. On Federated and Proof Of Validation Based Consensus Algorithms In Blockchain

    Science.gov (United States)

    Ambili, K. N.; Sindhu, M.; Sethumadhavan, M.

    2017-08-01

    Almost all real world activities have been digitized and there are various client server architecture based systems in place to handle them. These are all based on trust on third parties. There is an active attempt to successfully implement blockchain based systems which ensures that the IT systems are immutable, double spending is avoided and cryptographic strength is provided to them. A successful implementation of blockchain as backbone of existing information technology systems is bound to eliminate various types of fraud and ensure quicker delivery of the item on trade. To adapt IT systems to blockchain architecture, an efficient consensus algorithm need to be designed. Blockchain based on proof of work first came up as the backbone of cryptocurrency. After this, several other methods with variety of interesting features have come up. In this paper, we conduct a survey on existing attempts to achieve consensus in block chain. A federated consensus method and a proof of validation method are being compared.

  9. Speeding up detection of SHA-1 collision attacks using unavoidable attack conditions

    NARCIS (Netherlands)

    M.M.J. Stevens (Marc); D. Shumow

    2017-01-01

    textabstractCounter-cryptanalysis, the concept of using cryptanalytic techniques to detect cryptanalytic attacks, was introduced by Stevens at CRYPTO 2013 [22] with a hash collision detection algorithm. That is, an algorithm that detects whether a given single message is part of a colliding message

  10. Massively parallel algorithms for trace-driven cache simulations

    Science.gov (United States)

    Nicol, David M.; Greenberg, Albert G.; Lubachevsky, Boris D.

    1991-01-01

    Trace driven cache simulation is central to computer design. A trace is a very long sequence of reference lines from main memory. At the t(exp th) instant, reference x sub t is hashed into a set of cache locations, the contents of which are then compared with x sub t. If at the t sup th instant x sub t is not present in the cache, then it is said to be a miss, and is loaded into the cache set, possibly forcing the replacement of some other memory line, and making x sub t present for the (t+1) sup st instant. The problem of parallel simulation of a subtrace of N references directed to a C line cache set is considered, with the aim of determining which references are misses and related statistics. A simulation method is presented for the Least Recently Used (LRU) policy, which regradless of the set size C runs in time O(log N) using N processors on the exclusive read, exclusive write (EREW) parallel model. A simpler LRU simulation algorithm is given that runs in O(C log N) time using N/log N processors. Timings are presented of the second algorithm's implementation on the MasPar MP-1, a machine with 16384 processors. A broad class of reference based line replacement policies are considered, which includes LRU as well as the Least Frequently Used and Random replacement policies. A simulation method is presented for any such policy that on any trace of length N directed to a C line set runs in the O(C log N) time with high probability using N processors on the EREW model. The algorithms are simple, have very little space overhead, and are well suited for SIMD implementation.

  11. A Novel Medical Image Watermarking in Three-dimensional Fourier Compressed Domain

    Directory of Open Access Journals (Sweden)

    Baoru Han

    2015-09-01

    Full Text Available Digital watermarking is a research hotspot in the field of image security, which is protected digital image copyright. In order to ensure medical image information security, a novel medical image digital watermarking algorithm in three-dimensional Fourier compressed domain is proposed. The novel medical image digital watermarking algorithm takes advantage of three-dimensional Fourier compressed domain characteristics, Legendre chaotic neural network encryption features and robust characteristics of differences hashing, which is a robust zero-watermarking algorithm. On one hand, the original watermarking image is encrypted in order to enhance security. It makes use of Legendre chaotic neural network implementation. On the other hand, the construction of zero-watermarking adopts differences hashing in three-dimensional Fourier compressed domain. The novel watermarking algorithm does not need to select a region of interest, can solve the problem of medical image content affected. The specific implementation of the algorithm and the experimental results are given in the paper. The simulation results testify that the novel algorithm possesses a desirable robustness to common attack and geometric attack.

  12. Classification of Encrypted Web Traffic Using Machine Learning Algorithms

    Science.gov (United States)

    2013-06-01

    DPI devices to block certain websites; Yu, Cong, Chen, and Lei [52] suggest hashing the domains of pornographic and illegal websites so ISPs can...Zhenming Lei. “Blocking pornographic , illegal websites by internet host domain using FPGA and Bloom Filter”. Network Infrastructure and Digital Content

  13. An evaluation of multi-probe locality sensitive hashing for computing similarities over web-scale query logs.

    Directory of Open Access Journals (Sweden)

    Graham Cormode

    Full Text Available Many modern applications of AI such as web search, mobile browsing, image processing, and natural language processing rely on finding similar items from a large database of complex objects. Due to the very large scale of data involved (e.g., users' queries from commercial search engines, computing such near or nearest neighbors is a non-trivial task, as the computational cost grows significantly with the number of items. To address this challenge, we adopt Locality Sensitive Hashing (a.k.a, LSH methods and evaluate four variants in a distributed computing environment (specifically, Hadoop. We identify several optimizations which improve performance, suitable for deployment in very large scale settings. The experimental results demonstrate our variants of LSH achieve the robust performance with better recall compared with "vanilla" LSH, even when using the same amount of space.

  14. On cryptographic security of end-to-end encrypted connections in WhatsApp and Telegram messengers

    Directory of Open Access Journals (Sweden)

    Sergey V. Zapechnikov

    2017-11-01

    Full Text Available The aim of this work is to analyze the available possibilities for improving secure messaging with end-to-end connections under conditions of external violator actions and distrusted service provider. We made a comparative analysis of cryptographic security mechanisms for two widely used messengers: Telegram and WhatsApp. It was found that Telegram is based on MTProto protocol, while WhatsApp is based on the alternative Signal protocol. We examine the specific features of messengers implementation associated with random number generation on the most popular Android mobile platform. It was shown that Signal has better security properties. It is used in several other popular messengers such as TextSecure, RedPhone, GoogleAllo, FacebookMessenger, Signal along with WhatsApp. A number of possible attacks on both messengers were analyzed in details. In particular, we demonstrate that the metadata are poorly protected in both messengers. Metadata security may be one of the goals for further studies.

  15. Algebraic Side-Channel Attack on Twofish

    Directory of Open Access Journals (Sweden)

    Chujiao Ma

    2017-05-01

    Full Text Available While algebraic side-channel attack (ASCA has been successful in breaking simple cryptographic algorithms, it has never been done on larger or more complex algorithms such as Twofish. Compared to other algorithms that ASCA has been used on, Twofish is more difficult to attack due to the key-dependent S-boxes as well as the complex key scheduling. In this paper, we propose the first algebraic side-channel attack on Twofish, and examine the importance of side-channel information in getting past the key-dependent S-boxes and the complex key scheduling. The cryptographic algorithm and side-channel information are both expressed as boolean equations and a SAT solver is used to recover the key. While algebraic attack by itself is not sufficient to break the algorithm, with the help of side-channel information such as Hamming weights, we are able to correctly solve for 96 bits of the 128 bits key in under 2 hours with known plaintext/ciphertext.

  16. A novel block cryptosystem based on iterating a chaotic map

    International Nuclear Information System (INIS)

    Xiang Tao; Liao Xiaofeng; Tang Guoping; Chen Yong; Wong, Kwok-wo

    2006-01-01

    A block cryptographic scheme based on iterating a chaotic map is proposed. With random binary sequences generated from the real-valued chaotic map, the plaintext block is permuted by a key-dependent shift approach and then encrypted by the classical chaotic masking technique. Simulation results show that performance and security of the proposed cryptographic scheme are better than those of existing algorithms. Advantages and security of our scheme are also discussed in detail

  17. Quadratic Sieve integer factorization using Hadoop

    OpenAIRE

    Ghebregiorgish, Semere Tsehaye

    2012-01-01

    Master's thesis in Computer Science Integer factorization problem is one of the most important parts in the world of cryptography. The security of the widely-used public-key cryptographic algorithm, RSA [1], and the Blum Blum Shub cryptographic pseudorandom number generator [2] heavily depend on the presumed difficulty of factoring a number to its prime constituents. As the size of the number to be factored gets larger, the difficulty of the problem increases enormously. Thi...

  18. Meet-in-the-Middle Preimage Attacks on Hash Modes of Generalized Feistel and Misty Schemes with SP Round Function

    Science.gov (United States)

    Moon, Dukjae; Hong, Deukjo; Kwon, Daesung; Hong, Seokhie

    We assume that the domain extender is the Merkle-Damgård (MD) scheme and he message is padded by a ‘1’, and minimum number of ‘0’s, followed by a fixed size length information so that the length of padded message is multiple of block length. Under this assumption, we analyze securities of the hash mode when the compression function follows the Davies-Meyer (DM) scheme and the underlying block cipher is one of the plain Feistel or Misty scheme or the generalized Feistel or Misty schemes with Substitution-Permutation (SP) round function. We do this work based on Meet-in-the-Middle (MitM) preimage attack techniques, and develop several useful initial structures.

  19. Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques

    National Research Council Canada - National Science Library

    Fridrich, Jessica

    2002-01-01

    In this report, we describe an algorithm for robust visual hash functions with applications to digital image watermarking for authentication and integrity verification of video data and still images...

  20. Online Voting System Based on Image Steganography and Visual Cryptography

    Directory of Open Access Journals (Sweden)

    Biju Issac

    2017-01-01

    Full Text Available This paper discusses the implementation of an online voting system based on image steganography and visual cryptography. The system was implemented in Java EE on a web-based interface, with MySQL database server and Glassfish application server as the backend. After considering the requirements of an online voting system, current technologies on electronic voting schemes in published literature were examined. Next, the cryptographic and steganography techniques best suited for the requirements of the voting system were chosen, and the software was implemented. We have incorporated in our system techniques like the password hashed based scheme, visual cryptography, F5 image steganography and threshold decryption cryptosystem. The analysis, design and implementation phase of the software development of the voting system is discussed in detail. We have also used a questionnaire survey and did the user acceptance testing of the system.

  1. Authentication for Bulk Data Dissemination in Sensor Networks Using Symmetric Keys

    National Research Council Canada - National Science Library

    Wang, Limin; Kulkarni, Sandeep

    2007-01-01

    .... Our protocol uses the secret instantiation algorithm for distributing the keys. We apply the symmetric key signatures at the segment/group level and use hashed verification at the packet level...

  2. An Access Control Protocol for Wireless Sensor Network Using Double Trapdoor Chameleon Hash Function

    Directory of Open Access Journals (Sweden)

    Tejeshwari Thakur

    2016-01-01

    Full Text Available Wireless sensor network (WSN, a type of communication system, is normally deployed into the unattended environment where the intended user can get access to the network. The sensor nodes collect data from this environment. If the data are valuable and confidential, then security measures are needed to protect them from the unauthorized access. This situation requires an access control protocol (ACP in the design of sensor network because of sensor nodes which are vulnerable to various malicious attacks during the authentication and key establishment and the new node addition phase. In this paper, we propose a secured ACP for such WSN. This protocol is based on Elliptic Curve Discrete Log Problem (ECDLP and double trapdoor chameleon hash function which secures the WSN from malicious attacks such as node masquerading attack, replay attack, man-in-the-middle attack, and forgery attacks. Proposed ACP has a special feature known as session key security. Also, the proposed ACP is more efficient as it requires only one modular multiplication during the initialization phase.

  3. Kombinasi RSA-CRT dengan Random LSB untuk Keamanan Data di Kanwil Kementerian Agama Prov. Sumatera Utara

    Directory of Open Access Journals (Sweden)

    Niti Ravika Nasution

    2017-04-01

    Full Text Available In this study the authors use Cryptographic Algorithms Rivest Shamir Adleman Chinese Remainder Theorem (RSA-CRT and steganography technique Random Least Significant Bits (LSB. RSA-CRT is basically the same as usual, but utilizing RSA CRT theorem to shorten the bit size decryption exponent d by hiding d on congruent systems that accelerate time decryption, the difference in the key generation process and the decryption process. Cryptographic algorithm RSA-CRT produce ciphertext stored into a picture (image using Steganography technique Random Least Significant Bits (LSB. The workings of Random LSB is storing the message (ciphertext in the first bit or the second bit random key for use random number generator Pseudo Random Number Generator (PRNG with Linear Congruential Generator (LCG method. Ciphertext stored in a picture  (image has extracted key re-use random number generator at the time of inserting the message. Then the ciphertext is decrypted back by the algorithm RSA-CRT to produce the original text (plaintext. Merging Cryptographic Algorithm RSA-CRT with Steganography Technique Simple  LSB than with Random LSB generate higher PSNR and MSE is lower, which means better level of data security and more resistant to attack. Has more difficult to find a secret message by cryptanalysis and steganalyst.

  4. Security in Wireless Sensor Networks Employing MACGSP6

    Science.gov (United States)

    Nitipaichit, Yuttasart

    2010-01-01

    Wireless Sensor Networks (WSNs) have unique characteristics which constrain them; including small energy stores, limited computation, and short range communication capability. Most traditional security algorithms use cryptographic primitives such as Public-key cryptography and are not optimized for energy usage. Employing these algorithms for the…

  5. A one-time pad color image cryptosystem based on SHA-3 and multiple chaotic systems

    Science.gov (United States)

    Wang, Xingyuan; Wang, Siwei; Zhang, Yingqian; Luo, Chao

    2018-04-01

    A novel image encryption algorithm is proposed that combines the SHA-3 hash function and two chaotic systems: the hyper-chaotic Lorenz and Chen systems. First, 384 bit keystream hash values are obtained by applying SHA-3 to plaintext. The sensitivity of the SHA-3 algorithm and chaotic systems ensures the effect of a one-time pad. Second, the color image is expanded into three-dimensional space. During permutation, it undergoes plane-plane displacements in the x, y and z dimensions. During diffusion, we use the adjacent pixel dataset and corresponding chaotic value to encrypt each pixel. Finally, the structure of alternating between permutation and diffusion is applied to enhance the level of security. Furthermore, we design techniques to improve the algorithm's encryption speed. Our experimental simulations show that the proposed cryptosystem achieves excellent encryption performance and can resist brute-force, statistical, and chosen-plaintext attacks.

  6. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    Science.gov (United States)

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  7. Universal Intelligent Data Encryption Standards: A Review

    Directory of Open Access Journals (Sweden)

    Renjith V Ravi

    2014-06-01

    Full Text Available -The most challenging aspects in the word of electronic communication is nothing but the data security. The significance of the exchanged data over the internet and other media types are increasing. One of the most interesting subjects in the security related communities is the hunt for the best solution to offer an essential protection against the data intruders’ attacks together with providing these services in time. Cryptography is the one of the main category of data security which converts information from its original form into an unreadable form. There are two main uniqueness to distinguish an encryption system from another are its ability to secure the protected data against cryptanalytic attacks and its speed and efficiency in the process.Cryptographic research has a common objective to design protocols that offer a confidential and authenticated transmission channel for messages over an insecure network. If a cryptographic algorithm is said to be computationally secured, it cannot be broken with typical resources, either present or future and apart from the algorithm, key distribution is also more important to make an proficient cryptographic system.

  8. Cryptography- An ideal solution to privacy, data integrity and non ...

    African Journals Online (AJOL)

    Encryption, hashing and digital signatures are the three primitives of Cryptography and these have been treated in depth and their performances on text data and image data have been studied The most secure algorithms so far in use have been introduced and the respective performance of each primitive 's algorithm on ...

  9. Twisted Polynomials and Forgery Attacks on GCM

    DEFF Research Database (Denmark)

    Abdelraheem, Mohamed Ahmed A. M. A.; Beelen, Peter; Bogdanov, Andrey

    2015-01-01

    Polynomial hashing as an instantiation of universal hashing is a widely employed method for the construction of MACs and authenticated encryption (AE) schemes, the ubiquitous GCM being a prominent example. It is also used in recent AE proposals within the CAESAR competition which aim at providing...... in an improved key recovery algorithm. As cryptanalytic applications of our twisted polynomials, we develop the first universal forgery attacks on GCM in the weak-key model that do not require nonce reuse. Moreover, we present universal weak-key forgeries for the nonce-misuse resistant AE scheme POET, which...

  10. Design and implementation of a privacy preserving electronic health record linkage tool in Chicago.

    Science.gov (United States)

    Kho, Abel N; Cashy, John P; Jackson, Kathryn L; Pah, Adam R; Goel, Satyender; Boehnke, Jörn; Humphries, John Eric; Kominers, Scott Duke; Hota, Bala N; Sims, Shannon A; Malin, Bradley A; French, Dustin D; Walunas, Theresa L; Meltzer, David O; Kaleba, Erin O; Jones, Roderick C; Galanter, William L

    2015-09-01

    To design and implement a tool that creates a secure, privacy preserving linkage of electronic health record (EHR) data across multiple sites in a large metropolitan area in the United States (Chicago, IL), for use in clinical research. The authors developed and distributed a software application that performs standardized data cleaning, preprocessing, and hashing of patient identifiers to remove all protected health information. The application creates seeded hash code combinations of patient identifiers using a Health Insurance Portability and Accountability Act compliant SHA-512 algorithm that minimizes re-identification risk. The authors subsequently linked individual records using a central honest broker with an algorithm that assigns weights to hash combinations in order to generate high specificity matches. The software application successfully linked and de-duplicated 7 million records across 6 institutions, resulting in a cohort of 5 million unique records. Using a manually reconciled set of 11 292 patients as a gold standard, the software achieved a sensitivity of 96% and a specificity of 100%, with a majority of the missed matches accounted for by patients with both a missing social security number and last name change. Using 3 disease examples, it is demonstrated that the software can reduce duplication of patient records across sites by as much as 28%. Software that standardizes the assignment of a unique seeded hash identifier merged through an agreed upon third-party honest broker can enable large-scale secure linkage of EHR data for epidemiologic and public health research. The software algorithm can improve future epidemiologic research by providing more comprehensive data given that patients may make use of multiple healthcare systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Towards Symbolic Encryption Schemes

    DEFF Research Database (Denmark)

    Ahmed, Naveed; Jensen, Christian D.; Zenner, Erik

    2012-01-01

    , namely an authenticated encryption scheme that is secure under chosen ciphertext attack. Therefore, many reasonable encryption schemes, such as AES in the CBC or CFB mode, are not among the implementation options. In this paper, we report new attacks on CBC and CFB based implementations of the well......Symbolic encryption, in the style of Dolev-Yao models, is ubiquitous in formal security models. In its common use, encryption on a whole message is specified as a single monolithic block. From a cryptographic perspective, however, this may require a resource-intensive cryptographic algorithm......-known Needham-Schroeder and Denning-Sacco protocols. To avoid such problems, we advocate the use of refined notions of symbolic encryption that have natural correspondence to standard cryptographic encryption schemes....

  12. Bit-level differential power analysis attack on implementations of advanced encryption standard software running inside a PIC18F2420 microcontroller

    CSIR Research Space (South Africa)

    Mpalane, K

    2015-12-01

    Full Text Available importance. For this reason, cryptographic device developers rely on cryptography to secure their data [1]. Consequently, cryptographic devices depend on cipher algorithms to ensure confidentiality and integrity of data. The goal of cryptography is to use....” Proceedings of the IEEE 76.5 (1988): 550-559. [2] Zaidan, B. B., et al. ”On the differences between hiding information and cryptography techniques: An overview.” Journal of Applied Sciences 10 (2010): 1650-1655. [3] Diffie, Whitfield, and Martin E. Hellman...

  13. A fast exact simulation method for a class of Markov jump processes.

    Science.gov (United States)

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  14. Secured Session-key Distribution using control Vector Encryption / Decryption Process

    International Nuclear Information System (INIS)

    Ismail Jabiullah, M.; Abdullah Al-Shamim; Khaleqdad Khan, ANM; Lutfar Rahman, M.

    2006-01-01

    Frequent key changes are very much desirable for the secret communications and are thus in high demand. A session-key distribution technique has been designed and implemented using the programming language C on which the communication between the end-users is encrypted is used for the duration of a logical connection. Each session-key is obtained from the key distribution center (KDC) over the same networking facilities used for end-user communication. The control vector is cryptographically coupled with the session-key at the time of key generation in the KDC. For this, the generated hash function, master key and the session-key are used for producing the encrypted session-key, which has to be transferred. All the operations have been performed using the C programming language. This process can be widely applicable to all sorts of electronic transactions online or offline; commercially and academically.(authors)

  15. A Novel Method for Generating Encryption Keys

    Directory of Open Access Journals (Sweden)

    Dascalescu Ana Cristina

    2009-12-01

    Full Text Available The development of the informational society, which has led to an impressive growth of the information volume circulating in the computer networks, has accelerated the evolution and especially the use of modern cryptography instruments. Today, the commercial products use standard cryptographic libraries that implement certified and tested cryptographic algorithms. Instead, the fragility ofencryption algorithms is given by compositional operations like key handling or key generation. In this sense, the article proposes an innovative method to generate pseudorandom numbers which can be used for the construction of secure stream encryption keys. The proposed method is based on the mathematical complements based on the algebra of the finite fields and uses a particularized structure of the linear feedback shift registers.

  16. Network Security via Biometric Recognition of Patterns of Gene Expression

    Science.gov (United States)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT (Information Technology) organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time assays of gene expression products.

  17. Network Security via Biometric Recognition of Patterns of Gene Expression

    Science.gov (United States)

    Shaw, Harry C.

    2016-01-01

    Molecular biology provides the ability to implement forms of information and network security completely outside the bounds of legacy security protocols and algorithms. This paper addresses an approach which instantiates the power of gene expression for security. Molecular biology provides a rich source of gene expression and regulation mechanisms, which can be adopted to use in the information and electronic communication domains. Conventional security protocols are becoming increasingly vulnerable due to more intensive, highly capable attacks on the underlying mathematics of cryptography. Security protocols are being undermined by social engineering and substandard implementations by IT organizations. Molecular biology can provide countermeasures to these weak points with the current security approaches. Future advances in instruments for analyzing assays will also enable this protocol to advance from one of cryptographic algorithms to an integrated system of cryptographic algorithms and real-time expression and assay of gene expression products.

  18. Remote Control and Testing of the Interactive TV-Decoder

    Directory of Open Access Journals (Sweden)

    K. Vlcek

    1995-12-01

    Full Text Available The article deals with assembling and application of a complex sequential circuit VHDL (VHSIC (Very High-Speed Integrated Circuit Hardware Description Language model. The circuit model is a core of a cryptographic device for the signal encoding and decoding of discreet transmissions by TV-cable net. The cryptographic algorithm is changable according to the user's wishes. The principles of creation and example implementations are presented in the article. The behavioural model is used to minimize mistakes in the ASICs (Application Specific Integrated Circuits. The circuit implementation uses the FPGA (Field Programmable Gate Array technology. The diagnostics of the circuit is based on remote testing by the IEEE Std 1149.1-1990. The VHDL model of diagnostic subsystem is created as an orthogonal model in relation to the cryptographic circuit VHDL model.

  19. A Novel Single Pass Authenticated Encryption Stream Cipher for Software Defined Radios

    DEFF Research Database (Denmark)

    Khajuria, Samant

    2012-01-01

    to propose cryptographic services such as confidentiality, integrity and authentication. Therefore, integration of security services into SDR devices is essential. Authenticated Encryption schemes donate the class of cryptographic algorithms that are designed for protecting both message confidentiality....... This makes authenticated encryption very attractive for low-cost low-power hardware implementations, as it allows for the substantial decrease in the circuit area and power consumed compared to the traditional schemes. In this thesis, an authenticated encryption scheme is proposed with the focus of achieving...... high throughput and low overhead for SDRs. The thesis is divided into two research topics. One topic is the design of a 1-pass authenticated encryption scheme that can accomplish both message secrecy and authenticity in a single cryptographic primitive. The other topic is the implementation...

  20. LDPC and SHA based iris recognition for image authentication

    Directory of Open Access Journals (Sweden)

    K. Seetharaman

    2012-11-01

    Full Text Available We introduce a novel way to authenticate an image using Low Density Parity Check (LDPC and Secure Hash Algorithm (SHA based iris recognition method with reversible watermarking scheme, which is based on Integer Wavelet Transform (IWT and threshold embedding technique. The parity checks and parity matrix of LDPC encoding and cancellable biometrics i.e., hash string of unique iris code from SHA-512 are embedded into an image for authentication purpose using reversible watermarking scheme based on IWT and threshold embedding technique. Simply by reversing the embedding process, the original image, parity checks, parity matrix and SHA-512 hash are extracted back from watermarked-image. For authentication, the new hash string produced by employing SHA-512 on error corrected iris code from live person is compared with hash string extracted from watermarked-image. The LDPC code reduces the hamming distance for genuine comparisons by a larger amount than for the impostor comparisons. This results in better separation between genuine and impostor users which improves the authentication performance. Security of this scheme is very high due to the security complexity of SHA-512, which is 2256 under birthday attack. Experimental results show that this approach can assure more accurate authentication with a low false rejection or false acceptance rate and outperforms the prior arts in terms of PSNR.

  1. Cryptanalysis of "an improvement over an image encryption method based on total shuffling"

    Science.gov (United States)

    Akhavan, A.; Samsudin, A.; Akhshani, A.

    2015-09-01

    In the past two decades, several image encryption algorithms based on chaotic systems had been proposed. Many of the proposed algorithms are meant to improve other chaos based and conventional cryptographic algorithms. Whereas, many of the proposed improvement methods suffer from serious security problems. In this paper, the security of the recently proposed improvement method for a chaos-based image encryption algorithm is analyzed. The results indicate the weakness of the analyzed algorithm against chosen plain-text.

  2. Associations between butane hash oil use and cannabis-related problems.

    Science.gov (United States)

    Meier, Madeline H

    2017-10-01

    High-potency cannabis concentrates are increasingly popular in the United States, and there is concern that use of high-potency cannabis might increase risk for cannabis-related problems. However, little is known about the potential negative consequences of concentrate use. This study reports on associations between past-year use of a high-potency cannabis concentrate, known as butane hash oil (BHO), and cannabis-related problems. A sample of 821 college students were recruited to complete a survey about their health and behavior. Participants who had used cannabis in the past year (33%, n=273) completed questions about their cannabis use, including their use of BHO and cannabis-related problems in eight domains: physical dependence, impaired control, academic-occupational problems, social-interpersonal problems, self-care problems, self-perception, risk behavior, and blackouts. Approximately 44% (n=121) of past-year cannabis users had used BHO in the past year. More frequent BHO use was associated with higher levels of physical dependence (RR=1.8, pcannabis-related academic/occupational problems (RR=1.5, p=0.004), poor self-care (RR=1.3, p=0.002), and cannabis-related risk behavior (RR=1.2, p=0.001). After accounting for sociodemographic factors, age of onset of cannabis use, sensation seeking, overall frequency of cannabis use, and frequency of other substance use, BHO use was still associated with higher levels of physical dependence (RR=1.2, p=0.014). BHO use is associated with greater physiological dependence on cannabis, even after accounting for potential confounders. Longitudinal research is needed to determine if cannabis users with higher levels of physiological dependence seek out BHO and/or if BHO use increases risk for physiological dependence. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Privacy-Preserving Patient Similarity Learning in a Federated Environment: Development and Analysis.

    Science.gov (United States)

    Lee, Junghye; Sun, Jimeng; Wang, Fei; Wang, Shuang; Jun, Chi-Hyuck; Jiang, Xiaoqian

    2018-04-13

    There is an urgent need for the development of global analytic frameworks that can perform analyses in a privacy-preserving federated environment across multiple institutions without privacy leakage. A few studies on the topic of federated medical analysis have been conducted recently with the focus on several algorithms. However, none of them have solved similar patient matching, which is useful for applications such as cohort construction for cross-institution observational studies, disease surveillance, and clinical trials recruitment. The aim of this study was to present a privacy-preserving platform in a federated setting for patient similarity learning across institutions. Without sharing patient-level information, our model can find similar patients from one hospital to another. We proposed a federated patient hashing framework and developed a novel algorithm to learn context-specific hash codes to represent patients across institutions. The similarities between patients can be efficiently computed using the resulting hash codes of corresponding patients. To avoid security attack from reverse engineering on the model, we applied homomorphic encryption to patient similarity search in a federated setting. We used sequential medical events extracted from the Multiparameter Intelligent Monitoring in Intensive Care-III database to evaluate the proposed algorithm in predicting the incidence of five diseases independently. Our algorithm achieved averaged area under the curves of 0.9154 and 0.8012 with balanced and imbalanced data, respectively, in κ-nearest neighbor with κ=3. We also confirmed privacy preservation in similarity search by using homomorphic encryption. The proposed algorithm can help search similar patients across institutions effectively to support federated data analysis in a privacy-preserving manner. ©Junghye Lee, Jimeng Sun, Fei Wang, Shuang Wang, Chi-Hyuck Jun, Xiaoqian Jiang. Originally published in JMIR Medical Informatics (http

  4. Report on Pairing-based Cryptography.

    Science.gov (United States)

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.

  5. User characteristics and effect profile of Butane Hash Oil: An extremely high-potency cannabis concentrate.

    Science.gov (United States)

    Chan, Gary C K; Hall, Wayne; Freeman, Tom P; Ferris, Jason; Kelly, Adrian B; Winstock, Adam

    2017-09-01

    Recent reports suggest an increase in use of extremely potent cannabis concentrates such as Butane Hash Oil (BHO) in some developed countries. The aims of this study were to examine the characteristics of BHO users and the effect profiles of BHO. Anonymous online survey in over 20 countries in 2014 and 2015. Participants aged 18 years or older were recruited through onward promotion and online social networks. The overall sample size was 181,870. In this sample, 46% (N=83,867) reported using some form of cannabis in the past year, and 3% reported BHO use (n=5922). Participants reported their use of 7 types of cannabis in the past 12 months, the source of their cannabis, reasons for use, use of other illegal substances, and lifetime diagnosis for depression, anxiety and psychosis. Participants were asked to rate subjective effects of BHO and high potency herbal cannabis. Participants who reported a lifetime diagnosis of depression (OR=1.15, p=0.003), anxiety (OR=1.72, pcannabis. BHO users also reported stronger negative effects and less positive effects when using BHO than high potency herbal cannabis (pcannabis. Copyright © 2017. Published by Elsevier B.V.

  6. A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function

    Science.gov (United States)

    Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme. PMID:24892078

  7. A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function

    Directory of Open Access Journals (Sweden)

    Ashok Kumar Das

    2014-01-01

    Full Text Available In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.

  8. A robust and effective smart-card-based remote user authentication mechanism using hash function.

    Science.gov (United States)

    Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit

    2014-01-01

    In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.

  9. A System-Level Throughput Model for Quantum Key Distribution

    Science.gov (United States)

    2015-09-17

    discrete logarithms in a finite field [35]. Arguably the most popular asymmetric encryption scheme is the RSA algorithm, published a year later in...Theory, vol. 22, no. 6, pp. 644-654, 1976. [36] G. Singh and S. Supriya, ’A Study of Encryption Algorithms ( RSA , DES, 3DES and AES) for Information...xv Dictionary QKD = Quantum Key Distribution OTP = One-Time Pad cryptographic algorithm DES = Data Encryption Standard 3DES

  10. A Secure and Robust Object-Based Video Authentication System

    Directory of Open Access Journals (Sweden)

    He Dajun

    2004-01-01

    Full Text Available An object-based video authentication system, which combines watermarking, error correction coding (ECC, and digital signature techniques, is presented for protecting the authenticity between video objects and their associated backgrounds. In this system, a set of angular radial transformation (ART coefficients is selected as the feature to represent the video object and the background, respectively. ECC and cryptographic hashing are applied to those selected coefficients to generate the robust authentication watermark. This content-based, semifragile watermark is then embedded into the objects frame by frame before MPEG4 coding. In watermark embedding and extraction, groups of discrete Fourier transform (DFT coefficients are randomly selected, and their energy relationships are employed to hide and extract the watermark. The experimental results demonstrate that our system is robust to MPEG4 compression, object segmentation errors, and some common object-based video processing such as object translation, rotation, and scaling while securely preventing malicious object modifications. The proposed solution can be further incorporated into public key infrastructure (PKI.

  11. PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.

    Science.gov (United States)

    Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang

    2017-07-26

    Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.

  12. Cipher block based authentication module: A hardware design perspective

    NARCIS (Netherlands)

    Michail, H.E.; Schinianakis, D.; Goutis, C.E.; Kakarountas, A.P.; Selimis, G.

    2011-01-01

    Message Authentication Codes (MACs) are widely used in order to authenticate data packets, which are transmitted thought networks. Typically MACs are implemented using modules like hash functions and in conjunction with encryption algorithms (like Block Ciphers), which are used to encrypt the

  13. Lorenz's attractor applied to the stream cipher (Ali-Pacha generator)

    International Nuclear Information System (INIS)

    Ali-Pacha, Adda; Hadj-Said, Naima; M'Hamed, A.; Belgoraf, A.

    2007-01-01

    The safety of information is primarily founded today on the calculation of algorithms whose confidentiality depends on the number of the necessary bits for the definition of a cryptographic key. If this type of system has proved reliable, then the increasing power of the means of calculation threatens the confidentiality of these methods. The powerful computers are certainly able to quantify and decipher information quickly, but their computing speed allows parallel cryptanalysis, which aims 'to break' a code by discovering the key, for example, by testing all the possible keys. The only evocation of the principle of the quantum computer, with the potentially colossal capacities of calculation, has started a shock, even in the most savaged who are convinced of algorithmic cryptography. To mitigate this concern, we will introduce in this article a new cryptographic system based on chaotic concepts

  14. An efficient entire chaos-based scheme for deniable authentication

    International Nuclear Information System (INIS)

    Xiao Di; Liao Xiaofeng; Wong, K.W.

    2005-01-01

    By using a chaotic encryption-hash parallel algorithm and the semi-group property of Chebyshev chaotic map, we propose a secure and efficient scheme for the deniable authentication. The scheme is efficient, practicable and reliable, with high potential to be adopted for e-commerce

  15. Scalability and Total Recall with Fast CoveringLSH

    DEFF Research Database (Denmark)

    Pham, Ninh Dang; Pagh, Rasmus

    2016-01-01

    Locality-sensitive hashing (LSH) has emerged as the dominant algorithmic technique for similarity search with strong performance guarantees in high-dimensional spaces. A drawback of traditional LSH schemes is that they may have false negatives, i.e., the recall is less than 100%. This limits...

  16. An efficient entire chaos-based scheme for deniable authentication

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Di [College of Computer Science and Engineering, Chongqing University, Chongqing, 400044 (China) and College of Mechanical Engineering, Chongqing University, Chongqing, 400044 (China)]. E-mail: xiaodi_cqu@hotmail.com; Liao Xiaofeng [College of Computer Science and Engineering, Chongqing University, Chongqing, 400044 (China); Wong, K.W. [Department of Computer Engineering and Information Technology, City University of Hong Kong, Hong Kong (China)

    2005-02-01

    By using a chaotic encryption-hash parallel algorithm and the semi-group property of Chebyshev chaotic map, we propose a secure and efficient scheme for the deniable authentication. The scheme is efficient, practicable and reliable, with high potential to be adopted for e-commerce.

  17. Post-quantum cryptography

    Science.gov (United States)

    Bernstein, Daniel J.; Lange, Tanja

    2017-09-01

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  18. Post-quantum cryptography.

    Science.gov (United States)

    Bernstein, Daniel J; Lange, Tanja

    2017-09-13

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  19. Evaluating the multi-threading countermeasure

    CSIR Research Space (South Africa)

    Frieslaar, Ibraheem

    2016-12-01

    Full Text Available to obfuscate the individuals information from people attempting to intercept data. One of these cryptographic algorithms is the AES algorithm [1]. This algorithm has been declared to be the standard protocol to encrypt information by the The National Institute...-128 algo- rithm, four steps were followed: While the AES-128 algorithm was executing the encryption process, the power traces along with its corresponding input text were captured; a power leakage model was implemented where the guess of a key byte...

  20. Investigating multi-thread utilization as a software defence mechanism against side channel attacks

    CSIR Research Space (South Africa)

    Frieslaar, Ibraheem

    2016-11-01

    Full Text Available out information at critical points in the cryptographic algorithm and confuse the attacker. This research demonstrates it is capable of outperforming the known countermeasure of hiding and shuffling in terms of preventing the secret information from...

  1. Lattice Based Mix Network for Location Privacy in Mobile System

    Directory of Open Access Journals (Sweden)

    Kunwar Singh

    2015-01-01

    Full Text Available In 1981, David Chaum proposed a cryptographic primitive for privacy called mix network (Mixnet. A mixnet is cryptographic construction that establishes anonymous communication channel through a set of servers. In 2004, Golle et al. proposed a new cryptographic primitive called universal reencryption which takes the input as encrypted messages under the public key of the recipients not the public key of the universal mixnet. In Eurocrypt 2010, Gentry, Halevi, and Vaikunthanathan presented a cryptosystem which is an additive homomorphic and a multiplicative homomorphic for only one multiplication. In MIST 2013, Singh et al. presented a lattice based universal reencryption scheme under learning with error (LWE assumption. In this paper, we have improved Singh et al.’s scheme using Fairbrother’s idea. LWE is a lattice hard problem for which till now there is no polynomial time quantum algorithm. Wiangsripanawan et al. proposed a protocol for location privacy in mobile system using universal reencryption whose security is reducible to Decision Diffie-Hellman assumption. Once quantum computer becomes a reality, universal reencryption can be broken in polynomial time by Shor’s algorithm. In postquantum cryptography, our scheme can replace universal reencryption scheme used in Wiangsripanawan et al. scheme for location privacy in mobile system.

  2. Cryptanalysis of Compact-LWE and Related Lightweight Public Key Encryption

    Directory of Open Access Journals (Sweden)

    Dianyan Xiao

    2018-01-01

    Full Text Available In the emerging Internet of Things (IoT, lightweight public key cryptography plays an essential role in security and privacy protection. With the approach of quantum computing era, it is important to design and evaluate lightweight quantum-resistant cryptographic algorithms applicable to IoT. LWE-based cryptography is a widely used and well-studied family of postquantum cryptographic constructions whose hardness is based on worst-case lattice problems. To make LWE friendly to resource-constrained IoT devices, a variant of LWE, named Compact-LWE, was proposed and used to design lightweight cryptographic schemes. In this paper, we study the so-called Compact-LWE problem and clarify that under certain parameter settings it can be solved in polynomial time. As a consequence, our result leads to a practical attack against an instantiated scheme based on Compact-LWE proposed by Liu et al. in 2017.

  3. Number Theory and Public-Key Cryptography.

    Science.gov (United States)

    Lefton, Phyllis

    1991-01-01

    Described are activities in the study of techniques used to conceal the meanings of messages and data. Some background information and two BASIC programs that illustrate the algorithms used in a new cryptographic system called "public-key cryptography" are included. (CW)

  4. Second order statistical behavior of LLL and BKZ

    NARCIS (Netherlands)

    Y. Yu (Yang); L. Ducas (Léo)

    2017-01-01

    textabstractThe LLL algorithm (from Lenstra, Lenstra and Lovász) and its generalization BKZ (from Schnorr and Euchner) are widely used in cryptanalysis, especially for lattice-based cryptography. Precisely understanding their behavior is crucial for deriving appropriate key-size for cryptographic

  5. Vulnerability of advanced encryption standard algorithm to differential power analysis attacks implemented on ATmega-128 microcontroller

    CSIR Research Space (South Africa)

    Mpalane, Kealeboga

    2016-09-01

    Full Text Available A wide variety of cryptographic embedded devices including smartcards, ASICs and FPGAs must be secure against breaking in. However, these devices are vulnerable to side channel attacks. A side channel attack uses physical attributes...

  6. A novel image encryption scheme based on spatial chaos map

    International Nuclear Information System (INIS)

    Sun Fuyan; Liu Shutang; Li Zhongqin; Lue Zongwang

    2008-01-01

    In recent years, the chaos-based cryptographic algorithms have suggested some new and efficient ways to develop secure image encryption techniques, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. In this paper, spatial chaos system are used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail. The basic idea is to encrypt the image in space with spatial chaos map pixel by pixel, and then the pixels are confused in multiple directions of space. Using this method one cycle, the image becomes indistinguishable in space due to inherent properties of spatial chaotic systems. Several experimental results, key sensitivity tests, key space analysis, and statistical analysis show that the approach for image cryptosystems provides an efficient and secure way for real time image encryption and transmission from the cryptographic viewpoint

  7. An enhanced biometric authentication scheme for telecare medicine information systems with nonce using chaotic hash function.

    Science.gov (United States)

    Das, Ashok Kumar; Goswami, Adrijit

    2014-06-01

    Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.

  8. Fast Bitwise Implementation of the Algebraic Normal Form Transform

    OpenAIRE

    Bakoev, Valentin

    2017-01-01

    The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...

  9. Cryptanalysis of Application of Laplace Transform for Cryptography

    OpenAIRE

    Gençoğlu Muharrem Tuncay

    2017-01-01

    Although Laplace Transform is a good application field in the design of cryptosystems, many cryptographic algorithm proposals become unsatisfactory for secure communication. In this cryptanalysis study, one of the significant disadvantages of the proposed algorithm is performed with only statistical test of security analysis. In this study, Explaining what should be considered when performing security analysis of Laplace Transform based encryption systems and using basic mathematical rules, p...

  10. Secured Hash Based Burst Header Authentication Design for Optical Burst Switched Networks

    Science.gov (United States)

    Balamurugan, A. M.; Sivasubramanian, A.; Parvathavarthini, B.

    2017-12-01

    The optical burst switching (OBS) is a promising technology that could meet the fast growing network demand. They are featured with the ability to meet the bandwidth requirement of applications that demand intensive bandwidth. OBS proves to be a satisfactory technology to tackle the huge bandwidth constraints, but suffers from security vulnerabilities. The objective of this proposed work is to design a faster and efficient burst header authentication algorithm for core nodes. There are two important key features in this work, viz., header encryption and authentication. Since the burst header is an important in optical burst switched network, it has to be encrypted; otherwise it is be prone to attack. The proposed MD5&RC4-4S based burst header authentication algorithm runs 20.75 ns faster than the conventional algorithms. The modification suggested in the proposed RC4-4S algorithm gives a better security and solves the correlation problems between the publicly known outputs during key generation phase. The modified MD5 recommended in this work provides 7.81 % better avalanche effect than the conventional algorithm. The device utilization result also shows the suitability of the proposed algorithm for header authentication in real time applications.

  11. APE: Authenticated Permutation-Based Encryption for Lightweight Cryptography

    DEFF Research Database (Denmark)

    Andreeva, Elena; Bilgin, Begül; Bogdanov, Andrey

    2015-01-01

    The domain of lightweight cryptography focuses on cryptographic algorithms for extremely constrained devices. It is very costly to avoid nonce reuse in such environments, because this requires either a hardware source of randomness, or non-volatile memory to store a counter. At the same time, a lot...

  12. Survey and Benchmark of Block Ciphers for Wireless Sensor Networks

    NARCIS (Netherlands)

    Law, Y.W.; Doumen, J.M.; Hartel, Pieter H.

    Cryptographic algorithms play an important role in the security architecture of wireless sensor networks (WSNs). Choosing the most storage- and energy-efficient block cipher is essential, due to the facts that these networks are meant to operate without human intervention for a long period of time

  13. System of end-to-end symmetric database encryption

    Science.gov (United States)

    Galushka, V. V.; Aydinyan, A. R.; Tsvetkova, O. L.; Fathi, V. A.; Fathi, D. V.

    2018-05-01

    The article is devoted to the actual problem of protecting databases from information leakage, which is performed while bypassing access control mechanisms. To solve this problem, it is proposed to use end-to-end data encryption, implemented at the end nodes of an interaction of the information system components using one of the symmetric cryptographic algorithms. For this purpose, a key management method designed for use in a multi-user system based on the distributed key representation model, part of which is stored in the database, and the other part is obtained by converting the user's password, has been developed and described. In this case, the key is calculated immediately before the cryptographic transformations and is not stored in the memory after the completion of these transformations. Algorithms for registering and authorizing a user, as well as changing his password, have been described, and the methods for calculating parts of a key when performing these operations have been provided.

  14. Applying Cuckoo Search for analysis of LFSR based cryptosystem

    Directory of Open Access Journals (Sweden)

    Maiya Din

    2016-09-01

    Full Text Available Cryptographic techniques are employed for minimizing security hazards to sensitive information. To make the systems more robust, cyphers or crypts being used need to be analysed for which cryptanalysts require ways to automate the process, so that cryptographic systems can be tested more efficiently. Evolutionary algorithms provide one such resort as these are capable of searching global optimal solution very quickly. Cuckoo Search (CS Algorithm has been used effectively in cryptanalysis of conventional systems like Vigenere and Transposition cyphers. Linear Feedback Shift Register (LFSR is a crypto primitive used extensively in design of cryptosystems. In this paper, we analyse LFSR based cryptosystem using Cuckoo Search to find correct initial states of used LFSR. Primitive polynomials of degree 11, 13, 17 and 19 are considered to analyse text crypts of length 200, 300 and 400 characters. Optimal solutions were obtained for the following CS parameters: Levy distribution parameter (β = 1.5 and Alien eggs discovering probability (pa = 0.25.

  15. Pseudo-cryptanalysis of the Original Blue Midnight Wish

    DEFF Research Database (Denmark)

    Thomsen, Søren Steffen

    2010-01-01

    The hash function Blue Midnight Wish (BMW) is a candidate in the SHA-3 competition organized by the U.S. National Institute of Standards and Technology (NIST). BMW was selected for the second round of the competition, but the algorithm was tweaked in a number of ways. In this paper we describe cr...

  16. Utilizing the Double-Precision Floating-Point Computing Power of GPUs for RSA Acceleration

    Directory of Open Access Journals (Sweden)

    Jiankuo Dong

    2017-01-01

    Full Text Available Asymmetric cryptographic algorithm (e.g., RSA and Elliptic Curve Cryptography implementations on Graphics Processing Units (GPUs have been researched for over a decade. The basic idea of most previous contributions is exploiting the highly parallel GPU architecture and porting the integer-based algorithms from general-purpose CPUs to GPUs, to offer high performance. However, the great potential cryptographic computing power of GPUs, especially by the more powerful floating-point instructions, has not been comprehensively investigated in fact. In this paper, we fully exploit the floating-point computing power of GPUs, by various designs, including the floating-point-based Montgomery multiplication/exponentiation algorithm and Chinese Remainder Theorem (CRT implementation in GPU. And for practical usage of the proposed algorithm, a new method is performed to convert the input/output between octet strings and floating-point numbers, fully utilizing GPUs and further promoting the overall performance by about 5%. The performance of RSA-2048/3072/4096 decryption on NVIDIA GeForce GTX TITAN reaches 42,211/12,151/5,790 operations per second, respectively, which achieves 13 times the performance of the previous fastest floating-point-based implementation (published in Eurocrypt 2009. The RSA-4096 decryption precedes the existing fastest integer-based result by 23%.

  17. Defence against Black Hole and Selective Forwarding Attacks for Medical WSNs in the IoT

    Directory of Open Access Journals (Sweden)

    Avijit Mathur

    2016-01-01

    Full Text Available Wireless sensor networks (WSNs are being used to facilitate monitoring of patients in hospital and home environments. These systems consist of a variety of different components/sensors and many processes like clustering, routing, security, and self-organization. Routing is necessary for medical-based WSNs because it allows remote data delivery and it facilitates network scalability in large hospitals. However, routing entails several problems, mainly due to the open nature of wireless networks, and these need to be addressed. This paper looks at two of the problems that arise due to wireless routing between the nodes and access points of a medical WSN (for IoT use: black hole and selective forwarding (SF attacks. A solution to the former can readily be provided through the use of cryptographic hashes, while the latter makes use of a neighbourhood watch and threshold-based analysis to detect and correct SF attacks. The scheme proposed here is capable of detecting a selective forwarding attack with over 96% accuracy and successfully identifying the malicious node with 83% accuracy.

  18. Defence against Black Hole and Selective Forwarding Attacks for Medical WSNs in the IoT.

    Science.gov (United States)

    Mathur, Avijit; Newe, Thomas; Rao, Muzaffar

    2016-01-19

    Wireless sensor networks (WSNs) are being used to facilitate monitoring of patients in hospital and home environments. These systems consist of a variety of different components/sensors and many processes like clustering, routing, security, and self-organization. Routing is necessary for medical-based WSNs because it allows remote data delivery and it facilitates network scalability in large hospitals. However, routing entails several problems, mainly due to the open nature of wireless networks, and these need to be addressed. This paper looks at two of the problems that arise due to wireless routing between the nodes and access points of a medical WSN (for IoT use): black hole and selective forwarding (SF) attacks. A solution to the former can readily be provided through the use of cryptographic hashes, while the latter makes use of a neighbourhood watch and threshold-based analysis to detect and correct SF attacks. The scheme proposed here is capable of detecting a selective forwarding attack with over 96% accuracy and successfully identifying the malicious node with 83% accuracy.

  19. A Review of RSA and Public-Key Cryptosystems | Rabah | Botswana ...

    African Journals Online (AJOL)

    ... study and analyze the RSA cryptosystems – a public-key cryptographic algorithm - a system that uses two sets of keys; one for encryption and the other for decryption. Key Words: Public-key cryptography, DH, RSA, Internet Security and attacks, Digital Signature, Message digest, Authentication, Secure Socket Layer (SSL)

  20. Practical secure decision tree learning in a teletreatment application

    NARCIS (Netherlands)

    de Hoogh, Sebastiaan; Schoenmakers, Berry; Chen, Ping; op den Akker, Harm

    In this paper we develop a range of practical cryptographic protocols for secure decision tree learning, a primary problem in privacy preserving data mining. We focus on particular variants of the well-known ID3 algorithm allowing a high level of security and performance at the same time. Our

  1. Practical secure decision tree learning in a teletreatment application

    NARCIS (Netherlands)

    Hoogh, de S.J.A.; Schoenmakers, B.; Chen, Ping; Op den Akker, H.; Christin, N.; Safavi-Naini, R.

    2014-01-01

    In this paper we develop a range of practical cryptographic protocols for secure decision tree learning, a primary problem in privacy preserving data mining. We focus on particular variants of the well-known ID3 algorithm allowing a high level of security and performance at the same time. Our

  2. Cyber Moat: Adaptive Virtualized Network Framework for Deception and Disinformation

    Science.gov (United States)

    2016-12-12

    a suite of cryptographic algorithms including AES, RSA, and SHA 1 in the cache with small performance impacts. The results of this work have been... algorithms and mechanisms. A successful system prototype will be delivered and lead to a powerful new capability for using moving target defense... CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER 19a. NAME OF RESPONSIBLE PERSON a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF Kun Sun PAGES 19b

  3. Security problems with a chaos-based deniable authentication scheme

    International Nuclear Information System (INIS)

    Alvarez, Gonzalo

    2005-01-01

    Recently, a new scheme was proposed for deniable authentication. Its main originality lied on applying a chaos-based encryption-hash parallel algorithm and the semi-group property of the Chebyshev chaotic map. Although original and practicable, its insecurity and inefficiency are shown in this paper, thus rendering it inadequate for adoption in e-commerce

  4. Security problems with a chaos-based deniable authentication scheme

    Energy Technology Data Exchange (ETDEWEB)

    Alvarez, Gonzalo [Instituto de Fisica Aplicada, Consejo Superior de Investigaciones Cientificas, Serrano 144, 28006 Madrid (Spain)] e-mail: gonzalo@iec.csic.es

    2005-10-01

    Recently, a new scheme was proposed for deniable authentication. Its main originality lied on applying a chaos-based encryption-hash parallel algorithm and the semi-group property of the Chebyshev chaotic map. Although original and practicable, its insecurity and inefficiency are shown in this paper, thus rendering it inadequate for adoption in e-commerce.

  5. Learning Perfectly Secure Cryptography to Protect Communications with Adversarial Neural Cryptography

    Directory of Open Access Journals (Sweden)

    Murilo Coutinho

    2018-04-01

    Full Text Available Researches in Artificial Intelligence (AI have achieved many important breakthroughs, especially in recent years. In some cases, AI learns alone from scratch and performs human tasks faster and better than humans. With the recent advances in AI, it is natural to wonder whether Artificial Neural Networks will be used to successfully create or break cryptographic algorithms. Bibliographic review shows the main approach to this problem have been addressed throughout complex Neural Networks, but without understanding or proving the security of the generated model. This paper presents an analysis of the security of cryptographic algorithms generated by a new technique called Adversarial Neural Cryptography (ANC. Using the proposed network, we show limitations and directions to improve the current approach of ANC. Training the proposed Artificial Neural Network with the improved model of ANC, we show that artificially intelligent agents can learn the unbreakable One-Time Pad (OTP algorithm, without human knowledge, to communicate securely through an insecure communication channel. This paper shows in which conditions an AI agent can learn a secure encryption scheme. However, it also shows that, without a stronger adversary, it is more likely to obtain an insecure one.

  6. Learning Perfectly Secure Cryptography to Protect Communications with Adversarial Neural Cryptography.

    Science.gov (United States)

    Coutinho, Murilo; de Oliveira Albuquerque, Robson; Borges, Fábio; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-04-24

    Researches in Artificial Intelligence (AI) have achieved many important breakthroughs, especially in recent years. In some cases, AI learns alone from scratch and performs human tasks faster and better than humans. With the recent advances in AI, it is natural to wonder whether Artificial Neural Networks will be used to successfully create or break cryptographic algorithms. Bibliographic review shows the main approach to this problem have been addressed throughout complex Neural Networks, but without understanding or proving the security of the generated model. This paper presents an analysis of the security of cryptographic algorithms generated by a new technique called Adversarial Neural Cryptography (ANC). Using the proposed network, we show limitations and directions to improve the current approach of ANC. Training the proposed Artificial Neural Network with the improved model of ANC, we show that artificially intelligent agents can learn the unbreakable One-Time Pad (OTP) algorithm, without human knowledge, to communicate securely through an insecure communication channel. This paper shows in which conditions an AI agent can learn a secure encryption scheme. However, it also shows that, without a stronger adversary, it is more likely to obtain an insecure one.

  7. Security of the data transmission in the industrial control system

    Directory of Open Access Journals (Sweden)

    Marcin Bednarek

    2015-12-01

    Full Text Available The theme of this paper is to present the data transmission security system between the stations of the industrial control system. The possible options for secure communications between process stations, as well as between process and operator station are described. Transmission security mechanism is based on algorithms for symmetric and asymmetric encryption. The authentication process uses a software token algorithm and a one-way hash function. The algorithm for establishing a secured connection between the stations, including the authentication process and encryption of data transmission is given. The process of securing the transmission consists of 4 sub-processes: (I authentication; (II asymmetric, public keys transmission; (III symmetric key transmission; (IV data transmission. The presented process of securing the transmission was realized in the industrial controller and emulator. For this purpose, programming languages in accordance with EN 61131 were used. The functions were implemented as user function blocks. This allows us to include a mixed code in the structure of the block (both: ST and FBD. Available function categories: support of the asymmetric encryption; asymmetric encryption utility functions; support of the symmetric encryption; symmetric encryption utility functions; support of the hash value calculations; utility functions of conversion.[b]Keywords[/b]: transmission security, encryption, authentication, industrial control system

  8. Implementasi Modified LSB (Least Significant Bit) dan Algoritma DES (Data Encryption Standard) Pada Pengamanan Data Text

    OpenAIRE

    Gulo, Hengky P.F.

    2017-01-01

    The development of computer technology can make progress in the exchange of data or information. The exchange of data or information at this time is very fast, so everyone needs data or information. Data security becomes very important, so frequent misuse of data and the security of third-party interference. To resolve the issue of data security is needed, namely with cryptography and steganography techniques. In this study, the cryptographic algorithm used is the DES algorithm...

  9. Penambahan Chinese Reminder Theorem Untuk Mempercepat Proses Enkripsi Dan Dekripsi Pada RSA

    OpenAIRE

    Hasibuan, Andi Hazri

    2015-01-01

    Many methods are used to protect digital data stored or transmitted via electronic media. One way is to use a cryptographic algorithm RSA (Rivest-Shamir-Adleman). Standard RSA uses modular arithmetic to perform the encryption and decryption. In this thesis discussed the addition of Chinese Remainder Theorem to speed up the RSA. 100823021

  10. Bäcklund transformations and divisor doubling

    Science.gov (United States)

    Tsiganov, A. V.

    2018-03-01

    In classical mechanics well-known cryptographic algorithms and protocols can be very useful for construction of canonical transformations preserving form of Hamiltonians. We consider application of a standard generic divisor doubling for construction of new auto Bäcklund transformations for the Lagrange top and Hénon-Heiles system separable in parabolic coordinates.

  11. Implementation of the On-the-fly Encryption for the Linux OS Based on Certified CPS

    Directory of Open Access Journals (Sweden)

    Alexander Mikhailovich Korotin

    2013-02-01

    Full Text Available The article is devoted to tools for on-the-fly encryption and a method to implement such tool for the Linux OS based on a certified CPS.The idea is to modify the existing tool named eCryptfs. Russian cryptographic algorithms will be used in the user and kernel modes.

  12. A novel method of S-box design based on chaotic map and composition method

    International Nuclear Information System (INIS)

    Lambić, Dragan

    2014-01-01

    Highlights: • Novel chaotic S-box generation method is presented. • Presented S-box has better cryptographic properties than other examples of chaotic S-boxes. • The advantages of the proposed method are the low complexity and large key space. -- Abstract: An efficient algorithm for obtaining random bijective S-boxes based on chaotic maps and composition method is presented. The proposed method is based on compositions of S-boxes from a fixed starting set. The sequence of the indices of starting S-boxes used is obtained by using chaotic maps. The results of performance test show that the S-box presented in this paper has good cryptographic properties. The advantages of the proposed method are the low complexity and the possibility to achieve large key space

  13. Fast Implementation of Two Hash Algorithms on nVidia CUDA GPU

    OpenAIRE

    Lerchundi Osa, Gorka

    2009-01-01

    Projecte fet en col.laboració amb Norwegian University of Science and Technology. Department of Telematics User needs increases as time passes. We started with computers like the size of a room where the perforated plaques did the same function as the current machine code object does and at present we are at a point where the number of processors within our graphic device unit it’s not enough for our requirements. A change in the evolution of computing is looming. We are in a t...

  14. Security for Virtual Private Networks

    OpenAIRE

    Magdalena Nicoleta Iacob

    2015-01-01

    Network security must be a permanent concern for every company, given the fact that threats are evolving today more rapidly than in the past. This paper contains a general classification of cryptographic algorithms used in today networks and presents an implementation of virtual private networks using one of the most secure methods - digital certificates authentication.

  15. A new image cipher in time and frequency domains

    Science.gov (United States)

    Abd El-Latif, Ahmed A.; Niu, Xiamu; Amin, Mohamed

    2012-10-01

    Recently, various encryption techniques based on chaos have been proposed. However, most existing chaotic encryption schemes still suffer from fundamental problems such as small key space, weak security function and slow performance speed. This paper introduces an efficient encryption scheme for still visual data that overcome these disadvantages. The proposed scheme is based on hybrid Linear Feedback Shift Register (LFSR) and chaotic systems in hybrid domains. The core idea is to scramble the pixel positions based on 2D chaotic systems in frequency domain. Then, the diffusion is done on the scrambled image based on cryptographic primitive operations and the incorporation of LFSR and chaotic systems as round keys. The hybrid compound of LFSR, chaotic system and cryptographic primitive operations strengthen the encryption performance and enlarge the key space required to resist the brute force attacks. Results of statistical and differential analysis show that the proposed algorithm has high security for secure digital images. Furthermore, it has key sensitivity together with a large key space and is very fast compared to other competitive algorithms.

  16. Content-Based Image Retrial Based on Hadoop

    Directory of Open Access Journals (Sweden)

    DongSheng Yin

    2013-01-01

    Full Text Available Generally, time complexity of algorithms for content-based image retrial is extremely high. In order to retrieve images on large-scale databases efficiently, a new way for retrieving based on Hadoop distributed framework is proposed. Firstly, a database of images features is built by using Speeded Up Robust Features algorithm and Locality-Sensitive Hashing and then perform the search on Hadoop platform in a parallel way specially designed. Considerable experimental results show that it is able to retrieve images based on content on large-scale cluster and image sets effectively.

  17. RETRACTED: The Application of Symmetric Key Cryptographic Algorithms in Wireless Sensor Networks

    Science.gov (United States)

    Si, Lingling; Ji, Zhigang; Wang, Zhihui

    This article has been retracted: please see Elsevier Policy on Article Withdrawal. This article has been retracted at the request of the Publisher. The authors have plagiarized a paper that had already appeared in "Queen's 25th Biennial Symposium on Communications", page 168-172, print ISBN 978-1-4244-5709-0, http://dx.doi.org/10.1109/BSC.2010.5472979. One of the conditions of submission of a paper for publication is that authors declare explicitly that their work is original and has not appeared in a publication elsewhere. Re-use of any data should be appropriately cited. As such this article represents a severe abuse of the scientific publishing system. The scientific community takes a very strong view on this matter and apologies are offered to readers of the journal that this was not detected during the submission process.

  18. An Analysis of the Computer Security Ramifications of Weakened Asymmetric Cryptographic Algorithms

    Science.gov (United States)

    2012-06-01

    OpenVPN (Yonan). TLS (and by extension SSL) obviously rely on encryption to provide the confidentiality, integrity and authentication services it...Secure Shell (SSH) Transport Layer Protocol.” IETF, Jan. 2006. <tools.ietf.org/html/rfc4253> Yonan, James, and Mattock. " OpenVPN ." SourceForge...11 May 2012. <http://sourceforge.net/projects/ openvpn /> 92 REPORT DOCUMENTATION PAGE Form Approved OMB No. 074-0188 The public reporting

  19. Breaking down the barriers of using strong authentication and encryption in resource constrained embedded systems

    Science.gov (United States)

    Knobler, Ron; Scheffel, Peter; Jackson, Scott; Gaj, Kris; Kaps, Jens Peter

    2013-05-01

    Various embedded systems, such as unattended ground sensors (UGS), are deployed in dangerous areas, where they are subject to compromise. Since numerous systems contain a network of devices that communicate with each other (often times with commercial off the shelf [COTS] radios), an adversary is able to intercept messages between system devices, which jeopardizes sensitive information transmitted by the system (e.g. location of system devices). Secret key algorithms such as AES are a very common means to encrypt all system messages to a sufficient security level, for which lightweight implementations exist for even very resource constrained devices. However, all system devices must use the appropriate key to encrypt and decrypt messages from each other. While traditional public key algorithms (PKAs), such as RSA and Elliptic Curve Cryptography (ECC), provide a sufficiently secure means to provide authentication and a means to exchange keys, these traditional PKAs are not suitable for very resource constrained embedded systems or systems which contain low reliability communication links (e.g. mesh networks), especially as the size of the network increases. Therefore, most UGS and other embedded systems resort to pre-placed keys (PPKs) or other naïve schemes which greatly reduce the security and effectiveness of the overall cryptographic approach. McQ has teamed with the Cryptographic Engineering Research Group (CERG) at George Mason University (GMU) to develop an approach using revolutionary cryptographic techniques that provides both authentication and encryption, but on resource constrained embedded devices, without the burden of large amounts of key distribution or storage.

  20. The IAEA's Universal Instrument Token

    International Nuclear Information System (INIS)

    Naumann, I.; Wishard, B.; Morgan, K.; Christoph, B.; Schwier, A.; Frank, T.

    2015-01-01

    The IAEA currently seeks to improve the harmonization of security approaches across safeguards equipment. The protection of digital safeguards data is based on several principles: a) the signing of data in measurement devices using standard public/private-key-based signature generation, b) the storage of secret keys on certified, tamper-protected cryptographic devices, and c) well-established cryptographic algorithms and protocols based on global standards and internationally recognized cryptographic libraries. This paper discusses a cryptographic token, the Universal Instrument Token, which constitutes the core element of the architecture for signing safeguards data. This architecture supports the above principles and is compliant with the IAEA's information security policies and guidelines. An important side-condition is that the UIT must be implemented across a wide range of operating systems and hardware architectures, which mandates the use of open-source software for all software-related parts involved. The UIT is permanently connected to the measuring device (usually via the USB port) and requires complex hardware drivers and middleware components. Identifying open-source based, mature and ready-for-use smart card drivers and tools that are compatible with a range of operating systems was a major challenge. Reliable and well-established cryptographic libraries reside at the core of every information-security application. Different types of review software, typically software products used at IAEA headquarters in Vienna but occasionally also in the facilities, need to contain some specific software modules in order to verify the digital signatures attached to the data. Finally, also required are enrollment tools which generate private keys and certify their corresponding public counterparts using the IAEA's internal Certification Authority. In 2014, the roll-out of the UIT has raised the security of IAEA instrument data signing to a level which is

  1. Document-Based and Message-Centric Security Using XML Authentication and Encryption for Coalition and Interagency Operations

    Science.gov (United States)

    2009-09-01

    running the document through a mathematical hashing algorithm to obtain a reproducible fingerprint (Message Digest) of the document combing the message... fingerprint with the computed has of the document. If the message digests match and there exist a trusted third party, the 75 Certificate Authority...MESSAGES. /2/ CANVAS FLAGSHIPS/AND SHIPS IN COMPANY FOR MISSING NRS, CCS (EDINBURG) WILL SUBMIT BSR TO NCTAMS EURCENT. /3/FAST REACTION

  2. A Novel Fast and Secure Approach for Voice Encryption Based on DNA Computing

    Science.gov (United States)

    Kakaei Kate, Hamidreza; Razmara, Jafar; Isazadeh, Ayaz

    2018-06-01

    Today, in the world of information communication, voice information has a particular importance. One way to preserve voice data from attacks is voice encryption. The encryption algorithms use various techniques such as hashing, chaotic, mixing, and many others. In this paper, an algorithm is proposed for voice encryption based on three different schemes to increase flexibility and strength of the algorithm. The proposed algorithm uses an innovative encoding scheme, the DNA encryption technique and a permutation function to provide a secure and fast solution for voice encryption. The algorithm is evaluated based on various measures including signal to noise ratio, peak signal to noise ratio, correlation coefficient, signal similarity and signal frequency content. The results demonstrate applicability of the proposed method in secure and fast encryption of voice files

  3. A Theoretical and Experimental Comparison of One Time Pad Cryptography using Key and Plaintext Insertion and Transposition (KPIT and Key Coloumnar Transposition (KCT Method

    Directory of Open Access Journals (Sweden)

    Pryo Utomo

    2017-06-01

    Full Text Available One Time Pad (OTP is a cryptographic algorithm that is quite easy to be implemented. This algorithm works by converting plaintext and key into decimal then converting into binary number and calculating Exclusive-OR logic. In this paper, the authors try to make the comparison of OTP cryptography using KPI and KCT so that the ciphertext will be generated more difficult to be known. In the Key and Plaintext Insertion (KPI Method, we modify the OTP algorithm by adding the key insertion in the plaintext that has been splitted. Meanwhile in the Key Coloumnar Transposition (KCT Method, we modify the OTP algorithm by dividing the key into some parts in matrix of rows and coloumns. Implementation of the algorithms using PHP programming language.

  4. An Ultra-Lightweight Encryption Scheme in Underwater Acoustic Networks

    Directory of Open Access Journals (Sweden)

    Chunyan Peng

    2016-01-01

    Full Text Available We tackle a fundamental security problem in underwater acoustic networks (UANs. The S-box in the existing block encryption algorithm is more energy consuming and unsuitable for resources-constrained UANs. In this paper, instead of S-box, we present a lightweight, 8-round iteration block cipher algorithm for UANs communication based on chaotic theory and increase the key space by changing the number of iteration round. We further propose secure network architecture of UANs. By analysis, our algorithm can resist brute-force searches and adversarial attacks. Simulation results show that, compared with traditional AES-128 and PRESENT algorithms, our cryptographic algorithm can make a good trade-off between security and overhead, has better energy efficiency, and applies to UANs.

  5. Applications of Fast Truncated Multiplication in Cryptography

    Directory of Open Access Journals (Sweden)

    Laszlo Hars

    2006-12-01

    Full Text Available Truncated multiplications compute truncated products, contiguous subsequences of the digits of integer products. For an n-digit multiplication algorithm of time complexity O(nα, with 1<α≤2, there is a truncated multiplication algorithm, which is constant times faster when computing a short enough truncated product. Applying these fast truncated multiplications, several cryptographic long integer arithmetic algorithms are improved, including integer reciprocals, divisions, Barrett and Montgomery multiplications, 2n-digit modular multiplication on hardware for n-digit half products. For example, Montgomery multiplication is performed in 2.6 Karatsuba multiplication time.

  6. Development of a cellulose-based insulating composite material for green buildings: Case of treated organic waste (paper, cardboard, hash)

    Science.gov (United States)

    Ouargui, Ahmed; Belouaggadia, Naoual; Elbouari, Abdeslam; Ezzine, Mohammed

    2018-05-01

    Buildings are responsible for 36% of the final energy consumption in Morocco [1-2], and a reduction of this energy consumption of buildings is a priority for the kingdom in order to reach its energy saving goals. One of the most effective actions to reduce energy consumption is the selection and development of innovative and efficient building materials [3]. In this work, we present an experimental study of the effect of adding treated organic waste (paper, cardboard, hash) on mechanical and thermal properties of cement and clay bricks. Thermal conductivity, specific heat and mechanical resistance were investigated in terms of content and size additives. Soaking time and drying temperature were also taken into account. The results reveal that thermal conductivity decreases as well in the case of the paper-cement mixture as that of the paper-clay and seems to stabilize around 40%. In the case of the composite paper-cement, it is found that, for an additives quantity exceeding 15%, the compressive strength exceeds the standard for the hollow non-load bearing masonry. However, the case of paper-clay mixture seems to give more interesting results, related to the compressive strength, for a mass composition of 15% in paper. Given the positive results achieved, it seems possible to use these composites for the construction of walls, ceilings and roofs of housing while minimizing the energy consumption of the building.

  7. Wireless, amphibious theory for reinforcement learning

    Science.gov (United States)

    Li, Jinci

    2013-10-01

    Cryptographers agree that heterogeneous information are an interesting new topic in the field of cryptography, and biologists concur. Given the current status of stochastic epistemologies, security experts clearly desire the construction of flip-flop gates [1, 2, 3]. Mungo, our new system for authenticated algorithms, is the solution to all of these challenges. Though such a hypothesis at first glance seems perverse, it has ample historical precedence.

  8. Sweet Dreams and Nightmares: Security in the Internet of Things

    OpenAIRE

    Kasper , Timo; Oswald , David; Paar , Christof

    2014-01-01

    Part 1: Invited Paper; International audience; Wireless embedded devices are predominant in the Internet of Things: Objects tagged with Radio Frequency IDentification and Near Field Communication technology, smartphones, and other embedded tokens interact from device to device and thereby often process information that is security or privacy relevant for humans. For protecting sensitive data and preventing attacks, many embedded devices employ cryptographic algorithms and authentication schem...

  9. Algebraic dynamics algorithm: Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    WANG ShunJin; ZHANG Hua

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations,a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm.A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models.The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision,and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  10. Algebraic dynamics algorithm:Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations, a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm. A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models. The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision, and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  11. ASSIST: a fast versatile local structural comparison tool.

    Science.gov (United States)

    Caprari, Silvia; Toti, Daniele; Viet Hung, Le; Di Stefano, Maurizio; Polticelli, Fabio

    2014-04-01

    Structural genomics initiatives are increasingly leading to the determination of the 3D structure of target proteins whose catalytic function is not known. The aim of this work was that of developing a novel versatile tool for searching structural similarity, which allows to predict the catalytic function, if any, of these proteins. The algorithm implemented by the tool is based on local structural comparison to find the largest subset of similar residues between an input protein and known functional sites. The method uses a geometric hashing approach where information related to residue pairs from the input structures is stored in a hash table and then is quickly retrieved during the comparison step. Tests on proteins belonging to different functional classes, done using the Catalytic Site Atlas entries as targets, indicate that the algorithm is able to identify the correct functional class of the input protein in the vast majority of the cases. The application was developed in Java SE 6, with a Java Swing Graphic User Interface (GUI). The system can be run locally on any operating system (OS) equipped with a suitable Java Virtual Machine, and is available at the following URL: http://www.computationalbiology.it/software/ASSISTv1.zip.

  12. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  13. Cost analysis of hash collisions : will quantum computers make SHARCS obsolete?

    NARCIS (Netherlands)

    Bernstein, D.J.

    2009-01-01

    Current proposals for special-purpose factorization hardware will become obsolete if large quantum computers are built: the number-field sieve scales much more poorly than Shor's quantum algorithm for factorization. Will all special-purpose cryptanalytic hardware become obsolete in a post-quantum

  14. Techniques for Performance Improvement of Integer Multiplication in Cryptographic Applications

    Directory of Open Access Journals (Sweden)

    Robert Brumnik

    2014-01-01

    Full Text Available The problem of arithmetic operations performance in number fields is actively researched by many scientists, as evidenced by significant publications in this field. In this work, we offer some techniques to increase performance of software implementation of finite field multiplication algorithm, for both 32-bit and 64-bit platforms. The developed technique, called “delayed carry mechanism,” allows to preventing necessity to consider a significant bit carry at each iteration of the sum accumulation loop. This mechanism enables reducing the total number of additions and applies the modern parallelization technologies effectively.

  15. Performance Evaluation of Software Routers with VPN Features

    Directory of Open Access Journals (Sweden)

    H. Redžović

    2017-11-01

    Full Text Available This paper presents implementation and analysis of the VPN software router which is based on Quagga and strongSwan open-source software tools. We validated the functionalities of strongSwan and Quagga in realistic environment which include scenarios with link failures. Also, we measured and analyzed the performance of encryption and hash algorithms supported by strongSwan software, in order to advise an optimal VPN configuration that provides the best performance.

  16. Color image encryption based on Coupled Nonlinear Chaotic Map

    International Nuclear Information System (INIS)

    Mazloom, Sahar; Eftekhari-Moghadam, Amir Masud

    2009-01-01

    Image encryption is somehow different from text encryption due to some inherent features of image such as bulk data capacity and high correlation among pixels, which are generally difficult to handle by conventional methods. The desirable cryptographic properties of the chaotic maps such as sensitivity to initial conditions and random-like behavior have attracted the attention of cryptographers to develop new encryption algorithms. Therefore, recent researches of image encryption algorithms have been increasingly based on chaotic systems, though the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper proposes a Coupled Nonlinear Chaotic Map, called CNCM, and a novel chaos-based image encryption algorithm to encrypt color images by using CNCM. The chaotic cryptography technique which used in this paper is a symmetric key cryptography with a stream cipher structure. In order to increase the security of the proposed algorithm, 240 bit-long secret key is used to generate the initial conditions and parameters of the chaotic map by making some algebraic transformations to the key. These transformations as well as the nonlinearity and coupling structure of the CNCM have enhanced the cryptosystem security. For getting higher security and higher complexity, the current paper employs the image size and color components to cryptosystem, thereby significantly increasing the resistance to known/chosen-plaintext attacks. The results of several experimental, statistical analysis and key sensitivity tests show that the proposed image encryption scheme provides an efficient and secure way for real-time image encryption and transmission.

  17. Low-Power Public Key Cryptography

    Energy Technology Data Exchange (ETDEWEB)

    BEAVER,CHERYL L.; DRAELOS,TIMOTHY J.; HAMILTON,VICTORIA A.; SCHROEPPEL,RICHARD C.; GONZALES,RITA A.; MILLER,RUSSELL D.; THOMAS,EDWARD V.

    2000-11-01

    This report presents research on public key, digital signature algorithms for cryptographic authentication in low-powered, low-computation environments. We assessed algorithms for suitability based on their signature size, and computation and storage requirements. We evaluated a variety of general purpose and special purpose computing platforms to address issues such as memory, voltage requirements, and special functionality for low-powered applications. In addition, we examined custom design platforms. We found that a custom design offers the most flexibility and can be optimized for specific algorithms. Furthermore, the entire platform can exist on a single Application Specific Integrated Circuit (ASIC) or can be integrated with commercially available components to produce the desired computing platform.

  18. Cryptanalysis of Application of Laplace Transform for Cryptography

    Directory of Open Access Journals (Sweden)

    Gençoğlu Muharrem Tuncay

    2017-01-01

    Full Text Available Although Laplace Transform is a good application field in the design of cryptosystems, many cryptographic algorithm proposals become unsatisfactory for secure communication. In this cryptanalysis study, one of the significant disadvantages of the proposed algorithm is performed with only statistical test of security analysis. In this study, Explaining what should be considered when performing security analysis of Laplace Transform based encryption systems and using basic mathematical rules, password has broken without knowing secret key. Under the skin; This study is a refutation for the article titled Application of Laplace Transform for Cryptography written by Hiwerakar[3].

  19. Isomorphism Theorem on Vector Spaces over a Ring

    Directory of Open Access Journals (Sweden)

    Futa Yuichi

    2017-10-01

    Full Text Available In this article, we formalize in the Mizar system [1, 4] some properties of vector spaces over a ring. We formally prove the first isomorphism theorem of vector spaces over a ring. We also formalize the product space of vector spaces. ℤ-modules are useful for lattice problems such as LLL (Lenstra, Lenstra and Lovász [5] base reduction algorithm and cryptographic systems [6, 2].

  20. Pseudo-random number generator based on mixing of three chaotic maps

    Science.gov (United States)

    François, M.; Grosges, T.; Barchiesi, D.; Erra, R.

    2014-04-01

    A secure pseudo-random number generator three-mixer is proposed. The principle of the method consists in mixing three chaotic maps produced from an input initial vector. The algorithm uses permutations whose positions are computed and indexed by a standard chaotic function and a linear congruence. The performance of that scheme is evaluated through statistical analysis. Such a cryptosystem lets appear significant cryptographic qualities for a high security level.

  1. Tamper-Proof Circuits : : How to Trade Leakage for Tamper-Resilience

    DEFF Research Database (Denmark)

    Faust, Sebastian; Pietrzak, Krzysztof; Venturi, Daniele

    2011-01-01

    Tampering attacks are cryptanalytic attacks on the implementation of cryptographic algorithms (e.g., smart cards), where an adversary introduces faults with the hope that the tampered device will reveal secret information. Inspired by the work of Ishai et al. [Eurocrypt’06], we propose a compiler...... complex computation to protecting simple components....... that transforms any circuit into a new circuit with the same functionality, but which is resilient against a well-defined and powerful tampering adversary. More concretely, our transformed circuits remain secure even if the adversary can adaptively tamper with every wire in the circuit as long as the tampering......-box access to the original circuit and log(q) bits of additional auxiliary information. Thus, if the implemented cryptographic scheme is secure against log(q) bits of leakage, then our implementation is tamper-proof in the above sense. Surprisingly, allowing for this small amount of information leakage...

  2. Choice of optical system is critical for the security of double random phase encryption systems

    Science.gov (United States)

    Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Cassidy, Derek; Zhao, Liang; Ryle, James P.; Healy, John J.; Sheridan, John T.

    2017-06-01

    The linear canonical transform (LCT) is used in modeling a coherent light-field propagation through first-order optical systems. Recently, a generic optical system, known as the quadratic phase encoding system (QPES), for encrypting a two-dimensional image has been reported. In such systems, two random phase keys and the individual LCT parameters (α,β,γ) serve as secret keys of the cryptosystem. It is important that such encryption systems also satisfy some dynamic security properties. We, therefore, examine such systems using two cryptographic evaluation methods, the avalanche effect and bit independence criterion, which indicate the degree of security of the cryptographic algorithms using QPES. We compared our simulation results with the conventional Fourier and the Fresnel transform-based double random phase encryption (DRPE) systems. The results show that the LCT-based DRPE has an excellent avalanche and bit independence characteristics compared to the conventional Fourier and Fresnel-based encryption systems.

  3. Optimization of incremental structure from motion combining a random k-d forest and pHash for unordered images in a complex scene

    Science.gov (United States)

    Zhan, Zongqian; Wang, Chendong; Wang, Xin; Liu, Yi

    2018-01-01

    On the basis of today's popular virtual reality and scientific visualization, three-dimensional (3-D) reconstruction is widely used in disaster relief, virtual shopping, reconstruction of cultural relics, etc. In the traditional incremental structure from motion (incremental SFM) method, the time cost of the matching is one of the main factors restricting the popularization of this method. To make the whole matching process more efficient, we propose a preprocessing method before the matching process: (1) we first construct a random k-d forest with the large-scale scale-invariant feature transform features in the images and combine this with the pHash method to obtain a value of relatedness, (2) we then construct a connected weighted graph based on the relatedness value, and (3) we finally obtain a planned sequence of adding images according to the principle of the minimum spanning tree. On this basis, we attempt to thin the minimum spanning tree to reduce the number of matchings and ensure that the images are well distributed. The experimental results show a great reduction in the number of matchings with enough object points, with only a small influence on the inner stability, which proves that this method can quickly and reliably improve the efficiency of the SFM method with unordered multiview images in complex scenes.

  4. Status Report on the First Round of the Development of the Advanced Encryption Standard

    Science.gov (United States)

    Nechvatal, James; Barker, Elaine; Dodson, Donna; Dworkin, Morris; Foti, James; Roback, Edward

    1999-01-01

    In 1997, the National Institute of Standards and Technology (NIST) initiated a process to select a symmetric-key encryption algorithm to be used to protect sensitive (unclassified) Federal information in furtherance of NIST’s statutory responsibilities. In 1998, NIST announced the acceptance of 15 candidate algorithms and requested the assistance of the cryptographic research community in analyzing the candidates. This analysis included an initial examination of the security and efficiency characteristics for each algorithm. NIST has reviewed the results of this research and selected five algorithms (MARS, RC6™, Rijndael, Serpent and Twofish) as finalists. The research results and rationale for the selection of the finalists are documented in this report. The five finalists will be the subject of further study before the selection of one or more of these algorithms for inclusion in the Advanced Encryption Standard.

  5. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    Science.gov (United States)

    Kish, Laszlo B; Abbott, Derek; Granqvist, Claes G

    2013-01-01

    Recently, Bennett and Riedel (BR) (http://arxiv.org/abs/1303.7435v1) argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive) attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged.

  6. Quantum copying and simplification of the quantum Fourier transform

    Science.gov (United States)

    Niu, Chi-Sheng

    Theoretical studies of quantum computation and quantum information theory are presented in this thesis. Three topics are considered: simplification of the quantum Fourier transform in Shor's algorithm, optimal eavesdropping in the BB84 quantum cryptographic protocol, and quantum copying of one qubit. The quantum Fourier transform preceding the final measurement in Shor's algorithm is simplified by replacing a network of quantum gates with one that has fewer and simpler gates controlled by classical signals. This simplification results from an analysis of the network using the consistent history approach to quantum mechanics. The optimal amount of information which an eavesdropper can gain, for a given level of noise in the communication channel, is worked out for the BB84 quantum cryptographic protocol. The optimal eavesdropping strategy is expressed in terms of various quantum networks. A consistent history analysis of these networks using two conjugate quantum bases shows how the information gain in one basis influences the noise level in the conjugate basis. The no-cloning property of quantum systems, which is the physics behind quantum cryptography, is studied by considering copying machines that generate two imperfect copies of one qubit. The best qualities these copies can have are worked out with the help of the Bloch sphere representation for one qubit, and a quantum network is worked out for an optimal copying machine. If the copying machine does not have additional ancillary qubits, the copying process can be viewed using a 2-dimensional subspace in a product space of two qubits. A special representation of such a two-dimensional subspace makes possible a complete characterization of this type of copying. This characterization in turn leads to simplified eavesdropping strategies in the BB84 and the B92 quantum cryptographic protocols.

  7. Network perimeter security building defense in-depth

    CERN Document Server

    Riggs, Cliff

    2003-01-01

    PREFACEWho is this Book For?The Path to Network SecurityWho Should Read This Book?MANAGING NETWORK SECURITYThe Big Picture: Security Policies from A to ZAdministrative CountermeasuresPhysical CountermeasuresTechnological CountermeasuresCreating the Security Standards DocumentCreating the Configuration Guide DocumentPulling it All Together: Sample Security Policy CreationProteris Security Standards and ProceduresTHE NETWORK STACK AND SECURITYConnecting the NetworkProtocolsServers and HostsCRYPTOGRAPHY AND VPN TERMINOLOGYKeysCertificatesHashingDigital SignaturesCommon Encryption AlgorithmsSplit

  8. Enabling Medical Device Interoperability for the Integrated Clinical Environment

    Science.gov (United States)

    2016-02-01

    The MD PnP Program had already tested different technologies to carry out this migration (e.g. the Hibernate framework to persist Java objects into...Security, a powerful and highly customizable authentication and access control framework, and BCrypt, the Java implementation of a hashing algorithm...including C, C++, and Java . Our OpenICE demo code is written in JavaScript and runs in the web browser in order to make it as easy as possible for

  9. Subjective Audio Quality over a Secure IEEE 802.11n Draft 2.0 Wireless Local Area Network

    Science.gov (United States)

    2009-03-01

    encrypted with the key K. A significant vulnerability in the Diffie-Hellman algorithm is the man-in-the middle ( MitM ) attack. The encrypted...ZRTP) which prevents MitM attack through the use of shared keys [Zim06]. The Zfone Project IP phone software uses ZRTP for key negotiation. ZRTP...system by reading the SAS as displayed on a screen [Zim06]. ZRTP provides further protection from MitM attack by using some hashed key material for

  10. Microbiological quality of five potato products obtained at retail markets.

    OpenAIRE

    Duran, A P; Swartzentruber, A; Lanier, J M; Wentz, B A; Schwab, A H; Barnard, R J; Read, R B

    1982-01-01

    The microbiological quality of frozen hash brown potatoes, dried hash brown potatoes with onions, frozen french fried potatoes, dried instant mashed potatoes, and potato salad was determined by a national sampling at the retail level. A wide range of results was obtained, with most sampling units of each products having excellent microbiological quality. Geometric mean aerobic plate counts were as follows: dried hash brown potatoes, 270/g; frozen hash brown potatoes with onions, 580/g; frozen...

  11. Efficient Sampling of the Structure of Crypto Generators' State Transition Graphs

    Science.gov (United States)

    Keller, Jörg

    Cryptographic generators, e.g. stream cipher generators like the A5/1 used in GSM networks or pseudo-random number generators, are widely used in cryptographic network protocols. Basically, they are finite state machines with deterministic transition functions. Their state transition graphs typically cannot be analyzed analytically, nor can they be explored completely because of their size which typically is at least n = 264. Yet, their structure, i.e. number and sizes of weakly connected components, is of interest because a structure deviating significantly from expected values for random graphs may form a distinguishing attack that indicates a weakness or backdoor. By sampling, one randomly chooses k nodes, derives their distribution onto connected components by graph exploration, and extrapolates these results to the complete graph. In known algorithms, the computational cost to determine the component for one randomly chosen node is up to O(√n), which severely restricts the sample size k. We present an algorithm where the computational cost to find the connected component for one randomly chosen node is O(1), so that a much larger sample size k can be analyzed in a given time. We report on the performance of a prototype implementation, and about preliminary analysis for several generators.

  12. Shall we trust WDDL?

    Science.gov (United States)

    Guilley, Sylvain; Chaudhuri, Sumanta; Sauvage, Laurent; Graba, Tarik; Danger, Jean-Luc; Hoogvorst, Philippe; Vong, Vinh-Nga; Nassar, Maxime; Flament, Florent

    Security is not only a matter of cryptographic algorithms robustness but becomes also a question of securing their implementation. P. Kocher’s differential power analysis (DPA) is one of the many side-channel attacks that are more and more studied by the security community. Indeed, side-channel attacks (SCA) have proved to be very powerful on cryptographic algorithms such as DES and AES, customarily implemented in a wide variety of devices, ranging from smart-cards or ASICs to FPGAs. Among the proposed countermeasures, the “dual-rail with precharge logic” (DPL) aims at hiding information leaked by the circuit by making the power consumption independent of the calculation. However DPL logic could be subject to second order attacks exploiting timing difference between dual nets. In this article, we characterize by simulation, the vulnerability due to timing unbalance in the eight DES substitution boxes implemented in DPL WDDL style. The characterization results in a classification of the nodes according to their timing unbalance. Our results show that the timing unbalance is a major weakness of the WDDL logic, and that it could be used to retrieve the key using a DPA attack. This vulnerability has been experimentally observed on a full DES implementation using WDDL style for Altera Stratix EP1S25 FPGA.

  13. Secure Communication and Information Exchange using Authenticated Ciphertext Policy Attribute-Based Encryption in Mobile Ad-hoc Network

    Directory of Open Access Journals (Sweden)

    Samsul Huda

    2016-08-01

    Full Text Available MANETs are considered as suitable for commercial applications such as law enforcement, conference meeting, and sharing information in a student classroom and critical services such as military operations, disaster relief, and rescue operations. Meanwhile, in military operation especially in the battlefield in freely medium which naturally needs high mobility and flexibility. Thus, applying MANETs make these networks vulnerable to various types of attacks such aspacket eavesdropping, data disseminating, message replay, message modification, and especially privacy issue. In this paper, we propose a secure communication and information exchange in MANET with considering secure adhoc routing and secure information exchange. Regarding privacy issue or anonymity, we use a reliable asymmetric encryption which protecting user privacy by utilizing insensitive user attributes as user identity, CP-ABE (Ciphertext-Policy Attribute-Based Encryption cryptographic scheme. We also design protocols to implement the proposed scheme for various battlefied scenarios in real evironment using embedded devices. Our experimental results showed that the additional of HMAC (Keyed-Hash Message Authentication Code and AES (Advanced Encryption standard schemes using processor 1.2GHz only take processing time about 4.452 ms,  we can confirm that our approach by using CP-ABE with added HMAC and AES schemes make low overhead.

  14. A Lightweight Protocol for Secure Video Streaming.

    Science.gov (United States)

    Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis

    2018-05-14

    The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.

  15. Lightweight Privacy-Preserving Authentication Protocols Secure against Active Attack in an Asymmetric Way

    Science.gov (United States)

    Cui, Yank; Kobara, Kazukuni; Matsuura, Kanta; Imai, Hideki

    As pervasive computing technologies develop fast, the privacy protection becomes a crucial issue and needs to be coped with very carefully. Typically, it is difficult to efficiently identify and manage plenty of the low-cost pervasive devices like Radio Frequency Identification Devices (RFID), without leaking any privacy information. In particular, the attacker may not only eavesdrop the communication in a passive way, but also mount an active attack to ask queries adaptively, which is obviously more dangerous. Towards settling this problem, in this paper, we propose two lightweight authentication protocols which are privacy-preserving against active attack, in an asymmetric way. That asymmetric style with privacy-oriented simplification succeeds to reduce the load of low-cost devices and drastically decrease the computation cost for the management of server. This is because that, unlike the usual management of the identities, our approach does not require any synchronization nor exhaustive search in the database, which enjoys great convenience in case of a large-scale system. The protocols are based on a fast asymmetric encryption with specialized simplification and only one cryptographic hash function, which consequently assigns an easy work to pervasive devices. Besides, our results do not require the strong assumption of the random oracle.

  16. Algorithms

    Indian Academy of Sciences (India)

    polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.

  17. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  18. Secured Data Transmission Using Wavelet Based Steganography and cryptography

    OpenAIRE

    K.Ravindra Reddy; Ms Shaik Taj Mahaboob

    2014-01-01

    Steganography and cryptographic methods are used together with wavelets to increase the security of the data while transmitting through networks. Another technology, the digital watermarking is the process of embedding information into a digital (image) signal. Before embedding the plain text into the image, the plain text is encrypted by using Data Encryption Standard (DES) algorithm. The encrypted text is embedded into the LL sub band of the wavelet decomposed image using Le...

  19. A survey of noninteractive zero knowledge proof system and its applications.

    Science.gov (United States)

    Wu, Huixin; Wang, Feng

    2014-01-01

    Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions.

  20. A Survey of Noninteractive Zero Knowledge Proof System and Its Applications

    Directory of Open Access Journals (Sweden)

    Huixin Wu

    2014-01-01

    Full Text Available Zero knowledge proof system which has received extensive attention since it was proposed is an important branch of cryptography and computational complexity theory. Thereinto, noninteractive zero knowledge proof system contains only one message sent by the prover to the verifier. It is widely used in the construction of various types of cryptographic protocols and cryptographic algorithms because of its good privacy, authentication, and lower interactive complexity. This paper reviews and analyzes the basic principles of noninteractive zero knowledge proof system, and summarizes the research progress achieved by noninteractive zero knowledge proof system on the following aspects: the definition and related models of noninteractive zero knowledge proof system, noninteractive zero knowledge proof system of NP problems, noninteractive statistical and perfect zero knowledge, the connection between noninteractive zero knowledge proof system, interactive zero knowledge proof system, and zap, and the specific applications of noninteractive zero knowledge proof system. This paper also points out the future research directions.

  1. Key Management in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ismail Mansour

    2015-09-01

    Full Text Available Wireless sensor networks are a challenging field of research when it comes to security issues. Using low cost sensor nodes with limited resources makes it difficult for cryptographic algorithms to function without impacting energy consumption and latency. In this paper, we focus on key management issues in multi-hop wireless sensor networks. These networks are easy to attack due to the open nature of the wireless medium. Intruders could try to penetrate the network, capture nodes or take control over particular nodes. In this context, it is important to revoke and renew keys that might be learned by malicious nodes. We propose several secure protocols for key revocation and key renewal based on symmetric encryption and elliptic curve cryptography. All protocols are secure, but have different security levels. Each proposed protocol is formally proven and analyzed using Scyther, an automatic verification tool for cryptographic protocols. For efficiency comparison sake, we implemented all protocols on real testbeds using TelosB motes and discussed their performances.

  2. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    Science.gov (United States)

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed

    2016-11-01

    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  3. The Rebound Attack and Subspace Distinguishers: Application to Whirlpool

    DEFF Research Database (Denmark)

    Lamberger, Mario; Mendel, Florian; Schläffer, Martin

    2015-01-01

    We introduce the rebound attack as a variant of differential cryptanalysis on hash functions and apply it to the hash function Whirlpool, standardized by ISO/IEC. We give attacks on reduced variants of the 10-round Whirlpool hash function and compression function. Our results are collisions for 5...

  4. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  5. Algorithmic mathematics

    CERN Document Server

    Hougardy, Stefan

    2016-01-01

    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  6. Critical analysis of the Bennett-Riedel attack on secure cryptographic key distributions via the Kirchhoff-Law-Johnson-noise scheme.

    Directory of Open Access Journals (Sweden)

    Laszlo B Kish

    Full Text Available Recently, Bennett and Riedel (BR (http://arxiv.org/abs/1303.7435v1 argued that thermodynamics is not essential in the Kirchhoff-law-Johnson-noise (KLJN classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to demonstrate this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR's scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. All our analyses are based on a technically unlimited Eve with infinitely accurate and fast measurements limited only by the laws of physics and statistics. For non-ideal situations and at active (invasive attacks, the uncertainly principle between measurement duration and statistical errors makes it impossible for Eve to extract the key regardless of the accuracy or speed of her measurements. To show that thermodynamics and noise are essential for the security, we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each BR-attack against the KLJN scheme and conclude that the information theoretic (unconditional security of the KLJN method has not been successfully challenged.

  7. The joy of factoring

    CERN Document Server

    Wagstaff, Samuel S

    2013-01-01

    This book is about the theory and practice of integer factorization presented in a historic perspective. It describes about twenty algorithms for factoring and a dozen other number theory algorithms that support the factoring algorithms. Most algorithms are described both in words and in pseudocode to satisfy both number theorists and computer scientists. Each of the ten chapters begins with a concise summary of its contents. The book starts with a general explanation of why factoring integers is important. The next two chapters present number theory results that are relevant to factoring. Further on there is a chapter discussing, in particular, mechanical and electronic devices for factoring, as well as factoring using quantum physics and DNA molecules. Another chapter applies factoring to breaking certain cryptographic algorithms. Yet another chapter is devoted to practical vs. theoretical aspects of factoring. The book contains more than 100 examples illustrating various algorithms and theorems. It also co...

  8. Essential algorithms a practical approach to computer algorithms

    CERN Document Server

    Stephens, Rod

    2013-01-01

    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  9. Denni Algorithm An Enhanced Of SMS (Scan, Move and Sort) Algorithm

    Science.gov (United States)

    Aprilsyah Lubis, Denni; Salim Sitompul, Opim; Marwan; Tulus; Andri Budiman, M.

    2017-12-01

    Sorting has been a profound area for the algorithmic researchers, and many resources are invested to suggest a more working sorting algorithm. For this purpose many existing sorting algorithms were observed in terms of the efficiency of the algorithmic complexity. Efficient sorting is important to optimize the use of other algorithms that require sorted lists to work correctly. Sorting has been considered as a fundamental problem in the study of algorithms that due to many reasons namely, the necessary to sort information is inherent in many applications, algorithms often use sorting as a key subroutine, in algorithm design there are many essential techniques represented in the body of sorting algorithms, and many engineering issues come to the fore when implementing sorting algorithms., Many algorithms are very well known for sorting the unordered lists, and one of the well-known algorithms that make the process of sorting to be more economical and efficient is SMS (Scan, Move and Sort) algorithm, an enhancement of Quicksort invented Rami Mansi in 2010. This paper presents a new sorting algorithm called Denni-algorithm. The Denni algorithm is considered as an enhancement on the SMS algorithm in average, and worst cases. The Denni algorithm is compared with the SMS algorithm and the results were promising.

  10. Python algorithms mastering basic algorithms in the Python language

    CERN Document Server

    Hetland, Magnus Lie

    2014-01-01

    Python Algorithms, Second Edition explains the Python approach to algorithm analysis and design. Written by Magnus Lie Hetland, author of Beginning Python, this book is sharply focused on classical algorithms, but it also gives a solid understanding of fundamental algorithmic problem-solving techniques. The book deals with some of the most important and challenging areas of programming and computer science in a highly readable manner. It covers both algorithmic theory and programming practice, demonstrating how theory is reflected in real Python programs. Well-known algorithms and data struc

  11. Finite automata over algebraic structures: models and some methods of analysis

    Directory of Open Access Journals (Sweden)

    Volodymyr V. Skobelev

    2015-10-01

    Full Text Available In this paper some results of research in two new trends of finite automata theory are presented. For understanding the value and the aim of these researches some short retrospective analysis of development of finite automata theory is given. The first trend deals with families of finite automata defined via recurrence relations on algebraic structures over finite rings. The problem of design of some algorithm that simulates with some accuracy any element of given family of automata is investigated. Some general scheme for design of families of hash functions defined by outputless automata is elaborated. Computational security of these families of hash functions is analyzed. Automata defined on varieties with some algebra are presented and their homomorphisms are characterized. Special case of these automata, namely automata on elliptic curves, are investigated in detail. The second trend deals with quantum automata. Languages accepted by some basic models of quantum automata under supposition that unitary operators associated with input alphabet commute each with the others are characterized.

  12. File Detection On Network Traffic Using Approximate Matching

    Directory of Open Access Journals (Sweden)

    Frank Breitinger

    2014-09-01

    Full Text Available In recent years, Internet technologies changed enormously and allow faster Internet connections, higher data rates and mobile usage. Hence, it is possible to send huge amounts of data / files easily which is often used by insiders or attackers to steal intellectual property. As a consequence, data leakage prevention systems (DLPS have been developed which analyze network traffic and alert in case of a data leak. Although the overall concepts of the detection techniques are known, the systems are mostly closed and commercial.Within this paper we present a new technique for network trac analysis based on approximate matching (a.k.a fuzzy hashing which is very common in digital forensics to correlate similar files. This paper demonstrates how to optimize and apply them on single network packets. Our contribution is a straightforward concept which does not need a comprehensive conguration: hash the file and store the digest in the database. Within our experiments we obtained false positive rates between 10-4 and 10-5 and an algorithm throughput of over 650 Mbit/s.

  13. Embedded Lattice and Properties of Gram Matrix

    Directory of Open Access Journals (Sweden)

    Futa Yuichi

    2017-03-01

    Full Text Available In this article, we formalize in Mizar [14] the definition of embedding of lattice and its properties. We formally define an inner product on an embedded module. We also formalize properties of Gram matrix. We formally prove that an inverse of Gram matrix for a rational lattice exists. Lattice of Z-module is necessary for lattice problems, LLL (Lenstra, Lenstra and Lov´asz base reduction algorithm [16] and cryptographic systems with lattice [17].

  14. Rigorous Analysis of a Randomised Number Field Sieve

    OpenAIRE

    Lee, Jonathan; Venkatesan, Ramarathnam

    2018-01-01

    Factorisation of integers $n$ is of number theoretic and cryptographic significance. The Number Field Sieve (NFS) introduced circa 1990, is still the state of the art algorithm, but no rigorous proof that it halts or generates relationships is known. We propose and analyse an explicitly randomised variant. For each $n$, we show that these randomised variants of the NFS and Coppersmith's multiple polynomial sieve find congruences of squares in expected times matching the best-known heuristic e...

  15. BITCOIN – A NEW GLOBAL CURRENCY, INVESTMENT OPPORTUNITY OR SOMETHING ELSE?

    OpenAIRE

    Buterin, Denis; Ribarić, Eda; Savić, Suzana

    2015-01-01

    The history of bitcoin started in 2008 when the article “Bitcoin: A Peer-to-Peer Electronic Cash System” was published under the pseudonym Satoshi Nakamoto. Since then the bitcoin has undergone significant changes marked with rapid growth and decline of its value, accompanied by public attention. The bitcoin is a system based on complex cryptographic algorithms without central authority that releases money or monitors transactions. There are discussions over the issue whether bitcoin could po...

  16. Evaluating SPARQL queries on massive RDF datasets

    KAUST Repository

    Al-Harbi, Razen; Abdelaziz, Ibrahim; Kalnis, Panos; Mamoulis, Nikos

    2015-01-01

    In this paper, we propose AdHash, a distributed RDF system which addresses the shortcomings of previous work. First, AdHash initially applies lightweight hash partitioning, which drastically minimizes the startup cost, while favoring the parallel processing of join patterns on subjects, without any data communication. Using a locality-aware planner, queries that cannot be processed in parallel are evaluated with minimal communication. Second, AdHash monitors the data access patterns and adapts dynamically to the query load by incrementally redistributing and replicating frequently accessed data. As a result, the communication cost for future queries is drastically reduced or even eliminated. Our experiments with synthetic and real data verify that AdHash (i) starts faster than all existing systems, (ii) processes thousands of queries before other systems become online, and (iii) gracefully adapts to the query load, being able to evaluate queries on billion-scale RDF data in sub-seconds. In this demonstration, audience can use a graphical interface of AdHash to verify its performance superiority compared to state-of-the-art distributed RDF systems.

  17. Retraction notice to: "The Application of Symmetric Key Cryptographic Algorithms in Wireless Sensor Networks"

    Science.gov (United States)

    Si, Lingling; Ji, Zhigang; Wang, Zhihui

    This article has been retracted: please see Elsevier Policy on Article Withdrawal. This article has been retracted at the request of the Publisher. The authors have plagiarized a paper that had already appeared in "Queen's 25th Biennial Symposium on Communications", page 168-172, print ISBN 978-1-4244-5709-0. One of the conditions of submission of a paper for publication is that authors declare explicitly that their work is original and has not appeared in a publication elsewhere. Re-use of any data should be appropriately cited. As such this article represents a severe abuse of the scientific publishing system. The scientific community takes a very strong view on this matter and apologies are offered to readers of the journal that this was not detected during the submission process.

  18. Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm

    Science.gov (United States)

    Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad

    2018-01-01

    Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.

  19. Sound algorithms

    OpenAIRE

    De Götzen , Amalia; Mion , Luca; Tache , Olivier

    2007-01-01

    International audience; We call sound algorithms the categories of algorithms that deal with digital sound signal. Sound algorithms appeared in the very infancy of computer. Sound algorithms present strong specificities that are the consequence of two dual considerations: the properties of the digital sound signal itself and its uses, and the properties of auditory perception.

  20. WebVR——Web Virtual Reality Engine Based on P2P network

    OpenAIRE

    zhihan LV; Tengfei Yin; Yong Han; Yong Chen; Ge Chen

    2011-01-01

    WebVR, a multi-user online virtual reality engine, is introduced. The main contributions are mapping the geographical space and virtual space to the P2P overlay network space, and dividing the three spaces by quad-tree method. The geocoding is identified with Hash value, which is used to index the user list, terrain data, and the model object data. Sharing of data through improved Kademlia network model is designed and implemented. In this model, XOR algorithm is used to calculate the distanc...