WorldWideScience

Sample records for threshold cryptography improving

  1. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  2. AUTHENTICATION ARCHITECTURE USING THRESHOLD CRYPTOGRAPHY IN KERBEROS FOR MOBILE AD HOC NETWORKS

    Directory of Open Access Journals (Sweden)

    Hadj Gharib

    2014-06-01

    Full Text Available The use of wireless technologies is gradually increasing and risks related to the use of these technologies are considerable. Due to their dynamically changing topology and open environment without a centralized policy control of a traditional network, a mobile ad hoc network (MANET is vulnerable to the presence of malicious nodes and attacks. The ideal solution to overcome a myriad of security concerns in MANET’s is the use of reliable authentication architecture. In this paper we propose a new key management scheme based on threshold cryptography in kerberos for MANET’s, the proposed scheme uses the elliptic curve cryptography method that consumes fewer resources well adapted to the wireless environment. Our approach shows a strength and effectiveness against attacks.

  3. RSA cryptography and multi prime RSA cryptography

    Science.gov (United States)

    Sani, Nur Atiqah Abdul; Kamarulhaili, Hailiza

    2017-08-01

    RSA cryptography is one of the most powerful and popular cryptosystem which is being applied until now. There is one variant of RSA cryptography named Multi Prime RSA (MPRSA) cryptography. MPRSA cryptography is the improved version of RSA cryptography. We only need to modify a few steps in key generation part and apply the Chinese Remainder Theorem (CRT) in the decryption part to get the MPRSA algorithm. The focus of this research is to compare between the standard RSA cryptography and MPRSA cryptography in a few aspects. The research shows that MPRSA cryptography is more efficient than the RSA cryptography. Time complexity using Mathematica software is also conducted and it is proven that MPRSA cryptography has shorter time taken. It also implies the computational time is less than RSA cryptography. Mathematica software version 9.0 and a laptop HP ProBook 4331s are used to check the timing and to implement both algorithms.

  4. Security improvement by using a modified coherent state for quantum cryptography

    International Nuclear Information System (INIS)

    Lu, Y.J.; Zhu, Luobei; Ou, Z.Y.

    2005-01-01

    Weak coherent states as a photon source for quantum cryptography have a limit in secure data rate and transmission distance because of the presence of multiphoton events and loss in transmission line. Two-photon events in a coherent state can be taken out by a two-photon interference scheme. We investigate the security issue of utilizing this modified coherent state in quantum cryptography. A 4-dB improvement in the secure data rate or a nearly twofold increase in transmission distance over the coherent state are found. With a recently proposed and improved encoding strategy, further improvement is possible

  5. A Distributed Public Key Infrastructure Based on Threshold Cryptography for the HiiMap Next Generation Internet Architecture

    Directory of Open Access Journals (Sweden)

    Oliver Hanka

    2011-02-01

    Full Text Available In this article, a security extension for the HiiMap Next Generation Internet Architecture is presented. We regard a public key infrastructure which is integrated into the mapping infrastructure of the locator/identifier-split addressing scheme. The security approach is based on Threshold Cryptography which enables a sharing of keys among the mapping servers. Hence, a more trustworthy and fair approach for a Next Generation Internet Architecture as compared to the state of the art approach is fostered. Additionally, we give an evaluation based on IETF AAA recommendations for security-related systems.

  6. Visual cryptography for image processing and security theory, methods, and applications

    CERN Document Server

    Liu, Feng

    2014-01-01

    This unique book describes the fundamental concepts, theories and practice of visual cryptography. The design, construction, analysis, and application of visual cryptography schemes (VCSs) are discussed in detail. Original, cutting-edge research is presented on probabilistic, size invariant, threshold, concolorous, and cheating immune VCS. Features: provides a thorough introduction to the field; examines various common problems in visual cryptography, including the alignment, flipping, cheating, distortion, and thin line problems; reviews a range of VCSs, including XOR-based visual cryptograph

  7. A NEW ERA OF CRYPTOGRAPHY: QUANTUM CRYPTOGRAPHY

    OpenAIRE

    Sandeepak Bhandari

    2016-01-01

    ABSTRACT Security is the first priority in today digital world for secure communication between sender and receiver. Various Cryptography techniques are developed time to time for secure communication. Quantum Cryptography is one of the latest and advanced cryptography technique, it is different from all other cryptography technique and more secure. It based on the Quantum of physics since its name which make it more secure from all other cryptography and UN breakable. In this paper about...

  8. Pairing based threshold cryptography improving on Libert-Quisquater and Baek-Zheng

    DEFF Research Database (Denmark)

    Desmedt, Yvo; Lange, Tanja

    2006-01-01

    In this paper we apply techniques from secret sharing and threshold decryption to show how to properly design an ID-based threshold system in which one assumes no trust in any party. In our scheme: We avoid that any single machine ever knew the master secret s of the trusted authority (TA). Inste...

  9. Post-Quantum Cryptography

    DEFF Research Database (Denmark)

    Gauthier Umana, Valérie

    . The public key cryptosystems that can resist these emerging attacks are called quantum resistant or post-quantum cryptosystems. There are mainly four classes of public-key cryptography that are believed to resist classical and quantum attacks: code-based cryptography, hash-based cryptography, lattice......-based cryptography and multivariate public-key cryptography. In this thesis, we focus on the rst two classes. In the rst part, we introduce coding theory and give an overview of code-based cryptography. The main contribution is an attack on two promising variants of McEliece's cryptosystem, based on quasi...

  10. Neural cryptography with feedback.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Shacham, Lanir; Kanter, Ido

    2004-04-01

    Neural cryptography is based on a competition between attractive and repulsive stochastic forces. A feedback mechanism is added to neural cryptography which increases the repulsive forces. Using numerical simulations and an analytic approach, the probability of a successful attack is calculated for different model parameters. Scaling laws are derived which show that feedback improves the security of the system. In addition, a network with feedback generates a pseudorandom bit sequence which can be used to encrypt and decrypt a secret message.

  11. Coding and cryptography synergy for a robust communication

    CERN Document Server

    Zivic, Natasa

    2013-01-01

    This book presents the benefits of the synergetic effect of the combination of coding and cryptography. It introduces new directions for the interoperability between the components of a communication system. Coding and cryptography are standard components in today's distributed systems. The integration of cryptography into coding aspects is very interesting, as the usage of cryptography will be common use, even in industrial applications. The book is based on new developments of coding and cryptography, which use real numbers to express reliability values of bits instead of binary values 0 and 1. The presented methods are novel and designed for noisy communication, which doesn´t allow the successful use of cryptography. The rate of successful verifications is improved essentially not only for standard or "hard" verification, but even more after the introduction of "soft" verification. A security analysis shows the impact on the security. Information security and cryptography follow the late developments of c...

  12. Chocolate Key Cryptography

    Science.gov (United States)

    Bachman, Dale J.; Brown, Ezra A.; Norton, Anderson H.

    2010-01-01

    Cryptography is the science of hidden or secret writing. More generally, cryptography refers to the science of safeguarding information. Cryptography allows people to use a public medium such as the Internet to transmit private information securely, thus enabling a whole range of conveniences, from online shopping to personally printed movie…

  13. Halftone visual cryptography.

    Science.gov (United States)

    Zhou, Zhi; Arce, Gonzalo R; Di Crescenzo, Giovanni

    2006-08-01

    Visual cryptography encodes a secret binary image (SI) into n shares of random binary patterns. If the shares are xeroxed onto transparencies, the secret image can be visually decoded by superimposing a qualified subset of transparencies, but no secret information can be obtained from the superposition of a forbidden subset. The binary patterns of the n shares, however, have no visual meaning and hinder the objectives of visual cryptography. Extended visual cryptography [1] was proposed recently to construct meaningful binary images as shares using hypergraph colourings, but the visual quality is poor. In this paper, a novel technique named halftone visual cryptography is proposed to achieve visual cryptography via halftoning. Based on the blue-noise dithering principles, the proposed method utilizes the void and cluster algorithm [2] to encode a secret binary image into n halftone shares (images) carrying significant visual information. The simulation shows that the visual quality of the obtained halftone shares are observably better than that attained by any available visual cryptography method known to date.

  14. Post-quantum cryptography

    Science.gov (United States)

    Bernstein, Daniel J.; Lange, Tanja

    2017-09-01

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  15. Post-quantum cryptography.

    Science.gov (United States)

    Bernstein, Daniel J; Lange, Tanja

    2017-09-13

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  16. Contemporary cryptography

    CERN Document Server

    Oppliger, Rolf

    2011-01-01

    Whether you're new to the field or looking to broaden your knowledge of contemporary cryptography, this newly revised edition of an Artech House classic puts all aspects of this important topic into perspective. Delivering an accurate introduction to the current state-of-the-art in modern cryptography, the book offers you an in-depth understanding of essential tools and applications to help you with your daily work. The second edition has been reorganized and expanded, providing mathematical fundamentals and important cryptography principles in the appropriate appendixes, rather than summarize

  17. Calculator Cryptography.

    Science.gov (United States)

    Hall, Matthew

    2003-01-01

    Uses cryptography to demonstrate the importance of algebra and the use of technology as an effective real application of mathematics. Explains simple encoding and decoding of messages for student learning of modular arithmetic. This elementary encounter with cryptography along with its historical and modern background serves to motivate student…

  18. Introduction to modern cryptography

    CERN Document Server

    Katz, Jonathan

    2014-01-01

    Praise for the First Edition:""This book is a comprehensive, rigorous introduction to what the authors name 'modern' cryptography. … a novel approach to how cryptography is taught, replacing the older, construction-based approach. … The concepts are clearly stated, both in an intuitive fashion and formally. … I would heartily recommend this book to anyone who is interested in cryptography. … The exercises are challenging and interesting, and can benefit readers of all academic levels.""-IACR Book Reviews, January 2010""Over the past 30 years, cryptography has been transformed from a mysterious

  19. Broadband Quantum Cryptography

    CERN Document Server

    Rogers, Daniel

    2010-01-01

    Quantum cryptography is a rapidly developing field that draws from a number of disciplines, from quantum optics to information theory to electrical engineering. By combining some fundamental quantum mechanical principles of single photons with various aspects of information theory, quantum cryptography represents a fundamental shift in the basis for security from numerical complexity to the fundamental physical nature of the communications channel. As such, it promises the holy grail of data security: theoretically unbreakable encryption. Of course, implementing quantum cryptography in real br

  20. Step to improve neural cryptography against flipping attacks.

    Science.gov (United States)

    Zhou, Jiantao; Xu, Qinzhen; Pei, Wenjiang; He, Zhenya; Szu, Harold

    2004-12-01

    Synchronization of neural networks by mutual learning has been demonstrated to be possible for constructing key exchange protocol over public channel. However, the neural cryptography schemes presented so far are not the securest under regular flipping attack (RFA) and are completely insecure under majority flipping attack (MFA). We propose a scheme by splitting the mutual information and the training process to improve the security of neural cryptosystem against flipping attacks. Both analytical and simulation results show that the success probability of RFA on the proposed scheme can be decreased to the level of brute force attack (BFA) and the success probability of MFA still decays exponentially with the weights' level L. The synchronization time of the parties also remains polynomial with L. Moreover, we analyze the security under an advanced flipping attack.

  1. Quantum cryptography communication technology

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Hong, Seok Boong; Koo, In Soo

    2007-09-15

    Quantum cryptography communication based on quantum mechanics provides and unconditional security between two users. Even though huge advance has been done since the 1984, having a complete system is still far away. In the case of real quantum cryptography communication systems, an unconditional security level is lowered by the imperfection of the communication unit. It is important to investigate the unconditional security of quantum communication protocols based on these experimental results and implementation examples for the advanced spread all over the world. The Japanese report, titled, 'Investigation report on the worldwide trends of quantum cryptography communications systems' was translated and summarized in this report. An unconditional security theory of the quantum cryptography and real implementation examples in the domestic area are investigated also. The goal of the report is to make quantum cryptography communication more useful and reliable alternative telecommunication infrastructure as the one of the cyber security program of the class 1-E communication system of nuclear power plant. Also another goal of this report is to provide the quantitative decision basis on the quantum cryptography communication when this secure communication system will be used in class 1-E communication channel of the nuclear power plant.

  2. Quantum cryptography communication technology

    International Nuclear Information System (INIS)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Hong, Seok Boong; Koo, In Soo

    2007-09-01

    Quantum cryptography communication based on quantum mechanics provides and unconditional security between two users. Even though huge advance has been done since the 1984, having a complete system is still far away. In the case of real quantum cryptography communication systems, an unconditional security level is lowered by the imperfection of the communication unit. It is important to investigate the unconditional security of quantum communication protocols based on these experimental results and implementation examples for the advanced spread all over the world. The Japanese report, titled, 'Investigation report on the worldwide trends of quantum cryptography communications systems' was translated and summarized in this report. An unconditional security theory of the quantum cryptography and real implementation examples in the domestic area are investigated also. The goal of the report is to make quantum cryptography communication more useful and reliable alternative telecommunication infrastructure as the one of the cyber security program of the class 1-E communication system of nuclear power plant. Also another goal of this report is to provide the quantitative decision basis on the quantum cryptography communication when this secure communication system will be used in class 1-E communication channel of the nuclear power plant

  3. Lightweight Cryptography for Passive RFID Tags

    DEFF Research Database (Denmark)

    David, Mathieu

    2012-01-01

    were mostly unsatisfactory. As a conclusion, a new branch of cryptography, commonly called Lightweight Cryptography, emerged to address the issues of these tiny ubiquitous devices. This Thesis presents a comprehensive engineering to lightweight cryptography, proposes a classification and explores its...... various ramifications by giving key examples in each of them. We select two of these branches, ultralightweight cryptography and symmetric-key cryptography, and propose a cryptographic primitive in each of them. In the case of symmetric-key cryptography, we propose a stream cipher that has a footprint...... of an integrator for a particular application. Finally, we conclude that the research for finding robust cryptographic primitive in the branch of lightweight cryptography still has some nice days ahead, and that providing a secure cryptosystem for printed electronics RFID tags remains an open research topic....

  4. Understanding and applying cryptography and data security

    CERN Document Server

    Elbirt, Adam J

    2009-01-01

    Introduction A Brief History of Cryptography and Data Security Cryptography and Data Security in the Modern World Existing Texts Book Organization Symmetric-Key Cryptography Cryptosystem Overview The Modulo Operator Greatest Common Divisor The Ring ZmHomework ProblemsSymmetric-Key Cryptography: Substitution Ciphers Basic Cryptanalysis Shift Ciphers Affine Ciphers Homework ProblemsSymmetric-Key Cryptography: Stream Ciphers Random Numbers The One-Time Pad Key Stream GeneratorsReal-World ApplicationsHomework ProblemsSymmetric-Key Cryptography: Block Ciphers The Data Encryption StandardThe Advance

  5. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  6. Cryptography Engineering Design Principles and Practical Applications

    CERN Document Server

    Ferguson, Niels; Kohno, Tadayoshi

    2012-01-01

    The ultimate guide to cryptography, updated from an author team of the world's top cryptography experts. Cryptography is vital to keeping information safe, in an era when the formula to do so becomes more and more challenging. Written by a team of world-renowned cryptography experts, this essential guide is the definitive introduction to all major areas of cryptography: message security, key negotiation, and key management. You'll learn how to think like a cryptographer. You'll discover techniques for building cryptography into products from the start and you'll examine the many technical chan

  7. Fourier-based automatic alignment for improved Visual Cryptography schemes.

    Science.gov (United States)

    Machizaud, Jacques; Chavel, Pierre; Fournel, Thierry

    2011-11-07

    In Visual Cryptography, several images, called "shadow images", that separately contain no information, are overlapped to reveal a shared secret message. We develop a method to digitally register one printed shadow image acquired by a camera with a purely digital shadow image, stored in memory. Using Fourier techniques derived from Fourier Optics concepts, the idea is to enhance and exploit the quasi periodicity of the shadow images, composed by a random distribution of black and white patterns on a periodic sampling grid. The advantage is to speed up the security control or the access time to the message, in particular in the cases of a small pixel size or of large numbers of pixels. Furthermore, the interest of visual cryptography can be increased by embedding the initial message in two shadow images that do not have identical mathematical supports, making manual registration impractical. Experimental results demonstrate the successful operation of the method, including the possibility to directly project the result onto the printed shadow image.

  8. Quantum cryptography: towards realization in realistic conditions

    International Nuclear Information System (INIS)

    Imoto, M.; Koashi, M.; Shimizu, K.; Huttner, B.

    1997-01-01

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author)

  9. Quantum cryptography: towards realization in realistic conditions

    Energy Technology Data Exchange (ETDEWEB)

    Imoto, M; Koashi, M; Shimizu, K [NTT Basic Research Laboratories, 3-1 Morinosato-Wakamiya, Atsugi-shi, Kanagawa 243-01 (Japan); Huttner, B [Universite de Geneve, GAP-optique, 20, Rue de l` Ecole de Medecine CH1211, Geneve 4 (Switzerland)

    1997-05-11

    Many of quantum cryptography schemes have been proposed based on some assumptions such as no transmission loss, no measurement error, and an ideal single photon generator. We have been trying to develop a theory of quantum cryptography considering realistic conditions. As such attempts, we propose quantum cryptography with coherent states, quantum cryptography with two-photon interference, and generalization of two-state cryptography to two-mixed-state cases. (author) 15 refs., 1 fig., 1 tab.

  10. Online Voting System Based on Image Steganography and Visual Cryptography

    Directory of Open Access Journals (Sweden)

    Biju Issac

    2017-01-01

    Full Text Available This paper discusses the implementation of an online voting system based on image steganography and visual cryptography. The system was implemented in Java EE on a web-based interface, with MySQL database server and Glassfish application server as the backend. After considering the requirements of an online voting system, current technologies on electronic voting schemes in published literature were examined. Next, the cryptographic and steganography techniques best suited for the requirements of the voting system were chosen, and the software was implemented. We have incorporated in our system techniques like the password hashed based scheme, visual cryptography, F5 image steganography and threshold decryption cryptosystem. The analysis, design and implementation phase of the software development of the voting system is discussed in detail. We have also used a questionnaire survey and did the user acceptance testing of the system.

  11. Theory and practice of chaotic cryptography

    International Nuclear Information System (INIS)

    Amigo, J.M.; Kocarev, L.; Szczepanski, J.

    2007-01-01

    In this Letter we address some basic questions about chaotic cryptography, not least the very definition of chaos in discrete systems. We propose a conceptual framework and illustrate it with different examples from private and public key cryptography. We elaborate also on possible limits of chaotic cryptography

  12. Everyday cryptography fundamental principles and applications

    CERN Document Server

    Martin, Keith M

    2012-01-01

    Cryptography is a vital technology that underpins the security of information in computer networks. This book presents a comprehensive introduction to the role that cryptography plays in providing information security for technologies such as the Internet, mobile phones, payment cards, and wireless local area networks. Focusing on the fundamental principles that ground modern cryptography as they arise in modern applications, it avoids both an over-reliance on transient currenttechnologies and over-whelming theoretical research.Everyday Cryptography is a self-contained and widely accessible in

  13. Fast and simple high-capacity quantum cryptography with error detection.

    Science.gov (United States)

    Lai, Hong; Luo, Ming-Xing; Pieprzyk, Josef; Zhang, Jun; Pan, Lei; Li, Shudong; Orgun, Mehmet A

    2017-04-13

    Quantum cryptography is commonly used to generate fresh secure keys with quantum signal transmission for instant use between two parties. However, research shows that the relatively low key generation rate hinders its practical use where a symmetric cryptography component consumes the shared key. That is, the security of the symmetric cryptography demands frequent rate of key updates, which leads to a higher consumption of the internal one-time-pad communication bandwidth, since it requires the length of the key to be as long as that of the secret. In order to alleviate these issues, we develop a matrix algorithm for fast and simple high-capacity quantum cryptography. Our scheme can achieve secure private communication with fresh keys generated from Fibonacci- and Lucas- valued orbital angular momentum (OAM) states for the seed to construct recursive Fibonacci and Lucas matrices. Moreover, the proposed matrix algorithm for quantum cryptography can ultimately be simplified to matrix multiplication, which is implemented and optimized in modern computers. Most importantly, considerably information capacity can be improved effectively and efficiently by the recursive property of Fibonacci and Lucas matrices, thereby avoiding the restriction of physical conditions, such as the communication bandwidth.

  14. Fast and simple high-capacity quantum cryptography with error detection

    Science.gov (United States)

    Lai, Hong; Luo, Ming-Xing; Pieprzyk, Josef; Zhang, Jun; Pan, Lei; Li, Shudong; Orgun, Mehmet A.

    2017-04-01

    Quantum cryptography is commonly used to generate fresh secure keys with quantum signal transmission for instant use between two parties. However, research shows that the relatively low key generation rate hinders its practical use where a symmetric cryptography component consumes the shared key. That is, the security of the symmetric cryptography demands frequent rate of key updates, which leads to a higher consumption of the internal one-time-pad communication bandwidth, since it requires the length of the key to be as long as that of the secret. In order to alleviate these issues, we develop a matrix algorithm for fast and simple high-capacity quantum cryptography. Our scheme can achieve secure private communication with fresh keys generated from Fibonacci- and Lucas- valued orbital angular momentum (OAM) states for the seed to construct recursive Fibonacci and Lucas matrices. Moreover, the proposed matrix algorithm for quantum cryptography can ultimately be simplified to matrix multiplication, which is implemented and optimized in modern computers. Most importantly, considerably information capacity can be improved effectively and efficiently by the recursive property of Fibonacci and Lucas matrices, thereby avoiding the restriction of physical conditions, such as the communication bandwidth.

  15. Cryptography Basics

    DEFF Research Database (Denmark)

    Wattenhofer, Roger; Förster, Klaus-Tycho

    2017-01-01

    Public-key cryptography is one of the biggest scientific achievements of the last century. Two people that never met before can establish a common secret in plain sight? Sounds like pure magic! The idea of this chapter is to reveal some of the tricks of this “crypto magic”. This chapter is not ta......Public-key cryptography is one of the biggest scientific achievements of the last century. Two people that never met before can establish a common secret in plain sight? Sounds like pure magic! The idea of this chapter is to reveal some of the tricks of this “crypto magic”. This chapter...

  16. Protocols and plan of quantum cryptography

    Directory of Open Access Journals (Sweden)

    Milorad S. Markagić

    2012-01-01

    Full Text Available Along with the development of confidentiality of data and resources, there is a need to develop systems that would provide confidentiality. Currently, the most used systems are classical cryptographic systems and encryption public key systems. However, none of these systems provides a solution for the famous 'catch 22' of cryptography. Owing to the intensive development of quantum mechanics, in the last 30 years emerged an entirely new kind of cryptography-quantum cryptography. Its greatest contribution is a possibility to discover an intercepted communication channel from a third party. The question is: is this really true? The question arises: 'If the quantum cryptography is so good, why is not widely used?' The aim of this paper is, on the one hand, to define the basic mechanisms of quantum cryptography IP, and, on the other hand, to point to the shortcomings, as they related to the opportunities of today's devices and flaws in protocols.

  17. Quantum cryptography for secure free-space communications

    International Nuclear Information System (INIS)

    Hughes, R.J.; Buttler, W.T.; Kwiat, P.G.; Lamoreaux, S.K.; Luther, G.G.; Morgan, G.L.; Nordholt, J.E.; Peterson, C.G.

    1999-01-01

    The secure distribution of the secret random bit sequences known as key material, is an essential precursor to their use for the encryption and decryption of confidential communications. Quantum cryptography is a new technique for secure key distribution with single-photon transmissions: Heisenberg's uncertainty principle ensures that an adversary can neither successfully tap the key transmissions, nor evade detection (eavesdropping raises the key error rate above a threshold value). The authors have developed experimental quantum cryptography systems based on the transmission of non-orthogonal photon polarization states to generate shared key material over line-of-sight optical links. Key material is built up using the transmission of a single-photon per bit of an initial secret random sequence. A quantum-mechanically random subset of this sequence is identified, becoming the key material after a data reconciliation stage with the sender. The authors have developed and tested a free-space quantum key distribution (QKD) system over an outdoor optical path of ∼1 km at Los Alamos National Laboratory under nighttime conditions. Results show that free-space QKD can provide secure real-time key distribution between parties who have a need to communicate secretly. Finally, they examine the feasibility of surface to satellite QKD

  18. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    Science.gov (United States)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  19. Low power cryptography

    International Nuclear Information System (INIS)

    Kitsos, P; Koufopavlou, O; Selimis, G; Sklavos, N

    2005-01-01

    Today more and more sensitive data is stored digitally. Bank accounts, medical records and personal emails are some categories that data must keep secure. The science of cryptography tries to encounter the lack of security. Data confidentiality, authentication, non-reputation and data integrity are some of the main parts of cryptography. The evolution of cryptography drove in very complex cryptographic models which they could not be implemented before some years. The use of systems with increasing complexity, which usually are more secure, has as result low throughput rate and more energy consumption. However the evolution of cipher has no practical impact, if it has only theoretical background. Every encryption algorithm should exploit as much as possible the conditions of the specific system without omitting the physical, area and timing limitations. This fact requires new ways in design architectures for secure and reliable crypto systems. A main issue in the design of crypto systems is the reduction of power consumption, especially for portable systems as smart cards. (invited paper)

  20. Report on Pairing-based Cryptography.

    Science.gov (United States)

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.

  1. Coding Theory, Cryptography and Related Areas

    DEFF Research Database (Denmark)

    Buchmann, Johannes; Stichtenoth, Henning; Tapia-Recillas, Horacio

    Proceedings of anInternational Conference on Coding Theory, Cryptography and Related Areas, held in Guanajuato, Mexico. in april 1998......Proceedings of anInternational Conference on Coding Theory, Cryptography and Related Areas, held in Guanajuato, Mexico. in april 1998...

  2. Approach to design neural cryptography: a generalized architecture and a heuristic rule.

    Science.gov (United States)

    Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen

    2013-06-01

    Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.

  3. Conventional Cryptography.

    Science.gov (United States)

    Wright, Marie A.

    1993-01-01

    Cryptography is the science that renders data unintelligible to prevent its unauthorized disclosure or modification. Presents an application of matrices used in linear transformations to illustrate a cryptographic system. An example is provided. (17 references) (MDH)

  4. Quantum cryptography

    International Nuclear Information System (INIS)

    Tittel, W.; Brendel, J.; Gissin, N.; Ribordy, G.; Zbinden, H.

    1999-01-01

    The principles of quantum cryptography based on non-local correlations of entanglement photons are outlined. The method of coding and decoding of information and experiments is also described. The prospects of the technique are briefly discussed. (Z.J.)

  5. Security, Privacy, and Applied Cryptography Engineering

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the Second International Conference on Security, Privacy and Applied Cryptography Engineering held in Chennai, India, in November 2012. The 11 papers presented were carefully reviewed and selected from 61 submissions. The papers are organized...... and applications, high-performance computing in cryptology and cryptography in ubiquitous devices....

  6. An introduction to mathematical cryptography

    CERN Document Server

    Hoffstein, Jeffrey; Silverman, Joseph H

    2014-01-01

    This self-contained introduction to modern cryptography emphasizes the mathematics behind the theory of public key cryptosystems and digital signature schemes. The book focuses on these key topics while developing the mathematical tools needed for the construction and security analysis of diverse cryptosystems. Only basic linear algebra is required of the reader; techniques from algebra, number theory, and probability are introduced and developed as required. This text provides an ideal introduction for mathematics and computer science students to the mathematical foundations of modern cryptography. The book includes an extensive bibliography and index; supplementary materials are available online. The book covers a variety of topics that are considered central to mathematical cryptography. Key topics include: classical cryptographic constructions, such as Diffie–Hellmann key exchange, discrete logarithm-based cryptosystems, the RSA cryptosystem, and digital signatures; fundamental mathematical tools for cr...

  7. Lightweight cryptography for constrained devices

    DEFF Research Database (Denmark)

    Alippi, Cesare; Bogdanov, Andrey; Regazzoni, Francesco

    2014-01-01

    Lightweight cryptography is a rapidly evolving research field that responds to the request for security in resource constrained devices. This need arises from crucial pervasive IT applications, such as those based on RFID tags where cost and energy constraints drastically limit the solution...... complexity, with the consequence that traditional cryptography solutions become too costly to be implemented. In this paper, we survey design strategies and techniques suitable for implementing security primitives in constrained devices....

  8. Quantum cryptography approaching the classical limit.

    Science.gov (United States)

    Weedbrook, Christian; Pirandola, Stefano; Lloyd, Seth; Ralph, Timothy C

    2010-09-10

    We consider the security of continuous-variable quantum cryptography as we approach the classical limit, i.e., when the unknown preparation noise at the sender's station becomes significantly noisy or thermal (even by as much as 10(4) times greater than the variance of the vacuum mode). We show that, provided the channel transmission losses do not exceed 50%, the security of quantum cryptography is not dependent on the channel transmission, and is therefore incredibly robust against significant amounts of excess preparation noise. We extend these results to consider for the first time quantum cryptography at wavelengths considerably longer than optical and find that regions of security still exist all the way down to the microwave.

  9. Towards Practical Whitebox Cryptography: Optimizing Efficiency and Space Hardness

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Isobe, Takanori; Tischhauser, Elmar Wolfgang

    2016-01-01

    Whitebox cryptography aims to provide security for cryptographic algorithms in an untrusted environment where the adversary has full access to their implementation. Typical security goals for whitebox cryptography include key extraction security and decomposition security: Indeed, it should...... the practical requirements to whitebox cryptography in real-world applications such as DRM or mobile payments. Moreover, we formalize resistance towards decomposition in form of weak and strong space hardness at various security levels. We obtain bounds on space hardness in all those adversarial models...... real-world applications with whitebox cryptography....

  10. On the improvement of neural cryptography using erroneous transmitted information with error prediction.

    Science.gov (United States)

    Allam, Ahmed M; Abbas, Hazem M

    2010-12-01

    Neural cryptography deals with the problem of "key exchange" between two neural networks using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between the two communicating parties is eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process. Therefore, diminishing the probability of such a threat improves the reliability of exchanging the output bits through a public channel. The synchronization with feedback algorithm is one of the existing algorithms that enhances the security of neural cryptography. This paper proposes three new algorithms to enhance the mutual learning process. They mainly depend on disrupting the attacker confidence in the exchanged outputs and input patterns during training. The first algorithm is called "Do not Trust My Partner" (DTMP), which relies on one party sending erroneous output bits, with the other party being capable of predicting and correcting this error. The second algorithm is called "Synchronization with Common Secret Feedback" (SCSFB), where inputs are kept partially secret and the attacker has to train its network on input patterns that are different from the training sets used by the communicating parties. The third algorithm is a hybrid technique combining the features of the DTMP and SCSFB. The proposed approaches are shown to outperform the synchronization with feedback algorithm in the time needed for the parties to synchronize.

  11. High-rate measurement-device-independent quantum cryptography

    DEFF Research Database (Denmark)

    Pirandola, Stefano; Ottaviani, Carlo; Spedalieri, Gaetana

    2015-01-01

    Quantum cryptography achieves a formidable task - the remote distribution of secret keys by exploiting the fundamental laws of physics. Quantum cryptography is now headed towards solving the practical problem of constructing scalable and secure quantum networks. A significant step in this direction...

  12. Practical free space quantum cryptography

    International Nuclear Information System (INIS)

    Schmitt-Manderbach, T.; Weier, H.; Regner, N.; Kurtsiefer, C.; Weinfurter, H.

    2005-01-01

    Full text: Quantum cryptography, the secure key distribution between two parties, is the first practical application of quantum information technology. By encoding digital information into different polarization states of single photons, a string of key bits can be established between two parties, where laws of quantum mechanics ensure that a possible eavesdropper has negligible knowledge of. Having shown the feasibility of a long distance quantum key distribution scheme, the emphasis of this work is to incorporate the previously developed compact sender and receiver modules into a quantum cryptography system suitable for every-day use in metropolitan areas. The permanent installation with automatic alignment allows to investigate in detail the sensitivity of the free space optical link to weather conditions and air turbulences commonly encountered in urban areas. We report on a successful free space quantum cryptography experiment over a distance of 500 m between the rooftops of two university buildings using the BB84 protocol. The obtained bit error rates in first runs of this experiment using faint coherent pulses with an average photon number ranging from 0.1 to 1.0 was measured to be below 3 percent for experiments carried out during night, leading to average raw key rates (before error correction and privacy amplification) of 50 kBits per second. Thanks to its simplicity of implementation, our experiment brings free space quantum key distribution a big step closer to practical usability in metropolitan networks and on a level with fibre-based quantum cryptography that up to now offers the only ready-to-use systems available. Compact and automated free space hardware is also a prerequisite for a possible earth-satellite quantum key distribution system in order to break the distance limit of about 100 km of current quantum cryptography schemes. (author)

  13. Position-based quantum cryptography and catalytic computation

    NARCIS (Netherlands)

    Speelman, F.

    2016-01-01

    In this thesis, we present several results along two different lines of research. The first part concerns the study of position-based quantum cryptography, a topic in quantum cryptography. By combining quantum mechanics with special relativity theory, new cryptographic tasks can be developed that

  14. Gröbner Bases, Coding, and Cryptography

    CERN Document Server

    Sala, Massimiliano; Perret, Ludovic

    2009-01-01

    Coding theory and cryptography allow secure and reliable data transmission, which is at the heart of modern communication. This book offers a comprehensive overview on the application of commutative algebra to coding theory and cryptography. It analyzes important properties of algebraic/geometric coding systems individually.

  15. APPLICATION OF NATURAL TRANSFORM IN CRYPTOGRAPHY

    OpenAIRE

    Chindhe, Anil Dhondiram; Kiwne, Sakharam

    2017-01-01

    Abstaract−The newly defined integral transform ”Natural transform” has many application in the field of science and engineering.In this paper we described the application of Natural transform to Cryptography.This provide the algorithm for cryptography in which we use the natural transform of the exponential function for encryption of the plain text and corresponding inverse natural transform for decryption

  16. A Quantum Cryptography Communication Network Based on Software Defined Network

    Directory of Open Access Journals (Sweden)

    Zhang Hongliang

    2018-01-01

    Full Text Available With the development of the Internet, information security has attracted great attention in today’s society, and quantum cryptography communication network based on quantum key distribution (QKD is a very important part of this field, since the quantum key distribution combined with one-time-pad encryption scheme can guarantee the unconditional security of the information. The secret key generated by quantum key distribution protocols is a very valuable resource, so making full use of key resources is particularly important. Software definition network (SDN is a new type of network architecture, and it separates the control plane and the data plane of network devices through OpenFlow technology, thus it realizes the flexible control of the network resources. In this paper, a quantum cryptography communication network model based on SDN is proposed to realize the flexible control of quantum key resources in the whole cryptography communication network. Moreover, we propose a routing algorithm which takes into account both the hops and the end-to-end availible keys, so that the secret key generated by QKD can be used effectively. We also simulate this quantum cryptography communication network, and the result shows that based on SDN and the proposed routing algorithm the performance of this network is improved since the effective use of the quantum key resources.

  17. Neural Network Approach to Locating Cryptography in Object Code

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2009-09-01

    Finding and identifying cryptography is a growing concern in the malware analysis community. In this paper, artificial neural networks are used to classify functional blocks from a disassembled program as being either cryptography related or not. The resulting system, referred to as NNLC (Neural Net for Locating Cryptography) is presented and results of applying this system to various libraries are described.

  18. Dynamics of neural cryptography.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-05-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible.

  19. Dynamics of neural cryptography

    International Nuclear Information System (INIS)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-01-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible

  20. Dynamics of neural cryptography

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-05-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible.

  1. Report of the Public Cryptography Study Group.

    Science.gov (United States)

    American Council on Education, Washington, DC.

    Concerns of the National Security Agency (NSA) that information contained in some articles about cryptography in learned and professional journals and in monographs might be inimical to the national security are addressed. The Public Cryptography Study Group, with one dissenting opinion, recommends that a voluntary system of prior review of…

  2. Public Key Cryptography.

    Science.gov (United States)

    Tapson, Frank

    1996-01-01

    Describes public key cryptography, also known as RSA, which is a system using two keys, one used to put a message into cipher and another used to decipher the message. Presents examples using small prime numbers. (MKR)

  3. Cryptography with chaos using Chua's system

    International Nuclear Information System (INIS)

    Oliveira, C H; Pizolato, J C Jr

    2011-01-01

    In the last years, chaotic systems have been applied in information security. These systems have a complex and unpredictable behavior, what makes them more attractive for data cryptography applications. In this work, the chaotic behavior of signals generated by Chua's system is combined with the original information in order to obtain a safe cryptographic method. The experimental results demonstrate that the proposed scheme can be used in data cryptography applications.

  4. Three-Stage Quantum Cryptography Protocol under Collective-Rotation Noise

    OpenAIRE

    Wu, Linsen; Chen, Yuhua

    2015-01-01

    Information security is increasingly important as society migrates to the information age. Classical cryptography widely used nowadays is based on computational complexity, which means that it assumes that solving some particular mathematical problems is hard on a classical computer. With the development of supercomputers and, potentially, quantum computers, classical cryptography has more and more potential risks. Quantum cryptography provides a solution which is based on the Heisenberg unce...

  5. Practical Leakage-Resilient Symmetric Cryptography

    DEFF Research Database (Denmark)

    Faust, Sebastian; Pietrzak, Krzysztof; Schipper, Joachim

    2012-01-01

    Leakage resilient cryptography attempts to incorporate side-channel leakage into the black-box security model and designs cryptographic schemes that are provably secure within it. Informally, a scheme is leakage-resilient if it remains secure even if an adversary learns a bounded amount of arbitr......Leakage resilient cryptography attempts to incorporate side-channel leakage into the black-box security model and designs cryptographic schemes that are provably secure within it. Informally, a scheme is leakage-resilient if it remains secure even if an adversary learns a bounded amount...

  6. Mathematical Background of Public Key Cryptography

    DEFF Research Database (Denmark)

    Frey, Gerhard; Lange, Tanja

    2005-01-01

    The two main systems used for public key cryptography are RSA and protocols based on the discrete logarithm problem in some cyclic group. We focus on the latter problem and state cryptographic protocols and mathematical background material.......The two main systems used for public key cryptography are RSA and protocols based on the discrete logarithm problem in some cyclic group. We focus on the latter problem and state cryptographic protocols and mathematical background material....

  7. Device independence for two-party cryptography and position verification with memoryless devices

    Science.gov (United States)

    Ribeiro, Jérémy; Thinh, Le Phuc; Kaniewski, Jedrzej; Helsen, Jonas; Wehner, Stephanie

    2018-06-01

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we improve the device-independent security proofs of Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004] for two-party cryptography (with memoryless devices) and we add a security proof for device-independent position verification (also memoryless devices) under different physical constraints on the adversary. We assess the quality of the devices by observing a Bell violation, and, as for Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004], security can be attained for any violation of the Clauser-Holt-Shimony-Horne inequality.

  8. Device-independence for two-party cryptography and position verification

    DEFF Research Database (Denmark)

    Ribeiro, Jeremy; Thinh, Le Phuc; Kaniewski, Jedrzej

    Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position......-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which...... security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we give device-independent security proofs of two-party cryptography and Position Verification for memoryless devices under different physical constraints on the adversary...

  9. Distinguishability of quantum states and shannon complexity in quantum cryptography

    Science.gov (United States)

    Arbekov, I. M.; Molotkov, S. N.

    2017-07-01

    The proof of the security of quantum key distribution is a rather complex problem. Security is defined in terms different from the requirements imposed on keys in classical cryptography. In quantum cryptography, the security of keys is expressed in terms of the closeness of the quantum state of an eavesdropper after key distribution to an ideal quantum state that is uncorrelated to the key of legitimate users. A metric of closeness between two quantum states is given by the trace metric. In classical cryptography, the security of keys is understood in terms of, say, the complexity of key search in the presence of side information. In quantum cryptography, side information for the eavesdropper is given by the whole volume of information on keys obtained from both quantum and classical channels. The fact that the mathematical apparatuses used in the proof of key security in classical and quantum cryptography are essentially different leads to misunderstanding and emotional discussions [1]. Therefore, one should be able to answer the question of how different cryptographic robustness criteria are related to each other. In the present study, it is shown that there is a direct relationship between the security criterion in quantum cryptography, which is based on the trace distance determining the distinguishability of quantum states, and the criterion in classical cryptography, which uses guesswork on the determination of a key in the presence of side information.

  10. A prototype quantum cryptography system

    Energy Technology Data Exchange (ETDEWEB)

    Surasak, Chiangga

    1998-07-01

    In this work we have constructed a new secure quantum key distribution system based on the BB84 protocol. Many current state-of-the-art quantum cryptography systems encounter major problems concerning low bit rate, synchronization, and stabilization. Our quantum cryptography system utilizes only laser diodes and standard passive optical components, to enhance the stability and also to decrease the space requirements. The development of this demonstration for a practical quantum key distribution system is a consequence of our previous work on the quantum cryptographic system using optical fiber components for the transmitter and receiver. There we found that the optical fiber couplers should not be used due to the problems with space, stability and alignment. The goal of the synchronization is to use as little transmission capacities as possible. The experimental results of our quantum key distribution system show the feasibility of getting more than 90 % transmission capacities with the approaches developed in this work. Therefore it becomes feasible to securely establish a random key sequence at a rate of 1 to {approx} 5K bit/s by using our stable, compact, cheap, and user-friendly modules for quantum cryptography. (author)

  11. A prototype quantum cryptography system

    International Nuclear Information System (INIS)

    Chiangga Surasak

    1998-07-01

    In this work we have constructed a new secure quantum key distribution system based on the BB84 protocol. Many current state-of-the-art quantum cryptography systems encounter major problems concerning low bit rate, synchronization, and stabilization. Our quantum cryptography system utilizes only laser diodes and standard passive optical components, to enhance the stability and also to decrease the space requirements. The development of this demonstration for a practical quantum key distribution system is a consequence of our previous work on the quantum cryptographic system using optical fiber components for the transmitter and receiver. There we found that the optical fiber couplers should not be used due to the problems with space, stability and alignment. The goal of the synchronization is to use as little transmission capacities as possible. The experimental results of our quantum key distribution system show the feasibility of getting more than 90 % transmission capacities with the approaches developed in this work. Therefore it becomes feasible to securely establish a random key sequence at a rate of 1 to ∼ 5K bit/s by using our stable, compact, cheap, and user-friendly modules for quantum cryptography. (author)

  12. Non-commutative cryptography and complexity of group-theoretic problems

    CERN Document Server

    Myasnikov, Alexei; Ushakov, Alexander

    2011-01-01

    This book is about relations between three different areas of mathematics and theoretical computer science: combinatorial group theory, cryptography, and complexity theory. It explores how non-commutative (infinite) groups, which are typically studied in combinatorial group theory, can be used in public-key cryptography. It also shows that there is remarkable feedback from cryptography to combinatorial group theory because some of the problems motivated by cryptography appear to be new to group theory, and they open many interesting research avenues within group theory. In particular, a lot of emphasis in the book is put on studying search problems, as compared to decision problems traditionally studied in combinatorial group theory. Then, complexity theory, notably generic-case complexity of algorithms, is employed for cryptanalysis of various cryptographic protocols based on infinite groups, and the ideas and machinery from the theory of generic-case complexity are used to study asymptotically dominant prop...

  13. Three-Stage Quantum Cryptography Protocol under Collective-Rotation Noise

    Directory of Open Access Journals (Sweden)

    Linsen Wu

    2015-05-01

    Full Text Available Information security is increasingly important as society migrates to the information age. Classical cryptography widely used nowadays is based on computational complexity, which means that it assumes that solving some particular mathematical problems is hard on a classical computer. With the development of supercomputers and, potentially, quantum computers, classical cryptography has more and more potential risks. Quantum cryptography provides a solution which is based on the Heisenberg uncertainty principle and no-cloning theorem. While BB84-based quantum protocols are only secure when a single photon is used in communication, the three-stage quantum protocol is multi-photon tolerant. However, existing analyses assume perfect noiseless channels. In this paper, a multi-photon analysis is performed for the three-stage quantum protocol under the collective-rotation noise model. The analysis provides insights into the impact of the noise level on a three-stage quantum cryptography system.

  14. Quantum cryptography; Kvantova kryptografie

    Energy Technology Data Exchange (ETDEWEB)

    Tittel, W; Brendel, J; Gissin, N; Ribordy, G; Zbinden, H [GAP-Optique, Universite de Geneve, 20 reu de l' Ecole de Medicine, Genf (Switzerland)

    1999-07-01

    The principles of quantum cryptography based on non-local correlations of entanglement photons are outlined. The method of coding and decoding of information and experiments is also described. The prospects of the technique are briefly discussed. (Z.J.)

  15. Color extended visual cryptography using error diffusion.

    Science.gov (United States)

    Kang, InKoo; Arce, Gonzalo R; Lee, Heung-Kyu

    2011-01-01

    Color visual cryptography (VC) encrypts a color secret message into n color halftone image shares. Previous methods in the literature show good results for black and white or gray scale VC schemes, however, they are not sufficient to be applied directly to color shares due to different color structures. Some methods for color visual cryptography are not satisfactory in terms of producing either meaningless shares or meaningful shares with low visual quality, leading to suspicion of encryption. This paper introduces the concept of visual information pixel (VIP) synchronization and error diffusion to attain a color visual cryptography encryption method that produces meaningful color shares with high visual quality. VIP synchronization retains the positions of pixels carrying visual information of original images throughout the color channels and error diffusion generates shares pleasant to human eyes. Comparisons with previous approaches show the superior performance of the new method.

  16. QRS Detection Based on Improved Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Xuanyu Lu

    2018-01-01

    Full Text Available Cardiovascular disease is the first cause of death around the world. In accomplishing quick and accurate diagnosis, automatic electrocardiogram (ECG analysis algorithm plays an important role, whose first step is QRS detection. The threshold algorithm of QRS complex detection is known for its high-speed computation and minimized memory storage. In this mobile era, threshold algorithm can be easily transported into portable, wearable, and wireless ECG systems. However, the detection rate of the threshold algorithm still calls for improvement. An improved adaptive threshold algorithm for QRS detection is reported in this paper. The main steps of this algorithm are preprocessing, peak finding, and adaptive threshold QRS detecting. The detection rate is 99.41%, the sensitivity (Se is 99.72%, and the specificity (Sp is 99.69% on the MIT-BIH Arrhythmia database. A comparison is also made with two other algorithms, to prove our superiority. The suspicious abnormal area is shown at the end of the algorithm and RR-Lorenz plot drawn for doctors and cardiologists to use as aid for diagnosis.

  17. Bent functions results and applications to cryptography

    CERN Document Server

    Tokareva, Natalia

    2015-01-01

    Bent Functions: Results and Applications to Cryptography offers a unique survey of the objects of discrete mathematics known as Boolean bent functions. As these maximal, nonlinear Boolean functions and their generalizations have many theoretical and practical applications in combinatorics, coding theory, and cryptography, the text provides a detailed survey of their main results, presenting a systematic overview of their generalizations and applications, and considering open problems in classification and systematization of bent functions. The text is appropriate for novices and advanced

  18. Implementation of diffie-Hellman key exchange on wireless sensor using elliptic curve cryptography

    DEFF Research Database (Denmark)

    Khajuria, Samant; Tange, Henrik

    2009-01-01

    This work describes a low-cost public key cryptography (PKC) based solution for security services such as authentication as required for wireless sensor networks. We have implemented a software approach using elliptic curve cryptography (ECC) over GF (2m) in order to obtain stronger cryptography...

  19. Entropy in quantum information theory - Communication and cryptography

    DEFF Research Database (Denmark)

    Majenz, Christian

    in quantum Shannon theory. While immensely more entanglement-consuming, the variant of port based teleportation is interesting for applications like instantaneous non-local computation and attacks on quantum position-based cryptography. Port based teleportation cannot be implemented perfectly......, for vanishing error. As a byproduct, a new lower bound for the size of the program register for an approximate universal programmable quantum processor is derived. Finally, the mix is completed with a result in quantum cryptography. While quantum key distribution is the most well-known quantum cryptographic...... protocol, there has been increased interest in extending the framework of symmetric key cryptography to quantum messages. We give a new denition for information-theoretic quantum non-malleability, strengthening the previous denition by Ambainis et al. We show that quantum non-malleability implies secrecy...

  20. Comment on "Cheating prevention in visual cryptography".

    Science.gov (United States)

    Chen, Yu-Chi; Horng, Gwoboa; Tsai, Du-Shiau

    2012-07-01

    Visual cryptography (VC), proposed by Naor and Shamir, has numerous applications, including visual authentication and identification, steganography, and image encryption. In 2006, Horng showed that cheating is possible in VC, where some participants can deceive the remaining participants by forged transparencies. Since then, designing cheating-prevention visual secret-sharing (CPVSS) schemes has been studied by many researchers. In this paper, we cryptanalyze the Hu-Tzeng CPVSS scheme and show that it is not cheating immune. We also outline an improvement that helps to overcome the problem.

  1. Cheating prevention in visual cryptography.

    Science.gov (United States)

    Hu, Chih-Ming; Tzeng, Wen-Guey

    2007-01-01

    Visual cryptography (VC) is a method of encrypting a secret image into shares such that stacking a sufficient number of shares reveals the secret image. Shares are usually presented in transparencies. Each participant holds a transparency. Most of the previous research work on VC focuses on improving two parameters: pixel expansion and contrast. In this paper, we studied the cheating problem in VC and extended VC. We considered the attacks of malicious adversaries who may deviate from the scheme in any way. We presented three cheating methods and applied them on attacking existent VC or extended VC schemes. We improved one cheat-preventing scheme. We proposed a generic method that converts a VCS to another VCS that has the property of cheating prevention. The overhead of the conversion is near optimal in both contrast degression and pixel expansion.

  2. Practical device-independent quantum cryptography via entropy accumulation.

    Science.gov (United States)

    Arnon-Friedman, Rotem; Dupuis, Frédéric; Fawzi, Omar; Renner, Renato; Vidick, Thomas

    2018-01-31

    Device-independent cryptography goes beyond conventional quantum cryptography by providing security that holds independently of the quality of the underlying physical devices. Device-independent protocols are based on the quantum phenomena of non-locality and the violation of Bell inequalities. This high level of security could so far only be established under conditions which are not achievable experimentally. Here we present a property of entropy, termed "entropy accumulation", which asserts that the total amount of entropy of a large system is the sum of its parts. We use this property to prove the security of cryptographic protocols, including device-independent quantum key distribution, while achieving essentially optimal parameters. Recent experimental progress, which enabled loophole-free Bell tests, suggests that the achieved parameters are technologically accessible. Our work hence provides the theoretical groundwork for experimental demonstrations of device-independent cryptography.

  3. Counterfactual quantum cryptography network with untrusted relay

    Science.gov (United States)

    Chen, Yuanyuan; Gu, Xuemei; Jiang, Dong; Xie, Ling; Chen, Lijun

    2015-07-01

    Counterfactual quantum cryptography allows two remote parties to share a secret key even though a physical particle is not in fact transmitted through the quantum channel. In order to extend the scope of counterfactual quantum cryptography, we use an untrusted relay to construct a multi-user network. The implementation issues are discussed to show that the scheme can be realized with current technologies. We also prove the practical security advantages of the scheme by eliminating the probability that an eavesdropper can directly access the signal or an untrusted relay can perform false operations.

  4. Optical hiding with visual cryptography

    Science.gov (United States)

    Shi, Yishi; Yang, Xiubo

    2017-11-01

    We propose an optical hiding method based on visual cryptography. In the hiding process, we convert the secret information into a set of fabricated phase-keys, which are completely independent of each other, intensity-detected-proof and image-covered, leading to the high security. During the extraction process, the covered phase-keys are illuminated with laser beams and then incoherently superimposed to extract the hidden information directly by human vision, without complicated optical implementations and any additional computation, resulting in the convenience of extraction. Also, the phase-keys are manufactured as the diffractive optical elements that are robust to the attacks, such as the blocking and the phase-noise. Optical experiments verify that the high security, the easy extraction and the strong robustness are all obtainable in the visual-cryptography-based optical hiding.

  5. Ear surgery techniques results on hearing threshold improvement

    Directory of Open Access Journals (Sweden)

    Farhad Mokhtarinejad

    2013-01-01

    Full Text Available Background: Bone conduction (BC threshold depression is not always by means of sensory neural hearing loss and sometimes it is an artifact caused by middle ear pathologies and ossicular chain problems. In this research, the influences of ear surgeries on bone conduction were evaluated. Materials and Methods: This study was conducted as a clinical trial study. The ear surgery performed on 83 patients classified in four categories: Stapedectomy, tympanomastoid surgery and ossicular reconstruction partially or totally; Partial Ossicular Replacement Prosthesis (PORP and Total Ossicular Replacement Prosthesis (TORP. Bone conduction thresholds assessed in frequencies of 250, 500, 1000, 2000 and 4000 Hz pre and post the surgery. Results: In stapedectomy group, the average of BC threshold in all frequencies improved approximately 6 dB in frequency of 2000 Hz. In tympanomastoid group, BC threshold in the frequency of 500, 1000 and 2000 Hz changed 4 dB (P-value < 0.05. Moreover, In the PORP group, 5 dB enhancement was seen in 1000 and 2000 Hz. In TORP group, the results confirmed that BC threshold improved in all frequencies especially at 4000 Hz about 6.5 dB. Conclusion: In according to results of this study, BC threshold shift was seen after several ear surgeries such as stapedectomy, tympanoplasty, PORP and TORP. The average of BC improvement was approximately 5 dB. It must be considered that BC depression might happen because of ossicular chain problems. Therefore; by resolving middle ear pathologies, the better BC threshold was obtained, the less hearing problems would be faced.

  6. Applied quantum cryptography

    International Nuclear Information System (INIS)

    Kollmitzer, Christian; Pivk, Mario

    2010-01-01

    Using the quantum properties of single photons to exchange binary keys between two partners for subsequent encryption of secret data is an absolutely novel technology. Only a few years ago quantum cryptography - or better: quantum key distribution - was the domain of basic research laboratories at universities. But during the last few years things changed. QKD left the laboratories and was picked up by more practical oriented teams that worked hard to develop a practically applicable technology out of the astonishing results of basic research. One major milestone towards a QKD technology was a large research and development project funded by the European Commission that aimed at combining quantum physics with complementary technologies that are necessary to create a technical solution: electronics, software, and network components were added within the project SECOQC (Development of a Global Network for Secure Communication based on Quantum Cryptography) that teamed up all expertise on European level to get a technology for future encryption. The practical application of QKD in a standard optical fibre network was demonstrated October 2008 in Vienna, giving a glimpse of the future of secure communication. Although many steps have still to be done in order to achieve a real mature technology, the corner stone for future secure communication is already laid. QKD will not be the Holy Grail of security, it will not be able to solve all problems for evermore. But QKD has the potential to replace one of the weakest parts of symmetric encryption: the exchange of the key. It can be proven that the key exchange process cannot be corrupted and that keys that are generated and exchanged quantum cryptographically will be secure for ever (as long as some additional conditions are kept). This book will show the state of the art of Quantum Cryptography and it will sketch how it can be implemented in standard communication infrastructure. The growing vulnerability of sensitive

  7. Focus on Quantum Cryptography

    International Nuclear Information System (INIS)

    Kwiat, Paul G.

    2002-01-01

    Full text: In our modern era of telecommunications and the Internet, information has become a valuable commodity. Sometimes it must therefore be protected against theft - in this case, loss of secret information to an eavesdropper. Most of today's transactions are protected using encryption unproven to be secure against a computational attack by a classical computer and, in fact, the standardly used encryption algorithms are provably vulnerable to the mind-boggling parallelism of a quantum computer, should one ever be physically realized. Enter quantum cryptography. Underlying nearly all forms of encryption is the necessity for a truly secret key, a random string of zeros and ones; the basic notion of quantum cryptography is to employ single photon transmissions (or the closest attainable approximation to these) to distribute the random key material, while removing the threat of an undetected eavesdropper. Now, nearly twenty years since the seminal quantum cryptography paper by Bennett and Brassard (Bennett C H and Brassard G 1984 Proc. IEEE Int. Conf. on Computers, Systems, and Signal Processing (Bangalore) (New York: IEEE) pp 175-9), we take a look at several state-of-the-art implementations, and glimpse how future quantum cryptosystems might look. We start with papers from three of the world's leading experimental quantum cryptography efforts: Stucki et al and Bethune and Risk describe working systems for quantum key distribution (QKD) over telecommunications fibres (at 1550 nanometres and 1300 nanometres, respectively). The former's achievement of quantum key exchange over 67 kilometres of optical fibre is a world record, as is the experimental demonstration by Hughes et al of daylight free-space QKD over a 10 km atmospheric range. Next, Luetkenhaus and Jahma explore the possible vulnerabilities of such systems (which employ attenuated laser pulses instead of actual single photon states) to conceivable future eavesdropping technologies. Enzer et al have

  8. Application of Improved Wavelet Thresholding Function in Image Denoising Processing

    Directory of Open Access Journals (Sweden)

    Hong Qi Zhang

    2014-07-01

    Full Text Available Wavelet analysis is a time – frequency analysis method, time-frequency localization problems are well solved, this paper analyzes the basic principles of the wavelet transform and the relationship between the signal singularity Lipschitz exponent and the local maxima of the wavelet transform coefficients mold, the principles of wavelet transform in image denoising are analyzed, the disadvantages of traditional wavelet thresholding function are studied, wavelet threshold function, the discontinuity of hard threshold and constant deviation of soft threshold are improved, image is denoised through using the improved threshold function.

  9. Conditional efficient multiuser quantum cryptography network

    International Nuclear Information System (INIS)

    Xue Peng; Li Chuanfeng; Guo Guangcan

    2002-01-01

    We propose a conditional quantum key distribution scheme with three nonorthogonal states. Combined with the idea presented by Lo et al. (H.-K. Lo, H. F. Chau, and M. Ardehali, e-print arXiv: quant-ph/0011056), the efficiency of this scheme is increased to tend to 100%. Also, such a refined data analysis guarantees the security of our scheme against the most general eavesdropping strategy. Then, based on the scheme, we present a quantum cryptography network with the addition of a device called ''space optical switch.'' Moreover, we give out a realization of a quantum random number generator. Thus, a feasible experimental scheme of this efficient quantum cryptography network is completely given

  10. Application of AVK and selective encryption in improving performance of quantum cryptography and networks

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2006-07-01

    The subject of quantum cryptography has emerged as an important area of research. Reported theoretical and practical investigations have conclusively established the reliable quantum key distribution (QKD) protocols with a higher level of security. For perfect security, the implementation of a time variant key is essential. The nature of cost and operation involved in quantum key distribution to distribute a time variant key from session to session/message to message has yet to be addressed from an implementation angle, yet it is understood to be hard with current available technology. Besides, the disadvantages of the subject quantum cryptanalysis, in the name of 'quantum cheating' and quantum error are demonstrated in the literature. This calls for an investigation for an affordable hybrid solution using QKD with conventional classical methods of key distribution to implement a time variant key. The paper proposes a hybrid solution towards this investigation. The solutions suggested will improve the performance of computer networks for secure transport of data in general. (author)

  11. The 'golden' matrices and a new kind of cryptography

    International Nuclear Information System (INIS)

    Stakhov, A.P.

    2007-01-01

    We consider a new class of square matrices called the 'golden' matrices. They are a generalization of the classical Fibonacci Q-matrix for continuous domain. The 'golden' matrices can be used for creation of a new kind of cryptography called the 'golden' cryptography. The method is very fast and simple for technical realization and can be used for cryptographic protection of digital signals (telecommunication and measurement systems)

  12. Learning Perfectly Secure Cryptography to Protect Communications with Adversarial Neural Cryptography

    Directory of Open Access Journals (Sweden)

    Murilo Coutinho

    2018-04-01

    Full Text Available Researches in Artificial Intelligence (AI have achieved many important breakthroughs, especially in recent years. In some cases, AI learns alone from scratch and performs human tasks faster and better than humans. With the recent advances in AI, it is natural to wonder whether Artificial Neural Networks will be used to successfully create or break cryptographic algorithms. Bibliographic review shows the main approach to this problem have been addressed throughout complex Neural Networks, but without understanding or proving the security of the generated model. This paper presents an analysis of the security of cryptographic algorithms generated by a new technique called Adversarial Neural Cryptography (ANC. Using the proposed network, we show limitations and directions to improve the current approach of ANC. Training the proposed Artificial Neural Network with the improved model of ANC, we show that artificially intelligent agents can learn the unbreakable One-Time Pad (OTP algorithm, without human knowledge, to communicate securely through an insecure communication channel. This paper shows in which conditions an AI agent can learn a secure encryption scheme. However, it also shows that, without a stronger adversary, it is more likely to obtain an insecure one.

  13. Learning Perfectly Secure Cryptography to Protect Communications with Adversarial Neural Cryptography.

    Science.gov (United States)

    Coutinho, Murilo; de Oliveira Albuquerque, Robson; Borges, Fábio; García Villalba, Luis Javier; Kim, Tai-Hoon

    2018-04-24

    Researches in Artificial Intelligence (AI) have achieved many important breakthroughs, especially in recent years. In some cases, AI learns alone from scratch and performs human tasks faster and better than humans. With the recent advances in AI, it is natural to wonder whether Artificial Neural Networks will be used to successfully create or break cryptographic algorithms. Bibliographic review shows the main approach to this problem have been addressed throughout complex Neural Networks, but without understanding or proving the security of the generated model. This paper presents an analysis of the security of cryptographic algorithms generated by a new technique called Adversarial Neural Cryptography (ANC). Using the proposed network, we show limitations and directions to improve the current approach of ANC. Training the proposed Artificial Neural Network with the improved model of ANC, we show that artificially intelligent agents can learn the unbreakable One-Time Pad (OTP) algorithm, without human knowledge, to communicate securely through an insecure communication channel. This paper shows in which conditions an AI agent can learn a secure encryption scheme. However, it also shows that, without a stronger adversary, it is more likely to obtain an insecure one.

  14. Securing information display by use of visual cryptography.

    Science.gov (United States)

    Yamamoto, Hirotsugu; Hayasaki, Yoshio; Nishida, Nobuo

    2003-09-01

    We propose a secure display technique based on visual cryptography. The proposed technique ensures the security of visual information. The display employs a decoding mask based on visual cryptography. Without the decoding mask, the displayed information cannot be viewed. The viewing zone is limited by the decoding mask so that only one person can view the information. We have developed a set of encryption codes to maintain the designed viewing zone and have demonstrated a display that provides a limited viewing zone.

  15. Quantum discord as a resource for quantum cryptography.

    Science.gov (United States)

    Pirandola, Stefano

    2014-11-07

    Quantum discord is the minimal bipartite resource which is needed for a secure quantum key distribution, being a cryptographic primitive equivalent to non-orthogonality. Its role becomes crucial in device-dependent quantum cryptography, where the presence of preparation and detection noise (inaccessible to all parties) may be so strong to prevent the distribution and distillation of entanglement. The necessity of entanglement is re-affirmed in the stronger scenario of device-independent quantum cryptography, where all sources of noise are ascribed to the eavesdropper.

  16. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Adis Alihodzic

    2014-01-01

    Full Text Available Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.

  17. Two-out-of-two color matching based visual cryptography schemes.

    Science.gov (United States)

    Machizaud, Jacques; Fournel, Thierry

    2012-09-24

    Visual cryptography which consists in sharing a secret message between transparencies has been extended to color prints. In this paper, we propose a new visual cryptography scheme based on color matching. The stacked printed media reveal a uniformly colored message decoded by the human visual system. In contrast with the previous color visual cryptography schemes, the proposed one enables to share images without pixel expansion and to detect a forgery as the color of the message is kept secret. In order to correctly print the colors on the media and to increase the security of the scheme, we use spectral models developed for color reproduction describing printed colors from an optical point of view.

  18. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    OpenAIRE

    Sergey Nikolaevich Kyazhin; Andrey Vladimirovich Moiseev

    2013-01-01

    The current state of the cloud computing (CC) information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  19. Opportunities in white-box cryptography

    NARCIS (Netherlands)

    Michiels, W.

    White-box cryptography is the discipline of implementing a cryptographic algorithm in software such that an adversary will have difficulty extracting the cryptographic key. This approach assumes that the adversary has full access to and full control over the implementation's execution. White-box

  20. An Anti-Cheating Visual Cryptography Scheme Based on Chaotic Encryption System

    Science.gov (United States)

    Han, Yanyan; Xu, Zhuolin; Ge, Xiaonan; He, Wencai

    By chaotic encryption system and introducing the trusted third party (TTP), in this paper, an anti-cheating visual cryptography scheme (VCS) is proposed. The scheme solved the problem of dishonest participants and improved the security of chaotic encryption system. Simulation results and analysis show that the recovery image is acceptable, the system can detect the cheating in participants effectively and with high security.

  1. Genetic attack on neural cryptography.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Naeh, Rivka; Kanter, Ido

    2006-03-01

    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size.

  2. Genetic attack on neural cryptography

    International Nuclear Information System (INIS)

    Ruttor, Andreas; Kinzel, Wolfgang; Naeh, Rivka; Kanter, Ido

    2006-01-01

    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size

  3. Genetic attack on neural cryptography

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Naeh, Rivka; Kanter, Ido

    2006-03-01

    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size.

  4. Quantum cryptography: The power of independence

    Science.gov (United States)

    Ekert, Artur

    2018-02-01

    Device-independent quantum cryptography promises unprecedented security, but it is regarded as a theorist's dream and an experimentalist's nightmare. A new mathematical tool has now pushed its experimental demonstration much closer to reality.

  5. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    Directory of Open Access Journals (Sweden)

    Sergey Nikolaevich Kyazhin

    2013-09-01

    Full Text Available The current state of the cloud computing (CC information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  6. Asymmetric cryptography based on wavefront sensing.

    Science.gov (United States)

    Peng, Xiang; Wei, Hengzheng; Zhang, Peng

    2006-12-15

    A system of asymmetric cryptography based on wavefront sensing (ACWS) is proposed for the first time to our knowledge. One of the most significant features of the asymmetric cryptography is that a trapdoor one-way function is required and constructed by analogy to wavefront sensing, in which the public key may be derived from optical parameters, such as the wavelength or the focal length, while the private key may be obtained from a kind of regular point array. The ciphertext is generated by the encoded wavefront and represented with an irregular array. In such an ACWS system, the encryption key is not identical to the decryption key, which is another important feature of an asymmetric cryptographic system. The processes of asymmetric encryption and decryption are formulized mathematically and demonstrated with a set of numerical experiments.

  7. Cryptography and computational number theory

    CERN Document Server

    Shparlinski, Igor; Wang, Huaxiong; Xing, Chaoping; Workshop on Cryptography and Computational Number Theory, CCNT'99

    2001-01-01

    This volume contains the refereed proceedings of the Workshop on Cryptography and Computational Number Theory, CCNT'99, which has been held in Singapore during the week of November 22-26, 1999. The workshop was organized by the Centre for Systems Security of the Na­ tional University of Singapore. We gratefully acknowledge the financial support from the Singapore National Science and Technology Board under the grant num­ ber RP960668/M. The idea for this workshop grew out of the recognition of the recent, rapid development in various areas of cryptography and computational number the­ ory. The event followed the concept of the research programs at such well-known research institutions as the Newton Institute (UK), Oberwolfach and Dagstuhl (Germany), and Luminy (France). Accordingly, there were only invited lectures at the workshop with plenty of time for informal discussions. It was hoped and successfully achieved that the meeting would encourage and stimulate further research in information and computer s...

  8. Device-independent two-party cryptography secure against sequential attacks

    DEFF Research Database (Denmark)

    Kaniewski, Jedrzej; Wehner, Stephanie

    2016-01-01

    The goal of two-party cryptography is to enable two parties, Alice and Bob, to solve common tasks without the need for mutual trust. Examples of such tasks are private access to a database, and secure identification. Quantum communication enables security for all of these problems in the noisy......-storage model by sending more signals than the adversary can store in a certain time frame. Here, we initiate the study of device-independent (DI) protocols for two-party cryptography in the noisy-storage model. Specifically, we present a relatively easy to implement protocol for a cryptographic building block...... known as weak string erasure and prove its security even if the devices used in the protocol are prepared by the dishonest party. DI two-party cryptography is made challenging by the fact that Alice and Bob do not trust each other, which requires new techniques to establish security. We fully analyse...

  9. Improved Image Encryption for Real-Time Application over Wireless Communication Networks using Hybrid Cryptography Technique

    Directory of Open Access Journals (Sweden)

    Kazeem B. Adedeji

    2016-12-01

    Full Text Available Advances in communication networks have enabled organization to send confidential data such as digital images over wireless networks. However, the broadcast nature of wireless communication channel has made it vulnerable to attack from eavesdroppers. We have developed a hybrid cryptography technique, and we present its application to digital images as a means of improving the security of digital image for transmission over wireless communication networks. The hybrid technique uses a combination of a symmetric (Data Encryption Standard and asymmetric (Rivest Shamir Adleman cryptographic algorithms to secure data to be transmitted between different nodes of a wireless network. Three different image samples of type jpeg, png and jpg were tested using this technique. The results obtained showed that the hybrid system encrypt the images with minimal simulation time, and high throughput. More importantly, there is no relation or information between the original images and their encrypted form, according to Shannon’s definition of perfect security, thereby making the system much more secure.

  10. Image communication scheme based on dynamic visual cryptography and computer generated holography

    Science.gov (United States)

    Palevicius, Paulius; Ragulskis, Minvydas

    2015-01-01

    Computer generated holograms are often exploited to implement optical encryption schemes. This paper proposes the integration of dynamic visual cryptography (an optical technique based on the interplay of visual cryptography and time-averaging geometric moiré) with Gerchberg-Saxton algorithm. A stochastic moiré grating is used to embed the secret into a single cover image. The secret can be visually decoded by a naked eye if only the amplitude of harmonic oscillations corresponds to an accurately preselected value. The proposed visual image encryption scheme is based on computer generated holography, optical time-averaging moiré and principles of dynamic visual cryptography. Dynamic visual cryptography is used both for the initial encryption of the secret image and for the final decryption. Phase data of the encrypted image are computed by using Gerchberg-Saxton algorithm. The optical image is decrypted using the computationally reconstructed field of amplitudes.

  11. Proposal for founding mistrustful quantum cryptography on coin tossing

    International Nuclear Information System (INIS)

    Kent, Adrian

    2003-01-01

    A significant branch of classical cryptography deals with the problems which arise when mistrustful parties need to generate, process, or exchange information. As Kilian showed a while ago, mistrustful classical cryptography can be founded on a single protocol, oblivious transfer, from which general secure multiparty computations can be built. The scope of mistrustful quantum cryptography is limited by no-go theorems, which rule out, inter alia, unconditionally secure quantum protocols for oblivious transfer or general secure two-party computations. These theorems apply even to protocols which take relativistic signaling constraints into account. The best that can be hoped for, in general, are quantum protocols which are computationally secure against quantum attack. Here a method is described for building a classically certified bit commitment, and hence every other mistrustful cryptographic task, from a secure coin-tossing protocol. No security proof is attempted, but reasons are sketched why these protocols might resist quantum computational attack

  12. Number Theory and Public-Key Cryptography.

    Science.gov (United States)

    Lefton, Phyllis

    1991-01-01

    Described are activities in the study of techniques used to conceal the meanings of messages and data. Some background information and two BASIC programs that illustrate the algorithms used in a new cryptographic system called "public-key cryptography" are included. (CW)

  13. Relativistic quantum cryptography

    Science.gov (United States)

    Kaniewski, Jedrzej

    Special relativity states that information cannot travel faster than the speed of light, which means that communication between agents occupying distinct locations incurs some minimal delay. Alternatively, we can see it as temporary communication constraints between distinct agents and such constraints turn out to be useful for cryptographic purposes. In relativistic cryptography we consider protocols in which interactions occur at distinct locations at well-defined times and we investigate why such a setting allows to implement primitives which would not be possible otherwise. (Abstract shortened by UMI.).

  14. Code-Based Cryptography: New Security Solutions Against a Quantum Adversary

    OpenAIRE

    Sendrier , Nicolas; Tillich , Jean-Pierre

    2016-01-01

    International audience; Cryptography is one of the key tools for providing security in our quickly evolving technological society. An adversary with the ability to use a quantum computer would defeat most of the cryptographic solutions that are deployed today to secure our communications. We do not know when quantum computing will become available, but nevertheless, the cryptographic research community must get ready for it now. Code-based cryptography is among the few cryptographic technique...

  15. Implementation of multiplexing in a subcarrier-wave quantum cryptography system

    International Nuclear Information System (INIS)

    Chistyakov, V V; Gleim, A V; Egorov, V I; Nazarov, Yu V

    2014-01-01

    Quantum cryptography allows distributing secure keys in a way that any eavesdropping in the channel is inevitably detected. This work is dedicated to introducing wavelength division multiplexing in a subcarrier-wave quantum cryptography system. Compared to other existing schemes, the resulting device is able to achieve higher bitrates (up to 2.26 Mbit/s at 20 km), is robust against external conditions and compatible with standard telecommunication fibres in multi-user environment

  16. Cryptography and the Internet: lessons and challenges

    Energy Technology Data Exchange (ETDEWEB)

    McCurley, K.S.

    1996-12-31

    The popularization of the Internet has brought fundamental changes to the world, because it allows a universal method of communication between computers. This carries enormous benefits with it, but also raises many security considerations. Cryptography is a fundamental technology used to provide security of computer networks, and there is currently a widespread engineering effort to incorporate cryptography into various aspects of the Internet. The system-level engineering required to provide security services for the Internet carries some important lessons for researchers whose study is focused on narrowly defined problems. It also offers challenges to the cryptographic research community by raising new questions not adequately addressed by the existing body of knowledge. This paper attempts to summarize some of these lessons and challenges for the cryptographic research community.

  17. Cryptography as a Pedagogical Tool

    Science.gov (United States)

    Kaur, Manmohan

    2008-01-01

    In order to get undergraduates interested in mathematics, it is necessary to motivate them, give them good reasons to spend time on a subject that requires hard work, and, if possible, involve them in undergraduate research. This article discusses how cryptography can be used for all these purposes. In particular, a special topics course on…

  18. Quantum cryptography beyond quantum key distribution

    NARCIS (Netherlands)

    Broadbent, A.; Schaffner, C.

    2016-01-01

    Quantum cryptography is the art and science of exploiting quantum mechanical effects in order to perform cryptographic tasks. While the most well-known example of this discipline is quantum key distribution (QKD), there exist many other applications such as quantum money, randomness generation,

  19. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  20. Electronic Voting Protocol Using Identity-Based Cryptography

    Directory of Open Access Journals (Sweden)

    Gina Gallegos-Garcia

    2015-01-01

    Full Text Available Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC, which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE. With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI. Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps.

  1. Electronic Voting Protocol Using Identity-Based Cryptography.

    Science.gov (United States)

    Gallegos-Garcia, Gina; Tapia-Recillas, Horacio

    2015-01-01

    Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC), which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA) to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE). With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI). Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps.

  2. Architecture for the Secret-Key BC3 Cryptography Algorithm

    Directory of Open Access Journals (Sweden)

    Arif Sasongko

    2011-08-01

    Full Text Available Cryptography is a very important aspect in data security. The focus of research in this field is shifting from merely security aspect to consider as well the implementation aspect. This paper aims to introduce BC3 algorithm with focus on its hardware implementation. It proposes architecture for the hardware implementation for this algorithm. BC3 algorithm is a secret-key cryptography algorithm developed with two considerations: robustness and implementation efficiency. This algorithm has been implemented on software and has good performance compared to AES algorithm. BC3 is improvement of BC2 and AE cryptographic algorithm and it is expected to have the same level of robustness and to gain competitive advantages in the implementation aspect. The development of the architecture gives much attention on (1 resource sharing and (2 having single clock for each round. It exploits regularity of the algorithm. This architecture is then implemented on an FPGA. This implementation is three times smaller area than AES, but about five times faster. Furthermore, this BC3 hardware implementation has better performance compared to BC3 software both in key expansion stage and randomizing stage. For the future, the security of this implementation must be reviewed especially against side channel attack.

  3. Privacy-Enhancing Auctions Using Rational Cryptography

    DEFF Research Database (Denmark)

    Miltersen, Peter Bro; Nielsen, Jesper Buus; Triandopoulos, Nikolaos

    2009-01-01

    show how to use rational cryptography to approximately implement any given ex interim individually strictly rational equilibrium of such an auction without a trusted mediator through a cryptographic protocol that uses only point-to-point authenticated channels between the players. By “ex interim...

  4. Power-Split Hybrid Electric Vehicle Energy Management Based on Improved Logic Threshold Approach

    Directory of Open Access Journals (Sweden)

    Zhumu Fu

    2013-01-01

    Full Text Available We design an improved logic threshold approach of energy management for a power-split HEV assisted by an integrated starter generator (ISG. By combining the efficiency map and the optimum torque curve of internal combustion engine (ICE with the state of charge (SOC of batteries, the improved logic threshold controller manages the ICE within its peak efficiency region at first. Then the electrical power demand is established based on the ICE energy output. On that premise, a variable logic threshold value K is defined to achieve the power distribution between the ISG and the electric motor/generator (EMG. Finally, simulation models for the power-split HEV with improved logic threshold controller are established in ADVISOR. Compared to the equally power-split HEV with the logic threshold controller, when using the improved logic threshold controller, the battery power consumption, the ICE efficiency, the fuel consumption, and the motor driving system efficiency are improved.

  5. An application of different dioids in public key cryptography

    International Nuclear Information System (INIS)

    Durcheva, Mariana I.

    2014-01-01

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme is presented

  6. An application of different dioids in public key cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com [Technical University of Sofia, Faculty of Applied Mathematics and Informatics, 8 Kliment Ohridski St., Sofia 1000 (Bulgaria)

    2014-11-18

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme is presented.

  7. Generalized logistic map and its application in chaos based cryptography

    Science.gov (United States)

    Lawnik, M.

    2017-12-01

    The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.

  8. Multivariate Cryptography Based on Clipped Hopfield Neural Network.

    Science.gov (United States)

    Wang, Jia; Cheng, Lee-Ming; Su, Tong

    2018-02-01

    Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in space. The Diffie-Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography. The efficiency and security of our proposed new public key cryptosystem CHNN-MVC are simulated and found to be NP-hard. The proposed algorithm will strengthen multivariate public key cryptosystems and allows hardware realization practicality.

  9. Is Calculus a Failure in Cryptography?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 3. Is Calculus a Failure in Cryptography? P Vanchinathan. General Article Volume 21 Issue 3 March 2016 pp 239-245. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/021/03/0239-0245. Keywords.

  10. Number Theory and Applications : Proceedings of the International Conferences on Number Theory and Cryptography

    CERN Document Server

    Ramakrishnan, B

    2009-01-01

    This collection of articles contains the proceedings of the two international conferences (on Number Theory and Cryptography) held at the Harish - Chandra Research Institute. In recent years the interest in number theory has increased due to its applications in areas like error-correcting codes and cryptography. These proceedings contain papers in various areas of number theory, such as combinatorial, algebraic, analytic and transcendental aspects, arithmetic algebraic geometry, as well as graph theory and cryptography. While some papers do contain new results, several of the papers are expository articles that mention open questions, which will be useful to young researchers.

  11. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    Science.gov (United States)

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  12. Cyber Security for Smart Grid, Cryptography, and Privacy

    Directory of Open Access Journals (Sweden)

    Swapna Iyer

    2011-01-01

    Full Text Available The invention of “smart grid” promises to improve the efficiency and reliability of the power system. As smart grid is turning out to be one of the most promising technologies, its security concerns are becoming more crucial. The grid is susceptible to different types of attacks. This paper will focus on these threats and risks especially relating to cyber security. Cyber security is a vital topic, since the smart grid uses high level of computation like the IT. We will also see cryptography and key management techniques that are required to overcome these attacks. Privacy of consumers is another important security concern that this paper will deal with.

  13. AUDIO CRYPTANALYSIS- AN APPLICATION OF SYMMETRIC KEY CRYPTOGRAPHY AND AUDIO STEGANOGRAPHY

    Directory of Open Access Journals (Sweden)

    Smita Paira

    2016-09-01

    Full Text Available In the recent trend of network and technology, “Cryptography” and “Steganography” have emerged out as the essential elements of providing network security. Although Cryptography plays a major role in the fabrication and modification of the secret message into an encrypted version yet it has certain drawbacks. Steganography is the art that meets one of the basic limitations of Cryptography. In this paper, a new algorithm has been proposed based on both Symmetric Key Cryptography and Audio Steganography. The combination of a randomly generated Symmetric Key along with LSB technique of Audio Steganography sends a secret message unrecognizable through an insecure medium. The Stego File generated is almost lossless giving a 100 percent recovery of the original message. This paper also presents a detailed experimental analysis of the algorithm with a brief comparison with other existing algorithms and a future scope. The experimental verification and security issues are promising.

  14. Principles of the new quantum cryptography protocols building

    International Nuclear Information System (INIS)

    Kurochkin, V.; Kurochkin, Yu.

    2009-01-01

    The main aim of the quantum cryptography protocols is the maximal secrecy under the conditions of the real experiment. This work presents the result of the new protocol building with the use of the secrecy maximization. While using some well-known approaches this method has allowed one to achieve completely new results in quantum cryptography. The process of the protocol elaboration develops from the standard BB84 protocol upgrading to the building of completely new protocol with arbitrary large bases number. The secrecy proofs of the elaborated protocol appear to be natural continuation of the protocol building process. This approach reveals possibility to reach extremely high parameters of the protocol. It suits both the restrictions of contemporary technologies and requirements for high bit rate while being absolutely secret

  15. Counterfactual quantum cryptography.

    Science.gov (United States)

    Noh, Tae-Gon

    2009-12-04

    Quantum cryptography allows one to distribute a secret key between two remote parties using the fundamental principles of quantum mechanics. The well-known established paradigm for the quantum key distribution relies on the actual transmission of signal particle through a quantum channel. In this Letter, we show that the task of a secret key distribution can be accomplished even though a particle carrying secret information is not in fact transmitted through the quantum channel. The proposed protocols can be implemented with current technologies and provide practical security advantages by eliminating the possibility that an eavesdropper can directly access the entire quantum system of each signal particle.

  16. Cryptography, quantum computation and trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  17. Optimization problem in quantum cryptography

    International Nuclear Information System (INIS)

    Brandt, Howard E

    2003-01-01

    A complete optimization was recently performed, yielding the maximum information gain by a general unitary entangling probe in the four-state protocol of quantum cryptography. A larger set of optimum probe parameters was found than was known previously from an incomplete optimization. In the present work, a detailed comparison is made between the complete and incomplete optimizations. Also, a new set of optimum probe parameters is identified for the four-state protocol

  18. Event-by-event simulation of quantum cryptography protocols

    NARCIS (Netherlands)

    Zhao, S.; Raedt, H. De

    We present a new approach to simulate quantum cryptography protocols using event-based processes. The method is validated by simulating the BB84 protocol and the Ekert protocol, both without and with the presence of an eavesdropper.

  19. Implementation Cryptography Data Encryption Standard (DES) and Triple Data Encryption Standard (3DES) Method in Communication System Based Near Field Communication (NFC)

    Science.gov (United States)

    Ratnadewi; Pramono Adhie, Roy; Hutama, Yonatan; Saleh Ahmar, A.; Setiawan, M. I.

    2018-01-01

    Cryptography is a method used to create secure communication by manipulating sent messages during the communication occurred so only intended party that can know the content of that messages. Some of the most commonly used cryptography methods to protect sent messages, especially in the form of text, are DES and 3DES cryptography method. This research will explain the DES and 3DES cryptography method and its use for stored data security in smart cards that working in the NFC-based communication system. Several things that will be explained in this research is the ways of working of DES and 3DES cryptography method in doing the protection process of a data and software engineering through the creation of application using C++ programming language to realize and test the performance of DES and 3DES cryptography method in encrypted data writing process to smart cards and decrypted data reading process from smart cards. The execution time of the entering and the reading process data using a smart card DES cryptography method is faster than using 3DES cryptography.

  20. Introduction to public-key cryptography (Chapter 1)

    NARCIS (Netherlands)

    Avanzi, R.; Lange, T.; Cohen, H.; Frey, G.

    2006-01-01

    In this chapter we introduce the basic building blocks for cryptography based on the discrete logarithm problem that will constitute the main motivation for considering the groups studied in this book. We also briefly introduce the RSA cryptosystem as for use in practice it is still an important

  1. Security proof of counterfactual quantum cryptography against general intercept-resend attacks and its vulnerability

    International Nuclear Information System (INIS)

    Zhang Sheng; Wang Jian; Tang Chao-Jing

    2012-01-01

    Counterfactual quantum cryptography, recently proposed by Noh, is featured with no transmission of signal particles. This exhibits evident security advantages, such as its immunity to the well-known photon-number-splitting attack. In this paper, the theoretical security of counterfactual quantum cryptography protocol against the general intercept-resend attacks is proved by bounding the information of an eavesdropper Eve more tightly than in Yin's proposal [Phys. Rev. A 82 042335 (2010)]. It is also shown that practical counterfactual quantum cryptography implementations may be vulnerable when equipped with imperfect apparatuses, by proving that a negative key rate can be achieved when Eve launches a time-shift attack based on imperfect detector efficiency. (general)

  2. Strengthen Cloud Computing Security with Federal Identity Management Using Hierarchical Identity-Based Cryptography

    Science.gov (United States)

    Yan, Liang; Rong, Chunming; Zhao, Gansen

    More and more companies begin to provide different kinds of cloud computing services for Internet users at the same time these services also bring some security problems. Currently the majority of cloud computing systems provide digital identity for users to access their services, this will bring some inconvenience for a hybrid cloud that includes multiple private clouds and/or public clouds. Today most cloud computing system use asymmetric and traditional public key cryptography to provide data security and mutual authentication. Identity-based cryptography has some attraction characteristics that seem to fit well the requirements of cloud computing. In this paper, by adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.

  3. Two-phase hybrid cryptography algorithm for wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Rawya Rizk

    2015-12-01

    Full Text Available For achieving security in wireless sensor networks (WSNs, cryptography plays an important role. In this paper, a new security algorithm using combination of both symmetric and asymmetric cryptographic techniques is proposed to provide high security with minimized key maintenance. It guarantees three cryptographic primitives, integrity, confidentiality and authentication. Elliptical Curve Cryptography (ECC and Advanced Encryption Standard (AES are combined to provide encryption. XOR-DUAL RSA algorithm is considered for authentication and Message Digest-5 (MD5 for integrity. The results show that the proposed hybrid algorithm gives better performance in terms of computation time, the size of cipher text, and the energy consumption in WSN. It is also robust against different types of attacks in the case of image encryption.

  4. Harry Potter and the Cryptography with Matrices

    Science.gov (United States)

    Chua, Boon Liang

    2006-01-01

    This article describes Cryptography, defined as the science of encrypting and deciphering messages written in secret codes, it has played a vital role in securing information since ancient times. There are several cryptographic techniques and many make extensive use of mathematics to secure information. The author discusses an activity built…

  5. Spectral coherent-state quantum cryptography.

    Science.gov (United States)

    Cincotti, Gabriella; Spiekman, Leo; Wada, Naoya; Kitayama, Ken-ichi

    2008-11-01

    A novel implementation of quantum-noise optical cryptography is proposed, which is based on a simplified architecture that allows long-haul, high-speed transmission in a fiber optical network. By using a single multiport encoder/decoder and 16 phase shifters, this new approach can provide the same confidentiality as other implementations of Yuen's encryption protocol, which use a larger number of phase or polarization coherent states. Data confidentiality and error probability for authorized and unauthorized receivers are carefully analyzed.

  6. Key distillation in quantum cryptography

    Science.gov (United States)

    Slutsky, Boris Aron

    1998-11-01

    Quantum cryptography is a technique which permits two parties to communicate over an open channel and establish a shared sequence of bits known only to themselves. This task, provably impossible in classical cryptography, is accomplished by encoding the data on quantum particles and harnessing their unique properties. It is believed that no eavesdropping attack consistent with the laws of quantum theory can compromise the secret data unknowingly to the legitimate users of the channel. Any attempt by a hostile actor to monitor the data carrying particles while in transit reveals itself through transmission errors it must inevitably introduce. Unfortunately, in practice a communication is not free of errors even when no eavesdropping is present. Key distillation is a technique that permits the parties to overcome this difficulty and establish a secret key despite channel defects, under the assumption that every particle is handled independently from other particles by the enemy. In the present work, key distillation is described and its various aspects are studied. A relationship is derived between the average error rate resulting from an eavesdropping attack and the amount of information obtained by the attacker. Formal definition is developed of the security of the final key. The net throughput of secret bits in a quantum cryptosystem employing key distillation is assessed. An overview of quantum cryptographic protocols and related information theoretical results is also given.

  7. Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    James (Jong Hyuk Park

    2016-09-01

    Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.

  8. APE: Authenticated Permutation-Based Encryption for Lightweight Cryptography

    DEFF Research Database (Denmark)

    Andreeva, Elena; Bilgin, Begül; Bogdanov, Andrey

    2015-01-01

    The domain of lightweight cryptography focuses on cryptographic algorithms for extremely constrained devices. It is very costly to avoid nonce reuse in such environments, because this requires either a hardware source of randomness, or non-volatile memory to store a counter. At the same time, a lot...

  9. McBits: fast constant-time code-based cryptography

    NARCIS (Netherlands)

    Bernstein, D.J.; Chou, T.; Schwabe, P.

    2015-01-01

    This paper presents extremely fast algorithms for code-based public-key cryptography, including full protection against timing attacks. For example, at a 2^128 security level, this paper achieves a reciprocal decryption throughput of just 60493 cycles (plus cipher cost etc.) on a single Ivy Bridge

  10. Modern cryptography and elliptic curves a beginner's guide

    CERN Document Server

    Shemanske, Thomas R

    2017-01-01

    This book offers the beginning undergraduate student some of the vista of modern mathematics by developing and presenting the tools needed to gain an understanding of the arithmetic of elliptic curves over finite fields and their applications to modern cryptography. This gradual introduction also makes a significant effort to teach students how to produce or discover a proof by presenting mathematics as an exploration, and at the same time, it provides the necessary mathematical underpinnings to investigate the practical and implementation side of elliptic curve cryptography (ECC). Elements of abstract algebra, number theory, and affine and projective geometry are introduced and developed, and their interplay is exploited. Algebra and geometry combine to characterize congruent numbers via rational points on the unit circle, and group law for the set of points on an elliptic curve arises from geometric intuition provided by Bézout's theorem as well as the construction of projective space. The structure of the...

  11. Design of an Elliptic Curve Cryptography processor for RFID tag chips.

    Science.gov (United States)

    Liu, Zilong; Liu, Dongsheng; Zou, Xuecheng; Lin, Hui; Cheng, Jian

    2014-09-26

    Radio Frequency Identification (RFID) is an important technique for wireless sensor networks and the Internet of Things. Recently, considerable research has been performed in the combination of public key cryptography and RFID. In this paper, an efficient architecture of Elliptic Curve Cryptography (ECC) Processor for RFID tag chip is presented. We adopt a new inversion algorithm which requires fewer registers to store variables than the traditional schemes. A new method for coordinate swapping is proposed, which can reduce the complexity of the controller and shorten the time of iterative calculation effectively. A modified circular shift register architecture is presented in this paper, which is an effective way to reduce the area of register files. Clock gating and asynchronous counter are exploited to reduce the power consumption. The simulation and synthesis results show that the time needed for one elliptic curve scalar point multiplication over GF(2163) is 176.7 K clock cycles and the gate area is 13.8 K with UMC 0.13 μm Complementary Metal Oxide Semiconductor (CMOS) technology. Moreover, the low power and low cost consumption make the Elliptic Curve Cryptography Processor (ECP) a prospective candidate for application in the RFID tag chip.

  12. Device-independent two-party cryptography secure against sequential attacks

    International Nuclear Information System (INIS)

    Kaniewski, Jędrzej; Wehner, Stephanie

    2016-01-01

    The goal of two-party cryptography is to enable two parties, Alice and Bob, to solve common tasks without the need for mutual trust. Examples of such tasks are private access to a database, and secure identification. Quantum communication enables security for all of these problems in the noisy-storage model by sending more signals than the adversary can store in a certain time frame. Here, we initiate the study of device-independent (DI) protocols for two-party cryptography in the noisy-storage model. Specifically, we present a relatively easy to implement protocol for a cryptographic building block known as weak string erasure and prove its security even if the devices used in the protocol are prepared by the dishonest party. DI two-party cryptography is made challenging by the fact that Alice and Bob do not trust each other, which requires new techniques to establish security. We fully analyse the case of memoryless devices (for which sequential attacks are optimal) and the case of sequential attacks for arbitrary devices. The key ingredient of the proof, which might be of independent interest, is an explicit (and tight) relation between the violation of the Clauser–Horne–Shimony–Holt inequality observed by Alice and Bob and uncertainty generated by Alice against Bob who is forced to measure his system before finding out Alice’s setting (guessing with postmeasurement information). In particular, we show that security is possible for arbitrarily small violation. (paper)

  13. Device-independent two-party cryptography secure against sequential attacks

    Science.gov (United States)

    Kaniewski, Jędrzej; Wehner, Stephanie

    2016-05-01

    The goal of two-party cryptography is to enable two parties, Alice and Bob, to solve common tasks without the need for mutual trust. Examples of such tasks are private access to a database, and secure identification. Quantum communication enables security for all of these problems in the noisy-storage model by sending more signals than the adversary can store in a certain time frame. Here, we initiate the study of device-independent (DI) protocols for two-party cryptography in the noisy-storage model. Specifically, we present a relatively easy to implement protocol for a cryptographic building block known as weak string erasure and prove its security even if the devices used in the protocol are prepared by the dishonest party. DI two-party cryptography is made challenging by the fact that Alice and Bob do not trust each other, which requires new techniques to establish security. We fully analyse the case of memoryless devices (for which sequential attacks are optimal) and the case of sequential attacks for arbitrary devices. The key ingredient of the proof, which might be of independent interest, is an explicit (and tight) relation between the violation of the Clauser-Horne-Shimony-Holt inequality observed by Alice and Bob and uncertainty generated by Alice against Bob who is forced to measure his system before finding out Alice’s setting (guessing with postmeasurement information). In particular, we show that security is possible for arbitrarily small violation.

  14. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    Science.gov (United States)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  15. Cryptography in constant parallel time

    CERN Document Server

    Applebaum, Benny

    2013-01-01

    Locally computable (NC0) functions are 'simple' functions for which every bit of the output can be computed by reading a small number of bits of their input. The study of locally computable cryptography attempts to construct cryptographic functions that achieve this strong notion of simplicity and simultaneously provide a high level of security. Such constructions are highly parallelizable and they can be realized by Boolean circuits of constant depth.This book establishes, for the first time, the possibility of local implementations for many basic cryptographic primitives such as one-way func

  16. Cooperating attackers in neural cryptography.

    Science.gov (United States)

    Shacham, Lanir N; Klein, Einat; Mislovaty, Rachel; Kanter, Ido; Kinzel, Wolfgang

    2004-06-01

    A successful attack strategy in neural cryptography is presented. The neural cryptosystem, based on synchronization of neural networks by mutual learning, has been recently shown to be secure under different attack strategies. The success of the advanced attacker presented here, called the "majority-flipping attacker," does not decay with the parameters of the model. This attacker's outstanding success is due to its using a group of attackers which cooperate throughout the synchronization process, unlike any other attack strategy known. An analytical description of this attack is also presented, and fits the results of simulations.

  17. A "proof-reading" of Some Issues in Cryptography

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre

    2007-01-01

    In this paper, we identify some issues in the interplay between practice and theory in cryptography, issues that have repeatedly appeared in different incarnations over the years. These issues are related to fundamental concepts in the eld, e.g., to what extent we can prove that a system is secure...

  18. Trichocyanines: a Red-Hair-Inspired Modular Platform for Dye-Based One-Time-Pad Molecular Cryptography.

    Science.gov (United States)

    Leone, Loredana; Pezzella, Alessandro; Crescenzi, Orlando; Napolitano, Alessandra; Barone, Vincenzo; d'Ischia, Marco

    2015-06-01

    Current molecular cryptography (MoCryp) systems are almost exclusively based on DNA chemistry and reports of cryptography technologies based on other less complex chemical systems are lacking. We describe herein, as proof of concept, the prototype of the first asymmetric MoCryp system, based on an 8-compound set of a novel bioinspired class of cyanine-type dyes called trichocyanines. These novel acidichromic cyanine-type dyes inspired by red hair pigments were synthesized and characterized with the aid of density functional theory (DFT) calculations. Trichocyanines consist of a modular scaffold easily accessible via an expedient condensation of 3-phenyl- or 3-methyl-2H-1,4-benzothiazines with N-dimethyl- or o-methoxyhydroxy-substituted benzaldehyde or cinnamaldehyde derivatives. The eight representative members synthesized herein can be classified as belonging to two three-state systems tunable through four different control points. This versatile dye platform can generate an expandable palette of colors and appears to be specifically suited to implement an unprecedented single-use asymmetric molecular cryptography system. With this system, we intend to pioneer the translation of digital public-key cryptography into a chemical-coding one-time-pad-like system.

  19. Conference on Algebraic Geometry for Coding Theory and Cryptography

    CERN Document Server

    Lauter, Kristin; Walker, Judy

    2017-01-01

    Covering topics in algebraic geometry, coding theory, and cryptography, this volume presents interdisciplinary group research completed for the February 2016 conference at the Institute for Pure and Applied Mathematics (IPAM) in cooperation with the Association for Women in Mathematics (AWM). The conference gathered research communities across disciplines to share ideas and problems in their fields and formed small research groups made up of graduate students, postdoctoral researchers, junior faculty, and group leaders who designed and led the projects. Peer reviewed and revised, each of this volume's five papers achieves the conference’s goal of using algebraic geometry to address a problem in either coding theory or cryptography. Proposed variants of the McEliece cryptosystem based on different constructions of codes, constructions of locally recoverable codes from algebraic curves and surfaces, and algebraic approaches to the multicast network coding problem are only some of the topics covered in this vo...

  20. Composability in quantum cryptography

    International Nuclear Information System (INIS)

    Mueller-Quade, Joern; Renner, Renato

    2009-01-01

    If we combine two secure cryptographic systems, is the resulting system still secure? Answering this question is highly nontrivial and has recently sparked a considerable research effort, in particular, in the area of classical cryptography. A central insight was that the answer to the question is yes, but only within a well-specified composability framework and for carefully chosen security definitions. In this article, we review several aspects of composability in the context of quantum cryptography. The first part is devoted to key distribution. We discuss the security criteria that a quantum key distribution (QKD) protocol must fulfill to allow its safe use within a larger security application (e.g. for secure message transmission); and we demonstrate-by an explicit example-what can go wrong if conventional (non-composable) security definitions are used. Finally, to illustrate the practical use of composability, we show how to generate a continuous key stream by sequentially composing rounds of a QKD protocol. In the second part, we take a more general point of view, which is necessary for the study of cryptographic situations involving, for example, mutually distrustful parties. We explain the universal composability (UC) framework and state the composition theorem that guarantees that secure protocols can securely be composed to larger applications. We focus on the secure composition of quantum protocols into unconditionally secure classical protocols. However, the resulting security definition is so strict that some tasks become impossible without additional security assumptions. Quantum bit commitment is impossible in the UC framework even with mere computational security. Similar problems arise in the quantum bounded storage model and we observe a trade-off between the UC and the use of the weakest possible security assumptions.

  1. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    Science.gov (United States)

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the

  2. Performance improvement of per-user threshold based multiuser switched scheduling system

    KAUST Repository

    Nam, Haewoon

    2013-01-01

    SUMMARY This letter proposes a multiuser switched scheduling scheme with per-user threshold and post user selection and provides a generic analytical framework for determining the optimal feedback thresholds. The proposed scheme applies an individual feedback threshold for each user rather than a single common threshold for all users to achieve some capacity gain due to the flexibility of threshold selection as well as a lower scheduling outage probability. In addition, since scheduling outage may occur with a non-negligible probability, the proposed scheme employs post user selection in order to further improve the ergodic capacity, where the user with the highest potential for a higher channel quality than other users is selected. Numerical and simulation results show that the capacity gain by post user selection is significant when random sequence is used. Copyright © 2013 The Institute of Electronics, Information and Communication Engineers.

  3. Autocompensating quantum cryptography

    International Nuclear Information System (INIS)

    Bethune, Donald S.; Risk, William P.

    2002-01-01

    Quantum cryptographic key distribution (QKD) uses extremely faint light pulses to carry quantum information between two parties (Alice and Bob), allowing them to generate a shared, secret cryptographic key. Autocompensating QKD systems automatically and passively compensate for uncontrolled time-dependent variations of the optical fibre properties by coding the information as a differential phase between orthogonally polarized components of a light pulse sent on a round trip through the fibre, reflected at mid-course using a Faraday mirror. We have built a prototype system based on standard telecom technology that achieves a privacy-amplified bit generation rate of ∼1000 bits s -1 over a 10 km optical fibre link. Quantum cryptography is an example of an application that, by using quantum states of individual particles to represent information, accomplishes a practical task that is impossible using classical means. (author)

  4. An Improved Digital Signature Protocol to Multi-User Broadcast Authentication Based on Elliptic Curve Cryptography in Wireless Sensor Networks (WSNs

    Directory of Open Access Journals (Sweden)

    Hamed Bashirpour

    2018-03-01

    Full Text Available In wireless sensor networks (WSNs, users can use broadcast authentication mechanisms to connect to the target network and disseminate their messages within the network. Since data transfer for sensor networks is wireless, as a result, attackers can easily eavesdrop deployed sensor nodes and the data sent between them or modify the content of eavesdropped data and inject false data into the sensor network. Hence, the implementation of the message authentication mechanisms (in order to avoid changes and injecting messages into the network of wireless sensor networks is essential. In this paper, we present an improved protocol based on elliptic curve cryptography (ECC to accelerate authentication of multi-user message broadcasting. In comparison with previous ECC-based schemes, complexity and computational overhead of proposed scheme is significantly decreased. Also, the proposed scheme supports user anonymity, which is an important property in broadcast authentication schemes for WSNs to preserve user privacy and user untracking.

  5. Low Cost and Compact Quantum Cryptography

    OpenAIRE

    Duligall, J. L.; Godfrey, M. S.; Harrison, K. A.; Munro, W. J.; Rarity, J. G.

    2006-01-01

    We present the design of a novel free-space quantum cryptography system, complete with purpose-built software, that can operate in daylight conditions. The transmitter and receiver modules are built using inexpensive off-the-shelf components. Both modules are compact allowing the generation of renewed shared secrets on demand over a short range of a few metres. An analysis of the software is shown as well as results of error rates and therefore shared secret yields at varying background light...

  6. Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography

    Science.gov (United States)

    Aydin, Nuh

    2009-01-01

    The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…

  7. Efficient multiuser quantum cryptography network based on entanglement.

    Science.gov (United States)

    Xue, Peng; Wang, Kunkun; Wang, Xiaoping

    2017-04-04

    We present an efficient quantum key distribution protocol with a certain entangled state to solve a special cryptographic task. Also, we provide a proof of security of this protocol by generalizing the proof of modified of Lo-Chau scheme. Based on this two-user scheme, a quantum cryptography network protocol is proposed without any quantum memory.

  8. Architecture for the Secret-Key BC3 Cryptography Algorithm

    Directory of Open Access Journals (Sweden)

    Arif Sasongko

    2014-11-01

    Full Text Available Cryptography is a very important aspect in data security. The focus of research in this field is shifting from merely security aspect to consider as well the  implementation  aspect.  This  paper  aims  to  introduce  BC3  algorithm  with focus  on  its  hardware  implementation.  It  proposes  an  architecture  for  the hardware  implementation  for  this  algorithm.  BC3  algorithm  is  a  secret-key cryptography  algorithm  developed  with  two  considerations:  robustness  and implementation  efficiency.  This  algorithm  has  been  implemented  on  software and has good performance compared to AES algorithm. BC3 is improvement of BC2 and AE cryptographic algorithm and it is expected to have the same level of robustness and to gain competitive advantages in the implementation aspect. The development of the architecture gives much attention on (1 resource sharing and (2  having  single  clock  for  each  round.  It  exploits  regularity  of  the  algorithm. This architecture is then implemented on an FPGA. This implementation is three times smaller area than AES, but about five times faster. Furthermore, this BC3 hardware  implementation  has  better  performance  compared  to  BC3  software both in key expansion stage and randomizing stage. For the future, the security of this implementation must be reviewed especially against side channel attack.

  9. Insecurity of position-based quantum-cryptography protocols against entanglement attacks

    International Nuclear Information System (INIS)

    Lau, Hoi-Kwan; Lo, Hoi-Kwong

    2011-01-01

    Recently, position-based quantum cryptography has been claimed to be unconditionally secure. On the contrary, here we show that the existing proposals for position-based quantum cryptography are, in fact, insecure if entanglement is shared among two adversaries. Specifically, we demonstrate how the adversaries can incorporate ideas of quantum teleportation and quantum secret sharing to compromise the security with certainty. The common flaw to all current protocols is that the Pauli operators always map a codeword to a codeword (up to an irrelevant overall phase). We propose a modified scheme lacking this property in which the same cheating strategy used to undermine the previous protocols can succeed with a rate of at most 85%. We prove the modified protocol is secure when the shared quantum resource between the adversaries is a two- or three-level system.

  10. Fast elliptic-curve cryptography on the Cell Broadband Engine

    NARCIS (Netherlands)

    Costigan, N.; Schwabe, P.; Preneel, B.

    2009-01-01

    This paper is the first to investigate the power of the Cell Broadband Engine for state-of-the-art public-key cryptography. We present a high-speed implementation of elliptic-curve Diffie-Hellman (ECDH) key exchange for this processor, which needs 697080 cycles on one Synergistic Processor Unit for

  11. The mathematics of ciphers number theory and RSA cryptography

    CERN Document Server

    Coutinho, S C

    1999-01-01

    This book is an introduction to the algorithmic aspects of number theory and its applications to cryptography, with special emphasis on the RSA cryptosys-tem. It covers many of the familiar topics of elementary number theory, all with an algorithmic twist. The text also includes many interesting historical notes.

  12. Coding theory and cryptography the essentials

    CERN Document Server

    Hankerson, DC; Leonard, DA; Phelps, KT; Rodger, CA; Wall, JR; Wall, J R

    2000-01-01

    Containing data on number theory, encryption schemes, and cyclic codes, this highly successful textbook, proven by the authors in a popular two-quarter course, presents coding theory, construction, encoding, and decoding of specific code families in an ""easy-to-use"" manner appropriate for students with only a basic background in mathematics offering revised and updated material on the Berlekamp-Massey decoding algorithm and convolutional codes. Introducing the mathematics as it is needed and providing exercises with solutions, this edition includes an extensive section on cryptography, desig

  13. Steganography and Cryptography Inspired Enhancement of Introductory Programming Courses

    Science.gov (United States)

    Kortsarts, Yana; Kempner, Yulia

    2015-01-01

    Steganography is the art and science of concealing communication. The goal of steganography is to hide the very existence of information exchange by embedding messages into unsuspicious digital media covers. Cryptography, or secret writing, is the study of the methods of encryption, decryption and their use in communications protocols.…

  14. Quantum-tomographic cryptography with a semiconductor single-photon source

    International Nuclear Information System (INIS)

    Kaszlikowski, D.; Yang, L.J.; Yong, L.S.; Willeboordse, F.H.; Kwek, L.C.

    2005-01-01

    We analyze the security of so-called quantum-tomographic cryptography with the source producing entangled photons via an experimental scheme proposed by Fattal et al. [Phys. Rev. Lett. 92, 37903 (2004)]. We determine the range of the experimental parameters for which the protocol is secure against the most general incoherent attacks

  15. Thresher: an improved algorithm for peak height thresholding of microbial community profiles.

    Science.gov (United States)

    Starke, Verena; Steele, Andrew

    2014-11-15

    This article presents Thresher, an improved technique for finding peak height thresholds for automated rRNA intergenic spacer analysis (ARISA) profiles. We argue that thresholds must be sample dependent, taking community richness into account. In most previous fragment analyses, a common threshold is applied to all samples simultaneously, ignoring richness variations among samples and thereby compromising cross-sample comparison. Our technique solves this problem, and at the same time provides a robust method for outlier rejection, selecting for removal any replicate pairs that are not valid replicates. Thresholds are calculated individually for each replicate in a pair, and separately for each sample. The thresholds are selected to be the ones that minimize the dissimilarity between the replicates after thresholding. If a choice of threshold results in the two replicates in a pair failing a quantitative test of similarity, either that threshold or that sample must be rejected. We compare thresholded ARISA results with sequencing results, and demonstrate that the Thresher algorithm outperforms conventional thresholding techniques. The software is implemented in R, and the code is available at http://verenastarke.wordpress.com or by contacting the author. vstarke@ciw.edu or http://verenastarke.wordpress.com Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. On a two-pass scheme without a faraday mirror for free-space relativistic quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Kravtsov, K. S.; Radchenko, I. V. [Russian Academy of Sciences, Prokhorov General Physics Institute (Russian Federation); Korol' kov, A. V. [Academy of Cryptography (Russian Federation); Kulik, S. P., E-mail: sergei.kulik@gmail.com [Moscow State University (Russian Federation); Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Academy of Cryptography (Russian Federation)

    2013-05-15

    The stability of destructive interference independent of the input polarization and the state of a quantum communication channel in fiber optic systems used in quantum cryptography plays a principal role in providing the security of communicated keys. A novel optical scheme is proposed that can be used both in relativistic quantum cryptography for communicating keys in open space and for communicating them over fiber optic lines. The scheme ensures stability of destructive interference and admits simple automatic balancing of a fiber interferometer.

  17. On a two-pass scheme without a faraday mirror for free-space relativistic quantum cryptography

    International Nuclear Information System (INIS)

    Kravtsov, K. S.; Radchenko, I. V.; Korol’kov, A. V.; Kulik, S. P.; Molotkov, S. N.

    2013-01-01

    The stability of destructive interference independent of the input polarization and the state of a quantum communication channel in fiber optic systems used in quantum cryptography plays a principal role in providing the security of communicated keys. A novel optical scheme is proposed that can be used both in relativistic quantum cryptography for communicating keys in open space and for communicating them over fiber optic lines. The scheme ensures stability of destructive interference and admits simple automatic balancing of a fiber interferometer.

  18. Developmental Mechanisms Underlying Improved Contrast Thresholds for Discriminations of Orientation Signals Embedded in Noise

    Directory of Open Access Journals (Sweden)

    Seong Taek eJeon

    2014-09-01

    Full Text Available We combined an external noise paradigm with an efficient procedure for obtaining contrast thresholds (Lesmes et al., 2006 in order to model developmental changes during childhood. Specifically, we measured the contrast thresholds of 5-, 7-, 9-year-olds and adults (n = 20/age in a two alternative forced-choice orientation discrimination task over a wide range of external noise levels and at three levels of accuracy. Overall, as age increased, contrast thresholds decreased over the entire range of external noise levels tested. The decrease was greatest between 5 and 7 years of age. The reduction in threshold after age 5 was greater in the high than the low external noise region, a pattern implying greater tolerance to the irrelevant background noise as children became older. To model the mechanisms underlying these developmental changes in terms of internal noise components, we adapted the original perceptual template model (Lu and Dosher, 1998 and normalized the magnitude of performance changes against the performance of 5-year-olds. The resulting model provided an excellent fit (r2 = 0.985 to the contrast thresholds at multiple levels of accuracy (60, 75, and 90% across a wide range of external noise levels. The improvements in contrast thresholds with age were best modelled by a combination of reductions in internal additive noise, reductions in internal multiplicative noise, and improvements in excluding external noise by template retuning. In line with the data, the improvement was greatest between 5 and 7 years of age, accompanied by a 39% reduction in additive noise, 71% reduction in multiplicative noise, and 45% improvement in external noise exclusion. The modelled improvements likely reflect developmental changes at the cortical level, rather than changes in front-end structural properties (Kiorpes et al., 2003.

  19. Post-Quantum Cryptography: Riemann Primitives and Chrysalis

    OpenAIRE

    Malloy, Ian; Hollenbeck, Dennis

    2018-01-01

    The Chrysalis project is a proposed method for post-quantum cryptography using the Riemann sphere. To this end, Riemann primitives are introduced in addition to a novel implementation of this new method. Chrysalis itself is the first cryptographic scheme to rely on Holomorphic Learning with Errors, which is a complex form of Learning with Errors relying on the Gauss Circle Problem within the Riemann sphere. The principle security reduction proposed by this novel cryptographic scheme applies c...

  20. Introduction to cryptography

    CERN Document Server

    Buchmann, Johannes A

    2004-01-01

    Cryptography is a key technology in electronic key systems. It is used to keep data secret, digitally sign documents, access control, etc. Therefore, users should not only know how its techniques work, but they must also be able to estimate their efficiency and security. For this new edition, the author has updated the discussion of the security of encryption and signature schemes and recent advances in factoring and computing discrete logarithms. He has also added descriptions of time-memory trade of attacks and algebraic attacks on block ciphers, the Advanced Encryption Standard, the Secure Hash Algorithm, secret sharing schemes, and undeniable and blind signatures. Johannes A. Buchmann is a Professor of Computer Science and Mathematics at the Technical University of Darmstadt, and the Associate Editor of the Journal of Cryptology. In 1985, he received the Feodor Lynen Fellowship of the Alexander von Humboldt Foundation. Furthermore, he has received the most prestigious award in science in Germany, the Leib...

  1. File Cryptography with AES and RSA for Mobile Based on Android

    Science.gov (United States)

    laia, Yonata; Nababan, Marlince; Sihombing, Oloan; Aisyah, Siti; Sitanggang, Delima; Parsaoran, Saut; Zendato, Niskarto

    2018-04-01

    The users of mobile based on android were increasing currently even now mobile was almost the same computer one of which could be used to be done by every users mobile was save the individual important data.Saving the data on mobile was very risk because become hackers’ target. That was the reason of researchers want to add cryptography which the combination between Advance Encryption System (AES) dan Ron Rivest, Adi Shamir dan Len Adleman (RSA). The result of the second method above could do cryptography data on mobile. With different encryption time where the file size; 25.44 KB, encryption time 4 second, 200 KB, 5 second, 600 KB 7 second, 2.29 MB, 10 second. Where decryption 25.44 KB, encryption 2 second, 200 KB, 1.5 second, 600 KB 2.5 second, 2.29 MB, 2.7 second.

  2. Cryptographie quantique à variables continues

    Science.gov (United States)

    Bencheikh, K.; Jankovic, A.; Symul, T.; Levenson, J. A.

    2002-06-01

    Nous avons élaboré un protocole de cryptographie quantique qui permet de générer et de distribuer une clé secrète aléatoire. Le protocole repose sur l'utilisation de paires de champs électromagnétiques dont les quadratures présentent des corrélations quantiques de type Einstein-Podolsky-Rosen. Les fluctuations quantiques instantanése constituent les bits aléatoires de la clé secrète, et la dégradation irréversible des corrélations quantiques des quadratures causée par une tierce personne permet de la détecter et de garantir la sécurité d'échange.

  3. Combining Cryptography with EEG Biometrics.

    Science.gov (United States)

    Damaševičius, Robertas; Maskeliūnas, Rytis; Kazanavičius, Egidijus; Woźniak, Marcin

    2018-01-01

    Cryptographic frameworks depend on key sharing for ensuring security of data. While the keys in cryptographic frameworks must be correctly reproducible and not unequivocally connected to the identity of a user, in biometric frameworks this is different. Joining cryptography techniques with biometrics can solve these issues. We present a biometric authentication method based on the discrete logarithm problem and Bose-Chaudhuri-Hocquenghem (BCH) codes, perform its security analysis, and demonstrate its security characteristics. We evaluate a biometric cryptosystem using our own dataset of electroencephalography (EEG) data collected from 42 subjects. The experimental results show that the described biometric user authentication system is effective, achieving an Equal Error Rate (ERR) of 0.024.

  4. Introduction to Cryptography and the Bitcoin Protocol (1/2)

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Bitcoin protocol not only supports an electronic currency, but also has the possibility for being (mis)used in other ways. Topics will include the basic operation of how Bitcoin operates including motivations and also such things as block chaining, bitcoin mining, and how financial transactions operate. A knowledge of the topics covered in the Basic Cryptography lecture will be assumed.

  5. Introduction to Cryptography and the Bitcoin Protocol (2/2)

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Bitcoin protocol not only supports an electronic currency, but also has the possibility for being (mis)used in other ways. Topics will include the basic operation of how Bitcoin operates including motivations and also such things as block chaining, bitcoin mining, and how financial transactions operate. A knowledge of the topics covered in the Basic Cryptography lecture will be assumed.

  6. Cryptography from quantum uncertainty in the presence of quantum side information

    NARCIS (Netherlands)

    Bouman, Niek Johannes

    2012-01-01

    The thesis starts with a high-level introduction into cryptography and quantum mechanics. Chapter 2 gives a theoretical foundation by introducing probability theory, information theory, functional analysis, quantum mechanics and quantum information theory. Chapter 3, 4 and 5 are editions of work

  7. Decoy state method for quantum cryptography based on phase coding into faint laser pulses

    Science.gov (United States)

    Kulik, S. P.; Molotkov, S. N.

    2017-12-01

    We discuss the photon number splitting attack (PNS) in systems of quantum cryptography with phase coding. It is shown that this attack, as well as the structural equations for the PNS attack for phase encoding, differs physically from the analogous attack applied to the polarization coding. As far as we know, in practice, in all works to date processing of experimental data has been done for phase coding, but using formulas for polarization coding. This can lead to inadequate results for the length of the secret key. These calculations are important for the correct interpretation of the results, especially if it concerns the criterion of secrecy in quantum cryptography.

  8. Cryptography- An ideal solution to privacy, data integrity and non ...

    African Journals Online (AJOL)

    Encryption, hashing and digital signatures are the three primitives of Cryptography and these have been treated in depth and their performances on text data and image data have been studied The most secure algorithms so far in use have been introduced and the respective performance of each primitive 's algorithm on ...

  9. A key distribution scheme using elliptic curve cryptography in wireless sensor networks

    CSIR Research Space (South Africa)

    Louw, J

    2016-12-01

    Full Text Available Wireless sensor networks (WSNs) have become increasingly popular in many applications across a broad range of fields. Securing WSNs poses unique challenges mainly due to their resource constraints. Traditional public key cryptography (PKC...

  10. Implementing SSL/TLS using cryptography and PKI

    CERN Document Server

    Davies, Joshua

    2011-01-01

    Hands-on, practical guide to implementing SSL and TLS protocols for Internet security If you are a network professional who knows C programming, this practical book is for you.  Focused on how to implement Secure Socket Layer (SSL) and Transport Layer Security (TLS), this book guides you through all necessary steps, whether or not you have a working knowledge of cryptography. The book covers SSLv2, TLS 1.0, and TLS 1.2, including implementations of the relevant cryptographic protocols, secure hashing, certificate parsing, certificate generation, and more.  Coverage includes: Underst

  11. ID based cryptography for secure cloud data storage

    OpenAIRE

    Kaaniche , Nesrine; Boudguiga , Aymen; Laurent , Maryline

    2013-01-01

    International audience; This paper addresses the security issues of storing sensitive data in a cloud storage service and the need for users to trust the commercial cloud providers. It proposes a cryptographic scheme for cloud storage, based on an original usage of ID-Based Cryptography. Our solution has several advantages. First, it provides secrecy for encrypted data which are stored in public servers. Second, it offers controlled data access and sharing among users, so that unauthorized us...

  12. Analysis of Multiple Data Hiding Combined Coloured Visual Cryptography and LSB

    Science.gov (United States)

    Maulana, Halim; Rahman Syahputra, Edy

    2017-12-01

    Currently the level of data security becoming a major factor in data transfer. As we know every process of sending data through any medium the risk of that data gets hacked will still be there. Some techniques for securing data such as steganography and cryptography also often used as a solution for securing data. But it does not last long because it has been found out the weaknesses of the algorithm so that the security be assured. So, in need of variety of new algorithms to be able to protect the data so that data security can be guaranteed. In this study tries to combine two visual algorithms that steganography and cryptography. Where in these experiments will try to secure two pieces of data type that is the type of image data and text data where both the data is regarded as a message so to obtain the correct information receiver should get that two types of data.

  13. Cryptographic robustness of practical quantum cryptography: BB84 key distribution protocol

    International Nuclear Information System (INIS)

    Molotkov, S. N.

    2008-01-01

    In real fiber-optic quantum cryptography systems, the avalanche photodiodes are not perfect, the source of quantum states is not a single-photon one, and the communication channel is lossy. For these reasons, key distribution is impossible under certain conditions for the system parameters. A simple analysis is performed to find relations between the parameters of real cryptography systems and the length of the quantum channel that guarantee secure quantum key distribution when the eavesdropper's capabilities are limited only by fundamental laws of quantum mechanics while the devices employed by the legitimate users are based on current technologies. Critical values are determined for the rate of secure real-time key generation that can be reached under the current technology level. Calculations show that the upper bound on channel length can be as high as 300 km for imperfect photodetectors (avalanche photodiodes) with present-day quantum efficiency (η ∼ 20%) and dark count probability (p dark ∼ 10 -7 )

  14. Cryptographic Research and NSA: Report of the Public Cryptography Study Group.

    Science.gov (United States)

    Davida, George I.

    1981-01-01

    The Public Cryptography Study Group accepted the claim made by the National Security Agency that some information in some publications concerning cryptology could be inimical to national security, and is allowing the establishment of a voluntary mechanism, on an experimental basis, for NSA to review cryptology manuscripts. (MLW)

  15. A copyright protection scheme for digital images based on shuffled singular value decomposition and visual cryptography.

    Science.gov (United States)

    Devi, B Pushpa; Singh, Kh Manglem; Roy, Sudipta

    2016-01-01

    This paper proposes a new watermarking algorithm based on the shuffled singular value decomposition and the visual cryptography for copyright protection of digital images. It generates the ownership and identification shares of the image based on visual cryptography. It decomposes the image into low and high frequency sub-bands. The low frequency sub-band is further divided into blocks of same size after shuffling it and then the singular value decomposition is applied to each randomly selected block. Shares are generated by comparing one of the elements in the first column of the left orthogonal matrix with its corresponding element in the right orthogonal matrix of the singular value decomposition of the block of the low frequency sub-band. The experimental results show that the proposed scheme clearly verifies the copyright of the digital images, and is robust to withstand several image processing attacks. Comparison with the other related visual cryptography-based algorithms reveals that the proposed method gives better performance. The proposed method is especially resilient against the rotation attack.

  16. Nonlinear laser dynamics from quantum dots to cryptography

    CERN Document Server

    Lüdge, Kathy

    2012-01-01

    A distinctive discussion of the nonlinear dynamical phenomena of semiconductor lasers. The book combines recent results of quantum dot laser modeling with mathematical details and an analytic understanding of nonlinear phenomena in semiconductor lasers and points out possible applications of lasers in cryptography and chaos control. This interdisciplinary approach makes it a unique and powerful source of knowledge for anyone intending to contribute to this field of research.By presenting both experimental and theoretical results, the distinguished authors consider solitary lase

  17. Cryptography with chaos and shadowing

    International Nuclear Information System (INIS)

    Smaoui, Nejib; Kanso, Ali

    2009-01-01

    In this paper, we present a novel approach to encrypt a message (a text composed by some alphabets) using chaos and shadowing. First, we generate a numerical chaotic orbit based on the logistic map, and use the shadowing algorithm of Smaoui and Kostelich [Smaoui N, Kostelich E. Using chaos to shadow the quadratic map for all time. Int J Comput Math 1998;70:117-29] to show that there exists a finite number of true orbits that shadow the numerical orbit. Then, the finite number of maps generated is used in Baptista's algorithm [Baptista MS. Cryptography with chaos. Phys Lett A 1998;240:50-4] to encrypt each character of the message. It is shown that the use of chaos and shadowing in the encryption process enhances the security level.

  18. Cryptography with chaos and shadowing

    Energy Technology Data Exchange (ETDEWEB)

    Smaoui, Nejib [Department of Mathematics and Computer Science, Kuwait University, P.O. Box 5969, Safat 13060 (Kuwait)], E-mail: nsmaoui64@yahoo.com; Kanso, Ali [Department of Mathematics and Computer Science, Kuwait University, P.O. Box 5969, Safat 13060 (Kuwait)], E-mail: akanso@hotmail.com

    2009-11-30

    In this paper, we present a novel approach to encrypt a message (a text composed by some alphabets) using chaos and shadowing. First, we generate a numerical chaotic orbit based on the logistic map, and use the shadowing algorithm of Smaoui and Kostelich [Smaoui N, Kostelich E. Using chaos to shadow the quadratic map for all time. Int J Comput Math 1998;70:117-29] to show that there exists a finite number of true orbits that shadow the numerical orbit. Then, the finite number of maps generated is used in Baptista's algorithm [Baptista MS. Cryptography with chaos. Phys Lett A 1998;240:50-4] to encrypt each character of the message. It is shown that the use of chaos and shadowing in the encryption process enhances the security level.

  19. One-way entangled-photon autocompensating quantum cryptography

    Science.gov (United States)

    Walton, Zachary D.; Abouraddy, Ayman F.; Sergienko, Alexander V.; Saleh, Bahaa E.; Teich, Malvin C.

    2003-06-01

    A quantum cryptography implementation is presented that uses entanglement to combine one-way operation with an autocompensating feature that has hitherto only been available in implementations that require the signal to make a round trip between the users. Using the concept of advanced waves, it is shown that this proposed implementation is related to the round-trip implementation in the same way that Ekert’s two-particle scheme is related to the original one-particle scheme of Bennett and Brassard. The practical advantages and disadvantages of the proposed implementation are discussed in the context of existing schemes.

  20. One-way entangled-photon autocompensating quantum cryptography

    International Nuclear Information System (INIS)

    Walton, Zachary D.; Abouraddy, Ayman F.; Sergienko, Alexander V.; Saleh, Bahaa E. A.; Teich, Malvin C.

    2003-01-01

    A quantum cryptography implementation is presented that uses entanglement to combine one-way operation with an autocompensating feature that has hitherto only been available in implementations that require the signal to make a round trip between the users. Using the concept of advanced waves, it is shown that this proposed implementation is related to the round-trip implementation in the same way that Ekert's two-particle scheme is related to the original one-particle scheme of Bennett and Brassard. The practical advantages and disadvantages of the proposed implementation are discussed in the context of existing schemes

  1. Experimental quantum secret sharing and third-man quantum cryptography.

    Science.gov (United States)

    Chen, Yu-Ao; Zhang, An-Ning; Zhao, Zhi; Zhou, Xiao-Qi; Lu, Chao-Yang; Peng, Cheng-Zhi; Yang, Tao; Pan, Jian-Wei

    2005-11-11

    Quantum secret sharing (QSS) and third-man quantum cryptography (TQC) are essential for advanced quantum communication; however, the low intensity and fragility of the multiphoton entanglement source in previous experiments have made their realization an extreme experimental challenge. Here, we develop and exploit an ultrastable high intensity source of four-photon entanglement to report an experimental realization of QSS and TQC. The technology developed in our experiment will be important for future multiparty quantum communication.

  2. An Incomplete Cryptography based Digital Rights Management with DCFF

    OpenAIRE

    Thanh, Ta Minh; Iwakiri, Munetoshi

    2014-01-01

    In general, DRM (Digital Rights Management) system is responsible for the safe distribution of digital content, however, DRM system is achieved with individual function modules of cryptography, watermarking and so on. In this typical system flow, it has a problem that all original digital contents are temporarily disclosed with perfect condition via decryption process. In this paper, we propose the combination of the differential codes and fragile fingerprinting (DCFF) method based on incompl...

  3. Characterization of collective Gaussian attacks and security of coherent-state quantum cryptography.

    Science.gov (United States)

    Pirandola, Stefano; Braunstein, Samuel L; Lloyd, Seth

    2008-11-14

    We provide a simple description of the most general collective Gaussian attack in continuous-variable quantum cryptography. In the scenario of such general attacks, we analyze the asymptotic secret-key rates which are achievable with coherent states, joint measurements of the quadratures and one-way classical communication.

  4. Threshold-improved predictions for charm production in deep-inelastic scattering

    International Nuclear Information System (INIS)

    Lo Presti, N.A.; Kawamura, H.; Vogt, A.

    2010-08-01

    We have extended previous results on the threshold expansion of the gluon coefficient function for the charm contribution to the deep-inelastic structure function F 2 by deriving all thresholdenhanced contributions at the next-to-next-to-leading order. The size of these corrections is briefly illustrated, and a first step towards extending this improvement to more differential charmproduction cross sections is presented. (orig.)

  5. DNA Cryptography and Deep Learning using Genetic Algorithm with NW algorithm for Key Generation.

    Science.gov (United States)

    Kalsi, Shruti; Kaur, Harleen; Chang, Victor

    2017-12-05

    Cryptography is not only a science of applying complex mathematics and logic to design strong methods to hide data called as encryption, but also to retrieve the original data back, called decryption. The purpose of cryptography is to transmit a message between a sender and receiver such that an eavesdropper is unable to comprehend it. To accomplish this, not only we need a strong algorithm, but a strong key and a strong concept for encryption and decryption process. We have introduced a concept of DNA Deep Learning Cryptography which is defined as a technique of concealing data in terms of DNA sequence and deep learning. In the cryptographic technique, each alphabet of a letter is converted into a different combination of the four bases, namely; Adenine (A), Cytosine (C), Guanine (G) and Thymine (T), which make up the human deoxyribonucleic acid (DNA). Actual implementations with the DNA don't exceed laboratory level and are expensive. To bring DNA computing on a digital level, easy and effective algorithms are proposed in this paper. In proposed work we have introduced firstly, a method and its implementation for key generation based on the theory of natural selection using Genetic Algorithm with Needleman-Wunsch (NW) algorithm and Secondly, a method for implementation of encryption and decryption based on DNA computing using biological operations Transcription, Translation, DNA Sequencing and Deep Learning.

  6. Entropy-as-a-Service: Unlocking the Full Potential of Cryptography.

    Science.gov (United States)

    Vassilev, Apostol; Staples, Robert

    2016-09-01

    Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised.

  7. Iris Cryptography for Security Purpose

    Science.gov (United States)

    Ajith, Srighakollapu; Balaji Ganesh Kumar, M.; Latha, S.; Samiappan, Dhanalakshmi; Muthu, P.

    2018-04-01

    In today's world, the security became the major issue to every human being. A major issue is hacking as hackers are everywhere, as the technology was developed still there are many issues where the technology fails to meet the security. Engineers, scientists were discovering the new products for security purpose as biometrics sensors like face recognition, pattern recognition, gesture recognition, voice authentication etcetera. But these devices fail to reach the expected results. In this work, we are going to present an approach to generate a unique secure key using the iris template. Here the iris templates are processed using the well-defined processing techniques. Using the encryption and decryption process they are stored, traversed and utilized. As of the work, we can conclude that the iris cryptography gives us the expected results for securing the data from eavesdroppers.

  8. A New Wavelet Threshold Function and Denoising Application

    Directory of Open Access Journals (Sweden)

    Lu Jing-yi

    2016-01-01

    Full Text Available In order to improve the effects of denoising, this paper introduces the basic principles of wavelet threshold denoising and traditional structures threshold functions. Meanwhile, it proposes wavelet threshold function and fixed threshold formula which are both improved here. First, this paper studies the problems existing in the traditional wavelet threshold functions and introduces the adjustment factors to construct the new threshold function basis on soft threshold function. Then, it studies the fixed threshold and introduces the logarithmic function of layer number of wavelet decomposition to design the new fixed threshold formula. Finally, this paper uses hard threshold, soft threshold, Garrote threshold, and improved threshold function to denoise different signals. And the paper also calculates signal-to-noise (SNR and mean square errors (MSE of the hard threshold functions, soft thresholding functions, Garrote threshold functions, and the improved threshold function after denoising. Theoretical analysis and experimental results showed that the proposed approach could improve soft threshold functions with constant deviation and hard threshold with discontinuous function problems. The proposed approach could improve the different decomposition scales that adopt the same threshold value to deal with the noise problems, also effectively filter the noise in the signals, and improve the SNR and reduce the MSE of output signals.

  9. Cryptography with chaos at the physical level

    International Nuclear Information System (INIS)

    Machado, Romuel F.; Baptista, Murilo S.; Grebogi, C.

    2004-01-01

    In this work, we devise a chaos-based secret key cryptography scheme for digital communication where the encryption is realized at the physical level, that is, the encrypting transformations are applied to the wave signal instead to the symbolic sequence. The encryption process consists of transformations applied to a two-dimensional signal composed of the message carrying signal and an encrypting signal that has to be a chaotic one. The secret key, in this case, is related to the number of times the transformations are applied. Furthermore, we show that due to its chaotic nature, the encrypting signal is able to hide the statistics of the original signal

  10. Security Enhanced User Authentication Protocol for Wireless Sensor Networks Using Elliptic Curves Cryptography

    Directory of Open Access Journals (Sweden)

    Younsung Choi

    2014-06-01

    Full Text Available Wireless sensor networks (WSNs consist of sensors, gateways and users. Sensors are widely distributed to monitor various conditions, such as temperature, sound, speed and pressure but they have limited computational ability and energy. To reduce the resource use of sensors and enhance the security of WSNs, various user authentication protocols have been proposed. In 2011, Yeh et al. first proposed a user authentication protocol based on elliptic curve cryptography (ECC for WSNs. However, it turned out that Yeh et al.’s protocol does not provide mutual authentication, perfect forward secrecy, and key agreement between the user and sensor. Later in 2013, Shi et al. proposed a new user authentication protocol that improves both security and efficiency of Yeh et al.’s protocol. However, Shi et al.’s improvement introduces other security weaknesses. In this paper, we show that Shi et al.’s improved protocol is vulnerable to session key attack, stolen smart card attack, and sensor energy exhausting attack. In addition, we propose a new, security-enhanced user authentication protocol using ECC for WSNs.

  11. Security enhanced user authentication protocol for wireless sensor networks using elliptic curves cryptography.

    Science.gov (United States)

    Choi, Younsung; Lee, Donghoon; Kim, Jiye; Jung, Jaewook; Nam, Junghyun; Won, Dongho

    2014-06-10

    Wireless sensor networks (WSNs) consist of sensors, gateways and users. Sensors are widely distributed to monitor various conditions, such as temperature, sound, speed and pressure but they have limited computational ability and energy. To reduce the resource use of sensors and enhance the security of WSNs, various user authentication protocols have been proposed. In 2011, Yeh et al. first proposed a user authentication protocol based on elliptic curve cryptography (ECC) for WSNs. However, it turned out that Yeh et al.'s protocol does not provide mutual authentication, perfect forward secrecy, and key agreement between the user and sensor. Later in 2013, Shi et al. proposed a new user authentication protocol that improves both security and efficiency of Yeh et al.'s protocol. However, Shi et al.'s improvement introduces other security weaknesses. In this paper, we show that Shi et al.'s improved protocol is vulnerable to session key attack, stolen smart card attack, and sensor energy exhausting attack. In addition, we propose a new, security-enhanced user authentication protocol using ECC for WSNs.

  12. Coding, cryptography and combinatorics

    CERN Document Server

    Niederreiter, Harald; Xing, Chaoping

    2004-01-01

    It has long been recognized that there are fascinating connections between cod­ ing theory, cryptology, and combinatorics. Therefore it seemed desirable to us to organize a conference that brings together experts from these three areas for a fruitful exchange of ideas. We decided on a venue in the Huang Shan (Yellow Mountain) region, one of the most scenic areas of China, so as to provide the additional inducement of an attractive location. The conference was planned for June 2003 with the official title Workshop on Coding, Cryptography and Combi­ natorics (CCC 2003). Those who are familiar with events in East Asia in the first half of 2003 can guess what happened in the end, namely the conference had to be cancelled in the interest of the health of the participants. The SARS epidemic posed too serious a threat. At the time of the cancellation, the organization of the conference was at an advanced stage: all invited speakers had been selected and all abstracts of contributed talks had been screened by the p...

  13. The Design and Evaluation of a Cryptography Teaching Strategy for Software Engineering Students

    Science.gov (United States)

    Dowling, T.

    2006-01-01

    The present paper describes the design, implementation and evaluation of a cryptography module for final-year software engineering students. The emphasis is on implementation architectures and practical cryptanalysis rather than a standard mathematical approach. The competitive continuous assessment process reflects this approach and rewards…

  14. Development of the polarization tracking scheme for free-space quantum cryptography

    Science.gov (United States)

    Toyoshima, Morio; Takayama, Yoshihisa; Kunimori, Hiroo; Takeoka, Masahiro; Fujiwara, Mikio; Sasaki, Masahide

    2008-04-01

    Quantum cryptography is a new technique for transmitting quantum information. The information is securely transmitted due to the laws of physics. In such systems, the vehicle that transfers quantum information is a single photon. The problem with using photons is that the transmission distance is limited by the absorption of the photons by the optical fiber along which they pass. The maximum demonstrated range so far is approximately 100 km. Using free-space quantum cryptography between a ground station and a satellite is a possible way of sending quantum information farther than is possible with optical fibers. This is because there is no birefringence effect in the atmosphere. However, there is a complication in that the directions of the polarization basis between the transmitter and the receiver must coincide with each other. This polarization changes because the mobile terminals for free-space transmission continuously change their attitudes. If the transmission protocol is based on polarization, it is necessary to compensate for the change in attitude between the mobile terminals. We are developing a scheme to track the polarization basis between the transceivers. The preliminary result is presented.

  15. Geometry, algebra and applications from mechanics to cryptography

    CERN Document Server

    Encinas, Luis; Gadea, Pedro; María, Mª

    2016-01-01

    This volume collects contributions written by different experts in honor of Prof. Jaime Muñoz Masqué. It covers a wide variety of research topics, from differential geometry to algebra, but particularly focuses on the geometric formulation of variational calculus; geometric mechanics and field theories; symmetries and conservation laws of differential equations, and pseudo-Riemannian geometry of homogeneous spaces. It also discusses algebraic applications to cryptography and number theory. It offers state-of-the-art contributions in the context of current research trends. The final result is a challenging panoramic view of connecting problems that initially appear distant.

  16. Novel optical scanning cryptography using Fresnel telescope imaging.

    Science.gov (United States)

    Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren

    2015-07-13

    We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.

  17. Cryptanalysis of Application of Laplace Transform for Cryptography

    Directory of Open Access Journals (Sweden)

    Gençoğlu Muharrem Tuncay

    2017-01-01

    Full Text Available Although Laplace Transform is a good application field in the design of cryptosystems, many cryptographic algorithm proposals become unsatisfactory for secure communication. In this cryptanalysis study, one of the significant disadvantages of the proposed algorithm is performed with only statistical test of security analysis. In this study, Explaining what should be considered when performing security analysis of Laplace Transform based encryption systems and using basic mathematical rules, password has broken without knowing secret key. Under the skin; This study is a refutation for the article titled Application of Laplace Transform for Cryptography written by Hiwerakar[3].

  18. Introduction to number theory with cryptography

    CERN Document Server

    Kraft, James S

    2013-01-01

    IntroductionDiophantine EquationsModular ArithmeticPrimes and the Distribution of PrimesCryptographyDivisibilityDivisibilityEuclid's Theorem Euclid's Original Proof The Sieve of Eratosthenes The Division Algorithm The Greatest Common Divisor The Euclidean Algorithm Other BasesLinear Diophantine EquationsThe Postage Stamp Problem Fermat and Mersenne Numbers Chapter Highlights Problems Unique FactorizationPreliminary Results The Fundamental Theorem of Arithmetic Euclid and the Fundamental Theorem of ArithmeticChapter Highlights Problems Applications of Unique Factorization A Puzzle Irrationality Proofs The Rational Root Theorem Pythagorean Triples Differences of Squares Prime Factorization of Factorials The Riemann Zeta Function Chapter Highlights Problems CongruencesDefinitions and Examples Modular Exponentiation Divisibility TestsLinear Congruences The Chinese Remainder TheoremFractions mod m Fermat's Theorem Euler's Theorem Wilson's Theorem Queens on a Chessboard Chapter Highlights Problems Cryptographic App...

  19. Microscale optical cryptography using a subdiffraction-limit optical key

    Science.gov (United States)

    Ogura, Yusuke; Aino, Masahiko; Tanida, Jun

    2018-04-01

    We present microscale optical cryptography using a subdiffraction-limit optical pattern, which is finer than the diffraction-limit size of the decrypting optical system, as a key and a substrate with a reflectance distribution as an encrypted image. Because of the subdiffraction-limit spatial coding, this method enables us to construct a secret image with the diffraction-limit resolution. Simulation and experimental results demonstrate, both qualitatively and quantitatively, that the secret image becomes recognizable when and only when the substrate is illuminated with the designed key pattern.

  20. Deterministic and efficient quantum cryptography based on Bell's theorem

    International Nuclear Information System (INIS)

    Chen, Z.-B.; Zhang, Q.; Bao, X.-H.; Schmiedmayer, J.; Pan, J.-W.

    2005-01-01

    Full text: We propose a novel double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish a key bit with the help of classical communications. Eavesdropping can be detected by checking the violation of local realism for the detected events. We also show that our protocol allows a robust implementation under current technology. (author)

  1. QC-LDPC code-based cryptography

    CERN Document Server

    Baldi, Marco

    2014-01-01

    This book describes the fundamentals of cryptographic primitives based on quasi-cyclic low-density parity-check (QC-LDPC) codes, with a special focus on the use of these codes in public-key cryptosystems derived from the McEliece and Niederreiter schemes. In the first part of the book, the main characteristics of QC-LDPC codes are reviewed, and several techniques for their design are presented, while tools for assessing the error correction performance of these codes are also described. Some families of QC-LDPC codes that are best suited for use in cryptography are also presented. The second part of the book focuses on the McEliece and Niederreiter cryptosystems, both in their original forms and in some subsequent variants. The applicability of QC-LDPC codes in these frameworks is investigated by means of theoretical analyses and numerical tools, in order to assess their benefits and drawbacks in terms of system efficiency and security. Several examples of QC-LDPC code-based public key cryptosystems are prese...

  2. Quantum key distribution and cryptography

    International Nuclear Information System (INIS)

    Alleaume, R.

    2005-01-01

    Full text: Originally proposed by classical cryptographers, the ideas behind Quantum Key Distribution (QKD) have attracted considerable interest among the quantum optics community, which has significantly helped bring these ideas to reality. Experimental realizations have quickly evolved from early lab demonstrations to QKD systems that are now deployed in real conditions and targeting commercial applications. Although QKD can be theoretically proven to rely on 'unconditional security proofs' and should thus be able to provide security levels unachievable through computationally-based cryptographic techniques, the debate on the cryptographic applications of QKD remains somehow controversial. It seems that a consensus on that matter cannot be reached without a careful analysis of assumptions and definitions related to security models used in classical or in quantum cryptography. In this talk, we will try to present a comprehensive synthesis on this topic. We have initiated this work as a contribution to the European IP SECOQC project, confronting views and knowledge among experimental and theoretical quantum physicists, as well as classical cryptographers. (author)

  3. Improved bounds on the epidemic threshold of exact SIS models on complex networks

    KAUST Repository

    Ruhi, Navid Azizan; Thrampoulidis, Christos; Hassibi, Babak

    2017-01-01

    The SIS (susceptible-infected-susceptible) epidemic model on an arbitrary network, without making approximations, is a 2n-state Markov chain with a unique absorbing state (the all-healthy state). This makes analysis of the SIS model and, in particular, determining the threshold of epidemic spread quite challenging. It has been shown that the exact marginal probabilities of infection can be upper bounded by an n-dimensional linear time-invariant system, a consequence of which is that the Markov chain is “fast-mixing” when the LTI system is stable, i.e. when equation (where β is the infection rate per link, δ is the recovery rate, and λmax(A) is the largest eigenvalue of the network's adjacency matrix). This well-known threshold has been recently shown not to be tight in several cases, such as in a star network. In this paper, we provide tighter upper bounds on the exact marginal probabilities of infection, by also taking pairwise infection probabilities into account. Based on this improved bound, we derive tighter eigenvalue conditions that guarantee fast mixing (i.e., logarithmic mixing time) of the chain. We demonstrate the improvement of the threshold condition by comparing the new bound with the known one on various networks with various epidemic parameters.

  4. Improved bounds on the epidemic threshold of exact SIS models on complex networks

    KAUST Repository

    Ruhi, Navid Azizan

    2017-01-05

    The SIS (susceptible-infected-susceptible) epidemic model on an arbitrary network, without making approximations, is a 2n-state Markov chain with a unique absorbing state (the all-healthy state). This makes analysis of the SIS model and, in particular, determining the threshold of epidemic spread quite challenging. It has been shown that the exact marginal probabilities of infection can be upper bounded by an n-dimensional linear time-invariant system, a consequence of which is that the Markov chain is “fast-mixing” when the LTI system is stable, i.e. when equation (where β is the infection rate per link, δ is the recovery rate, and λmax(A) is the largest eigenvalue of the network\\'s adjacency matrix). This well-known threshold has been recently shown not to be tight in several cases, such as in a star network. In this paper, we provide tighter upper bounds on the exact marginal probabilities of infection, by also taking pairwise infection probabilities into account. Based on this improved bound, we derive tighter eigenvalue conditions that guarantee fast mixing (i.e., logarithmic mixing time) of the chain. We demonstrate the improvement of the threshold condition by comparing the new bound with the known one on various networks with various epidemic parameters.

  5. Optical cryptography topology based on a three-dimensional particle-like distribution and diffractive imaging.

    Science.gov (United States)

    Chen, Wen; Chen, Xudong

    2011-05-09

    In recent years, coherent diffractive imaging has been considered as a promising alternative for information retrieval instead of conventional interference methods. Coherent diffractive imaging using the X-ray light source has opened up a new research perspective for the measurement of non-crystalline and biological specimens, and can achieve unprecedentedly high resolutions. In this paper, we show how a three-dimensional (3D) particle-like distribution and coherent diffractive imaging can be applied for a study of optical cryptography. An optical multiple-random-phase-mask encoding approach is used, and the plaintext is considered as a series of particles distributed in a 3D space. A topology concept is also introduced into the proposed optical cryptosystem. During image decryption, a retrieval algorithm is developed to extract the plaintext from the ciphertexts. In addition, security and advantages of the proposed optical cryptography topology are also analyzed. © 2011 Optical Society of America

  6. A NOVEL ROLLING BASED DNA CRYPTOGRAPHY

    Directory of Open Access Journals (Sweden)

    Rejwana Haque

    2017-05-01

    Full Text Available DNA Cryptography can be defined as a hiding data in terms of DNA Sequence. In this paper we propose a new DNA Encryption Technique where three different types of ordering is use to make binary data into cipher text. The main stages of this encryption technique are: Key Analysis, Data and Key Arrangement, Roll in encoding, Secondary Arrangement and Shifting. Decryption process has six main steps to obtain the original binary data from the encrypted data and key. Decryption steps are: Key Analysis, Shifting, Secondary Arrangement, Key Arrangement, Roll-out decoding, Data Arrangement. Here key size is half of binary data and the key is varies from data to data so key are used as one time pad. In this paper we also discuss about the implementation from sample data and security analysis for this given method.

  7. Tight finite-key analysis for quantum cryptography.

    Science.gov (United States)

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  8. Fast, efficient error reconciliation for quantum cryptography

    International Nuclear Information System (INIS)

    Buttler, W.T.; Lamoreaux, S.K.; Torgerson, J.R.; Nickel, G.H.; Donahue, C.H.; Peterson, C.G.

    2003-01-01

    We describe an error-reconciliation protocol, which we call Winnow, based on the exchange of parity and Hamming's 'syndrome' for N-bit subunits of a large dataset. The Winnow protocol was developed in the context of quantum-key distribution and offers significant advantages and net higher efficiency compared to other widely used protocols within the quantum cryptography community. A detailed mathematical analysis of the Winnow protocol is presented in the context of practical implementations of quantum-key distribution; in particular, the information overhead required for secure implementation is one of the most important criteria in the evaluation of a particular error-reconciliation protocol. The increase in efficiency for the Winnow protocol is largely due to the reduction in authenticated public communication required for its implementation

  9. Implementation and Analysis Audio Steganography Used Parity Coding for Symmetric Cryptography Key Delivery

    Directory of Open Access Journals (Sweden)

    Afany Zeinata Firdaus

    2013-12-01

    Full Text Available In today's era of communication, online data transactions is increasing. Various information even more accessible, both upload and download. Because it takes a capable security system. Blowfish cryptographic equipped with Audio Steganography is one way to secure the data so that the data can not be accessed by unauthorized parties. In this study Audio Steganography technique is implemented using parity coding method that is used to send the key cryptography blowfish in e-commerce applications based on Android. The results obtained for the average computation time on stage insertion (embedding the secret message is shorter than the average computation time making phase (extracting the secret message. From the test results can also be seen that the more the number of characters pasted the greater the noise received, where the highest SNR is obtained when a character is inserted as many as 506 characters is equal to 11.9905 dB, while the lowest SNR obtained when a character is inserted as many as 2006 characters at 5,6897 dB . Keywords: audio steganograph, parity coding, embedding, extractin, cryptography blowfih.

  10. SURVEY ON CLOUD SECURITY BY DATA ENCRYPTION USING ELLIPTIC CURVE CRYPTOGRAPHY

    OpenAIRE

    Akanksha Tomar*, Jamwant Kumbhre

    2016-01-01

    Cloud computing is one of the latest technology trend of the IT trade for business area. Cloud computing security converged into a demanding topic in the sector of information technology and computer science research programs. Cloud Computing is a conceptual service based technology which is used by many companies widely these days. Elliptical Curve Cryptography based algorithm provides a highly secure communication, data integrity and authentication, along with the non-repudiation communicat...

  11. Quantum cryptography using a photon source based on postselection from entangled two-photon states

    Czech Academy of Sciences Publication Activity Database

    Peřina ml., Jan; Haderka, Ondřej; Soubusta, Jan

    2001-01-01

    Roč. 64, - (2001), s. 052305-1-152305-13 ISSN 1050-2947 R&D Projects: GA MŠk LN00A015 Institutional research plan: CEZ:AV0Z1010914 Keywords : quantum cryptography * photon number squeezing Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.810, year: 2001

  12. Quantum cryptography: Theoretical protocols for quantum key distribution and tests of selected commercial QKD systems in commercial fiber networks

    Science.gov (United States)

    Jacak, Monika; Jacak, Janusz; Jóźwiak, Piotr; Jóźwiak, Ireneusz

    2016-06-01

    The overview of the current status of quantum cryptography is given in regard to quantum key distribution (QKD) protocols, implemented both on nonentangled and entangled flying qubits. Two commercial R&D platforms of QKD systems are described (the Clavis II platform by idQuantique implemented on nonentangled photons and the EPR S405 Quelle platform by AIT based on entangled photons) and tested for feasibility of their usage in commercial TELECOM fiber metropolitan networks. The comparison of systems efficiency, stability and resistivity against noise and hacker attacks is given with some suggestion toward system improvement, along with assessment of two models of QKD.

  13. Analysis of limiting information characteristics of quantum-cryptography protocols

    International Nuclear Information System (INIS)

    Sych, D V; Grishanin, Boris A; Zadkov, Viktor N

    2005-01-01

    The problem of increasing the critical error rate of quantum-cryptography protocols by varying a set of letters in a quantum alphabet for space of a fixed dimensionality is studied. Quantum alphabets forming regular polyhedra on the Bloch sphere and the continual alphabet equally including all the quantum states are considered. It is shown that, in the absence of basis reconciliation, a protocol with the tetrahedral alphabet has the highest critical error rate among the protocols considered, while after the basis reconciliation, a protocol with the continual alphabet possesses the highest critical error rate. (quantum optics and quantum computation)

  14. Deterministic and efficient quantum cryptography based on Bell's theorem

    International Nuclear Information System (INIS)

    Chen Zengbing; Pan Jianwei; Zhang Qiang; Bao Xiaohui; Schmiedmayer, Joerg

    2006-01-01

    We propose a double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish one and only one perfect correlation, and thus deterministically create a key bit. Eavesdropping can be detected by violation of local realism. A variation of the protocol shows a higher security, similar to the six-state protocol, under individual attacks. Our scheme allows a robust implementation under the current technology

  15. Entanglement witnessing and quantum cryptography with nonideal ferromagnetic detectors

    Science.gov (United States)

    Kłobus, Waldemar; Grudka, Andrzej; Baumgartner, Andreas; Tomaszewski, Damian; Schönenberger, Christian; Martinek, Jan

    2014-03-01

    We investigate theoretically the use of nonideal ferromagnetic contacts as a means to detect quantum entanglement of electron spins in transport experiments. We use a designated entanglement witness and find a minimal spin polarization of η >1/√3 ≈58% required to demonstrate spin entanglement. This is significantly less stringent than the ubiquitous tests of Bell's inequality with η >1/√24 >≈84%. In addition, we discuss the impact of decoherence and noise on entanglement detection and apply the presented framework to a simple quantum cryptography protocol. Our results are directly applicable to a large variety of experiments.

  16. Constant-current regulator improves tunnel diode threshold-detector performance

    Science.gov (United States)

    Cancro, C. A.

    1965-01-01

    Grounded-base transistor is placed in a tunnel diode threshold detector circuit, and a bias voltage is applied to the tunnel diode. This provides the threshold detector with maximum voltage output and overload protection.

  17. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    Science.gov (United States)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  18. An Application-Independent Cryptography Model That Is Easy to Use for All Level Users

    Science.gov (United States)

    Gabrielson, Anthony J.

    2013-01-01

    Cryptography libraries are inflexible and difficult for developers to integrate with their applications. These difficulties are often encountered by applications, like PGP, which are non-intuitive for end-users and are often used improperly or not at all. This thesis discusses the negative impact of the current prevailing poor usability on…

  19. Position-based quantum cryptography over untrusted networks

    International Nuclear Information System (INIS)

    Nadeem, Muhammad

    2014-01-01

    In this article, we propose quantum position verification (QPV) schemes where all the channels are untrusted except the position of the prover and distant reference stations of verifiers. We review and analyze the existing QPV schemes containing some pre-shared data between the prover and verifiers. Most of these schemes are based on non-cryptographic assumptions, i.e. quantum/classical channels between the verifiers are secure. It seems impractical in an environment fully controlled by adversaries and would lead to security compromise in practical implementations. However, our proposed formula for QPV is more robust, secure and according to the standard assumptions of cryptography. Furthermore, once the position of the prover is verified, our schemes establish secret keys in parallel and can be used for authentication and secret communication between the prover and verifiers. (paper)

  20. Dynamic visual cryptography on deformable finite element grids

    Science.gov (United States)

    Aleksiene, S.; Vaidelys, M.; Aleksa, A.; Ragulskis, M.

    2017-07-01

    Dynamic visual cryptography scheme based on time averaged moiré fringes on deformable finite element grids is introduced in this paper. A predefined Eigenshape function is used for the selection of the pitch of the moiré grating. The relationship between the pitch of moiré grating, the roots of the zero order Bessel function of the first kind and the amplitude of harmonic oscillations is derived and validated by computational experiments. Phase regularization algorithm is used in the entire area of the cover image in order to embed the secret image and to avoid large fluctuations of the moiré grating. Computational simulations are used to demonstrate the efficiency and the applicability of the proposed image hiding technique.

  1. PREFACE: Quantum Information, Communication, Computation and Cryptography

    Science.gov (United States)

    Benatti, F.; Fannes, M.; Floreanini, R.; Petritis, D.

    2007-07-01

    The application of quantum mechanics to information related fields such as communication, computation and cryptography is a fast growing line of research that has been witnessing an outburst of theoretical and experimental results, with possible practical applications. On the one hand, quantum cryptography with its impact on secrecy of transmission is having its first important actual implementations; on the other hand, the recent advances in quantum optics, ion trapping, BEC manipulation, spin and quantum dot technologies allow us to put to direct test a great deal of theoretical ideas and results. These achievements have stimulated a reborn interest in various aspects of quantum mechanics, creating a unique interplay between physics, both theoretical and experimental, mathematics, information theory and computer science. In view of all these developments, it appeared timely to organize a meeting where graduate students and young researchers could be exposed to the fundamentals of the theory, while senior experts could exchange their latest results. The activity was structured as a school followed by a workshop, and took place at The Abdus Salam International Center for Theoretical Physics (ICTP) and The International School for Advanced Studies (SISSA) in Trieste, Italy, from 12-23 June 2006. The meeting was part of the activity of the Joint European Master Curriculum Development Programme in Quantum Information, Communication, Cryptography and Computation, involving the Universities of Cergy-Pontoise (France), Chania (Greece), Leuven (Belgium), Rennes1 (France) and Trieste (Italy). This special issue of Journal of Physics A: Mathematical and Theoretical collects 22 contributions from well known experts who took part in the workshop. They summarize the present day status of the research in the manifold aspects of quantum information. The issue is opened by two review articles, the first by G Adesso and F Illuminati discussing entanglement in continuous variable

  2. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Russian Academy of Sciences, Institute of Solid State Physics (Russian Federation)

    2016-11-15

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  3. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    International Nuclear Information System (INIS)

    Molotkov, S. N.

    2016-01-01

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  4. A Luggage Control System Based on NFC and Homomorphic Cryptography

    Directory of Open Access Journals (Sweden)

    Néstor Álvarez-Díaz

    2017-01-01

    Full Text Available We propose an innovative luggage tracking and management system that can be used to secure airport terminal services and reduce the waiting time of passengers during check-in. This addresses an urgent need to streamline and optimize passenger flows at airport terminals and lowers the risk of terrorist threats. The system employs Near Field Communication (NFC technology and homomorphic cryptography (the Paillier cryptosystem to protect wireless communication and stored data. A security analysis and a performance test show the usability and applicability of the proposed system.

  5. Integrating real-time subsurface hydrologic monitoring with empirical rainfall thresholds to improve landslide early warning

    Science.gov (United States)

    Mirus, Benjamin B.; Becker, Rachel E.; Baum, Rex L.; Smith, Joel B.

    2018-01-01

    Early warning for rainfall-induced shallow landsliding can help reduce fatalities and economic losses. Although these commonly occurring landslides are typically triggered by subsurface hydrological processes, most early warning criteria rely exclusively on empirical rainfall thresholds and other indirect proxies for subsurface wetness. We explore the utility of explicitly accounting for antecedent wetness by integrating real-time subsurface hydrologic measurements into landslide early warning criteria. Our efforts build on previous progress with rainfall thresholds, monitoring, and numerical modeling along the landslide-prone railway corridor between Everett and Seattle, Washington, USA. We propose a modification to a previously established recent versus antecedent (RA) cumulative rainfall thresholds by replacing the antecedent 15-day rainfall component with an average saturation observed over the same timeframe. We calculate this antecedent saturation with real-time telemetered measurements from five volumetric water content probes installed in the shallow subsurface within a steep vegetated hillslope. Our hybrid rainfall versus saturation (RS) threshold still relies on the same recent 3-day rainfall component as the existing RA thresholds, to facilitate ready integration with quantitative precipitation forecasts. During the 2015–2017 monitoring period, this RS hybrid approach has an increase of true positives and a decrease of false positives and false negatives relative to the previous RA rainfall-only thresholds. We also demonstrate that alternative hybrid threshold formats could be even more accurate, which suggests that further development and testing during future landslide seasons is needed. The positive results confirm that accounting for antecedent wetness conditions with direct subsurface hydrologic measurements can improve thresholds for alert systems and early warning of rainfall-induced shallow landsliding.

  6. An Online Banking System Based on Quantum Cryptography Communication

    Science.gov (United States)

    Zhou, Ri-gui; Li, Wei; Huan, Tian-tian; Shen, Chen-yi; Li, Hai-sheng

    2014-07-01

    In this paper, an online banking system has been built. Based on quantum cryptography communication, this system is proved unconditional secure. Two sets of GHZ states are applied, which can ensure the safety of purchase and payment, respectively. In another word, three trading participants in each triplet state group form an interdependent and interactive relationship. In the meantime, trading authorization and blind signature is introduced by means of controllable quantum teleportation. Thus, an effective monitor is practiced on the premise that the privacy of trading partners is guaranteed. If there is a dispute or deceptive behavior, the system will find out the deceiver immediately according to the relationship mentioned above.

  7. Mesoscopic quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Russian Academy of Sciences, Institute of Solid State Physics (Russian Federation)

    2017-03-15

    Since a strictly single-photon source is not yet available, in quantum cryptography systems, one uses, as information quantum states, coherent radiation of a laser with an average number of photons of μ ≈ 0.1–0.5 in a pulse, attenuated to the quasi-single-photon level. The linear independence of a set of coherent quasi-single-photon information states leads to the possibility of unambiguous measurements that, in the presence of losses in the line, restrict the transmission range of secret keys. Starting from a certain value of critical loss (the length of the line), the eavesdropper knows the entire key, does not make errors, and is not detected—the distribution of secret keys becomes impossible. This problem is solved by introducing an additional reference state with an average number of photons of μ{sub cl} ≈ 10{sup 3}–10{sup 6}, depending on the length of the communication line. It is shown that the use of a reference state does not allow the eavesdropper to carry out measurements with conclusive outcome while remaining undetected. A reference state guarantees detecting an eavesdropper in a channel with high losses. In this case, information states may contain a mesoscopic average number of photons in the range of μ{sub q} ≈ 0.5–10{sup 2}. The protocol proposed is easy to implement technically, admits flexible adjustment of parameters to the length of the communication line, and is simple and transparent for proving the secrecy of keys.

  8. Optical asymmetric cryptography using a three-dimensional space-based model

    International Nuclear Information System (INIS)

    Chen, Wen; Chen, Xudong

    2011-01-01

    In this paper, we present optical asymmetric cryptography combined with a three-dimensional (3D) space-based model. An optical multiple-random-phase-mask encoding system is developed in the Fresnel domain, and one random phase-only mask and the plaintext are combined as a series of particles. Subsequently, the series of particles is translated along an axial direction, and is distributed in a 3D space. During image decryption, the robustness and security of the proposed method are further analyzed. Numerical simulation results are presented to show the feasibility and effectiveness of the proposed optical image encryption method

  9. Improving the segmentation of therapy-induced leukoencephalopathy using apriori information and a gradient magnitude threshold

    Science.gov (United States)

    Glass, John O.; Reddick, Wilburn E.; Reeves, Cara; Pui, Ching-Hon

    2004-05-01

    Reliably quantifying therapy-induced leukoencephalopathy in children treated for cancer is a challenging task due to its varying MR properties and similarity to normal tissues and imaging artifacts. T1, T2, PD, and FLAIR images were analyzed for a subset of 15 children from an institutional protocol for the treatment of acute lymphoblastic leukemia. Three different analysis techniques were compared to examine improvements in the segmentation accuracy of leukoencephalopathy versus manual tracings by two expert observers. The first technique utilized no apriori information and a white matter mask based on the segmentation of the first serial examination of each patient. MR images were then segmented with a Kohonen Self-Organizing Map. The other two techniques combine apriori maps from the ICBM atlas spatially normalized to each patient and resliced using SPM99 software. The apriori maps were included as input and a gradient magnitude threshold calculated on the FLAIR images was also utilized. The second technique used a 2-dimensional threshold, while the third algorithm utilized a 3-dimensional threshold. Kappa values were compared for the three techniques to each observer, and improvements were seen with each addition to the original algorithm (Observer 1: 0.651, 0.653, 0.744; Observer 2: 0.603, 0.615, 0.699).

  10. Femtosecond Laser--Pumped Source of Entangled Photons for Quantum Cryptography Applications

    International Nuclear Information System (INIS)

    Pan, D.; Donaldson, W.; Sobolewski, R.

    2007-01-01

    We present an experimental setup for generation of entangled-photon pairs via spontaneous parametric down-conversion, based on the femtosecond-pulsed laser. Our entangled-photon source utilizes a 76-MHz-repetition-rate, 100-fs-pulse-width, mode-locked, ultrafast femtosecond laser, which can produce, on average, more photon pairs than a cw laser of an equal pump power. The resulting entangled pairs are counted by a pair of high-quantum-efficiency, single-photon, silicon avalanche photodiodes. Our apparatus s intended as an efficient source/receiver system for the quantum communications and quantum cryptography applications

  11. Secure Programming Cookbook for C and C++ Recipes for Cryptography, Authentication, Input Validation & More

    CERN Document Server

    Viega, John

    2009-01-01

    Secure Programming Cookbook for C and C++ is an important new resource for developers serious about writing secure code for Unix® (including Linux®) and Windows® environments. This essential code companion covers a wide range of topics, including safe initialization, access control, input validation, symmetric and public key cryptography, cryptographic hashes and MACs, authentication and key exchange, PKI, random numbers, and anti-tampering.

  12. Adaptive Wavelet Threshold Denoising Method for Machinery Sound Based on Improved Fruit Fly Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Jing Xu

    2016-07-01

    Full Text Available As the sound signal of a machine contains abundant information and is easy to measure, acoustic-based monitoring or diagnosis systems exhibit obvious superiority, especially in some extreme conditions. However, the sound directly collected from industrial field is always polluted. In order to eliminate noise components from machinery sound, a wavelet threshold denoising method optimized by an improved fruit fly optimization algorithm (WTD-IFOA is proposed in this paper. The sound is firstly decomposed by wavelet transform (WT to obtain coefficients of each level. As the wavelet threshold functions proposed by Donoho were discontinuous, many modified functions with continuous first and second order derivative were presented to realize adaptively denoising. However, the function-based denoising process is time-consuming and it is difficult to find optimal thresholds. To overcome these problems, fruit fly optimization algorithm (FOA was introduced to the process. Moreover, to avoid falling into local extremes, an improved fly distance range obeying normal distribution was proposed on the basis of original FOA. Then, sound signal of a motor was recorded in a soundproof laboratory, and Gauss white noise was added into the signal. The simulation results illustrated the effectiveness and superiority of the proposed approach by a comprehensive comparison among five typical methods. Finally, an industrial application on a shearer in coal mining working face was performed to demonstrate the practical effect.

  13. Real Time MODBUS Transmissions and Cryptography Security Designs and Enhancements of Protocol Sensitive Information

    Directory of Open Access Journals (Sweden)

    Aamir Shahzad

    2015-07-01

    Full Text Available Information technology (IT security has become a major concern due to the growing demand for information and massive development of client/server applications for various types of applications running on modern IT infrastructure. How has security been taken into account and which paradigms are necessary to minimize security issues while increasing efficiency, reducing the influence on transmissions, ensuring protocol independency and achieving substantial performance? We have found cryptography to be an absolute security mechanism for client/server architectures, and in this study, a new security design was developed with the MODBUS protocol, which is considered to offer phenomenal performance for future development and enhancement of real IT infrastructure. This study is also considered to be a complete development because security is tested in almost all ways of MODBUS communication. The computed measurements are evaluated to validate the overall development, and the results indicate a substantial improvement in security that is differentiated from conventional methods.

  14. Handbook of elliptic and hyperelliptic curve cryptography

    CERN Document Server

    Cohen, Henri; Avanzi, Roberto; Doche, Christophe; Lange, Tanja; Nguyen, Kim; Vercauteren, Frederik

    2005-01-01

    … very comprehensive coverage of this vast subject area … a useful and essential treatise for anyone involved in elliptic curve algorithms … this book offers the opportunity to grasp the ECC technology with a diversified and comprehensive perspective. … This book will remain on my shelf for a long time and will land on my desk on many occasions, if only because the coverage of the issues common to factoring and discrete log cryptosystems is excellent.-IACR Book Reviews, June 2011… the book is designed for people who are working in the area and want to learn more about a specific issue. The chapters are written to be relatively independent so that readers can focus on the part of interest for them. Such readers will be grateful for the excellent index and extensive bibliography. … the handbook covers a wide range of topics and will be a valuable reference for researchers in curve-based cryptography. -Steven D. Galbraith, Mathematical Reviews, Issue 2007f.

  15. Encrypted Objects and Decryption Processes: Problem-Solving with Functions in a Learning Environment Based on Cryptography

    Science.gov (United States)

    White, Tobin

    2009-01-01

    This paper introduces an applied problem-solving task, set in the context of cryptography and embedded in a network of computer-based tools. This designed learning environment engaged students in a series of collaborative problem-solving activities intended to introduce the topic of functions through a set of linked representations. In a…

  16. Characterizing multi-photon quantum interference with practical light sources and threshold single-photon detectors

    Science.gov (United States)

    Navarrete, Álvaro; Wang, Wenyuan; Xu, Feihu; Curty, Marcos

    2018-04-01

    The experimental characterization of multi-photon quantum interference effects in optical networks is essential in many applications of photonic quantum technologies, which include quantum computing and quantum communication as two prominent examples. However, such characterization often requires technologies which are beyond our current experimental capabilities, and today's methods suffer from errors due to the use of imperfect sources and photodetectors. In this paper, we introduce a simple experimental technique to characterize multi-photon quantum interference by means of practical laser sources and threshold single-photon detectors. Our technique is based on well-known methods in quantum cryptography which use decoy settings to tightly estimate the statistics provided by perfect devices. As an illustration of its practicality, we use this technique to obtain a tight estimation of both the generalized Hong‑Ou‑Mandel dip in a beamsplitter with six input photons and the three-photon coincidence probability at the output of a tritter.

  17. Hardening CISCO Devices based on Cryptography and Security Protocols - Part One: Background Theory

    Directory of Open Access Journals (Sweden)

    Faisal Waheed

    2018-07-01

    Full Text Available Network Security is a vital part of any corporate and enterprise network. Network attacks greatly compromise not only the sensitive data of the consumers but also cause outages to these networks. Thus inadequately protected networks need to be “hardened”. The hardening of network devices refers to the hardware and software components, device operating system’s features, management controls, access-list restrictions, operational configurations and above all making sure that the data and credentials are not stored or transferred in ‘plaintext’ over the network. This article investigates the use of cryptography and network protocols based on encryption, to meet the need for essential security requirements. Use of non-secure protocols, underrating and misconfigurations of management protection are reasons behind network devices not properly being hardened; hence leaving vulnerabilities for the intruders. The gap identified after conducting intense search and review of past work is used as the foundation to present solutions. When performing cryptography techniques by encrypting packets using tunnelling and security protocols, management level credentials are encrypted. These include password encryption and exceptional analysis of the emulated IOS (Internetwork Operating System. Necessary testing is carried out to evaluate an acceptable level of protection of these devices. In a virtual testing environment, security flaws are found mainly in the emulated IOS. The discoveries does not depend on the hardware or chassis of a networking device. Since routers primarily rely on its Operating System (OS, attackers focus on manipulating the command line configuration before initiating an attack. Substantial work is devoted to implementation and testing of a router based on Cryptography and Security Protocols in the border router. This is deployed at the core layer and acts as the first point of entry of any trusted and untrusted traffic. A step

  18. Breaking the Unbreakable : Exploiting Loopholes in Bell’s Theorem to Hack Quantum Cryptography

    OpenAIRE

    Jogenfors, Jonathan

    2017-01-01

    In this thesis we study device-independent quantum key distribution based on energy-time entanglement. This is a method for cryptography that promises not only perfect secrecy, but also to be a practical method for quantum key distribution thanks to the reduced complexity when compared to other quantum key distribution protocols. However, there still exist a number of loopholes that must be understood and eliminated in order to rule out eavesdroppers. We study several relevant loopholes and s...

  19. Disorder generated by interacting neural networks: application to econophysics and cryptography

    International Nuclear Information System (INIS)

    Kinzel, Wolfgang; Kanter, Ido

    2003-01-01

    When neural networks are trained on their own output signals they generate disordered time series. In particular, when two neural networks are trained on their mutual output they can synchronize; they relax to a time-dependent state with identical synaptic weights. Two applications of this phenomenon are discussed for (a) econophysics and (b) cryptography. (a) When agents competing in a closed market (minority game) are using neural networks to make their decisions, the total system relaxes to a state of good performance. (b) Two partners communicating over a public channel can find a common secret key

  20. EDITORIAL: Focus on Quantum Cryptography: Theory and Practice FOCUS ON QUANTUM CRYPTOGRAPHY: THEORY AND PRACTICE

    Science.gov (United States)

    Lütkenhaus, N.; Shields, A. J.

    2009-04-01

    Quantum cryptography, and especially quantum key distribution (QKD), is steadily progressing to become a viable tool for cryptographic services. In recent years we have witnessed a dramatic increase in the secure bit rate of QKD, as well as its extension to ever longer fibre- and air-based links and the emergence of metro-scale trusted networks. In the foreseeable future even global-scale communications may be possible using quantum repeaters or Earth-satellite links. A handful of start-ups and some bigger companies are already active in the field. The launch of an initiative to form industrial standards for QKD, under the auspices of the European Telecommunication Standards Institute, described in the paper by Laenger and Lenhart in this Focus Issue, can be taken as a sign of the growing commercial interest. Recent progress has seen an increase in the secure bit rate of QKD links, by orders of magnitude, to over 1 Mb s-1. This has resulted mainly from an improvement in the detection technology. Here changes in the way conventional semiconductor detectors are gated, as well as the development of novel devices based on non-linear processes and superconducting materials, are leading the way. Additional challenges for QKD at GHz clock rates include the design of high speed electronics, remote synchronization and high rate random number generation. Substantial effort is being devoted to increasing the range of individual links, which is limited by attenuation and other losses in optical fibres and air links. An important advance in the past few years has been the introduction of protocols with the same scaling as an ideal single-photon set-up. The good news is that these schemes use standard optical devices, such as weak laser pulses. Thanks to these new protocols and improvements in the detection technology, the range of a single fibre link can exceed a few hundred km. Outstanding issues include proving the unconditional security of some of the schemes. Much of the

  1. General Theory of Decoy-State Quantum Cryptography with Dark Count Rate Fluctuation

    International Nuclear Information System (INIS)

    Xiang, Gao; Shi-Hai, Sun; Lin-Mei, Liang

    2009-01-01

    The existing theory of decoy-state quantum cryptography assumes that the dark count rate is a constant, but in practice there exists fluctuation. We develop a new scheme of the decoy state, achieve a more practical key generation rate in the presence of fluctuation of the dark count rate, and compare the result with the result of the decoy-state without fluctuation. It is found that the key generation rate and maximal secure distance will be decreased under the influence of the fluctuation of the dark count rate

  2. Finite automata over magmas: models and some applications in Cryptography

    Directory of Open Access Journals (Sweden)

    Volodymyr V. Skobelev

    2018-05-01

    Full Text Available In the paper the families of finite semi-automata and reversible finite Mealy and Moore automata over finite magmas are defined and analyzed in detail. On the base of these models it is established that the set of finite quasigroups is the most acceptable subset of the set of finite magmas at resolving model problems in Cryptography, such as design of iterated hash functions and stream ciphers. Defined families of finite semi-automata and reversible finite automata over finite $T$-quasigroups are investigated in detail. It is established that in this case models time and space complexity for simulation of the functioning during one instant of automaton time can be much lower than in general case.

  3. Improvement of the damage threshold of high reflectivity multidielectric coatings 1.06 μM

    International Nuclear Information System (INIS)

    Geenen, B.; Malherbes, A.; Guerain, J.; Boisgard, D.

    1985-01-01

    Development of new high power laser for laser-matter interaction in C.E.A. Limeil requires the realization of H.R. coatings with damage thresholds above 8 J/cm/sup 2/. MATRA's laboratory ''couches minces optiques'' (thin optical layers) production commercial mirrors was around 3.5 J/cm/sup 2/ in 1982. In order to obtain better results the authors decided to improve the control of evaporation parameters such as: vacuum and regulation of oxygen pressure by means of a mass spectrometer; better measurements of evaporation temperature and regulation of evaporation rate; measurement and control of substrate temperature by pyrometric observation; and to automatize the process. These different measurements and controls enable them to establish new processing operations giving better evaporation conditions. The result was an increase of damage threshold from 3.5 J/cm/sup 2/ to 8 J/cm/sup 2/

  4. All-optical cryptography of M-QAM formats by using two-dimensional spectrally sliced keys.

    Science.gov (United States)

    Abbade, Marcelo L F; Cvijetic, Milorad; Messani, Carlos A; Alves, Cleiton J; Tenenbaum, Stefan

    2015-05-10

    There has been an increased interest in enhancing the security of optical communications systems and networks. All-optical cryptography methods have been considered as an alternative to electronic data encryption. In this paper we propose and verify the use of a novel all-optical scheme based on cryptographic keys applied on the spectral signal for encryption of the M-QAM modulated data with bit rates of up to 200 gigabits per second.

  5. Lead-chalcogenide mid-infrared vertical external cavity surface emitting lasers with improved threshold: Theory and experiment

    Science.gov (United States)

    Fill, Matthias; Debernardi, Pierluigi; Felder, Ferdinand; Zogg, Hans

    2013-11-01

    Mid-infrared Vertical External Cavity Surface Emitting Lasers (VECSEL) based on narrow gap lead-chalcogenide (IV-VI) semiconductors exhibit strongly reduced threshold powers if the active layers are structured laterally for improved optical confinement. This is predicted by 3-d optical calculations; they show that lateral optical confinement is needed to counteract the anti-guiding features of IV-VIs due to their negative temperature dependence of the refractive index. An experimental proof is performed with PbSe quantum well based VECSEL grown on a Si-substrate by molecular beam epitaxy and emitting around 3.3 μm. With proper mesa-etching, the threshold intensity is about 8-times reduced.

  6. Lead-chalcogenide mid-infrared vertical external cavity surface emitting lasers with improved threshold: Theory and experiment

    Energy Technology Data Exchange (ETDEWEB)

    Fill, Matthias [ETH Zurich, Laser Spectroscopy and Sensing Lab, 8093 Zurich (Switzerland); Phocone AG, 8005 Zurich (Switzerland); Debernardi, Pierluigi [IEIIT-CNR, Torino 10129 (Italy); Felder, Ferdinand [Phocone AG, 8005 Zurich (Switzerland); Zogg, Hans [ETH Zurich (Switzerland)

    2013-11-11

    Mid-infrared Vertical External Cavity Surface Emitting Lasers (VECSEL) based on narrow gap lead-chalcogenide (IV-VI) semiconductors exhibit strongly reduced threshold powers if the active layers are structured laterally for improved optical confinement. This is predicted by 3-d optical calculations; they show that lateral optical confinement is needed to counteract the anti-guiding features of IV-VIs due to their negative temperature dependence of the refractive index. An experimental proof is performed with PbSe quantum well based VECSEL grown on a Si-substrate by molecular beam epitaxy and emitting around 3.3 μm. With proper mesa-etching, the threshold intensity is about 8-times reduced.

  7. Oxytocin administration selectively improves olfactory detection thresholds for lyral in patients with schizophrenia.

    Science.gov (United States)

    Woolley, J D; Lam, O; Chuang, B; Ford, J M; Mathalon, D H; Vinogradov, S

    2015-03-01

    Olfaction plays an important role in mammalian social behavior. Olfactory deficits are common in schizophrenia and correlate with negative symptoms and low social drive. Despite their prominence and possible clinical relevance, little is understood about the pathological mechanisms underlying olfactory deficits in schizophrenia and there are currently no effective treatments for these deficits. The prosocial neuropeptide oxytocin may affect the olfactory system when administered intranasally to humans and there is growing interest in its therapeutic potential in schizophrenia. To examine this model, we administered 40IU of oxytocin and placebo intranasally to 31 patients with a schizophrenia spectrum illness and 34 age-matched healthy control participants in a randomized, double-blind, placebo-controlled, cross-over study. On each test day, participants completed an olfactory detection threshold test for two different odors: (1) lyral, a synthetic fragrance compound for which patients with schizophrenia have specific olfactory detection threshold deficits, possibly related to decreased cyclic adenosine 3',5'-monophosphate (cAMP) signaling; and (2) anise, a compound for which olfactory detection thresholds change with menstrual cycle phase in women. On the placebo test day, patients with schizophrenia did not significantly differ from healthy controls in detection of either odor. We found that oxytocin administration significantly and selectively improved olfactory detection thresholds for lyral but not for anise in patients with schizophrenia. In contrast, oxytocin had no effect on detection of either odor in healthy controls. Our data indicate that oxytocin administration may ameliorate olfactory deficits in schizophrenia and suggest the effects of intranasal oxytocin may extend to influencing the olfactory system. Given that oxytocin has been found to increase cAMP signaling in vitro a possible mechanism for these effects is discussed. Published by Elsevier Ltd.

  8. No information flow using statistical fluctuations and quantum cryptography

    Science.gov (United States)

    Larsson, Jan-Åke

    2004-04-01

    The communication protocol of Home and Whitaker [Phys. Rev. A 67, 022306 (2003)] is examined in some detail, and found to work equally well using a separable state. The protocol is in fact completely classical, based on postselection of suitable experimental runs. The quantum-cryptography protocol proposed in the same publication is also examined, and this protocol uses entanglement, a strictly quantum property of the system. An individual eavesdropping attack on each qubit pair would be detected by the security test proposed in the mentioned paper. However, the key is provided by groups of qubits, and there exists a coherent attack, internal to these groups, that will go unnoticed in that security test. A modified test is proposed here that will ensure security, even against such a coherent attack.

  9. No information flow using statistical fluctuations and quantum cryptography

    International Nuclear Information System (INIS)

    Larsson, Jan-Aake

    2004-01-01

    The communication protocol of Home and Whitaker [Phys. Rev. A 67, 022306 (2003)] is examined in some detail, and found to work equally well using a separable state. The protocol is in fact completely classical, based on postselection of suitable experimental runs. The quantum-cryptography protocol proposed in the same publication is also examined, and this protocol uses entanglement, a strictly quantum property of the system. An individual eavesdropping attack on each qubit pair would be detected by the security test proposed in the mentioned paper. However, the key is provided by groups of qubits, and there exists a coherent attack, internal to these groups, that will go unnoticed in that security test. A modified test is proposed here that will ensure security, even against such a coherent attack

  10. Effect of Smaller Left Ventricular Capture Threshold Safety Margins to Improve Device Longevity in Recipients of Cardiac Resynchronization-Defibrillation Therapy.

    Science.gov (United States)

    Steinhaus, Daniel A; Waks, Jonathan W; Collins, Robert; Kleckner, Karen; Kramer, Daniel B; Zimetbaum, Peter J

    2015-07-01

    Device longevity in cardiac resynchronization therapy (CRT) is affected by the pacing capture threshold (PCT) and programmed pacing amplitude of the left ventricular (LV) pacing lead. The aims of this study were to evaluate the stability of LV pacing thresholds in a nationwide sample of CRT defibrillator recipients and to determine potential longevity improvements associated with a decrease in the LV safety margin while maintaining effective delivery of CRT. CRT defibrillator patients in the Medtronic CareLink database were eligible for inclusion. LV PCT stability was evaluated using ≥2 measurements over a 14-day period. Separately, a random sample of 7,250 patients with programmed right atrial and right ventricular amplitudes ≤2.5 V, LV thresholds ≤ 2.5 V, and LV pacing ≥90% were evaluated to estimate theoretical battery longevity improvement using LV safety margins of 0.5 and 1.5 V. Threshold stability analysis in 43,256 patients demonstrated LV PCT stability of 1 V had the greatest increases in battery life (mean increase 0.86 years, 95% confidence interval 0.85 to 0.87). In conclusion, nearly all CRT defibrillator patients had LV PCT stability <1.0 V. Decreasing the LV safety margin from 1.5 to 0.5 V provided consistent delivery of CRT for most patients and significantly improved battery longevity. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Comment on 'Two-way protocols for quantum cryptography with a nonmaximally entangled qubit pair'

    International Nuclear Information System (INIS)

    Qin Sujuan; Gao Fei; Wen Qiaoyan; Guo Fenzhuo

    2010-01-01

    Three protocols of quantum cryptography with a nonmaximally entangled qubit pair [Phys. Rev. A 80, 022323 (2009)] were recently proposed by Shimizu, Tamaki, and Fukasaka. The security of these protocols is based on the quantum-mechanical constraint for a state transformation between nonmaximally entangled states. However, we find that the second protocol is vulnerable under the correlation-elicitation attack. An eavesdropper can obtain the encoded bit M although she has no knowledge about the random bit R.

  12. Design of secure digital communication systems using chaotic modulation, cryptography and chaotic synchronization

    International Nuclear Information System (INIS)

    Chien, T.-I.; Liao, T.-L.

    2005-01-01

    This paper presents a secure digital communication system based on chaotic modulation, cryptography, and chaotic synchronization techniques. The proposed system consists of a Chaotic Modulator (CM), a Chaotic Secure Transmitter (CST), a Chaotic Secure Receiver (CSR) and a Chaotic Demodulator (CDM). The CM module incorporates a chaotic system and a novel Chaotic Differential Peaks Keying (CDPK) modulation scheme to generate analog patterns corresponding to the input digital bits. The CST and CSR modules are designed such that a single scalar signal is transmitted in the public channel. Furthermore, by giving certain structural conditions of a particular class of chaotic system, the CST and the nonlinear observer-based CSR with an appropriate observer gain are constructed to synchronize with each other. These two slave systems are driven simultaneously by the transmitted signal and are designed to synchronize and generate appropriate cryptography keys for encryption and decryption purposes. In the CDM module, a nonlinear observer is designed to estimate the chaotic modulating system in the CM. A demodulation mechanism is then applied to decode the transmitted input digital bits. The effectiveness of the proposed scheme is demonstrated through the numerical simulation of an illustrative communication system. Synchronization between the chaotic circuits of the transmitter and receiver modules is guaranteed through the Lyapunov stability theorem. Finally, the security features of the proposed system in the event of attack by an intruder in either the time domain or the frequency domain are discussed

  13. VLSI IMPLEMENTATION OF NOVEL ROUND KEYS GENERATION SCHEME FOR CRYPTOGRAPHY APPLICATIONS BY ERROR CONTROL ALGORITHM

    Directory of Open Access Journals (Sweden)

    B. SENTHILKUMAR

    2015-05-01

    Full Text Available A novel implementation of code based cryptography (Cryptocoding technique for multi-layer key distribution scheme is presented. VLSI chip is designed for storing information on generation of round keys. New algorithm is developed for reduced key size with optimal performance. Error Control Algorithm is employed for both generation of round keys and diffusion of non-linearity among them. Two new functions for bit inversion and its reversal are developed for cryptocoding. Probability of retrieving original key from any other round keys is reduced by diffusing nonlinear selective bit inversions on round keys. Randomized selective bit inversions are done on equal length of key bits by Round Constant Feedback Shift Register within the error correction limits of chosen code. Complexity of retrieving the original key from any other round keys is increased by optimal hardware usage. Proposed design is simulated and synthesized using VHDL coding for Spartan3E FPGA and results are shown. Comparative analysis is done between 128 bit Advanced Encryption Standard round keys and proposed round keys for showing security strength of proposed algorithm. This paper concludes that chip based multi-layer key distribution of proposed algorithm is an enhanced solution to the existing threats on cryptography algorithms.

  14. Authentication in insecure environments using visual cryptography and non-transferable credentials in practise

    CERN Document Server

    Pape, Sebastian

    2014-01-01

    Sebastian Pape discusses two different scenarios for authentication. On the one hand, users cannot trust their devices and nevertheless want to be able to do secure authentication. On the other hand, users may not want to be tracked while their service provider does not want them to share their credentials. Many users may not be able to determine whether their device is trustworthy, i.e. it might contain malware. One solution is to use visual cryptography for authentication. The author generalizes this concept to human decipherable encryption schemes and establishes a relationship to CAPTCHAS.

  15. Field test of a practical secure communication network with decoy-state quantum cryptography.

    Science.gov (United States)

    Chen, Teng-Yun; Liang, Hao; Liu, Yang; Cai, Wen-Qi; Ju, Lei; Liu, Wei-Yue; Wang, Jian; Yin, Hao; Chen, Kai; Chen, Zeng-Bing; Peng, Cheng-Zhi; Pan, Jian-Wei

    2009-04-13

    We present a secure network communication system that operated with decoy-state quantum cryptography in a real-world application scenario. The full key exchange and application protocols were performed in real time among three nodes, in which two adjacent nodes were connected by approximate 20 km of commercial telecom optical fiber. The generated quantum keys were immediately employed and demonstrated for communication applications, including unbreakable real-time voice telephone between any two of the three communication nodes, or a broadcast from one node to the other two nodes by using one-time pad encryption.

  16. A Novel Basis Splitting Eavesdropping Scheme in Quantum Cryptography Based on the BB84 Protocol

    International Nuclear Information System (INIS)

    Zhao Nan; Zhu Chang-Hua; Quan Dong-Xiao

    2015-01-01

    We propose a novel strategy named basis-splitting scheme to split the intercepted quanta into several portions based on different bases, for eavesdropping in the process of quantum cryptography. Compared with intercept-resend strategy, our simulation results of the basis-splitting scheme under the non-ideal condition show a greater performance, especially with the increase of the length of shifted bits. Consequently our scheme can aid eavesdropper to gather much more useful information. (paper)

  17. Improved protein structure reconstruction using secondary structures, contacts at higher distance thresholds, and non-contacts.

    Science.gov (United States)

    Adhikari, Badri; Cheng, Jianlin

    2017-08-29

    Residue-residue contacts are key features for accurate de novo protein structure prediction. For the optimal utilization of these predicted contacts in folding proteins accurately, it is important to study the challenges of reconstructing protein structures using true contacts. Because contact-guided protein modeling approach is valuable for predicting the folds of proteins that do not have structural templates, it is necessary for reconstruction studies to focus on hard-to-predict protein structures. Using a data set consisting of 496 structural domains released in recent CASP experiments and a dataset of 150 representative protein structures, in this work, we discuss three techniques to improve the reconstruction accuracy using true contacts - adding secondary structures, increasing contact distance thresholds, and adding non-contacts. We find that reconstruction using secondary structures and contacts can deliver accuracy higher than using full contact maps. Similarly, we demonstrate that non-contacts can improve reconstruction accuracy not only when the used non-contacts are true but also when they are predicted. On the dataset consisting of 150 proteins, we find that by simply using low ranked predicted contacts as non-contacts and adding them as additional restraints, can increase the reconstruction accuracy by 5% when the reconstructed models are evaluated using TM-score. Our findings suggest that secondary structures are invaluable companions of contacts for accurate reconstruction. Confirming some earlier findings, we also find that larger distance thresholds are useful for folding many protein structures which cannot be folded using the standard definition of contacts. Our findings also suggest that for more accurate reconstruction using predicted contacts it is useful to predict contacts at higher distance thresholds (beyond 8 Å) and predict non-contacts.

  18. System Level Design of Reconfigurable Server Farms Using Elliptic Curve Cryptography Processor Engines

    Directory of Open Access Journals (Sweden)

    Sangook Moon

    2014-01-01

    Full Text Available As today’s hardware architecture becomes more and more complicated, it is getting harder to modify or improve the microarchitecture of a design in register transfer level (RTL. Consequently, traditional methods we have used to develop a design are not capable of coping with complex designs. In this paper, we suggest a way of designing complex digital logic circuits with a soft and advanced type of SystemVerilog at an electronic system level. We apply the concept of design-and-reuse with a high level of abstraction to implement elliptic curve crypto-processor server farms. With the concept of the superior level of abstraction to the RTL used with the traditional HDL design, we successfully achieved the soft implementation of the crypto-processor server farms as well as robust test bench code with trivial effort in the same simulation environment. Otherwise, it could have required error-prone Verilog simulations for the hardware IPs and other time-consuming jobs such as C/SystemC verification for the software, sacrificing more time and effort. In the design of the elliptic curve cryptography processor engine, we propose a 3X faster GF(2m serial multiplication architecture.

  19. An improved experimental scheme for simultaneous measurement of high-resolution zero electron kinetic energy (ZEKE) photoelectron and threshold photoion (MATI) spectra

    Science.gov (United States)

    Michels, François; Mazzoni, Federico; Becucci, Maurizio; Müller-Dethlefs, Klaus

    2017-10-01

    An improved detection scheme is presented for threshold ionization spectroscopy with simultaneous recording of the Zero Electron Kinetic Energy (ZEKE) and Mass Analysed Threshold Ionisation (MATI) signals. The objective is to obtain accurate dissociation energies for larger molecular clusters by simultaneously detecting the fragment and parent ion MATI signals with identical transmission. The scheme preserves an optimal ZEKE spectral resolution together with excellent separation of the spontaneous ion and MATI signals in the time-of-flight mass spectrum. The resulting improvement in sensitivity will allow for the determination of dissociation energies in clusters with substantial mass difference between parent and daughter ions.

  20. Multiple Schemes for Mobile Payment Authentication Using QR Code and Visual Cryptography

    Directory of Open Access Journals (Sweden)

    Jianfeng Lu

    2017-01-01

    Full Text Available QR code (quick response code is used due to its beneficial properties, especially in the mobile payment field. However, there exists an inevitable risk in the transaction process. It is not easily perceived that the attacker tampers with or replaces the QR code that contains merchant’s beneficiary account. Thus, it is of great urgency to conduct authentication of QR code. In this study, we propose a novel mechanism based on visual cryptography scheme (VCS and aesthetic QR code, which contains three primary schemes for different concealment levels. The main steps of these schemes are as follows. Firstly, one original QR code is split into two shadows using VC multiple rules; secondly, the two shadows are embedded into the same background image, respectively, and the embedded results are fused with the same carrier QR code, respectively, using XOR mechanism of RS and QR code error correction mechanism. Finally, the two aesthetic QR codes can be stacked precisely and the original QR code is restored according to the defined VCS. Experiments corresponding to three proposed schemes are conducted and demonstrate the feasibility and security of the mobile payment authentication, the significant improvement of the concealment for the shadows in QR code, and the diversity of mobile payment authentication.

  1. Application of visual cryptography for learning in optics and photonics

    Science.gov (United States)

    Mandal, Avikarsha; Wozniak, Peter; Vauderwange, Oliver; Curticapean, Dan

    2016-09-01

    In the age data digitalization, important applications of optics and photonics based sensors and technology lie in the field of biometrics and image processing. Protecting user data in a safe and secure way is an essential task in this area. However, traditional cryptographic protocols rely heavily on computer aided computation. Secure protocols which rely only on human interactions are usually simpler to understand. In many scenarios development of such protocols are also important for ease of implementation and deployment. Visual cryptography (VC) is an encryption technique on images (or text) in which decryption is done by human visual system. In this technique, an image is encrypted into number of pieces (known as shares). When the printed shares are physically superimposed together, the image can be decrypted with human vision. Modern digital watermarking technologies can be combined with VC for image copyright protection where the shares can be watermarks (small identification) embedded in the image. Similarly, VC can be used for improving security of biometric authentication. This paper presents about design and implementation of a practical laboratory experiment based on the concept of VC for a course in media engineering. Specifically, our contribution deals with integration of VC in different schemes for applications like digital watermarking and biometric authentication in the field of optics and photonics. We describe theoretical concepts and propose our infrastructure for the experiment. Finally, we will evaluate the learning outcome of the experiment, performed by the students.

  2. Information hiding based on double random-phase encoding and public-key cryptography.

    Science.gov (United States)

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  3. Implementation of Pollard Rho attack on elliptic curve cryptography over binary fields

    Science.gov (United States)

    Wienardo, Yuliawan, Fajar; Muchtadi-Alamsyah, Intan; Rahardjo, Budi

    2015-09-01

    Elliptic Curve Cryptography (ECC) is a public key cryptosystem with a security level determined by discrete logarithm problem called Elliptic Curve Discrete Logarithm Problem (ECDLP). John M. Pollard proposed an algorithm for discrete logarithm problem based on Monte Carlo method and known as Pollard Rho algorithm. The best current brute-force attack for ECC is Pollard Rho algorithm. In this research we implement modified Pollard Rho algorithm on ECC over GF (241). As the result, the runtime of Pollard Rho algorithm increases exponentially with the increase of the ECC key length. This work also presents the estimated runtime of Pollard Rho attack on ECC over longer bits.

  4. Selected Areas in Cryptography - SAC 2013 : 20th International Conference, Burnaby BC, Canada, August 14-16, 2013 : Revised Selected Papers

    NARCIS (Netherlands)

    Lange, T.; Lauter, K.; Lisonek, P.

    2014-01-01

    This book constitutes the proceedings of the 20th International Conference on Selected Areas in Cryptography, SAC 2013, held in Burnaby, Canada, in August 2013. The 26 papers presented in this volume were carefully reviewed and selected from 98 submissions. They are organized in topical sections

  5. Invisible transmission in quantum cryptography using continuous variables: A proof of Eve's vulnerability

    International Nuclear Information System (INIS)

    Navez, Patrick; Gatti, Alessandra; Lugiato, Luigi A.

    2002-01-01

    By analogy to classical cryptography, we develop a quantum cryptographic scheme in which the two public and private keys consist in each of two entangled beams of squeezed light. An analog secret information is encrypted by modulating the phase of the beam sent in public. The knowledge of the degree of nonclassical correlation between the beam quadratures measured in private and in public allows only the receiver to decrypt the secret information. Finally, in a view towards absolute security, we formally prove that any external intervention of an eavesdropper makes him vulnerable to any subsequent detection

  6. Quantum cryptography with a predetermined key, using continuous-variable Einstein-Podolsky-Rosen correlations

    Science.gov (United States)

    Reid, M. D.

    2000-12-01

    Correlations of the type discussed by EPR in their original 1935 paradox for continuous variables exist for the quadrature phase amplitudes of two spatially separated fields. These correlations were first experimentally reported in 1992. We propose to use such EPR beams in quantum cryptography, to transmit with high efficiency messages in such a way that the receiver and sender may later determine whether eavesdropping has occurred. The merit of the new proposal is in the possibility of transmitting a reasonably secure yet predetermined key. This would allow relay of a cryptographic key over long distances in the presence of lossy channels.

  7. Reduced randomness in quantum cryptography with sequences of qubits encoded in the same basis

    International Nuclear Information System (INIS)

    Lamoureux, L.-P.; Cerf, N. J.; Bechmann-Pasquinucci, H.; Gisin, N.; Macchiavello, C.

    2006-01-01

    We consider the cloning of sequences of qubits prepared in the states used in the BB84 or six-state quantum cryptography protocol, and show that the single-qubit fidelity is unaffected even if entire sequences of qubits are prepared in the same basis. This result is only valid provided that the sequences are much shorter than the total key. It is of great importance for practical quantum cryptosystems because it reduces the need for high-speed random number generation without impairing on the security against finite-size cloning attacks

  8. Optical asymmetric cryptography based on amplitude reconstruction of elliptically polarized light

    Science.gov (United States)

    Cai, Jianjun; Shen, Xueju; Lei, Ming

    2017-11-01

    We propose a novel optical asymmetric image encryption method based on amplitude reconstruction of elliptically polarized light, which is free from silhouette problem. The original image is analytically separated into two phase-only masks firstly, and then the two masks are encoded into amplitudes of the orthogonal polarization components of an elliptically polarized light. Finally, the elliptically polarized light propagates through a linear polarizer, and the output intensity distribution is recorded by a CCD camera to obtain the ciphertext. The whole encryption procedure could be implemented by using commonly used optical elements, and it combines diffusion process and confusion process. As a result, the proposed method achieves high robustness against iterative-algorithm-based attacks. Simulation results are presented to prove the validity of the proposed cryptography.

  9. Decoding chaotic cryptography without access to the superkey

    International Nuclear Information System (INIS)

    Vaidya, P.G.; Angadi, Savita

    2003-01-01

    Some chaotic systems can be synchronized by sending only a part of the state space information. This property is used to create keys for cryptography using the unsent state spaces. This idea was first used in connection with the Lorenz equation. It has been assumed for that equation that access to the unsent information is impossible without knowing the three parameters of the equation. This is why the values of these parameters are collectively known as the 'superkey'. The exhaustive search for this key from the existing data is time consuming and can easily be countered by changing the key. We show in this paper how the superkey can be found in a very rapid manner from the synchronizing signal. We achieve this by first transforming the Lorenz equation to a canonical form. Then we use our recently developed method to find highly accurate derivatives from data. Next we transform a nonlinear equation for the superkey to a linear form by embedding it in four dimensions. The final equations are solved by using the generalized inverse

  10. Decoding chaotic cryptography without access to the superkey

    CERN Document Server

    Vaidya, P G

    2003-01-01

    Some chaotic systems can be synchronized by sending only a part of the state space information. This property is used to create keys for cryptography using the unsent state spaces. This idea was first used in connection with the Lorenz equation. It has been assumed for that equation that access to the unsent information is impossible without knowing the three parameters of the equation. This is why the values of these parameters are collectively known as the 'superkey'. The exhaustive search for this key from the existing data is time consuming and can easily be countered by changing the key. We show in this paper how the superkey can be found in a very rapid manner from the synchronizing signal. We achieve this by first transforming the Lorenz equation to a canonical form. Then we use our recently developed method to find highly accurate derivatives from data. Next we transform a nonlinear equation for the superkey to a linear form by embedding it in four dimensions. The final equations are solved by using t...

  11. Achieving the Limits of the Noisy-Storage Model Using Entanglement Sampling

    DEFF Research Database (Denmark)

    Dupont-Dupuis, Fréderic; Fawzi, Omar; Wehner, Stephanie

    2013-01-01

    . Cryptology, 2000) with constructions of logarithmic-depth formulae which compute threshold functions using only constant fan-in threshold gates. Using this approach, we simplify and improve on previous results in cryptography and distributed computing. In particular: - We provide conceptually simple...... on the following complexity-theoretic contributions, which may be of independent interest: - We show that for every j,k ∈ ℕ such that m Delta equal to k-1/j-1 is an integer, there is an explicit (poly(n)-time) construction of a logarithmic-depth formula which computes a good approximation of an (n...... approximation than for the general threshold case, and also an exact explicit construction based on standard complexity-theoretic or cryptographic assumptions. © 2013 International Association for Cryptologic Research....

  12. Why cryptography should not rely on physical attack complexity

    CERN Document Server

    Krämer, Juliane

    2015-01-01

    This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on the software and the hardware level that can help to prevent them in the future. Though their theory has been known for several years now, since neither attack has yet been successfully implemented in practice, they have generally not been considered a serious threat. In short, their physical attack complexity has been overestimated and the implied security threat has been underestimated. First, the book introduces the photonic side channel, which offers not only temporal resolution, but also the highest possible spatial resolution. Due to the high cost of its initial implementation, it has not been taken seriously. The work shows both simple and differential photonic side channel analyses. Then, it presents a fault attack against pairing-based cryptography. Due to the need for at least two indepe...

  13. Wigner representation for experiments on quantum cryptography using two-photon polarization entanglement produced in parametric down-conversion

    International Nuclear Information System (INIS)

    Casado, A; Guerra, S; Placido, J

    2008-01-01

    In this paper, the theory of parametric down-conversion in the Wigner representation is applied to Ekert's quantum cryptography protocol. We analyse the relation between two-photon entanglement and (non-secure) quantum key distribution within the Wigner framework in the Heisenberg picture. Experiments using two-qubit polarization entanglement generated in nonlinear crystals are analysed in this formalism, along with the effects of eavesdropping attacks in the case of projective measurements

  14. Security in software-defined wireless sensor networks: threats, challenges and potential solutions

    CSIR Research Space (South Africa)

    Pritchard, SW

    2017-07-01

    Full Text Available have focused on low resource cryptography methods to secure the network [27] - [29], [33]. Cryptography methods are separated into symmetric cryptography and asymmetric cryptography. While symmetric cryptography solutions are preferred due to low... implementation cost and efficiency [5], they present many problems when managing large networks and attempts to improve this cryptography for WSNs [11] have resulted in the cost of resources. Symmetric cryptography is also difficult to implement in software...

  15. Perspectives on Entangled Nuclear Particle Pairs Generation and Manipulation in Quantum Communication and Cryptography Systems

    Directory of Open Access Journals (Sweden)

    Octavian Dănilă

    2012-01-01

    Full Text Available Entanglement between two quantum elements is a phenomenon which presents a broad application spectrum, being used largely in quantum cryptography schemes and in physical characterisation of the universe. Commonly known entangled states have been obtained with photons and electrons, but other quantum elements such as quarks, leptons, and neutrinos have shown their informational potential. In this paper, we present the perspective of exploiting the phenomenon of entanglement that appears in nuclear particle interactions as a resource for quantum key distribution protocols.

  16. Determinants of Change in the Cost-effectiveness Threshold.

    Science.gov (United States)

    Paulden, Mike; O'Mahony, James; McCabe, Christopher

    2017-02-01

    The cost-effectiveness threshold in health care systems with a constrained budget should be determined by the cost-effectiveness of displacing health care services to fund new interventions. Using comparative statics, we review some potential determinants of the threshold, including the budget for health care, the demand for existing health care interventions, the technical efficiency of existing interventions, and the development of new health technologies. We consider the anticipated direction of impact that would affect the threshold following a change in each of these determinants. Where the health care system is technically efficient, an increase in the health care budget unambiguously raises the threshold, whereas an increase in the demand for existing, non-marginal health interventions unambiguously lowers the threshold. Improvements in the technical efficiency of existing interventions may raise or lower the threshold, depending on the cause of the improvement in efficiency, whether the intervention is already funded, and, if so, whether it is marginal. New technologies may also raise or lower the threshold, depending on whether the new technology is a substitute for an existing technology and, again, whether the existing technology is marginal. Our analysis permits health economists and decision makers to assess if and in what direction the threshold may change over time. This matters, as threshold changes impact the cost-effectiveness of interventions that require decisions now but have costs and effects that fall in future periods.

  17. A secure RFID mutual authentication protocol for healthcare environments using elliptic curve cryptography.

    Science.gov (United States)

    Jin, Chunhua; Xu, Chunxiang; Zhang, Xiaojun; Zhao, Jining

    2015-03-01

    Radio Frequency Identification(RFID) is an automatic identification technology, which can be widely used in healthcare environments to locate and track staff, equipment and patients. However, potential security and privacy problems in RFID system remain a challenge. In this paper, we design a mutual authentication protocol for RFID based on elliptic curve cryptography(ECC). We use pre-computing method within tag's communication, so that our protocol can get better efficiency. In terms of security, our protocol can achieve confidentiality, unforgeability, mutual authentication, tag's anonymity, availability and forward security. Our protocol also can overcome the weakness in the existing protocols. Therefore, our protocol is suitable for healthcare environments.

  18. Authentication and Encryption Using Modified Elliptic Curve Cryptography with Particle Swarm Optimization and Cuckoo Search Algorithm

    Science.gov (United States)

    Kota, Sujatha; Padmanabhuni, Venkata Nageswara Rao; Budda, Kishor; K, Sruthi

    2018-05-01

    Elliptic Curve Cryptography (ECC) uses two keys private key and public key and is considered as a public key cryptographic algorithm that is used for both authentication of a person and confidentiality of data. Either one of the keys is used in encryption and other in decryption depending on usage. Private key is used in encryption by the user and public key is used to identify user in the case of authentication. Similarly, the sender encrypts with the private key and the public key is used to decrypt the message in case of confidentiality. Choosing the private key is always an issue in all public key Cryptographic Algorithms such as RSA, ECC. If tiny values are chosen in random the security of the complete algorithm becomes an issue. Since the Public key is computed based on the Private Key, if they are not chosen optimally they generate infinity values. The proposed Modified Elliptic Curve Cryptography uses selection in either of the choices; the first option is by using Particle Swarm Optimization and the second option is by using Cuckoo Search Algorithm for randomly choosing the values. The proposed algorithms are developed and tested using sample database and both are found to be secured and reliable. The test results prove that the private key is chosen optimally not repetitive or tiny and the computations in public key will not reach infinity.

  19. Wigner representation for experiments on quantum cryptography using two-photon polarization entanglement produced in parametric down-conversion

    Energy Technology Data Exchange (ETDEWEB)

    Casado, A [Departamento de Fisica Aplicada III, Escuela Superior de Ingenieros, Universidad de Sevilla, 41092 Sevilla (Spain); Guerra, S [Centro Asociado de la Universidad Nacional de Educacion a Distancia de Las Palmas de Gran Canaria (Spain); Placido, J [Departamento de Fisica, Universidad de Las Palmas de Gran Canaria (Spain)], E-mail: acasado@us.es

    2008-02-28

    In this paper, the theory of parametric down-conversion in the Wigner representation is applied to Ekert's quantum cryptography protocol. We analyse the relation between two-photon entanglement and (non-secure) quantum key distribution within the Wigner framework in the Heisenberg picture. Experiments using two-qubit polarization entanglement generated in nonlinear crystals are analysed in this formalism, along with the effects of eavesdropping attacks in the case of projective measurements.

  20. Azygos Vein Lead Implantation For High Defibrillation Thresholds In Implantable Cardioverter Defibrillator Placement

    Directory of Open Access Journals (Sweden)

    Naga VA Kommuri

    2010-01-01

    Full Text Available Evaluation of defibrillation threshold is a standard of care during implantation of implantable cardioverter defibrillator. High defibrillation thresholds are often encountered and pose a challenge to electrophysiologists to improve the defibrillation threshold. We describe a case series where defibrillation thresholds were improved after implanting a defibrillation lead in the azygos vein.

  1. Algebra for applications cryptography, secret sharing, error-correcting, fingerprinting, compression

    CERN Document Server

    Slinko, Arkadii

    2015-01-01

    This book examines the relationship between mathematics and data in the modern world. Indeed, modern societies are awash with data which must be manipulated in many different ways: encrypted, compressed, shared between users in a prescribed manner, protected from an unauthorised access and transmitted over unreliable channels. All of these operations can be understood only by a person with knowledge of basics in algebra and number theory. This book provides the necessary background in arithmetic, polynomials, groups, fields and elliptic curves that is sufficient to understand such real-life applications as cryptography, secret sharing, error-correcting, fingerprinting and compression of information. It is the first to cover many recent developments in these topics. Based on a lecture course given to third-year undergraduates, it is self-contained with numerous worked examples and exercises provided to test understanding. It can additionally be used for self-study.

  2. Postselection technique for quantum channels with applications to quantum cryptography.

    Science.gov (United States)

    Christandl, Matthias; König, Robert; Renner, Renato

    2009-01-16

    We propose a general method for studying properties of quantum channels acting on an n-partite system, whose action is invariant under permutations of the subsystems. Our main result is that, in order to prove that a certain property holds for an arbitrary input, it is sufficient to consider the case where the input is a particular de Finetti-type state, i.e., a state which consists of n identical and independent copies of an (unknown) state on a single subsystem. Our technique can be applied to the analysis of information-theoretic problems. For example, in quantum cryptography, we get a simple proof for the fact that security of a discrete-variable quantum key distribution protocol against collective attacks implies security of the protocol against the most general attacks. The resulting security bounds are tighter than previously known bounds obtained with help of the exponential de Finetti theorem.

  3. Full-field implementation of a perfect eavesdropper on a quantum cryptography system.

    Science.gov (United States)

    Gerhardt, Ilja; Liu, Qin; Lamas-Linares, Antía; Skaar, Johannes; Kurtsiefer, Christian; Makarov, Vadim

    2011-06-14

    Quantum key distribution (QKD) allows two remote parties to grow a shared secret key. Its security is founded on the principles of quantum mechanics, but in reality it significantly relies on the physical implementation. Technological imperfections of QKD systems have been previously explored, but no attack on an established QKD connection has been realized so far. Here we show the first full-field implementation of a complete attack on a running QKD connection. An installed eavesdropper obtains the entire 'secret' key, while none of the parameters monitored by the legitimate parties indicate a security breach. This confirms that non-idealities in physical implementations of QKD can be fully practically exploitable, and must be given increased scrutiny if quantum cryptography is to become highly secure.

  4. A neural-network approach for visual cryptography and authorization.

    Science.gov (United States)

    Yue, Tai-Wen; Chiang, Suchen

    2004-06-01

    In this paper, we propose a neural-network approach for visual authorization, which is an application of visual cryptography (VC). The scheme contains a key-share and a set of user-shares. The administrator owns the key-share, and each user owns a user-share issued by the administrator from the user-share set. The shares in the user-share set are visually indistinguishable, i.e. they have the same pictorial meaning. However, the stacking of the key-share with different user-shares will reveal significantly different images. Therefore, the administrator (in fact, only the administrator) can visually recognize the authority assigned to a particular user by viewing the information appearing in the superposed image of key-share and user-share. This approach is completely different from traditional VC approaches. The salient features include: (i) the access schemes are described using a set of graytone images, and (ii) the codebooks to fulfil them are not required; and (iii) the size of share images is the same as the size of target image.

  5. Psychophysical thresholds of face visibility during infancy

    DEFF Research Database (Denmark)

    Gelskov, Sofie; Kouider, Sid

    2010-01-01

    The ability to detect and focus on faces is a fundamental prerequisite for developing social skills. But how well can infants detect faces? Here, we address this question by studying the minimum duration at which faces must appear to trigger a behavioral response in infants. We used a preferential...... looking method in conjunction with masking and brief presentations (300 ms and below) to establish the temporal thresholds of visibility at different stages of development. We found that 5 and 10 month-old infants have remarkably similar visibility thresholds about three times higher than those of adults....... By contrast, 15 month-olds not only revealed adult-like thresholds, but also improved their performance through memory-based strategies. Our results imply that the development of face visibility follows a non-linear course and is determined by a radical improvement occurring between 10 and 15 months....

  6. Optical double-image cryptography based on diffractive imaging with a laterally-translated phase grating.

    Science.gov (United States)

    Chen, Wen; Chen, Xudong; Sheppard, Colin J R

    2011-10-10

    In this paper, we propose a method using structured-illumination-based diffractive imaging with a laterally-translated phase grating for optical double-image cryptography. An optical cryptosystem is designed, and multiple random phase-only masks are placed in the optical path. When a phase grating is laterally translated just before the plaintexts, several diffraction intensity patterns (i.e., ciphertexts) can be correspondingly obtained. During image decryption, an iterative retrieval algorithm is developed to extract plaintexts from the ciphertexts. In addition, security and advantages of the proposed method are analyzed. Feasibility and effectiveness of the proposed method are demonstrated by numerical simulation results. © 2011 Optical Society of America

  7. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    Treiber, Alexander; Ferrini, Daniele; Huebel, Hannes; Zeilinger, Anton; Poppe, Andreas; Loruenser, Thomas; Querasser, Edwin; Matyus, Thomas; Hentschel, Michael

    2009-01-01

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s -1 . The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s -1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  8. Cross-modal attention influences auditory contrast sensitivity: Decreasing visual load improves auditory thresholds for amplitude- and frequency-modulated sounds.

    Science.gov (United States)

    Ciaramitaro, Vivian M; Chow, Hiu Mei; Eglington, Luke G

    2017-03-01

    We used a cross-modal dual task to examine how changing visual-task demands influenced auditory processing, namely auditory thresholds for amplitude- and frequency-modulated sounds. Observers had to attend to two consecutive intervals of sounds and report which interval contained the auditory stimulus that was modulated in amplitude (Experiment 1) or frequency (Experiment 2). During auditory-stimulus presentation, observers simultaneously attended to a rapid sequential visual presentation-two consecutive intervals of streams of visual letters-and had to report which interval contained a particular color (low load, demanding less attentional resources) or, in separate blocks of trials, which interval contained more of a target letter (high load, demanding more attentional resources). We hypothesized that if attention is a shared resource across vision and audition, an easier visual task should free up more attentional resources for auditory processing on an unrelated task, hence improving auditory thresholds. Auditory detection thresholds were lower-that is, auditory sensitivity was improved-for both amplitude- and frequency-modulated sounds when observers engaged in a less demanding (compared to a more demanding) visual task. In accord with previous work, our findings suggest that visual-task demands can influence the processing of auditory information on an unrelated concurrent task, providing support for shared attentional resources. More importantly, our results suggest that attending to information in a different modality, cross-modal attention, can influence basic auditory contrast sensitivity functions, highlighting potential similarities between basic mechanisms for visual and auditory attention.

  9. Identifying thresholds for ecosystem-based management.

    Directory of Open Access Journals (Sweden)

    Jameal F Samhouri

    Full Text Available BACKGROUND: One of the greatest obstacles to moving ecosystem-based management (EBM from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. METHODOLOGY/PRINCIPAL FINDINGS: To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity and functional (e.g., resilience attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1 fishing and (2 nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. CONCLUSIONS/SIGNIFICANCE: For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management.

  10. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  11. A Robust Threshold for Iterative Channel Estimation in OFDM Systems

    Directory of Open Access Journals (Sweden)

    A. Kalaycioglu

    2010-04-01

    Full Text Available A novel threshold computation method for pilot symbol assisted iterative channel estimation in OFDM systems is considered. As the bits are transmitted in packets, the proposed technique is based on calculating a particular threshold for each data packet in order to select the reliable decoder output symbols to improve the channel estimation performance. Iteratively, additional pilot symbols are established according to the threshold and the channel is re-estimated with the new pilots inserted to the known channel estimation pilot set. The proposed threshold calculation method for selecting additional pilots performs better than non-iterative channel estimation, no threshold and fixed threshold techniques in poor HF channel simulations.

  12. Robust Public Key Cryptography — A New Cryptosystem Surviving Private Key Compromise

    Science.gov (United States)

    Shaik, Cheman

    A weakness of the present-day public key cryptosystems is that these cryptosystems do not survive private-key compromise attacks resulting from an internal breach of trust. In a competitive business environment, private key compromise is a common incident that voids the strength of public key cryptosystems such as RSA and ECC. Bribing corporate employees to disclose their secret keys and inadvertently disclosing secret information are among a plethora of practical attacks that occur at the implementation level. Once a breach of trust takes place and subsequently the private key is revealed, any public key cryptosystem fails to secure electronic data in Internet communications. The revealed key may be used by an attacker to decipher the intercepted data at an intermediary router. This weakness of public key cryptography calls for an additional security measure that enables encryptions to survive private key compromise attacks.

  13. Separable Reversible Data Hiding in Encrypted Signals with Public Key Cryptography

    Directory of Open Access Journals (Sweden)

    Wei-Liang Tai

    2018-01-01

    Full Text Available We propose separable reversible data hiding in an encrypted signal with public key cryptography. In our separable framework, the image owner encrypts the original image by using a public key. On receipt of the encrypted signal, the data-hider embeds data in it by using a data-hiding key. The image decryption and data extraction are independent and separable at the receiver side. Even though the receiver, who has only the data-hiding key, does not learn about the decrypted content, he can extract data from the received marked encrypted signal. However, the receiver who has only the private key cannot extract the embedded data, but he can directly decrypt the received marked encrypted signal to obtain the original image without any error. Compared with other schemes using a cipher stream to encrypt the image, the proposed scheme is more appropriate for cloud services without degrading the security level.

  14. A high sensitivity process variation sensor utilizing sub-threshold operation

    OpenAIRE

    Meterelliyoz, Mesut; Song, Peilin; Stellari, Franco; Kulkarni, Jaydeep P.; Roy, Kaushik

    2008-01-01

    In this paper, we propose a novel low-power, bias-free, high-sensitivity process variation sensor for monitoring random variations in the threshold voltage. The proposed sensor design utilizes the exponential current-voltage relationship of sub-threshold operation thereby improving the sensitivity by 2.3X compared to the above-threshold operation. A test-chip containing 128 PMOS and 128 NMOS devices has been fabri...

  15. Inter-symbol interference and beat noise in flexible data-rate coherent OCDMA and the BER improvement by using optical thresholding.

    Science.gov (United States)

    Wang, Xu; Wada, Naoya; Kitayama, Ken-Ichi

    2005-12-26

    Impairments of inter-symbol interference and beat noise in coherent time-spreading optical code-division-multiple-access are investigated theoretically and experimentally by sweeping the data-rate from 622 Mbps up to 10 Gbps with 511-chip superstructured fiber Bragg grating. The BER improvement by using optical thresholding technique has been verified in the experiment.

  16. Glint Field Trial Results and Application to Glint Threshold Distance Algorithm

    National Research Council Canada - National Science Library

    Chevalier, William

    1998-01-01

    .... Glint threshold algorithm. Software adjustments would tentatively be made to the existing algorithm to improve glint threshold distance calculation accuracy, making the modified model a better iterative eye armor design tool...

  17. Optical cryptography with biometrics for multi-depth objects.

    Science.gov (United States)

    Yan, Aimin; Wei, Yang; Hu, Zhijuan; Zhang, Jingtao; Tsang, Peter Wai Ming; Poon, Ting-Chung

    2017-10-11

    We propose an optical cryptosystem for encrypting images of multi-depth objects based on the combination of optical heterodyne technique and fingerprint keys. Optical heterodyning requires two optical beams to be mixed. For encryption, each optical beam is modulated by an optical mask containing either the fingerprint of the person who is sending, or receiving the image. The pair of optical masks are taken as the encryption keys. Subsequently, the two beams are used to scan over a multi-depth 3-D object to obtain an encrypted hologram. During the decryption process, each sectional image of the 3-D object is recovered by convolving its encrypted hologram (through numerical computation) with the encrypted hologram of a pinhole image that is positioned at the same depth as the sectional image. Our proposed method has three major advantages. First, the lost-key situation can be avoided with the use of fingerprints as the encryption keys. Second, the method can be applied to encrypt 3-D images for subsequent decrypted sectional images. Third, since optical heterodyning scanning is employed to encrypt a 3-D object, the optical system is incoherent, resulting in negligible amount of speckle noise upon decryption. To the best of our knowledge, this is the first time optical cryptography of 3-D object images has been demonstrated in an incoherent optical system with biometric keys.

  18. Threshold concepts in finance: student perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-10-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by finance academics. In addition, we investigate the potential of a framework of different types of knowledge to differentiate the delivery of the finance curriculum and the role of modelling in finance. Our purpose is to identify ways to improve curriculum design and delivery, leading to better student outcomes. Whilst we find that there is significant overlap between what students identify as important in finance and the threshold concepts identified by academics, much of this overlap is expressed by indirect reference to the concepts. Further, whilst different types of knowledge are apparent in the student data, there is evidence that students do not necessarily distinguish conceptual from other types of knowledge. As well as investigating the finance curriculum, the research demonstrates the use of threshold concepts to compare and contrast student and academic perceptions of a discipline and, as such, is of interest to researchers in education and other disciplines.

  19. A Comparative Study of Improved Artificial Bee Colony Algorithms Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Kanjana Charansiriphaisan

    2013-01-01

    Full Text Available Multilevel thresholding is a highly useful tool for the application of image segmentation. Otsu’s method, a common exhaustive search for finding optimal thresholds, involves a high computational cost. There has been a lot of recent research into various meta-heuristic searches in the area of optimization research. This paper analyses and discusses using a family of artificial bee colony algorithms, namely, the standard ABC, ABC/best/1, ABC/best/2, IABC/best/1, IABC/rand/1, and CABC, and some particle swarm optimization-based algorithms for searching multilevel thresholding. The strategy for an onlooker bee to select an employee bee was modified to serve our purposes. The metric measures, which are used to compare the algorithms, are the maximum number of function calls, successful rate, and successful performance. The ranking was performed by Friedman ranks. The experimental results showed that IABC/best/1 outperformed the other techniques when all of them were applied to multilevel image thresholding. Furthermore, the experiments confirmed that IABC/best/1 is a simple, general, and high performance algorithm.

  20. True random number generation from mobile telephone photo based on chaotic cryptography

    International Nuclear Information System (INIS)

    Zhao Liang; Liao Xiaofeng; Xiao Di; Xiang Tao; Zhou Qing; Duan Shukai

    2009-01-01

    A cheap, convenient and universal TRNG based on mobile telephone photo for producing random bit sequence is proposed. To settle the problem of sequential pixels and comparability, three chaos-based approaches are applied to post-process the generated binary image. The random numbers produced by three users are tested using US NIST RNG statistical test software. The experimental results indicate that the Arnold cat map is the fastest way to generate a random bit sequence and can be accepted on general PC. The 'MASK' algorithm also performs well. Finally, comparing with the TRNG of Hu et al. [Hu Y, Liao X, Wong KW, Zhou Q. A true random number generator based on mouse movement and chaotic cryptography. Chaos, Solitons and Fractals 2007. doi: 10.1016/j.chaos.2007.10.022] which is presented by Hu et al., many merits of the proposed TRNG in this paper has been found.

  1. Silica aerogel threshold Cherenkov counters for the JLab Hall A spectrometers: improvements and proposed modifications

    CERN Document Server

    Lagamba, L; Colilli, S; Crateri, R; De Leo, R; Frullani, S; Garibaldi, F; Giuliani, F; Gricia, M; Iodice, M; Iommi, R; Leone, A; Lucentini, M; Mostarda, A; Nappi, E; Perrino, R; Pierangeli, L; Santavenere, F; Urciuoli, G M

    2001-01-01

    Recently approved experiments at Jefferson Lab Hall A require a clean kaon identification in a large electron, pion, and proton background environment. To this end, improved performance is required of the silica aerogel threshold Cherenkov counters installed in the focal plane of the two Hall A spectrometers. In this paper we propose two strategies to improve the performance of the Cherenkov counters which presently use a hydrophilic aerogel radiator, and convey Cherenkov photons towards the photomultipliers by means of mirrors with a parabolic shape in one direction and flat in the other. The first strategy is aerogel baking. In the second strategy we propose a modification of the counter geometry by replacing the mirrors with a planar diffusing surface and by displacing in a different way the photomultipliers. Tests at CERN with a 5 GeV/c multiparticle beam revealed that both the strategies are able to increase significantly the number of the detected Cherenkov photons and, therefore, the detector performan...

  2. Psst, Can You Keep a Secret?

    Science.gov (United States)

    Vassilev, Apostol; Mouha, Nicky; Brandão, Luís

    2018-01-01

    The security of encrypted data depends not only on the theoretical properties of cryptographic primitives but also on the robustness of their implementations in software and hardware. Threshold cryptography introduces a computational paradigm that enables higher assurance for such implementations.

  3. Secure Multi-Player Protocols

    DEFF Research Database (Denmark)

    Fehr, Serge

    While classically cryptography is concerned with the problem of private communication among two entities, say players, in modern cryptography multi-player protocols play an important role. And among these, it is probably fair to say that secret sharing, and its stronger version verifiable secret...... sharing (VSS), as well as multi-party computation (MPC) belong to the most appealing and/or useful ones. The former two are basic tools to achieve better robustness of cryptographic schemes against malfunction or misuse by “decentralizing” the security from one single to a whole group of individuals...... (captured by the term threshold cryptography). The latter allows—at least in principle—to execute any collaboration among a group of players in a secure way that guarantees the correctness of the outcome but simultaneously respects the privacy of the participants. In this work, we study three aspects...

  4. Deciphering the language of nature: cryptography, secrecy, and alterity in Francis Bacon.

    Science.gov (United States)

    Clody, Michael C

    2011-01-01

    The essay argues that Francis Bacon's considerations of parables and cryptography reflect larger interpretative concerns of his natural philosophic project. Bacon describes nature as having a language distinct from those of God and man, and, in so doing, establishes a central problem of his natural philosophy—namely, how can the language of nature be accessed through scientific representation? Ultimately, Bacon's solution relies on a theory of differential and duplicitous signs that conceal within them the hidden voice of nature, which is best recognized in the natural forms of efficient causality. The "alphabet of nature"—those tables of natural occurrences—consequently plays a central role in his program, as it renders nature's language susceptible to a process and decryption that mirrors the model of the bilateral cipher. It is argued that while the writing of Bacon's natural philosophy strives for literality, its investigative process preserves a space for alterity within scientific representation, that is made accessible to those with the interpretative key.

  5. An Interoperability Consideration in Selecting Domain Parameters for Elliptic Curve Cryptography

    Science.gov (United States)

    Ivancic, Will (Technical Monitor); Eddy, Wesley M.

    2005-01-01

    Elliptic curve cryptography (ECC) will be an important technology for electronic privacy and authentication in the near future. There are many published specifications for elliptic curve cryptosystems, most of which contain detailed descriptions of the process for the selection of domain parameters. Selecting strong domain parameters ensures that the cryptosystem is robust to attacks. Due to a limitation in several published algorithms for doubling points on elliptic curves, some ECC implementations may produce incorrect, inconsistent, and incompatible results if domain parameters are not carefully chosen under a criterion that we describe. Few documents specify the addition or doubling of points in such a manner as to avoid this problematic situation. The safety criterion we present is not listed in any ECC specification we are aware of, although several other guidelines for domain selection are discussed in the literature. We provide a simple example of how a set of domain parameters not meeting this criterion can produce catastrophic results, and outline a simple means of testing curve parameters for interoperable safety over doubling.

  6. Upper bounds for the security of two distributed-phase reference protocols of quantum cryptography

    International Nuclear Information System (INIS)

    Branciard, Cyril; Gisin, Nicolas; Scarani, Valerio

    2008-01-01

    The differential-phase-shift (DPS) and the coherent-one-way (COW) are among the most practical protocols for quantum cryptography, and are therefore the object of fast-paced experimental developments. The assessment of their security is also a challenge for theorists: the existing tools, that allow to prove security against the most general attacks, do not apply to these two protocols in any straightforward way. We present new upper bounds for their security in the limit of large distances (d∼>50 km with typical values in optical fibers) by considering a large class of collective attacks, namely those in which the adversary attaches ancillary quantum systems to each pulse or to each pair of pulses. We introduce also two modified versions of the COW protocol, which may prove more robust than the original one

  7. Perioperative transfusion threshold and ambulation after hip revision surgery

    DEFF Research Database (Denmark)

    Nielsen, Kamilla; Johansson, Pär I; Dahl, Benny

    2014-01-01

    BACKGROUND: Transfusion with red blood cells (RBC) may be needed during hip revision surgery but the appropriate haemoglobin concentration (Hb) threshold for transfusion has not been well established. We hypothesized that a higher transfusion threshold would improve ambulation after hip revision...... surgery. METHODS: The trial was registered at Clinicaltrials.gov ( NCT00906295). Sixty-six patients aged 18 years or older undergoing hip revision surgery were randomized to receive RBC at a Hb threshold of either 7.3 g/dL (restrictive group) or 8.9 g/dL (liberal group). Postoperative ambulation...... received RBC. CONCLUSIONS: A Hb transfusion threshold of 8.9 g/dL was associated with a statistically significantly faster TUG after hip revision surgery compared to a threshold of 7.3 g/dL but the clinical importance is questionable and the groups did not differ in Hb at the time of testing....

  8. CARA Risk Assessment Thresholds

    Science.gov (United States)

    Hejduk, M. D.

    2016-01-01

    Warning remediation threshold (Red threshold): Pc level at which warnings are issued, and active remediation considered and usually executed. Analysis threshold (Green to Yellow threshold): Pc level at which analysis of event is indicated, including seeking additional information if warranted. Post-remediation threshold: Pc level to which remediation maneuvers are sized in order to achieve event remediation and obviate any need for immediate follow-up maneuvers. Maneuver screening threshold: Pc compliance level for routine maneuver screenings (more demanding than regular Red threshold due to additional maneuver uncertainty).

  9. Elliptic Curve Cryptography with Security System in Wireless Sensor Networks

    Science.gov (United States)

    Huang, Xu; Sharma, Dharmendra

    2010-10-01

    The rapid progress of wireless communications and embedded micro-electro-system technologies has made wireless sensor networks (WSN) very popular and even become part of our daily life. WSNs design are generally application driven, namely a particular application's requirements will determine how the network behaves. However, the natures of WSN have attracted increasing attention in recent years due to its linear scalability, a small software footprint, low hardware implementation cost, low bandwidth requirement, and high device performance. It is noted that today's software applications are mainly characterized by their component-based structures which are usually heterogeneous and distributed, including the WSNs. But WSNs typically need to configure themselves automatically and support as hoc routing. Agent technology provides a method for handling increasing software complexity and supporting rapid and accurate decision making. This paper based on our previous works [1, 2], three contributions have made, namely (a) fuzzy controller for dynamic slide window size to improve the performance of running ECC (b) first presented a hidden generation point for protection from man-in-the middle attack and (c) we first investigates multi-agent applying for key exchange together. Security systems have been drawing great attentions as cryptographic algorithms have gained popularity due to the natures that make them suitable for use in constrained environment such as mobile sensor information applications, where computing resources and power availability are limited. Elliptic curve cryptography (ECC) is one of high potential candidates for WSNs, which requires less computational power, communication bandwidth, and memory in comparison with other cryptosystem. For saving pre-computing storages recently there is a trend for the sensor networks that the sensor group leaders rather than sensors communicate to the end database, which highlighted the needs to prevent from the man

  10. Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm

    Science.gov (United States)

    Elahi, Sana; kaleem, Muhammad; Omer, Hammad

    2018-01-01

    Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.

  11. Weighted-noise threshold based channel estimation for OFDM ...

    Indian Academy of Sciences (India)

    Existing optimal time-domain thresholds exhibit suboptimal behavior for completely unavailable KCS ... Compared with no truncation case, truncation improved the MSE ... channel estimation errors has been studied. ...... Consumer Electron.

  12. Novel Threshold Changeable Secret Sharing Schemes Based on Polynomial Interpolation.

    Science.gov (United States)

    Yuan, Lifeng; Li, Mingchu; Guo, Cheng; Choo, Kim-Kwang Raymond; Ren, Yizhi

    2016-01-01

    After any distribution of secret sharing shadows in a threshold changeable secret sharing scheme, the threshold may need to be adjusted to deal with changes in the security policy and adversary structure. For example, when employees leave the organization, it is not realistic to expect departing employees to ensure the security of their secret shadows. Therefore, in 2012, Zhang et al. proposed (t → t', n) and ({t1, t2,⋯, tN}, n) threshold changeable secret sharing schemes. However, their schemes suffer from a number of limitations such as strict limit on the threshold values, large storage space requirement for secret shadows, and significant computation for constructing and recovering polynomials. To address these limitations, we propose two improved dealer-free threshold changeable secret sharing schemes. In our schemes, we construct polynomials to update secret shadows, and use two-variable one-way function to resist collusion attacks and secure the information stored by the combiner. We then demonstrate our schemes can adjust the threshold safely.

  13. Intraoperative transfusion threshold and tissue oxygenation

    DEFF Research Database (Denmark)

    Nielsen, K; Dahl, B; Johansson, P I

    2012-01-01

    Transfusion with allogeneic red blood cells (RBCs) may be needed to maintain oxygen delivery during major surgery, but the appropriate haemoglobin (Hb) concentration threshold has not been well established. We hypothesised that a higher level of Hb would be associated with improved subcutaneous...... oxygen tension during major spinal surgery....

  14. A rule based method for context sensitive threshold segmentation in SPECT using simulation

    International Nuclear Information System (INIS)

    Fleming, John S.; Alaamer, Abdulaziz S.

    1998-01-01

    Robust techniques for automatic or semi-automatic segmentation of objects in single photon emission computed tomography (SPECT) are still the subject of development. This paper describes a threshold based method which uses empirical rules derived from analysis of computer simulated images of a large number of objects. The use of simulation allowed the factors affecting the threshold which correctly segmented objects to be investigated systematically. Rules could then be derived from these data to define the threshold in any particular context. The technique operated iteratively and calculated local context sensitive thresholds along radial profiles from the centre of gravity of the object. It was evaluated in a further series of simulated objects and in human studies, and compared to the use of a global fixed threshold. The method was capable of improving accuracy of segmentation and volume assessment compared to the global threshold technique. The improvements were greater for small volumes, shapes with large surface area to volume ratio, variable surrounding activity and non-uniform distributions. The method was applied successfully to simulated objects and human studies and is considered to be a significant advance on global fixed threshold techniques. (author)

  15. Is the diagnostic threshold for bulimia nervosa clinically meaningful?

    Science.gov (United States)

    Chapa, Danielle A N; Bohrer, Brittany K; Forbush, Kelsie T

    2018-01-01

    The DSM-5 differentiates full- and sub-threshold bulimia nervosa (BN) according to average weekly frequencies of binge eating and inappropriate compensatory behaviors. This study was the first to evaluate the modified frequency criterion for BN published in the DSM-5. The purpose of this study was to test whether community-recruited adults (N=125; 83.2% women) with current full-threshold (n=77) or sub-threshold BN (n=48) differed in comorbid psychopathology and eating disorder (ED) illness duration, symptom severity, and clinical impairment. Participants completed the Clinical Impairment Assessment and participated in semi-structured clinical interviews of ED- and non-ED psychopathology. Differences between the sub- and full-threshold BN groups were assessed using MANOVA and Chi-square analyses. ED illness duration, age-of-onset, body mass index (BMI), alcohol and drug misuse, and the presence of current and lifetime mood or anxiety disorders did not differ between participants with sub- and full-threshold BN. Participants with full-threshold BN had higher levels of clinical impairment and weight concern than those with sub-threshold BN. However, minimal clinically important difference analyses suggested that statistically significant differences between participants with sub- and full-threshold BN on clinical impairment and weight concern were not clinically significant. In conclusion, sub-threshold BN did not differ from full-threshold BN in clinically meaningful ways. Future studies are needed to identify an improved frequency criterion for BN that better distinguishes individuals in ways that will more validly inform prognosis and effective treatment planning for BN. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. SART-Type Half-Threshold Filtering Approach for CT Reconstruction.

    Science.gov (United States)

    Yu, Hengyong; Wang, Ge

    2014-01-01

    The [Formula: see text] regularization problem has been widely used to solve the sparsity constrained problems. To enhance the sparsity constraint for better imaging performance, a promising direction is to use the [Formula: see text] norm (0 < p < 1) and solve the [Formula: see text] minimization problem. Very recently, Xu et al. developed an analytic solution for the [Formula: see text] regularization via an iterative thresholding operation, which is also referred to as half-threshold filtering. In this paper, we design a simultaneous algebraic reconstruction technique (SART)-type half-threshold filtering framework to solve the computed tomography (CT) reconstruction problem. In the medical imaging filed, the discrete gradient transform (DGT) is widely used to define the sparsity. However, the DGT is noninvertible and it cannot be applied to half-threshold filtering for CT reconstruction. To demonstrate the utility of the proposed SART-type half-threshold filtering framework, an emphasis of this paper is to construct a pseudoinverse transforms for DGT. The proposed algorithms are evaluated with numerical and physical phantom data sets. Our results show that the SART-type half-threshold filtering algorithms have great potential to improve the reconstructed image quality from few and noisy projections. They are complementary to the counterparts of the state-of-the-art soft-threshold filtering and hard-threshold filtering.

  17. Precipitation thresholds for landslide occurrence near Seattle, Mukilteo, and Everett, Washington

    Science.gov (United States)

    Scheevel, Caroline R.; Baum, Rex L.; Mirus, Benjamin B.; Smith, Joel B.

    2017-04-27

    Shallow landslides along coastal bluffs frequently occur in the railway corridor between Seattle and Everett, Washington. These slides disrupt passenger rail service, both because of required track maintenance and because the railroad owner, Burlington Northern Santa Fe Railway, does not allow passenger travel for 48 hours after a disruptive landslide. Sound Transit, which operates commuter trains in the corridor, is interested in a decision-making tool to help preemptively cancel passenger railway service in dangerous conditions and reallocate resources to alternative transportation.Statistical analysis showed that a majority of landslides along the Seattle-Everett Corridor are strongly correlated with antecedent rainfall, but that 21-37 percent of recorded landslide dates experienced less than 1 inch of precipitation in the 3 days preceding the landslide and less than 4 inches of rain in the 15 days prior to the preceding 3 days. We developed two empirical thresholds to identify precipitation conditions correlated with landslide occurrence. The two thresholds are defined as P3 = 2.16-0.44P15 and P3 = 2.16-0.22P32, where P3 is the cumulative precipitation in the 3 days prior to the considered date and P15 or P32 is the cumulative precipitation in the 15 days or 32 days prior to P3 (all measurements given in inches). The two thresholds, when compared to a previously developed threshold, quantitatively improve the prediction rate.We also investigated rainfall intensity-duration (ID) thresholds to determine whether revision would improve identification of moderate-intensity, landslide-producing storms. New, optimized ID thresholds evaluate rainstorms lasting at least 12 hours and identify landslide-inducing storms that were typically missed by previously published ID thresholds. The main advantage of the ID thresholds appears when they are combined with recent-antecedent thresholds because rainfall conditions that exceed both threshold types are more likely to induce

  18. Image size invariant visual cryptography for general access structures subject to display quality constraints.

    Science.gov (United States)

    Lee, Kai-Hui; Chiu, Pei-Ling

    2013-10-01

    Conventional visual cryptography (VC) suffers from a pixel-expansion problem, or an uncontrollable display quality problem for recovered images, and lacks a general approach to construct visual secret sharing schemes for general access structures. We propose a general and systematic approach to address these issues without sophisticated codebook design. This approach can be used for binary secret images in non-computer-aided decryption environments. To avoid pixel expansion, we design a set of column vectors to encrypt secret pixels rather than using the conventional VC-based approach. We begin by formulating a mathematic model for the VC construction problem to find the column vectors for the optimal VC construction, after which we develop a simulated-annealing-based algorithm to solve the problem. The experimental results show that the display quality of the recovered image is superior to that of previous papers.

  19. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  20. Winner's Curse Correction and Variable Thresholding Improve Performance of Polygenic Risk Modeling Based on Genome-Wide Association Study Summary-Level Data.

    Directory of Open Access Journals (Sweden)

    Jianxin Shi

    2016-12-01

    Full Text Available Recent heritability analyses have indicated that genome-wide association studies (GWAS have the potential to improve genetic risk prediction for complex diseases based on polygenic risk score (PRS, a simple modelling technique that can be implemented using summary-level data from the discovery samples. We herein propose modifications to improve the performance of PRS. We introduce threshold-dependent winner's-curse adjustments for marginal association coefficients that are used to weight the single-nucleotide polymorphisms (SNPs in PRS. Further, as a way to incorporate external functional/annotation knowledge that could identify subsets of SNPs highly enriched for associations, we propose variable thresholds for SNPs selection. We applied our methods to GWAS summary-level data of 14 complex diseases. Across all diseases, a simple winner's curse correction uniformly led to enhancement of performance of the models, whereas incorporation of functional SNPs was beneficial only for selected diseases. Compared to the standard PRS algorithm, the proposed methods in combination led to notable gain in efficiency (25-50% increase in the prediction R2 for 5 of 14 diseases. As an example, for GWAS of type 2 diabetes, winner's curse correction improved prediction R2 from 2.29% based on the standard PRS to 3.10% (P = 0.0017 and incorporating functional annotation data further improved R2 to 3.53% (P = 2×10-5. Our simulation studies illustrate why differential treatment of certain categories of functional SNPs, even when shown to be highly enriched for GWAS-heritability, does not lead to proportionate improvement in genetic risk-prediction because of non-uniform linkage disequilibrium structure.

  1. Quantum Information, computation and cryptography. An introductory survey of theory, technology and experiments

    International Nuclear Information System (INIS)

    Benatti, Fabio; Fannes, Mark; Floreanini, Roberto; Petritis, Dimitri

    2010-01-01

    This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation. (orig.)

  2. Particles near threshold

    International Nuclear Information System (INIS)

    Bhattacharya, T.; Willenbrock, S.

    1993-01-01

    We propose returning to the definition of the width of a particle in terms of the pole in the particle's propagator. Away from thresholds, this definition of width is equivalent to the standard perturbative definition, up to next-to-leading order; however, near a threshold, the two definitions differ significantly. The width as defined by the pole position provides more information in the threshold region than the standard perturbative definition and, in contrast with the perturbative definition, does not vanish when a two-particle s-wave threshold is approached from below

  3. Adaptive Hardware Cryptography Engine Based on FPGA

    International Nuclear Information System (INIS)

    Afify, M.A.A.

    2011-01-01

    In the last two decades, with spread of the real time applications over public networks or communications the need for information security become more important but with very high speed for data processing, to keep up with the real time applications requirements, that is the reason for using FPGA as an implementation platform for the proposed cryptography engine. Hence in this thesis a new S-Box design has been demonstrated and implemented, there is a comparison for the simulation results for proposed S-Box simulation results with respect to different designs for S-Box in DES, Two fish and Rijndael algorithms and another comparison among proposed S-Box with different sizes. The proposed S-Box implemented with 32-bits Input data lines and compared with different designs in the encryption algorithms with the same input lines, the proposed S-Box gives implementation results for the maximum frequency 120 MHz but the DES S-Box gives 34 MHz and Rijndael gives 71 MHz, on the other hand the proposed design gives the best implementation area, hence it gives 50 Configurable logic Block CLB but DES gives 88 CLB. The proposed S-Box implemented in different sizes 64-bits, 128-bits, and 256-bits for input data lines. The implementation carried out by using UniDAq PCI card with FPGA Chip XCV 800, synthesizing carried out for all designs by using Leonardo spectrum and simulation carried out by using model sim simulator program form the FPGA advantage package. Finally the results evaluation and verifications carried out using the UniDAq FPGA PCI card with chip XCV 800. Different cases study have been implemented, data encryption, images encryption, voice encryption, and video encryption. A prototype for Remote Monitoring Control System has been implemented. Finally the proposed design for S-Box has a significant achievement in maximum frequency, implementation area, and encryption strength.

  4. Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.

    Directory of Open Access Journals (Sweden)

    Liping Zhang

    Full Text Available In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.

  5. Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.

    Science.gov (United States)

    Zhang, Liping; Tang, Shanyu; Luo, He

    2016-01-01

    In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.

  6. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.

    Science.gov (United States)

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-07-14

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.

  7. An efficient RFID authentication protocol to enhance patient medication safety using elliptic curve cryptography.

    Science.gov (United States)

    Zhang, Zezhong; Qi, Qingqing

    2014-05-01

    Medication errors are very dangerous even fatal since it could cause serious even fatal harm to patients. In order to reduce medication errors, automated patient medication systems using the Radio Frequency Identification (RFID) technology have been used in many hospitals. The data transmitted in those medication systems is very important and sensitive. In the past decade, many security protocols have been proposed to ensure its secure transition attracted wide attention. Due to providing mutual authentication between the medication server and the tag, the RFID authentication protocol is considered as the most important security protocols in those systems. In this paper, we propose a RFID authentication protocol to enhance patient medication safety using elliptic curve cryptography (ECC). The analysis shows the proposed protocol could overcome security weaknesses in previous protocols and has better performance. Therefore, the proposed protocol is very suitable for automated patient medication systems.

  8. Synchronization of random bit generators based on coupled chaotic lasers and application to cryptography.

    Science.gov (United States)

    Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang

    2010-08-16

    Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.

  9. Music effect on pain threshold evaluated with current perception threshold

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    AIM: Music relieves anxiety and psychotic tension. This effect of music is applied to surgical operation in the hospital and dental office. It is still unclear whether this music effect is only limited to the psychological aspect but not to the physical aspect or whether its music effect is influenced by the mood or emotion of audience. To elucidate these issues, we evaluated the music effect on pain threshold by current perception threshold (CPT) and profile of mood states (POMC) test. METHODS: Healthy 30 subjects (12 men, 18 women, 25-49 years old, mean age 34.9) were tested. (1)After POMC test, all subjects were evaluated pain threshold with CPT by Neurometer (Radionics, USA) under 6 conditions, silence, listening to the slow tempo classic music, nursery music, hard rock music, classic paino music and relaxation music with 30 seconds interval. (2)After Stroop color word test as the stresser, pain threshold was evaluated with CPT under 2 conditions, silence and listening to the slow tempo classic music. RESULTS: Under litening to the music, CPT sores increased, especially 2 000 Hz level related with compression, warm and pain sensation. Type of music, preference of music and stress also affected CPT score. CONCLUSION: The present study demonstrated that the concentration on the music raise the pain threshold and that stress and mood influence the music effect on pain threshold.

  10. Image encryption using fingerprint as key based on phase retrieval algorithm and public key cryptography

    Science.gov (United States)

    Zhao, Tieyu; Ran, Qiwen; Yuan, Lin; Chi, Yingying; Ma, Jing

    2015-09-01

    In this paper, a novel image encryption system with fingerprint used as a secret key is proposed based on the phase retrieval algorithm and RSA public key algorithm. In the system, the encryption keys include the fingerprint and the public key of RSA algorithm, while the decryption keys are the fingerprint and the private key of RSA algorithm. If the users share the fingerprint, then the system will meet the basic agreement of asymmetric cryptography. The system is also applicable for the information authentication. The fingerprint as secret key is used in both the encryption and decryption processes so that the receiver can identify the authenticity of the ciphertext by using the fingerprint in decryption process. Finally, the simulation results show the validity of the encryption scheme and the high robustness against attacks based on the phase retrieval technique.

  11. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  12. Threshold-driven optimization for reference-based auto-planning

    Science.gov (United States)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  13. Symmetric Stream Cipher using Triple Transposition Key Method and Base64 Algorithm for Security Improvement

    Science.gov (United States)

    Nurdiyanto, Heri; Rahim, Robbi; Wulan, Nur

    2017-12-01

    Symmetric type cryptography algorithm is known many weaknesses in encryption process compared with asymmetric type algorithm, symmetric stream cipher are algorithm that works on XOR process between plaintext and key, to improve the security of symmetric stream cipher algorithm done improvisation by using Triple Transposition Key which developed from Transposition Cipher and also use Base64 algorithm for encryption ending process, and from experiment the ciphertext that produced good enough and very random.

  14. Implementation of floating gate MOSFET in inverter for threshold voltage tunability

    Directory of Open Access Journals (Sweden)

    Musa F.A.S.

    2017-01-01

    Full Text Available This paper presents the ability of floating gate MOSFET (FGMOS threshold voltage to be programmed or tuned which is exploited to improve the performance of electronic circuit design. This special characteristic owns by FGMOS is definitely contributes towards low voltage and low power circuit design. The comparison of threshold voltage between FGMOS and conventional NMOS is done in order to prove that FGMOS is able to produce a lower threshold voltage compared to conventional NMOS. In addition, in this paper, an implementation of FGMOS into inverter circuit is also done to show the programmability of FGMOS threshold voltage. The operations of the inverter circuits are verified using Sypnopsys simulation in 0.1μm CMOS technology with supply voltage of 1.8V.

  15. Modelling the regulatory system for diabetes mellitus with a threshold window

    Science.gov (United States)

    Yang, Jin; Tang, Sanyi; Cheke, Robert A.

    2015-05-01

    Piecewise (or non-smooth) glucose-insulin models with threshold windows for type 1 and type 2 diabetes mellitus are proposed and analyzed with a view to improving understanding of the glucose-insulin regulatory system. For glucose-insulin models with a single threshold, the existence and stability of regular, virtual, pseudo-equilibria and tangent points are addressed. Then the relations between regular equilibria and a pseudo-equilibrium are studied. Furthermore, the sufficient and necessary conditions for the global stability of regular equilibria and the pseudo-equilibrium are provided by using qualitative analysis techniques of non-smooth Filippov dynamic systems. Sliding bifurcations related to boundary node bifurcations were investigated with theoretical and numerical techniques, and insulin clinical therapies are discussed. For glucose-insulin models with a threshold window, the effects of glucose thresholds or the widths of threshold windows on the durations of insulin therapy and glucose infusion were addressed. The duration of the effects of an insulin injection is sensitive to the variation of thresholds. Our results indicate that blood glucose level can be maintained within a normal range using piecewise glucose-insulin models with a single threshold or a threshold window. Moreover, our findings suggest that it is critical to individualise insulin therapy for each patient separately, based on initial blood glucose levels.

  16. A Novel Image Steganography Technique for Secured Online Transaction Using DWT and Visual Cryptography

    Science.gov (United States)

    Anitha Devi, M. D.; ShivaKumar, K. B.

    2017-08-01

    Online payment eco system is the main target especially for cyber frauds. Therefore end to end encryption is very much needed in order to maintain the integrity of secret information related to transactions carried online. With access to payment related sensitive information, which enables lot of money transactions every day, the payment infrastructure is a major target for hackers. The proposed system highlights, an ideal approach for secure online transaction for fund transfer with a unique combination of visual cryptography and Haar based discrete wavelet transform steganography technique. This combination of data hiding technique reduces the amount of information shared between consumer and online merchant needed for successful online transaction along with providing enhanced security to customer’s account details and thereby increasing customer’s confidence preventing “Identity theft” and “Phishing”. To evaluate the effectiveness of proposed algorithm Root mean square error, Peak signal to noise ratio have been used as evaluation parameters

  17. A New Wavelet Threshold Determination Method Considering Interscale Correlation in Signal Denoising

    Directory of Open Access Journals (Sweden)

    Can He

    2015-01-01

    Full Text Available Due to simple calculation and good denoising effect, wavelet threshold denoising method has been widely used in signal denoising. In this method, the threshold is an important parameter that affects the denoising effect. In order to improve the denoising effect of the existing methods, a new threshold considering interscale correlation is presented. Firstly, a new correlation index is proposed based on the propagation characteristics of the wavelet coefficients. Then, a threshold determination strategy is obtained using the new index. At the end of the paper, a simulation experiment is given to verify the effectiveness of the proposed method. In the experiment, four benchmark signals are used as test signals. Simulation results show that the proposed method can achieve a good denoising effect under various signal types, noise intensities, and thresholding functions.

  18. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  19. Social Thresholds and their Translation into Social-ecological Management Practices

    Directory of Open Access Journals (Sweden)

    Lisa Christensen

    2012-03-01

    Full Text Available The objective of this paper is to provide a preliminary discussion of how to improve our conceptualization of social thresholds using (1 a more sociological analysis of social resilience, and (2 results from research carried out in collaboration with the Champagne and Aishihik First Nations of the Yukon Territory, Canada. Our sociological analysis of the concept of resilience begins with a review of the literature followed by placement of the concept in the domain of sociological theory to gain insight into its strengths and limitations. A new notion of social thresholds is proposed and case study research discussed to support the proposition. Our findings suggest that rather than view social thresholds as breakpoints between two regimes, as thresholds are typically conceived in the resilience literature, that they be viewed in terms of collectively recognized points that signify new experiences. Some examples of thresholds identified in our case study include power in decision making, level of healing from historical events, and a preference for small-scale development over large capital intensive projects.

  20. Generalized optical angular momentum sorter and its application to high-dimensional quantum cryptography.

    Science.gov (United States)

    Larocque, Hugo; Gagnon-Bischoff, Jérémie; Mortimer, Dominic; Zhang, Yingwen; Bouchard, Frédéric; Upham, Jeremy; Grillo, Vincenzo; Boyd, Robert W; Karimi, Ebrahim

    2017-08-21

    The orbital angular momentum (OAM) carried by optical beams is a useful quantity for encoding information. This form of encoding has been incorporated into various works ranging from telecommunications to quantum cryptography, most of which require methods that can rapidly process the OAM content of a beam. Among current state-of-the-art schemes that can readily acquire this information are so-called OAM sorters, which consist of devices that spatially separate the OAM components of a beam. Such devices have found numerous applications in optical communications, a field that is in constant demand for additional degrees of freedom, such as polarization and wavelength, into which information can also be encoded. Here, we report the implementation of a device capable of sorting a beam based on its OAM and polarization content, which could be of use in works employing both of these degrees of freedom as information channels. After characterizing our fabricated device, we demonstrate how it can be used for quantum communications via a quantum key distribution protocol.

  1. Effect of imperfect Faraday mirrors on the security of a Faraday–Michelson quantum cryptography system

    International Nuclear Information System (INIS)

    Wang, Wei-Long; Gao, Ming; Ma, Zhi

    2013-01-01

    The one-way Faraday–Michelson system is a very useful practical quantum cryptography system where Faraday mirrors (FMs) play an important role. In this paper we analyze the security of this system against imperfect FMs. We consider the security loophole caused by imperfect FMs in Alice’s and Bob’s security zones. Then we implement a passive FM attack in this system. By changing the values of the imperfection parameters of Alice’s FMs, we calculate the quantum bit error rate between Alice and Bob induced by Eve and the probability that Eve obtains outcomes successfully. It is shown that the imperfection of one of Alice’s two FMs makes the system sensitive to an attack. Finally we give a modified key rate as a function of the FM imperfections. The security analysis indicates that both Alice’s and Bob’s imperfect FMs can compromise the secure key. (paper)

  2. Laser-damage thresholds of thin-film optical coatings at 248 nm

    International Nuclear Information System (INIS)

    Milam, D.; Rainer, F.; Lowdermilk, W.H.

    1981-01-01

    We have measured the laser-induced damage thresholds for 248 nm wavelength light of over 100 optical coatings from commercial vendors and research institutions. All samples were irradiated once per damage site with temporally multi-lobed, 20-ns pulses generated by a KrF laser. The survey included high, partial, and dichroic reflectors, anti-reflective coatings, and single layer films. The samples were supplied by ten vendors. The majority of samples tested were high reflectors and antireflective coatings. The highest damage thresholds were 8.5 to 9.4 J/cm 2 , respectively. Although these represent extremes of what has been tested so far, several vendors have produced coatings of both types with thresholds which consistently exceed 6 J/cm 2 . Repeated irradiations of some sites were made on a few samples. These yielded no degradation in threshold, but in fact some improvement in damage resistance. These same samples also exhibited no change in threshold after being retested seven months later

  3. Dynamic-Threshold-Limited Timed-Token (DTLTT) Protocol | Kalu ...

    African Journals Online (AJOL)

    An improved version of the Static-Threshold-Limited On-Demand Guaranteed Service Timed-Token (STOGSTT) Media Access Control (MAC) protocol for channel capacity allocation to the asynchronous trac in Multiservice Local Area Network (MLANs) was developed and analyzed. TLODGSTT protocol uses static value of ...

  4. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  5. Thresholds in radiobiology

    International Nuclear Information System (INIS)

    Katz, R.; Hofmann, W.

    1982-01-01

    Interpretations of biological radiation effects frequently use the word 'threshold'. The meaning of this word is explored together with its relationship to the fundamental character of radiation effects and to the question of perception. It is emphasised that although the existence of either a dose or an LET threshold can never be settled by experimental radiobiological investigations, it may be argued on fundamental statistical grounds that for all statistical processes, and especially where the number of observed events is small, the concept of a threshold is logically invalid. (U.K.)

  6. Analysis of the width-w non-adjacent form in conjunction with hyperelliptic curve cryptography and with lattices☆

    Science.gov (United States)

    Krenn, Daniel

    2013-01-01

    In this work the number of occurrences of a fixed non-zero digit in the width-w non-adjacent forms of all elements of a lattice in some region (e.g. a ball) is analysed. As bases, expanding endomorphisms with eigenvalues of the same absolute value are allowed. Applications of the main result are on numeral systems with an algebraic integer as base. Those come from efficient scalar multiplication methods (Frobenius-and-add methods) in hyperelliptic curves cryptography, and the result is needed for analysing the running time of such algorithms. The counting result itself is an asymptotic formula, where its main term coincides with the full block length analysis. In its second order term a periodic fluctuation is exhibited. The proof follows Delange’s method. PMID:23805020

  7. Shifts in the relationship between motor unit recruitment thresholds versus derecruitment thresholds during fatigue.

    Science.gov (United States)

    Stock, Matt S; Mota, Jacob A

    2017-12-01

    Muscle fatigue is associated with diminished twitch force amplitude. We examined changes in the motor unit recruitment versus derecruitment threshold relationship during fatigue. Nine men (mean age = 26 years) performed repeated isometric contractions at 50% maximal voluntary contraction (MVC) knee extensor force until exhaustion. Surface electromyographic signals were detected from the vastus lateralis, and were decomposed into their constituent motor unit action potential trains. Motor unit recruitment and derecruitment thresholds and firing rates at recruitment and derecruitment were evaluated at the beginning, middle, and end of the protocol. On average, 15 motor units were studied per contraction. For the initial contraction, three subjects showed greater recruitment thresholds than derecruitment thresholds for all motor units. Five subjects showed greater recruitment thresholds than derecruitment thresholds for only low-threshold motor units at the beginning, with a mean cross-over of 31.6% MVC. As the muscle fatigued, many motor units were derecruited at progressively higher forces. In turn, decreased slopes and increased y-intercepts were observed. These shifts were complemented by increased firing rates at derecruitment relative to recruitment. As the vastus lateralis fatigued, the central nervous system's compensatory adjustments resulted in a shift of the regression line of the recruitment versus derecruitment threshold relationship. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Linguo Li

    2017-01-01

    Full Text Available The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO, which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur’s entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO, the differential evolution (DE, the Artifical Bee Colony (ABC, and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability.

  9. HMM-ModE – Improved classification using profile hidden Markov models by optimising the discrimination threshold and modifying emission probabilities with negative training sequences

    Directory of Open Access Journals (Sweden)

    Nandi Soumyadeep

    2007-03-01

    Full Text Available Abstract Background Profile Hidden Markov Models (HMM are statistical representations of protein families derived from patterns of sequence conservation in multiple alignments and have been used in identifying remote homologues with considerable success. These conservation patterns arise from fold specific signals, shared across multiple families, and function specific signals unique to the families. The availability of sequences pre-classified according to their function permits the use of negative training sequences to improve the specificity of the HMM, both by optimizing the threshold cutoff and by modifying emission probabilities to minimize the influence of fold-specific signals. A protocol to generate family specific HMMs is described that first constructs a profile HMM from an alignment of the family's sequences and then uses this model to identify sequences belonging to other classes that score above the default threshold (false positives. Ten-fold cross validation is used to optimise the discrimination threshold score for the model. The advent of fast multiple alignment methods enables the use of the profile alignments to align the true and false positive sequences, and the resulting alignments are used to modify the emission probabilities in the original model. Results The protocol, called HMM-ModE, was validated on a set of sequences belonging to six sub-families of the AGC family of kinases. These sequences have an average sequence similarity of 63% among the group though each sub-group has a different substrate specificity. The optimisation of discrimination threshold, by using negative sequences scored against the model improves specificity in test cases from an average of 21% to 98%. Further discrimination by the HMM after modifying model probabilities using negative training sequences is provided in a few cases, the average specificity rising to 99%. Similar improvements were obtained with a sample of G-Protein coupled receptors

  10. Metaanalysis of ketosis milk indicators in terms of their threshold estimation

    Directory of Open Access Journals (Sweden)

    Oto Hanuš

    2013-01-01

    Full Text Available Real time analyses of main milk components are attended in milking parlours today. Regular day information without delay is advantageous. Farmers can know milk composition every day. They can calculate milk energy quotients, identified subclinical ketosis in early lactation of dairy cows and thus improve ketosis prevention and avoid economical losses. Aim was to improve the estimation reliability of thresholds of milk indicators of energy metabolism for subclinical ketosis detection and its prevention support by metaanalysis. This can have higher result reliability than individual studies. Results of similar papers were analysed. These were focused on ketosis indicators in milk (acetone (AC and milk energy quotients (fat/crude protein, F/CP; fat/lactose, F/L and their thresholds for subclinical ketosis. Methods for threshold derivation were specified: – statistically to reference procedure; – calculation according to relevant data frequency distribution; – qualified estimation; – combinations of mentioned procedures. This was as weight source. Variability in AC subclinical ketosis cut–off values was high (78.5% and in ketosis milk quotients was low (from 5 to 8%. The value 10.57 mg.l−1 could be the validated estimation of milk AC cut–off limit for subclinical ketosis identification. Similarly the milk quotients F/CP and F/L 1.276 and 0.82. The F/CP F/L relationship is closer in 1st third of lactation (0.89; P < 0.001 than in whole lactation (0.86; P < 0.001. This could be one of proofs of ability for subclinical ketosis identification because the majority of cases occurs in early lactation. The improved estimations of thresholds of milk indicators in early lactation for subclinical ketosis can be used at this technological innovation. Combined use of both quotients could bring an improvement of regular diagnosis of subclinical ketosis.

  11. Clipper Meets Apple vs. FBI—A Comparison of the Cryptography Discourses from 1993 and 2016

    Directory of Open Access Journals (Sweden)

    Matthias Schulze

    2017-03-01

    Full Text Available This article analyzes two cryptography discourses dealing with the question of whether governments should be able to monitor secure and encrypted communication, for example via security vulnerabilities in cryptographic systems. The Clipper chip debate of 1993 and the FBI vs. Apple case of 2016 are analyzed to infer whether these discourses show similarities in their arguments and to draw lessons from them. The study is based on the securitization framework and analyzes the social construction of security threats in political discourses. The findings are that the arguments made by the proponents of exceptional access show major continuities between the two cases. In contrast, the arguments of the critics are more diverse. The critical arguments for stronger encryption remain highly relevant, especially in the context of the Snowden revelations. The article concludes that we need to adopt a more general cyber security perspective, considering the threat of cyber crime and state hacking, when debating whether the government should be able to weaken encryption.

  12. Threshold factorization redux

    Science.gov (United States)

    Chay, Junegone; Kim, Chul

    2018-05-01

    We reanalyze the factorization theorems for the Drell-Yan process and for deep inelastic scattering near threshold, as constructed in the framework of the soft-collinear effective theory (SCET), from a new, consistent perspective. In order to formulate the factorization near threshold in SCET, we should include an additional degree of freedom with small energy, collinear to the beam direction. The corresponding collinear-soft mode is included to describe the parton distribution function (PDF) near threshold. The soft function is modified by subtracting the contribution of the collinear-soft modes in order to avoid double counting on the overlap region. As a result, the proper soft function becomes infrared finite, and all the factorized parts are free of rapidity divergence. Furthermore, the separation of the relevant scales in each factorized part becomes manifest. We apply the same idea to the dihadron production in e+e- annihilation near threshold, and show that the resultant soft function is also free of infrared and rapidity divergences.

  13. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  14. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  15. Evaluating the "Threshold Theory": Can Head Impact Indicators Help?

    Science.gov (United States)

    Mihalik, Jason P; Lynall, Robert C; Wasserman, Erin B; Guskiewicz, Kevin M; Marshall, Stephen W

    2017-02-01

    This study aimed to determine the clinical utility of biomechanical head impact indicators by measuring the sensitivity, specificity, positive predictive value (PV+), and negative predictive value (PV-) of multiple thresholds. Head impact biomechanics (n = 283,348) from 185 football players in one Division I program were collected. A multidisciplinary clinical team independently made concussion diagnoses (n = 24). We dichotomized each impact using diagnosis (yes = 24, no = 283,324) and across a range of plausible impact indicator thresholds (10g increments beginning with a resultant linear head acceleration of 50g and ending with 120g). Some thresholds had adequate sensitivity, specificity, and PV-. All thresholds had low PV+, with the best recorded PV+ less than 0.4% when accounting for all head impacts sustained by our sample. Even when conservatively adjusting the frequency of diagnosed concussions by a factor of 5 to account for unreported/undiagnosed injuries, the PV+ of head impact indicators at any threshold was no greater than 1.94%. Although specificity and PV- appear high, the low PV+ would generate many unnecessary evaluations if these indicators were the sole diagnostic criteria. The clinical diagnostic value of head impact indicators is considerably questioned by these data. Notwithstanding, valid sensor technologies continue to offer objective data that have been used to improve player safety and reduce injury risk.

  16. Threshold guidance update

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1986-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Last years' activities (1984) included the development of a threshold guidance dose, the development of threshold concentrations corresponding to the guidance dose, the development of supporting documentation, review by a technical peer review committee, and review by the DOE community. As a result of the comments, areas have been identified for more extensive analysis, including an alternative basis for selection of the guidance dose and the development of quality assurance guidelines. Development of quality assurance guidelines will provide a reasonable basis for determining that a given waste stream qualifies as a threshold waste stream and can then be the basis for a more extensive cost-benefit analysis. The threshold guidance and supporting documentation will be revised, based on the comments received. The revised documents will be provided to DOE by early November. DOE-HQ has indicated that the revised documents will be available for review by DOE field offices and their contractors

  17. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    International Nuclear Information System (INIS)

    Vedam, S.; Archambault, L.; Starkschall, G.; Mohan, R.; Beddar, S.

    2007-01-01

    and delivery gate thresholds to within 0.3%. For patient data analysis, differences between simulation and delivery gate thresholds are reported as a fraction of the total respiratory motion range. For the smaller phase interval, the differences between simulation and delivery gate thresholds are 8±11% and 14±21% with and without audio-visual biofeedback, respectively, when the simulation gate threshold is determined based on the mean respiratory displacement within the 40%-60% gating phase interval. For the longer phase interval, corresponding differences are 4±7% and 8±15% with and without audio-visual biofeedback, respectively. Alternatively, when the simulation gate threshold is determined based on the maximum average respiratory displacement within the gating phase interval, greater differences between simulation and delivery gate thresholds are observed. A relationship between retrospective simulation gate threshold and prospective delivery gate threshold for respiratory gating is established and validated for regular and nonregular respiratory motion. Using this relationship, the delivery gate threshold can be reliably estimated at the time of 4D CT simulation, thereby improving the accuracy and efficiency of respiratory-gated radiation delivery

  18. Information verification cryptosystem using one-time keys based on double random phase encoding and public-key cryptography

    Science.gov (United States)

    Zhao, Tieyu; Ran, Qiwen; Yuan, Lin; Chi, Yingying; Ma, Jing

    2016-08-01

    A novel image encryption system based on double random phase encoding (DRPE) and RSA public-key algorithm is proposed. The main characteristic of the system is that each encryption process produces a new decryption key (even for the same plaintext), thus the encryption system conforms to the feature of the one-time pad (OTP) cryptography. The other characteristic of the system is the use of fingerprint key. Only with the rightful authorization will the true decryption be obtained, otherwise the decryption will result in noisy images. So the proposed system can be used to determine whether the ciphertext is falsified by attackers. In addition, the system conforms to the basic agreement of asymmetric cryptosystem (ACS) due to the combination with the RSA public-key algorithm. The simulation results show that the encryption scheme has high robustness against the existing attacks.

  19. Network-level reproduction number and extinction threshold for vector-borne diseases.

    Science.gov (United States)

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  20. How to Share a Lattice Trapdoor

    DEFF Research Database (Denmark)

    Bendlin, Rikke; Peikert, Chris; Krehbiel, Sara

    2013-01-01

    We develop secure threshold protocols for two important operations in lattice cryptography, namely, generating a hard lattice Λ together with a "strong" trapdoor, and sampling from a discrete Gaussian distribution over a desired coset of Λ using the trapdoor. These are the central operations...... delegation, which is used in lattice-based hierarchical IBE schemes. Our work therefore directly transfers all these systems to the threshold setting. Our protocols provide information-theoretic (i.e., statistical) security against adaptive corruptions in the UC framework, and they are robust against up to ℓ...

  1. Intermediate structure and threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2004-01-01

    The Intermediate Structure, evidenced through microstructures of the neutron strength function, is reflected in open reaction channels as fluctuations in excitation function of nuclear threshold effects. The intermediate state supporting both neutron strength function and nuclear threshold effect is a micro-giant neutron threshold state. (author)

  2. Analysis of the Threshold Effect of Financial Development on China’s Carbon Intensity

    Directory of Open Access Journals (Sweden)

    Xiongfeng Pan

    2016-03-01

    Full Text Available Using panel data on 30 provinces in China from 2005 to 2012, this paper conducts an empirical test on the threshold effect of the relationship between financial development and carbon emission intensity from the perspectives of financial scale and financial efficiency. The results show that at a low level of per capita GDP, the expansion of the financial scale and the enhancement of financial efficiency will increase carbon intensity. When the per capita GDP is greater than the threshold value (RMB 37,410, the expansion of the financial scale will also increase carbon intensity, but the potency of this effect will be weaker. At the same time, the improvement of financial efficiency will help reduce carbon intensity. Most provinces with per capita GDP greater than the threshold value (RMB 37,410 are located in the eastern coastal areas of China, whereas most provinces with per capita GDP less than the threshold value are located in the central and western areas of China. Both raising the level of openness and improving the industrial structure will have significantly positive effects on carbon intensity.

  3. Reconciling threshold and subthreshold expansions for pion-nucleon scattering

    Science.gov (United States)

    Siemens, D.; Ruiz de Elvira, J.; Epelbaum, E.; Hoferichter, M.; Krebs, H.; Kubis, B.; Meißner, U.-G.

    2017-07-01

    Heavy-baryon chiral perturbation theory (ChPT) at one loop fails in relating the pion-nucleon amplitude in the physical region and for subthreshold kinematics due to loop effects enhanced by large low-energy constants. Studying the chiral convergence of threshold and subthreshold parameters up to fourth order in the small-scale expansion, we address the question to what extent this tension can be mitigated by including the Δ (1232) as an explicit degree of freedom and/or using a covariant formulation of baryon ChPT. We find that the inclusion of the Δ indeed reduces the low-energy constants to more natural values and thereby improves consistency between threshold and subthreshold kinematics. In addition, even in the Δ-less theory the resummation of 1 /mN corrections in the covariant scheme improves the results markedly over the heavy-baryon formulation, in line with previous observations in the single-baryon sector of ChPT that so far have evaded a profound theoretical explanation.

  4. Reconciling threshold and subthreshold expansions for pion–nucleon scattering

    Directory of Open Access Journals (Sweden)

    D. Siemens

    2017-07-01

    Full Text Available Heavy-baryon chiral perturbation theory (ChPT at one loop fails in relating the pion–nucleon amplitude in the physical region and for subthreshold kinematics due to loop effects enhanced by large low-energy constants. Studying the chiral convergence of threshold and subthreshold parameters up to fourth order in the small-scale expansion, we address the question to what extent this tension can be mitigated by including the Δ(1232 as an explicit degree of freedom and/or using a covariant formulation of baryon ChPT. We find that the inclusion of the Δ indeed reduces the low-energy constants to more natural values and thereby improves consistency between threshold and subthreshold kinematics. In addition, even in the Δ-less theory the resummation of 1/mN corrections in the covariant scheme improves the results markedly over the heavy-baryon formulation, in line with previous observations in the single-baryon sector of ChPT that so far have evaded a profound theoretical explanation.

  5. Nuclear threshold effects and neutron strength function

    International Nuclear Information System (INIS)

    Hategan, Cornel; Comisel, Horia

    2003-01-01

    One proves that a Nuclear Threshold Effect is dependent, via Neutron Strength Function, on Spectroscopy of Ancestral Neutron Threshold State. The magnitude of the Nuclear Threshold Effect is proportional to the Neutron Strength Function. Evidence for relation of Nuclear Threshold Effects to Neutron Strength Functions is obtained from Isotopic Threshold Effect and Deuteron Stripping Threshold Anomaly. The empirical and computational analysis of the Isotopic Threshold Effect and of the Deuteron Stripping Threshold Anomaly demonstrate their close relationship to Neutron Strength Functions. It was established that the Nuclear Threshold Effects depend, in addition to genuine Nuclear Reaction Mechanisms, on Spectroscopy of (Ancestral) Neutron Threshold State. The magnitude of the effect is proportional to the Neutron Strength Function, in their dependence on mass number. This result constitutes also a proof that the origins of these threshold effects are Neutron Single Particle States at zero energy. (author)

  6. The Singapore protocol [for quantum cryptography

    International Nuclear Information System (INIS)

    Englert, B.

    2005-01-01

    The qubit protocol for quantum key distribution presented in this talk is fully tomographic and more efficient than other tomographic protocols. Under ideal circumstances the efficiency is log 2 (4/3) = 0.415 key bits per qubit sent, which is 25% more than the efficiency of 1/3 = 0.333 for the standard 6-state protocol. One can extract 0.4 key bits per qubit by a simple two-way communication scheme, and can so get close to the information-theoretical limit. The noise thresholds for secure key bit generation in the presence of unbiased noise will be reported and discussed. (author)

  7. A Trusted Computing Architecture of Embedded System Based on Improved TPM

    Directory of Open Access Journals (Sweden)

    Wang Xiaosheng

    2017-01-01

    Full Text Available The Trusted Platform Module (TPM currently used by PCs is not suitable for embedded systems, it is necessary to improve existing TPM. The paper proposes a trusted computing architecture with new TPM and the cryptographic system developed by China for the embedded system. The improved TPM consists of the Embedded System Trusted Cryptography Module (eTCM and the Embedded System Trusted Platform Control Module (eTPCM, which are combined and implemented the TPM’s autonomous control, active defense, high-speed encryption/decryption and other function through its internal bus arbitration module and symmetric and asymmetric cryptographic engines to effectively protect the security of embedded system. In our improved TPM, a trusted measurement method with chain model and star type model is used. Finally, the improved TPM is designed by FPGA, and it is used to a trusted PDA to carry out experimental verification. Experiments show that the trusted architecture of the embedded system based on the improved TPM is efficient, reliable and secure.

  8. Ankle Accelerometry for Assessing Physical Activity among Adolescent Girls: Threshold Determination, Validity, Reliability, and Feasibility

    Science.gov (United States)

    Hager, Erin R.; Treuth, Margarita S.; Gormely, Candice; Epps, LaShawna; Snitker, Soren; Black, Maureen M.

    2015-01-01

    Purpose: Ankle accelerometry allows for 24-hr data collection and improves data volume/integrity versus hip accelerometry. Using Actical ankle accelerometry, the purpose of this study was to (a) develop sensitive/specific thresholds, (b) examine validity/reliability, (c) compare new thresholds with those of the manufacturer, and (d) examine…

  9. A Framework for Optimizing Phytosanitary Thresholds in Seed Systems.

    Science.gov (United States)

    Choudhury, Robin Alan; Garrett, Karen A; Klosterman, Steven J; Subbarao, Krishna V; McRoberts, Neil

    2017-10-01

    Seedborne pathogens and pests limit production in many agricultural systems. Quarantine programs help prevent the introduction of exotic pathogens into a country, but few regulations directly apply to reducing the reintroduction and spread of endemic pathogens. Use of phytosanitary thresholds helps limit the movement of pathogen inoculum through seed, but the costs associated with rejected seed lots can be prohibitive for voluntary implementation of phytosanitary thresholds. In this paper, we outline a framework to optimize thresholds for seedborne pathogens, balancing the cost of rejected seed lots and benefit of reduced inoculum levels. The method requires relatively small amounts of data, and the accuracy and robustness of the analysis improves over time as data accumulate from seed testing. We demonstrate the method first and illustrate it with a case study of seedborne oospores of Peronospora effusa, the causal agent of spinach downy mildew. A seed lot threshold of 0.23 oospores per seed could reduce the overall number of oospores entering the production system by 90% while removing 8% of seed lots destined for distribution. Alternative mitigation strategies may result in lower economic losses to seed producers, but have uncertain efficacy. We discuss future challenges and prospects for implementing this approach.

  10. EbayesThresh: R Programs for Empirical Bayes Thresholding

    Directory of Open Access Journals (Sweden)

    Iain Johnstone

    2005-04-01

    Full Text Available Suppose that a sequence of unknown parameters is observed sub ject to independent Gaussian noise. The EbayesThresh package in the S language implements a class of Empirical Bayes thresholding methods that can take advantage of possible sparsity in the sequence, to improve the quality of estimation. The prior for each parameter in the sequence is a mixture of an atom of probability at zero and a heavy-tailed density. Within the package, this can be either a Laplace (double exponential density or else a mixture of normal distributions with tail behavior similar to the Cauchy distribution. The mixing weight, or sparsity parameter, is chosen automatically by marginal maximum likelihood. If estimation is carried out using the posterior median, this is a random thresholding procedure; the estimation can also be carried out using other thresholding rules with the same threshold, and the package provides the posterior mean, and hard and soft thresholding, as additional options. This paper reviews the method, and gives details (far beyond those previously published of the calculations needed for implementing the procedures. It explains and motivates both the general methodology, and the use of the EbayesThresh package, through simulated and real data examples. When estimating the wavelet transform of an unknown function, it is appropriate to apply the method level by level to the transform of the observed data. The package can carry out these calculations for wavelet transforms obtained using various packages in R and S-PLUS. Details, including a motivating example, are presented, and the application of the method to image estimation is also explored. The final topic considered is the estimation of a single sequence that may become progressively sparser along the sequence. An iterated least squares isotone regression method allows for the choice of a threshold that depends monotonically on the order in which the observations are made. An alternative

  11. Rate modulation detection thresholds for cochlear implant users.

    Science.gov (United States)

    Brochier, Tim; McKay, Colette; McDermott, Hugh

    2018-02-01

    The perception of temporal amplitude modulations is critical for speech understanding by cochlear implant (CI) users. The present study compared the ability of CI users to detect sinusoidal modulations of the electrical stimulation rate and current level, at different presentation levels (80% and 40% of the dynamic range) and modulation frequencies (10 and 100 Hz). Rate modulation detection thresholds (RMDTs) and amplitude modulation detection thresholds (AMDTs) were measured and compared to assess whether there was a perceptual advantage to either modulation method. Both RMDTs and AMDTs improved with increasing presentation level and decreasing modulation frequency. RMDTs and AMDTs were correlated, indicating that a common processing mechanism may underlie the perception of rate modulation and amplitude modulation, or that some subject-dependent factors affect both types of modulation detection.

  12. The laser second threshold: Its exact analytical dependence on detuning and relaxation rates

    International Nuclear Information System (INIS)

    Bakasov, A.A.; Abraham, N.B.

    1992-11-01

    An exact analysis has been carried out for general analytical expressions for the second threshold of a single-mode homogeneously broadened laser and for the initial pulsation frequency at the second threshold for arbitrary physical values of the relaxation rates, and at arbitrary detuning between the cavity frequency and the atomic resonance frequency. These expressions also give correspondingly exact forms for asymptotic cases that have previously studied with some approximations. Earlier approximate results are partly confirmed and partly improved by these more general expressions. The physical status of various expressions and approximations is re-considered and specified more clearly, including an analysis of which reasonably can be attained in lasers or masers. A general analytical proof is given that for larger detuning of the laser cavity from resonance a higher value of the laser excitation is required to destabilize the steady state solution (the second threshold). We also present results for the minimum value of the second threshold at fixed detuning as a function of the other parameters of the system and on the dependence of the ratio of the second threshold to the first threshold as a function of detuning. Minima of the second threshold and of the threshold ratio occur only if the population relaxation rate is equal to zero. The minima of the threshold ratio are shown to be bounded from above as well as from below (as functions of the relaxation rates, so long as the second threshold exists). The upper bound on the threshold ratio is equal to 17. The variation of the second threshold in the semi-infinite parameter space of the decay rates is shown at various detunings in plots with a finite domain by normalizing the material relaxation rates to the cavity decay rate. (author). 53 refs, 22 figs, 3 tabs

  13. Threshold behavior in electron-atom scattering

    International Nuclear Information System (INIS)

    Sadeghpour, H.R.; Greene, C.H.

    1996-01-01

    Ever since the classic work of Wannier in 1953, the process of treating two threshold electrons in the continuum of a positively charged ion has been an active field of study. The authors have developed a treatment motivated by the physics below the double ionization threshold. By modeling the double ionization as a series of Landau-Zener transitions, they obtain an analytical formulation of the absolute threshold probability which has a leading power law behavior, akin to Wannier's law. Some of the noteworthy aspects of this derivation are that the derivation can be conveniently continued below threshold giving rise to a open-quotes cuspclose quotes at threshold, and that on both sides of the threshold, absolute values of the cross sections are obtained

  14. Analysis of the width-[Formula: see text] non-adjacent form in conjunction with hyperelliptic curve cryptography and with lattices.

    Science.gov (United States)

    Krenn, Daniel

    2013-06-17

    In this work the number of occurrences of a fixed non-zero digit in the width-[Formula: see text] non-adjacent forms of all elements of a lattice in some region (e.g. a ball) is analysed. As bases, expanding endomorphisms with eigenvalues of the same absolute value are allowed. Applications of the main result are on numeral systems with an algebraic integer as base. Those come from efficient scalar multiplication methods (Frobenius-and-add methods) in hyperelliptic curves cryptography, and the result is needed for analysing the running time of such algorithms. The counting result itself is an asymptotic formula, where its main term coincides with the full block length analysis. In its second order term a periodic fluctuation is exhibited. The proof follows Delange's method.

  15. Double Photoionization Near Threshold

    Science.gov (United States)

    Wehlitz, Ralf

    2007-01-01

    The threshold region of the double-photoionization cross section is of particular interest because both ejected electrons move slowly in the Coulomb field of the residual ion. Near threshold both electrons have time to interact with each other and with the residual ion. Also, different theoretical models compete to describe the double-photoionization cross section in the threshold region. We have investigated that cross section for lithium and beryllium and have analyzed our data with respect to the latest results in the Coulomb-dipole theory. We find that our data support the idea of a Coulomb-dipole interaction.

  16. Combined threshold and transverse momentum resummation for inclusive observables

    International Nuclear Information System (INIS)

    Muselli, Claudio; Forte, Stefano; Ridolfi, Giovanni

    2017-01-01

    We present a combined resummation for the transverse momentum distribution of a colorless final state in perturbative QCD, expressed as a function of transverse momentum p T and the scaling variable x. Its expression satisfies three requirements: it reduces to standard transverse momentum resummation to any desired logarithmic order in the limit p T →0 for fixed x, up to power suppressed corrections in p T ; it reduces to threshold resummation to any desired logarithmic order in the limit x→1 for fixed p T , up to power suppressed correction in 1−x; upon integration over transverse momentum it reproduces the resummation of the total cross cross at any given logarithmic order in the threshold x→1 limit, up to power suppressed correction in 1−x. Its main ingredient, and our main new result, is a modified form of transverse momentum resummation, which leads to threshold resummation upon integration over p T , and for which we provide a simple closed-form analytic expression in Fourier-Mellin (b,N) space. We give explicit coefficients up to NNLL order for the specific case of Higgs production in gluon fusion in the effective field theory limit. Our result allows for a systematic improvement of the transverse momentum distribution through threshold resummation which holds for all p T , and elucidates the relation between transverse momentum resummation and threshold resummation at the inclusive level, specifically by providing within perturbative QCD a simple derivation of the main consequence of the so-called collinear anomaly of SCET.

  17. Combined threshold and transverse momentum resummation for inclusive observables

    Energy Technology Data Exchange (ETDEWEB)

    Muselli, Claudio; Forte, Stefano [Tif Lab, Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano,Via Celoria 16, I-20133 Milano (Italy); Ridolfi, Giovanni [Dipartimento di Fisica, Università di Genova and INFN, Sezione di Genova,Via Dodecaneso 33, I-16146 Genova (Italy)

    2017-03-21

    We present a combined resummation for the transverse momentum distribution of a colorless final state in perturbative QCD, expressed as a function of transverse momentum p{sub T} and the scaling variable x. Its expression satisfies three requirements: it reduces to standard transverse momentum resummation to any desired logarithmic order in the limit p{sub T}→0 for fixed x, up to power suppressed corrections in p{sub T}; it reduces to threshold resummation to any desired logarithmic order in the limit x→1 for fixed p{sub T}, up to power suppressed correction in 1−x; upon integration over transverse momentum it reproduces the resummation of the total cross cross at any given logarithmic order in the threshold x→1 limit, up to power suppressed correction in 1−x. Its main ingredient, and our main new result, is a modified form of transverse momentum resummation, which leads to threshold resummation upon integration over p{sub T}, and for which we provide a simple closed-form analytic expression in Fourier-Mellin (b,N) space. We give explicit coefficients up to NNLL order for the specific case of Higgs production in gluon fusion in the effective field theory limit. Our result allows for a systematic improvement of the transverse momentum distribution through threshold resummation which holds for all p{sub T}, and elucidates the relation between transverse momentum resummation and threshold resummation at the inclusive level, specifically by providing within perturbative QCD a simple derivation of the main consequence of the so-called collinear anomaly of SCET.

  18. Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation

    Directory of Open Access Journals (Sweden)

    Marisa W. Paryasto

    2012-04-01

    Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.

  19. Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation

    Directory of Open Access Journals (Sweden)

    Marisa W. Paryasto

    2013-09-01

    Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.

  20. Numerical investigation of the inertial cavitation threshold under multi-frequency ultrasound.

    Science.gov (United States)

    Suo, Dingjie; Govind, Bala; Zhang, Shengqi; Jing, Yun

    2018-03-01

    Through the introduction of multi-frequency sonication in High Intensity Focused Ultrasound (HIFU), enhancement of efficiency has been noted in several applications including thrombolysis, tissue ablation, sonochemistry, and sonoluminescence. One key experimental observation is that multi-frequency ultrasound can help lower the inertial cavitation threshold, thereby improving the power efficiency. However, this has not been well corroborated by the theory. In this paper, a numerical investigation on the inertial cavitation threshold of microbubbles (MBs) under multi-frequency ultrasound irradiation is conducted. The relationships between the cavitation threshold and MB size at various frequencies and in different media are investigated. The results of single-, dual and triple frequency sonication show reduced inertial cavitation thresholds by introducing additional frequencies which is consistent with previous experimental work. In addition, no significant difference is observed between dual frequency sonication with various frequency differences. This study, not only reaffirms the benefit of using multi-frequency ultrasound for various applications, but also provides a possible route for optimizing ultrasound excitations for initiating inertial cavitation. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Cognitive approaches for patterns analysis and security applications

    Science.gov (United States)

    Ogiela, Marek R.; Ogiela, Lidia

    2017-08-01

    In this paper will be presented new opportunities for developing innovative solutions for semantic pattern classification and visual cryptography, which will base on cognitive and bio-inspired approaches. Such techniques can be used for evaluation of the meaning of analyzed patterns or encrypted information, and allow to involve such meaning into the classification task or encryption process. It also allows using some crypto-biometric solutions to extend personalized cryptography methodologies based on visual pattern analysis. In particular application of cognitive information systems for semantic analysis of different patterns will be presented, and also a novel application of such systems for visual secret sharing will be described. Visual shares for divided information can be created based on threshold procedure, which may be dependent on personal abilities to recognize some image details visible on divided images.

  2. Automatic Threshold Determination for a Local Approach of Change Detection in Long-Term Signal Recordings

    Directory of Open Access Journals (Sweden)

    David Hewson

    2007-01-01

    Full Text Available CUSUM (cumulative sum is a well-known method that can be used to detect changes in a signal when the parameters of this signal are known. This paper presents an adaptation of the CUSUM-based change detection algorithms to long-term signal recordings where the various hypotheses contained in the signal are unknown. The starting point of the work was the dynamic cumulative sum (DCS algorithm, previously developed for application to long-term electromyography (EMG recordings. DCS has been improved in two ways. The first was a new procedure to estimate the distribution parameters to ensure the respect of the detectability property. The second was the definition of two separate, automatically determined thresholds. One of them (lower threshold acted to stop the estimation process, the other one (upper threshold was applied to the detection function. The automatic determination of the thresholds was based on the Kullback-Leibler distance which gives information about the distance between the detected segments (events. Tests on simulated data demonstrated the efficiency of these improvements of the DCS algorithm.

  3. Particle identification using the time-over-threshold method in the ATLAS Transition Radiation Tracker

    International Nuclear Information System (INIS)

    Akesson, T.; Arik, E.; Assamagan, K.; Baker, K.; Barberio, E.; Barberis, D.; Bertelsen, H.; Bytchkov, V.; Callahan, J.; Catinaccio, A.; Danielsson, H.; Dittus, F.; Dolgoshein, B.; Dressnandt, N.; Ebenstein, W.L.; Eerola, P.; Farthouat, P.; Froidevaux, D.; Grichkevitch, Y.; Hajduk, Z.; Hansen, J.R.; Keener, P.T.; Kekelidze, G.; Konovalov, S.; Kowalski, T.; Kramarenko, V.A.; Krivchitch, A.; Laritchev, A.; Lichard, P.; Lucotte, A.; Lundberg, B.; Luehring, F.; Mailov, A.; Manara, A.; McFarlane, K.; Mitsou, V.A.; Morozov, S.; Muraviev, S.; Nadtochy, A.; Newcomer, F.M.; Olszowska, J.; Ogren, H.; Oh, S.H.; Peshekhonov, V.; Rembser, C.; Romaniouk, A.; Rousseau, D.; Rust, D.R.; Schegelsky, V.; Sapinski, M.; Shmeleva, A.; Smirnov, S.; Smirnova, L.N.; Sosnovtsev, V.; Soutchkov, S.; Spiridenkov, E.; Tikhomirov, V.; Van Berg, R.; Vassilakopoulos, V.; Wang, C.; Williams, H.H.

    2001-01-01

    Test-beam studies of the ATLAS Transition Radiation Tracker (TRT) straw tube performance in terms of electron-pion separation using a time-over-threshold method are described. The test-beam data are compared with Monte Carlo simulations of charged particles passing through the straw tubes of the TRT. For energies below 10 GeV, the time-over-threshold method combined with the standard transition-radiation cluster-counting technique significantly improves the electron-pion separation in the TRT. The use of the time-over-threshold information also provides some kaon-pion separation, thereby significantly enhancing the B-physics capabilities of the ATLAS detector

  4. An Implementation of RC4+ Algorithm and Zig-zag Algorithm in a Super Encryption Scheme for Text Security

    Science.gov (United States)

    Budiman, M. A.; Amalia; Chayanie, N. I.

    2018-03-01

    Cryptography is the art and science of using mathematical methods to preserve message security. There are two types of cryptography, namely classical and modern cryptography. Nowadays, most people would rather use modern cryptography than classical cryptography because it is harder to break than the classical one. One of classical algorithm is the Zig-zag algorithm that uses the transposition technique: the original message is unreadable unless the person has the key to decrypt the message. To improve the security, the Zig-zag Cipher is combined with RC4+ Cipher which is one of the symmetric key algorithms in the form of stream cipher. The two algorithms are combined to make a super-encryption. By combining these two algorithms, the message will be harder to break by a cryptanalyst. The result showed that complexity of the combined algorithm is θ(n2 ), while the complexity of Zig-zag Cipher and RC4+ Cipher are θ(n2 ) and θ(n), respectively.

  5. Small-threshold behaviour of two-loop self-energy diagrams: two-particle thresholds

    International Nuclear Information System (INIS)

    Berends, F.A.; Davydychev, A.I.; Moskovskij Gosudarstvennyj Univ., Moscow; Smirnov, V.A.; Moskovskij Gosudarstvennyj Univ., Moscow

    1996-01-01

    The behaviour of two-loop two-point diagrams at non-zero thresholds corresponding to two-particle cuts is analyzed. The masses involved in a cut and the external momentum are assumed to be small as compared to some of the other masses of the diagram. By employing general formulae of asymptotic expansions of Feynman diagrams in momenta and masses, we construct an algorithm to derive analytic approximations to the diagrams. In such a way, we calculate several first coefficients of the expansion. Since no conditions on relative values of the small masses and the external momentum are imposed, the threshold irregularities are described analytically. Numerical examples, using diagrams occurring in the standard model, illustrate the convergence of the expansion below the first large threshold. (orig.)

  6. Number Theory in Science and Communication With Applications in Cryptography, Physics, Digital Information, Computing, and Self-Similarity

    CERN Document Server

    Schroeder, Manfred

    2009-01-01

    "Number Theory in Science and Communication" is a well-known introduction for non-mathematicians to this fascinating and useful branch of applied mathematics . It stresses intuitive understanding rather than abstract theory and highlights important concepts such as continued fractions, the golden ratio, quadratic residues and Chinese remainders, trapdoor functions, pseudoprimes and primitive elements. Their applications to problems in the real world are one of the main themes of the book. This revised fifth edition is augmented by recent advances in coding theory, permutations and derangements and a chapter in quantum cryptography. From reviews of earlier editions – "I continue to find [Schroeder’s] Number Theory a goldmine of valuable information. It is a marvellous book, in touch with the most recent applications of number theory and written with great clarity and humor.’ Philip Morrison (Scientific American) "A light-hearted and readable volume with a wide range of applications to which the author ha...

  7. Hard decoding algorithm for optimizing thresholds under general Markovian noise

    Science.gov (United States)

    Chamberland, Christopher; Wallman, Joel; Beale, Stefanie; Laflamme, Raymond

    2017-04-01

    Quantum error correction is instrumental in protecting quantum systems from noise in quantum computing and communication settings. Pauli channels can be efficiently simulated and threshold values for Pauli error rates under a variety of error-correcting codes have been obtained. However, realistic quantum systems can undergo noise processes that differ significantly from Pauli noise. In this paper, we present an efficient hard decoding algorithm for optimizing thresholds and lowering failure rates of an error-correcting code under general completely positive and trace-preserving (i.e., Markovian) noise. We use our hard decoding algorithm to study the performance of several error-correcting codes under various non-Pauli noise models by computing threshold values and failure rates for these codes. We compare the performance of our hard decoding algorithm to decoders optimized for depolarizing noise and show improvements in thresholds and reductions in failure rates by several orders of magnitude. Our hard decoding algorithm can also be adapted to take advantage of a code's non-Pauli transversal gates to further suppress noise. For example, we show that using the transversal gates of the 5-qubit code allows arbitrary rotations around certain axes to be perfectly corrected. Furthermore, we show that Pauli twirling can increase or decrease the threshold depending upon the code properties. Lastly, we show that even if the physical noise model differs slightly from the hypothesized noise model used to determine an optimized decoder, failure rates can still be reduced by applying our hard decoding algorithm.

  8. Coherent eavesdropping attacks in tomographic quantum cryptography: Nonequivalence of quantum and classical key distillation

    International Nuclear Information System (INIS)

    Kaszlikowski, Dagomir; Lim, J.Y.; Englert, Berthold-Georg; Kwek, L.C.

    2005-01-01

    The security of a cryptographic key that is generated by communication through a noisy quantum channel relies on the ability to distill a shorter secure key sequence from a longer insecure one. We show that - for protocols that use quantum channels of any dimension and completely characterize them by state tomography - the noise threshold for classical advantage distillation of a specific kind is substantially lower than the threshold for quantum entanglement distillation if the eavesdropper can perform powerful coherent attacks. In marked contrast, earlier investigations had shown that the thresholds are identical for incoherent attacks on the same classical distillation scheme. It remains an open question whether other schemes for classical advantage distillation have higher thresholds for coherent eavesdropping attacks

  9. An n -material thresholding method for improving integerness of solutions in topology optimization

    International Nuclear Information System (INIS)

    Watts, Seth; Engineering); Tortorelli, Daniel A.; Engineering)

    2016-01-01

    It is common in solving topology optimization problems to replace an integer-valued characteristic function design field with the material volume fraction field, a real-valued approximation of the design field that permits "fictitious" mixtures of materials during intermediate iterations in the optimization process. This is reasonable so long as one can interpolate properties for such materials and so long as the final design is integer valued. For this purpose, we present a method for smoothly thresholding the volume fractions of an arbitrary number of material phases which specify the design. This method is trivial for two-material design problems, for example, the canonical topology design problem of specifying the presence or absence of a single material within a domain, but it becomes more complex when three or more materials are used, as often occurs in material design problems. We take advantage of the similarity in properties between the volume fractions and the barycentric coordinates on a simplex to derive a thresholding, method which is applicable to an arbitrary number of materials. As we show in a sensitivity analysis, this method has smooth derivatives, allowing it to be used in gradient-based optimization algorithms. Finally, we present results, which show synergistic effects when used with Solid Isotropic Material with Penalty and Rational Approximation of Material Properties material interpolation functions, popular methods of ensuring integerness of solutions.

  10. The PM2.5 threshold for aerosol extinction in the Beijing megacity

    Science.gov (United States)

    Kong, Lingbin; Xin, Jinyuan; Liu, Zirui; Zhang, Kequan; Tang, Guiqian; Zhang, Wenyu; Wang, Yuesi

    2017-10-01

    Particulate pollution has remained at a high level in the megacity of Beijing in the past decade. The PM2.5, PM10, aerosol optical depth (AOD), Angstrom exponent(α), and PM2.5/PM10 ratio (the proportion of PM2.5 in PM10) in Beijing were 70±6 μg m-3, 128±6 μg m-3, 0.57 ± 0.05, 1.10 ± 0.08, 45 ± 4%, respectively, from 2005 to 2014. The annual means of PM concentration, AOD, α, and PM2.5/PM10 ratio decreased slightly during this decade, meanwhile PM concentration increased in the winter. Furthermore, we found there were thresholds of PM2.5 concentration for aerosol extinction. When the PM concentration was lower than a certain threshold, AOD decreased quickly with the decline of PM concentration. To make the improvement of the particle pollution more noticeable, the PM concentration should be controlled under the threshold. The annual averaged threshold is 63 μg m-3, and the threshold values reached the maximum of 74 μg m-3 in spring, ranged from 54 to 56 μg m-3 in the three other seasons. The threshold values ranged from 55 to 77 μg m-3 under other relevant factors, including air masses directions and relative humidity.

  11. Physico-chemical thresholds in the distribution of fish species among French lakes

    Directory of Open Access Journals (Sweden)

    Roubeix Vincent

    2017-01-01

    Full Text Available The management of lakes requires the definition of physico-chemical thresholds to be used for ecosystem preservation or restoration. According to the European Water Framework Directive, the limits between physico-chemical quality classes must be set consistently with biological quality elements. One way to do this consists in analyzing the response of aquatic communities to environmental gradients across monitoring sites and in identifying ecological community thresholds, i.e. zones in the gradients where the species turnover is the highest. In this study, fish data from 196 lakes in France were considered to derive ecological thresholds using the multivariate method of gradient forest. The analysis was performed on 25 species and 36 environmental parameters. The results revealed the highest importance of maximal water temperature in the distribution of fish species. Other important parameters included geographical factors, dissolved organic carbon concentration and water transparency, while nutrients appeared to have low influence. In spite of the diversity of species responses to the gradients, community thresholds were detected in the gradients of the most important physico-chemical parameters and of total phosphorus and nitrate concentrations as well. The thresholds identified in such macroecological study may highlight new patterns of species natural distribution and improve niche characterization. Moreover, when factors that may be influenced by human activities are involved, the thresholds could be used to set environmental standards for lake preservation.

  12. Remote sensing of aquatic vegetation distribution in Taihu Lake using an improved classification tree with modified thresholds.

    Science.gov (United States)

    Zhao, Dehua; Jiang, Hao; Yang, Tangwu; Cai, Ying; Xu, Delin; An, Shuqing

    2012-03-01

    Classification trees (CT) have been used successfully in the past to classify aquatic vegetation from spectral indices (SI) obtained from remotely-sensed images. However, applying CT models developed for certain image dates to other time periods within the same year or among different years can reduce the classification accuracy. In this study, we developed CT models with modified thresholds using extreme SI values (CT(m)) to improve the stability of the models when applying them to different time periods. A total of 903 ground-truth samples were obtained in September of 2009 and 2010 and classified as emergent, floating-leaf, or submerged vegetation or other cover types. Classification trees were developed for 2009 (Model-09) and 2010 (Model-10) using field samples and a combination of two images from winter and summer. Overall accuracies of these models were 92.8% and 94.9%, respectively, which confirmed the ability of CT analysis to map aquatic vegetation in Taihu Lake. However, Model-10 had only 58.9-71.6% classification accuracy and 31.1-58.3% agreement (i.e., pixels classified the same in the two maps) for aquatic vegetation when it was applied to image pairs from both a different time period in 2010 and a similar time period in 2009. We developed a method to estimate the effects of extrinsic (EF) and intrinsic (IF) factors on model uncertainty using Modis images. Results indicated that 71.1% of the instability in classification between time periods was due to EF, which might include changes in atmospheric conditions, sun-view angle and water quality. The remainder was due to IF, such as phenological and growth status differences between time periods. The modified version of Model-10 (i.e. CT(m)) performed better than traditional CT with different image dates. When applied to 2009 images, the CT(m) version of Model-10 had very similar thresholds and performance as Model-09, with overall accuracies of 92.8% and 90.5% for Model-09 and the CT(m) version of Model

  13. A novel gate and drain engineered charge plasma tunnel field-effect transistor for low sub-threshold swing and ambipolar nature

    Science.gov (United States)

    Yadav, Dharmendra Singh; Raad, Bhagwan Ram; Sharma, Dheeraj

    2016-12-01

    In this paper, we focus on the improvement of figures of merit for charge plasma based tunnel field-effect transistor (TFET) in terms of ON-state current, threshold voltage, sub-threshold swing, ambipolar nature, and gate to drain capacitance which provides better channel controlling of the device with improved high frequency response at ultra-low supply voltages. Regarding this, we simultaneously employ work function engineering on the drain and gate electrode of the charge plasma TFET. The use of gate work function engineering modulates the barrier on the source/channel interface leads to improvement in the ON-state current, threshold voltage, and sub-threshold swing. Apart from this, for the first time use of work function engineering on the drain electrode increases the tunneling barrier for the flow of holes on the drain/channel interface, it results into suppression of ambipolar behavior. The lowering of gate to drain capacitance therefore enhanced high frequency parameters. Whereas, the presence of dual work functionality at the gate electrode and over the drain region improves the overall performance of the charge plasma based TFET.

  14. Modification of electrical pain threshold by voluntary breathing-controlled electrical stimulation (BreEStim in healthy subjects.

    Directory of Open Access Journals (Sweden)

    Shengai Li

    Full Text Available BACKGROUND: Pain has a distinct sensory and affective (i.e., unpleasantness component. BreEStim, during which electrical stimulation is delivered during voluntary breathing, has been shown to selectively reduce the affective component of post-amputation phantom pain. The objective was to examine whether BreEStim increases pain threshold such that subjects could have improved tolerance of sensation of painful stimuli. METHODS: Eleven pain-free healthy subjects (7 males, 4 females participated in the study. All subjects received BreEStim (100 stimuli and conventional electrical stimulation (EStim, 100 stimuli to two acupuncture points (Neiguan and Weiguan of the dominant hand in a random order. The two different treatments were provided at least three days apart. Painful, but tolerable electrical stimuli were delivered randomly during EStim, but were triggered by effortful inhalation during BreEStim. Measurements of tactile sensation threshold, electrical sensation and electrical pain thresholds, thermal (cold sensation, warm sensation, cold pain and heat pain thresholds were recorded from the thenar eminence of both hands. These measurements were taken pre-intervention and 10-min post-intervention. RESULTS: There was no difference in the pre-intervention baseline measurement of all thresholds between BreEStim and EStim. The electrical pain threshold significantly increased after BreEStim (27.5±6.7% for the dominant hand and 28.5±10.8% for the non-dominant hand, respectively. The electrical pain threshold significantly decreased after EStim (9.1±2.8% for the dominant hand and 10.2±4.6% for the non-dominant hand, respectively (F[1, 10] = 30.992, p = .00024. There was no statistically significant change in other thresholds after BreEStim and EStim. The intensity of electrical stimuli was progressively increased, but no difference was found between BreEStim and EStim. CONCLUSION: Voluntary breathing controlled electrical stimulation

  15. Superconducting nanowire single-photon detectors: physics and applications

    International Nuclear Information System (INIS)

    Natarajan, Chandra M; Tanner, Michael G; Hadfield, Robert H

    2012-01-01

    Single-photon detectors based on superconducting nanowires (SSPDs or SNSPDs) have rapidly emerged as a highly promising photon-counting technology for infrared wavelengths. These devices offer high efficiency, low dark counts and excellent timing resolution. In this review, we consider the basic SNSPD operating principle and models of device behaviour. We give an overview of the evolution of SNSPD device design and the improvements in performance which have been achieved. We also evaluate device limitations and noise mechanisms. We survey practical refrigeration technologies and optical coupling schemes for SNSPDs. Finally we summarize promising application areas, ranging from quantum cryptography to remote sensing. Our goal is to capture a detailed snapshot of an emerging superconducting detector technology on the threshold of maturity. (topical review)

  16. Hyper-arousal decreases human visual thresholds.

    Directory of Open Access Journals (Sweden)

    Adam J Woods

    Full Text Available Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0-2° C water, a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1 and contrast thresholds (Experiment 2 were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal's ability to decrease visual thresholds has important implications for survival, sports, and everyday life.

  17. Hierarchical Threshold Adaptive for Point Cloud Filter Algorithm of Moving Surface Fitting

    Directory of Open Access Journals (Sweden)

    ZHU Xiaoxiao

    2018-02-01

    Full Text Available In order to improve the accuracy,efficiency and adaptability of point cloud filtering algorithm,a hierarchical threshold adaptive for point cloud filter algorithm of moving surface fitting was proposed.Firstly,the noisy points are removed by using a statistic histogram method.Secondly,the grid index is established by grid segmentation,and the surface equation is set up through the lowest point among the neighborhood grids.The real height and fit are calculated.The difference between the elevation and the threshold can be determined.Finally,in order to improve the filtering accuracy,hierarchical filtering is used to change the grid size and automatically set the neighborhood size and threshold until the filtering result reaches the accuracy requirement.The test data provided by the International Photogrammetry and Remote Sensing Society (ISPRS is used to verify the algorithm.The first and second error and the total error are 7.33%,10.64% and 6.34% respectively.The algorithm is compared with the eight classical filtering algorithms published by ISPRS.The experiment results show that the method has well-adapted and it has high accurate filtering result.

  18. A numerical study of threshold states

    International Nuclear Information System (INIS)

    Ata, M.S.; Grama, C.; Grama, N.; Hategan, C.

    1979-01-01

    There are some experimental evidences of charged particle threshold states. On the statistical background of levels, some simple structures were observed in excitation spectrum. They occur near the coulombian threshold and have a large reduced width for the decay in the threshold channel. These states were identified as charged cluster threshold states. Such threshold states were observed in sup(15,16,17,18)O, sup(18,19)F, sup(19,20)Ne, sup(24)Mg, sup(32)S. The types of clusters involved were d, t, 3 He, α and even 12 C. They were observed in heavy-ions transfer reactions in the residual nucleus as strong excited levels. The charged particle threshold states occur as simple structures at high excitation energy. They could be interesting both from nuclear structure as well as nuclear reaction mechanism point of view. They could be excited as simple structures both in compound and residual nucleus. (author)

  19. Prediction Model and Principle of End-of-Life Threshold for Lithium Ion Batteries Based on Open Circuit Voltage Drifts

    International Nuclear Information System (INIS)

    Cui, Yingzhi; Yang, Jie; Du, Chunyu; Zuo, Pengjian; Gao, Yunzhi; Cheng, Xinqun; Ma, Yulin; Yin, Geping

    2017-01-01

    Highlights: •Open circuit voltage evolution over ageing of lithium ion batteries is deciphered. •The mechanism responsible for the end-of-life (EOL) threshold is elaborated. •A new prediction model of EOL threshold with improved accuracy is developed. •This EOL prediction model is promising for the applications in electric vehicles. -- Abstract: The end-of-life (EOL) of a lithium ion battery (LIB) is defined as the time point when the LIB can no longer provide sufficient power or energy to accomplish its intended function. Generally, the EOL occurs abruptly when the degradation of a LIB reaches the threshold. Therefore, current prediction methods of EOL by extrapolating the early degradation behavior often result in significant errors. To address this problem, this paper analyzes the reason for the EOL threshold of a LIB with shallow depth of discharge. It is found that the sudden appearance of EOL threshold results from the drift of open circuit voltage (OCV) at the end of both shallow depth and full discharges. Further, a new EOL threshold prediction model with highly improved accuracy is developed based on the OCV drifts and their evolution mechanism, which can effectively avoid the misjudgment of EOL threshold. The accuracy of this EOL threshold prediction model is verified by comparing with experimental results. The EOL threshold prediction model can be applied to other battery chemistry systems and its possible application in electric vehicles is finally discussed.

  20. Conceptions of nuclear threshold status

    International Nuclear Information System (INIS)

    Quester, G.H.

    1991-01-01

    This paper reviews some alternative definitions of nuclear threshold status. Each of them is important, and major analytical confusions would result if one sense of the term is mistaken for another. The motives for nations entering into such threshold status are a blend of civilian and military gains, and of national interests versus parochial or bureaucratic interests. A portion of the rationale for threshold status emerges inevitably from the pursuit of economic goals, and another portion is made more attraction by the derives of the domestic political process. Yet the impact on international security cannot be dismissed, especially where conflicts among the states remain real. Among the military or national security motives are basic deterrence, psychological warfare, war-fighting and, more generally, national prestige. In the end, as the threshold phenomenon is assayed for lessons concerning the role of nuclear weapons more generally in international relations and security, one might conclude that threshold status and outright proliferation coverage to a degree in the motives for all of the states involved and in the advantages attained. As this paper has illustrated, nuclear threshold status is more subtle and more ambiguous than outright proliferation, and it takes considerable time to sort out the complexities. Yet the world has now had a substantial amount of time to deal with this ambiguous status, and this may tempt more states to exploit it

  1. Variation in the Hearing Threshold in Women during the Menstrual Cycle.

    Science.gov (United States)

    Souza, Dayse da Silva; Luckwu, Brunna; Andrade, Wagner Teobaldo Lopes de; Pessoa, Luciane Spinelli de Figueiredo; Nascimento, João Agnaldo do; Rosa, Marine Raquel Diniz da

    2017-10-01

    Introduction  The hormonal changes that occur during the menstrual cycle and their relationship with hearing problems have been studied. However, they have not been well explained. Objective  The objective of our study is to investigate the variation in hearing thresholds in women during the menstrual cycle. Method  We conducted a cohort and longitudinal study. It was composed of 30 volunteers, aged 18-39 years old, of which 20 were women during the phases of the menstrual cycle and 10 were men (control group) who underwent audiometry and impedance exams, to correlate the possible audiological changes in each phase of the menstrual cycle. Results  There were significant changes in hearing thresholds observed during the menstrual cycle phases in the group of women who used hormonal contraceptives and the group who did not use such contraceptives. Improved hearing thresholds were observed in the late follicular phase in the group who did not use hormonal contraceptives and the hearing thresholds at high frequencies were better. Throughout the menstrual cycle phases, the mean variation was 3.6 db HL between weeks in the group who used hormonal contraceptives and 4.09 db HL in the group who did not use them. Conclusions  The present study found that there may be a relationship between hearing changes and hormonal fluctuations during the menstrual cycle based on changes in the hearing thresholds of women. In addition, this study suggests that estrogen has an otoprotective effect on hearing, since the best hearing thresholds were found when estrogen was at its maximum peak.

  2. Fabrication of Pt nanowires with a diffraction-unlimited feature size by high-threshold lithography

    International Nuclear Information System (INIS)

    Li, Li; Zhang, Ziang; Yu, Miao; Song, Zhengxun; Weng, Zhankun; Wang, Zuobin; Li, Wenjun; Wang, Dapeng; Zhao, Le; Peng, Kuiqing

    2015-01-01

    Although the nanoscale world can already be observed at a diffraction-unlimited resolution using far-field optical microscopy, to make the step from microscopy to lithography still requires a suitable photoresist material system. In this letter, we consider the threshold to be a region with a width characterized by the extreme feature size obtained using a Gaussian beam spot. By narrowing such a region through improvement of the threshold sensitization to intensity in a high-threshold material system, the minimal feature size becomes smaller. By using platinum as the negative photoresist, we demonstrate that high-threshold lithography can be used to fabricate nanowire arrays with a scalable resolution along the axial direction of the linewidth from the micro- to the nanoscale using a nanosecond-pulsed laser source with a wavelength λ 0  = 1064 nm. The minimal feature size is only several nanometers (sub λ 0 /100). Compared with conventional polymer resist lithography, the advantages of high-threshold lithography are sharper pinpoints of laser intensity triggering the threshold response and also higher robustness allowing for large area exposure by a less-expensive nanosecond-pulsed laser

  3. Threshold Concepts in Finance: Student Perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  4. Threshold resummation for slepton-pair production at hadron colliders

    International Nuclear Information System (INIS)

    Bozzi, Giuseppe; Fuks, Benjamin; Klasen, Michael

    2007-01-01

    We present a first and extensive study of threshold resummation effects for supersymmetric (SUSY) particle production at hadron colliders, focusing on Drell-Yan like slepton-pair and slepton-sneutrino associated production. After confirming the known next-to-leading order (NLO) QCD corrections and generalizing the NLO SUSY-QCD corrections to the case of mixing squarks in the virtual loop contributions, we employ the usual Mellin N-space resummation formalism with the minimal prescription for the inverse Mellin-transform and improve it by resumming 1/N-suppressed and a class of N-independent universal contributions. Numerically, our results increase the theoretical cross sections by 5 to 15% with respect to the NLO predictions and stabilize them by reducing the scale dependence from up to 20% at NLO to less than 10% with threshold resummation

  5. Threshold resummation for slepton-pair production at hadron colliders

    International Nuclear Information System (INIS)

    Bozzi, Giuseppe; Fuks, Benjamin; Klasen, Michael

    2007-01-01

    We present a first and extensive study of threshold resummation effects for supersymmetric (SUSY) particle production at hadron colliders, focusing on Drell-Yan like slepton-pair and slepton-sneutrino associated production. After confirming the known next-to-leading order (NLO) QCD corrections and generalizing the NLO SUSY-QCD corrections to the case of mixing squarks in the virtual loop contributions, we employ the usual Mellin N-space resummation formalism with the minimal prescription for the inverse Mellin-transform and improve it by re-summing 1/N-suppressed and a class of N-independent universal contributions. Numerically, our results increase the theoretical cross sections by 5 to 15% with respect to the NLO predictions and stabilize them by reducing the scale dependence from up to 20% at NLO to less than 10% with threshold resummation. (authors)

  6. Thresholding magnetic resonance images of human brain

    Institute of Scientific and Technical Information of China (English)

    Qing-mao HU; Wieslaw L NOWINSKI

    2005-01-01

    In this paper, methods are proposed and validated to determine low and high thresholds to segment out gray matter and white matter for MR images of different pulse sequences of human brain. First, a two-dimensional reference image is determined to represent the intensity characteristics of the original three-dimensional data. Then a region of interest of the reference image is determined where brain tissues are present. The non-supervised fuzzy c-means clustering is employed to determine: the threshold for obtaining head mask, the low threshold for T2-weighted and PD-weighted images, and the high threshold for T1-weighted, SPGR and FLAIR images. Supervised range-constrained thresholding is employed to determine the low threshold for T1-weighted, SPGR and FLAIR images. Thresholding based on pairs of boundary pixels is proposed to determine the high threshold for T2- and PD-weighted images. Quantification against public data sets with various noise and inhomogeneity levels shows that the proposed methods can yield segmentation robust to noise and intensity inhomogeneity. Qualitatively the proposed methods work well with real clinical data.

  7. Near threshold fatigue testing

    Science.gov (United States)

    Freeman, D. C.; Strum, M. J.

    1993-01-01

    Measurement of the near-threshold fatigue crack growth rate (FCGR) behavior provides a basis for the design and evaluation of components subjected to high cycle fatigue. Typically, the near-threshold fatigue regime describes crack growth rates below approximately 10(exp -5) mm/cycle (4 x 10(exp -7) inch/cycle). One such evaluation was recently performed for the binary alloy U-6Nb. The procedures developed for this evaluation are described in detail to provide a general test method for near-threshold FCGR testing. In particular, techniques for high-resolution measurements of crack length performed in-situ through a direct current, potential drop (DCPD) apparatus, and a method which eliminates crack closure effects through the use of loading cycles with constant maximum stress intensity are described.

  8. An Advanced Method to Apply Multiple Rainfall Thresholds for Urban Flood Warnings

    Directory of Open Access Journals (Sweden)

    Jiun-Huei Jang

    2015-11-01

    Full Text Available Issuing warning information to the public when rainfall exceeds given thresholds is a simple and widely-used method to minimize flood risk; however, this method lacks sophistication when compared with hydrodynamic simulation. In this study, an advanced methodology is proposed to improve the warning effectiveness of the rainfall threshold method for urban areas through deterministic-stochastic modeling, without sacrificing simplicity and efficiency. With regards to flooding mechanisms, rainfall thresholds of different durations are divided into two groups accounting for flooding caused by drainage overload and disastrous runoff, which help in grading the warning level in terms of emergency and severity when the two are observed together. A flood warning is then classified into four levels distinguished by green, yellow, orange, and red lights in ascending order of priority that indicate the required measures, from standby, flood defense, evacuation to rescue, respectively. The proposed methodology is tested according to 22 historical events in the last 10 years for 252 urbanized townships in Taiwan. The results show satisfactory accuracy in predicting the occurrence and timing of flooding, with a logical warning time series for taking progressive measures. For systems with multiple rainfall thresholds already in place, the methodology can be used to ensure better application of rainfall thresholds in urban flood warnings.

  9. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    Science.gov (United States)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  10. Input-output relation and energy efficiency in the neuron with different spike threshold dynamics.

    Science.gov (United States)

    Yi, Guo-Sheng; Wang, Jiang; Tsang, Kai-Ming; Wei, Xi-Le; Deng, Bin

    2015-01-01

    Neuron encodes and transmits information through generating sequences of output spikes, which is a high energy-consuming process. The spike is initiated when membrane depolarization reaches a threshold voltage. In many neurons, threshold is dynamic and depends on the rate of membrane depolarization (dV/dt) preceding a spike. Identifying the metabolic energy involved in neural coding and their relationship to threshold dynamic is critical to understanding neuronal function and evolution. Here, we use a modified Morris-Lecar model to investigate neuronal input-output property and energy efficiency associated with different spike threshold dynamics. We find that the neurons with dynamic threshold sensitive to dV/dt generate discontinuous frequency-current curve and type II phase response curve (PRC) through Hopf bifurcation, and weak noise could prohibit spiking when bifurcation just occurs. The threshold that is insensitive to dV/dt, instead, results in a continuous frequency-current curve, a type I PRC and a saddle-node on invariant circle bifurcation, and simultaneously weak noise cannot inhibit spiking. It is also shown that the bifurcation, frequency-current curve and PRC type associated with different threshold dynamics arise from the distinct subthreshold interactions of membrane currents. Further, we observe that the energy consumption of the neuron is related to its firing characteristics. The depolarization of spike threshold improves neuronal energy efficiency by reducing the overlap of Na(+) and K(+) currents during an action potential. The high energy efficiency is achieved at more depolarized spike threshold and high stimulus current. These results provide a fundamental biophysical connection that links spike threshold dynamics, input-output relation, energetics and spike initiation, which could contribute to uncover neural encoding mechanism.

  11. SU-E-I-56: Threshold Effect of ASIR Before Which Image Improve and After Which Image Degrades.

    Science.gov (United States)

    Abdulkhaliq, F; Mail, N; Saoudi, A

    2012-06-01

    This study showed to what extent ASIR improves CT-image and to what extent it degrades it. In our study we used GE HD750 CT-scanner, Siemens Sensation CT-scanner, Catphan, PTW-pin-ion- chamber, CTDI-phantom. We measured the CT-dose using the PTW-pinion-chamber and CTDI-phantom. Image-quality and noise were evaluated using catphan and GE water phantom. Image noise reduce as higher levels of ASIR are applied. A phantom scan showed that 50%ASIR with 50% lower-dose (10.8mGy) achieved the same image noise of standard FBP image with full dose 21.7mGy (noise∼5). To confirm that the two same-noise images retain same image-quality, two scans were compared; one with full dose 260mAs(21.7mGy) and the other one with 50% lower dose 130mAs(10.8mGy). The results showed that ASIR failed to retain the same quality. For high contrast resolution, 50%ASIR reduced the resolution of patterns = 71p/cm, however it improved the detectability of patterns = 61p/cm. ASIR has degraded the CNR of the low-contrast-objects of = 5HU (CNR of 1.4 at 260mAs STND to CNR of 1.08 at 130mAs ASIR), however it improved the CNR of the low-contrast-objects of = 10HU (CNR of 2.35 at 260mAs STND to CNR of 2.63 at 130mAs ASIR). ASIR degraded the edges and killed some of the small objects. This shows that ASIR has a critical point of improve/degrade. Also, ASIR can improve images for the same dose, but with high levels of ASIR (e.g. 100%ASIR), cause disapear of small low contrast objects (e.g. 2mm). People think that ASIR only improves image and reduces patient dose. Our study showed that ASIR has some drawbacks. There is a threshold before wich ASIR is positive and after which ASIR is negative. Recently only GE provide ASIR in the market but our study showed that other CTs such as Siemens can do similar performance like ASIR. © 2012 American Association of Physicists in Medicine.

  12. At-Risk-of-Poverty Threshold

    Directory of Open Access Journals (Sweden)

    Táňa Dvornáková

    2012-06-01

    Full Text Available European Statistics on Income and Living Conditions (EU-SILC is a survey on households’ living conditions. The main aim of the survey is to get long-term comparable data on social and economic situation of households. Data collected in the survey are used mainly in connection with the evaluation of income poverty and determinationof at-risk-of-poverty rate. This article deals with the calculation of the at risk-of-poverty threshold based on data from EU-SILC 2009. The main task is to compare two approaches to the computation of at riskof-poverty threshold. The first approach is based on the calculation of the threshold for each country separately,while the second one is based on the calculation of the threshold for all states together. The introduction summarizes common attributes in the calculation of the at-risk-of-poverty threshold, such as disposable household income, equivalised household income. Further, different approaches to both calculations are introduced andadvantages and disadvantages of these approaches are stated. Finally, the at-risk-of-poverty rate calculation is described and comparison of the at-risk-of-poverty rates based on these two different approaches is made.

  13. Lightweight Data Aggregation Scheme against Internal Attackers in Smart Grid Using Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    Debiao He

    2017-01-01

    Full Text Available Recent advances of Internet and microelectronics technologies have led to the concept of smart grid which has been a widespread concern for industry, governments, and academia. The openness of communications in the smart grid environment makes the system vulnerable to different types of attacks. The implementation of secure communication and the protection of consumers’ privacy have become challenging issues. The data aggregation scheme is an important technique for preserving consumers’ privacy because it can stop the leakage of a specific consumer’s data. To satisfy the security requirements of practical applications, a lot of data aggregation schemes were presented over the last several years. However, most of them suffer from security weaknesses or have poor performances. To reduce computation cost and achieve better security, we construct a lightweight data aggregation scheme against internal attackers in the smart grid environment using Elliptic Curve Cryptography (ECC. Security analysis of our proposed approach shows that it is provably secure and can provide confidentiality, authentication, and integrity. Performance analysis of the proposed scheme demonstrates that both computation and communication costs of the proposed scheme are much lower than the three previous schemes. As a result of these aforementioned benefits, the proposed lightweight data aggregation scheme is more practical for deployment in the smart grid environment.

  14. Threshold Multi Split-Row algorithm for decoding irregular LDPC codes

    Directory of Open Access Journals (Sweden)

    Chakir Aqil

    2017-12-01

    Full Text Available In this work, we propose a new threshold multi split-row algorithm in order to improve the multi split-row algorithm for LDPC irregular codes decoding. We give a complete description of our algorithm as well as its advantages for the LDPC codes. The simulation results over an additive white gaussian channel show that an improvement in code error performance between 0.4 dB and 0.6 dB compared to the multi split-row algorithm.

  15. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    Science.gov (United States)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  16. Summary of DOE threshold limits efforts

    International Nuclear Information System (INIS)

    Wickham, L.E.; Smith, C.F.; Cohen, J.J.

    1987-01-01

    The Department of Energy (DOE) has been developing the concept of threshold quantities for use in determining which waste materials may be disposed of as nonradioactive waste in DOE sanitary landfills. Waste above a threshold level could be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. After extensive review of a draft threshold guidance document in 1985, a second draft threshold background document was produced in March 1986. The second draft included a preliminary cost-benefit analysis and quality assurance considerations. The review of the second draft has been completed. Final changes to be incorporated include an in-depth cost-benefit analysis of two example sites and recommendations of how to further pursue (i.e. employ) the concept of threshold quantities within the DOE. 3 references

  17. A system dynamics model of clinical decision thresholds for the detection of developmental-behavioral disorders

    Directory of Open Access Journals (Sweden)

    R. Christopher Sheldrick

    2016-11-01

    Full Text Available Abstract Background Clinical decision-making has been conceptualized as a sequence of two separate processes: assessment of patients’ functioning and application of a decision threshold to determine whether the evidence is sufficient to justify a given decision. A range of factors, including use of evidence-based screening instruments, has the potential to influence either or both processes. However, implementation studies seldom specify or assess the mechanism by which screening is hypothesized to influence clinical decision-making, thus limiting their ability to address unexpected findings regarding clinicians’ behavior. Building on prior theory and empirical evidence, we created a system dynamics (SD model of how physicians’ clinical decisions are influenced by their assessments of patients and by factors that may influence decision thresholds, such as knowledge of past patient outcomes. Using developmental-behavioral disorders as a case example, we then explore how referral decisions may be influenced by changes in context. Specifically, we compare predictions from the SD model to published implementation trials of evidence-based screening to understand physicians’ management of positive screening results and changes in referral rates. We also conduct virtual experiments regarding the influence of a variety of interventions that may influence physicians’ thresholds, including improved access to co-located mental health care and improved feedback systems regarding patient outcomes. Results Results of the SD model were consistent with recent implementation trials. For example, the SD model suggests that if screening improves physicians’ accuracy of assessment without also influencing decision thresholds, then a significant proportion of children with positive screens will not be referred and the effect of screening implementation on referral rates will be modest—results that are consistent with a large proportion of published

  18. Phased arrays: A strategy to lower the energy threshold for neutrinos

    Directory of Open Access Journals (Sweden)

    Wissel Stephanie

    2017-01-01

    Full Text Available In-ice radio arrays are optimized for detecting the highest energy, cosmogenic neutrinos expected to be produced though cosmic ray interactions with background photons. However, there are two expected populations of high energy neutrinos: the astrophysical flux observed by IceCube (~1 PeV and the cosmogenic flux (~ 1017 eV or 100 PeV. Typical radio arrays employ a noise-riding trigger, which limits their minimum energy threshold based on the background noise temperature of the ice. Phased radio arrays could lower the energy threshold by combining the signals from several channels before triggering, thereby improving the signal-to-noise at the trigger level. Reducing the energy threshold would allow radio experiments to more efficiently overlap with optical Cherenkov neutrino telescopes as well as for more efficient searches for cosmogenic neutrinos. We discuss the proposed technique and prototypical phased arrays deployed in an anechoic chamber and at Greenland’s Summit Station.

  19. Optical Associative Memory Model With Threshold Modification Using Complementary Vector

    Science.gov (United States)

    Bian, Shaoping; Xu, Kebin; Hong, Jing

    1989-02-01

    A new criterion to evaluate the similarity between two vectors in associative memory is presented. According to it, an experimental research about optical associative memory model with threshold modification using complementary vector is carried out. This model is capable of eliminating the posibility to recall erroneously. Therefore the accuracy of reading out is improved.

  20. Pressure Systems Stored-Energy Threshold Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  1. An Advanced Encryption Standard Powered Mutual Authentication Protocol Based on Elliptic Curve Cryptography for RFID, Proven on WISP

    Directory of Open Access Journals (Sweden)

    Alaauldin Ibrahim

    2017-01-01

    Full Text Available Information in patients’ medical histories is subject to various security and privacy concerns. Meanwhile, any modification or error in a patient’s medical data may cause serious or even fatal harm. To protect and transfer this valuable and sensitive information in a secure manner, radio-frequency identification (RFID technology has been widely adopted in healthcare systems and is being deployed in many hospitals. In this paper, we propose a mutual authentication protocol for RFID tags based on elliptic curve cryptography and advanced encryption standard. Unlike existing authentication protocols, which only send the tag ID securely, the proposed protocol could also send the valuable data stored in the tag in an encrypted pattern. The proposed protocol is not simply a theoretical construct; it has been coded and tested on an experimental RFID tag. The proposed scheme achieves mutual authentication in just two steps and satisfies all the essential security requirements of RFID-based healthcare systems.

  2. A Key Generation Model for Improving the Security of Cryptographic ...

    African Journals Online (AJOL)

    Cryptography is a mathematical technique that plays an important role in information security techniques for addressing authentication, interactive proofs, data origination, sender/receiver identity, non-repudiation, secure computation, data integrity and confidentiality, message integrity checking and digital signatures.

  3. A threshold-voltage model for small-scaled GaAs nMOSFET with stacked high-k gate dielectric

    International Nuclear Information System (INIS)

    Liu Chaowen; Xu Jingping; Liu Lu; Lu Hanhan; Huang Yuan

    2016-01-01

    A threshold-voltage model for a stacked high-k gate dielectric GaAs MOSFET is established by solving a two-dimensional Poisson's equation in channel and considering the short-channel, DIBL and quantum effects. The simulated results are in good agreement with the Silvaco TCAD data, confirming the correctness and validity of the model. Using the model, impacts of structural and physical parameters of the stack high-k gate dielectric on the threshold-voltage shift and the temperature characteristics of the threshold voltage are investigated. The results show that the stacked gate dielectric structure can effectively suppress the fringing-field and DIBL effects and improve the threshold and temperature characteristics, and on the other hand, the influence of temperature on the threshold voltage is overestimated if the quantum effect is ignored. (paper)

  4. Parton distributions with threshold resummation

    CERN Document Server

    Bonvini, Marco; Rojo, Juan; Rottoli, Luca; Ubiali, Maria; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.

    2015-01-01

    We construct a set of parton distribution functions (PDFs) in which fixed-order NLO and NNLO calculations are supplemented with soft-gluon (threshold) resummation up to NLL and NNLL accuracy respectively, suitable for use in conjunction with any QCD calculation in which threshold resummation is included at the level of partonic cross sections. These resummed PDF sets, based on the NNPDF3.0 analysis, are extracted from deep-inelastic scattering, Drell-Yan, and top quark pair production data, for which resummed calculations can be consistently used. We find that, close to threshold, the inclusion of resummed PDFs can partially compensate the enhancement in resummed matrix elements, leading to resummed hadronic cross-sections closer to the fixed-order calculation. On the other hand, far from threshold, resummed PDFs reduce to their fixed-order counterparts. Our results demonstrate the need for a consistent use of resummed PDFs in resummed calculations.

  5. Effect of threshold quantization in opportunistic splitting algorithm

    KAUST Repository

    Nam, Haewoon

    2011-12-01

    This paper discusses algorithms to find the optimal threshold and also investigates the impact of threshold quantization on the scheduling outage performance of the opportunistic splitting scheduling algorithm. Since this algorithm aims at finding the user with the highest channel quality within the minimal number of mini-slots by adjusting the threshold every mini-slot, optimizing the threshold is of paramount importance. Hence, in this paper we first discuss how to compute the optimal threshold along with two tight approximations for the optimal threshold. Closed-form expressions are provided for those approximations for simple calculations. Then, we consider linear quantization of the threshold to take the limited number of bits for signaling messages in practical systems into consideration. Due to the limited granularity for the quantized threshold value, an irreducible scheduling outage floor is observed. The numerical results show that the two approximations offer lower scheduling outage probability floors compared to the conventional algorithm when the threshold is quantized. © 2006 IEEE.

  6. THRESHOLD DETERMINATION FOR LOCAL INSTANTANEOUS SEA SURFACE HEIGHT DERIVATION WITH ICEBRIDGE DATA IN BEAUFORT SEA

    Directory of Open Access Journals (Sweden)

    C. Zhu

    2018-05-01

    Full Text Available The NASA Operation IceBridge (OIB mission is the largest program in the Earth’s polar remote sensing science observation project currently, initiated in 2009, which collects airborne remote sensing measurements to bridge the gap between NASA’s ICESat and the upcoming ICESat-2 mission. This paper develop an improved method that optimizing the selection method of Digital Mapping System (DMS image and using the optimal threshold obtained by experiments in Beaufort Sea to calculate the local instantaneous sea surface height in this area. The optimal threshold determined by comparing manual selection with the lowest (Airborne Topographic Mapper ATM L1B elevation threshold of 2 %, 1 %, 0.5 %, 0.2 %, 0.1 % and 0.05 % in A, B, C sections, the mean of mean difference are 0.166 m, 0.124 m, 0.083 m, 0.018 m, 0.002 m and −0.034 m. Our study shows the lowest L1B data of 0.1 % is the optimal threshold. The optimal threshold and manual selections are also used to calculate the instantaneous sea surface height over images with leads, we find that improved methods has closer agreement with those from L1B manual selections. For these images without leads, the local instantaneous sea surface height estimated by using the linear equations between distance and sea surface height calculated over images with leads.

  7. Robust optimization of the laser induced damage threshold of dielectric mirrors for high power lasers.

    Science.gov (United States)

    Chorel, Marine; Lanternier, Thomas; Lavastre, Éric; Bonod, Nicolas; Bousquet, Bruno; Néauport, Jérôme

    2018-04-30

    We report on a numerical optimization of the laser induced damage threshold of multi-dielectric high reflection mirrors in the sub-picosecond regime. We highlight the interplay between the electric field distribution, refractive index and intrinsic laser induced damage threshold of the materials on the overall laser induced damage threshold (LIDT) of the multilayer. We describe an optimization method of the multilayer that minimizes the field enhancement in high refractive index materials while preserving a near perfect reflectivity. This method yields a significant improvement of the damage resistance since a maximum increase of 40% can be achieved on the overall LIDT of the multilayer.

  8. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  9. Memory reactivation improves visual perception.

    Science.gov (United States)

    Amar-Halpert, Rotem; Laor-Maayany, Rony; Nemni, Shlomi; Rosenblatt, Jonathan D; Censor, Nitzan

    2017-10-01

    Human perception thresholds can improve through learning. Here we report findings challenging the fundamental 'practice makes perfect' basis of procedural learning theory, showing that brief reactivations of encoded visual memories are sufficient to improve perceptual discrimination thresholds. Learning was comparable to standard practice-induced learning and was not due to short training per se, nor to an epiphenomenon of primed retrieval enhancement. The results demonstrate that basic perceptual functions can be substantially improved by memory reactivation, supporting a new account of perceptual learning dynamics.

  10. Damage threshold of lithium niobate crystal under single and multiple femtosecond laser pulses: theoretical and experimental study

    International Nuclear Information System (INIS)

    Meng, Qinglong; Zhang, Bin; Zhong, Sencheng; Zhu, Liguo

    2016-01-01

    The damage threshold of lithium niobate crystal under single and multiple femtosecond laser pulses has been studied theoretically and experimentally. Firstly, the model for the damage threshold prediction of crystal materials based on the improved rate equation has been proposed. Then, the experimental measure method of the damage threshold of crystal materials has been given in detail. On the basis, the variation of the damage threshold of lithium niobate crystal with the pulse duration has also been analyzed quantitatively. Finally, the damage threshold of lithium niobate crystal under multiple laser pulses has been measured and compared to the theoretical results. The results show that the transmittance of lithium niobate crystal is almost a constant when the laser pulse fluence is relative low, whereas it decreases linearly with the increase in the laser pulse fluence below the damage threshold. The damage threshold of lithium niobate crystal increases with the increase in the duration of the femtosecond laser pulse. And the damage threshold of lithium niobate crystal under multiple laser pulses is obviously lower than that irradiated by a single laser pulse. The theoretical data fall in good agreement with the experimental results. (orig.)

  11. Noninteractive Verifiable Outsourcing Algorithm for Bilinear Pairing with Improved Checkability

    Directory of Open Access Journals (Sweden)

    Yanli Ren

    2017-01-01

    Full Text Available It is well known that the computation of bilinear pairing is the most expensive operation in pairing-based cryptography. In this paper, we propose a noninteractive verifiable outsourcing algorithm of bilinear pairing based on two servers in the one-malicious model. The outsourcer need not execute any expensive operation, such as scalar multiplication and modular exponentiation. Moreover, the outsourcer could detect any failure with a probability close to 1 if one of the servers misbehaves. Therefore, the proposed algorithm improves checkability and decreases communication cost compared with the previous ones. Finally, we utilize the proposed algorithm as a subroutine to achieve an anonymous identity-based encryption (AIBE scheme with outsourced decryption and an identity-based signature (IBS scheme with outsourced verification.

  12. Provable Secure and Efficient Digital Rights Management Authentication Scheme Using Smart Card Based on Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    Yuanyuan Zhang

    2015-01-01

    Full Text Available Since the concept of ubiquitous computing is firstly proposed by Mark Weiser, its connotation has been extending and expanding by many scholars. In pervasive computing application environment, many kinds of small devices containing smart cart are used to communicate with others. In 2013, Yang et al. proposed an enhanced authentication scheme using smart card for digital rights management. They demonstrated that their scheme is secure enough. However, Mishra et al. pointed out that Yang et al.’s scheme suffers from the password guessing attack and the denial of service attack. Moreover, they also demonstrated that Yang et al.’s scheme is not efficient enough when the user inputs an incorrect password. In this paper, we analyze Yang et al.’s scheme again, and find that their scheme is vulnerable to the session key attack. And, there are some mistakes in their scheme. To surmount the weakness of Yang et al.’s scheme, we propose a more efficient and provable secure digital rights management authentication scheme using smart card based on elliptic curve cryptography.

  13. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  14. A Threshold Continuum for Aeolian Sand Transport

    Science.gov (United States)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  15. A threshold-voltage model for small-scaled GaAs nMOSFET with stacked high-k gate dielectric

    Science.gov (United States)

    Chaowen, Liu; Jingping, Xu; Lu, Liu; Hanhan, Lu; Yuan, Huang

    2016-02-01

    A threshold-voltage model for a stacked high-k gate dielectric GaAs MOSFET is established by solving a two-dimensional Poisson's equation in channel and considering the short-channel, DIBL and quantum effects. The simulated results are in good agreement with the Silvaco TCAD data, confirming the correctness and validity of the model. Using the model, impacts of structural and physical parameters of the stack high-k gate dielectric on the threshold-voltage shift and the temperature characteristics of the threshold voltage are investigated. The results show that the stacked gate dielectric structure can effectively suppress the fringing-field and DIBL effects and improve the threshold and temperature characteristics, and on the other hand, the influence of temperature on the threshold voltage is overestimated if the quantum effect is ignored. Project supported by the National Natural Science Foundation of China (No. 61176100).

  16. The influence of thresholds on the risk assessment of carcinogens in food.

    Science.gov (United States)

    Pratt, Iona; Barlow, Susan; Kleiner, Juliane; Larsen, John Christian

    2009-08-01

    The risks from exposure to chemical contaminants in food must be scientifically assessed, in order to safeguard the health of consumers. Risk assessment of chemical contaminants that are both genotoxic and carcinogenic presents particular difficulties, since the effects of such substances are normally regarded as being without a threshold. No safe level can therefore be defined, and this has implications for both risk management and risk communication. Risk management of these substances in food has traditionally involved application of the ALARA (As Low as Reasonably Achievable) principle, however ALARA does not enable risk managers to assess the urgency and extent of the risk reduction measures needed. A more refined approach is needed, and several such approaches have been developed. Low-dose linear extrapolation from animal carcinogenicity studies or epidemiological studies to estimate risks for humans at low exposure levels has been applied by a number of regulatory bodies, while more recently the Margin of Exposure (MOE) approach has been applied by both the European Food Safety Authority and the Joint FAO/WHO Expert Committee on Food Additives. A further approach is the Threshold of Toxicological Concern (TTC), which establishes exposure thresholds for chemicals present in food, dependent on structure. Recent experimental evidence that genotoxic responses may be thresholded has significant implications for the risk assessment of chemicals that are both genotoxic and carcinogenic. In relation to existing approaches such as linear extrapolation, MOE and TTC, the existence of a threshold reduces the uncertainties inherent in such methodology and improves confidence in the risk assessment. However, for the foreseeable future, regulatory decisions based on the concept of thresholds for genotoxic carcinogens are likely to be taken case-by-case, based on convincing data on the Mode of Action indicating that the rate limiting variable for the development of cancer

  17. Human impacts on morphodynamic thresholds in estuarine systems

    Science.gov (United States)

    Wang, Z. B.; Van Maren, D. S.; Ding, P. X.; Yang, S. L.; Van Prooijen, B. C.; De Vet, P. L. M.; Winterwerp, J. C.; De Vriend, H. J.; Stive, M. J. F.; He, Q.

    2015-12-01

    Many estuaries worldwide are modified, primarily driven by economic gain or safety. These works, combined with global climate changes heavily influence the morphologic development of estuaries. In this paper, we analyze the impact of human activities on the morphodynamic developments of the Scheldt Estuary and the Wadden Sea basins in the Netherlands and the Yangtze Estuary in China at various spatial scales, and identify mechanisms responsible for their change. Human activities in these systems include engineering works and dredging activities for improving and maintaining the navigation channels, engineering works for flood protection, and shoreline management activities such as land reclamations. The Yangtze Estuary is influenced by human activities in the upstream river basin as well, especially through the construction of many dams. The tidal basins in the Netherlands are also influenced by human activities along the adjacent coasts. Furthermore, all these systems are influenced by global changes through (accelerated) sea-level rise and changing weather patterns. We show that the cumulative impacts of these human activities and global changes may lead to exceeding thresholds beyond which the morphology of the tidal basins significantly changes, and loses its natural characteristics. A threshold is called tipping point when the changes are even irreversible. Knowledge on such thresholds or tipping points is important for the sustainable management of these systems. We have identified and quantified various examples of such thresholds and/or tipping points for the morphodynamic developments at various spatial and temporal scales. At the largest scale (mega-scale) we consider the sediment budget of a tidal basin as a whole. A smaller scale (macro-scale) is the development of channel structures in an estuary, especially the development of two competing channels. At the smallest scale (meso-scale) we analyze the developments of tidal flats and the connecting

  18. Iran: the next nuclear threshold state?

    OpenAIRE

    Maurer, Christopher L.

    2014-01-01

    Approved for public release; distribution is unlimited A nuclear threshold state is one that could quickly operationalize its peaceful nuclear program into one capable of producing a nuclear weapon. This thesis compares two known threshold states, Japan and Brazil, with Iran to determine if the Islamic Republic could also be labeled a threshold state. Furthermore, it highlights the implications such a status could have on U.S. nonproliferation policy. Although Iran's nuclear program is mir...

  19. Hydrometeorological threshold conditions for debris flow initiation in Norway

    Directory of Open Access Journals (Sweden)

    N. K. Meyer

    2012-10-01

    Full Text Available Debris flows, triggered by extreme precipitation events and rapid snow melt, cause considerable damage to the Norwegian infrastructure every year. To define intensity-duration (ID thresholds for debris flow initiation critical water supply conditions arising from intensive rainfall or snow melt were assessed on the basis of daily hydro-meteorological information for 502 documented debris flow events. Two threshold types were computed: one based on absolute ID relationships and one using ID relationships normalized by the local precipitation day normal (PDN. For each threshold type, minimum, medium and maximum threshold values were defined by fitting power law curves along the 10th, 50th and 90th percentiles of the data population. Depending on the duration of the event, the absolute threshold intensities needed for debris flow initiation vary between 15 and 107 mm day−1. Since the PDN changes locally, the normalized thresholds show spatial variations. Depending on location, duration and threshold level, the normalized threshold intensities vary between 6 and 250 mm day−1. The thresholds obtained were used for a frequency analysis of over-threshold events giving an estimation of the exceedance probability and thus potential for debris flow events in different parts of Norway. The absolute thresholds are most often exceeded along the west coast, while the normalized thresholds are most frequently exceeded on the west-facing slopes of the Norwegian mountain ranges. The minimum thresholds derived in this study are in the range of other thresholds obtained for regions with a climate comparable to Norway. Statistics reveal that the normalized threshold is more reliable than the absolute threshold as the former shows no spatial clustering of debris flows related to water supply events captured by the threshold.

  20. 11 CFR 9036.1 - Threshold submission.

    Science.gov (United States)

    2010-01-01

    ... credit or debit card, including one made over the Internet, the candidate shall provide sufficient... section shall not count toward the threshold amount. (c) Threshold certification by Commission. (1) After...

  1. High-frequency (8 to 16 kHz) reference thresholds and intrasubject threshold variability relative to ototoxicity criteria using a Sennheiser HDA 200 earphone.

    Science.gov (United States)

    Frank, T

    2001-04-01

    The first purpose of this study was to determine high-frequency (8 to 16 kHz) thresholds for standardizing reference equivalent threshold sound pressure levels (RETSPLs) for a Sennheiser HDA 200 earphone. The second and perhaps more important purpose of this study was to determine whether repeated high-frequency thresholds using a Sennheiser HDA 200 earphone had a lower intrasubject threshold variability than the ASHA 1994 significant threshold shift criteria for ototoxicity. High-frequency thresholds (8 to 16 kHz) were obtained for 100 (50 male, 50 female) normally hearing (0.25 to 8 kHz) young adults (mean age of 21.2 yr) in four separate test sessions using a Sennheiser HDA 200 earphone. The mean and median high-frequency thresholds were similar for each test session and increased as frequency increased. At each frequency, the high-frequency thresholds were not significantly (p > 0.05) different for gender, test ear, or test session. The median thresholds at each frequency were similar to the 1998 interim ISO RETSPLs; however, large standard deviations and wide threshold distributions indicated very high intersubject threshold variability, especially at 14 and 16 kHz. Threshold repeatability was determined by finding the threshold differences between each possible test session comparison (N = 6). About 98% of all of the threshold differences were within a clinically acceptable range of +/-10 dB from 8 to 14 kHz. The threshold differences between each subject's second, third, and fourth minus their first test session were also found to determine whether intrasubject threshold variability was less than the ASHA 1994 criteria for determining a significant threshold shift due to ototoxicity. The results indicated a false-positive rate of 0% for a threshold shift > or = 20 dB at any frequency and a false-positive rate of 2% for a threshold shift >10 dB at two consecutive frequencies. This study verified that the output of high-frequency audiometers at 0 dB HL using

  2. Otsu Based Optimal Multilevel Image Thresholding Using Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    N. Sri Madhava Raja

    2014-01-01

    Full Text Available Histogram based multilevel thresholding approach is proposed using Brownian distribution (BD guided firefly algorithm (FA. A bounded search technique is also presented to improve the optimization accuracy with lesser search iterations. Otsu’s between-class variance function is maximized to obtain optimal threshold level for gray scale images. The performances of the proposed algorithm are demonstrated by considering twelve benchmark images and are compared with the existing FA algorithms such as Lévy flight (LF guided FA and random operator guided FA. The performance assessment comparison between the proposed and existing firefly algorithms is carried using prevailing parameters such as objective function, standard deviation, peak-to-signal ratio (PSNR, structural similarity (SSIM index, and search time of CPU. The results show that BD guided FA provides better objective function, PSNR, and SSIM, whereas LF based FA provides faster convergence with relatively lower CPU time.

  3. Exogenous and endogenous attention during perceptual learning differentially affect post-training target thresholds

    Science.gov (United States)

    Mukai, Ikuko; Bahadur, Kandy; Kesavabhotla, Kartik; Ungerleider, Leslie G.

    2012-01-01

    There is conflicting evidence in the literature regarding the role played by attention in perceptual learning. To further examine this issue, we independently manipulated exogenous and endogenous attention and measured the rate of perceptual learning of oriented Gabor patches presented in different quadrants of the visual field. In this way, we could track learning at attended, divided-attended, and unattended locations. We also measured contrast thresholds of the Gabor patches before and after training. Our results showed that, for both exogenous and endogenous attention, accuracy in performing the orientation discrimination improved to a greater extent at attended than at unattended locations. Importantly, however, only exogenous attention resulted in improved contrast thresholds. These findings suggest that both exogenous and endogenous attention facilitate perceptual learning, but that these two types of attention may be mediated by different neural mechanisms. PMID:21282340

  4. Optimal Investment Under Transaction Costs: A Threshold Rebalanced Portfolio Approach

    Science.gov (United States)

    Tunc, Sait; Donmez, Mehmet Ali; Kozat, Suleyman Serdar

    2013-06-01

    We study optimal investment in a financial market having a finite number of assets from a signal processing perspective. We investigate how an investor should distribute capital over these assets and when he should reallocate the distribution of the funds over these assets to maximize the cumulative wealth over any investment period. In particular, we introduce a portfolio selection algorithm that maximizes the expected cumulative wealth in i.i.d. two-asset discrete-time markets where the market levies proportional transaction costs in buying and selling stocks. We achieve this using "threshold rebalanced portfolios", where trading occurs only if the portfolio breaches certain thresholds. Under the assumption that the relative price sequences have log-normal distribution from the Black-Scholes model, we evaluate the expected wealth under proportional transaction costs and find the threshold rebalanced portfolio that achieves the maximal expected cumulative wealth over any investment period. Our derivations can be readily extended to markets having more than two stocks, where these extensions are pointed out in the paper. As predicted from our derivations, we significantly improve the achieved wealth over portfolio selection algorithms from the literature on historical data sets.

  5. Adjusting patients streaming initiated by a wait time threshold in emergency department for minimizing opportunity cost.

    Science.gov (United States)

    Kim, Byungjoon B J; Delbridge, Theodore R; Kendrick, Dawn B

    2017-07-10

    Purpose Two different systems for streaming patients were considered to improve efficiency measures such as waiting times (WTs) and length of stay (LOS) for a current emergency department (ED). A typical fast track area (FTA) and a fast track with a wait time threshold (FTW) were designed and compared effectiveness measures from the perspective of total opportunity cost of all patients' WTs in the ED. The paper aims to discuss these issues. Design/methodology/approach This retrospective case study used computerized ED patient arrival to discharge time logs (between July 1, 2009 and June 30, 2010) to build computer simulation models for the FTA and fast track with wait time threshold systems. Various wait time thresholds were applied to stream different acuity-level patients. National average wait time for each acuity level was considered as a threshold to stream patients. Findings The fast track with a wait time threshold (FTW) showed a statistically significant shorter total wait time than the current system or a typical FTA system. The patient streaming management would improve the service quality of the ED as well as patients' opportunity costs by reducing the total LOS in the ED. Research limitations/implications The results of this study were based on computer simulation models with some assumptions such as no transfer times between processes, an arrival distribution of patients, and no deviation of flow pattern. Practical implications When the streaming of patient flow can be managed based on the wait time before being seen by a physician, it is possible for patients to see a physician within a tolerable wait time, which would result in less crowded in the ED. Originality/value A new streaming scheme of patients' flow may improve the performance of fast track system.

  6. Thermotactile perception thresholds measurement conditions.

    Science.gov (United States)

    Maeda, Setsuo; Sakakibara, Hisataka

    2002-10-01

    The purpose of this paper is to investigate the effects of posture, push force and rate of temperature change on thermotactile thresholds and to clarify suitable measuring conditions for Japanese people. Thermotactile (warm and cold) thresholds on the right middle finger were measured with an HVLab thermal aesthesiometer. Subjects were eight healthy male Japanese students. The effects of posture in measurement were examined in the posture of a straight hand and forearm placed on a support, the same posture without a support, and the fingers and hand flexed at the wrist with the elbow placed on a desk. The finger push force applied to the applicator of the thermal aesthesiometer was controlled at a 0.5, 1.0, 2.0 and 3.0 N. The applicator temperature was changed to 0.5, 1.0, 1.5, 2.0 and 2.5 degrees C/s. After each measurement, subjects were asked about comfort under the measuring conditions. Three series of experiments were conducted on different days to evaluate repeatability. Repeated measures ANOVA showed that warm thresholds were affected by the push force and the rate of temperature change and that cold thresholds were influenced by posture and push force. The comfort assessment indicated that the measurement posture of a straight hand and forearm laid on a support was the most comfortable for the subjects. Relatively high repeatability was obtained under measurement conditions of a 1 degrees C/s temperature change rate and a 0.5 N push force. Measurement posture, push force and rate of temperature change can affect the thermal threshold. Judging from the repeatability, a push force of 0.5 N and a temperature change of 1.0 degrees C/s in the posture with the straight hand and forearm laid on a support are recommended for warm and cold threshold measurements.

  7. DOE approach to threshold quantities

    International Nuclear Information System (INIS)

    Wickham, L.E.; Kluk, A.F.; Department of Energy, Washington, DC)

    1985-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Ideally, the threshold must be set high enough to significantly reduce the amount of waste requiring special handling. It must also be low enough so that waste at the threshold quantity poses a very small health risk and multiple exposures to such waste would still constitute a small health risk. It should also be practical to segregate waste above or below the threshold quantity using available instrumentation. Guidance is being prepared to aid DOE sites in establishing threshold quantity values based on pathways analysis using site-specific parameters (waste stream characteristics, maximum exposed individual, population considerations, and site specific parameters such as rainfall, etc.). A guidance dose of between 0.001 to 1.0 mSv/y (0.1 to 100 mrem/y) was recommended with 0.3 mSv/y (30 mrem/y) selected as the guidance dose upon which to base calculations. Several tasks were identified, beginning with the selection of a suitable pathway model for relating dose to the concentration of radioactivity in the waste. Threshold concentrations corresponding to the guidance dose were determined for waste disposal sites at a selected humid and arid site. Finally, cost-benefit considerations at the example sites were addressed. The results of the various tasks are summarized and the relationship of this effort with related developments at other agencies discussed

  8. Threshold concepts as barriers to understanding climate science

    Science.gov (United States)

    Walton, P.

    2013-12-01

    scientific engagement with the public to develop climate literacy. The analysis of 3 successive cohorts of students' journals who followed the same degree module identified that threshold concepts do exist within the field, such as those related to: role of ocean circulation, use of proxy indicators, forcing factors and feedback mechanisms. Once identified, the study looked at possible strategies to overcome these barriers to support student climate literacy. It concluded that the use of threshold concepts could be problematic when trying to improve climate literacy, as each individual has their own concepts they find ';troublesome' that do not necessarily relate to others. For scientists this presents the difficulty of how to develop a strategy that supports the individual that is cost and time effective. However, the study identifies that eLearning can be used effectively to help people understand troublesome knowledge.

  9. Doubler system quench detection threshold

    International Nuclear Information System (INIS)

    Kuepke, K.; Kuchnir, M.; Martin, P.

    1983-01-01

    The experimental study leading to the determination of the sensitivity needed for protecting the Fermilab Doubler from damage during quenches is presented. The quench voltage thresholds involved were obtained from measurements made on Doubler cable of resistance x temperature and voltage x time during quenches under several currents and from data collected during operation of the Doubler Quench Protection System as implemented in the B-12 string of 20 magnets. At 4kA, a quench voltage threshold in excess of 5.OV will limit the peak Doubler cable temperature to 452K for quenches originating in the magnet coils whereas a threshold of 0.5V is required for quenches originating outside of coils

  10. Reaction thresholds in doubly special relativity

    International Nuclear Information System (INIS)

    Heyman, Daniel; Major, Seth; Hinteleitner, Franz

    2004-01-01

    Two theories of special relativity with an additional invariant scale, 'doubly special relativity', are tested with calculations of particle process kinematics. Using the Judes-Visser modified conservation laws, thresholds are studied in both theories. In contrast with some linear approximations, which allow for particle processes forbidden in special relativity, both the Amelino-Camelia and Magueijo-Smolin frameworks allow no additional processes. To first order, the Amelino-Camelia framework thresholds are lowered and the Magueijo-Smolin framework thresholds may be raised or lowered

  11. An ulra-low-energy/frame multi-standard JPEG co-processor in 65nm CMOS with sub/near-threshold power supply.

    NARCIS (Netherlands)

    Pu, Yu; Pineda de Gyvez, J.; Corporaal, H.; Ha, Y.

    2009-01-01

    Many digital ICs can benefit from sub/near threshold operations that provide ultra-low-energy/operation for long battery lifetime. In addition, sub/near threshold operation largely mitigates the transient current hence lowering the ground bounce noise. This also helps to improve the performance of

  12. Thresholds in chemical respiratory sensitisation.

    Science.gov (United States)

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the

  13. Low-threshold support for families with dementia in Germany

    Directory of Open Access Journals (Sweden)

    Hochgraeber Iris

    2012-06-01

    Full Text Available Abstract Background Low-threshold support services are a part of the German health care system and help relieving family caregivers. There is limited information available on how to construct and implement low-threshold support services for people with dementia and their families in Germany. Some studies describe separately different perspectives of experiences and expectations, but there is no study combining all the different perspectives of those involved and taking the arrangements and organisation as well as their opinions on supporting and inhibiting factors into consideration. Findings This protocol describes the design of the study on low-threshold support services for families with a person with dementia in two German regions. The aim is to develop recommendations on how to build up these services and how to implement them in a region. A quantitative as well as a qualitative approach will be used. The quantitative part will be a survey on characteristics of service users and providers, as well as health care structures of the two project regions and an evaluation of important aspects derived from a literature search. Group discussions and semi-structured interviews will be carried out to get a deeper insight into the facilitators and barriers for both using and providing these services. All people involved will be included, such as the people with dementia, their relatives, volunteers, coordinators and institution representatives. Discussion Results of this study will provide important aspects for policymakers who are interested in an effective and low-threshold support for people with dementia. Furthermore the emerging recommendations can help staff and institutions to improve quality of care and can contribute to developing health and social care structures in Germany.

  14. Non-periodic preventive maintenance with reliability thresholds for complex repairable systems

    International Nuclear Information System (INIS)

    Lin, Zu-Liang; Huang, Yeu-Shiang; Fang, Chih-Chiang

    2015-01-01

    In general, a non-periodic condition-based PM policy with different condition variables is often more effective than a periodic age-based policy for deteriorating complex repairable systems. In this study, system reliability is estimated and used as the condition variable, and three reliability-based PM models are then developed with consideration of different scenarios which can assist in evaluating the maintenance cost for each scenario. The proposed approach provides the optimal reliability thresholds and PM schedules in advance by which the system availability and quality can be ensured and the organizational resources can be well prepared and managed. The results of the sensitivity anlysis indicate that PM activities performed at a high reliability threshold can not only significantly improve the system availability but also efficiently extend the system lifetime, although such a PM strategy is more costly than that for a low reliabiltiy threshold. The optimal reliability threshold increases along with the number of PM activities to prevent future breakdowns caused by severe deterioration, and thus substantially reduces repair costs. - Highlights: • The PM problems for repairable deteriorating systems are formulated. • The structural properties of the proposed PM models are investigated. • The corresponding algorithms to find the optimal PM strategies are provided. • Imperfect PM activities are allowed to reduce the occurences of breakdowns. • Provide managers with insights about the critical factors in the planning stage

  15. Compositional threshold for Nuclear Waste Glass Durability

    International Nuclear Information System (INIS)

    Kruger, Albert A.; Farooqi, Rahmatullah; Hrma, Pavel R.

    2013-01-01

    Within the composition space of glasses, a distinct threshold appears to exist that separates 'good' glasses, i.e., those which are sufficiently durable, from 'bad' glasses of a low durability. The objective of our research is to clarify the origin of this threshold by exploring the relationship between glass composition, glass structure and chemical durability around the threshold region

  16. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  17. Improving occupational injury surveillance by using a severity threshold: development of a new occupational health indicator.

    Science.gov (United States)

    Sears, Jeanne M; Bowman, Stephen M; Rotert, Mary; Blanar, Laura; Hogg-Johnson, Sheilah

    2016-06-01

    Hospital discharge data are used for occupational injury surveillance, but observed hospitalisation trends are affected by trends in healthcare practices and workers' compensation coverage that may increasingly impair ascertainment of minor injuries relative to severe injuries. The objectives of this study were to (1) describe the development of a severe injury definition for surveillance purposes and (2) assess the impact of imposing a severity threshold on estimated occupational and non-occupational injury trends. Three independent methods were used to estimate injury severity for the severe injury definition. 10 population-based hospital discharge databases were used to estimate trends (1998-2009), including the National Hospital Discharge Survey (NHDS) and State Inpatient Databases (SID) from the Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality. Negative binomial regression was used to model injury trends with and without severity restriction and to test trend divergence by severity. Trend estimates for occupational injuries were biased downwards in the absence of severity restriction, more so than for non-occupational injuries. Imposing a severity threshold resulted in a markedly different historical picture. Severity restriction can be used as an injury surveillance methodology to increase the accuracy of trend estimates, which can then be used by occupational health researchers, practitioners and policy-makers to identify prevention opportunities and to support state and national investments in occupational injury prevention efforts. The newly adopted state-based occupational health indicator, 'Work-Related Severe Traumatic Injury Hospitalizations', incorporates a severity threshold that will reduce temporal ascertainment threats to accurate trend estimates. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    Directory of Open Access Journals (Sweden)

    Lori Townsend

    2016-06-01

    Full Text Available This study used the Delphi method to engage expert practitioners on the topic of threshold concepts for information literacy. A panel of experts considered two questions. First, is the threshold concept approach useful for information literacy instruction? The panel unanimously agreed that the threshold concept approach holds potential for information literacy instruction. Second, what are the threshold concepts for information literacy instruction? The panel proposed and discussed over fifty potential threshold concepts, finally settling on six information literacy threshold concepts.

  19. Multiuser switched diversity scheduling systems with per-user threshold

    KAUST Repository

    Nam, Haewoon

    2010-05-01

    A multiuser switched diversity scheduling scheme with per-user feedback threshold is proposed and analyzed in this paper. The conventional multiuser switched diversity scheduling scheme uses a single feedback threshold for every user, where the threshold is a function of the average signal-to-noise ratios (SNRs) of the users as well as the number of users involved in the scheduling process. The proposed scheme, however, constructs a sequence of feedback thresholds instead of a single feedback threshold such that each user compares its channel quality with the corresponding feedback threshold in the sequence. Numerical and simulation results show that thanks to the flexibility of threshold selection, where a potentially different threshold can be used for each user, the proposed scheme provides a higher system capacity than that for the conventional scheme. © 2006 IEEE.

  20. Pain thresholds, supra-threshold pain and lidocaine sensitivity in patients with erythromelalgia, including the I848Tmutation in NaV 1.7.

    Science.gov (United States)

    Helås, T; Sagafos, D; Kleggetveit, I P; Quiding, H; Jönsson, B; Segerdahl, M; Zhang, Z; Salter, H; Schmelz, M; Jørum, E

    2017-09-01

    Nociceptive thresholds and supra-threshold pain ratings as well as their reduction upon local injection with lidocaine were compared between healthy subjects and patients with erythromelalgia (EM). Lidocaine (0.25, 0.50, 1.0 or 10 mg/mL) or placebo (saline) was injected intradermally in non-painful areas of the lower arm, in a randomized, double-blind manner, to test the effect on dynamic and static mechanical sensitivity, mechanical pain sensitivity, thermal thresholds and supra-threshold heat pain sensitivity. Heat pain thresholds and pain ratings to supra-threshold heat stimulation did not differ between EM-patients (n = 27) and controls (n = 25), neither did the dose-response curves for lidocaine. Only the subgroup of EM-patients with mutations in sodium channel subunits Na V 1.7, 1.8 or 1.9 (n = 8) had increased lidocaine sensitivity for supra-threshold heat stimuli, contrasting lower sensitivity to strong mechanical stimuli. This pattern was particularly clear in the two patients carrying the Na V 1.7 I848T mutations in whom lidocaine's hyperalgesic effect on mechanical pain sensitivity contrasted more effective heat analgesia. Heat pain thresholds are not sensitized in EM patients, even in those with gain-of-function mutations in Na V 1.7. Differential lidocaine sensitivity was overt only for noxious stimuli in the supra-threshold range suggesting that sensitized supra-threshold encoding is important for the clinical pain phenotype in EM in addition to lower activation threshold. Intracutaneous lidocaine dose-dependently blocked nociceptive sensations, but we did not identify EM patients with particular high lidocaine sensitivity that could have provided valuable therapeutic guidance. Acute pain thresholds and supra-threshold heat pain in controls and patients with erythromelalgia do not differ and have the same lidocaine sensitivity. Acute heat pain thresholds even in EM patients with the Na V 1.7 I848T mutation are normal and only nociceptor

  1. A brief peripheral motion contrast threshold test predicts older drivers' hazardous behaviors in simulated driving.

    Science.gov (United States)

    Henderson, Steven; Woods-Fry, Heather; Collin, Charles A; Gagnon, Sylvain; Voloaca, Misha; Grant, John; Rosenthal, Ted; Allen, Wade

    2015-05-01

    Our research group has previously demonstrated that the peripheral motion contrast threshold (PMCT) test predicts older drivers' self-report accident risk, as well as simulated driving performance. However, the PMCT is too lengthy to be a part of a battery of tests to assess fitness to drive. Therefore, we have developed a new version of this test, which takes under two minutes to administer. We assessed the motion contrast thresholds of 24 younger drivers (19-32) and 25 older drivers (65-83) with both the PMCT-10min and the PMCT-2min test and investigated if thresholds were associated with measures of simulated driving performance. Younger participants had significantly lower motion contrast thresholds than older participants and there were no significant correlations between younger participants' thresholds and any measures of driving performance. The PMCT-10min and the PMCT-2min thresholds of older drivers' predicted simulated crash risk, as well as the minimum distance of approach to all hazards. This suggests that our tests of motion processing can help predict the risk of collision or near collision in older drivers. Thresholds were also correlated with the total lane deviation time, suggesting a deficiency in processing of peripheral flow and delayed detection of adjacent cars. The PMCT-2min is an improved version of a previously validated test, and it has the potential to help assess older drivers' fitness to drive. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Designing and implementing of improved cryptographic algorithm using modular arithmetic theory

    Directory of Open Access Journals (Sweden)

    Maryam Kamarzarrin

    2015-05-01

    Full Text Available Maintaining the privacy and security of people information are two most important principles of electronic health plan. One of the methods of creating privacy and securing of information is using Public key cryptography system. In this paper, we compare two algorithms, Common And Fast Exponentiation algorithms, for enhancing the efficiency of public key cryptography. We express that a designed system by Fast Exponentiation Algorithm has high speed and performance but low power consumption and space occupied compared with Common Exponentiation algorithm. Although designed systems by Common Exponentiation algorithm have slower speed and lower performance, designing by this algorithm has less complexity, and easier designing compared with Fast Exponentiation algorithm. In this paper, we will try to examine and compare two different methods of exponentiation, also observe performance Impact of these two approaches in the form of hardware with VHDL language on FPGA.

  3. When do price thresholds matter in retail categories?

    OpenAIRE

    Pauwels, Koen; Srinivasan, Shuba; Franses, Philip Hans

    2007-01-01

    textabstractMarketing literature has long recognized that brand price elasticity need not be monotonic and symmetric, but has yet to provide generalizable market-level insights on threshold-based price elasticity, asymmetric thresholds, and the sign and magnitude of elasticity transitions. This paper introduces smooth transition regression models to study threshold-based price elasticity of the top 4 brands across 20 fast-moving consumer good categories. Threshold-based price elasticity is fo...

  4. Acrolein-stressed threshold adaptation alters the molecular and metabolic bases of an engineered Saccharomyces cerevisiae to improve glutathione production.

    Science.gov (United States)

    Zhou, Wenlong; Yang, Yan; Tang, Liang; Cheng, Kai; Li, Changkun; Wang, Huimin; Liu, Minzhi; Wang, Wei

    2018-03-14

    Acrolein (Acr) was used as a selection agent to improve the glutathione (GSH) overproduction of the prototrophic strain W303-1b/FGP PT . After two rounds of adaptive laboratory evolution (ALE), an unexpected result was obtained wherein identical GSH production was observed in the selected isolates. Then, a threshold selection mechanism of Acr-stressed adaption was clarified based on the formation of an Acr-GSH adduct, and a diffusion coefficient (0.36 ± 0.02 μmol·min -1 ·OD 600 -1 ) was calculated. Metabolomic analysis was carried out to reveal the molecular bases that triggered GSH overproduction. The results indicated that all three precursors (glutamic acid (Glu), glycine (Gly) and cysteine (Cys)) needed for GSH synthesis were at a relativity higher concentration in the evolved strain and that the accumulation of homocysteine (Hcy) and cystathionine might promote Cys synthesis and then improve GSH production. In addition to GSH and Cys, it was observed that other non-protein thiols and molecules related to ATP generation were at obviously different levels. To divert the accumulated thiols to GSH biosynthesis, combinatorial strategies, including deletion of cystathionine β-lyase (STR3), overexpression of cystathionine γ-lyase (CYS3) and cystathionine β-synthase (CYS4), and reduction of the unfolded protein response (UPR) through up-regulation of protein disulphide isomerase (PDI), were also investigated.

  5. High-order above-threshold dissociation of molecules

    Science.gov (United States)

    Lu, Peifen; Wang, Junping; Li, Hui; Lin, Kang; Gong, Xiaochun; Song, Qiying; Ji, Qinying; Zhang, Wenbin; Ma, Junyang; Li, Hanxiao; Zeng, Heping; He, Feng; Wu, Jian

    2018-03-01

    Electrons bound to atoms or molecules can simultaneously absorb multiple photons via the above-threshold ionization featured with discrete peaks in the photoelectron spectrum on account of the quantized nature of the light energy. Analogously, the above-threshold dissociation of molecules has been proposed to address the multiple-photon energy deposition in the nuclei of molecules. In this case, nuclear energy spectra consisting of photon-energy spaced peaks exceeding the binding energy of the molecular bond are predicted. Although the observation of such phenomena is difficult, this scenario is nevertheless logical and is based on the fundamental laws. Here, we report conclusive experimental observation of high-order above-threshold dissociation of H2 in strong laser fields where the tunneling-ionized electron transfers the absorbed multiphoton energy, which is above the ionization threshold to the nuclei via the field-driven inelastic rescattering. Our results provide an unambiguous evidence that the electron and nuclei of a molecule as a whole absorb multiple photons, and thus above-threshold ionization and above-threshold dissociation must appear simultaneously, which is the cornerstone of the nowadays strong-field molecular physics.

  6. Thresholds in Xeric Hydrology and Biogeochemistry

    Science.gov (United States)

    Meixner, T.; Brooks, P. D.; Simpson, S. C.; Soto, C. D.; Yuan, F.; Turner, D.; Richter, H.

    2011-12-01

    Due to water limitation, thresholds in hydrologic and biogeochemical processes are common in arid and semi-arid systems. Some of these thresholds such as those focused on rainfall runoff relationships have been well studied. However to gain a full picture of the role that thresholds play in driving the hydrology and biogeochemistry of xeric systems a full view of the entire array of processes at work is needed. Here a walk through the landscape of xeric systems will be conducted illustrating the powerful role of hydrologic thresholds on xeric system biogeochemistry. To understand xeric hydro-biogeochemistry two key ideas need to be focused on. First, it is important to start from a framework of reaction and transport. Second an understanding of the temporal and spatial components of thresholds that have a large impact on hydrologic and biogeochemical fluxes needs to be offered. In the uplands themselves episodic rewetting and drying of soils permits accelerated biogeochemical processing but also more gradual drainage of water through the subsurface than expected in simple conceptions of biogeochemical processes. Hydrologic thresholds (water content above hygroscopic) results in a stop start nutrient spiral of material across the landscape since runoff connecting uplands to xeric perennial riparian is episodic and often only transports materials a short distance (100's of m). This episodic movement results in important and counter-intuitive nutrient inputs to riparian zones but also significant processing and uptake of nutrients. The floods that transport these biogeochemicals also result in significant input to riparian groundwater and may be key to sustaining these critical ecosystems. Importantly the flood driven recharge process itself is a threshold process dependent on flood characteristics (floods greater than 100 cubic meters per second) and antecedent conditions (losing to near neutral gradients). Floods also appear to influence where arid and semi

  7. Low heat pain thresholds in migraineurs between attacks.

    Science.gov (United States)

    Schwedt, Todd J; Zuniga, Leslie; Chong, Catherine D

    2015-06-01

    Between attacks, migraine is associated with hypersensitivities to sensory stimuli. The objective of this study was to investigate hypersensitivity to pain in migraineurs between attacks. Cutaneous heat pain thresholds were measured in 112 migraineurs, migraine free for ≥ 48 hours, and 75 healthy controls. Pain thresholds at the head and at the arm were compared between migraineurs and controls using two-tailed t-tests. Among migraineurs, correlations between heat pain thresholds and headache frequency, allodynia symptom severity, and time interval until next headache were calculated. Migraineurs had lower pain thresholds than controls at the head (43.9 ℃ ± 3.2 ℃ vs. 45.1 ℃ ± 3.0 ℃, p = 0.015) and arm (43.2 ℃ ± 3.4 ℃ vs. 44.8 ℃ ± 3.3 ℃, p pain thresholds and headache frequency or allodynia symptom severity. For the 41 migraineurs for whom time to next headache was known, there were positive correlations between time to next headache and pain thresholds at the head (r = 0.352, p = 0.024) and arm (r = 0.312, p = 0.047). This study provides evidence that migraineurs have low heat pain thresholds between migraine attacks. Mechanisms underlying these lower pain thresholds could also predispose migraineurs to their next migraine attack, a hypothesis supported by finding positive correlations between pain thresholds and time to next migraine attack. © International Headache Society 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  8. A Theoretical and Experimental Comparison of One Time Pad Cryptography using Key and Plaintext Insertion and Transposition (KPIT and Key Coloumnar Transposition (KCT Method

    Directory of Open Access Journals (Sweden)

    Pryo Utomo

    2017-06-01

    Full Text Available One Time Pad (OTP is a cryptographic algorithm that is quite easy to be implemented. This algorithm works by converting plaintext and key into decimal then converting into binary number and calculating Exclusive-OR logic. In this paper, the authors try to make the comparison of OTP cryptography using KPI and KCT so that the ciphertext will be generated more difficult to be known. In the Key and Plaintext Insertion (KPI Method, we modify the OTP algorithm by adding the key insertion in the plaintext that has been splitted. Meanwhile in the Key Coloumnar Transposition (KCT Method, we modify the OTP algorithm by dividing the key into some parts in matrix of rows and coloumns. Implementation of the algorithms using PHP programming language.

  9. Some considerations regarding the creep crack growth threshold

    International Nuclear Information System (INIS)

    Thouless, M.D.; Evans, A.G.

    1984-01-01

    The preceding analysis reveals that the existence of a threshold determined by the sintering stress does not influence the post threshold crack velocity. Considerations of the sintering stress can thus be conveniently excluded from analysis of the post threshold crack velocity. The presence of a crack growth threshold has been predicted, based on the existence of cavity nucleation controlled crack growth. A preliminary analysis of cavity nucleation rates within the damage zone reveals that this threshold is relatively abrupt, in accord with experimental observations. Consequently, at stress intensities below K /SUB th/ growth becomes nucleation limited and crack blunting occurs in preference to crack growth

  10. Crack Growth Behavior in the Threshold Region for High Cycle Fatigue Loading

    Science.gov (United States)

    Forman, R. G.; Zanganeh, M.

    2014-01-01

    This paper describes the results of a research program conducted to improve the understanding of fatigue crack growth rate behavior in the threshold growth rate region and to answer a question on the validity of threshold region test data. The validity question relates to the view held by some experimentalists that using the ASTM load shedding test method does not produce valid threshold test results and material properties. The question involves the fanning behavior observed in threshold region of da/dN plots for some materials in which the low R-ratio data fans out from the high R-ratio data. This fanning behavior or elevation of threshold values in the low R-ratio tests is generally assumed to be caused by an increase in crack closure in the low R-ratio tests. Also, the increase in crack closure is assumed by some experimentalists to result from using the ASTM load shedding test procedure. The belief is that this procedure induces load history effects which cause remote closure from plasticity and/or roughness changes in the surface morphology. However, experimental studies performed by the authors have shown that the increase in crack closure is a result of extensive crack tip bifurcations that can occur in some materials, particularly in aluminum alloys, when the crack tip cyclic yield zone size becomes less than the grain size of the alloy. This behavior is related to the high stacking fault energy (SFE) property of aluminum alloys which results in easier slip characteristics. Therefore, the fanning behavior which occurs in aluminum alloys is a function of intrinsic dislocation property of the alloy, and therefore, the fanned data does represent the true threshold properties of the material. However, for the corrosion sensitive steel alloys tested in laboratory air, the occurrence of fanning results from fretting corrosion at the crack tips, and these results should not be considered to be representative of valid threshold properties because the fanning is

  11. When Do Price Thresholds Matter in Retail Categories?

    OpenAIRE

    Koen Pauwels; Shuba Srinivasan; Philip Hans Franses

    2007-01-01

    Marketing literature has long recognized that brand price elasticity need not be monotonic and symmetric, but has yet to provide generalizable market-level insights on threshold-based price elasticity, asymmetric thresholds, and the sign and magnitude of elasticity transitions. This paper introduces smooth transition regression models to study threshold-based price elasticity of the top 4 brands across 20 fast-moving consumer good categories. Threshold-based price elasticity is found for 76% ...

  12. Estimating the Threshold Level of Inflation for Thailand

    OpenAIRE

    Jiranyakul, Komain

    2017-01-01

    Abstract. This paper analyzes the relationship between inflation and economic growth in Thailand using annual dataset during 1990 and 2015. The threshold model is estimated for different levels of threshold inflation rate. The results suggest that the threshold level of inflation above which inflation significantly slow growth is estimated at 3 percent. The negative relationship between inflation and growth is apparent above this threshold level of inflation. In other words, the inflation rat...

  13. Improved approach to quantitative cardiac volumetrics using automatic thresholding and manual trimming: a cardiovascular MRI study.

    Science.gov (United States)

    Rayarao, Geetha; Biederman, Robert W W; Williams, Ronald B; Yamrozik, June A; Lombardi, Richard; Doyle, Mark

    2018-01-01

    To establish the clinical validity and accuracy of automatic thresholding and manual trimming (ATMT) by comparing the method with the conventional contouring method for in vivo cardiac volume measurements. CMR was performed on 40 subjects (30 patients and 10 controls) using steady-state free precession cine sequences with slices oriented in the short-axis and acquired contiguously from base to apex. Left ventricular (LV) volumes, end-diastolic volume, end-systolic volume, and stroke volume (SV) were obtained with ATMT and with the conventional contouring method. Additionally, SV was measured independently using CMR phase velocity mapping (PVM) of the aorta for validation. Three methods of calculating SV were compared by applying Bland-Altman analysis. The Bland-Altman standard deviation of variation (SD) and offset bias for LV SV for the three sets of data were: ATMT-PVM (7.65, [Formula: see text]), ATMT-contours (7.85, [Formula: see text]), and contour-PVM (11.01, 4.97), respectively. Equating the observed range to the error contribution of each approach, the error magnitude of ATMT:PVM:contours was in the ratio 1:2.4:2.5. Use of ATMT for measuring ventricular volumes accommodates trabeculae and papillary structures more intuitively than contemporary contouring methods. This results in lower variation when analyzing cardiac structure and function and consequently improved accuracy in assessing chamber volumes.

  14. Time-efficient multidimensional threshold tracking method

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Kowalewski, Borys; Dau, Torsten

    2015-01-01

    Traditionally, adaptive methods have been used to reduce the time it takes to estimate psychoacoustic thresholds. However, even with adaptive methods, there are many cases where the testing time is too long to be clinically feasible, particularly when estimating thresholds as a function of anothe...

  15. A light-powered sub-threshold microprocessor

    Energy Technology Data Exchange (ETDEWEB)

    Liu Ming; Chen Hong; Zhang Chun; Li Changmeng; Wang Zhihua, E-mail: lium02@mails.tsinghua.edu.cn [Institute of Microelectronics, Tsinghua University, Beijing 100084 (China)

    2010-11-15

    This paper presents an 8-bit sub-threshold microprocessor which can be powered by an integrated photosensitive diode. With a custom designed sub-threshold standard cell library and 1 kbit sub-threshold SRAM design, the leakage power of 58 nW, dynamic power of 385 nW - 165 kHz, EDP 13 pJ/inst and the operating voltage of 350 mV are achieved. Under a light of about 150 kLux, the microprocessor can run at a rate of up to 500 kHz. The microprocessor can be used for wireless-sensor-network nodes.

  16. Bedding material affects mechanical thresholds, heat thresholds and texture preference

    Science.gov (United States)

    Moehring, Francie; O’Hara, Crystal L.; Stucky, Cheryl L.

    2015-01-01

    It has long been known that the bedding type animals are housed on can affect breeding behavior and cage environment. Yet little is known about its effects on evoked behavior responses or non-reflexive behaviors. C57BL/6 mice were housed for two weeks on one of five bedding types: Aspen Sani Chips® (standard bedding for our institute), ALPHA-Dri®, Cellu-Dri™, Pure-o’Cel™ or TEK-Fresh. Mice housed on Aspen exhibited the lowest (most sensitive) mechanical thresholds while those on TEK-Fresh exhibited 3-fold higher thresholds. While bedding type had no effect on responses to punctate or dynamic light touch stimuli, TEK-Fresh housed animals exhibited greater responsiveness in a noxious needle assay, than those housed on the other bedding types. Heat sensitivity was also affected by bedding as animals housed on Aspen exhibited the shortest (most sensitive) latencies to withdrawal whereas those housed on TEK-Fresh had the longest (least sensitive) latencies to response. Slight differences between bedding types were also seen in a moderate cold temperature preference assay. A modified tactile conditioned place preference chamber assay revealed that animals preferred TEK-Fresh to Aspen bedding. Bedding type had no effect in a non-reflexive wheel running assay. In both acute (two day) and chronic (5 week) inflammation induced by injection of Complete Freund’s Adjuvant in the hindpaw, mechanical thresholds were reduced in all groups regardless of bedding type, but TEK-Fresh and Pure-o’Cel™ groups exhibited a greater dynamic range between controls and inflamed cohorts than Aspen housed mice. PMID:26456764

  17. 40 CFR 68.115 - Threshold determination.

    Science.gov (United States)

    2010-07-01

    ... (CONTINUED) CHEMICAL ACCIDENT PREVENTION PROVISIONS Regulated Substances for Accidental Release Prevention... process exceeds the threshold. (b) For the purposes of determining whether more than a threshold quantity... portion of the process is less than 10 millimeters of mercury (mm Hg), the amount of the substance in the...

  18. Approach to DOE threshold guidance limits

    International Nuclear Information System (INIS)

    Shuman, R.D.; Wickham, L.E.

    1984-01-01

    The need for less restrictive criteria governing disposal of extremely low-level radioactive waste has long been recognized. The Low-Level Waste Management Program has been directed by the Department of Energy (DOE) to aid in the development of a threshold guidance limit for DOE low-level waste facilities. Project objectives are concernd with the definition of a threshold limit dose and pathway analysis of radionuclide transport within selected exposure scenarios at DOE sites. Results of the pathway analysis will be used to determine waste radionuclide concentration guidelines that meet the defined threshold limit dose. Methods of measurement and verification of concentration limits round out the project's goals. Work on defining a threshold limit dose is nearing completion. Pathway analysis of sanitary landfill operations at the Savannah River Plant and the Idaho National Engineering Laboratory is in progress using the DOSTOMAN computer code. Concentration limit calculations and determination of implementation procedures shall follow completion of the pathways work. 4 references

  19. Towards a unifying basis of auditory thresholds: binaural summation.

    Science.gov (United States)

    Heil, Peter

    2014-04-01

    Absolute auditory threshold decreases with increasing sound duration, a phenomenon explainable by the assumptions that the sound evokes neural events whose probabilities of occurrence are proportional to the sound's amplitude raised to an exponent of about 3 and that a constant number of events are required for threshold (Heil and Neubauer, Proc Natl Acad Sci USA 100:6151-6156, 2003). Based on this probabilistic model and on the assumption of perfect binaural summation, an equation is derived here that provides an explicit expression of the binaural threshold as a function of the two monaural thresholds, irrespective of whether they are equal or unequal, and of the exponent in the model. For exponents >0, the predicted binaural advantage is largest when the two monaural thresholds are equal and decreases towards zero as the monaural threshold difference increases. This equation is tested and the exponent derived by comparing binaural thresholds with those predicted on the basis of the two monaural thresholds for different values of the exponent. The thresholds, measured in a large sample of human subjects with equal and unequal monaural thresholds and for stimuli with different temporal envelopes, are compatible only with an exponent close to 3. An exponent of 3 predicts a binaural advantage of 2 dB when the two ears are equally sensitive. Thus, listening with two (equally sensitive) ears rather than one has the same effect on absolute threshold as doubling duration. The data suggest that perfect binaural summation occurs at threshold and that peripheral neural signals are governed by an exponent close to 3. They might also shed new light on mechanisms underlying binaural summation of loudness.

  20. Spike-threshold adaptation predicted by membrane potential dynamics in vivo.

    Directory of Open Access Journals (Sweden)

    Bertrand Fontaine

    2014-04-01

    Full Text Available Neurons encode information in sequences of spikes, which are triggered when their membrane potential crosses a threshold. In vivo, the spiking threshold displays large variability suggesting that threshold dynamics have a profound influence on how the combined input of a neuron is encoded in the spiking. Threshold variability could be explained by adaptation to the membrane potential. However, it could also be the case that most threshold variability reflects noise and processes other than threshold adaptation. Here, we investigated threshold variation in auditory neurons responses recorded in vivo in barn owls. We found that spike threshold is quantitatively predicted by a model in which the threshold adapts, tracking the membrane potential at a short timescale. As a result, in these neurons, slow voltage fluctuations do not contribute to spiking because they are filtered by threshold adaptation. More importantly, these neurons can only respond to input spikes arriving together on a millisecond timescale. These results demonstrate that fast adaptation to the membrane potential captures spike threshold variability in vivo.