WorldWideScience

Sample records for threshold cryptography improving

  1. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  2. Pairing based threshold cryptography improving on Libert-Quisquater and Baek-Zheng

    DEFF Research Database (Denmark)

    Desmedt, Yvo; Lange, Tanja

    2006-01-01

    In this paper we apply techniques from secret sharing and threshold decryption to show how to properly design an ID-based threshold system in which one assumes no trust in any party. In our scheme: We avoid that any single machine ever knew the master secret s of the trusted authority (TA). Inste...

  3. AUTHENTICATION ARCHITECTURE USING THRESHOLD CRYPTOGRAPHY IN KERBEROS FOR MOBILE AD HOC NETWORKS

    Directory of Open Access Journals (Sweden)

    Hadj Gharib

    2014-06-01

    Full Text Available The use of wireless technologies is gradually increasing and risks related to the use of these technologies are considerable. Due to their dynamically changing topology and open environment without a centralized policy control of a traditional network, a mobile ad hoc network (MANET is vulnerable to the presence of malicious nodes and attacks. The ideal solution to overcome a myriad of security concerns in MANET’s is the use of reliable authentication architecture. In this paper we propose a new key management scheme based on threshold cryptography in kerberos for MANET’s, the proposed scheme uses the elliptic curve cryptography method that consumes fewer resources well adapted to the wireless environment. Our approach shows a strength and effectiveness against attacks.

  4. Step to improve neural cryptography against flipping attacks.

    Science.gov (United States)

    Zhou, Jiantao; Xu, Qinzhen; Pei, Wenjiang; He, Zhenya; Szu, Harold

    2004-12-01

    Synchronization of neural networks by mutual learning has been demonstrated to be possible for constructing key exchange protocol over public channel. However, the neural cryptography schemes presented so far are not the securest under regular flipping attack (RFA) and are completely insecure under majority flipping attack (MFA). We propose a scheme by splitting the mutual information and the training process to improve the security of neural cryptosystem against flipping attacks. Both analytical and simulation results show that the success probability of RFA on the proposed scheme can be decreased to the level of brute force attack (BFA) and the success probability of MFA still decays exponentially with the weights' level L. The synchronization time of the parties also remains polynomial with L. Moreover, we analyze the security under an advanced flipping attack.

  5. Fourier-based automatic alignment for improved Visual Cryptography schemes.

    Science.gov (United States)

    Machizaud, Jacques; Chavel, Pierre; Fournel, Thierry

    2011-11-07

    In Visual Cryptography, several images, called "shadow images", that separately contain no information, are overlapped to reveal a shared secret message. We develop a method to digitally register one printed shadow image acquired by a camera with a purely digital shadow image, stored in memory. Using Fourier techniques derived from Fourier Optics concepts, the idea is to enhance and exploit the quasi periodicity of the shadow images, composed by a random distribution of black and white patterns on a periodic sampling grid. The advantage is to speed up the security control or the access time to the message, in particular in the cases of a small pixel size or of large numbers of pixels. Furthermore, the interest of visual cryptography can be increased by embedding the initial message in two shadow images that do not have identical mathematical supports, making manual registration impractical. Experimental results demonstrate the successful operation of the method, including the possibility to directly project the result onto the printed shadow image.

  6. On the improvement of neural cryptography using erroneous transmitted information with error prediction.

    Science.gov (United States)

    Allam, Ahmed M; Abbas, Hazem M

    2010-12-01

    Neural cryptography deals with the problem of "key exchange" between two neural networks using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between the two communicating parties is eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process. Therefore, diminishing the probability of such a threat improves the reliability of exchanging the output bits through a public channel. The synchronization with feedback algorithm is one of the existing algorithms that enhances the security of neural cryptography. This paper proposes three new algorithms to enhance the mutual learning process. They mainly depend on disrupting the attacker confidence in the exchanged outputs and input patterns during training. The first algorithm is called "Do not Trust My Partner" (DTMP), which relies on one party sending erroneous output bits, with the other party being capable of predicting and correcting this error. The second algorithm is called "Synchronization with Common Secret Feedback" (SCSFB), where inputs are kept partially secret and the attacker has to train its network on input patterns that are different from the training sets used by the communicating parties. The third algorithm is a hybrid technique combining the features of the DTMP and SCSFB. The proposed approaches are shown to outperform the synchronization with feedback algorithm in the time needed for the parties to synchronize.

  7. Neural cryptography with feedback.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Shacham, Lanir; Kanter, Ido

    2004-04-01

    Neural cryptography is based on a competition between attractive and repulsive stochastic forces. A feedback mechanism is added to neural cryptography which increases the repulsive forces. Using numerical simulations and an analytic approach, the probability of a successful attack is calculated for different model parameters. Scaling laws are derived which show that feedback improves the security of the system. In addition, a network with feedback generates a pseudorandom bit sequence which can be used to encrypt and decrypt a secret message.

  8. Calculator Cryptography.

    Science.gov (United States)

    Hall, Matthew

    2003-01-01

    Uses cryptography to demonstrate the importance of algebra and the use of technology as an effective real application of mathematics. Explains simple encoding and decoding of messages for student learning of modular arithmetic. This elementary encounter with cryptography along with its historical and modern background serves to motivate student…

  9. Contemporary cryptography

    CERN Document Server

    Oppliger, Rolf

    2011-01-01

    Whether you're new to the field or looking to broaden your knowledge of contemporary cryptography, this newly revised edition of an Artech House classic puts all aspects of this important topic into perspective. Delivering an accurate introduction to the current state-of-the-art in modern cryptography, the book offers you an in-depth understanding of essential tools and applications to help you with your daily work. The second edition has been reorganized and expanded, providing mathematical fundamentals and important cryptography principles in the appropriate appendixes, rather than summarize

  10. Conventional Cryptography.

    Science.gov (United States)

    Wright, Marie A.

    1993-01-01

    Cryptography is the science that renders data unintelligible to prevent its unauthorized disclosure or modification. Presents an application of matrices used in linear transformations to illustrate a cryptographic system. An example is provided. (17 references) (MDH)

  11. The Improvement of Flip (2,2 Visual Cryptography Images Using Two Key Images

    Directory of Open Access Journals (Sweden)

    Ratna Dewi

    2016-09-01

    Full Text Available The Flip (2, 2 Visual Cryptography (FVC is one of the techniques used to encrypt the two secret images into two dual purpose transparencies. The two transparencies can be sent to the objective person. The first secret images can be obtained by stacking the two transparencies and the second secret images can be obtained by stacking the one transparency with the flipping other transparency. Unfortunately, the result decryption processes still have noise and the quality of decrypted secret image is not as same as original secret image. This article proposed the new algorithm to improve the quality of decryption secret image. In this process, the two secret images from decryption process were compared with the two original secret images. The different values of each pixel, which was counted from subtraction of decryption image and original secret images, will be inserted to the two key images. The experimental results of this improvement have a good similarity. The noise in decryption process can be eliminated so the two secret images reconstruction similar to the original secret images.

  12. Application of AVK and selective encryption in improving performance of quantum cryptography and networks

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2006-07-01

    The subject of quantum cryptography has emerged as an important area of research. Reported theoretical and practical investigations have conclusively established the reliable quantum key distribution (QKD) protocols with a higher level of security. For perfect security, the implementation of a time variant key is essential. The nature of cost and operation involved in quantum key distribution to distribute a time variant key from session to session/message to message has yet to be addressed from an implementation angle, yet it is understood to be hard with current available technology. Besides, the disadvantages of the subject quantum cryptanalysis, in the name of 'quantum cheating' and quantum error are demonstrated in the literature. This calls for an investigation for an affordable hybrid solution using QKD with conventional classical methods of key distribution to implement a time variant key. The paper proposes a hybrid solution towards this investigation. The solutions suggested will improve the performance of computer networks for secure transport of data in general. (author)

  13. Quantum cryptography

    CERN Document Server

    Gilbert, Gerald; Hamrick, Michael

    2013-01-01

    This book provides a detailed account of the theory and practice of quantum cryptography. Suitable as the basis for a course in the subject at the graduate level, it crosses the disciplines of physics, mathematics, computer science and engineering. The theoretical and experimental aspects of the subject are derived from first principles, and attention is devoted to the practical development of realistic quantum communications systems. The book also includes a comprehensive analysis of practical quantum cryptography systems implemented in actual physical environments via either free-space or fiber-optic cable quantum channels. This book will be a valuable resource for graduate students, as well as professional scientists and engineers, who desire an introduction to the field that will enable them to undertake research in quantum cryptography. It will also be a useful reference for researchers who are already active in the field, and for academic faculty members who are teaching courses in quantum information s...

  14. Achieving Higher-Fidelity Conjunction Analyses Using Cryptography to Improve Information Sharing

    Science.gov (United States)

    2014-01-01

    debris—the man-made orbital junk that represents a collision risk to operational satellites—is a growing threat that will increasingly affect future space...January 1987, pp. 218– 229; Oded Goldreich, Foundations of Cryptography, Volume II, Cambridge University Press, 2004; 3 In digital computing, every

  15. A NEW ERA OF CRYPTOGRAPHY: QUANTUM CRYPTOGRAPHY

    OpenAIRE

    Sandeepak Bhandari

    2016-01-01

    ABSTRACT Security is the first priority in today digital world for secure communication between sender and receiver. Various Cryptography techniques are developed time to time for secure communication. Quantum Cryptography is one of the latest and advanced cryptography technique, it is different from all other cryptography technique and more secure. It based on the Quantum of physics since its name which make it more secure from all other cryptography and UN breakable. In this paper about...

  16. Cryptography Basics

    DEFF Research Database (Denmark)

    Wattenhofer, Roger; Förster, Klaus-Tycho

    2017-01-01

    Public-key cryptography is one of the biggest scientific achievements of the last century. Two people that never met before can establish a common secret in plain sight? Sounds like pure magic! The idea of this chapter is to reveal some of the tricks of this “crypto magic”. This chapter is not ta...

  17. Chocolate Key Cryptography

    Science.gov (United States)

    Bachman, Dale J.; Brown, Ezra A.; Norton, Anderson H.

    2010-01-01

    Cryptography is the science of hidden or secret writing. More generally, cryptography refers to the science of safeguarding information. Cryptography allows people to use a public medium such as the Internet to transmit private information securely, thus enabling a whole range of conveniences, from online shopping to personally printed movie…

  18. An improved three party authenticated key exchange protocol using hash function and elliptic curve cryptography for mobile-commerce environments

    Directory of Open Access Journals (Sweden)

    S.K. Hafizul Islam

    2017-07-01

    Full Text Available In the literature, many three-party authenticated key exchange (3PAKE protocols are put forwarded to established a secure session key between two users with the help of trusted server. The computed session key will ensure secure message exchange between the users over any insecure communication networks. In this paper, we identified some deficiencies in Tan’s 3PAKE protocol and then devised an improved 3PAKE protocol without symmetric key en/decryption technique for mobile-commerce environments. The proposed protocol is based on the elliptic curve cryptography and one-way cryptographic hash function. In order to prove security validation of the proposed 3PAKE protocol we have used widely accepted AVISPA software whose results confirm that the proposed protocol is secure against active and passive attacks including replay and man-in-the-middle attacks. The proposed protocol is not only secure in the AVISPA software, but it also secure against relevant numerous security attacks such as man-in-the-middle attack, impersonation attack, parallel attack, key-compromise impersonation attack, etc. In addition, our protocol is designed with lower computation cost than other relevant protocols. Therefore, the proposed protocol is more efficient and suitable for practical use than other protocols in mobile-commerce environments.

  19. Coding and cryptography synergy for a robust communication

    CERN Document Server

    Zivic, Natasa

    2013-01-01

    This book presents the benefits of the synergetic effect of the combination of coding and cryptography. It introduces new directions for the interoperability between the components of a communication system. Coding and cryptography are standard components in today's distributed systems. The integration of cryptography into coding aspects is very interesting, as the usage of cryptography will be common use, even in industrial applications. The book is based on new developments of coding and cryptography, which use real numbers to express reliability values of bits instead of binary values 0 and 1. The presented methods are novel and designed for noisy communication, which doesn´t allow the successful use of cryptography. The rate of successful verifications is improved essentially not only for standard or "hard" verification, but even more after the introduction of "soft" verification. A security analysis shows the impact on the security. Information security and cryptography follow the late developments of c...

  20. Broadband Quantum Cryptography

    CERN Document Server

    Rogers, Daniel

    2010-01-01

    Quantum cryptography is a rapidly developing field that draws from a number of disciplines, from quantum optics to information theory to electrical engineering. By combining some fundamental quantum mechanical principles of single photons with various aspects of information theory, quantum cryptography represents a fundamental shift in the basis for security from numerical complexity to the fundamental physical nature of the communications channel. As such, it promises the holy grail of data security: theoretically unbreakable encryption. Of course, implementing quantum cryptography in real br

  1. Post-quantum cryptography

    Science.gov (United States)

    Bernstein, Daniel J.; Lange, Tanja

    2017-09-01

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  2. Introduction to modern cryptography

    CERN Document Server

    Katz, Jonathan

    2014-01-01

    Praise for the First Edition:""This book is a comprehensive, rigorous introduction to what the authors name 'modern' cryptography. … a novel approach to how cryptography is taught, replacing the older, construction-based approach. … The concepts are clearly stated, both in an intuitive fashion and formally. … I would heartily recommend this book to anyone who is interested in cryptography. … The exercises are challenging and interesting, and can benefit readers of all academic levels.""-IACR Book Reviews, January 2010""Over the past 30 years, cryptography has been transformed from a mysterious

  3. Post-quantum cryptography.

    Science.gov (United States)

    Bernstein, Daniel J; Lange, Tanja

    2017-09-13

    Cryptography is essential for the security of online communication, cars and implanted medical devices. However, many commonly used cryptosystems will be completely broken once large quantum computers exist. Post-quantum cryptography is cryptography under the assumption that the attacker has a large quantum computer; post-quantum cryptosystems strive to remain secure even in this scenario. This relatively young research area has seen some successes in identifying mathematical operations for which quantum algorithms offer little advantage in speed, and then building cryptographic systems around those. The central challenge in post-quantum cryptography is to meet demands for cryptographic usability and flexibility without sacrificing confidence.

  4. Arithmetic and Cryptography

    Indian Academy of Sciences (India)

    Keywords. Number theory; arithmetic; cryptography; RSA; public key cryptosystem; prime numbers; factorization; algorithms; residue class ring; theoretical computer science; internet security; information theory; trapdoor oneway function.

  5. Randomized dynamical decoupling strategies and improved one-way key rates for quantum cryptography

    International Nuclear Information System (INIS)

    Kern, Oliver

    2009-01-01

    noisy preprocessing) followed by the use of a structured block code, higher secure key rates may be obtained. For the BB84 protocol it is shown that iterating the combined preprocessing leads to an even higher gain. In order to speed up the numerical evaluation of the key rates, results of representation theory come into play. If a coherent version of the protocol is considered, the block code used in the preprocessing stage becomes a concatenated stabilizer code which is obtained by concatenating an outer random code with an inner deterministic one. This concatenated stabilizer code is used to compute an improved lower bound on the quantum capacity of a certain quantum channel (the so-called qubit depolarizing channel). (orig.)

  6. Randomized dynamical decoupling strategies and improved one-way key rates for quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Kern, Oliver

    2009-05-25

    noisy preprocessing) followed by the use of a structured block code, higher secure key rates may be obtained. For the BB84 protocol it is shown that iterating the combined preprocessing leads to an even higher gain. In order to speed up the numerical evaluation of the key rates, results of representation theory come into play. If a coherent version of the protocol is considered, the block code used in the preprocessing stage becomes a concatenated stabilizer code which is obtained by concatenating an outer random code with an inner deterministic one. This concatenated stabilizer code is used to compute an improved lower bound on the quantum capacity of a certain quantum channel (the so-called qubit depolarizing channel). (orig.)

  7. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Adis Alihodzic

    2014-01-01

    Full Text Available Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.

  8. Public Key Cryptography.

    Science.gov (United States)

    Tapson, Frank

    1996-01-01

    Describes public key cryptography, also known as RSA, which is a system using two keys, one used to put a message into cipher and another used to decipher the message. Presents examples using small prime numbers. (MKR)

  9. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    Science.gov (United States)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  10. Quantum cryptography communication technology

    International Nuclear Information System (INIS)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Hong, Seok Boong; Koo, In Soo

    2007-09-01

    Quantum cryptography communication based on quantum mechanics provides and unconditional security between two users. Even though huge advance has been done since the 1984, having a complete system is still far away. In the case of real quantum cryptography communication systems, an unconditional security level is lowered by the imperfection of the communication unit. It is important to investigate the unconditional security of quantum communication protocols based on these experimental results and implementation examples for the advanced spread all over the world. The Japanese report, titled, 'Investigation report on the worldwide trends of quantum cryptography communications systems' was translated and summarized in this report. An unconditional security theory of the quantum cryptography and real implementation examples in the domestic area are investigated also. The goal of the report is to make quantum cryptography communication more useful and reliable alternative telecommunication infrastructure as the one of the cyber security program of the class 1-E communication system of nuclear power plant. Also another goal of this report is to provide the quantitative decision basis on the quantum cryptography communication when this secure communication system will be used in class 1-E communication channel of the nuclear power plant

  11. Halftone visual cryptography.

    Science.gov (United States)

    Zhou, Zhi; Arce, Gonzalo R; Di Crescenzo, Giovanni

    2006-08-01

    Visual cryptography encodes a secret binary image (SI) into n shares of random binary patterns. If the shares are xeroxed onto transparencies, the secret image can be visually decoded by superimposing a qualified subset of transparencies, but no secret information can be obtained from the superposition of a forbidden subset. The binary patterns of the n shares, however, have no visual meaning and hinder the objectives of visual cryptography. Extended visual cryptography [1] was proposed recently to construct meaningful binary images as shares using hypergraph colourings, but the visual quality is poor. In this paper, a novel technique named halftone visual cryptography is proposed to achieve visual cryptography via halftoning. Based on the blue-noise dithering principles, the proposed method utilizes the void and cluster algorithm [2] to encode a secret binary image into n halftone shares (images) carrying significant visual information. The simulation shows that the visual quality of the obtained halftone shares are observably better than that attained by any available visual cryptography method known to date.

  12. Quantum cryptography communication technology

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Choi, Young Soo; Lee, Jae Chul; Choi, Yu Rak; Jung, Gwang Il; Jung, Jong Eun; Hong, Seok Boong; Koo, In Soo

    2007-09-15

    Quantum cryptography communication based on quantum mechanics provides and unconditional security between two users. Even though huge advance has been done since the 1984, having a complete system is still far away. In the case of real quantum cryptography communication systems, an unconditional security level is lowered by the imperfection of the communication unit. It is important to investigate the unconditional security of quantum communication protocols based on these experimental results and implementation examples for the advanced spread all over the world. The Japanese report, titled, 'Investigation report on the worldwide trends of quantum cryptography communications systems' was translated and summarized in this report. An unconditional security theory of the quantum cryptography and real implementation examples in the domestic area are investigated also. The goal of the report is to make quantum cryptography communication more useful and reliable alternative telecommunication infrastructure as the one of the cyber security program of the class 1-E communication system of nuclear power plant. Also another goal of this report is to provide the quantitative decision basis on the quantum cryptography communication when this secure communication system will be used in class 1-E communication channel of the nuclear power plant.

  13. Visual cryptography for image processing and security theory, methods, and applications

    CERN Document Server

    Liu, Feng

    2014-01-01

    This unique book describes the fundamental concepts, theories and practice of visual cryptography. The design, construction, analysis, and application of visual cryptography schemes (VCSs) are discussed in detail. Original, cutting-edge research is presented on probabilistic, size invariant, threshold, concolorous, and cheating immune VCS. Features: provides a thorough introduction to the field; examines various common problems in visual cryptography, including the alignment, flipping, cheating, distortion, and thin line problems; reviews a range of VCSs, including XOR-based visual cryptograph

  14. An Improved Digital Signature Protocol to Multi-User Broadcast Authentication Based on Elliptic Curve Cryptography in Wireless Sensor Networks (WSNs

    Directory of Open Access Journals (Sweden)

    Hamed Bashirpour

    2018-03-01

    Full Text Available In wireless sensor networks (WSNs, users can use broadcast authentication mechanisms to connect to the target network and disseminate their messages within the network. Since data transfer for sensor networks is wireless, as a result, attackers can easily eavesdrop deployed sensor nodes and the data sent between them or modify the content of eavesdropped data and inject false data into the sensor network. Hence, the implementation of the message authentication mechanisms (in order to avoid changes and injecting messages into the network of wireless sensor networks is essential. In this paper, we present an improved protocol based on elliptic curve cryptography (ECC to accelerate authentication of multi-user message broadcasting. In comparison with previous ECC-based schemes, complexity and computational overhead of proposed scheme is significantly decreased. Also, the proposed scheme supports user anonymity, which is an important property in broadcast authentication schemes for WSNs to preserve user privacy and user untracking.

  15. Post-Quantum Cryptography

    DEFF Research Database (Denmark)

    Gauthier Umana, Valérie

    The security of almost all the public-key cryptosystems used in practice depends on the fact that the prime factorization of a number and the discrete logarithm are hard problems to solve. In 1994, Peter Shor found a polynomial-time algorithm which solves these two problems using quantum computers....... The public key cryptosystems that can resist these emerging attacks are called quantum resistant or post-quantum cryptosystems. There are mainly four classes of public-key cryptography that are believed to resist classical and quantum attacks: code-based cryptography, hash-based cryptography, lattice...... part, we rst give an overview of hash based signature schemes. Their security is based on the collision resistance of a hash function and is a good quantum resistant alternative to the used signature schemes. We show that several existing proposals of how to make multiple-time signature schemes...

  16. Threshold for improvement in insulin sensitivity with adolescent weight loss.

    Science.gov (United States)

    Abrams, Pamela; Levitt Katz, Lorraine E; Moore, Reneé H; Xanthopoulos, Melissa S; Bishop-Gilyard, Chanelle T; Wadden, Thomas A; Berkowitz, Robert I

    2013-09-01

    To assess the association of weight loss and insulin sensitivity, glucose tolerance, and metabolic syndrome (MS) in obese adolescents following weight loss treatment, and to determine the threshold amount of weight loss required to observe improvements in these measures. A randomized, controlled behavioral weight loss trial was conducted with 113 obese adolescents. Changes in fasting insulin, homeostasis model assessment of insulin resistance, whole body insulin sensitivity index (WBISI), body mass index (BMI), and MS criteria were assessed at baseline and at month 4. There was significant improvement in all measures of insulin sensitivity at month 4. Mean fasting insulin dropped from 22.3 to 16.6 μU/mL (P adolescents. An approximate decrease in BMI of 8% was the threshold level at which insulin sensitivity improved. As more weight loss programs are designed for obese adolescents, it will be important to have reasonable weight loss goals that will yield improvements in metabolic and cardiovascular disease risk factors. Copyright © 2013. Published by Mosby, Inc.

  17. Dynamics of neural cryptography

    International Nuclear Information System (INIS)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-01-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible

  18. Low power cryptography

    International Nuclear Information System (INIS)

    Kitsos, P; Koufopavlou, O; Selimis, G; Sklavos, N

    2005-01-01

    Today more and more sensitive data is stored digitally. Bank accounts, medical records and personal emails are some categories that data must keep secure. The science of cryptography tries to encounter the lack of security. Data confidentiality, authentication, non-reputation and data integrity are some of the main parts of cryptography. The evolution of cryptography drove in very complex cryptographic models which they could not be implemented before some years. The use of systems with increasing complexity, which usually are more secure, has as result low throughput rate and more energy consumption. However the evolution of cipher has no practical impact, if it has only theoretical background. Every encryption algorithm should exploit as much as possible the conditions of the specific system without omitting the physical, area and timing limitations. This fact requires new ways in design architectures for secure and reliable crypto systems. A main issue in the design of crypto systems is the reduction of power consumption, especially for portable systems as smart cards. (invited paper)

  19. Dynamics of neural cryptography.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-05-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible.

  20. Dynamics of neural cryptography

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-05-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible.

  1. Cheating prevention in visual cryptography.

    Science.gov (United States)

    Hu, Chih-Ming; Tzeng, Wen-Guey

    2007-01-01

    Visual cryptography (VC) is a method of encrypting a secret image into shares such that stacking a sufficient number of shares reveals the secret image. Shares are usually presented in transparencies. Each participant holds a transparency. Most of the previous research work on VC focuses on improving two parameters: pixel expansion and contrast. In this paper, we studied the cheating problem in VC and extended VC. We considered the attacks of malicious adversaries who may deviate from the scheme in any way. We presented three cheating methods and applied them on attacking existent VC or extended VC schemes. We improved one cheat-preventing scheme. We proposed a generic method that converts a VCS to another VCS that has the property of cheating prevention. The overhead of the conversion is near optimal in both contrast degression and pixel expansion.

  2. Genetic attack on neural cryptography.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Naeh, Rivka; Kanter, Ido

    2006-03-01

    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size.

  3. Genetic attack on neural cryptography

    International Nuclear Information System (INIS)

    Ruttor, Andreas; Kinzel, Wolfgang; Naeh, Rivka; Kanter, Ido

    2006-01-01

    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size

  4. Applied quantum cryptography

    International Nuclear Information System (INIS)

    Kollmitzer, Christian; Pivk, Mario

    2010-01-01

    Using the quantum properties of single photons to exchange binary keys between two partners for subsequent encryption of secret data is an absolutely novel technology. Only a few years ago quantum cryptography - or better: quantum key distribution - was the domain of basic research laboratories at universities. But during the last few years things changed. QKD left the laboratories and was picked up by more practical oriented teams that worked hard to develop a practically applicable technology out of the astonishing results of basic research. One major milestone towards a QKD technology was a large research and development project funded by the European Commission that aimed at combining quantum physics with complementary technologies that are necessary to create a technical solution: electronics, software, and network components were added within the project SECOQC (Development of a Global Network for Secure Communication based on Quantum Cryptography) that teamed up all expertise on European level to get a technology for future encryption. The practical application of QKD in a standard optical fibre network was demonstrated October 2008 in Vienna, giving a glimpse of the future of secure communication. Although many steps have still to be done in order to achieve a real mature technology, the corner stone for future secure communication is already laid. QKD will not be the Holy Grail of security, it will not be able to solve all problems for evermore. But QKD has the potential to replace one of the weakest parts of symmetric encryption: the exchange of the key. It can be proven that the key exchange process cannot be corrupted and that keys that are generated and exchanged quantum cryptographically will be secure for ever (as long as some additional conditions are kept). This book will show the state of the art of Quantum Cryptography and it will sketch how it can be implemented in standard communication infrastructure. The growing vulnerability of sensitive

  5. Counterfactual quantum cryptography.

    Science.gov (United States)

    Noh, Tae-Gon

    2009-12-04

    Quantum cryptography allows one to distribute a secret key between two remote parties using the fundamental principles of quantum mechanics. The well-known established paradigm for the quantum key distribution relies on the actual transmission of signal particle through a quantum channel. In this Letter, we show that the task of a secret key distribution can be accomplished even though a particle carrying secret information is not in fact transmitted through the quantum channel. The proposed protocols can be implemented with current technologies and provide practical security advantages by eliminating the possibility that an eavesdropper can directly access the entire quantum system of each signal particle.

  6. Cryptography cracking codes

    CERN Document Server

    2014-01-01

    While cracking a code might seem like something few of us would encounter in our daily lives, it is actually far more prevalent than we may realize. Anyone who has had personal information taken because of a hacked email account can understand the need for cryptography and the importance of encryption-essentially the need to code information to keep it safe. This detailed volume examines the logic and science behind various ciphers, their real world uses, how codes can be broken, and the use of technology in this oft-overlooked field.

  7. Autocompensating quantum cryptography

    International Nuclear Information System (INIS)

    Bethune, Donald S.; Risk, William P.

    2002-01-01

    Quantum cryptographic key distribution (QKD) uses extremely faint light pulses to carry quantum information between two parties (Alice and Bob), allowing them to generate a shared, secret cryptographic key. Autocompensating QKD systems automatically and passively compensate for uncontrolled time-dependent variations of the optical fibre properties by coding the information as a differential phase between orthogonally polarized components of a light pulse sent on a round trip through the fibre, reflected at mid-course using a Faraday mirror. We have built a prototype system based on standard telecom technology that achieves a privacy-amplified bit generation rate of ∼1000 bits s -1 over a 10 km optical fibre link. Quantum cryptography is an example of an application that, by using quantum states of individual particles to represent information, accomplishes a practical task that is impossible using classical means. (author)

  8. Introduction to cryptography

    CERN Document Server

    Buchmann, Johannes A

    2004-01-01

    Cryptography is a key technology in electronic key systems. It is used to keep data secret, digitally sign documents, access control, etc. Therefore, users should not only know how its techniques work, but they must also be able to estimate their efficiency and security. For this new edition, the author has updated the discussion of the security of encryption and signature schemes and recent advances in factoring and computing discrete logarithms. He has also added descriptions of time-memory trade of attacks and algebraic attacks on block ciphers, the Advanced Encryption Standard, the Secure Hash Algorithm, secret sharing schemes, and undeniable and blind signatures. Johannes A. Buchmann is a Professor of Computer Science and Mathematics at the Technical University of Darmstadt, and the Associate Editor of the Journal of Cryptology. In 1985, he received the Feodor Lynen Fellowship of the Alexander von Humboldt Foundation. Furthermore, he has received the most prestigious award in science in Germany, the Leib...

  9. Lightweight cryptography for constrained devices

    DEFF Research Database (Denmark)

    Alippi, Cesare; Bogdanov, Andrey; Regazzoni, Francesco

    2014-01-01

    Lightweight cryptography is a rapidly evolving research field that responds to the request for security in resource constrained devices. This need arises from crucial pervasive IT applications, such as those based on RFID tags where cost and energy constraints drastically limit the solution...... complexity, with the consequence that traditional cryptography solutions become too costly to be implemented. In this paper, we survey design strategies and techniques suitable for implementing security primitives in constrained devices....

  10. Composability in quantum cryptography

    International Nuclear Information System (INIS)

    Mueller-Quade, Joern; Renner, Renato

    2009-01-01

    If we combine two secure cryptographic systems, is the resulting system still secure? Answering this question is highly nontrivial and has recently sparked a considerable research effort, in particular, in the area of classical cryptography. A central insight was that the answer to the question is yes, but only within a well-specified composability framework and for carefully chosen security definitions. In this article, we review several aspects of composability in the context of quantum cryptography. The first part is devoted to key distribution. We discuss the security criteria that a quantum key distribution (QKD) protocol must fulfill to allow its safe use within a larger security application (e.g. for secure message transmission); and we demonstrate-by an explicit example-what can go wrong if conventional (non-composable) security definitions are used. Finally, to illustrate the practical use of composability, we show how to generate a continuous key stream by sequentially composing rounds of a QKD protocol. In the second part, we take a more general point of view, which is necessary for the study of cryptographic situations involving, for example, mutually distrustful parties. We explain the universal composability (UC) framework and state the composition theorem that guarantees that secure protocols can securely be composed to larger applications. We focus on the secure composition of quantum protocols into unconditionally secure classical protocols. However, the resulting security definition is so strict that some tasks become impossible without additional security assumptions. Quantum bit commitment is impossible in the UC framework even with mere computational security. Similar problems arise in the quantum bounded storage model and we observe a trade-off between the UC and the use of the weakest possible security assumptions.

  11. Understanding and applying cryptography and data security

    CERN Document Server

    Elbirt, Adam J

    2009-01-01

    Introduction A Brief History of Cryptography and Data Security Cryptography and Data Security in the Modern World Existing Texts Book Organization Symmetric-Key Cryptography Cryptosystem Overview The Modulo Operator Greatest Common Divisor The Ring ZmHomework ProblemsSymmetric-Key Cryptography: Substitution Ciphers Basic Cryptanalysis Shift Ciphers Affine Ciphers Homework ProblemsSymmetric-Key Cryptography: Stream Ciphers Random Numbers The One-Time Pad Key Stream GeneratorsReal-World ApplicationsHomework ProblemsSymmetric-Key Cryptography: Block Ciphers The Data Encryption StandardThe Advance

  12. Cryptography Engineering Design Principles and Practical Applications

    CERN Document Server

    Ferguson, Niels; Kohno, Tadayoshi

    2012-01-01

    The ultimate guide to cryptography, updated from an author team of the world's top cryptography experts. Cryptography is vital to keeping information safe, in an era when the formula to do so becomes more and more challenging. Written by a team of world-renowned cryptography experts, this essential guide is the definitive introduction to all major areas of cryptography: message security, key negotiation, and key management. You'll learn how to think like a cryptographer. You'll discover techniques for building cryptography into products from the start and you'll examine the many technical chan

  13. An Improved and Secure Biometric Authentication Scheme for Telecare Medicine Information Systems Based on Elliptic Curve Cryptography.

    Science.gov (United States)

    Chaudhry, Shehzad Ashraf; Mahmood, Khalid; Naqvi, Husnain; Khan, Muhammad Khurram

    2015-11-01

    Telecare medicine information system (TMIS) offers the patients convenient and expedite healthcare services remotely anywhere. Patient security and privacy has emerged as key issues during remote access because of underlying open architecture. An authentication scheme can verify patient's as well as TMIS server's legitimacy during remote healthcare services. To achieve security and privacy a number of authentication schemes have been proposed. Very recently Lu et al. (J. Med. Syst. 39(3):1-8, 2015) proposed a biometric based three factor authentication scheme for TMIS to confiscate the vulnerabilities of Arshad et al.'s (J. Med. Syst. 38(12):136, 2014) scheme. Further, they emphasized the robustness of their scheme against several attacks. However, in this paper we establish that Lu et al.'s scheme is vulnerable to numerous attacks including (1) Patient anonymity violation attack, (2) Patient impersonation attack, and (3) TMIS server impersonation attack. Furthermore, their scheme does not provide patient untraceability. We then, propose an improvement of Lu et al.'s scheme. We have analyzed the security of improved scheme using popular automated tool ProVerif. The proposed scheme while retaining the plusses of Lu et al.'s scheme is also robust against known attacks.

  14. Power-Split Hybrid Electric Vehicle Energy Management Based on Improved Logic Threshold Approach

    Directory of Open Access Journals (Sweden)

    Zhumu Fu

    2013-01-01

    Full Text Available We design an improved logic threshold approach of energy management for a power-split HEV assisted by an integrated starter generator (ISG. By combining the efficiency map and the optimum torque curve of internal combustion engine (ICE with the state of charge (SOC of batteries, the improved logic threshold controller manages the ICE within its peak efficiency region at first. Then the electrical power demand is established based on the ICE energy output. On that premise, a variable logic threshold value K is defined to achieve the power distribution between the ISG and the electric motor/generator (EMG. Finally, simulation models for the power-split HEV with improved logic threshold controller are established in ADVISOR. Compared to the equally power-split HEV with the logic threshold controller, when using the improved logic threshold controller, the battery power consumption, the ICE efficiency, the fuel consumption, and the motor driving system efficiency are improved.

  15. Coding Theory, Cryptography and Related Areas

    DEFF Research Database (Denmark)

    Buchmann, Johannes; Stichtenoth, Henning; Tapia-Recillas, Horacio

    Proceedings of anInternational Conference on Coding Theory, Cryptography and Related Areas, held in Guanajuato, Mexico. in april 1998......Proceedings of anInternational Conference on Coding Theory, Cryptography and Related Areas, held in Guanajuato, Mexico. in april 1998...

  16. Coding, cryptography and combinatorics

    CERN Document Server

    Niederreiter, Harald; Xing, Chaoping

    2004-01-01

    It has long been recognized that there are fascinating connections between cod­ ing theory, cryptology, and combinatorics. Therefore it seemed desirable to us to organize a conference that brings together experts from these three areas for a fruitful exchange of ideas. We decided on a venue in the Huang Shan (Yellow Mountain) region, one of the most scenic areas of China, so as to provide the additional inducement of an attractive location. The conference was planned for June 2003 with the official title Workshop on Coding, Cryptography and Combi­ natorics (CCC 2003). Those who are familiar with events in East Asia in the first half of 2003 can guess what happened in the end, namely the conference had to be cancelled in the interest of the health of the participants. The SARS epidemic posed too serious a threat. At the time of the cancellation, the organization of the conference was at an advanced stage: all invited speakers had been selected and all abstracts of contributed talks had been screened by the p...

  17. Crossmodal integration improves sensory detection thresholds in the ferret.

    Directory of Open Access Journals (Sweden)

    Karl J Hollensteiner

    Full Text Available During the last two decades ferrets (Mustela putorius have been established as a highly efficient animal model in different fields in neuroscience. Here we asked whether ferrets integrate sensory information according to the same principles established for other species. Since only few methods and protocols are available for behaving ferrets we developed a head-free, body-restrained approach allowing a standardized stimulation position and the utilization of the ferret's natural response behavior. We established a behavioral paradigm to test audiovisual integration in the ferret. Animals had to detect a brief auditory and/or visual stimulus presented either left or right from their midline. We first determined detection thresholds for auditory amplitude and visual contrast. In a second step, we combined both modalities and compared psychometric fits and the reaction times between all conditions. We employed Maximum Likelihood Estimation (MLE to model bimodal psychometric curves and to investigate whether ferrets integrate modalities in an optimal manner. Furthermore, to test for a redundant signal effect we pooled the reaction times of all animals to calculate a race model. We observed that bimodal detection thresholds were reduced and reaction times were faster in the bimodal compared to unimodal conditions. The race model and MLE modeling showed that ferrets integrate modalities in a statistically optimal fashion. Taken together, the data indicate that principles of multisensory integration previously demonstrated in other species also apply to crossmodal processing in the ferret.

  18. Comment on "Cheating prevention in visual cryptography".

    Science.gov (United States)

    Chen, Yu-Chi; Horng, Gwoboa; Tsai, Du-Shiau

    2012-07-01

    Visual cryptography (VC), proposed by Naor and Shamir, has numerous applications, including visual authentication and identification, steganography, and image encryption. In 2006, Horng showed that cheating is possible in VC, where some participants can deceive the remaining participants by forged transparencies. Since then, designing cheating-prevention visual secret-sharing (CPVSS) schemes has been studied by many researchers. In this paper, we cryptanalyze the Hu-Tzeng CPVSS scheme and show that it is not cheating immune. We also outline an improvement that helps to overcome the problem.

  19. Mesoscopic quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Russian Academy of Sciences, Institute of Solid State Physics (Russian Federation)

    2017-03-15

    Since a strictly single-photon source is not yet available, in quantum cryptography systems, one uses, as information quantum states, coherent radiation of a laser with an average number of photons of μ ≈ 0.1–0.5 in a pulse, attenuated to the quasi-single-photon level. The linear independence of a set of coherent quasi-single-photon information states leads to the possibility of unambiguous measurements that, in the presence of losses in the line, restrict the transmission range of secret keys. Starting from a certain value of critical loss (the length of the line), the eavesdropper knows the entire key, does not make errors, and is not detected—the distribution of secret keys becomes impossible. This problem is solved by introducing an additional reference state with an average number of photons of μ{sub cl} ≈ 10{sup 3}–10{sup 6}, depending on the length of the communication line. It is shown that the use of a reference state does not allow the eavesdropper to carry out measurements with conclusive outcome while remaining undetected. A reference state guarantees detecting an eavesdropper in a channel with high losses. In this case, information states may contain a mesoscopic average number of photons in the range of μ{sub q} ≈ 0.5–10{sup 2}. The protocol proposed is easy to implement technically, admits flexible adjustment of parameters to the length of the communication line, and is simple and transparent for proving the secrecy of keys.

  20. Cryptography as a Pedagogical Tool

    Science.gov (United States)

    Kaur, Manmohan

    2008-01-01

    In order to get undergraduates interested in mathematics, it is necessary to motivate them, give them good reasons to spend time on a subject that requires hard work, and, if possible, involve them in undergraduate research. This article discusses how cryptography can be used for all these purposes. In particular, a special topics course on…

  1. Finding Cryptography in Object Code

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright

    2008-10-01

    Finding and identifying Cryptography is a growing concern in the malware analysis community. In this paper, a heuristic method for determining the likelihood that a given function contains a cryptographic algorithm is discussed and the results of applying this method in various environments is shown. The algorithm is based on frequency analysis of opcodes that make up each function within a binary.

  2. Practical free space quantum cryptography

    International Nuclear Information System (INIS)

    Schmitt-Manderbach, T.; Weier, H.; Regner, N.; Kurtsiefer, C.; Weinfurter, H.

    2005-01-01

    Full text: Quantum cryptography, the secure key distribution between two parties, is the first practical application of quantum information technology. By encoding digital information into different polarization states of single photons, a string of key bits can be established between two parties, where laws of quantum mechanics ensure that a possible eavesdropper has negligible knowledge of. Having shown the feasibility of a long distance quantum key distribution scheme, the emphasis of this work is to incorporate the previously developed compact sender and receiver modules into a quantum cryptography system suitable for every-day use in metropolitan areas. The permanent installation with automatic alignment allows to investigate in detail the sensitivity of the free space optical link to weather conditions and air turbulences commonly encountered in urban areas. We report on a successful free space quantum cryptography experiment over a distance of 500 m between the rooftops of two university buildings using the BB84 protocol. The obtained bit error rates in first runs of this experiment using faint coherent pulses with an average photon number ranging from 0.1 to 1.0 was measured to be below 3 percent for experiments carried out during night, leading to average raw key rates (before error correction and privacy amplification) of 50 kBits per second. Thanks to its simplicity of implementation, our experiment brings free space quantum key distribution a big step closer to practical usability in metropolitan networks and on a level with fibre-based quantum cryptography that up to now offers the only ready-to-use systems available. Compact and automated free space hardware is also a prerequisite for a possible earth-satellite quantum key distribution system in order to break the distance limit of about 100 km of current quantum cryptography schemes. (author)

  3. Performance improvement of per-user threshold based multiuser switched scheduling system

    KAUST Repository

    Nam, Haewoon

    2013-01-01

    SUMMARY This letter proposes a multiuser switched scheduling scheme with per-user threshold and post user selection and provides a generic analytical framework for determining the optimal feedback thresholds. The proposed scheme applies an individual feedback threshold for each user rather than a single common threshold for all users to achieve some capacity gain due to the flexibility of threshold selection as well as a lower scheduling outage probability. In addition, since scheduling outage may occur with a non-negligible probability, the proposed scheme employs post user selection in order to further improve the ergodic capacity, where the user with the highest potential for a higher channel quality than other users is selected. Numerical and simulation results show that the capacity gain by post user selection is significant when random sequence is used. Copyright © 2013 The Institute of Electronics, Information and Communication Engineers.

  4. A Quantum Cryptography Communication Network Based on Software Defined Network

    Directory of Open Access Journals (Sweden)

    Zhang Hongliang

    2018-01-01

    Full Text Available With the development of the Internet, information security has attracted great attention in today’s society, and quantum cryptography communication network based on quantum key distribution (QKD is a very important part of this field, since the quantum key distribution combined with one-time-pad encryption scheme can guarantee the unconditional security of the information. The secret key generated by quantum key distribution protocols is a very valuable resource, so making full use of key resources is particularly important. Software definition network (SDN is a new type of network architecture, and it separates the control plane and the data plane of network devices through OpenFlow technology, thus it realizes the flexible control of the network resources. In this paper, a quantum cryptography communication network model based on SDN is proposed to realize the flexible control of quantum key resources in the whole cryptography communication network. Moreover, we propose a routing algorithm which takes into account both the hops and the end-to-end availible keys, so that the secret key generated by QKD can be used effectively. We also simulate this quantum cryptography communication network, and the result shows that based on SDN and the proposed routing algorithm the performance of this network is improved since the effective use of the quantum key resources.

  5. Everyday cryptography fundamental principles and applications

    CERN Document Server

    Martin, Keith M

    2012-01-01

    Cryptography is a vital technology that underpins the security of information in computer networks. This book presents a comprehensive introduction to the role that cryptography plays in providing information security for technologies such as the Internet, mobile phones, payment cards, and wireless local area networks. Focusing on the fundamental principles that ground modern cryptography as they arise in modern applications, it avoids both an over-reliance on transient currenttechnologies and over-whelming theoretical research.Everyday Cryptography is a self-contained and widely accessible in

  6. Optical hiding with visual cryptography

    Science.gov (United States)

    Shi, Yishi; Yang, Xiubo

    2017-11-01

    We propose an optical hiding method based on visual cryptography. In the hiding process, we convert the secret information into a set of fabricated phase-keys, which are completely independent of each other, intensity-detected-proof and image-covered, leading to the high security. During the extraction process, the covered phase-keys are illuminated with laser beams and then incoherently superimposed to extract the hidden information directly by human vision, without complicated optical implementations and any additional computation, resulting in the convenience of extraction. Also, the phase-keys are manufactured as the diffractive optical elements that are robust to the attacks, such as the blocking and the phase-noise. Optical experiments verify that the high security, the easy extraction and the strong robustness are all obtainable in the visual-cryptography-based optical hiding.

  7. Cryptography for Big Data Security

    Science.gov (United States)

    2015-07-13

    constructions based on this paradigm include Ishai, Kushilevitz and Ostrovsky [97], Ben- Sasson , Chiesa, Genkin, Tromer and Virza [17], and Bitansky...Security and Privacy. [15] Assaf Ben- David , Noam Nisan, and Benny Pinkas. Fairplaymp: A system for secure multi-party computation. In ACM Conference...cryptographic fault-tolerant distributed computation (extended abstract). In STOC, 1988. 50 CHAPTER 1. CRYPTOGRAPHY FOR BIG DATA SECURITY [17] Eli Ben- Sasson

  8. National Workshop on Coding Theory and Cryptography

    Indian Academy of Sciences (India)

    Coding theory and cryptography are two inter-related branches of applied algebra that find increasing applications in communication theory, data security and many other areas of information technology. This workshop will discuss the basics and applications of algebraic coding theory and cryptography, public key ...

  9. Report of the Public Cryptography Study Group.

    Science.gov (United States)

    American Council on Education, Washington, DC.

    Concerns of the National Security Agency (NSA) that information contained in some articles about cryptography in learned and professional journals and in monographs might be inimical to the national security are addressed. The Public Cryptography Study Group, with one dissenting opinion, recommends that a voluntary system of prior review of…

  10. Protocols and plan of quantum cryptography

    Directory of Open Access Journals (Sweden)

    Milorad S. Markagić

    2012-01-01

    Full Text Available Along with the development of confidentiality of data and resources, there is a need to develop systems that would provide confidentiality. Currently, the most used systems are classical cryptographic systems and encryption public key systems. However, none of these systems provides a solution for the famous 'catch 22' of cryptography. Owing to the intensive development of quantum mechanics, in the last 30 years emerged an entirely new kind of cryptography-quantum cryptography. Its greatest contribution is a possibility to discover an intercepted communication channel from a third party. The question is: is this really true? The question arises: 'If the quantum cryptography is so good, why is not widely used?' The aim of this paper is, on the one hand, to define the basic mechanisms of quantum cryptography IP, and, on the other hand, to point to the shortcomings, as they related to the opportunities of today's devices and flaws in protocols.

  11. Developmental Mechanisms Underlying Improved Contrast Thresholds for Discriminations of Orientation Signals Embedded in Noise

    Directory of Open Access Journals (Sweden)

    Seong Taek eJeon

    2014-09-01

    Full Text Available We combined an external noise paradigm with an efficient procedure for obtaining contrast thresholds (Lesmes et al., 2006 in order to model developmental changes during childhood. Specifically, we measured the contrast thresholds of 5-, 7-, 9-year-olds and adults (n = 20/age in a two alternative forced-choice orientation discrimination task over a wide range of external noise levels and at three levels of accuracy. Overall, as age increased, contrast thresholds decreased over the entire range of external noise levels tested. The decrease was greatest between 5 and 7 years of age. The reduction in threshold after age 5 was greater in the high than the low external noise region, a pattern implying greater tolerance to the irrelevant background noise as children became older. To model the mechanisms underlying these developmental changes in terms of internal noise components, we adapted the original perceptual template model (Lu and Dosher, 1998 and normalized the magnitude of performance changes against the performance of 5-year-olds. The resulting model provided an excellent fit (r2 = 0.985 to the contrast thresholds at multiple levels of accuracy (60, 75, and 90% across a wide range of external noise levels. The improvements in contrast thresholds with age were best modelled by a combination of reductions in internal additive noise, reductions in internal multiplicative noise, and improvements in excluding external noise by template retuning. In line with the data, the improvement was greatest between 5 and 7 years of age, accompanied by a 39% reduction in additive noise, 71% reduction in multiplicative noise, and 45% improvement in external noise exclusion. The modelled improvements likely reflect developmental changes at the cortical level, rather than changes in front-end structural properties (Kiorpes et al., 2003.

  12. An Improved Quantum-Inspired Genetic Algorithm for Image Multilevel Thresholding Segmentation

    Directory of Open Access Journals (Sweden)

    Jian Zhang

    2014-01-01

    Full Text Available A multilevel thresholding algorithm for histogram-based image segmentation is presented in this paper. The proposed algorithm introduces an adaptive adjustment strategy of the rotation angle and a cooperative learning strategy into quantum genetic algorithm (called IQGA. An adaptive adjustment strategy of the quantum rotation which is introduced in this study helps improving the convergence speed, search ability, and stability. Cooperative learning enhances the search ability in the high-dimensional solution space by splitting a high-dimensional vector into several one-dimensional vectors. The experimental results demonstrate good performance of the IQGA in solving multilevel thresholding segmentation problem by compared with QGA, GA and PSO.

  13. Key distillation in quantum cryptography

    Science.gov (United States)

    Slutsky, Boris Aron

    1998-11-01

    Quantum cryptography is a technique which permits two parties to communicate over an open channel and establish a shared sequence of bits known only to themselves. This task, provably impossible in classical cryptography, is accomplished by encoding the data on quantum particles and harnessing their unique properties. It is believed that no eavesdropping attack consistent with the laws of quantum theory can compromise the secret data unknowingly to the legitimate users of the channel. Any attempt by a hostile actor to monitor the data carrying particles while in transit reveals itself through transmission errors it must inevitably introduce. Unfortunately, in practice a communication is not free of errors even when no eavesdropping is present. Key distillation is a technique that permits the parties to overcome this difficulty and establish a secret key despite channel defects, under the assumption that every particle is handled independently from other particles by the enemy. In the present work, key distillation is described and its various aspects are studied. A relationship is derived between the average error rate resulting from an eavesdropping attack and the amount of information obtained by the attacker. Formal definition is developed of the security of the final key. The net throughput of secret bits in a quantum cryptosystem employing key distillation is assessed. An overview of quantum cryptographic protocols and related information theoretical results is also given.

  14. Cryptography and computational number theory

    CERN Document Server

    Shparlinski, Igor; Wang, Huaxiong; Xing, Chaoping; Workshop on Cryptography and Computational Number Theory, CCNT'99

    2001-01-01

    This volume contains the refereed proceedings of the Workshop on Cryptography and Computational Number Theory, CCNT'99, which has been held in Singapore during the week of November 22-26, 1999. The workshop was organized by the Centre for Systems Security of the Na­ tional University of Singapore. We gratefully acknowledge the financial support from the Singapore National Science and Technology Board under the grant num­ ber RP960668/M. The idea for this workshop grew out of the recognition of the recent, rapid development in various areas of cryptography and computational number the­ ory. The event followed the concept of the research programs at such well-known research institutions as the Newton Institute (UK), Oberwolfach and Dagstuhl (Germany), and Luminy (France). Accordingly, there were only invited lectures at the workshop with plenty of time for informal discussions. It was hoped and successfully achieved that the meeting would encourage and stimulate further research in information and computer s...

  15. Analysis of order-statistic CFAR threshold estimators for improved ultrasonic flaw detection.

    Science.gov (United States)

    Saniie, J; Nagle, D T

    1992-01-01

    In the pulse-echo method using broadband transducers, flaw detection can be improved by using optimal bandpass filtering to resolve flaw echoes surrounded by grain scatterers. Optimal bandpass filtering is achieved by examining spectral information of the flaw and grain echoes where frequency differences have been experimentally shown to be predictable in the Rayleigh scattering region. Using optimal frequency band information, flaw echoes can then be discriminated by applying adaptive thresholding techniques based on surrounding range cells. The authors present order-statistic (OS) processors, ranked and trimmed mean (TM), to robustly estimate the threshold while censoring outliers. The design of these OS processors is accomplished analytically based on constant false-alarm rate (CFAR) detection. It is shown that OS-CFAR and TM-CFAR processors can detect flaw echoes robustly with the CFAR of 10 (-4) where the range cell used for the threshold estimate contains outliers.

  16. Large change in voltage at phase reversal improves biphasic defibrillation thresholds. Parallel-series mode switching.

    Science.gov (United States)

    Yamanouchi, Y; Mowrey, K A; Nadzam, G R; Hills, D G; Kroll, M W; Brewer, J E; Donohoo, A M; Wilkoff, B L; Tchou, P J

    1996-10-01

    Multiple factors contribute to an improved defibrillation threshold of biphasic shocks. The leading-edge voltage of the second phase may be an important factor in reducing the defibrillation threshold. We tested two experimental biphasic waveforms with large voltage changes at phase reversal. The phase 2 leading-edge voltage was twice the phase 1 trailing-edge voltage. This large voltage change was achieved by switching two capacitors from parallel to series mode at phase reversal. Two capacitors were tested (60/15 microfarads [microF] and 90/22.5 microF) and compared with two control biphasic waveforms for which the phase 1 trailing-edge voltage equaled the phase 2 leading-edge voltage. The control waveforms were incorporated into clinical (135/135 microF) or investigational devices (90/90 microF). Defibrillation threshold parameters were evaluated in eight anesthetized pigs by use of a nonthoracotomy transvenous lead to a can electrode system. The stored energy at the defibrillation threshold (ion joules) was 8.2 +/- 1.5 for 60/15 microF (P voltage changes at phase reversal caused by parallel-series mode switching appeared to improve the ventricular defibrillation threshold in a pig model compared with a currently available biphasic waveform. The 60/15-microF capacitor performed as well as the 90/ 22.5-microF capacitor in the experimental waveform. Thus, smaller capacitors may allow reduction in device size without sacrificing defibrillation threshold energy requirements.

  17. A Foundational Proof Framework for Cryptography

    Science.gov (United States)

    2015-05-01

    A Foundational Proof Framework for Cryptography A P T S E A S D P C S H U C , M M 2015 Distribution A: Public Release ©2014 – A P . T U S A F I A... Cryptography A I present a state-of-the-art mechanized framework for developing and checking proofs of secu- rity for cryptographic schemes in the computational...model. This system, called the Foundational Cryptography Framework (FCF) is based on the Coq proof assistant, and it provides a sophisticated

  18. Quantum cryptography: The power of independence

    Science.gov (United States)

    Ekert, Artur

    2018-02-01

    Device-independent quantum cryptography promises unprecedented security, but it is regarded as a theorist's dream and an experimentalist's nightmare. A new mathematical tool has now pushed its experimental demonstration much closer to reality.

  19. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    Science.gov (United States)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  20. Bent functions results and applications to cryptography

    CERN Document Server

    Tokareva, Natalia

    2015-01-01

    Bent Functions: Results and Applications to Cryptography offers a unique survey of the objects of discrete mathematics known as Boolean bent functions. As these maximal, nonlinear Boolean functions and their generalizations have many theoretical and practical applications in combinatorics, coding theory, and cryptography, the text provides a detailed survey of their main results, presenting a systematic overview of their generalizations and applications, and considering open problems in classification and systematization of bent functions. The text is appropriate for novices and advanced

  1. Mathematical Background of Public Key Cryptography

    DEFF Research Database (Denmark)

    Frey, Gerhard; Lange, Tanja

    2005-01-01

    The two main systems used for public key cryptography are RSA and protocols based on the discrete logarithm problem in some cyclic group. We focus on the latter problem and state cryptographic protocols and mathematical background material.......The two main systems used for public key cryptography are RSA and protocols based on the discrete logarithm problem in some cyclic group. We focus on the latter problem and state cryptographic protocols and mathematical background material....

  2. Adaptive Wavelet Threshold Denoising Method for Machinery Sound Based on Improved Fruit Fly Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Jing Xu

    2016-07-01

    Full Text Available As the sound signal of a machine contains abundant information and is easy to measure, acoustic-based monitoring or diagnosis systems exhibit obvious superiority, especially in some extreme conditions. However, the sound directly collected from industrial field is always polluted. In order to eliminate noise components from machinery sound, a wavelet threshold denoising method optimized by an improved fruit fly optimization algorithm (WTD-IFOA is proposed in this paper. The sound is firstly decomposed by wavelet transform (WT to obtain coefficients of each level. As the wavelet threshold functions proposed by Donoho were discontinuous, many modified functions with continuous first and second order derivative were presented to realize adaptively denoising. However, the function-based denoising process is time-consuming and it is difficult to find optimal thresholds. To overcome these problems, fruit fly optimization algorithm (FOA was introduced to the process. Moreover, to avoid falling into local extremes, an improved fly distance range obeying normal distribution was proposed on the basis of original FOA. Then, sound signal of a motor was recorded in a soundproof laboratory, and Gauss white noise was added into the signal. The simulation results illustrated the effectiveness and superiority of the proposed approach by a comprehensive comparison among five typical methods. Finally, an industrial application on a shearer in coal mining working face was performed to demonstrate the practical effect.

  3. Improved bounds on the epidemic threshold of exact SIS models on complex networks

    KAUST Repository

    Ruhi, Navid Azizan

    2017-01-05

    The SIS (susceptible-infected-susceptible) epidemic model on an arbitrary network, without making approximations, is a 2n-state Markov chain with a unique absorbing state (the all-healthy state). This makes analysis of the SIS model and, in particular, determining the threshold of epidemic spread quite challenging. It has been shown that the exact marginal probabilities of infection can be upper bounded by an n-dimensional linear time-invariant system, a consequence of which is that the Markov chain is “fast-mixing” when the LTI system is stable, i.e. when equation (where β is the infection rate per link, δ is the recovery rate, and λmax(A) is the largest eigenvalue of the network\\'s adjacency matrix). This well-known threshold has been recently shown not to be tight in several cases, such as in a star network. In this paper, we provide tighter upper bounds on the exact marginal probabilities of infection, by also taking pairwise infection probabilities into account. Based on this improved bound, we derive tighter eigenvalue conditions that guarantee fast mixing (i.e., logarithmic mixing time) of the chain. We demonstrate the improvement of the threshold condition by comparing the new bound with the known one on various networks with various epidemic parameters.

  4. Improvement of the damage threshold of high reflectivity multidielectric coatings 1.06 μM

    International Nuclear Information System (INIS)

    Geenen, B.; Malherbes, A.; Guerain, J.; Boisgard, D.

    1985-01-01

    Development of new high power laser for laser-matter interaction in C.E.A. Limeil requires the realization of H.R. coatings with damage thresholds above 8 J/cm/sup 2/. MATRA's laboratory ''couches minces optiques'' (thin optical layers) production commercial mirrors was around 3.5 J/cm/sup 2/ in 1982. In order to obtain better results the authors decided to improve the control of evaporation parameters such as: vacuum and regulation of oxygen pressure by means of a mass spectrometer; better measurements of evaporation temperature and regulation of evaporation rate; measurement and control of substrate temperature by pyrometric observation; and to automatize the process. These different measurements and controls enable them to establish new processing operations giving better evaporation conditions. The result was an increase of damage threshold from 3.5 J/cm/sup 2/ to 8 J/cm/sup 2/

  5. Microscopy mineral image enhancement based on improved adaptive threshold in nonsubsampled shearlet transform domain

    Science.gov (United States)

    Li, Liangliang; Si, Yujuan; Jia, Zhenhong

    2018-03-01

    In this paper, a novel microscopy mineral image enhancement method based on adaptive threshold in non-subsampled shearlet transform (NSST) domain is proposed. First, the image is decomposed into one low-frequency sub-band and several high-frequency sub-bands. Second, the gamma correction is applied to process the low-frequency sub-band coefficients, and the improved adaptive threshold is adopted to suppress the noise of the high-frequency sub-bands coefficients. Third, the processed coefficients are reconstructed with the inverse NSST. Finally, the unsharp filter is used to enhance the details of the reconstructed image. Experimental results on various microscopy mineral images demonstrated that the proposed approach has a better enhancement effect in terms of objective metric and subjective metric.

  6. Fast and simple high-capacity quantum cryptography with error detection

    Science.gov (United States)

    Lai, Hong; Luo, Ming-Xing; Pieprzyk, Josef; Zhang, Jun; Pan, Lei; Li, Shudong; Orgun, Mehmet A.

    2017-04-01

    Quantum cryptography is commonly used to generate fresh secure keys with quantum signal transmission for instant use between two parties. However, research shows that the relatively low key generation rate hinders its practical use where a symmetric cryptography component consumes the shared key. That is, the security of the symmetric cryptography demands frequent rate of key updates, which leads to a higher consumption of the internal one-time-pad communication bandwidth, since it requires the length of the key to be as long as that of the secret. In order to alleviate these issues, we develop a matrix algorithm for fast and simple high-capacity quantum cryptography. Our scheme can achieve secure private communication with fresh keys generated from Fibonacci- and Lucas- valued orbital angular momentum (OAM) states for the seed to construct recursive Fibonacci and Lucas matrices. Moreover, the proposed matrix algorithm for quantum cryptography can ultimately be simplified to matrix multiplication, which is implemented and optimized in modern computers. Most importantly, considerably information capacity can be improved effectively and efficiently by the recursive property of Fibonacci and Lucas matrices, thereby avoiding the restriction of physical conditions, such as the communication bandwidth.

  7. Approach to design neural cryptography: a generalized architecture and a heuristic rule.

    Science.gov (United States)

    Mu, Nankun; Liao, Xiaofeng; Huang, Tingwen

    2013-06-01

    Neural cryptography, a type of public key exchange protocol, is widely considered as an effective method for sharing a common secret key between two neural networks on public channels. How to design neural cryptography remains a great challenge. In this paper, in order to provide an approach to solve this challenge, a generalized network architecture and a significant heuristic rule are designed. The proposed generic framework is named as tree state classification machine (TSCM), which extends and unifies the existing structures, i.e., tree parity machine (TPM) and tree committee machine (TCM). Furthermore, we carefully study and find that the heuristic rule can improve the security of TSCM-based neural cryptography. Therefore, TSCM and the heuristic rule can guide us to designing a great deal of effective neural cryptography candidates, in which it is possible to achieve the more secure instances. Significantly, in the light of TSCM and the heuristic rule, we further expound that our designed neural cryptography outperforms TPM (the most secure model at present) on security. Finally, a series of numerical simulation experiments are provided to verify validity and applicability of our results.

  8. Fast and simple high-capacity quantum cryptography with error detection.

    Science.gov (United States)

    Lai, Hong; Luo, Ming-Xing; Pieprzyk, Josef; Zhang, Jun; Pan, Lei; Li, Shudong; Orgun, Mehmet A

    2017-04-13

    Quantum cryptography is commonly used to generate fresh secure keys with quantum signal transmission for instant use between two parties. However, research shows that the relatively low key generation rate hinders its practical use where a symmetric cryptography component consumes the shared key. That is, the security of the symmetric cryptography demands frequent rate of key updates, which leads to a higher consumption of the internal one-time-pad communication bandwidth, since it requires the length of the key to be as long as that of the secret. In order to alleviate these issues, we develop a matrix algorithm for fast and simple high-capacity quantum cryptography. Our scheme can achieve secure private communication with fresh keys generated from Fibonacci- and Lucas- valued orbital angular momentum (OAM) states for the seed to construct recursive Fibonacci and Lucas matrices. Moreover, the proposed matrix algorithm for quantum cryptography can ultimately be simplified to matrix multiplication, which is implemented and optimized in modern computers. Most importantly, considerably information capacity can be improved effectively and efficiently by the recursive property of Fibonacci and Lucas matrices, thereby avoiding the restriction of physical conditions, such as the communication bandwidth.

  9. Neural Network Approach to Locating Cryptography in Object Code

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2009-09-01

    Finding and identifying cryptography is a growing concern in the malware analysis community. In this paper, artificial neural networks are used to classify functional blocks from a disassembled program as being either cryptography related or not. The resulting system, referred to as NNLC (Neural Net for Locating Cryptography) is presented and results of applying this system to various libraries are described.

  10. Quantum cryptography approaching the classical limit.

    Science.gov (United States)

    Weedbrook, Christian; Pirandola, Stefano; Lloyd, Seth; Ralph, Timothy C

    2010-09-10

    We consider the security of continuous-variable quantum cryptography as we approach the classical limit, i.e., when the unknown preparation noise at the sender's station becomes significantly noisy or thermal (even by as much as 10(4) times greater than the variance of the vacuum mode). We show that, provided the channel transmission losses do not exceed 50%, the security of quantum cryptography is not dependent on the channel transmission, and is therefore incredibly robust against significant amounts of excess preparation noise. We extend these results to consider for the first time quantum cryptography at wavelengths considerably longer than optical and find that regions of security still exist all the way down to the microwave.

  11. Color extended visual cryptography using error diffusion.

    Science.gov (United States)

    Kang, InKoo; Arce, Gonzalo R; Lee, Heung-Kyu

    2011-01-01

    Color visual cryptography (VC) encrypts a color secret message into n color halftone image shares. Previous methods in the literature show good results for black and white or gray scale VC schemes, however, they are not sufficient to be applied directly to color shares due to different color structures. Some methods for color visual cryptography are not satisfactory in terms of producing either meaningless shares or meaningful shares with low visual quality, leading to suspicion of encryption. This paper introduces the concept of visual information pixel (VIP) synchronization and error diffusion to attain a color visual cryptography encryption method that produces meaningful color shares with high visual quality. VIP synchronization retains the positions of pixels carrying visual information of original images throughout the color channels and error diffusion generates shares pleasant to human eyes. Comparisons with previous approaches show the superior performance of the new method.

  12. An adaptive threshold based image processing technique for improved glaucoma detection and classification.

    Science.gov (United States)

    Issac, Ashish; Partha Sarathi, M; Dutta, Malay Kishore

    2015-11-01

    Glaucoma is an optic neuropathy which is one of the main causes of permanent blindness worldwide. This paper presents an automatic image processing based method for detection of glaucoma from the digital fundus images. In this proposed work, the discriminatory parameters of glaucoma infection, such as cup to disc ratio (CDR), neuro retinal rim (NRR) area and blood vessels in different regions of the optic disc has been used as features and fed as inputs to learning algorithms for glaucoma diagnosis. These features which have discriminatory changes with the occurrence of glaucoma are strategically used for training the classifiers to improve the accuracy of identification. The segmentation of optic disc and cup based on adaptive threshold of the pixel intensities lying in the optic nerve head region. Unlike existing methods the proposed algorithm is based on an adaptive threshold that uses local features from the fundus image for segmentation of optic cup and optic disc making it invariant to the quality of the image and noise content which may find wider acceptability. The experimental results indicate that such features are more significant in comparison to the statistical or textural features as considered in existing works. The proposed work achieves an accuracy of 94.11% with a sensitivity of 100%. A comparison of the proposed work with the existing methods indicates that the proposed approach has improved accuracy of classification glaucoma from a digital fundus which may be considered clinically significant. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. Cyber Security for Smart Grid, Cryptography, and Privacy

    Directory of Open Access Journals (Sweden)

    Swapna Iyer

    2011-01-01

    Full Text Available The invention of “smart grid” promises to improve the efficiency and reliability of the power system. As smart grid is turning out to be one of the most promising technologies, its security concerns are becoming more crucial. The grid is susceptible to different types of attacks. This paper will focus on these threats and risks especially relating to cyber security. Cyber security is a vital topic, since the smart grid uses high level of computation like the IT. We will also see cryptography and key management techniques that are required to overcome these attacks. Privacy of consumers is another important security concern that this paper will deal with.

  14. Security, Privacy, and Applied Cryptography Engineering

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the Second International Conference on Security, Privacy and Applied Cryptography Engineering held in Chennai, India, in November 2012. The 11 papers presented were carefully reviewed and selected from 61 submissions. The papers are organized...... in topical sections on symmetric-key algorithms and cryptanalysis, cryptographic implementations, side channel analysis and countermeasures, fault tolerance of cryptosystems, physically unclonable functions, public-key schemes and cryptanalysis, analysis and design of security protocol, security of systems...... and applications, high-performance computing in cryptology and cryptography in ubiquitous devices....

  15. Acute exercise performed close to the anaerobic threshold improves cognitive performance in elderly females

    Directory of Open Access Journals (Sweden)

    C. Córdova

    2009-05-01

    Full Text Available The objective of the present study was to compare the effect of acute exercise performed at different intensities in relation to the anaerobic threshold (AT on abilities requiring control of executive functions or alertness in physically active elderly females. Forty-eight physically active elderly females (63.8 ± 4.6 years old were assigned to one of four groups by drawing lots: control group without exercise or trial groups with exercise performed at 60, 90, or 110% of AT (watts and submitted to 5 cognitive tests before and after exercise. Following cognitive pretesting, an incremental cycle ergometer test was conducted to determine AT using a fixed blood lactate concentration of 3.5 mmol/L as cutoff. Acute exercise executed at 90% of AT resulted in significant (P < 0.05, ANOVA improvement in the performance of executive functions when compared to control in 3 of 5 tests (verbal fluency, Tower of Hanoi test (number of movements, and Trail Making test B. Exercising at 60% of AT did not improve results of any tests for executive functions, whereas exercise executed at 110% of AT only improved the performance in one of these tests (verbal fluency compared to control. Women from all trial groups exhibited a remarkable reduction in the Simple Response Time (alertness test (P = 0.001. Thus, physical exercise performed close to AT is more effective to improve cognitive processing of older women even if conducted acutely, and using a customized exercise prescription based on the anaerobic threshold should optimize the beneficial effects.

  16. An Anti-Cheating Visual Cryptography Scheme Based on Chaotic Encryption System

    Science.gov (United States)

    Han, Yanyan; Xu, Zhuolin; Ge, Xiaonan; He, Wencai

    By chaotic encryption system and introducing the trusted third party (TTP), in this paper, an anti-cheating visual cryptography scheme (VCS) is proposed. The scheme solved the problem of dishonest participants and improved the security of chaotic encryption system. Simulation results and analysis show that the recovery image is acceptable, the system can detect the cheating in participants effectively and with high security.

  17. Is Calculus a Failure in Cryptography?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 3. Is Calculus a Failure in Cryptography? P Vanchinathan. General Article Volume 21 Issue 3 March 2016 pp 239-245. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/021/03/0239-0245. Keywords.

  18. Report on Pairing-based Cryptography

    Science.gov (United States)

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST’s position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed. PMID:26958435

  19. Harry Potter and the Cryptography with Matrices

    Science.gov (United States)

    Chua, Boon Liang

    2006-01-01

    This article describes Cryptography, defined as the science of encrypting and deciphering messages written in secret codes, it has played a vital role in securing information since ancient times. There are several cryptographic techniques and many make extensive use of mathematics to secure information. The author discusses an activity built…

  20. Report on Pairing-based Cryptography.

    Science.gov (United States)

    Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily

    2015-01-01

    This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.

  1. Number Theory and Public-Key Cryptography.

    Science.gov (United States)

    Lefton, Phyllis

    1991-01-01

    Described are activities in the study of techniques used to conceal the meanings of messages and data. Some background information and two BASIC programs that illustrate the algorithms used in a new cryptographic system called "public-key cryptography" are included. (CW)

  2. Quantum cryptography beyond quantum key distribution

    NARCIS (Netherlands)

    A. Broadbent (Anne); C. Schaffner (Christian)

    2016-01-01

    textabstractQuantum cryptography is the art and science of exploiting quantum mechanical effects in order to perform cryptographic tasks. While the most well-known example of this discipline is quantum key distribution (QKD), there exist many other applications such as quantum money, randomness

  3. CRYPTOGRAPHY- AN IDEAL SOLUTION TO PRIVACY, DATA ...

    African Journals Online (AJOL)

    and non-repudiation through digital signalures; and has become the issue of today's communication ... Encryption, hashing and digital signatures are the three primitives of Cryptography and these have been treated in .... encryption and decryption transformation algorithms in such a way that publicizing the encryption key.

  4. Is Calculus a Failure in Cryptography?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 3. Is Calculus a Failure in Cryptography? P Vanchinathan. General Article Volume 21 Issue 3 March 2016 pp 239-245. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/021/03/0239-0245. Keywords.

  5. Quantum cryptography beyond quantum key distribution

    NARCIS (Netherlands)

    Broadbent, A.; Schaffner, C.

    2016-01-01

    Quantum cryptography is the art and science of exploiting quantum mechanical effects in order to perform cryptographic tasks. While the most well-known example of this discipline is quantum key distribution (QKD), there exist many other applications such as quantum money, randomness generation,

  6. Cryptography, quantum computation and trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Richard J.

    1998-03-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  7. Architecture for the Secret-Key BC3 Cryptography Algorithm

    Directory of Open Access Journals (Sweden)

    Arif Sasongko

    2011-08-01

    Full Text Available Cryptography is a very important aspect in data security. The focus of research in this field is shifting from merely security aspect to consider as well the implementation aspect. This paper aims to introduce BC3 algorithm with focus on its hardware implementation. It proposes architecture for the hardware implementation for this algorithm. BC3 algorithm is a secret-key cryptography algorithm developed with two considerations: robustness and implementation efficiency. This algorithm has been implemented on software and has good performance compared to AES algorithm. BC3 is improvement of BC2 and AE cryptographic algorithm and it is expected to have the same level of robustness and to gain competitive advantages in the implementation aspect. The development of the architecture gives much attention on (1 resource sharing and (2 having single clock for each round. It exploits regularity of the algorithm. This architecture is then implemented on an FPGA. This implementation is three times smaller area than AES, but about five times faster. Furthermore, this BC3 hardware implementation has better performance compared to BC3 software both in key expansion stage and randomizing stage. For the future, the security of this implementation must be reviewed especially against side channel attack.

  8. An Improved Otsu Threshold Segmentation Method for Underwater Simultaneous Localization and Mapping-Based Navigation

    Directory of Open Access Journals (Sweden)

    Xin Yuan

    2016-07-01

    Full Text Available The main focus of this paper is on extracting features with SOund Navigation And Ranging (SONAR sensing for further underwater landmark-based Simultaneous Localization and Mapping (SLAM. According to the characteristics of sonar images, in this paper, an improved Otsu threshold segmentation method (TSM has been developed for feature detection. In combination with a contour detection algorithm, the foreground objects, although presenting different feature shapes, are separated much faster and more precisely than by other segmentation methods. Tests have been made with side-scan sonar (SSS and forward-looking sonar (FLS images in comparison with other four TSMs, namely the traditional Otsu method, the local TSM, the iterative TSM and the maximum entropy TSM. For all the sonar images presented in this work, the computational time of the improved Otsu TSM is much lower than that of the maximum entropy TSM, which achieves the highest segmentation precision among the four above mentioned TSMs. As a result of the segmentations, the centroids of the main extracted regions have been computed to represent point landmarks which can be used for navigation, e.g., with the help of an Augmented Extended Kalman Filter (AEKF-based SLAM algorithm. The AEKF-SLAM approach is a recursive and iterative estimation-update process, which besides a prediction and an update stage (as in classical Extended Kalman Filter (EKF, includes an augmentation stage. During navigation, the robot localizes the centroids of different segments of features in sonar images, which are detected by our improved Otsu TSM, as point landmarks. Using them with the AEKF achieves more accurate and robust estimations of the robot pose and the landmark positions, than with those detected by the maximum entropy TSM. Together with the landmarks identified by the proposed segmentation algorithm, the AEKF-SLAM has achieved reliable detection of cycles in the map and consistent map update on loop

  9. An Improved Otsu Threshold Segmentation Method for Underwater Simultaneous Localization and Mapping-Based Navigation.

    Science.gov (United States)

    Yuan, Xin; Martínez, José-Fernán; Eckert, Martina; López-Santidrián, Lourdes

    2016-07-22

    The main focus of this paper is on extracting features with SOund Navigation And Ranging (SONAR) sensing for further underwater landmark-based Simultaneous Localization and Mapping (SLAM). According to the characteristics of sonar images, in this paper, an improved Otsu threshold segmentation method (TSM) has been developed for feature detection. In combination with a contour detection algorithm, the foreground objects, although presenting different feature shapes, are separated much faster and more precisely than by other segmentation methods. Tests have been made with side-scan sonar (SSS) and forward-looking sonar (FLS) images in comparison with other four TSMs, namely the traditional Otsu method, the local TSM, the iterative TSM and the maximum entropy TSM. For all the sonar images presented in this work, the computational time of the improved Otsu TSM is much lower than that of the maximum entropy TSM, which achieves the highest segmentation precision among the four above mentioned TSMs. As a result of the segmentations, the centroids of the main extracted regions have been computed to represent point landmarks which can be used for navigation, e.g., with the help of an Augmented Extended Kalman Filter (AEKF)-based SLAM algorithm. The AEKF-SLAM approach is a recursive and iterative estimation-update process, which besides a prediction and an update stage (as in classical Extended Kalman Filter (EKF)), includes an augmentation stage. During navigation, the robot localizes the centroids of different segments of features in sonar images, which are detected by our improved Otsu TSM, as point landmarks. Using them with the AEKF achieves more accurate and robust estimations of the robot pose and the landmark positions, than with those detected by the maximum entropy TSM. Together with the landmarks identified by the proposed segmentation algorithm, the AEKF-SLAM has achieved reliable detection of cycles in the map and consistent map update on loop closure, which is

  10. Cryptography and the Internet: lessons and challenges

    Energy Technology Data Exchange (ETDEWEB)

    McCurley, K.S.

    1996-12-31

    The popularization of the Internet has brought fundamental changes to the world, because it allows a universal method of communication between computers. This carries enormous benefits with it, but also raises many security considerations. Cryptography is a fundamental technology used to provide security of computer networks, and there is currently a widespread engineering effort to incorporate cryptography into various aspects of the Internet. The system-level engineering required to provide security services for the Internet carries some important lessons for researchers whose study is focused on narrowly defined problems. It also offers challenges to the cryptographic research community by raising new questions not adequately addressed by the existing body of knowledge. This paper attempts to summarize some of these lessons and challenges for the cryptographic research community.

  11. Asymmetric cryptography based on wavefront sensing.

    Science.gov (United States)

    Peng, Xiang; Wei, Hengzheng; Zhang, Peng

    2006-12-15

    A system of asymmetric cryptography based on wavefront sensing (ACWS) is proposed for the first time to our knowledge. One of the most significant features of the asymmetric cryptography is that a trapdoor one-way function is required and constructed by analogy to wavefront sensing, in which the public key may be derived from optical parameters, such as the wavelength or the focal length, while the private key may be obtained from a kind of regular point array. The ciphertext is generated by the encoded wavefront and represented with an irregular array. In such an ACWS system, the encryption key is not identical to the decryption key, which is another important feature of an asymmetric cryptographic system. The processes of asymmetric encryption and decryption are formulized mathematically and demonstrated with a set of numerical experiments.

  12. DRM: tales from the crypt(ography).

    Science.gov (United States)

    Kabachinski, Jeff

    2007-01-01

    That quenches my immediate thirst for researching cryptography. You can see how quickly these methods can get complicated. I don't know, and based on DMCA maybe I don't want to know, how encryption is used in DRM. We know that it includes encryption techniques to protect content from being copied and to control ultimate usage of the content. As with just about any technology, there are good and not so good uses. DRM is just another example. We barely touched the surface of the world of cryptography--you can find so much more with simple internet searches. I've included several of the encryption techniques I came across in the glossary to whet your appetite for more (see www.aami.org/ publications/bit). Remember, the key to encryption is the key to encryption!

  13. Spectral coherent-state quantum cryptography.

    Science.gov (United States)

    Cincotti, Gabriella; Spiekman, Leo; Wada, Naoya; Kitayama, Ken-ichi

    2008-11-01

    A novel implementation of quantum-noise optical cryptography is proposed, which is based on a simplified architecture that allows long-haul, high-speed transmission in a fiber optical network. By using a single multiport encoder/decoder and 16 phase shifters, this new approach can provide the same confidentiality as other implementations of Yuen's encryption protocol, which use a larger number of phase or polarization coherent states. Data confidentiality and error probability for authorized and unauthorized receivers are carefully analyzed.

  14. Contributions to Provable Security and Efficient Cryptography

    OpenAIRE

    Schmidt-Samoa, Katja

    2006-01-01

    This thesis deals with two main matters of modern public key cryptography: provable security and efficient implementation. Indubitably, security is the most important property of any cryptographic scheme. Nevertheless, cryptographic algorithms have often been designed on a trial-and-error basis, i.e., a system has been regarded as secure as long as it withstood cryptanalytic attacks. In contrast, the provable security approach provides rigorous mathematical proofs within well-defined models. ...

  15. Improved Acuity and Dexterity but Unchanged Touch and Pain Thresholds following Repetitive Sensory Stimulation of the Fingers

    Directory of Open Access Journals (Sweden)

    Rebecca Kowalewski

    2012-01-01

    Full Text Available Neuroplasticity underlies the brain’s ability to alter perception and behavior through training, practice, or simply exposure to sensory stimulation. Improvement of tactile discrimination has been repeatedly demonstrated after repetitive sensory stimulation (rSS of the fingers; however, it remains unknown if such protocols also affect hand dexterity or pain thresholds. We therefore stimulated the thumb and index finger of young adults to investigate, besides testing tactile discrimination, the impact of rSS on dexterity, pain, and touch thresholds. We observed an improvement in the pegboard task where subjects used the thumb and index finger only. Accordingly, stimulating 2 fingers simultaneously potentiates the efficacy of rSS. In fact, we observed a higher gain of discrimination performance as compared to a single-finger rSS. In contrast, pain and touch thresholds remained unaffected. Our data suggest that selecting particular fingers modulates the efficacy of rSS, thereby affecting processes controlling sensorimotor integration.

  16. Improvement of the drift chamber system in the SAPHIR detector and first measurements of the Φ meson production at threshold

    International Nuclear Information System (INIS)

    Scholmann, J.N.

    1996-09-01

    The SAPHIR detector at ELSA enables the measurement of photon induced Φ meson production from threshold up to 3 GeV in the full kinematical range. A considerable improvement of the drift chamber system is a precondition of gaining the necessary data rate in an acceptable time. The research focuses attention on the choice of the chamber gas and on a different mechanical construction, so as to minimize the negative influences of the photon beam crossing the sensitive volume of the drift chamber system. In addition, first preliminary results of the total and the differential cross section for the Φ meson production close to threshold were evaluated. (orig.)

  17. Improving Image Quality of Coronary Computed Tomography Angiography Using Patient Weight and Height-Dependent Scan Trigger Threshold.

    Science.gov (United States)

    Kang, Deqiang; Hua, Haiqin; Peng, Nan; Zhao, Jing; Wang, Zhiqun

    2017-04-01

    We aim to improve the image quality of coronary computed tomography angiography (CCTA) by using personalized weight and height-dependent scan trigger threshold. This study was divided into two parts. First, we performed and analyzed the 100 scheduled CCTA data, which were acquired by using body mass index-dependent Smart Prep sequence (trigger threshold ranged from 80 Hu to 250 Hu based on body mass index). By identifying the cases of high quality image, a linear regression equation was established to determine the correlation among the Smart Prep threshold, height, and body weight. Furthermore, a quick search table was generated for weight and height-dependent Smart Prep threshold in CCTA scan. Second, to evaluate the effectiveness of the new individual threshold method, an additional 100 consecutive patients were divided into two groups: individualized group (n = 50) with weight and height-dependent threshold and control group (n = 50) with the conventional constant threshold of 150 HU. Image quality was compared between the two groups by measuring the enhancement in coronary artery, aorta, left and right ventricle, and inferior vena cava. By visual inspection, image quality scores were performed to compare between the two groups. Regression equation between Smart Prep threshold (K, Hu), height (H, cm), and body weight (BW, kg) was K = 0.811 × H + 1.917 × BW - 99.341. When compared to the control group, the individualized group presented an average overall increase of 12.30% in enhancement in left main coronary artery, 12.94% in proximal right coronary artery, and 10.6% in aorta. Correspondingly, the contrast-to-noise ratios increased by 26.03%, 27.08%, and 23.17%, respectively, and by 633.1% in contrast between aorta and left ventricle. Meanwhile, the individualized group showed an average overall decrease of 22.7% in enhancement of right ventricle and 32.7% in inferior vena cava. There was no significant difference of the

  18. Gröbner Bases, Coding, and Cryptography

    CERN Document Server

    Sala, Massimiliano; Perret, Ludovic

    2009-01-01

    Coding theory and cryptography allow secure and reliable data transmission, which is at the heart of modern communication. This book offers a comprehensive overview on the application of commutative algebra to coding theory and cryptography. It analyzes important properties of algebraic/geometric coding systems individually.

  19. Position-based quantum cryptography and catalytic computation

    NARCIS (Netherlands)

    Speelman, F.

    2016-01-01

    In this thesis, we present several results along two different lines of research. The first part concerns the study of position-based quantum cryptography, a topic in quantum cryptography. By combining quantum mechanics with special relativity theory, new cryptographic tasks can be developed that

  20. An improved method to set significance thresholds forβdiversity testing in microbial community comparisons

    DEFF Research Database (Denmark)

    Gülay, Arda; Smets, Barth F.

    2015-01-01

    , including those associated with the process of subsampling. These components exist for any proposed β diversity measurement procedure. Further, we introduce a strategy to set significance thresholds for β diversity of any group of microbial samples using rarefaction, invoking the notion of a meta-community...

  1. Adaptive and distributed cryptography for signature biometrics protection

    Science.gov (United States)

    Campisi, Patrizio; Maiorana, Emanuele; Gonzalez Prats, Miguel; Neri, Alessandro

    2007-02-01

    The most emerging technology for people identification and authentication is biometrics. In contrast with traditional recognition approaches, biometric authentication relies on who a person is or what a person does, being based on strictly personal traits, much more difficult to be forgotten, lost, stolen, copied or forged than traditional data. In this paper, we focus on two vulnerable points of biometric systems: the database where the templates are stored and the communication channel between the stored templates and the matcher. Specifically, we propose a method, based on user adaptive error correction codes, to achieve securitization and cancelability of the stored templates applied to dynamic signature features. More in detail, the employed error correction code is tailored to the intra-class variability of each user's signature features. This leads to an enhancement of the system performance expressed in terms of false acceptance rate. Moreover, in order to avoid corruption or interception of the stored templates in the transmission channels, we propose a scheme based on threshold cryptography: the distribution of the certificate authority functionality among a number of nodes provides distributed, fault-tolerant, and hierarchical key management services. Experimental results show the effectiveness of our approach, when compared to traditional non-secure correlation-based classifiers.

  2. A renormalization group improved calculation of top quark production near threshold

    CERN Document Server

    Hoang, A H; Stewart, I W; Teubner, Thomas

    2001-01-01

    The top quark cross section close to threshold in $e^+e^-$ annihilation is computed including the summation of logarithms of the velocity at next-to-next-to-leading-logarithmic order in QCD. The remaining theoretical uncertainty in the normalization of the total cross section is at the few percent level, an order of magnitude smaller than in previous next-to-next-to-leading order calculations. This uncertainty is smaller than the effects of a light standard model Higgs boson.

  3. Nanoscale cryptography: opportunities and challenges.

    Science.gov (United States)

    Masoumi, Massoud; Shi, Weidong; Xu, Lei

    2015-01-01

    While most of the electronics industry is dependent on the ever-decreasing size of lithographic transistors, this scaling cannot continue indefinitely. To improve the performance of the integrated circuits, new emerging and paradigms are needed. In recent years, nanoelectronics has become one of the most important and exciting forefront in science and engineering. It shows a great promise for providing us in the near future with many breakthroughs that change the direction of technological advances in a wide range of applications. In this paper, we discuss the contribution that nanotechnology may offer to the evolution of cryptographic hardware and embedded systems and demonstrate how nanoscale devices can be used for constructing security primitives. Using a custom set of design automation tools, it is demonstrated that relative to a conventional 45-nm CMOS system, performance gains can be obtained up to two orders of magnitude reduction in area and up to 50 % improvement in speed.

  4. Improvement Thresholds for Morning Stiffness Duration in Patients Receiving Delayed- Versus Immediate-Release Prednisone for Rheumatoid Arthritis.

    Science.gov (United States)

    Buttgereit, Frank; Kent, Jeffrey D; Holt, Robert J; Grahn, Amy Y; Rice, Patricia; Alten, Rieke; Yazici, Yusuf

    2015-07-01

    Morning stiffness, a common patient reported symptom in rheumatoid arthritis, is associated with an increase in early morning inflammatory cytokines and significant disability. Little is known about categorical morning stiffness responses to glucocorticoid use in rheumatoid arthritis patients. Chronic pain threshold models have indicated previously that response rates of 15% to 30% indicate minimally important relief, 40% to 50% indicate substantial pain relief, and greater than 70% represents extensive pain relief. The objective of the present analysis was to assess differences in the percentages of patients achieving 25%(minimally important change), 50% (substantial change), and 75% (extensive change) reduction in the duration of patient-reported morning stiffness between patients receiving DR- and IR-prednisone in the Circadian Administration of Prednisone in Rheumatoid Arthritis (CAPRA-1) trial. The CAPRA-1 trial was a 12-week, double-blind study followed by an additional 9-month open-label extension. Patients in the CAPRA-1 trial were randomized to IR-prednisone in the morning or DR-prednisone at bedtime in addition to stable disease modifying antirheumatic drug therapy. After the double-blind phase, patients randomized to IR-prednisone (N =110) were switched to DR-prednisone and followed at 3, 6, and 9 months in an open-label extension phase. Patients originally randomized to DR-prednisone (N = 97) continued that therapy in the open-label extension. Patient morning stiffness diary entries from 4 weeks before and 4 weeks after each scheduled visit were analyzed over 1 year for threshold response. The number of patients reaching threshold response (25%, 50%, and 75% improvement) and time to morning stiffness response were examined. The DR-prednisone arm had significantly more responders in all three morning stiffness threshold response categories at the end of the double-blind period compared with IR-prednisone (p ≤ 0.05). Patients who switched from IR- to DR

  5. EDITORIAL: Focus on Quantum Cryptography: Theory and Practice FOCUS ON QUANTUM CRYPTOGRAPHY: THEORY AND PRACTICE

    Science.gov (United States)

    Lütkenhaus, N.; Shields, A. J.

    2009-04-01

    Quantum cryptography, and especially quantum key distribution (QKD), is steadily progressing to become a viable tool for cryptographic services. In recent years we have witnessed a dramatic increase in the secure bit rate of QKD, as well as its extension to ever longer fibre- and air-based links and the emergence of metro-scale trusted networks. In the foreseeable future even global-scale communications may be possible using quantum repeaters or Earth-satellite links. A handful of start-ups and some bigger companies are already active in the field. The launch of an initiative to form industrial standards for QKD, under the auspices of the European Telecommunication Standards Institute, described in the paper by Laenger and Lenhart in this Focus Issue, can be taken as a sign of the growing commercial interest. Recent progress has seen an increase in the secure bit rate of QKD links, by orders of magnitude, to over 1 Mb s-1. This has resulted mainly from an improvement in the detection technology. Here changes in the way conventional semiconductor detectors are gated, as well as the development of novel devices based on non-linear processes and superconducting materials, are leading the way. Additional challenges for QKD at GHz clock rates include the design of high speed electronics, remote synchronization and high rate random number generation. Substantial effort is being devoted to increasing the range of individual links, which is limited by attenuation and other losses in optical fibres and air links. An important advance in the past few years has been the introduction of protocols with the same scaling as an ideal single-photon set-up. The good news is that these schemes use standard optical devices, such as weak laser pulses. Thanks to these new protocols and improvements in the detection technology, the range of a single fibre link can exceed a few hundred km. Outstanding issues include proving the unconditional security of some of the schemes. Much of the

  6. Computation, cryptography, and network security

    CERN Document Server

    Rassias, Michael

    2015-01-01

    Analysis, assessment, and data management are core competencies for operation research analysts. This volume addresses a number of issues and developed methods for improving those skills. It is an outgrowth of a conference held in April 2013 at the Hellenic Military Academy, and brings together a broad variety of mathematical methods and theories with several applications. It discusses directions and pursuits of scientists that pertain to engineering sciences. It is also presents the theoretical background required for algorithms and techniques applied to a large variety of concrete problems. A number of open questions as well as new future areas are also highlighted.   This book will appeal to operations research analysts, engineers, community decision makers, academics, the military community, practitioners sharing the current “state-of-the-art,” and analysts from coalition partners. Topics covered include Operations Research, Games and Control Theory, Computational Number Theory and Information Securi...

  7. Improving occupational injury surveillance by using a severity threshold: development of a new occupational health indicator.

    Science.gov (United States)

    Sears, Jeanne M; Bowman, Stephen M; Rotert, Mary; Blanar, Laura; Hogg-Johnson, Sheilah

    2016-06-01

    Hospital discharge data are used for occupational injury surveillance, but observed hospitalisation trends are affected by trends in healthcare practices and workers' compensation coverage that may increasingly impair ascertainment of minor injuries relative to severe injuries. The objectives of this study were to (1) describe the development of a severe injury definition for surveillance purposes and (2) assess the impact of imposing a severity threshold on estimated occupational and non-occupational injury trends. Three independent methods were used to estimate injury severity for the severe injury definition. 10 population-based hospital discharge databases were used to estimate trends (1998-2009), including the National Hospital Discharge Survey (NHDS) and State Inpatient Databases (SID) from the Healthcare Cost and Utilization Project (HCUP), Agency for Healthcare Research and Quality. Negative binomial regression was used to model injury trends with and without severity restriction and to test trend divergence by severity. Trend estimates for occupational injuries were biased downwards in the absence of severity restriction, more so than for non-occupational injuries. Imposing a severity threshold resulted in a markedly different historical picture. Severity restriction can be used as an injury surveillance methodology to increase the accuracy of trend estimates, which can then be used by occupational health researchers, practitioners and policy-makers to identify prevention opportunities and to support state and national investments in occupational injury prevention efforts. The newly adopted state-based occupational health indicator, 'Work-Related Severe Traumatic Injury Hospitalizations', incorporates a severity threshold that will reduce temporal ascertainment threats to accurate trend estimates. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  8. Practical Computer Security through Cryptography

    Science.gov (United States)

    McNab, David; Twetev, David (Technical Monitor)

    1998-01-01

    The core protocols upon which the Internet was built are insecure. Weak authentication and the lack of low level encryption services introduce vulnerabilities that propagate upwards in the network stack. Using statistics based on CERT/CC Internet security incident reports, the relative likelihood of attacks via these vulnerabilities is analyzed. The primary conclusion is that the standard UNIX BSD-based authentication system is by far the most commonly exploited weakness. Encryption of Sensitive password data and the adoption of cryptographically-based authentication protocols can greatly reduce these vulnerabilities. Basic cryptographic terminology and techniques are presented, with attention focused on the ways in which technology such as encryption and digital signatures can be used to protect against the most commonly exploited vulnerabilities. A survey of contemporary security software demonstrates that tools based on cryptographic techniques, such as Kerberos, ssh, and PGP, are readily available and effectively close many of the most serious security holes. Nine practical recommendations for improving security are described.

  9. Lead-chalcogenide mid-infrared vertical external cavity surface emitting lasers with improved threshold: Theory and experiment

    Science.gov (United States)

    Fill, Matthias; Debernardi, Pierluigi; Felder, Ferdinand; Zogg, Hans

    2013-11-01

    Mid-infrared Vertical External Cavity Surface Emitting Lasers (VECSEL) based on narrow gap lead-chalcogenide (IV-VI) semiconductors exhibit strongly reduced threshold powers if the active layers are structured laterally for improved optical confinement. This is predicted by 3-d optical calculations; they show that lateral optical confinement is needed to counteract the anti-guiding features of IV-VIs due to their negative temperature dependence of the refractive index. An experimental proof is performed with PbSe quantum well based VECSEL grown on a Si-substrate by molecular beam epitaxy and emitting around 3.3 μm. With proper mesa-etching, the threshold intensity is about 8-times reduced.

  10. Lead-chalcogenide mid-infrared vertical external cavity surface emitting lasers with improved threshold: Theory and experiment

    Energy Technology Data Exchange (ETDEWEB)

    Fill, Matthias [ETH Zurich, Laser Spectroscopy and Sensing Lab, 8093 Zurich (Switzerland); Phocone AG, 8005 Zurich (Switzerland); Debernardi, Pierluigi [IEIIT-CNR, Torino 10129 (Italy); Felder, Ferdinand [Phocone AG, 8005 Zurich (Switzerland); Zogg, Hans [ETH Zurich (Switzerland)

    2013-11-11

    Mid-infrared Vertical External Cavity Surface Emitting Lasers (VECSEL) based on narrow gap lead-chalcogenide (IV-VI) semiconductors exhibit strongly reduced threshold powers if the active layers are structured laterally for improved optical confinement. This is predicted by 3-d optical calculations; they show that lateral optical confinement is needed to counteract the anti-guiding features of IV-VIs due to their negative temperature dependence of the refractive index. An experimental proof is performed with PbSe quantum well based VECSEL grown on a Si-substrate by molecular beam epitaxy and emitting around 3.3 μm. With proper mesa-etching, the threshold intensity is about 8-times reduced.

  11. An n -material thresholding method for improving integerness of solutions in topology optimization

    International Nuclear Information System (INIS)

    Watts, Seth; Engineering); Tortorelli, Daniel A.; Engineering)

    2016-01-01

    It is common in solving topology optimization problems to replace an integer-valued characteristic function design field with the material volume fraction field, a real-valued approximation of the design field that permits "fictitious" mixtures of materials during intermediate iterations in the optimization process. This is reasonable so long as one can interpolate properties for such materials and so long as the final design is integer valued. For this purpose, we present a method for smoothly thresholding the volume fractions of an arbitrary number of material phases which specify the design. This method is trivial for two-material design problems, for example, the canonical topology design problem of specifying the presence or absence of a single material within a domain, but it becomes more complex when three or more materials are used, as often occurs in material design problems. We take advantage of the similarity in properties between the volume fractions and the barycentric coordinates on a simplex to derive a thresholding, method which is applicable to an arbitrary number of materials. As we show in a sensitivity analysis, this method has smooth derivatives, allowing it to be used in gradient-based optimization algorithms. Finally, we present results, which show synergistic effects when used with Solid Isotropic Material with Penalty and Rational Approximation of Material Properties material interpolation functions, popular methods of ensuring integerness of solutions.

  12. Clinically important improvement thresholds for Harris Hip Score and its ability to predict revision risk after primary total hip arthroplasty.

    Science.gov (United States)

    Singh, Jasvinder A; Schleck, Cathy; Harmsen, Scott; Lewallen, David

    2016-06-10

    Some aspects of validity are missing for the Harris Hip Score (HHS). Our objective was to examine the clinically meaningful change thresholds, responsiveness and the predictive ability of the HHS questionnaire. We included a cohort of patients who underwent primary total hip arthroplasty (THA) and responded to the HHS preoperatively and at 2- or 5-year post-THA (change score) to examine the clinically meaningful change thresholds (Minimal clinically important improvement, MCII; and moderate improvement), responsiveness (effect size (ES) and standardized response mean (SRM)) based on pre- to post-operative change and the predictive ability of change score or absolute postoperative score at 2- and 5-years post-THA for future revision. Two thousand six hundred sixty-seven patients with a mean age of 64 years completed baseline HHS; 1036 completed both baseline and 2-year HHS and 669 both baseline and 5-year HHS. MCII and moderate improvement thresholds ranged 15.9-18 points and 39.6-40.1 points, respectively. ES was 3.12 and 3.02 at 2- and 5-years; respective SRM was 2.73 and 2.52. There were 3195 hips with HHS scores at 2-years and 2699 hips with HHS scores at 5-years (regardless of the completion of baseline HHS; absolute postoperative scores). Compared to patients with absolute HHS scores of 81-100 (score range, 0-100), patients with scores revision, 4.34 (2.14, 7.95; p 50 points from preoperative to 2-years post-THA, lack of improvement/worsening or 1-20 point improvement were associated with increased hazards of revision, 18.10 (1.41, 234.83; p = 0.02); and 6.21 (0.81, 60.73; p = 0.10), respectively. HHS is a valid measure of THA outcomes and is responsive to change. Both absolute HHS postoperative scores and HHS score change postoperatively are predictive of revision risk post-primary THA. We defined MCID and moderate improvement thresholds for HHS in this study.

  13. Event-by-event simulation of quantum cryptography protocols

    NARCIS (Netherlands)

    Zhao, S.; Raedt, H. De

    We present a new approach to simulate quantum cryptography protocols using event-based processes. The method is validated by simulating the BB84 protocol and the Ekert protocol, both without and with the presence of an eavesdropper.

  14. Advanced Mitigation Process (AMP) for Improving Laser Damage Threshold of Fused Silica Optics

    Science.gov (United States)

    Ye, Xin; Huang, Jin; Liu, Hongjie; Geng, Feng; Sun, Laixi; Jiang, Xiaodong; Wu, Weidong; Qiao, Liang; Zu, Xiaotao; Zheng, Wanguo

    2016-08-01

    The laser damage precursors in subsurface of fused silica (e.g. photosensitive impurities, scratches and redeposited silica compounds) were mitigated by mineral acid leaching and HF etching with multi-frequency ultrasonic agitation, respectively. The comparison of scratches morphology after static etching and high-frequency ultrasonic agitation etching was devoted in our case. And comparison of laser induce damage resistance of scratched and non-scratched fused silica surfaces after HF etching with high-frequency ultrasonic agitation were also investigated in this study. The global laser induce damage resistance was increased significantly after the laser damage precursors were mitigated in this case. The redeposition of reaction produce was avoided by involving multi-frequency ultrasonic and chemical leaching process. These methods made the increase of laser damage threshold more stable. In addition, there is no scratch related damage initiations found on the samples which were treated by Advanced Mitigation Process.

  15. Hardware Accelerators for Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    C. Puttmann

    2008-05-01

    Full Text Available In this paper we explore different hardware accelerators for cryptography based on elliptic curves. Furthermore, we present a hierarchical multiprocessor system-on-chip (MPSoC platform that can be used for fast integration and evaluation of novel hardware accelerators. In respect of two application scenarios the hardware accelerators are coupled at different hierarchy levels of the MPSoC platform. The whole system is implemented in a state of the art 65 nm standard cell technology. Moreover, an FPGA-based rapid prototyping system for fast system verification is presented. Finally, a metric to analyze the resource efficiency by means of chip area, execution time and energy consumption is introduced.

  16. A NOVEL ROLLING BASED DNA CRYPTOGRAPHY

    Directory of Open Access Journals (Sweden)

    Rejwana Haque

    2017-05-01

    Full Text Available DNA Cryptography can be defined as a hiding data in terms of DNA Sequence. In this paper we propose a new DNA Encryption Technique where three different types of ordering is use to make binary data into cipher text. The main stages of this encryption technique are: Key Analysis, Data and Key Arrangement, Roll in encoding, Secondary Arrangement and Shifting. Decryption process has six main steps to obtain the original binary data from the encrypted data and key. Decryption steps are: Key Analysis, Shifting, Secondary Arrangement, Key Arrangement, Roll-out decoding, Data Arrangement. Here key size is half of binary data and the key is varies from data to data so key are used as one time pad. In this paper we also discuss about the implementation from sample data and security analysis for this given method.

  17. Introduction to number theory with cryptography

    CERN Document Server

    Kraft, James S

    2013-01-01

    IntroductionDiophantine EquationsModular ArithmeticPrimes and the Distribution of PrimesCryptographyDivisibilityDivisibilityEuclid's Theorem Euclid's Original Proof The Sieve of Eratosthenes The Division Algorithm The Greatest Common Divisor The Euclidean Algorithm Other BasesLinear Diophantine EquationsThe Postage Stamp Problem Fermat and Mersenne Numbers Chapter Highlights Problems Unique FactorizationPreliminary Results The Fundamental Theorem of Arithmetic Euclid and the Fundamental Theorem of ArithmeticChapter Highlights Problems Applications of Unique Factorization A Puzzle Irrationality Proofs The Rational Root Theorem Pythagorean Triples Differences of Squares Prime Factorization of Factorials The Riemann Zeta Function Chapter Highlights Problems CongruencesDefinitions and Examples Modular Exponentiation Divisibility TestsLinear Congruences The Chinese Remainder TheoremFractions mod m Fermat's Theorem Euler's Theorem Wilson's Theorem Queens on a Chessboard Chapter Highlights Problems Cryptographic App...

  18. Securing information display by use of visual cryptography.

    Science.gov (United States)

    Yamamoto, Hirotsugu; Hayasaki, Yoshio; Nishida, Nobuo

    2003-09-01

    We propose a secure display technique based on visual cryptography. The proposed technique ensures the security of visual information. The display employs a decoding mask based on visual cryptography. Without the decoding mask, the displayed information cannot be viewed. The viewing zone is limited by the decoding mask so that only one person can view the information. We have developed a set of encryption codes to maintain the designed viewing zone and have demonstrated a display that provides a limited viewing zone.

  19. Design and Implementation of Lattice-Based Cryptography

    OpenAIRE

    Lepoint, Tancrède

    2014-01-01

    Today, lattice-based cryptography is a thriving scientific field. Its swift expansion is due, among others, to the attractiveness of fully homomorphic encryption and cryptographic multilinear maps. Lattice-based cryptography has also been recognized for its thrilling properties: a security that can be reduced to worst-case instances of problems over lattices, a quasi-optimal asymptotic efficiency and an alleged resistance to quantum computers. However, its practical use in r...

  20. High-rate measurement-device-independent quantum cryptography

    DEFF Research Database (Denmark)

    Pirandola, Stefano; Ottaviani, Carlo; Spedalieri, Gaetana

    2015-01-01

    Quantum cryptography achieves a formidable task - the remote distribution of secret keys by exploiting the fundamental laws of physics. Quantum cryptography is now headed towards solving the practical problem of constructing scalable and secure quantum networks. A significant step in this direction...... than those currently achieved. Our protocol could be employed to build high-rate quantum networks where devices securely connect to nearby access points or proxy servers....

  1. PREFACE: Quantum Information, Communication, Computation and Cryptography

    Science.gov (United States)

    Benatti, F.; Fannes, M.; Floreanini, R.; Petritis, D.

    2007-07-01

    The application of quantum mechanics to information related fields such as communication, computation and cryptography is a fast growing line of research that has been witnessing an outburst of theoretical and experimental results, with possible practical applications. On the one hand, quantum cryptography with its impact on secrecy of transmission is having its first important actual implementations; on the other hand, the recent advances in quantum optics, ion trapping, BEC manipulation, spin and quantum dot technologies allow us to put to direct test a great deal of theoretical ideas and results. These achievements have stimulated a reborn interest in various aspects of quantum mechanics, creating a unique interplay between physics, both theoretical and experimental, mathematics, information theory and computer science. In view of all these developments, it appeared timely to organize a meeting where graduate students and young researchers could be exposed to the fundamentals of the theory, while senior experts could exchange their latest results. The activity was structured as a school followed by a workshop, and took place at The Abdus Salam International Center for Theoretical Physics (ICTP) and The International School for Advanced Studies (SISSA) in Trieste, Italy, from 12-23 June 2006. The meeting was part of the activity of the Joint European Master Curriculum Development Programme in Quantum Information, Communication, Cryptography and Computation, involving the Universities of Cergy-Pontoise (France), Chania (Greece), Leuven (Belgium), Rennes1 (France) and Trieste (Italy). This special issue of Journal of Physics A: Mathematical and Theoretical collects 22 contributions from well known experts who took part in the workshop. They summarize the present day status of the research in the manifold aspects of quantum information. The issue is opened by two review articles, the first by G Adesso and F Illuminati discussing entanglement in continuous variable

  2. Guaranteerring Security of Financial Transaction by Using Quantum Cryptography in Banking Environment

    Science.gov (United States)

    Ghernaouti-Hélie, Solange; Sfaxi, Mohamed Ali

    Protocols and applications could profit of quantum cryptography to secure communications. The applications of quantum cryptography are linked to telecommunication services that require very high level of security such as bank transactions.

  3. Implementation of diffie-Hellman key exchange on wireless sensor using elliptic curve cryptography

    DEFF Research Database (Denmark)

    Khajuria, Samant; Tange, Henrik

    2009-01-01

    This work describes a low-cost public key cryptography (PKC) based solution for security services such as authentication as required for wireless sensor networks. We have implemented a software approach using elliptic curve cryptography (ECC) over GF (2m) in order to obtain stronger cryptography...

  4. Handbook of elliptic and hyperelliptic curve cryptography

    CERN Document Server

    Cohen, Henri; Avanzi, Roberto; Doche, Christophe; Lange, Tanja; Nguyen, Kim; Vercauteren, Frederik

    2005-01-01

    … very comprehensive coverage of this vast subject area … a useful and essential treatise for anyone involved in elliptic curve algorithms … this book offers the opportunity to grasp the ECC technology with a diversified and comprehensive perspective. … This book will remain on my shelf for a long time and will land on my desk on many occasions, if only because the coverage of the issues common to factoring and discrete log cryptosystems is excellent.-IACR Book Reviews, June 2011… the book is designed for people who are working in the area and want to learn more about a specific issue. The chapters are written to be relatively independent so that readers can focus on the part of interest for them. Such readers will be grateful for the excellent index and extensive bibliography. … the handbook covers a wide range of topics and will be a valuable reference for researchers in curve-based cryptography. -Steven D. Galbraith, Mathematical Reviews, Issue 2007f.

  5. Practical device-independent quantum cryptography via entropy accumulation.

    Science.gov (United States)

    Arnon-Friedman, Rotem; Dupuis, Frédéric; Fawzi, Omar; Renner, Renato; Vidick, Thomas

    2018-01-31

    Device-independent cryptography goes beyond conventional quantum cryptography by providing security that holds independently of the quality of the underlying physical devices. Device-independent protocols are based on the quantum phenomena of non-locality and the violation of Bell inequalities. This high level of security could so far only be established under conditions which are not achievable experimentally. Here we present a property of entropy, termed "entropy accumulation", which asserts that the total amount of entropy of a large system is the sum of its parts. We use this property to prove the security of cryptographic protocols, including device-independent quantum key distribution, while achieving essentially optimal parameters. Recent experimental progress, which enabled loophole-free Bell tests, suggests that the achieved parameters are technologically accessible. Our work hence provides the theoretical groundwork for experimental demonstrations of device-independent cryptography.

  6. Low tube voltage CT for improved detection of pancreatic cancer: detection threshold for small, simulated lesions

    Directory of Open Access Journals (Sweden)

    Holm Jon

    2012-07-01

    Full Text Available Abstract Background Pancreatic ductal adenocarcinoma is associated with dismal prognosis. The detection of small pancreatic tumors which are still resectable is still a challenging problem. The aim of this study was to investigate the effect of decreasing the tube voltage from 120 to 80 kV on the detection of pancreatic tumors. Methods Three scanning protocols was used; one using the standard tube voltage (120 kV and current (160 mA and two using 80 kV but with different tube currents (500 and 675 mA to achieve equivalent dose (15 mGy and noise (15 HU as that of the standard protocol. Tumors were simulated into collected CT phantom images. The attenuation in normal parenchyma at 120 kV was set at 130 HU, as measured previously in clinical examinations, and the tumor attenuation was assumed to differ 20 HU and was set at 110HU. By scanning and measuring of iodine solution with different concentrations the corresponding tumor and parenchyma attenuation at 80 kV was found to be 185 and 219 HU, respectively. To objectively evaluate the differences between the three protocols, a multi-reader multi-case receiver operating characteristic study was conducted, using three readers and 100 cases, each containing 0–3 lesions. Results The highest reader averaged figure-of-merit (FOM was achieved for 80 kV and 675 mA (FOM = 0,850, and the lowest for 120 kV (FOM = 0,709. There was a significant difference between the three protocols (p t-test shows that there was a significant difference between 120 and 80 kV, but not between the two levels of tube currents at 80 kV. Conclusion We conclude that when decreasing the tube voltage there is a significant improvement in tumor conspicuity.

  7. Quantum discord as a resource for quantum cryptography.

    Science.gov (United States)

    Pirandola, Stefano

    2014-11-07

    Quantum discord is the minimal bipartite resource which is needed for a secure quantum key distribution, being a cryptographic primitive equivalent to non-orthogonality. Its role becomes crucial in device-dependent quantum cryptography, where the presence of preparation and detection noise (inaccessible to all parties) may be so strong to prevent the distribution and distillation of entanglement. The necessity of entanglement is re-affirmed in the stronger scenario of device-independent quantum cryptography, where all sources of noise are ascribed to the eavesdropper.

  8. Generalized logistic map and its application in chaos based cryptography

    Science.gov (United States)

    Lawnik, M.

    2017-12-01

    The logistic map is commonly used in, for example, chaos based cryptography. However, its properties do not render a safe construction of encryption algorithms. Thus, the scope of the paper is a proposal of generalization of the logistic map by means of a wellrecognized family of chaotic maps. In the next step, an analysis of Lyapunov exponent and the distribution of the iterative variable are studied. The obtained results confirm that the analyzed model can safely and effectively replace a classic logistic map for applications involving chaotic cryptography.

  9. Adaptive Hardware Cryptography Engine Based on FPGA

    International Nuclear Information System (INIS)

    Afify, M.A.A.

    2011-01-01

    In the last two decades, with spread of the real time applications over public networks or communications the need for information security become more important but with very high speed for data processing, to keep up with the real time applications requirements, that is the reason for using FPGA as an implementation platform for the proposed cryptography engine. Hence in this thesis a new S-Box design has been demonstrated and implemented, there is a comparison for the simulation results for proposed S-Box simulation results with respect to different designs for S-Box in DES, Two fish and Rijndael algorithms and another comparison among proposed S-Box with different sizes. The proposed S-Box implemented with 32-bits Input data lines and compared with different designs in the encryption algorithms with the same input lines, the proposed S-Box gives implementation results for the maximum frequency 120 MHz but the DES S-Box gives 34 MHz and Rijndael gives 71 MHz, on the other hand the proposed design gives the best implementation area, hence it gives 50 Configurable logic Block CLB but DES gives 88 CLB. The proposed S-Box implemented in different sizes 64-bits, 128-bits, and 256-bits for input data lines. The implementation carried out by using UniDAq PCI card with FPGA Chip XCV 800, synthesizing carried out for all designs by using Leonardo spectrum and simulation carried out by using model sim simulator program form the FPGA advantage package. Finally the results evaluation and verifications carried out using the UniDAq FPGA PCI card with chip XCV 800. Different cases study have been implemented, data encryption, images encryption, voice encryption, and video encryption. A prototype for Remote Monitoring Control System has been implemented. Finally the proposed design for S-Box has a significant achievement in maximum frequency, implementation area, and encryption strength.

  10. Acrolein-stressed threshold adaptation alters the molecular and metabolic bases of an engineered Saccharomyces cerevisiae to improve glutathione production.

    Science.gov (United States)

    Zhou, Wenlong; Yang, Yan; Tang, Liang; Cheng, Kai; Li, Changkun; Wang, Huimin; Liu, Minzhi; Wang, Wei

    2018-03-14

    Acrolein (Acr) was used as a selection agent to improve the glutathione (GSH) overproduction of the prototrophic strain W303-1b/FGP PT . After two rounds of adaptive laboratory evolution (ALE), an unexpected result was obtained wherein identical GSH production was observed in the selected isolates. Then, a threshold selection mechanism of Acr-stressed adaption was clarified based on the formation of an Acr-GSH adduct, and a diffusion coefficient (0.36 ± 0.02 μmol·min -1 ·OD 600 -1 ) was calculated. Metabolomic analysis was carried out to reveal the molecular bases that triggered GSH overproduction. The results indicated that all three precursors (glutamic acid (Glu), glycine (Gly) and cysteine (Cys)) needed for GSH synthesis were at a relativity higher concentration in the evolved strain and that the accumulation of homocysteine (Hcy) and cystathionine might promote Cys synthesis and then improve GSH production. In addition to GSH and Cys, it was observed that other non-protein thiols and molecules related to ATP generation were at obviously different levels. To divert the accumulated thiols to GSH biosynthesis, combinatorial strategies, including deletion of cystathionine β-lyase (STR3), overexpression of cystathionine γ-lyase (CYS3) and cystathionine β-synthase (CYS4), and reduction of the unfolded protein response (UPR) through up-regulation of protein disulphide isomerase (PDI), were also investigated.

  11. Remote sensing of aquatic vegetation distribution in Taihu Lake using an improved classification tree with modified thresholds.

    Science.gov (United States)

    Zhao, Dehua; Jiang, Hao; Yang, Tangwu; Cai, Ying; Xu, Delin; An, Shuqing

    2012-03-01

    Classification trees (CT) have been used successfully in the past to classify aquatic vegetation from spectral indices (SI) obtained from remotely-sensed images. However, applying CT models developed for certain image dates to other time periods within the same year or among different years can reduce the classification accuracy. In this study, we developed CT models with modified thresholds using extreme SI values (CT(m)) to improve the stability of the models when applying them to different time periods. A total of 903 ground-truth samples were obtained in September of 2009 and 2010 and classified as emergent, floating-leaf, or submerged vegetation or other cover types. Classification trees were developed for 2009 (Model-09) and 2010 (Model-10) using field samples and a combination of two images from winter and summer. Overall accuracies of these models were 92.8% and 94.9%, respectively, which confirmed the ability of CT analysis to map aquatic vegetation in Taihu Lake. However, Model-10 had only 58.9-71.6% classification accuracy and 31.1-58.3% agreement (i.e., pixels classified the same in the two maps) for aquatic vegetation when it was applied to image pairs from both a different time period in 2010 and a similar time period in 2009. We developed a method to estimate the effects of extrinsic (EF) and intrinsic (IF) factors on model uncertainty using Modis images. Results indicated that 71.1% of the instability in classification between time periods was due to EF, which might include changes in atmospheric conditions, sun-view angle and water quality. The remainder was due to IF, such as phenological and growth status differences between time periods. The modified version of Model-10 (i.e. CT(m)) performed better than traditional CT with different image dates. When applied to 2009 images, the CT(m) version of Model-10 had very similar thresholds and performance as Model-09, with overall accuracies of 92.8% and 90.5% for Model-09 and the CT(m) version of Model

  12. Introduction to Cryptography and the Bitcoin Protocol (2/2)

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Bitcoin protocol not only supports an electronic currency, but also has the possibility for being (mis)used in other ways. Topics will include the basic operation of how Bitcoin operates including motivations and also such things as block chaining, bitcoin mining, and how financial transactions operate. A knowledge of the topics covered in the Basic Cryptography lecture will be assumed.

  13. Introduction to Cryptography and the Bitcoin Protocol (1/2)

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    The Bitcoin protocol not only supports an electronic currency, but also has the possibility for being (mis)used in other ways. Topics will include the basic operation of how Bitcoin operates including motivations and also such things as block chaining, bitcoin mining, and how financial transactions operate. A knowledge of the topics covered in the Basic Cryptography lecture will be assumed.

  14. Cryptography- An ideal solution to privacy, data integrity and non ...

    African Journals Online (AJOL)

    Encryption, hashing and digital signatures are the three primitives of Cryptography and these have been treated in depth and their performances on text data and image data have been studied The most secure algorithms so far in use have been introduced and the respective performance of each primitive 's algorithm on ...

  15. Cryptography for a High-Assurance Web-Based Enterprise

    Science.gov (United States)

    2013-10-01

    STD 5, Internet Protocol, J. Postel, Sep1981, and subsequent RFCs 791/950/919/922/792/1112. f. RFC 4510 Lightweight Directory Access Protocol ( LDAP ...July 2010. h. RFC 2829, Authentication Methods for LDAP , May 2000. [6]. PKCS #1: RSA Cryptography Standard, http://www.rsa.com /rsalabs/node.asp

  16. Efficient multiuser quantum cryptography network based on entanglement.

    Science.gov (United States)

    Xue, Peng; Wang, Kunkun; Wang, Xiaoping

    2017-04-04

    We present an efficient quantum key distribution protocol with a certain entangled state to solve a special cryptographic task. Also, we provide a proof of security of this protocol by generalizing the proof of modified of Lo-Chau scheme. Based on this two-user scheme, a quantum cryptography network protocol is proposed without any quantum memory.

  17. Steganography and Cryptography Inspired Enhancement of Introductory Programming Courses

    Science.gov (United States)

    Kortsarts, Yana; Kempner, Yulia

    2015-01-01

    Steganography is the art and science of concealing communication. The goal of steganography is to hide the very existence of information exchange by embedding messages into unsuspicious digital media covers. Cryptography, or secret writing, is the study of the methods of encryption, decryption and their use in communications protocols.…

  18. The mathematics of ciphers number theory and RSA cryptography

    CERN Document Server

    Coutinho, S C

    1999-01-01

    This book is an introduction to the algorithmic aspects of number theory and its applications to cryptography, with special emphasis on the RSA cryptosys-tem. It covers many of the familiar topics of elementary number theory, all with an algorithmic twist. The text also includes many interesting historical notes.

  19. Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography

    Science.gov (United States)

    Aydin, Nuh

    2009-01-01

    The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…

  20. Towards Practical Whitebox Cryptography: Optimizing Efficiency and Space Hardness

    DEFF Research Database (Denmark)

    Bogdanov, Andrey; Isobe, Takanori; Tischhauser, Elmar Wolfgang

    2016-01-01

    the practical requirements to whitebox cryptography in real-world applications such as DRM or mobile payments. Moreover, we formalize resistance towards decomposition in form of weak and strong space hardness at various security levels. We obtain bounds on space hardness in all those adversarial models...

  1. An improved experimental scheme for simultaneous measurement of high-resolution zero electron kinetic energy (ZEKE) photoelectron and threshold photoion (MATI) spectra

    Science.gov (United States)

    Michels, François; Mazzoni, Federico; Becucci, Maurizio; Müller-Dethlefs, Klaus

    2017-10-01

    An improved detection scheme is presented for threshold ionization spectroscopy with simultaneous recording of the Zero Electron Kinetic Energy (ZEKE) and Mass Analysed Threshold Ionisation (MATI) signals. The objective is to obtain accurate dissociation energies for larger molecular clusters by simultaneously detecting the fragment and parent ion MATI signals with identical transmission. The scheme preserves an optimal ZEKE spectral resolution together with excellent separation of the spontaneous ion and MATI signals in the time-of-flight mass spectrum. The resulting improvement in sensitivity will allow for the determination of dissociation energies in clusters with substantial mass difference between parent and daughter ions.

  2. Cryogenic ion implantation near amorphization threshold dose for halo/extension junction improvement in sub-30 nm device technologies

    International Nuclear Information System (INIS)

    Park, Hugh; Todorov, Stan; Colombeau, Benjamin; Rodier, Dennis; Kouzminov, Dimitry; Zou Wei; Guo Baonian; Khasgiwale, Niranjan; Decker-Lucke, Kurt

    2012-01-01

    We report on junction advantages of cryogenic ion implantation with medium current implanters. We propose a methodical approach on maximizing cryogenic effects on junction characteristics near the amorphization threshold doses that are typically used for halo implants for sub-30 nm technologies. BF 2 + implant at a dose of 8×10 13 cm −2 does not amorphize silicon at room temperature. When implanted at −100°C, it forms a 30 - 35 nm thick amorphous layer. The cryogenic BF 2 + implant significantly reduces the depth of the boron distribution, both as-implanted and after anneals, which improves short channel rolloff characteristics. It also creates a shallower n + -p junction by steepening profiles of arsenic that is subsequently implanted in the surface region. We demonstrate effects of implant sequences, germanium preamorphization, indium and carbon co-implants for extension/halo process integration. When applied to sequences such as Ge+As+C+In+BF 2 + , the cryogenic implants at −100°C enable removal of Ge preamorphization, and form more active n + -p junctions and steeper B and In halo profiles than sequences at room temperature.

  3. Inter-symbol interference and beat noise in flexible data-rate coherent OCDMA and the BER improvement by using optical thresholding.

    Science.gov (United States)

    Wang, Xu; Wada, Naoya; Kitayama, Ken-Ichi

    2005-12-26

    Impairments of inter-symbol interference and beat noise in coherent time-spreading optical code-division-multiple-access are investigated theoretically and experimentally by sweeping the data-rate from 622 Mbps up to 10 Gbps with 511-chip superstructured fiber Bragg grating. The BER improvement by using optical thresholding technique has been verified in the experiment.

  4. Modern cryptography and elliptic curves a beginner's guide

    CERN Document Server

    Shemanske, Thomas R

    2017-01-01

    This book offers the beginning undergraduate student some of the vista of modern mathematics by developing and presenting the tools needed to gain an understanding of the arithmetic of elliptic curves over finite fields and their applications to modern cryptography. This gradual introduction also makes a significant effort to teach students how to produce or discover a proof by presenting mathematics as an exploration, and at the same time, it provides the necessary mathematical underpinnings to investigate the practical and implementation side of elliptic curve cryptography (ECC). Elements of abstract algebra, number theory, and affine and projective geometry are introduced and developed, and their interplay is exploited. Algebra and geometry combine to characterize congruent numbers via rational points on the unit circle, and group law for the set of points on an elliptic curve arises from geometric intuition provided by Bézout's theorem as well as the construction of projective space. The structure of the...

  5. Electronic Voting Protocol Using Identity-Based Cryptography.

    Science.gov (United States)

    Gallegos-Garcia, Gina; Tapia-Recillas, Horacio

    2015-01-01

    Electronic voting protocols proposed to date meet their properties based on Public Key Cryptography (PKC), which offers high flexibility through key agreement protocols and authentication mechanisms. However, when PKC is used, it is necessary to implement Certification Authority (CA) to provide certificates which bind public keys to entities and enable verification of such public key bindings. Consequently, the components of the protocol increase notably. An alternative is to use Identity-Based Encryption (IBE). With this kind of cryptography, it is possible to have all the benefits offered by PKC, without neither the need of certificates nor all the core components of a Public Key Infrastructure (PKI). Considering the aforementioned, in this paper we propose an electronic voting protocol, which meets the privacy and robustness properties by using bilinear maps.

  6. Two-phase hybrid cryptography algorithm for wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Rawya Rizk

    2015-12-01

    Full Text Available For achieving security in wireless sensor networks (WSNs, cryptography plays an important role. In this paper, a new security algorithm using combination of both symmetric and asymmetric cryptographic techniques is proposed to provide high security with minimized key maintenance. It guarantees three cryptographic primitives, integrity, confidentiality and authentication. Elliptical Curve Cryptography (ECC and Advanced Encryption Standard (AES are combined to provide encryption. XOR-DUAL RSA algorithm is considered for authentication and Message Digest-5 (MD5 for integrity. The results show that the proposed hybrid algorithm gives better performance in terms of computation time, the size of cipher text, and the energy consumption in WSN. It is also robust against different types of attacks in the case of image encryption.

  7. Conference on Algebraic Geometry for Coding Theory and Cryptography

    CERN Document Server

    Lauter, Kristin; Walker, Judy

    2017-01-01

    Covering topics in algebraic geometry, coding theory, and cryptography, this volume presents interdisciplinary group research completed for the February 2016 conference at the Institute for Pure and Applied Mathematics (IPAM) in cooperation with the Association for Women in Mathematics (AWM). The conference gathered research communities across disciplines to share ideas and problems in their fields and formed small research groups made up of graduate students, postdoctoral researchers, junior faculty, and group leaders who designed and led the projects. Peer reviewed and revised, each of this volume's five papers achieves the conference’s goal of using algebraic geometry to address a problem in either coding theory or cryptography. Proposed variants of the McEliece cryptosystem based on different constructions of codes, constructions of locally recoverable codes from algebraic curves and surfaces, and algebraic approaches to the multicast network coding problem are only some of the topics covered in this vo...

  8. Multivariate Cryptography Based on Clipped Hopfield Neural Network.

    Science.gov (United States)

    Wang, Jia; Cheng, Lee-Ming; Su, Tong

    2018-02-01

    Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in space. The Diffie-Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography. The efficiency and security of our proposed new public key cryptosystem CHNN-MVC are simulated and found to be NP-hard. The proposed algorithm will strengthen multivariate public key cryptosystems and allows hardware realization practicality.

  9. ID based cryptography for secure cloud data storage

    OpenAIRE

    Kaaniche , Nesrine; Boudguiga , Aymen; Laurent , Maryline

    2013-01-01

    International audience; This paper addresses the security issues of storing sensitive data in a cloud storage service and the need for users to trust the commercial cloud providers. It proposes a cryptographic scheme for cloud storage, based on an original usage of ID-Based Cryptography. Our solution has several advantages. First, it provides secrecy for encrypted data which are stored in public servers. Second, it offers controlled data access and sharing among users, so that unauthorized us...

  10. Nonlinear laser dynamics from quantum dots to cryptography

    CERN Document Server

    Lüdge, Kathy

    2012-01-01

    A distinctive discussion of the nonlinear dynamical phenomena of semiconductor lasers. The book combines recent results of quantum dot laser modeling with mathematical details and an analytic understanding of nonlinear phenomena in semiconductor lasers and points out possible applications of lasers in cryptography and chaos control. This interdisciplinary approach makes it a unique and powerful source of knowledge for anyone intending to contribute to this field of research.By presenting both experimental and theoretical results, the distinguished authors consider solitary lase

  11. SHAMROCK: A Synthesizable High Assurance Cryptography and Key Management Coprocessor

    Science.gov (United States)

    2016-11-01

    designers to readily and correctly incorporate cryptography and key management into embedded systems . SHAMROCK has been incorporated in multiple...three devices are able to communicate using the same key. If device C is no longer used (or trusted ) for the application, the system is reconfigured by...encrypted communications between SHAMROCK- embedded systems . The key generation request is initiated by the protocol control module to the HAKM module

  12. Experimental quantum secret sharing and third-man quantum cryptography.

    Science.gov (United States)

    Chen, Yu-Ao; Zhang, An-Ning; Zhao, Zhi; Zhou, Xiao-Qi; Lu, Chao-Yang; Peng, Cheng-Zhi; Yang, Tao; Pan, Jian-Wei

    2005-11-11

    Quantum secret sharing (QSS) and third-man quantum cryptography (TQC) are essential for advanced quantum communication; however, the low intensity and fragility of the multiphoton entanglement source in previous experiments have made their realization an extreme experimental challenge. Here, we develop and exploit an ultrastable high intensity source of four-photon entanglement to report an experimental realization of QSS and TQC. The technology developed in our experiment will be important for future multiparty quantum communication.

  13. An Incomplete Cryptography based Digital Rights Management with DCFF

    OpenAIRE

    Thanh, Ta Minh; Iwakiri, Munetoshi

    2014-01-01

    In general, DRM (Digital Rights Management) system is responsible for the safe distribution of digital content, however, DRM system is achieved with individual function modules of cryptography, watermarking and so on. In this typical system flow, it has a problem that all original digital contents are temporarily disclosed with perfect condition via decryption process. In this paper, we propose the combination of the differential codes and fragile fingerprinting (DCFF) method based on incompl...

  14. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    Science.gov (United States)

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  15. Improved laser damage threshold performance of calcium fluoride optical surfaces via Accelerated Neutral Atom Beam (ANAB) processing

    Science.gov (United States)

    Kirkpatrick, S.; Walsh, M.; Svrluga, R.; Thomas, M.

    2015-11-01

    Optics are not keeping up with the pace of laser advancements. The laser industry is rapidly increasing its power capabilities and reducing wavelengths which have exposed the optics as a weak link in lifetime failures for these advanced systems. Nanometer sized surface defects (scratches, pits, bumps and residual particles) on the surface of optics are a significant limiting factor to high end performance. Angstrom level smoothing of materials such as calcium fluoride, spinel, magnesium fluoride, zinc sulfide, LBO and others presents a unique challenge for traditional polishing techniques. Exogenesis Corporation, using its new and proprietary Accelerated Neutral Atom Beam (ANAB) technology, is able to remove nano-scale surface damage and particle contamination leaving many material surfaces with roughness typically around one Angstrom. This surface defect mitigation via ANAB processing can be shown to increase performance properties of high intensity optical materials. This paper describes the ANAB technology and summarizes smoothing results for calcium fluoride laser windows. It further correlates laser damage threshold improvements with the smoothing produced by ANAB surface treatment. All ANAB processing was performed at Exogenesis Corporation using an nAccel100TM Accelerated Particle Beam processing tool. All surface measurement data for the paper was produced via AFM analysis on a Park Model XE70 AFM, and all laser damage testing was performed at Spica Technologies, Inc. Exogenesis Corporation's ANAB processing technology is a new and unique surface modification technique that has demonstrated to be highly effective at correcting nano-scale surface defects. ANAB is a non-contact vacuum process comprised of an intense beam of accelerated, electrically neutral gas atoms with average energies of a few tens of electron volts. The ANAB process does not apply mechanical forces associated with traditional polishing techniques. ANAB efficiently removes surface

  16. High-intensity interval training and β-hydroxy-β-methylbutyric free acid improves aerobic power and metabolic thresholds

    Science.gov (United States)

    2014-01-01

    Background Previous research combining Calcium β-hydroxy-β-methylbutyrate (CaHMB) and running high-intensity interval training (HIIT) have shown positive effects on aerobic performance measures. The purpose of this study was to examine the effect of β-hydroxy-β-methylbutyric free acid (HMBFA) and cycle ergometry HIIT on maximal oxygen consumption (VO2peak), ventilatory threshold (VT), respiratory compensation point (RCP) and time to exhaustion (Tmax) in college-aged men and women. Methods Thirty-four healthy men and women (Age: 22.7 ± 3.1 yrs ; VO2peak: 39.3 ± 5.0 ml · kg-1 · min-1) volunteered to participate in this double-blind, placebo-controlled design study. All participants completed a series of tests prior to and following treatment. A peak oxygen consumption test was performed on a cycle ergometer to assess VO2peak, Tmax, VT, and RCP. Twenty-six participants were randomly assigned into either a placebo (PLA-HIIT) or 3 g per day of HMBFA (BetaTor™) (HMBFA-HIIT) group. Eight participants served as controls (CTL). Participants in the HIIT groups completed 12 HIIT (80-120% maximal workload) exercise sessions consisting of 5–6 bouts of a 2:1 minute cycling work to rest ratio protocol over a four-week period. Body composition was measured with dual energy x-ray absorptiometry (DEXA). Outcomes were assessed by ANCOVA with posttest means adjusted for pretest differences. Results The HMBFA-HIIT intervention showed significant (p HIIT group. Both PLA-HIIT and HMBFA-HIIT treatment groups demonstrated significant (p HIIT and HMBFA-HIIT groups. Conclusions Our findings support the use of HIIT in combination with HMBFA to improve aerobic fitness in college age men and women. These data suggest that the addition of HMBFA supplementation may result in greater changes in VO2peak and VT than HIIT alone. Study registration The study was registered on ClinicalTrials.gov (ID NCT01941368). PMID:24782684

  17. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    Science.gov (United States)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  18. Elliptic Curve Cryptography with Security System in Wireless Sensor Networks

    Science.gov (United States)

    Huang, Xu; Sharma, Dharmendra

    2010-10-01

    The rapid progress of wireless communications and embedded micro-electro-system technologies has made wireless sensor networks (WSN) very popular and even become part of our daily life. WSNs design are generally application driven, namely a particular application's requirements will determine how the network behaves. However, the natures of WSN have attracted increasing attention in recent years due to its linear scalability, a small software footprint, low hardware implementation cost, low bandwidth requirement, and high device performance. It is noted that today's software applications are mainly characterized by their component-based structures which are usually heterogeneous and distributed, including the WSNs. But WSNs typically need to configure themselves automatically and support as hoc routing. Agent technology provides a method for handling increasing software complexity and supporting rapid and accurate decision making. This paper based on our previous works [1, 2], three contributions have made, namely (a) fuzzy controller for dynamic slide window size to improve the performance of running ECC (b) first presented a hidden generation point for protection from man-in-the middle attack and (c) we first investigates multi-agent applying for key exchange together. Security systems have been drawing great attentions as cryptographic algorithms have gained popularity due to the natures that make them suitable for use in constrained environment such as mobile sensor information applications, where computing resources and power availability are limited. Elliptic curve cryptography (ECC) is one of high potential candidates for WSNs, which requires less computational power, communication bandwidth, and memory in comparison with other cryptosystem. For saving pre-computing storages recently there is a trend for the sensor networks that the sensor group leaders rather than sensors communicate to the end database, which highlighted the needs to prevent from the man

  19. High-intensity interval training and β-hydroxy-β-methylbutyric free acid improves aerobic power and metabolic thresholds

    OpenAIRE

    Robinson, Edward H; Stout, Jeffrey R; Miramonti, Amelia A; Fukuda, David H; Wang, Ran; Townsend, Jeremy R; Mangine, Gerald T; Fragala, Maren S; Hoffman, Jay R

    2014-01-01

    Background Previous research combining Calcium β-hydroxy-β-methylbutyrate (CaHMB) and running high-intensity interval training (HIIT) have shown positive effects on aerobic performance measures. The purpose of this study was to examine the effect of β-hydroxy-β-methylbutyric free acid (HMBFA) and cycle ergometry HIIT on maximal oxygen consumption (VO2peak), ventilatory threshold (VT), respiratory compensation point (RCP) and time to exhaustion (Tmax) in college-aged men and women. Methods Thi...

  20. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    OpenAIRE

    Sergey Nikolaevich Kyazhin; Andrey Vladimirovich Moiseev

    2013-01-01

    The current state of the cloud computing (CC) information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  1. Cryptography in the Cloud Computing: the Current State and Logical Tasks

    Directory of Open Access Journals (Sweden)

    Sergey Nikolaevich Kyazhin

    2013-09-01

    Full Text Available The current state of the cloud computing (CC information security is analysed and logical problems of storage and data transmission security at CC are allocated. Cryptographic methods of data security in CC, in particular, lightweight cryptography and the cryptography based on bilinear pairings are described.

  2. A library for prototyping the computer arithmetic level in elliptic curve cryptography

    Science.gov (United States)

    Imbert, Laurent; Peirera, Agostinho; Tisserand, Arnaud

    2007-09-01

    This paper presents the first version of a software library called PACE ("Prototyping Arithmetic in Cryptography Easily"). This is a C++ library under LGPL license. It provides number systems and algorithms for prototyping the arithmetic layer in cryptographic applications. The first version of PACE includes basic support of prime finite fields and ECC (Elliptic Curve Cryptography) basic algorithms for software implementations.

  3. Improved security proofs and constructions for public-key cryptography

    OpenAIRE

    Pan, Jiaxin (M. Sc.)

    2016-01-01

    Diese Arbeit verbessert die Sicherheitsanalyse und Konstruktierbarkeit von Public-Key-Kryptographie: Der erste Teil der Arbeit schlägt einen vereinfachten Sicherheitsbeweis für digitale Signaturverfahren von kanonischen Identifikationsschemata über die klassischen Fiat-Shamir-Transformation im Random Oracle Modell vor. Der zweite Teil der Arbeit schlägt eine neue Variante der Message Authentication Codes (MACs) vor, die sogenannten affinen MACs. Außerdem wird eine generische Transform...

  4. Threshold Cryptography-based Group Authentication (TCGA) Scheme for the Internet of Things (IoT)

    DEFF Research Database (Denmark)

    Mahalle, Parikshit N.; Prasad, Neeli R.; Prasad, Ramjee

    2014-01-01

    Internet of things (IoT) is an emerging paradigm where the devices around us (persistent and non-persistent) are connected to each other to provide seamless communication, and contextual services. In the IoT, each device cannot be authenticated in the short time due to unbounded number of devices...

  5. Real Time MODBUS Transmissions and Cryptography Security Designs and Enhancements of Protocol Sensitive Information

    Directory of Open Access Journals (Sweden)

    Aamir Shahzad

    2015-07-01

    Full Text Available Information technology (IT security has become a major concern due to the growing demand for information and massive development of client/server applications for various types of applications running on modern IT infrastructure. How has security been taken into account and which paradigms are necessary to minimize security issues while increasing efficiency, reducing the influence on transmissions, ensuring protocol independency and achieving substantial performance? We have found cryptography to be an absolute security mechanism for client/server architectures, and in this study, a new security design was developed with the MODBUS protocol, which is considered to offer phenomenal performance for future development and enhancement of real IT infrastructure. This study is also considered to be a complete development because security is tested in almost all ways of MODBUS communication. The computed measurements are evaluated to validate the overall development, and the results indicate a substantial improvement in security that is differentiated from conventional methods.

  6. Cross-modal attention influences auditory contrast sensitivity: Decreasing visual load improves auditory thresholds for amplitude- and frequency-modulated sounds.

    Science.gov (United States)

    Ciaramitaro, Vivian M; Chow, Hiu Mei; Eglington, Luke G

    2017-03-01

    We used a cross-modal dual task to examine how changing visual-task demands influenced auditory processing, namely auditory thresholds for amplitude- and frequency-modulated sounds. Observers had to attend to two consecutive intervals of sounds and report which interval contained the auditory stimulus that was modulated in amplitude (Experiment 1) or frequency (Experiment 2). During auditory-stimulus presentation, observers simultaneously attended to a rapid sequential visual presentation-two consecutive intervals of streams of visual letters-and had to report which interval contained a particular color (low load, demanding less attentional resources) or, in separate blocks of trials, which interval contained more of a target letter (high load, demanding more attentional resources). We hypothesized that if attention is a shared resource across vision and audition, an easier visual task should free up more attentional resources for auditory processing on an unrelated task, hence improving auditory thresholds. Auditory detection thresholds were lower-that is, auditory sensitivity was improved-for both amplitude- and frequency-modulated sounds when observers engaged in a less demanding (compared to a more demanding) visual task. In accord with previous work, our findings suggest that visual-task demands can influence the processing of auditory information on an unrelated concurrent task, providing support for shared attentional resources. More importantly, our results suggest that attending to information in a different modality, cross-modal attention, can influence basic auditory contrast sensitivity functions, highlighting potential similarities between basic mechanisms for visual and auditory attention.

  7. Neutron-induced fission cross section of (nat)Pb and (209)Bi from threshold to 1 GeV: An improved parametrization

    CERN Document Server

    Tarrio, D; Audouin, L; Berthier, B; Duran, I; Ferrant, L; Isaev, S; Le Naour, C; Paradela, C; Stephan, C; Trubert, D; Abbondanno, U; Aerts, G; Alvarez-Velarde, F; Andriamonje, S; Andrzejewski, J; Assimakopoulos, P; Badurek, G; Baumann, P; Becvar, F; Belloni, F; Berthoumieux, E; Calvino, F; Calviani, M; Cano-Ott, D; Capote, R; Carrapico, C; Carrillo de Albornoz, A; Cennini, P; Chepel, V; Chiaveri, E; Colonna, N; Cortes, G; Couture, A; Cox, J; Dahlfors, M; David, S; Dillmann, I; Dolfini, R; Domingo-Pardo, C; Dridi, W; Eleftheriadis, C; Embid-Segura, M; Ferrari, A; Ferreira-Marques, R; Fitzpatrick, L; Frais-Koelbl, H; Fujii, K; Furman, W; Goncalves, I; Gonzalez-Romero, E; Goverdovski, A; Gramegna, F; Griesmayer, E; Guerrero, C; Gunsing, F; Haas, B; Haight, R; Heil, M; Herrera-Martinez, A; Igashira, M; Jericha, E; Kadi, Y; Kappeler, F; Karadimos, D; Karamanis, D; Kerveno, M; Ketlerov, V; Koehler, P; Konovalov, V; Kossionides, E; Krticka, M; Lampoudis, C; Leeb, H; Lederer, C; Lindote, A; Lopes, I; Losito, R; Lozano, M; Lukic, S; Marganiec, J; Marques, L; Marrone, S; Martinez, T; Massimi, C; Mastinu, P; Mendoza, E; Mengoni, A; Milazzo, P.M; Moreau, C; Mosconi, M; Neves, F; Oberhummer, H; O'Brien, S; Oshima, M; Pancin, J; Papachristodoulou, C; Papadopoulos, C; Patronis, N; Pavlik, A; Pavlopoulos, P; Perrot, L; Pigni, M.T; Plag, R; Plompen, A; Plukis, A; Poch, A; Praena, J; Pretel, C; Quesada, J; Rauscher, T; Reifarth, R; Rosetti, M; Rubbia, C; Rudolf, G; Rullhusen, P; Salgado, J; Santos, C; Sarchiapone, L; Sarmento, R; Savvidis, I; Tagliente, G; Tain, J.L; Tavora, L; Terlizzi, R; Vannini, G; Vaz, P; Ventura, A; Villamarin, D; Vlachoudis, V; Vlastou, R; Voss, F; Walter, S; Wendler, H; Wiescher, M; Wisshak, K

    2011-01-01

    Neutron-induced fission cross sections for (nat)Pb and (209)Bi were measured with a white-spectrum neutron source at the CERN Neutron Time-of-Flight (n\\_TOF) facility. The experiment, using neutrons from threshold up to 1 GeV, provides the first results for these nuclei above 200 MeV. The cross sections were measured relative to (235)U and (238)U in a dedicated fission chamber with parallel plate avalanche counter detectors. Results are compared with previous experimental data. Upgraded parametrizations of the cross sections are presented, from threshold energy up to 1 GeV. The proposed new sets of fitting parameters improve former results along the whole energy range.

  8. Intramuscular Neurotrophin-3 normalizes low threshold spinal reflexes, reduces spasms and improves mobility after bilateral corticospinal tract injury in rats

    Science.gov (United States)

    Kathe, Claudia; Hutson, Thomas Haynes; McMahon, Stephen Brendan; Moon, Lawrence David Falcon

    2016-01-01

    Brain and spinal injury reduce mobility and often impair sensorimotor processing in the spinal cord leading to spasticity. Here, we establish that complete transection of corticospinal pathways in the pyramids impairs locomotion and leads to increased spasms and excessive mono- and polysynaptic low threshold spinal reflexes in rats. Treatment of affected forelimb muscles with an adeno-associated viral vector (AAV) encoding human Neurotrophin-3 at a clinically-feasible time-point after injury reduced spasticity. Neurotrophin-3 normalized the short latency Hoffmann reflex to a treated hand muscle as well as low threshold polysynaptic spinal reflexes involving afferents from other treated muscles. Neurotrophin-3 also enhanced locomotor recovery. Furthermore, the balance of inhibitory and excitatory boutons in the spinal cord and the level of an ion co-transporter in motor neuron membranes required for normal reflexes were normalized. Our findings pave the way for Neurotrophin-3 as a therapy that treats the underlying causes of spasticity and not only its symptoms. DOI: http://dx.doi.org/10.7554/eLife.18146.001 PMID:27759565

  9. A "proof-reading" of Some Issues in Cryptography

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre

    2007-01-01

    In this paper, we identify some issues in the interplay between practice and theory in cryptography, issues that have repeatedly appeared in different incarnations over the years. These issues are related to fundamental concepts in the eld, e.g., to what extent we can prove that a system is secure...... and what theoretic results on security mean for practical applications. We argue that several such issues are often over-looked or misunderstood, and that it may be very productive if both theoreticians and practitioners think more consciously about these issues and act accordingly....

  10. An Online Banking System Based on Quantum Cryptography Communication

    Science.gov (United States)

    Zhou, Ri-gui; Li, Wei; Huan, Tian-tian; Shen, Chen-yi; Li, Hai-sheng

    2014-07-01

    In this paper, an online banking system has been built. Based on quantum cryptography communication, this system is proved unconditional secure. Two sets of GHZ states are applied, which can ensure the safety of purchase and payment, respectively. In another word, three trading participants in each triplet state group form an interdependent and interactive relationship. In the meantime, trading authorization and blind signature is introduced by means of controllable quantum teleportation. Thus, an effective monitor is practiced on the premise that the privacy of trading partners is guaranteed. If there is a dispute or deceptive behavior, the system will find out the deceiver immediately according to the relationship mentioned above.

  11. Novel optical scanning cryptography using Fresnel telescope imaging.

    Science.gov (United States)

    Yan, Aimin; Sun, Jianfeng; Hu, Zhijuan; Zhang, Jingtao; Liu, Liren

    2015-07-13

    We propose a new method called modified optical scanning cryptography using Fresnel telescope imaging technique for encryption and decryption of remote objects. An image or object can be optically encrypted on the fly by Fresnel telescope scanning system together with an encryption key. For image decryption, the encrypted signals are received and processed with an optical coherent heterodyne detection system. The proposed method has strong performance through use of secure Fresnel telescope scanning with orthogonal polarized beams and efficient all-optical information processing. The validity of the proposed method is demonstrated by numerical simulations and experimental results.

  12. Geometry, algebra and applications from mechanics to cryptography

    CERN Document Server

    Encinas, Luis; Gadea, Pedro; María, Mª

    2016-01-01

    This volume collects contributions written by different experts in honor of Prof. Jaime Muñoz Masqué. It covers a wide variety of research topics, from differential geometry to algebra, but particularly focuses on the geometric formulation of variational calculus; geometric mechanics and field theories; symmetries and conservation laws of differential equations, and pseudo-Riemannian geometry of homogeneous spaces. It also discusses algebraic applications to cryptography and number theory. It offers state-of-the-art contributions in the context of current research trends. The final result is a challenging panoramic view of connecting problems that initially appear distant.

  13. A Luggage Control System Based on NFC and Homomorphic Cryptography

    Directory of Open Access Journals (Sweden)

    Néstor Álvarez-Díaz

    2017-01-01

    Full Text Available We propose an innovative luggage tracking and management system that can be used to secure airport terminal services and reduce the waiting time of passengers during check-in. This addresses an urgent need to streamline and optimize passenger flows at airport terminals and lowers the risk of terrorist threats. The system employs Near Field Communication (NFC technology and homomorphic cryptography (the Paillier cryptosystem to protect wireless communication and stored data. A security analysis and a performance test show the usability and applicability of the proposed system.

  14. Microscale optical cryptography using a subdiffraction-limit optical key

    Science.gov (United States)

    Ogura, Yusuke; Aino, Masahiko; Tanida, Jun

    2018-04-01

    We present microscale optical cryptography using a subdiffraction-limit optical pattern, which is finer than the diffraction-limit size of the decrypting optical system, as a key and a substrate with a reflectance distribution as an encrypted image. Because of the subdiffraction-limit spatial coding, this method enables us to construct a secret image with the diffraction-limit resolution. Simulation and experimental results demonstrate, both qualitatively and quantitatively, that the secret image becomes recognizable when and only when the substrate is illuminated with the designed key pattern.

  15. Implementing SSL/TLS using cryptography and PKI

    CERN Document Server

    Davies, Joshua

    2011-01-01

    Hands-on, practical guide to implementing SSL and TLS protocols for Internet security If you are a network professional who knows C programming, this practical book is for you.  Focused on how to implement Secure Socket Layer (SSL) and Transport Layer Security (TLS), this book guides you through all necessary steps, whether or not you have a working knowledge of cryptography. The book covers SSLv2, TLS 1.0, and TLS 1.2, including implementations of the relevant cryptographic protocols, secure hashing, certificate parsing, certificate generation, and more.  Coverage includes: Underst

  16. One-way entangled-photon autocompensating quantum cryptography

    Science.gov (United States)

    Walton, Zachary D.; Abouraddy, Ayman F.; Sergienko, Alexander V.; Saleh, Bahaa E.; Teich, Malvin C.

    2003-06-01

    A quantum cryptography implementation is presented that uses entanglement to combine one-way operation with an autocompensating feature that has hitherto only been available in implementations that require the signal to make a round trip between the users. Using the concept of advanced waves, it is shown that this proposed implementation is related to the round-trip implementation in the same way that Ekert’s two-particle scheme is related to the original one-particle scheme of Bennett and Brassard. The practical advantages and disadvantages of the proposed implementation are discussed in the context of existing schemes.

  17. Finite key analysis in quantum cryptography

    International Nuclear Information System (INIS)

    Meyer, T.

    2007-01-01

    In view of experimental realization of quantum key distribution schemes, the study of their efficiency becomes as important as the proof of their security. The latter is the subject of most of the theoretical work about quantum key distribution, and many important results such as the proof of unconditional security have been obtained. The efficiency and also the robustness of quantum key distribution protocols against noise can be measured by figures of merit such as the secret key rate (the fraction of input signals that make it into the key) and the threshold quantum bit error rate (the maximal error rate such that one can still create a secret key). It is important to determine these quantities because they tell us whether a certain quantum key distribution scheme can be used at all in a given situation and if so, how many secret key bits it can generate in a given time. However, these figures of merit are usually derived under the ''infinite key limit'' assumption, that is, one assumes that an infinite number of quantum states are send and that all sub-protocols of the scheme (in particular privacy amplification) are carried out on these infinitely large blocks. Such an assumption usually eases the analysis, but also leads to (potentially) too optimistic values for the quantities in question. In this thesis, we are explicitly avoiding the infinite key limit for the analysis of the privacy amplification step, which plays the most important role in a quantum key distribution scheme. We still assume that an optimal error correction code is applied and we do not take into account any statistical errors that might occur in the parameter estimation step. Renner and coworkers derived an explicit formula for the obtainable key rate in terms of Renyi entropies of the quantum states describing Alice's, Bob's, and Eve's systems. This results serves as a starting point for our analysis, and we derive an algorithm that efficiently computes the obtainable key rate for any

  18. Finite key analysis in quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, T.

    2007-10-31

    In view of experimental realization of quantum key distribution schemes, the study of their efficiency becomes as important as the proof of their security. The latter is the subject of most of the theoretical work about quantum key distribution, and many important results such as the proof of unconditional security have been obtained. The efficiency and also the robustness of quantum key distribution protocols against noise can be measured by figures of merit such as the secret key rate (the fraction of input signals that make it into the key) and the threshold quantum bit error rate (the maximal error rate such that one can still create a secret key). It is important to determine these quantities because they tell us whether a certain quantum key distribution scheme can be used at all in a given situation and if so, how many secret key bits it can generate in a given time. However, these figures of merit are usually derived under the ''infinite key limit'' assumption, that is, one assumes that an infinite number of quantum states are send and that all sub-protocols of the scheme (in particular privacy amplification) are carried out on these infinitely large blocks. Such an assumption usually eases the analysis, but also leads to (potentially) too optimistic values for the quantities in question. In this thesis, we are explicitly avoiding the infinite key limit for the analysis of the privacy amplification step, which plays the most important role in a quantum key distribution scheme. We still assume that an optimal error correction code is applied and we do not take into account any statistical errors that might occur in the parameter estimation step. Renner and coworkers derived an explicit formula for the obtainable key rate in terms of Renyi entropies of the quantum states describing Alice's, Bob's, and Eve's systems. This results serves as a starting point for our analysis, and we derive an algorithm that efficiently computes

  19. Additional Effects of a Physical Therapy Protocol on Headache Frequency, Pressure Pain Threshold, and Improvement Perception in Patients With Migraine and Associated Neck Pain: A Randomized Controlled Trial.

    Science.gov (United States)

    Bevilaqua-Grossi, Débora; Gonçalves, Maria Claudia; Carvalho, Gabriela Ferreira; Florencio, Lidiane Lima; Dach, Fabíola; Speciali, José Geraldo; Bigal, Marcelo Eduardo; Chaves, Thaís Cristina

    2016-06-01

    To evaluate the additional effect provided by physical therapy in migraine treatment. Randomized controlled trial. Tertiary university-based hospital. Among the 300 patients approached, 50 women (age range, 18-55y) diagnosed with migraine were randomized into 2 groups: a control group (n=25) and a physiotherapy plus medication group (n=25) (N=50). Both groups received medication for migraine treatment. Additionally, physiotherapy plus medication patients received 8 sessions of physical therapy over 4 weeks, comprised mainly of manual therapy and stretching maneuvers lasting 50 minutes. A blinded examiner assessed the clinical outcomes of headache frequency, intensity, and self-perception of global change and physical outcomes of pressure pain threshold and cervical range of motion. Data were recorded at baseline, posttreatment, and 1-month follow-up. Twenty-three patients experienced side effects from the medication. Both groups reported a significantly reduced frequency of headaches; however, no differences were observed between groups (physiotherapy plus medication patients showed an additional 18% improvement at posttreatment and 12% improvement at follow-up compared with control patients, P>.05). The reduction observed in the physiotherapy plus medication patients was clinically relevant at posttreatment, whereas clinical relevance for control patients was demonstrated only at follow-up. For pain intensity, physiotherapy plus medication patients showed statistical evidence and clinical relevance with reduction posttreatment (P<.05). In addition, they showed better self-perception of global change than control patients (P<.05). The cervical muscle pressure pain threshold increased significantly in the physiotherapy plus medication patients and decreased in the control patients, but statistical differences between groups were observed only in the temporal area (P<.05). No differences were observed between groups regarding cervical range of motion. We cannot

  20. Applications of Fast Truncated Multiplication in Cryptography

    Directory of Open Access Journals (Sweden)

    Laszlo Hars

    2006-12-01

    Full Text Available Truncated multiplications compute truncated products, contiguous subsequences of the digits of integer products. For an n-digit multiplication algorithm of time complexity O(nα, with 1<α≤2, there is a truncated multiplication algorithm, which is constant times faster when computing a short enough truncated product. Applying these fast truncated multiplications, several cryptographic long integer arithmetic algorithms are improved, including integer reciprocals, divisions, Barrett and Montgomery multiplications, 2n-digit modular multiplication on hardware for n-digit half products. For example, Montgomery multiplication is performed in 2.6 Karatsuba multiplication time.

  1. Implementation of diffie-Hellman key exchange on wireless sensor using elliptic curve cryptography

    DEFF Research Database (Denmark)

    Khajuria, Samant; Tange, Henrik

    2009-01-01

    This work describes a low-cost public key cryptography (PKC) based solution for security services such as authentication as required for wireless sensor networks. We have implemented a software approach using elliptic curve cryptography (ECC) over GF (2m) in order to obtain stronger cryptography....... In our approach we are using Koblitz curves and TNAF (pi-adic non-adjacent form) with partial reduction modulo. A Diffie-Hellman key exchange is implemented. This public key exchange is done between two participants in the network. The actual implementation is done at an 8-bit ATmega128L Micaz platform...

  2. Number Theory and Applications : Proceedings of the International Conferences on Number Theory and Cryptography

    CERN Document Server

    Ramakrishnan, B

    2009-01-01

    This collection of articles contains the proceedings of the two international conferences (on Number Theory and Cryptography) held at the Harish - Chandra Research Institute. In recent years the interest in number theory has increased due to its applications in areas like error-correcting codes and cryptography. These proceedings contain papers in various areas of number theory, such as combinatorial, algebraic, analytic and transcendental aspects, arithmetic algebraic geometry, as well as graph theory and cryptography. While some papers do contain new results, several of the papers are expository articles that mention open questions, which will be useful to young researchers.

  3. An improved Peltier effect-based instrument for critical temperature threshold measurement in cold- and heat-induced urticaria.

    Science.gov (United States)

    Magerl, M; Abajian, M; Krause, K; Altrichter, S; Siebenhaar, F; Church, M K

    2015-10-01

    Cold- and heat-induced urticaria are chronic physical urticaria conditions in which wheals, angioedema or both are evoked by skin exposure to cold and heat respectively. The diagnostic work up of both conditions should include skin provocation tests and accurate determination of critical temperature thresholds (CTT) for producing symptoms in order to be able to predict the potential risk that each individual patient faces and how this may be ameliorated by therapy. To develop and validate TempTest(®) 4, a simple and relatively inexpensive instrument for the accurate determination of CTT which may be used in clinical practice. TempTest(®) 4 has a single 2 mm wide 350 mm U-shaped Peltier element generating a temperature gradient from 4 °C to 44 °C along its length. Using a clear plastic guide placed over the skin after provocation, CTT values may be determined with an accuracy of ±1 °C. Here, TempTest(®) 4 was compared with its much more expensive predecessor, TempTest(®) 3, in inducing wheals in 30 cold urticaria patients. Both TempTest(®) 4 and TempTest(®) 3 induced wheals in all 30 patients between 8 ° and 28 °C. There was a highly significant (P < 0.0001) correlation between the instruments in the CTT values in individual patients. The TempTest(®) 4 is a simple, easy to use, licensed, commercially available and affordable instrument for the determination of CTTs in both cold- and heat-induced urticaria. © 2014 European Academy of Dermatology and Venereology.

  4. Entropy in quantum information theory - Communication and cryptography

    DEFF Research Database (Denmark)

    Majenz, Christian

    to density matrices, the von Neumann entropy behaves dierently. The latter does not, for example, have the monotonicity property that the latter possesses: When adding another quantum system, the entropy can decrease. A long-standing open question is, whether there are quantum analogues of unconstrained non......Entropies have been immensely useful in information theory. In this Thesis, several results in quantum information theory are collected, most of which use entropy as the main mathematical tool. The rst one concerns the von Neumann entropy. While a direct generalization of the Shannon entropy...... in quantum Shannon theory. While immensely more entanglement-consuming, the variant of port based teleportation is interesting for applications like instantaneous non-local computation and attacks on quantum position-based cryptography. Port based teleportation cannot be implemented perfectly...

  5. APE: Authenticated Permutation-Based Encryption for Lightweight Cryptography

    DEFF Research Database (Denmark)

    Andreeva, Elena; Bilgin, Begül; Bogdanov, Andrey

    2015-01-01

    The domain of lightweight cryptography focuses on cryptographic algorithms for extremely constrained devices. It is very costly to avoid nonce reuse in such environments, because this requires either a hardware source of randomness, or non-volatile memory to store a counter. At the same time, a lot...... of cryptographic schemes actually require the nonce assumption for their security. In this paper, we propose APE as the first permutation-based authenticated encryption scheme that is resistant against nonce misuse. We formally prove that APE is secure, based on the security of the underlying permutation......, and Spongent. For any of these permutations, an implementation that supports both encryption and decryption requires less than 1.9 kGE and 2.8 kGE for 80-bit and 128-bit security levels, respectively....

  6. Position-based quantum cryptography over untrusted networks

    International Nuclear Information System (INIS)

    Nadeem, Muhammad

    2014-01-01

    In this article, we propose quantum position verification (QPV) schemes where all the channels are untrusted except the position of the prover and distant reference stations of verifiers. We review and analyze the existing QPV schemes containing some pre-shared data between the prover and verifiers. Most of these schemes are based on non-cryptographic assumptions, i.e. quantum/classical channels between the verifiers are secure. It seems impractical in an environment fully controlled by adversaries and would lead to security compromise in practical implementations. However, our proposed formula for QPV is more robust, secure and according to the standard assumptions of cryptography. Furthermore, once the position of the prover is verified, our schemes establish secret keys in parallel and can be used for authentication and secret communication between the prover and verifiers. (paper)

  7. Scalable Normal Basis Arithmetic Unit for Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    J. Schmidt

    2005-01-01

    Full Text Available The design of a scalable arithmetic unit for operations over elements of GF(2m represented in normal basis is presented. The unit is applicable in public-key cryptography. It comprises a pipelined Massey-Omura multiplier and a shifter. We equipped the multiplier with additional data paths to enable easy implementation of both multiplication and inversion in a single arithmetic unit. We discuss optimum design of the shifter with respect to the inversion algorithm and multiplier performance. The functionality of the multiplier/inverter has been tested by simulation and implemented in Xilinx Virtex FPGA.We present implementation data for various digit widths which exhibit a time minimum for digit width D = 15.

  8. Tight finite-key analysis for quantum cryptography.

    Science.gov (United States)

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  9. Postselection technique for quantum channels with applications to quantum cryptography.

    Science.gov (United States)

    Christandl, Matthias; König, Robert; Renner, Renato

    2009-01-16

    We propose a general method for studying properties of quantum channels acting on an n-partite system, whose action is invariant under permutations of the subsystems. Our main result is that, in order to prove that a certain property holds for an arbitrary input, it is sufficient to consider the case where the input is a particular de Finetti-type state, i.e., a state which consists of n identical and independent copies of an (unknown) state on a single subsystem. Our technique can be applied to the analysis of information-theoretic problems. For example, in quantum cryptography, we get a simple proof for the fact that security of a discrete-variable quantum key distribution protocol against collective attacks implies security of the protocol against the most general attacks. The resulting security bounds are tighter than previously known bounds obtained with help of the exponential de Finetti theorem.

  10. Why cryptography should not rely on physical attack complexity

    CERN Document Server

    Krämer, Juliane

    2015-01-01

    This book presents two practical physical attacks. It shows how attackers can reveal the secret key of symmetric as well as asymmetric cryptographic algorithms based on these attacks, and presents countermeasures on the software and the hardware level that can help to prevent them in the future. Though their theory has been known for several years now, since neither attack has yet been successfully implemented in practice, they have generally not been considered a serious threat. In short, their physical attack complexity has been overestimated and the implied security threat has been underestimated. First, the book introduces the photonic side channel, which offers not only temporal resolution, but also the highest possible spatial resolution. Due to the high cost of its initial implementation, it has not been taken seriously. The work shows both simple and differential photonic side channel analyses. Then, it presents a fault attack against pairing-based cryptography. Due to the need for at least two indepe...

  11. Dynamic visual cryptography on deformable finite element grids

    Science.gov (United States)

    Aleksiene, S.; Vaidelys, M.; Aleksa, A.; Ragulskis, M.

    2017-07-01

    Dynamic visual cryptography scheme based on time averaged moiré fringes on deformable finite element grids is introduced in this paper. A predefined Eigenshape function is used for the selection of the pitch of the moiré grating. The relationship between the pitch of moiré grating, the roots of the zero order Bessel function of the first kind and the amplitude of harmonic oscillations is derived and validated by computational experiments. Phase regularization algorithm is used in the entire area of the cover image in order to embed the secret image and to avoid large fluctuations of the moiré grating. Computational simulations are used to demonstrate the efficiency and the applicability of the proposed image hiding technique.

  12. Security proof of quantum cryptography based entirely on entanglement purification

    International Nuclear Information System (INIS)

    Aschauer, Hans; Briegel, Hans J.

    2002-01-01

    We give a proof that entanglement purification, even with noisy apparatus, is sufficient to disentangle an eavesdropper (Eve) from the communication channel. In the security regime, the purification process factorizes the overall initial state into a tensor-product state of Alice and Bob, on one side, and Eve on the other side, thus establishing a completely private, albeit noisy, quantum communication channel between Alice and Bob. The security regime is found to coincide for all practical purposes with the purification regime of a two-way recurrence protocol. This makes two-way entanglement purification protocols, which constitute an important element in the quantum repeater, an efficient tool for secure long-distance quantum cryptography

  13. Conception and implémentation de cryptographie à base de réseaux

    OpenAIRE

    Lepoint , Tancrède

    2014-01-01

    Today, lattice-based cryptography is a thriving scientific field. Its swift expansion is due, among others, to the attractiveness of fully homomorphic encryption and cryptographic multilinear maps. Lattice-based cryptography has also been recognized for its thrilling properties: a security that can be reduced to worst-case instances of problems over lattices, a quasi-optimal asymptotic efficiency and an alleged resistance to quantum computers. However, its practical use in real-world products...

  14. Security Enhanced User Authentication Protocol for Wireless Sensor Networks Using Elliptic Curves Cryptography

    Science.gov (United States)

    Choi, Younsung; Lee, Donghoon; Kim, Jiye; Jung, Jaewook; Nam, Junghyun; Won, Dongho

    2014-01-01

    Wireless sensor networks (WSNs) consist of sensors, gateways and users. Sensors are widely distributed to monitor various conditions, such as temperature, sound, speed and pressure but they have limited computational ability and energy. To reduce the resource use of sensors and enhance the security of WSNs, various user authentication protocols have been proposed. In 2011, Yeh et al. first proposed a user authentication protocol based on elliptic curve cryptography (ECC) for WSNs. However, it turned out that Yeh et al.'s protocol does not provide mutual authentication, perfect forward secrecy, and key agreement between the user and sensor. Later in 2013, Shi et al. proposed a new user authentication protocol that improves both security and efficiency of Yeh et al.'s protocol. However, Shi et al.'s improvement introduces other security weaknesses. In this paper, we show that Shi et al.'s improved protocol is vulnerable to session key attack, stolen smart card attack, and sensor energy exhausting attack. In addition, we propose a new, security-enhanced user authentication protocol using ECC for WSNs. PMID:24919012

  15. Security Enhanced User Authentication Protocol for Wireless Sensor Networks Using Elliptic Curves Cryptography

    Directory of Open Access Journals (Sweden)

    Younsung Choi

    2014-06-01

    Full Text Available Wireless sensor networks (WSNs consist of sensors, gateways and users. Sensors are widely distributed to monitor various conditions, such as temperature, sound, speed and pressure but they have limited computational ability and energy. To reduce the resource use of sensors and enhance the security of WSNs, various user authentication protocols have been proposed. In 2011, Yeh et al. first proposed a user authentication protocol based on elliptic curve cryptography (ECC for WSNs. However, it turned out that Yeh et al.’s protocol does not provide mutual authentication, perfect forward secrecy, and key agreement between the user and sensor. Later in 2013, Shi et al. proposed a new user authentication protocol that improves both security and efficiency of Yeh et al.’s protocol. However, Shi et al.’s improvement introduces other security weaknesses. In this paper, we show that Shi et al.’s improved protocol is vulnerable to session key attack, stolen smart card attack, and sensor energy exhausting attack. In addition, we propose a new, security-enhanced user authentication protocol using ECC for WSNs.

  16. Security enhanced user authentication protocol for wireless sensor networks using elliptic curves cryptography.

    Science.gov (United States)

    Choi, Younsung; Lee, Donghoon; Kim, Jiye; Jung, Jaewook; Nam, Junghyun; Won, Dongho

    2014-06-10

    Wireless sensor networks (WSNs) consist of sensors, gateways and users. Sensors are widely distributed to monitor various conditions, such as temperature, sound, speed and pressure but they have limited computational ability and energy. To reduce the resource use of sensors and enhance the security of WSNs, various user authentication protocols have been proposed. In 2011, Yeh et al. first proposed a user authentication protocol based on elliptic curve cryptography (ECC) for WSNs. However, it turned out that Yeh et al.'s protocol does not provide mutual authentication, perfect forward secrecy, and key agreement between the user and sensor. Later in 2013, Shi et al. proposed a new user authentication protocol that improves both security and efficiency of Yeh et al.'s protocol. However, Shi et al.'s improvement introduces other security weaknesses. In this paper, we show that Shi et al.'s improved protocol is vulnerable to session key attack, stolen smart card attack, and sensor energy exhausting attack. In addition, we propose a new, security-enhanced user authentication protocol using ECC for WSNs.

  17. Multiple Schemes for Mobile Payment Authentication Using QR Code and Visual Cryptography

    Directory of Open Access Journals (Sweden)

    Jianfeng Lu

    2017-01-01

    Full Text Available QR code (quick response code is used due to its beneficial properties, especially in the mobile payment field. However, there exists an inevitable risk in the transaction process. It is not easily perceived that the attacker tampers with or replaces the QR code that contains merchant’s beneficiary account. Thus, it is of great urgency to conduct authentication of QR code. In this study, we propose a novel mechanism based on visual cryptography scheme (VCS and aesthetic QR code, which contains three primary schemes for different concealment levels. The main steps of these schemes are as follows. Firstly, one original QR code is split into two shadows using VC multiple rules; secondly, the two shadows are embedded into the same background image, respectively, and the embedded results are fused with the same carrier QR code, respectively, using XOR mechanism of RS and QR code error correction mechanism. Finally, the two aesthetic QR codes can be stacked precisely and the original QR code is restored according to the defined VCS. Experiments corresponding to three proposed schemes are conducted and demonstrate the feasibility and security of the mobile payment authentication, the significant improvement of the concealment for the shadows in QR code, and the diversity of mobile payment authentication.

  18. System Level Design of Reconfigurable Server Farms Using Elliptic Curve Cryptography Processor Engines

    Directory of Open Access Journals (Sweden)

    Sangook Moon

    2014-01-01

    Full Text Available As today’s hardware architecture becomes more and more complicated, it is getting harder to modify or improve the microarchitecture of a design in register transfer level (RTL. Consequently, traditional methods we have used to develop a design are not capable of coping with complex designs. In this paper, we suggest a way of designing complex digital logic circuits with a soft and advanced type of SystemVerilog at an electronic system level. We apply the concept of design-and-reuse with a high level of abstraction to implement elliptic curve crypto-processor server farms. With the concept of the superior level of abstraction to the RTL used with the traditional HDL design, we successfully achieved the soft implementation of the crypto-processor server farms as well as robust test bench code with trivial effort in the same simulation environment. Otherwise, it could have required error-prone Verilog simulations for the hardware IPs and other time-consuming jobs such as C/SystemC verification for the software, sacrificing more time and effort. In the design of the elliptic curve cryptography processor engine, we propose a 3X faster GF(2m serial multiplication architecture.

  19. Novel Noncommutative Cryptography Scheme Using Extra Special Group

    Directory of Open Access Journals (Sweden)

    Gautam Kumar

    2017-01-01

    Full Text Available Noncommutative cryptography (NCC is truly a fascinating area with great hope of advancing performance and security for high end applications. It provides a high level of safety measures. The basis of this group is established on the hidden subgroup or subfield problem (HSP. The major focus in this manuscript is to establish the cryptographic schemes on the extra special group (ESG. ESG is showing one of the most appropriate noncommutative platforms for the solution of an open problem. The working principle is based on the random polynomials chosen by the communicating parties to secure key exchange, encryption-decryption, and authentication schemes. This group supports Heisenberg, dihedral order, and quaternion group. Further, this is enhanced from the general group elements to equivalent ring elements, known by the monomials generations for the cryptographic schemes. In this regard, special or peculiar matrices show the potential advantages. The projected approach is exclusively based on the typical sparse matrices, and an analysis report is presented fulfilling the central cryptographic requirements. The order of this group is more challenging to assail like length based, automorphism, and brute-force attacks.

  20. Optical cryptography with biometrics for multi-depth objects.

    Science.gov (United States)

    Yan, Aimin; Wei, Yang; Hu, Zhijuan; Zhang, Jingtao; Tsang, Peter Wai Ming; Poon, Ting-Chung

    2017-10-11

    We propose an optical cryptosystem for encrypting images of multi-depth objects based on the combination of optical heterodyne technique and fingerprint keys. Optical heterodyning requires two optical beams to be mixed. For encryption, each optical beam is modulated by an optical mask containing either the fingerprint of the person who is sending, or receiving the image. The pair of optical masks are taken as the encryption keys. Subsequently, the two beams are used to scan over a multi-depth 3-D object to obtain an encrypted hologram. During the decryption process, each sectional image of the 3-D object is recovered by convolving its encrypted hologram (through numerical computation) with the encrypted hologram of a pinhole image that is positioned at the same depth as the sectional image. Our proposed method has three major advantages. First, the lost-key situation can be avoided with the use of fingerprints as the encryption keys. Second, the method can be applied to encrypt 3-D images for subsequent decrypted sectional images. Third, since optical heterodyning scanning is employed to encrypt a 3-D object, the optical system is incoherent, resulting in negligible amount of speckle noise upon decryption. To the best of our knowledge, this is the first time optical cryptography of 3-D object images has been demonstrated in an incoherent optical system with biometric keys.

  1. Decoding chaotic cryptography without access to the superkey

    International Nuclear Information System (INIS)

    Vaidya, P.G.; Angadi, Savita

    2003-01-01

    Some chaotic systems can be synchronized by sending only a part of the state space information. This property is used to create keys for cryptography using the unsent state spaces. This idea was first used in connection with the Lorenz equation. It has been assumed for that equation that access to the unsent information is impossible without knowing the three parameters of the equation. This is why the values of these parameters are collectively known as the 'superkey'. The exhaustive search for this key from the existing data is time consuming and can easily be countered by changing the key. We show in this paper how the superkey can be found in a very rapid manner from the synchronizing signal. We achieve this by first transforming the Lorenz equation to a canonical form. Then we use our recently developed method to find highly accurate derivatives from data. Next we transform a nonlinear equation for the superkey to a linear form by embedding it in four dimensions. The final equations are solved by using the generalized inverse

  2. Voting on Thresholds for Public Goods

    DEFF Research Database (Denmark)

    Rauchdobler, Julian; Sausgruber, Rupert; Tyran, Jean-Robert

    Introducing a threshold in the sense of a minimal project size transforms a public goods game with an inefficient equilibrium into a coordination game with a set of Pareto-superior equilibria. Thresholds may therefore improve efficiency in the voluntary provision of public goods. In our one......-shot experiment, we find that coordination often fails and exogenously imposed thresholds are ineffective at best and often counter-productive. This holds under a range of threshold levels and refund rates. We test if thresholds perform better if they are endogenously chosen, i.e. if a threshold is approved...

  3. Strengthen Cloud Computing Security with Federal Identity Management Using Hierarchical Identity-Based Cryptography

    Science.gov (United States)

    Yan, Liang; Rong, Chunming; Zhao, Gansen

    More and more companies begin to provide different kinds of cloud computing services for Internet users at the same time these services also bring some security problems. Currently the majority of cloud computing systems provide digital identity for users to access their services, this will bring some inconvenience for a hybrid cloud that includes multiple private clouds and/or public clouds. Today most cloud computing system use asymmetric and traditional public key cryptography to provide data security and mutual authentication. Identity-based cryptography has some attraction characteristics that seem to fit well the requirements of cloud computing. In this paper, by adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.

  4. Device-independent two-party cryptography secure against sequential attacks

    DEFF Research Database (Denmark)

    Kaniewski, Jedrzej; Wehner, Stephanie

    2016-01-01

    known as weak string erasure and prove its security even if the devices used in the protocol are prepared by the dishonest party. DI two-party cryptography is made challenging by the fact that Alice and Bob do not trust each other, which requires new techniques to establish security. We fully analyse......The goal of two-party cryptography is to enable two parties, Alice and Bob, to solve common tasks without the need for mutual trust. Examples of such tasks are private access to a database, and secure identification. Quantum communication enables security for all of these problems in the noisy......-storage model by sending more signals than the adversary can store in a certain time frame. Here, we initiate the study of device-independent (DI) protocols for two-party cryptography in the noisy-storage model. Specifically, we present a relatively easy to implement protocol for a cryptographic building block...

  5. AUDIO CRYPTANALYSIS- AN APPLICATION OF SYMMETRIC KEY CRYPTOGRAPHY AND AUDIO STEGANOGRAPHY

    Directory of Open Access Journals (Sweden)

    Smita Paira

    2016-09-01

    Full Text Available In the recent trend of network and technology, “Cryptography” and “Steganography” have emerged out as the essential elements of providing network security. Although Cryptography plays a major role in the fabrication and modification of the secret message into an encrypted version yet it has certain drawbacks. Steganography is the art that meets one of the basic limitations of Cryptography. In this paper, a new algorithm has been proposed based on both Symmetric Key Cryptography and Audio Steganography. The combination of a randomly generated Symmetric Key along with LSB technique of Audio Steganography sends a secret message unrecognizable through an insecure medium. The Stego File generated is almost lossless giving a 100 percent recovery of the original message. This paper also presents a detailed experimental analysis of the algorithm with a brief comparison with other existing algorithms and a future scope. The experimental verification and security issues are promising.

  6. Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    James (Jong Hyuk Park

    2016-09-01

    Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.

  7. Two-out-of-two color matching based visual cryptography schemes.

    Science.gov (United States)

    Machizaud, Jacques; Fournel, Thierry

    2012-09-24

    Visual cryptography which consists in sharing a secret message between transparencies has been extended to color prints. In this paper, we propose a new visual cryptography scheme based on color matching. The stacked printed media reveal a uniformly colored message decoded by the human visual system. In contrast with the previous color visual cryptography schemes, the proposed one enables to share images without pixel expansion and to detect a forgery as the color of the message is kept secret. In order to correctly print the colors on the media and to increase the security of the scheme, we use spectral models developed for color reproduction describing printed colors from an optical point of view.

  8. An Efficient Interception Mechanism Against Cheating In Visual Cryptography With Non Pixel Expansion Of Images

    Directory of Open Access Journals (Sweden)

    Linju P.S

    2015-08-01

    Full Text Available Visual cryptography is a technique of cryptography in which secret images are divided into multiple shares and are distributed to different entities. Each secret can be reconstructed by superimposing these shares using different operations. Common traditional drawbacks of all existing methods are pixel expansion and noise at output. Another major issues that can occur in existing visual cryptography systems are Cheating between share holders and Share holders cheating owner. In order to overcome these limitations sealing algorithm is used with two applications of VC such as MIVC and EVC. Here two secret images can be send at the same time by converting them to halftone representations which in turn are partitioned as three shares in total.

  9. Decoy state method for quantum cryptography based on phase coding into faint laser pulses

    Science.gov (United States)

    Kulik, S. P.; Molotkov, S. N.

    2017-12-01

    We discuss the photon number splitting attack (PNS) in systems of quantum cryptography with phase coding. It is shown that this attack, as well as the structural equations for the PNS attack for phase encoding, differs physically from the analogous attack applied to the polarization coding. As far as we know, in practice, in all works to date processing of experimental data has been done for phase coding, but using formulas for polarization coding. This can lead to inadequate results for the length of the secret key. These calculations are important for the correct interpretation of the results, especially if it concerns the criterion of secrecy in quantum cryptography.

  10. Walking training at the heart rate of pain threshold improves cardiovascular function and autonomic regulation in intermittent claudication: A randomized controlled trial.

    Science.gov (United States)

    Chehuen, Marcel; Cucato, Gabriel G; Carvalho, Celso Ricardo F; Ritti-Dias, Raphael M; Wolosker, Nelson; Leicht, Anthony S; Forjaz, Cláudia Lúcia M

    2017-10-01

    This study investigated the effects of walking training (WT) on cardiovascular function and autonomic regulation in patents with intermittent claudication (IC). Randomized controlled trial. Forty-two male patients with IC (≥50years) were randomly allocated into two groups: control (CG, n=20, 30min of stretching exercises) and WT (WTG, n=22, 15 bouts of 2min of walking interpolated by 2min of upright rest-walking intensity was set at the heart rate of pain threshold). Both interventions were performed twice/week for 12 weeks. Walking capacity (maximal treadmill test), blood pressure (auscultatory), cardiac output (CO 2 rebreathing), heart rate (ECG), stroke volume, systemic vascular resistance, forearm and calf vascular resistance (plethysmography), and low (LF) and high frequency (HF) components of heart rate variability and spontaneous baroreflex sensitivity were measured at baseline and after 12 weeks of the study. WT increased total walking distance (+302±85m, p=0.001) and spontaneous baroreflex sensitivity (+2.13±1.07ms/mmHg, p=0.02). Additionally, at rest, WT decreased systolic and mean blood pressures (-10±3 and -5±2mmHg, p=0.001 and p=0.01, respectively), cardiac output (-0.37±0.24l/min, p=0.03), heart rate (-4±2bpm, p=0.001), forearm vascular resistance (-8.5±2.8U, p=0.02) and LF/HF (-1.24±0.99, p=0.001). No change was observed in the CG. In addition to increasing walking capacity, WT improved cardiovascular function and autonomic regulation in patients with IC. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  11. The Design and Evaluation of a Cryptography Teaching Strategy for Software Engineering Students

    Science.gov (United States)

    Dowling, T.

    2006-01-01

    The present paper describes the design, implementation and evaluation of a cryptography module for final-year software engineering students. The emphasis is on implementation architectures and practical cryptanalysis rather than a standard mathematical approach. The competitive continuous assessment process reflects this approach and rewards…

  12. An Application-Independent Cryptography Model That Is Easy to Use for All Level Users

    Science.gov (United States)

    Gabrielson, Anthony J.

    2013-01-01

    Cryptography libraries are inflexible and difficult for developers to integrate with their applications. These difficulties are often encountered by applications, like PGP, which are non-intuitive for end-users and are often used improperly or not at all. This thesis discusses the negative impact of the current prevailing poor usability on…

  13. One-time pad, complexity of verification of keys, and practical security of quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Russian Academy of Sciences, Institute of Solid State Physics (Russian Federation)

    2016-11-15

    A direct relation between the complexity of the complete verification of keys, which is one of the main criteria of security in classical systems, and a trace distance used in quantum cryptography is demonstrated. Bounds for the minimum and maximum numbers of verification steps required to determine the actual key are obtained.

  14. Quantum cryptography using a photon source based on postselection from entangled two-photon states

    Czech Academy of Sciences Publication Activity Database

    Peřina ml., Jan; Haderka, Ondřej; Soubusta, Jan

    2001-01-01

    Roč. 64, - (2001), s. 052305-1-152305-13 ISSN 1050-2947 R&D Projects: GA MŠk LN00A015 Institutional research plan: CEZ:AV0Z1010914 Keywords : quantum cryptography * photon number squeezing Subject RIV: BH - Optics, Masers, Lasers Impact factor: 2.810, year: 2001

  15. Cryptographic Research and NSA: Report of the Public Cryptography Study Group.

    Science.gov (United States)

    Davida, George I.

    1981-01-01

    The Public Cryptography Study Group accepted the claim made by the National Security Agency that some information in some publications concerning cryptology could be inimical to national security, and is allowing the establishment of a voluntary mechanism, on an experimental basis, for NSA to review cryptology manuscripts. (MLW)

  16. Characterization of collective Gaussian attacks and security of coherent-state quantum cryptography.

    Science.gov (United States)

    Pirandola, Stefano; Braunstein, Samuel L; Lloyd, Seth

    2008-11-14

    We provide a simple description of the most general collective Gaussian attack in continuous-variable quantum cryptography. In the scenario of such general attacks, we analyze the asymptotic secret-key rates which are achievable with coherent states, joint measurements of the quadratures and one-way classical communication.

  17. UbiKiMa : Ubiquitous authentication using a smartphone, migrating from passwords to strong cryptography

    NARCIS (Netherlands)

    Everts, M.H.; Hoepman, J.H.; Siljee B.I.J.

    2013-01-01

    Passwords are the only ubiquitous form of authentication currently available on the web. Unfortunately, passwords are insecure. In this paper we therefore propose the use of strong cryptography, using the fact that users increasingly own a smartphone that can perform the required cryptographic

  18. Cryptography from quantum uncertainty in the presence of quantum side information

    NARCIS (Netherlands)

    Bouman, Niek Johannes

    2012-01-01

    The thesis starts with a high-level introduction into cryptography and quantum mechanics. Chapter 2 gives a theoretical foundation by introducing probability theory, information theory, functional analysis, quantum mechanics and quantum information theory. Chapter 3, 4 and 5 are editions of work

  19. Quantum cryptography using coherent states: Randomized encryption and key generation

    Science.gov (United States)

    Corndorf, Eric

    With the advent of the global optical-telecommunications infrastructure, an increasing number of individuals, companies, and agencies communicate information with one another over public networks or physically-insecure private networks. While the majority of the traffic flowing through these networks requires little or no assurance of secrecy, the same cannot be said for certain communications between banks, between government agencies, within the military, and between corporations. In these arenas, the need to specify some level of secrecy in communications is a high priority. While the current approaches to securing sensitive information (namely the public-key-cryptography infrastructure and deterministic private-key ciphers like AES and 3DES) seem to be cryptographically strong based on empirical evidence, there exist no mathematical proofs of secrecy for any widely deployed cryptosystem. As an example, the ubiquitous public-key cryptosystems infer all of their secrecy from the assumption that factoring of the product of two large primes is necessarily time consuming---something which has not, and perhaps cannot, be proven. Since the 1980s, the possibility of using quantum-mechanical features of light as a physical mechanism for satisfying particular cryptographic objectives has been explored. This research has been fueled by the hopes that cryptosystems based on quantum systems may provide provable levels of secrecy which are at least as valid as quantum mechanics itself. Unfortunately, the most widely considered quantum-cryptographic protocols (BB84 and the Ekert protocol) have serious implementation problems. Specifically, they require quantum-mechanical states which are not readily available, and they rely on unproven relations between intrusion-level detection and the information available to an attacker. As a result, the secrecy level provided by these experimental implementations is entirely unspecified. In an effort to provably satisfy the cryptographic

  20. Threshold models in radiation carcinogenesis

    International Nuclear Information System (INIS)

    Hoel, D.G.; Li, P.

    1998-01-01

    Cancer incidence and mortality data from the atomic bomb survivors cohort has been analyzed to allow for the possibility of a threshold dose response. The same dose-response models as used in the original papers were fit to the data. The estimated cancer incidence from the fitted models over-predicted the observed cancer incidence in the lowest exposure group. This is consistent with a threshold or nonlinear dose-response at low-doses. Thresholds were added to the dose-response models and the range of possible thresholds is shown for both solid tumor cancers as well as the different leukemia types. This analysis suggests that the A-bomb cancer incidence data agree more with a threshold or nonlinear dose-response model than a purely linear model although the linear model is statistically equivalent. This observation is not found with the mortality data. For both the incidence data and the mortality data the addition of a threshold term significantly improves the fit to the linear or linear-quadratic dose response for both total leukemias and also for the leukemia subtypes of ALL, AML, and CML

  1. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  2. On a two-pass scheme without a faraday mirror for free-space relativistic quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Kravtsov, K. S.; Radchenko, I. V. [Russian Academy of Sciences, Prokhorov General Physics Institute (Russian Federation); Korol' kov, A. V. [Academy of Cryptography (Russian Federation); Kulik, S. P., E-mail: sergei.kulik@gmail.com [Moscow State University (Russian Federation); Molotkov, S. N., E-mail: sergei.molotkov@gmail.com [Academy of Cryptography (Russian Federation)

    2013-05-15

    The stability of destructive interference independent of the input polarization and the state of a quantum communication channel in fiber optic systems used in quantum cryptography plays a principal role in providing the security of communicated keys. A novel optical scheme is proposed that can be used both in relativistic quantum cryptography for communicating keys in open space and for communicating them over fiber optic lines. The scheme ensures stability of destructive interference and admits simple automatic balancing of a fiber interferometer.

  3. Resistive Threshold Logic

    OpenAIRE

    James, A. P.; Francis, L. R. V. J.; Kumar, D.

    2013-01-01

    We report a resistance based threshold logic family useful for mimicking brain like large variable logic functions in VLSI. A universal Boolean logic cell based on an analog resistive divider and threshold logic circuit is presented. The resistive divider is implemented using memristors and provides output voltage as a summation of weighted product of input voltages. The output of resistive divider is converted into a binary value by a threshold operation implemented by CMOS inverter and/or O...

  4. Device-independence for two-party cryptography and position verification

    DEFF Research Database (Denmark)

    Ribeiro, Jeremy; Thinh, Le Phuc; Kaniewski, Jedrzej

    security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we give device-independent security proofs of two-party cryptography and Position Verification for memoryless devices under different physical constraints on the adversary....... We assess the quality of the devices by observing a Bell violation and we show that security can be attained for any violation of the Clauser-Holt-Shimony-Horne inequality.......-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which...

  5. Insecurity of position-based quantum-cryptography protocols against entanglement attacks

    International Nuclear Information System (INIS)

    Lau, Hoi-Kwan; Lo, Hoi-Kwong

    2011-01-01

    Recently, position-based quantum cryptography has been claimed to be unconditionally secure. On the contrary, here we show that the existing proposals for position-based quantum cryptography are, in fact, insecure if entanglement is shared among two adversaries. Specifically, we demonstrate how the adversaries can incorporate ideas of quantum teleportation and quantum secret sharing to compromise the security with certainty. The common flaw to all current protocols is that the Pauli operators always map a codeword to a codeword (up to an irrelevant overall phase). We propose a modified scheme lacking this property in which the same cheating strategy used to undermine the previous protocols can succeed with a rate of at most 85%. We prove the modified protocol is secure when the shared quantum resource between the adversaries is a two- or three-level system.

  6. DNA-Cryptography-Based Obfuscated Systolic Finite Field Multiplier for Secure Cryptosystem in Smart Grid

    Science.gov (United States)

    Chen, Shaobo; Chen, Pingxiuqi; Shao, Qiliang; Basha Shaik, Nazeem; Xie, Jiafeng

    2017-05-01

    The elliptic curve cryptography (ECC) provides much stronger security per bits compared to the traditional cryptosystem, and hence it is an ideal role in secure communication in smart grid. On the other side, secure implementation of finite field multiplication over GF(2 m ) is considered as the bottle neck of ECC. In this paper, we present a novel obfuscation strategy for secure implementation of systolic field multiplier for ECC in smart grid. First, for the first time, we propose a novel obfuscation technique to derive a novel obfuscated systolic finite field multiplier for ECC implementation. Then, we employ the DNA cryptography coding strategy to obfuscate the field multiplier further. Finally, we obtain the area-time-power complexity of the proposed field multiplier to confirm the efficiency of the proposed design. The proposed design is highly obfuscated with low overhead, suitable for secure cryptosystem in smart grid.

  7. Optical cryptography topology based on a three-dimensional particle-like distribution and diffractive imaging.

    Science.gov (United States)

    Chen, Wen; Chen, Xudong

    2011-05-09

    In recent years, coherent diffractive imaging has been considered as a promising alternative for information retrieval instead of conventional interference methods. Coherent diffractive imaging using the X-ray light source has opened up a new research perspective for the measurement of non-crystalline and biological specimens, and can achieve unprecedentedly high resolutions. In this paper, we show how a three-dimensional (3D) particle-like distribution and coherent diffractive imaging can be applied for a study of optical cryptography. An optical multiple-random-phase-mask encoding approach is used, and the plaintext is considered as a series of particles distributed in a 3D space. A topology concept is also introduced into the proposed optical cryptosystem. During image decryption, a retrieval algorithm is developed to extract the plaintext from the ciphertexts. In addition, security and advantages of the proposed optical cryptography topology are also analyzed. © 2011 Optical Society of America

  8. Analysis of Multiple Data Hiding Combined Coloured Visual Cryptography and LSB

    Science.gov (United States)

    Maulana, Halim; Rahman Syahputra, Edy

    2017-12-01

    Currently the level of data security becoming a major factor in data transfer. As we know every process of sending data through any medium the risk of that data gets hacked will still be there. Some techniques for securing data such as steganography and cryptography also often used as a solution for securing data. But it does not last long because it has been found out the weaknesses of the algorithm so that the security be assured. So, in need of variety of new algorithms to be able to protect the data so that data security can be guaranteed. In this study tries to combine two visual algorithms that steganography and cryptography. Where in these experiments will try to secure two pieces of data type that is the type of image data and text data where both the data is regarded as a message so to obtain the correct information receiver should get that two types of data.

  9. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  10. Voting on Thresholds for Public Goods

    DEFF Research Database (Denmark)

    Rauchdobler, Julian; Sausgruber, Rupert; Tyran, Jean-Robert

    2010-01-01

    Introducing a threshold in the sense of a minimal project size transforms a public-good game with an inefficient equilibrium into a coordination game with a set of Pareto-superior equilibria. Thresholds may therefore improve efficiency in the voluntary provision of public goods. In our one-shot e...

  11. Perspectives on Entangled Nuclear Particle Pairs Generation and Manipulation in Quantum Communication and Cryptography Systems

    OpenAIRE

    Octavian Dănilă; Paul E. Sterian; Andreea Rodica Sterian

    2012-01-01

    Entanglement between two quantum elements is a phenomenon which presents a broad application spectrum, being used largely in quantum cryptography schemes and in physical characterisation of the universe. Commonly known entangled states have been obtained with photons and electrons, but other quantum elements such as quarks, leptons, and neutrinos have shown their informational potential. In this paper, we present the perspective of exploiting the phenomenon of entanglement that appears in nuc...

  12. SURVEY ON CLOUD SECURITY BY DATA ENCRYPTION USING ELLIPTIC CURVE CRYPTOGRAPHY

    OpenAIRE

    Akanksha Tomar*, Jamwant Kumbhre

    2016-01-01

    Cloud computing is one of the latest technology trend of the IT trade for business area. Cloud computing security converged into a demanding topic in the sector of information technology and computer science research programs. Cloud Computing is a conceptual service based technology which is used by many companies widely these days. Elliptical Curve Cryptography based algorithm provides a highly secure communication, data integrity and authentication, along with the non-repudiation communicat...

  13. Secure Programming Cookbook for C and C++ Recipes for Cryptography, Authentication, Input Validation & More

    CERN Document Server

    Viega, John

    2009-01-01

    Secure Programming Cookbook for C and C++ is an important new resource for developers serious about writing secure code for Unix® (including Linux®) and Windows® environments. This essential code companion covers a wide range of topics, including safe initialization, access control, input validation, symmetric and public key cryptography, cryptographic hashes and MACs, authentication and key exchange, PKI, random numbers, and anti-tampering.

  14. Entropy-as-a-Service: Unlocking the Full Potential of Cryptography.

    Science.gov (United States)

    Vassilev, Apostol; Staples, Robert

    2016-09-01

    Securing the Internet requires strong cryptography, which depends on the availability of good entropy for generating unpredictable keys and accurate clocks. Attacks abusing weak keys or old inputs portend challenges for the Internet. EaaS is a novel architecture providing entropy and timestamps from a decentralized root of trust, scaling gracefully across diverse geopolitical locales and remaining trustworthy unless much of the collective is compromised.

  15. Device-independent two-party cryptography secure against sequential attacks

    International Nuclear Information System (INIS)

    Kaniewski, Jędrzej; Wehner, Stephanie

    2016-01-01

    The goal of two-party cryptography is to enable two parties, Alice and Bob, to solve common tasks without the need for mutual trust. Examples of such tasks are private access to a database, and secure identification. Quantum communication enables security for all of these problems in the noisy-storage model by sending more signals than the adversary can store in a certain time frame. Here, we initiate the study of device-independent (DI) protocols for two-party cryptography in the noisy-storage model. Specifically, we present a relatively easy to implement protocol for a cryptographic building block known as weak string erasure and prove its security even if the devices used in the protocol are prepared by the dishonest party. DI two-party cryptography is made challenging by the fact that Alice and Bob do not trust each other, which requires new techniques to establish security. We fully analyse the case of memoryless devices (for which sequential attacks are optimal) and the case of sequential attacks for arbitrary devices. The key ingredient of the proof, which might be of independent interest, is an explicit (and tight) relation between the violation of the Clauser–Horne–Shimony–Holt inequality observed by Alice and Bob and uncertainty generated by Alice against Bob who is forced to measure his system before finding out Alice’s setting (guessing with postmeasurement information). In particular, we show that security is possible for arbitrarily small violation. (paper)

  16. Design of an Elliptic Curve Cryptography processor for RFID tag chips.

    Science.gov (United States)

    Liu, Zilong; Liu, Dongsheng; Zou, Xuecheng; Lin, Hui; Cheng, Jian

    2014-09-26

    Radio Frequency Identification (RFID) is an important technique for wireless sensor networks and the Internet of Things. Recently, considerable research has been performed in the combination of public key cryptography and RFID. In this paper, an efficient architecture of Elliptic Curve Cryptography (ECC) Processor for RFID tag chip is presented. We adopt a new inversion algorithm which requires fewer registers to store variables than the traditional schemes. A new method for coordinate swapping is proposed, which can reduce the complexity of the controller and shorten the time of iterative calculation effectively. A modified circular shift register architecture is presented in this paper, which is an effective way to reduce the area of register files. Clock gating and asynchronous counter are exploited to reduce the power consumption. The simulation and synthesis results show that the time needed for one elliptic curve scalar point multiplication over GF(2163) is 176.7 K clock cycles and the gate area is 13.8 K with UMC 0.13 μm Complementary Metal Oxide Semiconductor (CMOS) technology. Moreover, the low power and low cost consumption make the Elliptic Curve Cryptography Processor (ECP) a prospective candidate for application in the RFID tag chip.

  17. Design of an Elliptic Curve Cryptography Processor for RFID Tag Chips

    Directory of Open Access Journals (Sweden)

    Zilong Liu

    2014-09-01

    Full Text Available Radio Frequency Identification (RFID is an important technique for wireless sensor networks and the Internet of Things. Recently, considerable research has been performed in the combination of public key cryptography and RFID. In this paper, an efficient architecture of Elliptic Curve Cryptography (ECC Processor for RFID tag chip is presented. We adopt a new inversion algorithm which requires fewer registers to store variables than the traditional schemes. A new method for coordinate swapping is proposed, which can reduce the complexity of the controller and shorten the time of iterative calculation effectively. A modified circular shift register architecture is presented in this paper, which is an effective way to reduce the area of register files. Clock gating and asynchronous counter are exploited to reduce the power consumption. The simulation and synthesis results show that the time needed for one elliptic curve scalar point multiplication over GF(2163 is 176.7 K clock cycles and the gate area is 13.8 K with UMC 0.13 μm Complementary Metal Oxide Semiconductor (CMOS technology. Moreover, the low power and low cost consumption make the Elliptic Curve Cryptography Processor (ECP a prospective candidate for application in the RFID tag chip.

  18. A copyright protection scheme for digital images based on shuffled singular value decomposition and visual cryptography.

    Science.gov (United States)

    Devi, B Pushpa; Singh, Kh Manglem; Roy, Sudipta

    2016-01-01

    This paper proposes a new watermarking algorithm based on the shuffled singular value decomposition and the visual cryptography for copyright protection of digital images. It generates the ownership and identification shares of the image based on visual cryptography. It decomposes the image into low and high frequency sub-bands. The low frequency sub-band is further divided into blocks of same size after shuffling it and then the singular value decomposition is applied to each randomly selected block. Shares are generated by comparing one of the elements in the first column of the left orthogonal matrix with its corresponding element in the right orthogonal matrix of the singular value decomposition of the block of the low frequency sub-band. The experimental results show that the proposed scheme clearly verifies the copyright of the digital images, and is robust to withstand several image processing attacks. Comparison with the other related visual cryptography-based algorithms reveals that the proposed method gives better performance. The proposed method is especially resilient against the rotation attack.

  19. DNA Cryptography and Deep Learning using Genetic Algorithm with NW algorithm for Key Generation.

    Science.gov (United States)

    Kalsi, Shruti; Kaur, Harleen; Chang, Victor

    2017-12-05

    Cryptography is not only a science of applying complex mathematics and logic to design strong methods to hide data called as encryption, but also to retrieve the original data back, called decryption. The purpose of cryptography is to transmit a message between a sender and receiver such that an eavesdropper is unable to comprehend it. To accomplish this, not only we need a strong algorithm, but a strong key and a strong concept for encryption and decryption process. We have introduced a concept of DNA Deep Learning Cryptography which is defined as a technique of concealing data in terms of DNA sequence and deep learning. In the cryptographic technique, each alphabet of a letter is converted into a different combination of the four bases, namely; Adenine (A), Cytosine (C), Guanine (G) and Thymine (T), which make up the human deoxyribonucleic acid (DNA). Actual implementations with the DNA don't exceed laboratory level and are expensive. To bring DNA computing on a digital level, easy and effective algorithms are proposed in this paper. In proposed work we have introduced firstly, a method and its implementation for key generation based on the theory of natural selection using Genetic Algorithm with Needleman-Wunsch (NW) algorithm and Secondly, a method for implementation of encryption and decryption based on DNA computing using biological operations Transcription, Translation, DNA Sequencing and Deep Learning.

  20. Improving the understanding of sleep apnea characterization using Recurrence Quantification Analysis by defining overall acceptable values for the dimensionality of the system, the delay, and the distance threshold.

    Science.gov (United States)

    Martín-González, Sofía; Navarro-Mesa, Juan L; Juliá-Serdá, Gabriel; Ramírez-Ávila, G Marcelo; Ravelo-García, Antonio G

    2018-01-01

    Our contribution focuses on the characterization of sleep apnea from a cardiac rate point of view, using Recurrence Quantification Analysis (RQA), based on a Heart Rate Variability (HRV) feature selection process. Three parameters are crucial in RQA: those related to the embedding process (dimension and delay) and the threshold distance. There are no overall accepted parameters for the study of HRV using RQA in sleep apnea. We focus on finding an overall acceptable combination, sweeping a range of values for each of them simultaneously. Together with the commonly used RQA measures, we include features related to recurrence times, and features originating in the complex network theory. To the best of our knowledge, no author has used them all for sleep apnea previously. The best performing feature subset is entered into a Linear Discriminant classifier. The best results in the "Apnea-ECG Physionet database" and the "HuGCDN2014 database" are, according to the area under the receiver operating characteristic curve, 0.93 (Accuracy: 86.33%) and 0.86 (Accuracy: 84.18%), respectively. Our system outperforms, using a relatively small set of features, previously existing studies in the context of sleep apnea. We conclude that working with dimensions around 7-8 and delays about 4-5, and using for the threshold distance the Fixed Amount of Nearest Neighbours (FAN) method with 5% of neighbours, yield the best results. Therefore, we would recommend these reference values for future work when applying RQA to the analysis of HRV in sleep apnea. We also conclude that, together with the commonly used vertical and diagonal RQA measures, there are newly used features that contribute valuable information for apnea minutes discrimination. Therefore, they are especially interesting for characterization purposes. Using two different databases supports that the conclusions reached are potentially generalizable, and are not limited by database variability.

  1. Threshold concepts in prosthetics.

    Science.gov (United States)

    Hill, Sophie

    2017-12-01

    Curriculum documents identify key concepts within learning prosthetics. Threshold concepts provide an alternative way of viewing the curriculum, focussing on the ways of thinking and practicing within prosthetics. Threshold concepts can be described as an opening to a different way of viewing a concept. This article forms part of a larger study exploring what students and staff experience as difficult in learning about prosthetics. To explore possible threshold concepts within prosthetics. Qualitative, interpretative phenomenological analysis. Data from 18 students and 8 staff at two universities with undergraduate prosthetics and orthotics programmes were generated through interviews and questionnaires. The data were analysed using an interpretative phenomenological analysis approach. Three possible threshold concepts arose from the data: 'how we walk', 'learning to talk' and 'considering the person'. Three potential threshold concepts in prosthetics are suggested with possible implications for prosthetics education. These possible threshold concepts involve changes in both conceptual and ontological knowledge, integrating into the persona of the individual. This integration occurs through the development of memories associated with procedural concepts that combine with disciplinary concepts. Considering the prosthetics curriculum through the lens of threshold concepts enables a focus on how students learn to become prosthetists. Clinical relevance This study provides new insights into how prosthetists learn. This has implications for curriculum design in prosthetics education.

  2. Inclusion of Exercise Intensities Above the Lactate Threshold in VO2/Running Speed Regression Does not Improve the Precision of Accumulated Oxygen Deficit Estimation in Endurance-Trained Runners.

    Science.gov (United States)

    Reis, Victor M; Silva, António J; Ascensão, António; Duarte, José A

    2005-12-01

    The present study intended to verify if the inclusion of intensities above lactate threshold (LT) in the VO2/running speed regression (RSR) affects the estimation error of accumulated oxygen deficit (AOD) during a treadmill running performed by endurance-trained subjects. Fourteen male endurance-trained runners performed a sub maximal treadmill running test followed by an exhaustive supra maximal test 48h later. The total energy demand (TED) and the AOD during the supra maximal test were calculated from the RSR established on first testing. For those purposes two regressions were used: a complete regression (CR) including all available sub maximal VO2 measurements and a sub threshold regression (STR) including solely the VO2 values measured during exercise intensities below LT. TED mean values obtained with CR and STR were not significantly different under the two conditions of analysis (177.71 ± 5.99 and 174.03 ± 6.53 ml·kg(-1), respectively). Also the mean values of AOD obtained with CR and STR did not differ under the two conditions (49.75 ± 8.38 and 45.8 9 ± 9.79 ml·kg(-1), respectively). Moreover, the precision of those estimations was also similar under the two procedures. The mean error for TED estimation was 3.27 ± 1.58 and 3.41 ± 1.85 ml·kg(-1) (for CR and STR, respectively) and the mean error for AOD estimation was 5.03 ± 0.32 and 5.14 ± 0.35 ml·kg(-1) (for CR and STR, respectively). The results indicated that the inclusion of exercise intensities above LT in the RSR does not improve the precision of the AOD estimation in endurance-trained runners. However, the use of STR may induce an underestimation of AOD comparatively to the use of CR. Key PointsIt has been suggested that the inclusion of exercise intensities above the lactate threshold in the VO2/power regression can significantly affect the estimation of the energy cost and, thus, the estimation of the AOD.However data on the precision of those AOD measurements is rarely provided.We have

  3. Implementation Cryptography Data Encryption Standard (DES) and Triple Data Encryption Standard (3DES) Method in Communication System Based Near Field Communication (NFC)

    Science.gov (United States)

    Ratnadewi; Pramono Adhie, Roy; Hutama, Yonatan; Saleh Ahmar, A.; Setiawan, M. I.

    2018-01-01

    Cryptography is a method used to create secure communication by manipulating sent messages during the communication occurred so only intended party that can know the content of that messages. Some of the most commonly used cryptography methods to protect sent messages, especially in the form of text, are DES and 3DES cryptography method. This research will explain the DES and 3DES cryptography method and its use for stored data security in smart cards that working in the NFC-based communication system. Several things that will be explained in this research is the ways of working of DES and 3DES cryptography method in doing the protection process of a data and software engineering through the creation of application using C++ programming language to realize and test the performance of DES and 3DES cryptography method in encrypted data writing process to smart cards and decrypted data reading process from smart cards. The execution time of the entering and the reading process data using a smart card DES cryptography method is faster than using 3DES cryptography.

  4. Regional Seismic Threshold Monitoring

    National Research Council Canada - National Science Library

    Kvaerna, Tormod

    2006-01-01

    ... model to be used for predicting the travel times of regional phases. We have applied these attenuation relations to develop and assess a regional threshold monitoring scheme for selected subregions of the European Arctic...

  5. Geospatial cryptography: enabling researchers to access private, spatially referenced, human subjects data for cancer control and prevention.

    Science.gov (United States)

    Jacquez, Geoffrey M; Essex, Aleksander; Curtis, Andrew; Kohler, Betsy; Sherman, Recinda; Emam, Khaled El; Shi, Chen; Kaufmann, Andy; Beale, Linda; Cusick, Thomas; Goldberg, Daniel; Goovaerts, Pierre

    2017-07-01

    As the volume, accuracy and precision of digital geographic information have increased, concerns regarding individual privacy and confidentiality have come to the forefront. Not only do these challenge a basic tenet underlying the advancement of science by posing substantial obstacles to the sharing of data to validate research results, but they are obstacles to conducting certain research projects in the first place. Geospatial cryptography involves the specification, design, implementation and application of cryptographic techniques to address privacy, confidentiality and security concerns for geographically referenced data. This article defines geospatial cryptography and demonstrates its application in cancer control and surveillance. Four use cases are considered: (1) national-level de-duplication among state or province-based cancer registries; (2) sharing of confidential data across cancer registries to support case aggregation across administrative geographies; (3) secure data linkage; and (4) cancer cluster investigation and surveillance. A secure multi-party system for geospatial cryptography is developed. Solutions under geospatial cryptography are presented and computation time is calculated. As services provided by cancer registries to the research community, de-duplication, case aggregation across administrative geographies and secure data linkage are often time-consuming and in some instances precluded by confidentiality and security concerns. Geospatial cryptography provides secure solutions that hold significant promise for addressing these concerns and for accelerating the pace of research with human subjects data residing in our nation's cancer registries. Pursuit of the research directions posed herein conceivably would lead to a geospatially encrypted geographic information system (GEGIS) designed specifically to promote the sharing and spatial analysis of confidential data. Geospatial cryptography holds substantial promise for accelerating the

  6. Hydrodynamics of sediment threshold

    Science.gov (United States)

    Ali, Sk Zeeshan; Dey, Subhasish

    2016-07-01

    A novel hydrodynamic model for the threshold of cohesionless sediment particle motion under a steady unidirectional streamflow is presented. The hydrodynamic forces (drag and lift) acting on a solitary sediment particle resting over a closely packed bed formed by the identical sediment particles are the primary motivating forces. The drag force comprises of the form drag and form induced drag. The lift force includes the Saffman lift, Magnus lift, centrifugal lift, and turbulent lift. The points of action of the force system are appropriately obtained, for the first time, from the basics of micro-mechanics. The sediment threshold is envisioned as the rolling mode, which is the plausible mode to initiate a particle motion on the bed. The moment balance of the force system on the solitary particle about the pivoting point of rolling yields the governing equation. The conditions of sediment threshold under the hydraulically smooth, transitional, and rough flow regimes are examined. The effects of velocity fluctuations are addressed by applying the statistical theory of turbulence. This study shows that for a hindrance coefficient of 0.3, the threshold curve (threshold Shields parameter versus shear Reynolds number) has an excellent agreement with the experimental data of uniform sediments. However, most of the experimental data are bounded by the upper and lower limiting threshold curves, corresponding to the hindrance coefficients of 0.2 and 0.4, respectively. The threshold curve of this study is compared with those of previous researchers. The present model also agrees satisfactorily with the experimental data of nonuniform sediments.

  7. A Robust Threshold for Iterative Channel Estimation in OFDM Systems

    Directory of Open Access Journals (Sweden)

    A. Kalaycioglu

    2010-04-01

    Full Text Available A novel threshold computation method for pilot symbol assisted iterative channel estimation in OFDM systems is considered. As the bits are transmitted in packets, the proposed technique is based on calculating a particular threshold for each data packet in order to select the reliable decoder output symbols to improve the channel estimation performance. Iteratively, additional pilot symbols are established according to the threshold and the channel is re-estimated with the new pilots inserted to the known channel estimation pilot set. The proposed threshold calculation method for selecting additional pilots performs better than non-iterative channel estimation, no threshold and fixed threshold techniques in poor HF channel simulations.

  8. A secure RFID mutual authentication protocol for healthcare environments using elliptic curve cryptography.

    Science.gov (United States)

    Jin, Chunhua; Xu, Chunxiang; Zhang, Xiaojun; Zhao, Jining

    2015-03-01

    Radio Frequency Identification(RFID) is an automatic identification technology, which can be widely used in healthcare environments to locate and track staff, equipment and patients. However, potential security and privacy problems in RFID system remain a challenge. In this paper, we design a mutual authentication protocol for RFID based on elliptic curve cryptography(ECC). We use pre-computing method within tag's communication, so that our protocol can get better efficiency. In terms of security, our protocol can achieve confidentiality, unforgeability, mutual authentication, tag's anonymity, availability and forward security. Our protocol also can overcome the weakness in the existing protocols. Therefore, our protocol is suitable for healthcare environments.

  9. Information hiding based on double random-phase encoding and public-key cryptography.

    Science.gov (United States)

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  10. Optical double-image cryptography based on diffractive imaging with a laterally-translated phase grating.

    Science.gov (United States)

    Chen, Wen; Chen, Xudong; Sheppard, Colin J R

    2011-10-10

    In this paper, we propose a method using structured-illumination-based diffractive imaging with a laterally-translated phase grating for optical double-image cryptography. An optical cryptosystem is designed, and multiple random phase-only masks are placed in the optical path. When a phase grating is laterally translated just before the plaintexts, several diffraction intensity patterns (i.e., ciphertexts) can be correspondingly obtained. During image decryption, an iterative retrieval algorithm is developed to extract plaintexts from the ciphertexts. In addition, security and advantages of the proposed method are analyzed. Feasibility and effectiveness of the proposed method are demonstrated by numerical simulation results. © 2011 Optical Society of America

  11. Field test of a practical secure communication network with decoy-state quantum cryptography.

    Science.gov (United States)

    Chen, Teng-Yun; Liang, Hao; Liu, Yang; Cai, Wen-Qi; Ju, Lei; Liu, Wei-Yue; Wang, Jian; Yin, Hao; Chen, Kai; Chen, Zeng-Bing; Peng, Cheng-Zhi; Pan, Jian-Wei

    2009-04-13

    We present a secure network communication system that operated with decoy-state quantum cryptography in a real-world application scenario. The full key exchange and application protocols were performed in real time among three nodes, in which two adjacent nodes were connected by approximate 20 km of commercial telecom optical fiber. The generated quantum keys were immediately employed and demonstrated for communication applications, including unbreakable real-time voice telephone between any two of the three communication nodes, or a broadcast from one node to the other two nodes by using one-time pad encryption.

  12. Implementation of Pollard Rho attack on elliptic curve cryptography over binary fields

    Science.gov (United States)

    Wienardo, Yuliawan, Fajar; Muchtadi-Alamsyah, Intan; Rahardjo, Budi

    2015-09-01

    Elliptic Curve Cryptography (ECC) is a public key cryptosystem with a security level determined by discrete logarithm problem called Elliptic Curve Discrete Logarithm Problem (ECDLP). John M. Pollard proposed an algorithm for discrete logarithm problem based on Monte Carlo method and known as Pollard Rho algorithm. The best current brute-force attack for ECC is Pollard Rho algorithm. In this research we implement modified Pollard Rho algorithm on ECC over GF (241). As the result, the runtime of Pollard Rho algorithm increases exponentially with the increase of the ECC key length. This work also presents the estimated runtime of Pollard Rho attack on ECC over longer bits.

  13. Disorder generated by interacting neural networks: application to econophysics and cryptography

    International Nuclear Information System (INIS)

    Kinzel, Wolfgang; Kanter, Ido

    2003-01-01

    When neural networks are trained on their own output signals they generate disordered time series. In particular, when two neural networks are trained on their mutual output they can synchronize; they relax to a time-dependent state with identical synaptic weights. Two applications of this phenomenon are discussed for (a) econophysics and (b) cryptography. (a) When agents competing in a closed market (minority game) are using neural networks to make their decisions, the total system relaxes to a state of good performance. (b) Two partners communicating over a public channel can find a common secret key

  14. Perspectives on Entangled Nuclear Particle Pairs Generation and Manipulation in Quantum Communication and Cryptography Systems

    Directory of Open Access Journals (Sweden)

    Octavian Dănilă

    2012-01-01

    Full Text Available Entanglement between two quantum elements is a phenomenon which presents a broad application spectrum, being used largely in quantum cryptography schemes and in physical characterisation of the universe. Commonly known entangled states have been obtained with photons and electrons, but other quantum elements such as quarks, leptons, and neutrinos have shown their informational potential. In this paper, we present the perspective of exploiting the phenomenon of entanglement that appears in nuclear particle interactions as a resource for quantum key distribution protocols.

  15. Authentication in insecure environments using visual cryptography and non-transferable credentials in practise

    CERN Document Server

    Pape, Sebastian

    2014-01-01

    Sebastian Pape discusses two different scenarios for authentication. On the one hand, users cannot trust their devices and nevertheless want to be able to do secure authentication. On the other hand, users may not want to be tracked while their service provider does not want them to share their credentials. Many users may not be able to determine whether their device is trustworthy, i.e. it might contain malware. One solution is to use visual cryptography for authentication. The author generalizes this concept to human decipherable encryption schemes and establishes a relationship to CAPTCHAS.

  16. SD-EQR: A New Technique To Use QR CodesTM in Cryptography

    OpenAIRE

    Dey, Somdip

    2012-01-01

    In this paper the author present a new technique of using QR Codes (commonly known as 'Quick Respond Codes') in the field of Cryptography. QR Codes are mainly used to convey or store messages because they have higher or large storage capacity than any other normal conventional 'barcodes'. In this paper the primary focus will be on storing messages in encrypted format with a password and send it to the required destination hiding in a QR Code, without being tracked or decrypted properly by any...

  17. Improving sensitivity of linear regression-based cell type-specific differential expression deconvolution with per-gene vs. global significance threshold.

    Science.gov (United States)

    Glass, Edmund R; Dozmorov, Mikhail G

    2016-10-06

    of target cell (cell type being analyzed). We demonstrate that LRCDE, which uses Welch's t-test to compare per-gene cell type-specific gene expression estimates, is more sensitive in detecting cell type-specific differential expression at α < 0.05 missed by the global false discovery rate threshold FDR < 0.3.

  18. Automatic histogram threshold using fuzzy measures.

    Science.gov (United States)

    Vieira Lopes, Nuno; Mogadouro do Couto, Pedro A; Bustince, Humberto; Melo-Pinto, Pedro

    2010-01-01

    In this paper, an automatic histogram threshold approach based on a fuzziness measure is presented. This work is an improvement of an existing method. Using fuzzy logic concepts, the problems involved in finding the minimum of a criterion function are avoided. Similarity between gray levels is the key to find an optimal threshold. Two initial regions of gray levels, located at the boundaries of the histogram, are defined. Then, using an index of fuzziness, a similarity process is started to find the threshold point. A significant contrast between objects and background is assumed. Previous histogram equalization is used in small contrast images. No prior knowledge of the image is required.

  19. COALA-System for Visual Representation of Cryptography Algorithms

    Science.gov (United States)

    Stanisavljevic, Zarko; Stanisavljevic, Jelena; Vuletic, Pavle; Jovanovic, Zoran

    2014-01-01

    Educational software systems have an increasingly significant presence in engineering sciences. They aim to improve students' attitudes and knowledge acquisition typically through visual representation and simulation of complex algorithms and mechanisms or hardware systems that are often not available to the educational institutions. This paper…

  20. Cryptography Based E-Commerce Security: A Review

    OpenAIRE

    Shazia Yasin; Khalid Haseeb; Rashid Jalal Qureshi

    2012-01-01

    E-commerce is a powerful tool for business transformation that allows companies to enhance their supply-chain operation, reach new markets, and improve services for customers as well as for providers. Implementing the E-commerce applications that provide these benefits may be impossible without a coherent, consistent approach to E-commerce security. E-commerce has presented a new way of doing transactions all over the world using internet. Organizations have changed their way of doing busines...

  1. High-Rate Strong-Signal Quantum Cryptography

    Science.gov (United States)

    Yuen, Horace P.

    1996-01-01

    Several quantum cryptosystems utilizing different kinds of nonclassical lights, which can accommodate high intensity fields and high data rate, are described. However, they are all sensitive to loss and both the high rate and the strong-signal character rapidly disappear. A squeezed light homodyne detection scheme is proposed which, with present-day technology, leads to more than two orders of magnitude data rate improvement over other current experimental systems for moderate loss.

  2. One Time Password Security through Cryptography For Mobile Banking

    OpenAIRE

    Dr.D.S.Rao; Gurleen Kour; Divya Jyoti

    2011-01-01

    Electronic banking- which provides the economic services through internet- changed the business trade of banks drastically, also decreasing the cost and improving the ease for the user. Highly usage of phones in which internet is enabled makes the change of banking operations to mobile phones from desktop computers - a reasonable growth of electronic banking. This newly formed banking which a division of electronic banking is called mobile banking. The mobile banking is explained as: “the exe...

  3. Hadron production near threshold

    Indian Academy of Sciences (India)

    Final state interaction effects in → + and → 3He reactions are explored near threshold to study the sensitivity of the cross-sections to the potential and the scattering matrix. The final state scattering wave functions between and and and 3He are described rigorously. The production is ...

  4. Elaborating on Threshold Concepts

    Science.gov (United States)

    Rountree, Janet; Robins, Anthony; Rountree, Nathan

    2013-01-01

    We propose an expanded definition of Threshold Concepts (TCs) that requires the successful acquisition and internalisation not only of knowledge, but also its practical elaboration in the domains of applied strategies and mental models. This richer definition allows us to clarify the relationship between TCs and Fundamental Ideas, and to account…

  5. Boundaries, Thresholds, and Consequences.

    Science.gov (United States)

    Smith, Carl R.

    1997-01-01

    Highlights issues in the debate concerning Individuals with Disabilities Education Act (IDEA) special education legislation as it relates to student discipline and incarcerated juveniles. Focuses on assessment issues and thresholds for diagnosable conditions. Looks at debates surrounding IDEA and some of the consequences of new legislation. (RJM)

  6. VLSI IMPLEMENTATION OF NOVEL ROUND KEYS GENERATION SCHEME FOR CRYPTOGRAPHY APPLICATIONS BY ERROR CONTROL ALGORITHM

    Directory of Open Access Journals (Sweden)

    B. SENTHILKUMAR

    2015-05-01

    Full Text Available A novel implementation of code based cryptography (Cryptocoding technique for multi-layer key distribution scheme is presented. VLSI chip is designed for storing information on generation of round keys. New algorithm is developed for reduced key size with optimal performance. Error Control Algorithm is employed for both generation of round keys and diffusion of non-linearity among them. Two new functions for bit inversion and its reversal are developed for cryptocoding. Probability of retrieving original key from any other round keys is reduced by diffusing nonlinear selective bit inversions on round keys. Randomized selective bit inversions are done on equal length of key bits by Round Constant Feedback Shift Register within the error correction limits of chosen code. Complexity of retrieving the original key from any other round keys is increased by optimal hardware usage. Proposed design is simulated and synthesized using VHDL coding for Spartan3E FPGA and results are shown. Comparative analysis is done between 128 bit Advanced Encryption Standard round keys and proposed round keys for showing security strength of proposed algorithm. This paper concludes that chip based multi-layer key distribution of proposed algorithm is an enhanced solution to the existing threats on cryptography algorithms.

  7. Encrypted Objects and Decryption Processes: Problem-Solving with Functions in a Learning Environment Based on Cryptography

    Science.gov (United States)

    White, Tobin

    2009-01-01

    This paper introduces an applied problem-solving task, set in the context of cryptography and embedded in a network of computer-based tools. This designed learning environment engaged students in a series of collaborative problem-solving activities intended to introduce the topic of functions through a set of linked representations. In a…

  8. Trichocyanines: a Red-Hair-Inspired Modular Platform for Dye-Based One-Time-Pad Molecular Cryptography.

    Science.gov (United States)

    Leone, Loredana; Pezzella, Alessandro; Crescenzi, Orlando; Napolitano, Alessandra; Barone, Vincenzo; d'Ischia, Marco

    2015-06-01

    Current molecular cryptography (MoCryp) systems are almost exclusively based on DNA chemistry and reports of cryptography technologies based on other less complex chemical systems are lacking. We describe herein, as proof of concept, the prototype of the first asymmetric MoCryp system, based on an 8-compound set of a novel bioinspired class of cyanine-type dyes called trichocyanines. These novel acidichromic cyanine-type dyes inspired by red hair pigments were synthesized and characterized with the aid of density functional theory (DFT) calculations. Trichocyanines consist of a modular scaffold easily accessible via an expedient condensation of 3-phenyl- or 3-methyl-2H-1,4-benzothiazines with N-dimethyl- or o-methoxyhydroxy-substituted benzaldehyde or cinnamaldehyde derivatives. The eight representative members synthesized herein can be classified as belonging to two three-state systems tunable through four different control points. This versatile dye platform can generate an expandable palette of colors and appears to be specifically suited to implement an unprecedented single-use asymmetric molecular cryptography system. With this system, we intend to pioneer the translation of digital public-key cryptography into a chemical-coding one-time-pad-like system.

  9. Laser damage helps the eavesdropper in quantum cryptography.

    Science.gov (United States)

    Bugge, Audun Nystad; Sauge, Sebastien; Ghazali, Aina Mardhiyah M; Skaar, Johannes; Lydersen, Lars; Makarov, Vadim

    2014-02-21

    We propose a class of attacks on quantum key distribution (QKD) systems where an eavesdropper actively engineers new loopholes by using damaging laser illumination to permanently change properties of system components. This can turn a perfect QKD system into a completely insecure system. A proof-of-principle experiment performed on an avalanche photodiode-based detector shows that laser damage can be used to create loopholes. After ∼1  W illumination, the detectors' dark count rate reduces 2-5 times, permanently improving single-photon counting performance. After ∼1.5  W, the detectors switch permanently into the linear photodetection mode and become completely insecure for QKD applications.

  10. Stroke rehabilitation reaches a threshold.

    Directory of Open Access Journals (Sweden)

    Cheol E Han

    2008-08-01

    Full Text Available Motor training with the upper limb affected by stroke partially reverses the loss of cortical representation after lesion and has been proposed to increase spontaneous arm use. Moreover, repeated attempts to use the affected hand in daily activities create a form of practice that can potentially lead to further improvement in motor performance. We thus hypothesized that if motor retraining after stroke increases spontaneous arm use sufficiently, then the patient will enter a virtuous circle in which spontaneous arm use and motor performance reinforce each other. In contrast, if the dose of therapy is not sufficient to bring spontaneous use above threshold, then performance will not increase and the patient will further develop compensatory strategies with the less affected hand. To refine this hypothesis, we developed a computational model of bilateral hand use in arm reaching to study the interactions between adaptive decision making and motor relearning after motor cortex lesion. The model contains a left and a right motor cortex, each controlling the opposite arm, and a single action choice module. The action choice module learns, via reinforcement learning, the value of using each arm for reaching in specific directions. Each motor cortex uses a neural population code to specify the initial direction along which the contralateral hand moves towards a target. The motor cortex learns to minimize directional errors and to maximize neuronal activity for each movement. The derived learning rule accounts for the reversal of the loss of cortical representation after rehabilitation and the increase of this loss after stroke with insufficient rehabilitation. Further, our model exhibits nonlinear and bistable behavior: if natural recovery, motor training, or both, brings performance above a certain threshold, then training can be stopped, as the repeated spontaneous arm use provides a form of motor learning that further bootstraps performance and

  11. Azygos Vein Lead Implantation For High Defibrillation Thresholds In Implantable Cardioverter Defibrillator Placement

    Directory of Open Access Journals (Sweden)

    Naga VA Kommuri

    2010-01-01

    Full Text Available Evaluation of defibrillation threshold is a standard of care during implantation of implantable cardioverter defibrillator. High defibrillation thresholds are often encountered and pose a challenge to electrophysiologists to improve the defibrillation threshold. We describe a case series where defibrillation thresholds were improved after implanting a defibrillation lead in the azygos vein.

  12. ECG-cryptography and authentication in body area networks.

    Science.gov (United States)

    Zhang, Zhaoyang; Wang, Honggang; Vasilakos, Athanasios V; Fang, Hua

    2012-11-01

    Wireless body area networks (BANs) have drawn much attention from research community and industry in recent years. Multimedia healthcare services provided by BANs can be available to anyone, anywhere, and anytime seamlessly. A critical issue in BANs is how to preserve the integrity and privacy of a person's medical data over wireless environments in a resource efficient manner. This paper presents a novel key agreement scheme that allows neighboring nodes in BANs to share a common key generated by electrocardiogram (ECG) signals. The improved Jules Sudan (IJS) algorithm is proposed to set up the key agreement for the message authentication. The proposed ECG-IJS key agreement can secure data communications over BANs in a plug-n-play manner without any key distribution overheads. Both the simulation and experimental results are presented, which demonstrate that the proposed ECG-IJS scheme can achieve better security performance in terms of serval performance metrics such as false acceptance rate (FAR) and false rejection rate (FRR) than other existing approaches. In addition, the power consumption analysis also shows that the proposed ECG-IJS scheme can achieve energy efficiency for BANs.

  13. Hadron production near threshold

    Indian Academy of Sciences (India)

    Abstract. Final state interaction effects in pp → pΛK+ and pd → 3He η reactions are explored near threshold to study the sensitivity of the cross-sections to the pΛ potential and the ηN scattering matrix. The final state scattering wave functions between Λ and p and η and 3He are described rigorously. The Λ production is ...

  14. Control of distributed interference in the one-way quantum cryptography system

    Science.gov (United States)

    Balygin, K. A.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.

    2017-07-01

    The possibility of controlling interference in two spaced fiber Mach-Zehnder interferometers and maintaining a nearly ideal visibility has been demonstrated for the one-way quantum cryptography system directly in the key distribution process through a communication channel with a length of 50 km. It has been shown that the deviation of the visibility from ideal is certainly due to the detected difference between the numbers of 0's and 1's in the raw (sifted) key. For this reason, an interferometer can be balanced only in the quasi-singlephoton mode without the interruption of the process of key distribution by using the difference between the numbers of 0's and 1's in the raw key as an indicator of an error. The proposed approach reduces the balancing time and, furthermore, does not require additional exchanges through an open communication channel.

  15. A Novel Image Steganography Technique for Secured Online Transaction Using DWT and Visual Cryptography

    Science.gov (United States)

    Anitha Devi, M. D.; ShivaKumar, K. B.

    2017-08-01

    Online payment eco system is the main target especially for cyber frauds. Therefore end to end encryption is very much needed in order to maintain the integrity of secret information related to transactions carried online. With access to payment related sensitive information, which enables lot of money transactions every day, the payment infrastructure is a major target for hackers. The proposed system highlights, an ideal approach for secure online transaction for fund transfer with a unique combination of visual cryptography and Haar based discrete wavelet transform steganography technique. This combination of data hiding technique reduces the amount of information shared between consumer and online merchant needed for successful online transaction along with providing enhanced security to customer’s account details and thereby increasing customer’s confidence preventing “Identity theft” and “Phishing”. To evaluate the effectiveness of proposed algorithm Root mean square error, Peak signal to noise ratio have been used as evaluation parameters

  16. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks

    Science.gov (United States)

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-01-01

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes. PMID:26184224

  17. Algebra for applications cryptography, secret sharing, error-correcting, fingerprinting, compression

    CERN Document Server

    Slinko, Arkadii

    2015-01-01

    This book examines the relationship between mathematics and data in the modern world. Indeed, modern societies are awash with data which must be manipulated in many different ways: encrypted, compressed, shared between users in a prescribed manner, protected from an unauthorised access and transmitted over unreliable channels. All of these operations can be understood only by a person with knowledge of basics in algebra and number theory. This book provides the necessary background in arithmetic, polynomials, groups, fields and elliptic curves that is sufficient to understand such real-life applications as cryptography, secret sharing, error-correcting, fingerprinting and compression of information. It is the first to cover many recent developments in these topics. Based on a lecture course given to third-year undergraduates, it is self-contained with numerous worked examples and exercises provided to test understanding. It can additionally be used for self-study.

  18. An efficient RFID authentication protocol to enhance patient medication safety using elliptic curve cryptography.

    Science.gov (United States)

    Zhang, Zezhong; Qi, Qingqing

    2014-05-01

    Medication errors are very dangerous even fatal since it could cause serious even fatal harm to patients. In order to reduce medication errors, automated patient medication systems using the Radio Frequency Identification (RFID) technology have been used in many hospitals. The data transmitted in those medication systems is very important and sensitive. In the past decade, many security protocols have been proposed to ensure its secure transition attracted wide attention. Due to providing mutual authentication between the medication server and the tag, the RFID authentication protocol is considered as the most important security protocols in those systems. In this paper, we propose a RFID authentication protocol to enhance patient medication safety using elliptic curve cryptography (ECC). The analysis shows the proposed protocol could overcome security weaknesses in previous protocols and has better performance. Therefore, the proposed protocol is very suitable for automated patient medication systems.

  19. A User Authentication Scheme Based on Elliptic Curves Cryptography for Wireless Ad Hoc Networks.

    Science.gov (United States)

    Chen, Huifang; Ge, Linlin; Xie, Lei

    2015-07-14

    The feature of non-infrastructure support in a wireless ad hoc network (WANET) makes it suffer from various attacks. Moreover, user authentication is the first safety barrier in a network. A mutual trust is achieved by a protocol which enables communicating parties to authenticate each other at the same time and to exchange session keys. For the resource-constrained WANET, an efficient and lightweight user authentication scheme is necessary. In this paper, we propose a user authentication scheme based on the self-certified public key system and elliptic curves cryptography for a WANET. Using the proposed scheme, an efficient two-way user authentication and secure session key agreement can be achieved. Security analysis shows that our proposed scheme is resilient to common known attacks. In addition, the performance analysis shows that our proposed scheme performs similar or better compared with some existing user authentication schemes.

  20. Full-field implementation of a perfect eavesdropper on a quantum cryptography system.

    Science.gov (United States)

    Gerhardt, Ilja; Liu, Qin; Lamas-Linares, Antía; Skaar, Johannes; Kurtsiefer, Christian; Makarov, Vadim

    2011-06-14

    Quantum key distribution (QKD) allows two remote parties to grow a shared secret key. Its security is founded on the principles of quantum mechanics, but in reality it significantly relies on the physical implementation. Technological imperfections of QKD systems have been previously explored, but no attack on an established QKD connection has been realized so far. Here we show the first full-field implementation of a complete attack on a running QKD connection. An installed eavesdropper obtains the entire 'secret' key, while none of the parameters monitored by the legitimate parties indicate a security breach. This confirms that non-idealities in physical implementations of QKD can be fully practically exploitable, and must be given increased scrutiny if quantum cryptography is to become highly secure.

  1. Optical asymmetric cryptography based on elliptical polarized light linear truncation and a numerical reconstruction technique.

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Wang, Zhisong; Zhao, Cheng

    2014-06-20

    We demonstrate a novel optical asymmetric cryptosystem based on the principle of elliptical polarized light linear truncation and a numerical reconstruction technique. The device of an array of linear polarizers is introduced to achieve linear truncation on the spatially resolved elliptical polarization distribution during image encryption. This encoding process can be characterized as confusion-based optical cryptography that involves no Fourier lens and diffusion operation. Based on the Jones matrix formalism, the intensity transmittance for this truncation is deduced to perform elliptical polarized light reconstruction based on two intensity measurements. Use of a quick response code makes the proposed cryptosystem practical, with versatile key sensitivity and fault tolerance. Both simulation and preliminary experimental results that support theoretical analysis are presented. An analysis of the resistance of the proposed method on a known public key attack is also provided.

  2. Optical asymmetric cryptography based on amplitude reconstruction of elliptically polarized light

    Science.gov (United States)

    Cai, Jianjun; Shen, Xueju; Lei, Ming

    2017-11-01

    We propose a novel optical asymmetric image encryption method based on amplitude reconstruction of elliptically polarized light, which is free from silhouette problem. The original image is analytically separated into two phase-only masks firstly, and then the two masks are encoded into amplitudes of the orthogonal polarization components of an elliptically polarized light. Finally, the elliptically polarized light propagates through a linear polarizer, and the output intensity distribution is recorded by a CCD camera to obtain the ciphertext. The whole encryption procedure could be implemented by using commonly used optical elements, and it combines diffusion process and confusion process. As a result, the proposed method achieves high robustness against iterative-algorithm-based attacks. Simulation results are presented to prove the validity of the proposed cryptography.

  3. Image size invariant visual cryptography for general access structures subject to display quality constraints.

    Science.gov (United States)

    Lee, Kai-Hui; Chiu, Pei-Ling

    2013-10-01

    Conventional visual cryptography (VC) suffers from a pixel-expansion problem, or an uncontrollable display quality problem for recovered images, and lacks a general approach to construct visual secret sharing schemes for general access structures. We propose a general and systematic approach to address these issues without sophisticated codebook design. This approach can be used for binary secret images in non-computer-aided decryption environments. To avoid pixel expansion, we design a set of column vectors to encrypt secret pixels rather than using the conventional VC-based approach. We begin by formulating a mathematic model for the VC construction problem to find the column vectors for the optimal VC construction, after which we develop a simulated-annealing-based algorithm to solve the problem. The experimental results show that the display quality of the recovered image is superior to that of previous papers.

  4. Synchronization of random bit generators based on coupled chaotic lasers and application to cryptography.

    Science.gov (United States)

    Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang

    2010-08-16

    Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.

  5. Separable Reversible Data Hiding in Encrypted Signals with Public Key Cryptography

    Directory of Open Access Journals (Sweden)

    Wei-Liang Tai

    2018-01-01

    Full Text Available We propose separable reversible data hiding in an encrypted signal with public key cryptography. In our separable framework, the image owner encrypts the original image by using a public key. On receipt of the encrypted signal, the data-hider embeds data in it by using a data-hiding key. The image decryption and data extraction are independent and separable at the receiver side. Even though the receiver, who has only the data-hiding key, does not learn about the decrypted content, he can extract data from the received marked encrypted signal. However, the receiver who has only the private key cannot extract the embedded data, but he can directly decrypt the received marked encrypted signal to obtain the original image without any error. Compared with other schemes using a cipher stream to encrypt the image, the proposed scheme is more appropriate for cloud services without degrading the security level.

  6. True random number generation from mobile telephone photo based on chaotic cryptography

    International Nuclear Information System (INIS)

    Zhao Liang; Liao Xiaofeng; Xiao Di; Xiang Tao; Zhou Qing; Duan Shukai

    2009-01-01

    A cheap, convenient and universal TRNG based on mobile telephone photo for producing random bit sequence is proposed. To settle the problem of sequential pixels and comparability, three chaos-based approaches are applied to post-process the generated binary image. The random numbers produced by three users are tested using US NIST RNG statistical test software. The experimental results indicate that the Arnold cat map is the fastest way to generate a random bit sequence and can be accepted on general PC. The 'MASK' algorithm also performs well. Finally, comparing with the TRNG of Hu et al. [Hu Y, Liao X, Wong KW, Zhou Q. A true random number generator based on mouse movement and chaotic cryptography. Chaos, Solitons and Fractals 2007. doi: 10.1016/j.chaos.2007.10.022] which is presented by Hu et al., many merits of the proposed TRNG in this paper has been found.

  7. Effect of imperfect Faraday mirrors on the security of a Faraday–Michelson quantum cryptography system

    International Nuclear Information System (INIS)

    Wang, Wei-Long; Gao, Ming; Ma, Zhi

    2013-01-01

    The one-way Faraday–Michelson system is a very useful practical quantum cryptography system where Faraday mirrors (FMs) play an important role. In this paper we analyze the security of this system against imperfect FMs. We consider the security loophole caused by imperfect FMs in Alice’s and Bob’s security zones. Then we implement a passive FM attack in this system. By changing the values of the imperfection parameters of Alice’s FMs, we calculate the quantum bit error rate between Alice and Bob induced by Eve and the probability that Eve obtains outcomes successfully. It is shown that the imperfection of one of Alice’s two FMs makes the system sensitive to an attack. Finally we give a modified key rate as a function of the FM imperfections. The security analysis indicates that both Alice’s and Bob’s imperfect FMs can compromise the secure key. (paper)

  8. Intraoperative transfusion threshold and tissue oxygenation

    DEFF Research Database (Denmark)

    Nielsen, K; Dahl, B; Johansson, P I

    2012-01-01

    Transfusion with allogeneic red blood cells (RBCs) may be needed to maintain oxygen delivery during major surgery, but the appropriate haemoglobin (Hb) concentration threshold has not been well established. We hypothesised that a higher level of Hb would be associated with improved subcutaneous...

  9. Oscillatory Threshold Logic

    Science.gov (United States)

    Borresen, Jon; Lynch, Stephen

    2012-01-01

    In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034

  10. All-optical cryptography of M-QAM formats by using two-dimensional spectrally sliced keys.

    Science.gov (United States)

    Abbade, Marcelo L F; Cvijetic, Milorad; Messani, Carlos A; Alves, Cleiton J; Tenenbaum, Stefan

    2015-05-10

    There has been an increased interest in enhancing the security of optical communications systems and networks. All-optical cryptography methods have been considered as an alternative to electronic data encryption. In this paper we propose and verify the use of a novel all-optical scheme based on cryptographic keys applied on the spectral signal for encryption of the M-QAM modulated data with bit rates of up to 200 gigabits per second.

  11. Wigner representation for experiments on quantum cryptography using two-photon polarization entanglement produced in parametric down-conversion

    Energy Technology Data Exchange (ETDEWEB)

    Casado, A [Departamento de Fisica Aplicada III, Escuela Superior de Ingenieros, Universidad de Sevilla, 41092 Sevilla (Spain); Guerra, S [Centro Asociado de la Universidad Nacional de Educacion a Distancia de Las Palmas de Gran Canaria (Spain); Placido, J [Departamento de Fisica, Universidad de Las Palmas de Gran Canaria (Spain)], E-mail: acasado@us.es

    2008-02-28

    In this paper, the theory of parametric down-conversion in the Wigner representation is applied to Ekert's quantum cryptography protocol. We analyse the relation between two-photon entanglement and (non-secure) quantum key distribution within the Wigner framework in the Heisenberg picture. Experiments using two-qubit polarization entanglement generated in nonlinear crystals are analysed in this formalism, along with the effects of eavesdropping attacks in the case of projective measurements.

  12. Wigner representation for experiments on quantum cryptography using two-photon polarization entanglement produced in parametric down-conversion

    International Nuclear Information System (INIS)

    Casado, A; Guerra, S; Placido, J

    2008-01-01

    In this paper, the theory of parametric down-conversion in the Wigner representation is applied to Ekert's quantum cryptography protocol. We analyse the relation between two-photon entanglement and (non-secure) quantum key distribution within the Wigner framework in the Heisenberg picture. Experiments using two-qubit polarization entanglement generated in nonlinear crystals are analysed in this formalism, along with the effects of eavesdropping attacks in the case of projective measurements

  13. Threshold concepts in finance: conceptualizing the curriculum

    Science.gov (United States)

    Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim

    2015-08-01

    Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to the mastery of finance and by exploring their potential for informing curriculum design and pedagogical practices to improve student outcomes. In this paper, we report the results of an online survey of finance academics at multiple institutions in Australia, Canada, New Zealand, South Africa and the United Kingdom. The outcomes of our research are recommendations for threshold concepts in finance endorsed by quantitative evidence, as well as a model of the finance curriculum incorporating finance, modelling and statistics threshold concepts. In addition, we draw conclusions about the application of threshold concept theory supported by both quantitative and qualitative evidence. Our methodology and findings have general relevance to the application of threshold concept theory as a means to investigate and inform curriculum design and delivery in higher education.

  14. Iris pigmentation and AC thresholds.

    Science.gov (United States)

    Roche, A F; Mukherjee, D; Chumlea, W C; Siervogel, R M

    1983-03-01

    Data from 160 White children were used to analyze possible associations between iris pigmentation and AC pure-tone thresholds. Iris pigmentation was graded from iris color using glass models of eyes, and AC thresholds were obtained under carefully controlled conditions. Analyses of variance using two groupings of iris color grades showed no evidence of an association between iris color grade and AC thresholds. Furthermore, inspection of arrays of the actual glass eye models, in conjunction with the order of mean thresholds at each test frequency, did not indicate the presence of an association between iris color grades and thresholds. It was concluded that while iris pigmentation may be related to some aspects of hearing ability, it does not appear to be related to AC thresholds in children.

  15. Crossing the threshold

    Science.gov (United States)

    Bush, John; Tambasco, Lucas

    2017-11-01

    First, we summarize the circumstances in which chaotic pilot-wave dynamics gives rise to quantum-like statistical behavior. For ``closed'' systems, in which the droplet is confined to a finite domain either by boundaries or applied forces, quantum-like features arise when the persistence time of the waves exceeds the time required for the droplet to cross its domain. Second, motivated by the similarities between this hydrodynamic system and stochastic electrodynamics, we examine the behavior of a bouncing droplet above the Faraday threshold, where a stochastic element is introduced into the drop dynamics by virtue of its interaction with a background Faraday wave field. With a view to extending the dynamical range of pilot-wave systems to capture more quantum-like features, we consider a generalized theoretical framework for stochastic pilot-wave dynamics in which the relative magnitudes of the drop-generated pilot-wave field and a stochastic background field may be varied continuously. We gratefully acknowledge the financial support of the NSF through their CMMI and DMS divisions.

  16. Albania - Thresholds I and II

    Data.gov (United States)

    Millennium Challenge Corporation — From 2006 to 2011, the government of Albania (GOA) received two Millennium Challenge Corporation (MCC) Threshold Programs totaling $29.6 million. Albania received...

  17. Olfactory threshold in Parkinson's disease.

    Science.gov (United States)

    Quinn, N P; Rossor, M N; Marsden, C D

    1987-01-01

    Olfactory threshold to differing concentrations of amyl acetate was determined in 78 subjects with idiopathic Parkinson's disease and 40 age-matched controls. Impaired olfactory threshold (previously reported by others) was confirmed in Parkinsonian subjects compared with controls. There was no significant correlation between olfactory threshold and age, sex, duration of disease, or current therapy with levodopa or anticholinergic drugs. In a sub-group of 14 levodopa-treated patients with severe "on-off" fluctuations, no change in olfactory threshold between the two states was demonstrable. Olfactory impairment in Parkinson's disease may involve mechanisms that are not influenced by pharmacologic manipulation of dopaminergic or cholinergic status. PMID:3819760

  18. Learning foraging thresholds for lizards

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, L.A. [Univ. of Warwick, Coventry (United Kingdom). Dept. of Computer Science; Hart, W.E. [Sandia National Labs., Albuquerque, NM (United States); Wilson, D.B. [Massachusetts Inst. of Tech., Cambridge, MA (United States)

    1996-01-12

    This work gives a proof of convergence for a randomized learning algorithm that describes how anoles (lizards found in the Carribean) learn a foraging threshold distance. This model assumes that an anole will pursue a prey if and only if it is within this threshold of the anole`s perch. This learning algorithm was proposed by the biologist Roughgarden and his colleagues. They experimentally confirmed that this algorithm quickly converges to the foraging threshold that is predicted by optimal foraging theory our analysis provides an analytic confirmation that the learning algorithm converses to this optimal foraging threshold with high probability.

  19. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  20. Second threshold in weak interactions

    NARCIS (Netherlands)

    Veltman, M.J.G.

    1977-01-01

    The point of view that weak interactions must have a second threshold below 300 – 600 GeV is developed. Above this threshold new physics must come in. This new physics may be the Higgs system, or some other nonperturbative system possibly having some similarities to the Higgs system. The limit of

  1. The Nature of Psychological Thresholds

    Science.gov (United States)

    Rouder, Jeffrey N.; Morey, Richard D.

    2009-01-01

    Following G. T. Fechner (1966), thresholds have been conceptualized as the amount of intensity needed to transition between mental states, such as between a states of unconsciousness and consciousness. With the advent of the theory of signal detection, however, discrete-state theory and the corresponding notion of threshold have been discounted.…

  2. Compilation Techniques Specific for a Hardware Cryptography-Embedded Multimedia Mobile Processor

    Directory of Open Access Journals (Sweden)

    Masa-aki FUKASE

    2007-12-01

    Full Text Available The development of single chip VLSI processors is the key technology of ever growing pervasive computing to answer overall demands for usability, mobility, speed, security, etc. We have so far developed a hardware cryptography-embedded multimedia mobile processor architecture, HCgorilla. Since HCgorilla integrates a wide range of techniques from architectures to applications and languages, one-sided design approach is not always useful. HCgorilla needs more complicated strategy, that is, hardware/software (H/S codesign. Thus, we exploit the software support of HCgorilla composed of a Java interface and parallelizing compilers. They are assumed to be installed in servers in order to reduce the load and increase the performance of HCgorilla-embedded clients. Since compilers are the essence of software's responsibility, we focus in this article on our recent results about the design, specifications, and prototyping of parallelizing compilers for HCgorilla. The parallelizing compilers are composed of a multicore compiler and a LIW compiler. They are specified to abstract parallelism from executable serial codes or the Java interface output and output the codes executable in parallel by HCgorilla. The prototyping compilers are written in Java. The evaluation by using an arithmetic test program shows the reasonability of the prototyping compilers compared with hand compilers.

  3. Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.

    Directory of Open Access Journals (Sweden)

    Liping Zhang

    Full Text Available In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.

  4. Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation

    Directory of Open Access Journals (Sweden)

    Marisa W. Paryasto

    2013-09-01

    Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.

  5. Composite Field Multiplier based on Look-Up Table for Elliptic Curve Cryptography Implementation

    Directory of Open Access Journals (Sweden)

    Marisa W. Paryasto

    2012-04-01

    Full Text Available Implementing a secure cryptosystem requires operations involving hundreds of bits. One of the most recommended algorithm is Elliptic Curve Cryptography (ECC. The complexity of elliptic curve algorithms and parameters with hundreds of bits requires specific design and implementation strategy. The design architecture must be customized according to security requirement, available resources and parameter choices. In this work we propose the use of composite field to implement finite field multiplication for ECC implementation. We use 299-bit keylength represented in GF((21323 instead of in GF(2299. Composite field multiplier can be implemented using different multiplier for ground-field and for extension field. In this paper, LUT is used for multiplication in the ground-field and classic multiplieris used for the extension field multiplication. A generic architecture for the multiplier is presented. Implementation is done with VHDL with the target device Altera DE2. The work in this paper uses the simplest algorithm to confirm the idea that by dividing field into composite, use different multiplier for base and extension field would give better trade-off for time and area. This work will be the beginning of our more advanced further research that implements composite-field using Mastrovito Hybrid, KOA and LUT.

  6. Deciphering the language of nature: cryptography, secrecy, and alterity in Francis Bacon.

    Science.gov (United States)

    Clody, Michael C

    2011-01-01

    The essay argues that Francis Bacon's considerations of parables and cryptography reflect larger interpretative concerns of his natural philosophic project. Bacon describes nature as having a language distinct from those of God and man, and, in so doing, establishes a central problem of his natural philosophy—namely, how can the language of nature be accessed through scientific representation? Ultimately, Bacon's solution relies on a theory of differential and duplicitous signs that conceal within them the hidden voice of nature, which is best recognized in the natural forms of efficient causality. The "alphabet of nature"—those tables of natural occurrences—consequently plays a central role in his program, as it renders nature's language susceptible to a process and decryption that mirrors the model of the bilateral cipher. It is argued that while the writing of Bacon's natural philosophy strives for literality, its investigative process preserves a space for alterity within scientific representation, that is made accessible to those with the interpretative key.

  7. An Interoperability Consideration in Selecting Domain Parameters for Elliptic Curve Cryptography

    Science.gov (United States)

    Ivancic, Will (Technical Monitor); Eddy, Wesley M.

    2005-01-01

    Elliptic curve cryptography (ECC) will be an important technology for electronic privacy and authentication in the near future. There are many published specifications for elliptic curve cryptosystems, most of which contain detailed descriptions of the process for the selection of domain parameters. Selecting strong domain parameters ensures that the cryptosystem is robust to attacks. Due to a limitation in several published algorithms for doubling points on elliptic curves, some ECC implementations may produce incorrect, inconsistent, and incompatible results if domain parameters are not carefully chosen under a criterion that we describe. Few documents specify the addition or doubling of points in such a manner as to avoid this problematic situation. The safety criterion we present is not listed in any ECC specification we are aware of, although several other guidelines for domain selection are discussed in the literature. We provide a simple example of how a set of domain parameters not meeting this criterion can produce catastrophic results, and outline a simple means of testing curve parameters for interoperable safety over doubling.

  8. Generalized optical angular momentum sorter and its application to high-dimensional quantum cryptography.

    Science.gov (United States)

    Larocque, Hugo; Gagnon-Bischoff, Jérémie; Mortimer, Dominic; Zhang, Yingwen; Bouchard, Frédéric; Upham, Jeremy; Grillo, Vincenzo; Boyd, Robert W; Karimi, Ebrahim

    2017-08-21

    The orbital angular momentum (OAM) carried by optical beams is a useful quantity for encoding information. This form of encoding has been incorporated into various works ranging from telecommunications to quantum cryptography, most of which require methods that can rapidly process the OAM content of a beam. Among current state-of-the-art schemes that can readily acquire this information are so-called OAM sorters, which consist of devices that spatially separate the OAM components of a beam. Such devices have found numerous applications in optical communications, a field that is in constant demand for additional degrees of freedom, such as polarization and wavelength, into which information can also be encoded. Here, we report the implementation of a device capable of sorting a beam based on its OAM and polarization content, which could be of use in works employing both of these degrees of freedom as information channels. After characterizing our fabricated device, we demonstrate how it can be used for quantum communications via a quantum key distribution protocol.

  9. Optical cryptography of gray-level image information using QPSK modulation and digital holography

    Science.gov (United States)

    Gil, Sang Keun; Jeon, Seok Hee; Jeong, Jong Rae

    2009-02-01

    We propose a novel optical cryptography of gray-level information image using QPSK digital modulation method and digital holographic technique. A gray-level information image is digitized into 8-bits binary information data by ASCII encoding method and these binary information data are expressed by four pair of quadrature phase values in a block having 2×2 pixels by QPSK digital modulation. After encoding and modulation, the size of data to be encrypted expands two times more than the original size of gray-level image. The modified information with corresponded phase values is displayed on a phase-type spatial light modulator and is encrypted with a security key by using optical digital holography. The security key is expressed with random binary phase. Digital hologram in this method is Fourier transform hologram and is recorded on CCD camera with 256 gray-level quantized intensities. These encrypted digital holograms are able to be stored by computer and be transmitted over a communication network. With this encrypted digital hologram, the phase values are reconstructed with the same security key by holographic technique and are decrypted into the original gray-level information image by decoding. Simulation results show that the proposed method can be used for a cipher and security system.

  10. Clipper Meets Apple vs. FBI—A Comparison of the Cryptography Discourses from 1993 and 2016

    Directory of Open Access Journals (Sweden)

    Matthias Schulze

    2017-03-01

    Full Text Available This article analyzes two cryptography discourses dealing with the question of whether governments should be able to monitor secure and encrypted communication, for example via security vulnerabilities in cryptographic systems. The Clipper chip debate of 1993 and the FBI vs. Apple case of 2016 are analyzed to infer whether these discourses show similarities in their arguments and to draw lessons from them. The study is based on the securitization framework and analyzes the social construction of security threats in political discourses. The findings are that the arguments made by the proponents of exceptional access show major continuities between the two cases. In contrast, the arguments of the critics are more diverse. The critical arguments for stronger encryption remain highly relevant, especially in the context of the Snowden revelations. The article concludes that we need to adopt a more general cyber security perspective, considering the threat of cyber crime and state hacking, when debating whether the government should be able to weaken encryption.

  11. Quantum Information, computation and cryptography. An introductory survey of theory, technology and experiments

    Energy Technology Data Exchange (ETDEWEB)

    Benatti, Fabio [Trieste Univ., Miramare (Italy). Dipt. Fisica Teorica; Fannes, Mark [Leuven Univ. (Belgium). Inst. voor Theoretische Fysica; Floreanini, Roberto [INFN, Trieste (Italy). Dipt. di Fisica Teorica; Petritis, Dimitri (eds.) [Rennes 1 Univ., 35 (France). Inst. de Recherche Mathematique de Rennes

    2010-07-01

    This multi-authored textbook addresses graduate students with a background in physics, mathematics or computer science. No research experience is necessary. Consequently, rather than comprehensively reviewing the vast body of knowledge and literature gathered in the past twenty years, this book concentrates on a number of carefully selected aspects of quantum information theory and technology. Given the highly interdisciplinary nature of the subject, the multi-authored approach brings together different points of view from various renowned experts, providing a coherent picture of the subject matter. The book consists of ten chapters and includes examples, problems, and exercises. The first five present the mathematical tools required for a full comprehension of various aspects of quantum mechanics, classical information, and coding theory. Chapter 6 deals with the manipulation and transmission of information in the quantum realm. Chapters 7 and 8 discuss experimental implementations of quantum information ideas using photons and atoms. Finally, chapters 9 and 10 address ground-breaking applications in cryptography and computation. (orig.)

  12. Lightweight Data Aggregation Scheme against Internal Attackers in Smart Grid Using Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    Debiao He

    2017-01-01

    Full Text Available Recent advances of Internet and microelectronics technologies have led to the concept of smart grid which has been a widespread concern for industry, governments, and academia. The openness of communications in the smart grid environment makes the system vulnerable to different types of attacks. The implementation of secure communication and the protection of consumers’ privacy have become challenging issues. The data aggregation scheme is an important technique for preserving consumers’ privacy because it can stop the leakage of a specific consumer’s data. To satisfy the security requirements of practical applications, a lot of data aggregation schemes were presented over the last several years. However, most of them suffer from security weaknesses or have poor performances. To reduce computation cost and achieve better security, we construct a lightweight data aggregation scheme against internal attackers in the smart grid environment using Elliptic Curve Cryptography (ECC. Security analysis of our proposed approach shows that it is provably secure and can provide confidentiality, authentication, and integrity. Performance analysis of the proposed scheme demonstrates that both computation and communication costs of the proposed scheme are much lower than the three previous schemes. As a result of these aforementioned benefits, the proposed lightweight data aggregation scheme is more practical for deployment in the smart grid environment.

  13. Elliptic Curve Cryptography-Based Authentication with Identity Protection for Smart Grids.

    Science.gov (United States)

    Zhang, Liping; Tang, Shanyu; Luo, He

    2016-01-01

    In a smart grid, the power service provider enables the expected power generation amount to be measured according to current power consumption, thus stabilizing the power system. However, the data transmitted over smart grids are not protected, and then suffer from several types of security threats and attacks. Thus, a robust and efficient authentication protocol should be provided to strength the security of smart grid networks. As the Supervisory Control and Data Acquisition system provides the security protection between the control center and substations in most smart grid environments, we focus on how to secure the communications between the substations and smart appliances. Existing security approaches fail to address the performance-security balance. In this study, we suggest a mitigation authentication protocol based on Elliptic Curve Cryptography with privacy protection by using a tamper-resistant device at the smart appliance side to achieve a delicate balance between performance and security of smart grids. The proposed protocol provides some attractive features such as identity protection, mutual authentication and key agreement. Finally, we demonstrate the completeness of the proposed protocol using the Gong-Needham-Yahalom logic.

  14. Percolation Threshold in Polycarbonate Nanocomposites

    Science.gov (United States)

    Ahuja, Suresh

    2014-03-01

    Nanocomposites have unique mechanical, electrical, magnetic, optical and thermal properties. Many methods could be applied to prepare polymer-inorganic nanocomposites, such as sol-gel processing, in-situ polymerization, particle in-situ formation, blending, and radiation synthesis. The analytical composite models that have been put forth include Voigt and Reuss bounds, Polymer nanocomposites offer the possibility of substantial improvements in material properties such as shear and bulk modulus, yield strength, toughness, film scratch resistance, optical properties, electrical conductivity, gas and solvent transport, with only very small amounts of nanoparticles Experimental results are compared against composite models of Hashin and Shtrikman bounds, Halpin-Tsai model, Cox model, and various Mori and Tanaka models. Examples of numerical modeling are molecular dynamics modeling and finite element modeling of reduced modulus and hardness that takes into account the modulus of the components and the effect of the interface between the hard filler and relatively soft polymer, polycarbonate. Higher nanoparticle concentration results in poor dispersion and adhesion to polymer matrix which results in lower modulus and hardness and departure from the existing composite models. As the level of silica increases beyond a threshold level, aggregates form which results in weakening of the structure. Polymer silica interface is found to be weak as silica is non-interacting promoting interfacial slip at silica-matrix junctions. Our experimental results compare favorably with those of nanocomposites of polyesters where the effect of nanoclay on composite hardness and modulus depended on dispersion of nanoclay in polyester.

  15. Threshold Concepts in Higher Education: A Synthesis of the Literature Relating to Measurement of Threshold Crossing

    Science.gov (United States)

    Nicola-Richmond, Kelli; Pépin, Geneviève; Larkin, Helen; Taylor, Charlotte

    2018-01-01

    In relation to teaching and learning approaches that improve student learning outcomes, threshold concepts have generated substantial interest in higher education. They have been described as "portals" that lead to a transformed way of understanding or thinking, enabling learners to progress, and have been enthusiastically adopted to…

  16. Photoproduction of Charm Near Threshold

    Energy Technology Data Exchange (ETDEWEB)

    Brodsky, Stanley J.

    2000-10-31

    Charm and bottom production near threshold is sensitive to the multi-quark, gluonic, and hidden-color correlations of hadronic and nuclear wavefunctions in QCD since all of the target's constituents must act coherently within the small interaction volume of the heavy quark production subprocess. Although such multi-parton subprocess cross sections are suppressed by powers of 1=m{sub Q}{sup 2}, they have less phase-space suppression and can dominate the contributions of the leading-twist single-gluon subprocesses in the threshold regime. The small rates for open and hidden charm photoproduction at threshold call for a dedicated facility.

  17. Parton distributions with threshold resummation

    CERN Document Server

    Bonvini, Marco; Rojo, Juan; Rottoli, Luca; Ubiali, Maria; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Hartland, Nathan P.

    2015-01-01

    We construct a set of parton distribution functions (PDFs) in which fixed-order NLO and NNLO calculations are supplemented with soft-gluon (threshold) resummation up to NLL and NNLL accuracy respectively, suitable for use in conjunction with any QCD calculation in which threshold resummation is included at the level of partonic cross sections. These resummed PDF sets, based on the NNPDF3.0 analysis, are extracted from deep-inelastic scattering, Drell-Yan, and top quark pair production data, for which resummed calculations can be consistently used. We find that, close to threshold, the inclusion of resummed PDFs can partially compensate the enhancement in resummed matrix elements, leading to resummed hadronic cross-sections closer to the fixed-order calculation. On the other hand, far from threshold, resummed PDFs reduce to their fixed-order counterparts. Our results demonstrate the need for a consistent use of resummed PDFs in resummed calculations.

  18. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Science.gov (United States)

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers

  19. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    Directory of Open Access Journals (Sweden)

    Nicholas V Olijnyk

    Full Text Available This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index are growing. China's H-index (a normalized indicator has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures; some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state, while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation. Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  20. Advances in threshold photoelectron spectroscopy (TPES) and threshold photoelectron photoion coincidence (TPEPICO).

    Science.gov (United States)

    Baer, Tomas; Tuckett, Richard P

    2017-04-12

    The history and evolution of molecular threshold photoelectron spectroscopy and threshold photoelectron photoion coincidence spectroscopy (TPEPICO) over the last fifty years are reviewed. Emphasis is placed on instrumentation and the extraction of dynamical information about energy selected ion dissociation, not on the detailed spectroscopy of certain molecules. Three important advances have expanded greatly the power of the technique, and permitted its implementation on modern synchrotron radiation beamlines. The use of velocity focusing of threshold electrons onto an imaging detector in the 1990s simultaneously improved the sensitivity and electron energy resolution, and also facilitated the subtraction of hot electron background in both threshold electron spectroscopy and TPEPICO studies. The development of multi-start multi-stop collection detectors for both electrons and ions in the 2000s permitted the use of the full intensity of modern synchrotron radiation thereby greatly improving the signal-to-noise ratio. Finally, recent developments involving imaging electrons in a range of energies as well as ions onto separate position-sensitive detectors has further improved the collection sensitivity so that low density samples found in a variety of studies can be investigated. As a result, photoelectron photoion coincidence spectroscopy is now well positioned to address a range of challenging problems that include the quantitative determination of compositions of isomer mixtures, and the detection and spectroscopy of free radicals produced in pyrolysis or discharge sources as well as in combustion studies.

  1. Conceptions of nuclear threshold status

    International Nuclear Information System (INIS)

    Quester, G.H.

    1991-01-01

    This paper reviews some alternative definitions of nuclear threshold status. Each of them is important, and major analytical confusions would result if one sense of the term is mistaken for another. The motives for nations entering into such threshold status are a blend of civilian and military gains, and of national interests versus parochial or bureaucratic interests. A portion of the rationale for threshold status emerges inevitably from the pursuit of economic goals, and another portion is made more attraction by the derives of the domestic political process. Yet the impact on international security cannot be dismissed, especially where conflicts among the states remain real. Among the military or national security motives are basic deterrence, psychological warfare, war-fighting and, more generally, national prestige. In the end, as the threshold phenomenon is assayed for lessons concerning the role of nuclear weapons more generally in international relations and security, one might conclude that threshold status and outright proliferation coverage to a degree in the motives for all of the states involved and in the advantages attained. As this paper has illustrated, nuclear threshold status is more subtle and more ambiguous than outright proliferation, and it takes considerable time to sort out the complexities. Yet the world has now had a substantial amount of time to deal with this ambiguous status, and this may tempt more states to exploit it

  2. Tactile thresholds in healthy subjects

    Directory of Open Access Journals (Sweden)

    Metka Moharić

    2014-10-01

    Full Text Available Background: The assessment of sensory thresholds provides a method of examining the function of peripheral nerve fibers and their central connections. Quantitative sensory testing is a variant of conventional sensory testing wherein the goal is the quantification of the level of stimulation needed to produce a particular sensation. While thermal and vibratory testing are established methods in assessment of sensory thresholds, assessment of tactile thresholds with monofilaments is not used routinely. The purpose of this study was to assess the tactile thresholds in normal healthy population.Methods: In 39 healthy volunteers (19 men aged 21 to 71 years, tactile thresholds were assessed with von Frey’s hair in 7 parts of the body bilaterally.Results: We found touch sensitivity not to be dependent on age or gender. The right side was significantly more sensitive in the lateral part of the leg (p=0.011 and the left side in the medial part of the arm (p=0.022. There were also significant differences between sites (p<0.001, whereby distal parts of the body were more sensitive.Conclusions: Von Frey filaments allow the estimation of tactile thresholds without the need for complicated instrumentation.

  3. Technology Thresholds for Microgravity: Status and Prospects

    Science.gov (United States)

    Noever, D. A.

    1996-01-01

    The technological and economic thresholds for microgravity space research are estimated in materials science and biotechnology. In the 1990s, the improvement of materials processing has been identified as a national scientific priority, particularly for stimulating entrepreneurship. The substantial US investment at stake in these critical technologies includes six broad categories: aerospace, transportation, health care, information, energy, and the environment. Microgravity space research addresses key technologies in each area. The viability of selected space-related industries is critically evaluated and a market share philosophy is developed, namely that incremental improvements in a large markets efficiency is a tangible reward from space-based research.

  4. Perioperative transfusion threshold and ambulation after hip revision surgery

    DEFF Research Database (Denmark)

    Nielsen, Kamilla; Johansson, Pär I; Dahl, Benny

    2014-01-01

    BACKGROUND: Transfusion with red blood cells (RBC) may be needed during hip revision surgery but the appropriate haemoglobin concentration (Hb) threshold for transfusion has not been well established. We hypothesized that a higher transfusion threshold would improve ambulation after hip revision...... surgery. METHODS: The trial was registered at Clinicaltrials.gov ( NCT00906295). Sixty-six patients aged 18 years or older undergoing hip revision surgery were randomized to receive RBC at a Hb threshold of either 7.3 g/dL (restrictive group) or 8.9 g/dL (liberal group). Postoperative ambulation...... received RBC. CONCLUSIONS: A Hb transfusion threshold of 8.9 g/dL was associated with a statistically significantly faster TUG after hip revision surgery compared to a threshold of 7.3 g/dL but the clinical importance is questionable and the groups did not differ in Hb at the time of testing....

  5. Change Detection by Fusing Advantages of Threshold and Clustering Methods

    Science.gov (United States)

    Tan, M.; Hao, M.

    2017-09-01

    In change detection (CD) of medium-resolution remote sensing images, the threshold and clustering methods are two kinds of the most popular ones. It is found that the threshold method of the expectation maximum (EM) algorithm usually generates a CD map including many false alarms but almost detecting all changes, and the fuzzy local information c-means algorithm (FLICM) obtains a homogeneous CD map but with some missed detections. Therefore, we aim to design a framework to improve CD results by fusing the advantages of threshold and clustering methods. Experimental results indicate the effectiveness of the proposed method.

  6. CHANGE DETECTION BY FUSING ADVANTAGES OF THRESHOLD AND CLUSTERING METHODS

    Directory of Open Access Journals (Sweden)

    M. Tan

    2017-09-01

    Full Text Available In change detection (CD of medium-resolution remote sensing images, the threshold and clustering methods are two kinds of the most popular ones. It is found that the threshold method of the expectation maximum (EM algorithm usually generates a CD map including many false alarms but almost detecting all changes, and the fuzzy local information c-means algorithm (FLICM obtains a homogeneous CD map but with some missed detections. Therefore, we aim to design a framework to improve CD results by fusing the advantages of threshold and clustering methods. Experimental results indicate the effectiveness of the proposed method.

  7. DOE approach to threshold quantities

    International Nuclear Information System (INIS)

    Wickham, L.E.; Kluk, A.F.; Department of Energy, Washington, DC)

    1985-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Ideally, the threshold must be set high enough to significantly reduce the amount of waste requiring special handling. It must also be low enough so that waste at the threshold quantity poses a very small health risk and multiple exposures to such waste would still constitute a small health risk. It should also be practical to segregate waste above or below the threshold quantity using available instrumentation. Guidance is being prepared to aid DOE sites in establishing threshold quantity values based on pathways analysis using site-specific parameters (waste stream characteristics, maximum exposed individual, population considerations, and site specific parameters such as rainfall, etc.). A guidance dose of between 0.001 to 1.0 mSv/y (0.1 to 100 mrem/y) was recommended with 0.3 mSv/y (30 mrem/y) selected as the guidance dose upon which to base calculations. Several tasks were identified, beginning with the selection of a suitable pathway model for relating dose to the concentration of radioactivity in the waste. Threshold concentrations corresponding to the guidance dose were determined for waste disposal sites at a selected humid and arid site. Finally, cost-benefit considerations at the example sites were addressed. The results of the various tasks are summarized and the relationship of this effort with related developments at other agencies discussed

  8. Thresholds in chemical respiratory sensitisation.

    Science.gov (United States)

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the

  9. On computational Gestalt detection thresholds.

    Science.gov (United States)

    Grompone von Gioi, Rafael; Jakubowicz, Jérémie

    2009-01-01

    The aim of this paper is to show some recent developments of computational Gestalt theory, as pioneered by Desolneux, Moisan and Morel. The new results allow to predict much more accurately the detection thresholds. This step is unavoidable if one wants to analyze visual detection thresholds in the light of computational Gestalt theory. The paper first recalls the main elements of computational Gestalt theory. It points out a precision issue in this theory, essentially due to the use of discrete probability distributions. It then proposes to overcome this issue by using continuous probability distributions and illustrates it on the meaningful alignment detector of Desolneux et al.

  10. Nuclear threshold effects and neutron strength function

    International Nuclear Information System (INIS)

    Hategan, Cornel; Comisel, Horia

    2003-01-01

    One proves that a Nuclear Threshold Effect is dependent, via Neutron Strength Function, on Spectroscopy of Ancestral Neutron Threshold State. The magnitude of the Nuclear Threshold Effect is proportional to the Neutron Strength Function. Evidence for relation of Nuclear Threshold Effects to Neutron Strength Functions is obtained from Isotopic Threshold Effect and Deuteron Stripping Threshold Anomaly. The empirical and computational analysis of the Isotopic Threshold Effect and of the Deuteron Stripping Threshold Anomaly demonstrate their close relationship to Neutron Strength Functions. It was established that the Nuclear Threshold Effects depend, in addition to genuine Nuclear Reaction Mechanisms, on Spectroscopy of (Ancestral) Neutron Threshold State. The magnitude of the effect is proportional to the Neutron Strength Function, in their dependence on mass number. This result constitutes also a proof that the origins of these threshold effects are Neutron Single Particle States at zero energy. (author)

  11. Thresholds models of technological transitions

    NARCIS (Netherlands)

    Zeppini, P.; Frenken, K.; Kupers, R.

    2014-01-01

    We present a systematic review of seven threshold models of technological transitions from physics, biology, economics and sociology. The very same phenomenon of a technological transition can be explained by very different logics, ranging from economic explanations based on price, performance and

  12. Risk thresholds for alcohol consumption

    DEFF Research Database (Denmark)

    Wood, Angela M; Kaptoge, Stephen; Butterworth, Adam S

    2018-01-01

    BACKGROUND: Low-risk limits recommended for alcohol consumption vary substantially across different national guidelines. To define thresholds associated with lowest risk for all-cause mortality and cardiovascular disease, we studied individual-participant data from 599 912 current drinkers withou...

  13. Weights of Exact Threshold Functions

    DEFF Research Database (Denmark)

    Babai, László; Hansen, Kristoffer Arnsfelt; Podolskii, Vladimir V.

    2010-01-01

    We consider Boolean exact threshold functions defined by linear equations, and in general degree d polynomials. We give upper and lower bounds on the maximum magnitude (absolute value) of the coefficients required to represent such functions. These bounds are very close and in the linear case in ...... leave a substantial gap, a challenge for future work....

  14. Threshold quantities for helminth infections

    NARCIS (Netherlands)

    Heesterbeek, J.A.P.; Roberts, M.G.

    1995-01-01

    For parasites with a clearly defined life-cycle we give threshold quantities that determine the stability of the parasite-free steady state for autonomous and periodic deterministic systems formulated in terms of mean parasite burdens. We discuss the biological interpretations of the quantities, how

  15. Percolation Threshold Parameters of Fluids

    Czech Academy of Sciences Publication Activity Database

    Škvor, J.; Nezbeda, Ivo

    2009-01-01

    Roč. 79, č. 4 (2009), 041141-041147 ISSN 1539-3755 Institutional research plan: CEZ:AV0Z40720504 Keywords : percolation threshold * universality * infinite cluster Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 2.400, year: 2009

  16. Threshold enhancement of diphoton resonances

    Directory of Open Access Journals (Sweden)

    Aoife Bharucha

    2016-10-01

    Full Text Available We revisit a mechanism to enhance the decay width of (pseudo-scalar resonances to photon pairs when the process is mediated by loops of charged fermions produced near threshold. Motivated by the recent LHC data, indicating the presence of an excess in the diphoton spectrum at approximately 750 GeV, we illustrate this threshold enhancement mechanism in the case of a 750 GeV pseudoscalar boson A with a two-photon decay mediated by a charged and uncolored fermion having a mass at the 12MA threshold and a small decay width, <1 MeV. The implications of such a threshold enhancement are discussed in two explicit scenarios: i the Minimal Supersymmetric Standard Model in which the A state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through loops of charginos with masses close to 12MA and ii a two Higgs doublet model in which A is again produced by gluon fusion but decays into photons through loops of vector-like charged heavy leptons. In both these scenarios, while the mass of the charged fermion has to be adjusted to be extremely close to half of the A resonance mass, the small total widths are naturally obtained if only suppressed three-body decay channels occur. Finally, the implications of some of these scenarios for dark matter are discussed.

  17. Crossing Thresholds in Academic Reading

    Science.gov (United States)

    Abbott, Rob

    2013-01-01

    This paper looks at the conceptual thresholds in relation to academic reading which might be crossed by undergraduate English Literature students. It is part of a wider study following 16 students through three years of undergraduate study. It uses theoretical ideas from Bakhtin and Foucault to analyse interviews with English lecturers. It…

  18. Quantifying ecological thresholds from response surfaces

    Science.gov (United States)

    Heather E. Lintz; Bruce McCune; Andrew N. Gray; Katherine A. McCulloh

    2011-01-01

    Ecological thresholds are abrupt changes of ecological state. While an ecological threshold is a widely accepted concept, most empirical methods detect them in time or across geographic space. Although useful, these approaches do not quantify the direct drivers of threshold response. Causal understanding of thresholds detected empirically requires their investigation...

  19. Treinamento de natação na intensidade do limiar anaeróbio melhora a aptidão funcional de ratos idosos Swimming training at anaerobic threshold intensity improves the functional fitness of older rats

    Directory of Open Access Journals (Sweden)

    Verusca Najara de Carvalho Cunha

    2008-12-01

    Full Text Available Os efeitos do treinamento aeróbio em intensidade relativa ao limiar de lactato (LL foram analisados em 15 ratos idosos (~448 dias de vida. Os grupos de animais treinados (n=9 e controle (n=6 foram submetidos a um teste antes e após quatro semanas de treinamento. O teste incremental consistiu de uma carga inicial de 1% do peso corporal e incrementos de 1% a cada três minutos, com mensurações de lactato sanguíneo para identificação do LL por inspeção visual do ponto de inflexão da curva. O programa de treinamento consistiu de 30 minutos de natação/dia, cinco dias/semana, com sobrecarga de 5% do peso corporal (PC, ou controle sem exercício. Foi observado aumento significativo na intensidade do LL após o treinamento (pré = 4,5 ± 1,1 vs. Pós = 5.4 ± 0.9% PC. A carga máxima atingida ao final do teste incremental aumentou significativamente de 39,7 ± 7,5g no pré para 48,4 ± 10,5g no pós treinamento, sem mudanças para o grupo controle (44,7 ± 8 vs. 45,3 ± 9,3g. O peso corporal do grupo treinado não apresentou diferença como resultado de quatro semanas de natação em intensidade correspondente ao LL (641,0 ±62,0 para 636,0 ± 72.7g; p>0.05. Por outro lado, o grupo não treinado aumentou significativamente o PC de 614,0 ± 8,0 para 643,0 ± 74,1g. A carga máxima atingida expressa tanto em valores absolutos como relativos (%PC aumentou significativamente após o treinamento. Conclui-se que quatro semanas de treinamento de natação em intensidade correspondente ao limiar de lactato resultou em uma melhora da aptidão aeróbia e na manutenção do peso corporal em ratos idosos.The effects of aerobic training at the lactate threshold (LT intensity were analyzed in fifteen older rats (~448 days old. Both the trained (n = 9 and control (n = 6 groups were submitted to an incremental exercise test before and after four weeks of training. The incremental exercise test consisted of an initial load of 1% BM and 1% increments at each

  20. Perceptual learning: psychophysical thresholds and electrical brain topography.

    Science.gov (United States)

    Skrandies, W; Jedynak, A; Fahle, M

    2001-06-01

    We studied perceptual learning by determining psychophysical discrimination thresholds for visual hyper acuity targets (vernier stimuli) as a function of stimulus orientation. One aim was to relate perceptual improvements to changes of electrophysiological activity of the human brain. A group of 43 healthy adults participated in a psychophysical experiment where vernier thresholds for vertical and horizontal vernier targets were compared. In 16 subjects thresholds were measured for each orientation twice at an interval of 25 min. Between threshold estimations, evoked brain activity was recorded from 30 electrodes over the occipital brain areas while the subjects observed appearance and disappearance of supra-threshold vernier offsets. Mean evoked potentials were computed for the first and second 600 stimulus presentations, and the scalp topography of electrical brain activity was analyzed. Vertically oriented stimuli yielded significantly better performance than horizontal targets, and thresholds were significantly lower in the second half of the experiment, i.e. after prolonged viewing of stimuli. The improvements in discrimination performance were specific for stimulus orientation and did not generalize. Learning effects were also observed with electrical brain activity, and field strength of the potentials increased significantly as a function of time. Scalp topography of the evoked components was significantly affected indicating a shift of activation between different neuronal elements induced by perceptual learning.

  1. An Advanced Encryption Standard Powered Mutual Authentication Protocol Based on Elliptic Curve Cryptography for RFID, Proven on WISP

    Directory of Open Access Journals (Sweden)

    Alaauldin Ibrahim

    2017-01-01

    Full Text Available Information in patients’ medical histories is subject to various security and privacy concerns. Meanwhile, any modification or error in a patient’s medical data may cause serious or even fatal harm. To protect and transfer this valuable and sensitive information in a secure manner, radio-frequency identification (RFID technology has been widely adopted in healthcare systems and is being deployed in many hospitals. In this paper, we propose a mutual authentication protocol for RFID tags based on elliptic curve cryptography and advanced encryption standard. Unlike existing authentication protocols, which only send the tag ID securely, the proposed protocol could also send the valuable data stored in the tag in an encrypted pattern. The proposed protocol is not simply a theoretical construct; it has been coded and tested on an experimental RFID tag. The proposed scheme achieves mutual authentication in just two steps and satisfies all the essential security requirements of RFID-based healthcare systems.

  2. Analysis of the width-w non-adjacent form in conjunction with hyperelliptic curve cryptography and with lattices☆

    Science.gov (United States)

    Krenn, Daniel

    2013-01-01

    In this work the number of occurrences of a fixed non-zero digit in the width-w non-adjacent forms of all elements of a lattice in some region (e.g. a ball) is analysed. As bases, expanding endomorphisms with eigenvalues of the same absolute value are allowed. Applications of the main result are on numeral systems with an algebraic integer as base. Those come from efficient scalar multiplication methods (Frobenius-and-add methods) in hyperelliptic curves cryptography, and the result is needed for analysing the running time of such algorithms. The counting result itself is an asymptotic formula, where its main term coincides with the full block length analysis. In its second order term a periodic fluctuation is exhibited. The proof follows Delange’s method. PMID:23805020

  3. Number Theory in Science and Communication With Applications in Cryptography, Physics, Digital Information, Computing, and Self-Similarity

    CERN Document Server

    Schroeder, Manfred

    2009-01-01

    "Number Theory in Science and Communication" is a well-known introduction for non-mathematicians to this fascinating and useful branch of applied mathematics . It stresses intuitive understanding rather than abstract theory and highlights important concepts such as continued fractions, the golden ratio, quadratic residues and Chinese remainders, trapdoor functions, pseudoprimes and primitive elements. Their applications to problems in the real world are one of the main themes of the book. This revised fifth edition is augmented by recent advances in coding theory, permutations and derangements and a chapter in quantum cryptography. From reviews of earlier editions – "I continue to find [Schroeder’s] Number Theory a goldmine of valuable information. It is a marvellous book, in touch with the most recent applications of number theory and written with great clarity and humor.’ Philip Morrison (Scientific American) "A light-hearted and readable volume with a wide range of applications to which the author ha...

  4. Multiratio fusion change detection with adaptive thresholding

    Science.gov (United States)

    Hytla, Patrick C.; Balster, Eric J.; Vasquez, Juan R.; Neuroth, Robert M.

    2017-04-01

    A ratio-based change detection method known as multiratio fusion (MRF) is proposed and tested. The MRF framework builds on other change detection components proposed in this work: dual ratio (DR) and multiratio (MR). The DR method involves two ratios coupled with adaptive thresholds to maximize detected changes and minimize false alarms. The use of two ratios is shown to outperform the single ratio case when the means of the image pairs are not equal. MR change detection builds on the DR method by including negative imagery to produce four total ratios with adaptive thresholds. Inclusion of negative imagery is shown to improve detection sensitivity and to boost detection performance in certain target and background cases. MRF further expands this concept by fusing together the ratio outputs using a routine in which detections must be verified by two or more ratios to be classified as a true changed pixel. The proposed method is tested with synthetically generated test imagery and real datasets with results compared to other methods found in the literature. DR is shown to significantly outperform the standard single ratio method. MRF produces excellent change detection results that exhibit up to a 22% performance improvement over other methods from the literature at low false-alarm rates.

  5. Elicitation threshold of cobalt chloride

    DEFF Research Database (Denmark)

    Fischer, Louise A; Johansen, Jeanne D; Voelund, Aage

    2016-01-01

    BACKGROUND: Cobalt is a strong skin sensitizer (grade 5 of 5 in the guinea-pig maximization test) that is used in various industrial and consumer applications. To prevent sensitization to cobalt and elicitation of allergic cobalt dermatitis, information about the elicitation threshold level...... of cobalt is important. OBJECTIVE: To identify the dermatitis elicitation threshold levels in cobalt-allergic individuals. MATERIALS AND METHODS: Published patch test dose-response studies were reviewed to determine the elicitation dose (ED) levels in dermatitis patients with a previous positive patch test...... reaction to cobalt. A logistic dose-response model was applied to data collected from the published literature to estimate ED values. The 95% confidence interval (CI) for the ratio of mean doses that can elicit a reaction in 10% (ED(10)) of a population was calculated with Fieller's method. RESULTS...

  6. Scaling behavior of threshold epidemics

    Science.gov (United States)

    Ben-Naim, E.; Krapivsky, P. L.

    2012-05-01

    We study the classic Susceptible-Infected-Recovered (SIR) model for the spread of an infectious disease. In this stochastic process, there are two competing mechanism: infection and recovery. Susceptible individuals may contract the disease from infected individuals, while infected ones recover from the disease at a constant rate and are never infected again. Our focus is the behavior at the epidemic threshold where the rates of the infection and recovery processes balance. In the infinite population limit, we establish analytically scaling rules for the time-dependent distribution functions that characterize the sizes of the infected and the recovered sub-populations. Using heuristic arguments, we also obtain scaling laws for the size and duration of the epidemic outbreaks as a function of the total population. We perform numerical simulations to verify the scaling predictions and discuss the consequences of these scaling laws for near-threshold epidemic outbreaks.

  7. Roots at the Percolation Threshold

    Science.gov (United States)

    Kroener, E.; Ahmed, M. A.; Kaestner, A.; Vontobel, P.; Zarebanadkouki, M.; Carminati, A.

    2014-12-01

    Much of the carbon assimilated by plants during photosynthesis is lost to the soil via rhizodepositions. One component of rhizopdeposition is mucilage, a hydrogel that dramatically alters the soil physical properties. Mucilage was assumed to explain unexpectedly low rhizosphere rewetting rates during irrigation (Carminati et al. 2010) and temporarily water repellency in the rhizosphere after severe drying (Moradi et al. 2012).Here, we present an experimental and theoretical study for the rewetting behaviour of a soil mixed with mucilage, which was used as an analogue of the rhizosphere. Our samples were made of two layers of untreated soils separated by a thin layer (ca. 1 mm) of soil treated with mucilage. We prepared soil columns of varying particle size, mucilage concentration and height of the middle layer above the water table. The dry soil columns were re-wetted by capillary rise from the bottom.The rewetting of the middle layer showed a distinct dual behavior. For mucilage concentrations lower than a certain threshold, water could cross the thin layer almost immediately after rewetting of bulk soil. At slightly higher mucilage concentrations, the thin layer was almost impermeable. The mucilage concentration at the threshold strongly depended on particle size: the smaller the particle size the larger the soil specific surface and the more mucilage was needed to cover the entire particle surface and to induce water repellency.We applied a classic pore network model to simulate the experimental observations. In the model a certain fraction of nodes were randomly disconnected to reproduce the effect of mucilage in temporarily blocking the flow. The percolation model could qualitatively reproduce well the threshold characteristics of the experiments. Our experiments, together with former observations of water dynamics in the rhizosphere, suggest that the rhizosphere is near the percolation threshold, where small variations in mucilage concentration sensitively

  8. Realistic Realizations Of Threshold Circuits

    Science.gov (United States)

    Razavi, Hassan M.

    1987-08-01

    Threshold logic, in which each input is weighted, has many theoretical advantages over the standard gate realization, such as reducing the number of gates, interconnections, and power dissipation. However, because of the difficult synthesis procedure and complicated circuit implementation, their use in the design of digital systems is almost nonexistant. In this study, three methods of NMOS realizations are discussed, and their advantages and shortcomings are explored. Also, the possibility of using the methods to realize multi-valued logic is examined.

  9. Root finding with threshold circuits

    Czech Academy of Sciences Publication Activity Database

    Jeřábek, Emil

    2012-01-01

    Roč. 462, Nov 30 (2012), s. 59-69 ISSN 0304-3975 R&D Projects: GA AV ČR IAA100190902; GA MŠk(CZ) 1M0545 Institutional support: RVO:67985840 Keywords : root finding * threshold circuit * power series Subject RIV: BA - General Mathematics Impact factor: 0.489, year: 2012 http://www.sciencedirect.com/science/article/pii/S0304397512008006#

  10. Color difference thresholds in dentistry.

    Science.gov (United States)

    Paravina, Rade D; Ghinea, Razvan; Herrera, Luis J; Bona, Alvaro D; Igiel, Christopher; Linninger, Mercedes; Sakai, Maiko; Takahashi, Hidekazu; Tashkandi, Esam; Perez, Maria del Mar

    2015-01-01

    The aim of this prospective multicenter study was to determine 50:50% perceptibility threshold (PT) and 50:50% acceptability threshold (AT) of dental ceramic under simulated clinical settings. The spectral radiance of 63 monochromatic ceramic specimens was determined using a non-contact spectroradiometer. A total of 60 specimen pairs, divided into 3 sets of 20 specimen pairs (medium to light shades, medium to dark shades, and dark shades), were selected for psychophysical experiment. The coordinating center and seven research sites obtained the Institutional Review Board (IRB) approvals prior the beginning of the experiment. Each research site had 25 observers, divided into five groups of five observers: dentists-D, dental students-S, dental auxiliaries-A, dental technicians-T, and lay persons-L. There were 35 observers per group (five observers per group at each site ×7 sites), for a total of 175 observers. Visual color comparisons were performed using a viewing booth. Takagi-Sugeno-Kang (TSK) fuzzy approximation was used for fitting the data points. The 50:50% PT and 50:50% AT were determined in CIELAB and CIEDE2000. The t-test was used to evaluate the statistical significance in thresholds differences. The CIELAB 50:50% PT was ΔEab  = 1.2, whereas 50:50% AT was ΔEab  = 2.7. Corresponding CIEDE2000 (ΔE00 ) values were 0.8 and 1.8, respectively. 50:50% PT by the observer group revealed differences among groups D, A, T, and L as compared with 50:50% PT for all observers. The 50:50% AT for all observers was statistically different than 50:50% AT in groups T and L. A 50:50% perceptibility and ATs were significantly different. The same is true for differences between two color difference formulas ΔE00 /ΔEab . Observer groups and sites showed high level of statistical difference in all thresholds. Visual color difference thresholds can serve as a quality control tool to guide the selection of esthetic dental materials, evaluate clinical performance, and

  11. Improvement in Brightness Uniformity by Compensating for the Threshold Voltages of Both the Driving Thin-Film Transistor and the Organic Light-Emitting Diode for Active-Matrix Organic Light-Emitting Diode Displays

    Directory of Open Access Journals (Sweden)

    Ching-Lin Fan

    2014-01-01

    Full Text Available This paper proposes a novel pixel circuit design and driving method for active-matrix organic light-emitting diode (AM-OLED displays that use low-temperature polycrystalline-silicon thin-film transistors (LTPS-TFTs as driving element. The automatic integrated circuit modeling simulation program with integrated circuit emphasis (AIM-SPICE simulator was used to verify that the proposed pixel circuit, which comprises five transistors and one capacitor, can supply uniform output current. The voltage programming method of the proposed pixel circuit comprises three periods: reset, compensation with data input, and emission periods. The simulated results reflected excellent performance. For instance, when ΔVTH=±0.33 V, the average error rate of the OLED current variation was low (<0.8%, and when ΔVTH_OLED=+0.33 V, the error rate of the OLED current variation was 4.7%. Moreover, when the I×R (current × resistance drop voltage of a power line was 0.3 V, the error rate of the OLED current variation was 5.8%. The simulated results indicated that the proposed pixel circuit exhibits high immunity to the threshold voltage deviation of both the driving poly-Si TFTs and OLEDs, and simultaneously compensates for the I×R drop voltage of a power line.

  12. Anaerobic threshold: its concept and role in endurance sport.

    Science.gov (United States)

    Ghosh, Asok Kumar

    2004-01-01

    aerobic to anaerobic transition intensity is one of the most significant physiological variable in endurance sports. Scientists have explained the term in various ways, like, Lactate Threshold, Ventilatory Anaerobic Threshold, Onset of Blood Lactate Accumulation, Onset of Plasma Lactate Accumulation, Heart Rate Deflection Point and Maximum Lactate Steady State. But all of these have great role both in monitoring training schedule and in determining sports performance. Individuals endowed with the possibility to obtain a high oxygen uptake need to complement with rigorous training program in order to achieve maximal performance. If they engage in endurance events, they must also develop the ability to sustain a high fractional utilization of their maximal oxygen uptake (%VO(2) max) and become physiologically efficient in performing their activity. Anaerobic threshold is highly correlated to the distance running performance as compared to maximum aerobic capacity or VO(2) max, because sustaining a high fractional utilization of the VO(2) max for a long time delays the metabolic acidosis. Training at or little above the anaerobic threshold intensity improves both the aerobic capacity and anaerobic threshold level. Anaerobic Threshold can also be determined from the speed-heart rate relationship in the field situation, without undergoing sophisticated laboratory techniques. However, controversies also exist among scientists regarding its role in high performance sports.

  13. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  14. Nuclear thermodynamics below particle threshold

    International Nuclear Information System (INIS)

    Schiller, A.; Agvaanluvsan, U.; Algin, E.; Bagheri, A.; Chankova, R.; Guttormsen, M.; Hjorth-Jensen, M.; Rekstad, J.; Siem, S.; Sunde, A. C.; Voinov, A.

    2005-01-01

    From a starting point of experimentally measured nuclear level densities, we discuss thermodynamical properties of nuclei below the particle emission threshold. Since nuclei are essentially mesoscopic systems, a straightforward generalization of macroscopic ensemble theory often yields unphysical results. A careful critique of traditional thermodynamical concepts reveals problems commonly encountered in mesoscopic systems. One of which is the fact that microcanonical and canonical ensemble theory yield different results, another concerns the introduction of temperature for small, closed systems. Finally, the concept of phase transitions is investigated for mesoscopic systems

  15. Level reduction and the quantum threshold theorem

    Science.gov (United States)

    Aliferis, Panagiotis (Panos)

    quantum threshold theorem for coherent and leakage noise and for quantum computation by measurements. In addition, the proof provides a methodology which allows us to establish improved rigorous lower bounds on the value of the quantum accuracy threshold.

  16. Threshold Concepts in Finance: Student Perspectives

    Science.gov (United States)

    Hoadley, Susan; Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2015-01-01

    Finance threshold concepts are the essential conceptual knowledge that underpin well-developed financial capabilities and are central to the mastery of finance. In this paper we investigate threshold concepts in finance from the point of view of students, by establishing the extent to which students are aware of threshold concepts identified by…

  17. Threshold enhancement of diphoton resonances

    CERN Document Server

    Bharucha, Aoife; Goudelis, Andreas

    2016-10-10

    The data collected by the LHC collaborations at an energy of 13 TeV indicates the presence of an excess in the diphoton spectrum that would correspond to a resonance of a 750 GeV mass. The apparently large production cross section is nevertheless very difficult to explain in minimal models. We consider the possibility that the resonance is a pseudoscalar boson $A$ with a two--photon decay mediated by a charged and uncolored fermion having a mass at the $\\frac12 M_A$ threshold and a very small decay width, $\\ll 1$ MeV; one can then generate a large enhancement of the $A\\gamma\\gamma$ amplitude which explains the excess without invoking a large multiplicity of particles propagating in the loop, large electric charges and/or very strong Yukawa couplings. The implications of such a threshold enhancement are discussed in two explicit scenarios: i) the Minimal Supersymmetric Standard Model in which the $A$ state is produced via the top quark mediated gluon fusion process and decays into photons predominantly through...

  18. Dynamic-Threshold-Limited Timed-Token (DTLTT) Protocol | Kalu ...

    African Journals Online (AJOL)

    An improved version of the Static-Threshold-Limited On-Demand Guaranteed Service Timed-Token (STOGSTT) Media Access Control (MAC) protocol for channel capacity allocation to the asynchronous trac in Multiservice Local Area Network (MLANs) was developed and analyzed. TLODGSTT protocol uses static value of ...

  19. An integrative perspective of the anaerobic threshold.

    Science.gov (United States)

    Sales, Marcelo Magalhães; Sousa, Caio Victor; da Silva Aguiar, Samuel; Knechtle, Beat; Nikolaidis, Pantelis Theodoros; Alves, Polissandro Mortoza; Simões, Herbert Gustavo

    2017-12-14

    The concept of anaerobic threshold (AT) was introduced during the nineteen sixties. Since then, several methods to identify the anaerobic threshold (AT) have been studied and suggested as novel 'thresholds' based upon the variable used for its detection (i.e. lactate threshold, ventilatory threshold, glucose threshold). These different techniques have brought some confusion about how we should name this parameter, for instance, anaerobic threshold or the physiological measure used (i.e. lactate, ventilation). On the other hand, the modernization of scientific methods and apparatus to detect AT, as well as the body of literature formed in the past decades, could provide a more cohesive understanding over the AT and the multiple physiological systems involved. Thus, the purpose of this review was to provide an integrative perspective of the methods to determine AT. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Epidemic thresholds for bipartite networks

    Science.gov (United States)

    Hernández, D. G.; Risau-Gusman, S.

    2013-11-01

    It is well known that sexually transmitted diseases (STD) spread across a network of human sexual contacts. This network is most often bipartite, as most STD are transmitted between men and women. Even though network models in epidemiology have quite a long history now, there are few general results about bipartite networks. One of them is the simple dependence, predicted using the mean field approximation, between the epidemic threshold and the average and variance of the degree distribution of the network. Here we show that going beyond this approximation can lead to qualitatively different results that are supported by numerical simulations. One of the new features, that can be relevant for applications, is the existence of a critical value for the infectivity of each population, below which no epidemics can arise, regardless of the value of the infectivity of the other population.

  1. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  2. Projection–Based Text Line Segmentation with a Variable Threshold

    Directory of Open Access Journals (Sweden)

    Ptak Roman

    2017-03-01

    Full Text Available Document image segmentation into text lines is one of the stages in unconstrained handwritten document recognition. This paper presents a new algorithm for text line separation in handwriting. The developed algorithm is based on a method using the projection profile. It employs thresholding, but the threshold value is variable. This permits determination of low or overlapping peaks of the graph. The proposed technique is shown to improve the recognition rate relative to traditional methods. The algorithm is robust in text line detection with respect to different text line lengths.

  3. Visual evoked potential and psychophysical contrast thresholds in glaucoma.

    Science.gov (United States)

    Abdullah, Siti Nurliyana; Sanderson, Gordon F; James, Andrew C; Vaegan; Maddess, Ted

    2014-04-01

    We compared the diagnostic power of electrophysiologically and psychophysically measured contrast thresholds for the diagnosis of glaucoma. Additionally, we investigated whether combining results from the two methods improved diagnostic power. Seven-eight subjects between 40 and 88 years formed the main study group: 21 normal controls (9 males) and 57 glaucoma patients (30 males) were tested. Twenty-two younger control subjects were also tested. Contrast thresholds were determined for a 1 cpd sinusoidal grating, subtending 41° × 52° modulated at 14.3 rps. The thresholds were based on the same staircase method applied to visual evoked potential (VEP) and psychophysical responses (Psyc). Diagnostic power was assessed by the percent area under the curve (%AUC) of receiver operating characteristic plots. Psyc showed significant age dependence, -0.10 ± 0.02 dB, while VEPs did not. Diagnostic performance for moderate and severe eyes combined was modest: Psyc 74 ± 9.0 % and VEP 72 ± 9.1 %, but improved significantly (p Psyc thresholds appeared to improve diagnostic power. Canonical correlation analysis indicated that they measured statistically independent aspects of glaucoma possibly related to disease severity. Adding the 20-s psychophysical test to a VEP test produced a significant benefit for a small time cost.

  4. Energy Detector Using a Hybrid Threshold in Cognitive Radio Systems

    Science.gov (United States)

    Kim, Jong-Ho; Hwang, Seung-Hoon; Hwang, Deok-Kyu

    Cognitive radio systems offer the opportunity to improve the spectrum utilization by detecting unused frequency bands while avoiding interference to primary users. This paper proposes a new algorithm for spectrum sensing, which is an energy detector using a hybrid (adaptive and fixed) threshold, in order to compensate the weak points of the existing energy detector in the distorted communication channel environment. Simulation results are presented which show that the performance of the new proposed scheme is better than the existing scheme using a fixed threshold or an adaptive threshold. Additionally, the performance is investigated in terms of several parameters such as the mobile speed and the probability of false alarms. The simulation results also show that the proposed algorithm makes the detector highly robust against fading, shadowing, and interference.

  5. Evaluating the "Threshold Theory": Can Head Impact Indicators Help?

    Science.gov (United States)

    Mihalik, Jason P; Lynall, Robert C; Wasserman, Erin B; Guskiewicz, Kevin M; Marshall, Stephen W

    2017-02-01

    This study aimed to determine the clinical utility of biomechanical head impact indicators by measuring the sensitivity, specificity, positive predictive value (PV+), and negative predictive value (PV-) of multiple thresholds. Head impact biomechanics (n = 283,348) from 185 football players in one Division I program were collected. A multidisciplinary clinical team independently made concussion diagnoses (n = 24). We dichotomized each impact using diagnosis (yes = 24, no = 283,324) and across a range of plausible impact indicator thresholds (10g increments beginning with a resultant linear head acceleration of 50g and ending with 120g). Some thresholds had adequate sensitivity, specificity, and PV-. All thresholds had low PV+, with the best recorded PV+ less than 0.4% when accounting for all head impacts sustained by our sample. Even when conservatively adjusting the frequency of diagnosed concussions by a factor of 5 to account for unreported/undiagnosed injuries, the PV+ of head impact indicators at any threshold was no greater than 1.94%. Although specificity and PV- appear high, the low PV+ would generate many unnecessary evaluations if these indicators were the sole diagnostic criteria. The clinical diagnostic value of head impact indicators is considerably questioned by these data. Notwithstanding, valid sensor technologies continue to offer objective data that have been used to improve player safety and reduce injury risk.

  6. Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Linguo Li

    2017-01-01

    Full Text Available The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO, which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur’s entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO, the differential evolution (DE, the Artifical Bee Colony (ABC, and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability.

  7. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography

    Science.gov (United States)

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786

  8. Provable Secure and Efficient Digital Rights Management Authentication Scheme Using Smart Card Based on Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    Yuanyuan Zhang

    2015-01-01

    Full Text Available Since the concept of ubiquitous computing is firstly proposed by Mark Weiser, its connotation has been extending and expanding by many scholars. In pervasive computing application environment, many kinds of small devices containing smart cart are used to communicate with others. In 2013, Yang et al. proposed an enhanced authentication scheme using smart card for digital rights management. They demonstrated that their scheme is secure enough. However, Mishra et al. pointed out that Yang et al.’s scheme suffers from the password guessing attack and the denial of service attack. Moreover, they also demonstrated that Yang et al.’s scheme is not efficient enough when the user inputs an incorrect password. In this paper, we analyze Yang et al.’s scheme again, and find that their scheme is vulnerable to the session key attack. And, there are some mistakes in their scheme. To surmount the weakness of Yang et al.’s scheme, we propose a more efficient and provable secure digital rights management authentication scheme using smart card based on elliptic curve cryptography.

  9. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    Science.gov (United States)

    Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young

    2016-01-01

    Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  10. An Enhanced Biometric Based Authentication with Key-Agreement Protocol for Multi-Server Architecture Based on Elliptic Curve Cryptography.

    Directory of Open Access Journals (Sweden)

    Alavalapati Goutham Reddy

    Full Text Available Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.

  11. Threshold behavior in electron-atom scattering

    International Nuclear Information System (INIS)

    Sadeghpour, H.R.; Greene, C.H.

    1996-01-01

    Ever since the classic work of Wannier in 1953, the process of treating two threshold electrons in the continuum of a positively charged ion has been an active field of study. The authors have developed a treatment motivated by the physics below the double ionization threshold. By modeling the double ionization as a series of Landau-Zener transitions, they obtain an analytical formulation of the absolute threshold probability which has a leading power law behavior, akin to Wannier's law. Some of the noteworthy aspects of this derivation are that the derivation can be conveniently continued below threshold giving rise to a open-quotes cuspclose quotes at threshold, and that on both sides of the threshold, absolute values of the cross sections are obtained

  12. Iran: the next nuclear threshold state?

    OpenAIRE

    Maurer, Christopher L.

    2014-01-01

    Approved for public release; distribution is unlimited A nuclear threshold state is one that could quickly operationalize its peaceful nuclear program into one capable of producing a nuclear weapon. This thesis compares two known threshold states, Japan and Brazil, with Iran to determine if the Islamic Republic could also be labeled a threshold state. Furthermore, it highlights the implications such a status could have on U.S. nonproliferation policy. Although Iran's nuclear program is mir...

  13. Roots at the percolation threshold.

    Science.gov (United States)

    Kroener, Eva; Ahmed, Mutez Ali; Carminati, Andrea

    2015-04-01

    The rhizosphere is the layer of soil around the roots where complex and dynamic interactions between plants and soil affect the capacity of plants to take up water. The physical properties of the rhizosphere are affected by mucilage, a gel exuded by roots. Mucilage can absorb large volumes of water, but it becomes hydrophobic after drying. We use a percolation model to describe the rewetting of dry rhizosphere. We find that at a critical mucilage concentration the rhizosphere becomes impermeable. The critical mucilage concentration depends on the radius of the soil particle size. Capillary rise experiments with neutron radiography prove that for concentrations below the critical mucilage concentration water could easily cross the rhizosphere, while above the critical concentration water could no longer percolate through it. Our studies, together with former observations of water dynamics in the rhizosphere, suggest that the rhizosphere is near the percolation threshold, where small variations in mucilage concentration sensitively alter the soil hydraulic conductivity. Is mucilage exudation a plant mechanism to efficiently control the rhizosphere conductivity and the access to water?

  14. Dynamical thresholds for complete fusion

    International Nuclear Information System (INIS)

    Davies, K.T.R.; Sierk, A.J.; Nix, J.R.

    1983-01-01

    It is our purpose here to study the effect of nuclear dissipation and shape parametrization on dynamical thresholds for compound-nucleus formation in symmetric heavy-ion reactions. This is done by solving numerically classical equations of motion for head-on collisions to determine whether the dynamical trajectory in a multidimensional deformation space passes inside the fission saddle point and forms a compound nucleus, or whether it passes outside the fission saddle point and reseparates in a fast-fission or deep-inelastic reaction. Specifying the nuclear shape in terms of smoothly joined portions of three quadratic surfaces of revolution, we take into account three symmetric deformation coordinates. However, in some cases we reduce the number of coordinates to two by requiring the ends of the fusing system to be spherical in shape. The nuclear potential energy of deformation is determined in terms of a Coulomb energy and a double volume energy of a Yukawa-plus-exponential folding function. The collective kinetic energy is calculated for incompressible, nearly irrotational flow by means of the Werner-Wheeler approximation. Four possibilities are studied for the transfer of collective kinetic energy into internal single-particle excitation energy: zero dissipation, ordinary two body viscosity, one-body wall-formula dissipation, and one-body wall-and-window dissipation

  15. Efficient threshold for volumetric segmentation

    Science.gov (United States)

    Burdescu, Dumitru D.; Brezovan, Marius; Stanescu, Liana; Stoica Spahiu, Cosmin; Ebanca, Daniel

    2015-07-01

    Image segmentation plays a crucial role in effective understanding of digital images. However, the research on the existence of general purpose segmentation algorithm that suits for variety of applications is still very much active. Among the many approaches in performing image segmentation, graph based approach is gaining popularity primarily due to its ability in reflecting global image properties. Volumetric image segmentation can simply result an image partition composed by relevant regions, but the most fundamental challenge in segmentation algorithm is to precisely define the volumetric extent of some object, which may be represented by the union of multiple regions. The aim in this paper is to present a new method to detect visual objects from color volumetric images and efficient threshold. We present a unified framework for volumetric image segmentation and contour extraction that uses a virtual tree-hexagonal structure defined on the set of the image voxels. The advantage of using a virtual tree-hexagonal network superposed over the initial image voxels is that it reduces the execution time and the memory space used, without losing the initial resolution of the image.

  16. Error Thresholds on Dynamic Fittness-Landscapes

    OpenAIRE

    Nilsson, Martin; Snoad, Nigel

    1999-01-01

    In this paper we investigate error-thresholds on dynamics fitness-landscapes. We show that there exists both lower and an upper threshold, representing limits to the copying fidelity of simple replicators. The lower bound can be expressed as a correction term to the error-threshold present on a static landscape. The upper error-threshold is a new limit that only exists on dynamic fitness-landscapes. We also show that for long genomes on highly dynamic fitness-landscapes there exists a lower b...

  17. A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM)

    Science.gov (United States)

    2017-10-01

    TECHNICAL REPORT 3079 October 2017 A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM...Head 55190 Networks Division iii EXECUTIVE SUMMARY This report summarizes the methodology developed to improve the radar threshold modeling...PHASED ARRAY RADAR CONFIGURATION ..................................................................... 1 3. METHODOLOGY

  18. A Threshold Accepting Metaheuristic for the Vehicle Routing Problem with Time Windows

    NARCIS (Netherlands)

    Bräysy, Olli; Berger, Jean; Barkaoui, Mohamed; Dullaert, Wout

    2003-01-01

    Threshold Accepting, a variant of Simulated Annealing, is applied for the first time to a set of 356 benchmark instances for the Vehicle Routing with Time Windows. The Threshold Accepting metaheuristic is used to improve upon results obtained with a recent parallel genetic algorithm and a

  19. A Threshold Accepting Metaheuristic for the Vehicle Routing Problem with Time Windows.

    NARCIS (Netherlands)

    Bräysy, Olli; Berger, Jean; Barkaoui, Mohamed; Dullaert, Wout

    2003-01-01

    Threshold Accepting, a variant of Simulated, Annealing, is applied for the first time to a set of 356 benchmark instances for the Vehicle Routing with Time Windows. The Threshold Accepting metaheuristic is used to improve upon results obtained with a recent parallel genetic algorithm and a

  20. Ankle Accelerometry for Assessing Physical Activity among Adolescent Girls: Threshold Determination, Validity, Reliability, and Feasibility

    Science.gov (United States)

    Hager, Erin R.; Treuth, Margarita S.; Gormely, Candice; Epps, LaShawna; Snitker, Soren; Black, Maureen M.

    2015-01-01

    Purpose: Ankle accelerometry allows for 24-hr data collection and improves data volume/integrity versus hip accelerometry. Using Actical ankle accelerometry, the purpose of this study was to (a) develop sensitive/specific thresholds, (b) examine validity/reliability, (c) compare new thresholds with those of the manufacturer, and (d) examine…

  1. General immunity and superadditivity of two-way Gaussian quantum cryptography.

    Science.gov (United States)

    Ottaviani, Carlo; Pirandola, Stefano

    2016-03-01

    We consider two-way continuous-variable quantum key distribution, studying its security against general eavesdropping strategies. Assuming the asymptotic limit of many signals exchanged, we prove that two-way Gaussian protocols are immune to coherent attacks. More precisely we show the general superadditivity of the two-way security thresholds, which are proven to be higher than the corresponding one-way counterparts in all cases. We perform the security analysis first reducing the general eavesdropping to a two-mode coherent Gaussian attack, and then showing that the superadditivity is achieved by exploiting the random on/off switching of the two-way quantum communication. This allows the parties to choose the appropriate communication instances to prepare the key, accordingly to the tomography of the quantum channel. The random opening and closing of the circuit represents, in fact, an additional degree of freedom allowing the parties to convert, a posteriori, the two-mode correlations of the eavesdropping into noise. The eavesdropper is assumed to have no access to the on/off switching and, indeed, cannot adapt her attack. We explicitly prove that this mechanism enhances the security performance, no matter if the eavesdropper performs collective or coherent attacks.

  2. Tuning the threshold voltage in electrolyte-gated organic field-effect transistors

    Science.gov (United States)

    Kergoat, Loïg; Herlogsson, Lars; Piro, Benoit; Pham, Minh Chau; Horowitz, Gilles; Crispin, Xavier; Berggren, Magnus

    2012-01-01

    Low-voltage organic field-effect transistors (OFETs) promise for low power consumption logic circuits. To enhance the efficiency of the logic circuits, the control of the threshold voltage of the transistors are based on is crucial. We report the systematic control of the threshold voltage of electrolyte-gated OFETs by using various gate metals. The influence of the work function of the metal is investigated in metal-electrolyte-organic semiconductor diodes and electrolyte-gated OFETs. A good correlation is found between the flat-band potential and the threshold voltage. The possibility to tune the threshold voltage over half the potential range applied and to obtain depletion-like (positive threshold voltage) and enhancement (negative threshold voltage) transistors is of great interest when integrating these transistors in logic circuits. The combination of a depletion-like and enhancement transistor leads to a clear improvement of the noise margins in depleted-load unipolar inverters. PMID:22586088

  3. Time-efficient multidimensional threshold tracking method

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Kowalewski, Borys; Dau, Torsten

    2015-01-01

    Traditionally, adaptive methods have been used to reduce the time it takes to estimate psychoacoustic thresholds. However, even with adaptive methods, there are many cases where the testing time is too long to be clinically feasible, particularly when estimating thresholds as a function of anothe...

  4. Applying Threshold Concepts to Finance Education

    Science.gov (United States)

    Hoadley, Susan; Wood, Leigh N.; Tickle, Leonie; Kyng, Tim

    2016-01-01

    Purpose: The purpose of this paper is to investigate and identify threshold concepts that are the essential conceptual content of finance programmes. Design/Methodology/Approach: Conducted in three stages with finance academics and students, the study uses threshold concepts as both a theoretical framework and a research methodology. Findings: The…

  5. Intelligence and Creativity: Over the Threshold Together?

    Science.gov (United States)

    Welter, Marisete Maria; Jaarsveld, Saskia; van Leeuwen, Cees; Lachmann, Thomas

    2016-01-01

    Threshold theory predicts a positive correlation between IQ and creativity scores up to an IQ level of 120 and no correlation above this threshold. Primary school children were tested at beginning (N = 98) and ending (N = 70) of the school year. Participants performed the standard progressive matrices (SPM) and the Test of Creative…

  6. Threshold Concepts, Systems and Learning for Sustainability

    Science.gov (United States)

    Sandri, Orana Jade

    2013-01-01

    This paper presents a framework for understanding the role that systems theory might play in education for sustainability (EfS). It offers a sketch and critique of Land and Meyer's notion of a "threshold concept", to argue that seeing systems as a threshold concept for sustainability is useful for understanding the processes of…

  7. Evaluation of the Detection Threshold of Three ...

    African Journals Online (AJOL)

    A mean count of 39 pigments per microlitre was obtained for these five patients. Both HEXAGON MALARIA and SD-BIOLINE had a detection threshold of 4 pigments per microlitre, while ACCU-STAT MALARIA had 20 pigments per microlitre. This suggests that these three kits have good detection thresholds and could ...

  8. Log canonical thresholds of smooth Fano threefolds

    International Nuclear Information System (INIS)

    Cheltsov, Ivan A; Shramov, Konstantin A

    2008-01-01

    The complex singularity exponent is a local invariant of a holomorphic function determined by the integrability of fractional powers of the function. The log canonical thresholds of effective Q-divisors on normal algebraic varieties are algebraic counterparts of complex singularity exponents. For a Fano variety, these invariants have global analogues. In the former case, it is the so-called α-invariant of Tian; in the latter case, it is the global log canonical threshold of the Fano variety, which is the infimum of log canonical thresholds of all effective Q-divisors numerically equivalent to the anticanonical divisor. An appendix to this paper contains a proof that the global log canonical threshold of a smooth Fano variety coincides with its α-invariant of Tian. The purpose of the paper is to compute the global log canonical thresholds of smooth Fano threefolds (altogether, there are 105 deformation families of such threefolds). The global log canonical thresholds are computed for every smooth threefold in 64 deformation families, and the global log canonical thresholds are computed for a general threefold in 20 deformation families. Some bounds for the global log canonical thresholds are computed for 14 deformation families. Appendix A is due to J.-P. Demailly.

  9. Underestimation of pacing threshold as determined by an automatic ventricular threshold testing algorithm.

    Science.gov (United States)

    Sauer, William H; Cooper, Joshua M; Lai, Rebecca W; Verdino, Ralph J

    2006-09-01

    In this case report, we describe markedly different pacing thresholds determined by a manual threshold test and the automatic Ventricular Capture Management algorithm. The discrepancy in pacing threshold values reported was due to the difference in the AV intervals used with the different testing methods. We propose that the differences in right ventricular dimensions with altered diastolic filling periods affected the threshold in this patient with a new passive fixation lead in the right ventricular apex.

  10. A Threshold Continuum for Aeolian Sand Transport

    Science.gov (United States)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  11. MOS Current Mode Logic Near Threshold Circuits

    Directory of Open Access Journals (Sweden)

    Alexander Shapiro

    2014-06-01

    Full Text Available Near threshold circuits (NTC are an attractive and promising technology that provides significant power savings with some delay penalty. The combination of NTC technology with MOS current mode logic (MCML is examined in this work. By combining MCML with NTC, the constant power consumption of MCML is reduced to leakage power levels that can be tolerated in certain modern applications. Additionally, the speed of NTC is improved due to the high speed nature of MCML technology. A 14 nm Fin field effect transistor (FinFET technology is used to evaluate these combined circuit techniques. A 32-bit Kogge Stone adder is chosen as a demonstration vehicle for feasibility analysis. MCML with NTC is shown to yield enhanced power efficiency when operated above 1 GHz with a 100% activity factor as compared to standard CMOS. MCML with NTC is more power efficient than standard CMOS beyond 9 GHz over a wide range of activity factors. MCML with NTC also exhibits significantly lower noise levels as compared to standard CMOS. The results of the analysis demonstrate that pairing NTC and MCML is efficient when operating at high frequencies and activity factors.

  12. On the Instability Threshold of Journal Bearing Supported Rotors

    Directory of Open Access Journals (Sweden)

    Ricardo Ugliara Mendes

    2014-01-01

    Full Text Available Journal bearing supported rotors present two kinds of self-excited vibrations: oil-whirl and oil-whip. The first one is commonly masked by the rotor unbalance, hence being rarely associated with instability problems. Oil-whip is a severe vibration which occurs when the oil-whirl frequency coincides with the first flexural natural frequency of the shaft. In many cases, oil-whip is the only fluid-induced instability considered during the design stage; however, experimental evidences have shown that the instability threshold may occur much sooner, demanding a better comprehension of the instability mechanism. In this context, numerical simulations were made in order to improve the identification of the instability threshold for two test rig configurations: one on which the instability occurs on the oil-whip frequency, and another which became unstable before this threshold. Therefore, the main contribution of this paper is to present an investigation of two different thresholds of fluid-induced instabilities and their detectability on design stage simulations based on rotordynamic analysis using linear speed dependent coefficients for the bearings.

  13. Optimal Selection of Threshold Value 'r' for Refined Multiscale Entropy.

    Science.gov (United States)

    Marwaha, Puneeta; Sunkaria, Ramesh Kumar

    2015-12-01

    Refined multiscale entropy (RMSE) technique was introduced to evaluate complexity of a time series over multiple scale factors 't'. Here threshold value 'r' is updated as 0.15 times SD of filtered scaled time series. The use of fixed threshold value 'r' in RMSE sometimes assigns very close resembling entropy values to certain time series at certain temporal scale factors and is unable to distinguish different time series optimally. The present study aims to evaluate RMSE technique by varying threshold value 'r' from 0.05 to 0.25 times SD of filtered scaled time series and finding optimal 'r' values for each scale factor at which different time series can be distinguished more effectively. The proposed RMSE was used to evaluate over HRV time series of normal sinus rhythm subjects, patients suffering from sudden cardiac death, congestive heart failure, healthy adult male, healthy adult female and mid-aged female groups as well as over synthetic simulated database for different datalengths 'N' of 3000, 3500 and 4000. The proposed RMSE results in improved discrimination among different time series. To enhance the computational capability, empirical mathematical equations have been formulated for optimal selection of threshold values 'r' as a function of SD of filtered scaled time series and datalength 'N' for each scale factor 't'.

  14. Hyper-arousal decreases human visual thresholds.

    Directory of Open Access Journals (Sweden)

    Adam J Woods

    Full Text Available Arousal has long been known to influence behavior and serves as an underlying component of cognition and consciousness. However, the consequences of hyper-arousal for visual perception remain unclear. The present study evaluates the impact of hyper-arousal on two aspects of visual sensitivity: visual stereoacuity and contrast thresholds. Sixty-eight participants participated in two experiments. Thirty-four participants were randomly divided into two groups in each experiment: Arousal Stimulation or Sham Control. The Arousal Stimulation group underwent a 50-second cold pressor stimulation (immersing the foot in 0-2° C water, a technique known to increase arousal. In contrast, the Sham Control group immersed their foot in room temperature water. Stereoacuity thresholds (Experiment 1 and contrast thresholds (Experiment 2 were measured before and after stimulation. The Arousal Stimulation groups demonstrated significantly lower stereoacuity and contrast thresholds following cold pressor stimulation, whereas the Sham Control groups showed no difference in thresholds. These results provide the first evidence that hyper-arousal from sensory stimulation can lower visual thresholds. Hyper-arousal's ability to decrease visual thresholds has important implications for survival, sports, and everyday life.

  15. Reaction thresholds in doubly special relativity

    International Nuclear Information System (INIS)

    Heyman, Daniel; Major, Seth; Hinteleitner, Franz

    2004-01-01

    Two theories of special relativity with an additional invariant scale, 'doubly special relativity', are tested with calculations of particle process kinematics. Using the Judes-Visser modified conservation laws, thresholds are studied in both theories. In contrast with some linear approximations, which allow for particle processes forbidden in special relativity, both the Amelino-Camelia and Magueijo-Smolin frameworks allow no additional processes. To first order, the Amelino-Camelia framework thresholds are lowered and the Magueijo-Smolin framework thresholds may be raised or lowered

  16. Digital IP Protection Using Threshold Voltage Control

    OpenAIRE

    Davis, Joseph; Kulkarni, Niranjan; Yang, Jinghua; Dengi, Aykut; Vrudhula, Sarma

    2016-01-01

    This paper proposes a method to completely hide the functionality of a digital standard cell. This is accomplished by a differential threshold logic gate (TLG). A TLG with $n$ inputs implements a subset of Boolean functions of $n$ variables that are linear threshold functions. The output of such a gate is one if and only if an integer weighted linear arithmetic sum of the inputs equals or exceeds a given integer threshold. We present a novel architecture of a TLG that not only allows a single...

  17. Precipitation thresholds for landslide occurrence near Seattle, Mukilteo, and Everett, Washington

    Science.gov (United States)

    Scheevel, Caroline R.; Baum, Rex L.; Mirus, Benjamin B.; Smith, Joel B.

    2017-04-27

    Shallow landslides along coastal bluffs frequently occur in the railway corridor between Seattle and Everett, Washington. These slides disrupt passenger rail service, both because of required track maintenance and because the railroad owner, Burlington Northern Santa Fe Railway, does not allow passenger travel for 48 hours after a disruptive landslide. Sound Transit, which operates commuter trains in the corridor, is interested in a decision-making tool to help preemptively cancel passenger railway service in dangerous conditions and reallocate resources to alternative transportation.Statistical analysis showed that a majority of landslides along the Seattle-Everett Corridor are strongly correlated with antecedent rainfall, but that 21-37 percent of recorded landslide dates experienced less than 1 inch of precipitation in the 3 days preceding the landslide and less than 4 inches of rain in the 15 days prior to the preceding 3 days. We developed two empirical thresholds to identify precipitation conditions correlated with landslide occurrence. The two thresholds are defined as P3 = 2.16-0.44P15 and P3 = 2.16-0.22P32, where P3 is the cumulative precipitation in the 3 days prior to the considered date and P15 or P32 is the cumulative precipitation in the 15 days or 32 days prior to P3 (all measurements given in inches). The two thresholds, when compared to a previously developed threshold, quantitatively improve the prediction rate.We also investigated rainfall intensity-duration (ID) thresholds to determine whether revision would improve identification of moderate-intensity, landslide-producing storms. New, optimized ID thresholds evaluate rainstorms lasting at least 12 hours and identify landslide-inducing storms that were typically missed by previously published ID thresholds. The main advantage of the ID thresholds appears when they are combined with recent-antecedent thresholds because rainfall conditions that exceed both threshold types are more likely to induce

  18. Improving laser damage threshold measurements: an explosive analogy

    Science.gov (United States)

    Arenberg, Jonathan W.; Thomas, Michael D.

    2012-11-01

    Laser damage measurements share similarities with testing of explosives, namely the sample or sample site is damaged or modified during the measurement and cannot be retested. An extensive literature exists for techniques of measurement of the "all fire" and "no fire" levels for explosives. These levels hold direct analogy to the "all damage" or 100% probability of damage or the "all safe" or 0% probability of damage. The Maximum Likelihood Estimate method, which is the foundation of this technique, is introduced. These methods are applied to an archetypal damage probability model and the results shown to be accurate and unbiased.

  19. Analysis and Enhancement of a Password Authentication and Update Scheme Based on Elliptic Curve Cryptography

    Directory of Open Access Journals (Sweden)

    Lili Wang

    2014-01-01

    Full Text Available Recently, a password authentication and update scheme has been presented by Islam and Biswas to remove the security weaknesses in Lin and Huang’s scheme. Unfortunately, He et al., Wang et al., and Li have found out that Islam and Biswas’ improvement was vulnerable to offline password guessing attack, stolen verifier attack, privilege insider attack, and denial of service attack. In this paper, we further analyze Islam and Biswas’ scheme and demonstrate that their scheme cannot resist password compromise impersonation attack. In order to remedy the weaknesses mentioned above, we propose an improved anonymous remote authentication scheme using smart card without using bilinear paring computation. In addition, the verifier tables are no longer existent, and the privacy of users could be protected better. Furthermore, our proposal not only inherits the advantages in Islam and Biswas’ scheme, but also provides more features, including preserving user anonymity, supporting offline password change, revocation, reregistration with the same identifier, and system update. Finally, we compare our enhancement with related works to illustrate that the improvement is more secure and robust, while maintaining low performance cost.

  20. Secure information management using linguistic threshold approach

    CERN Document Server

    Ogiela, Marek R

    2013-01-01

    This book details linguistic threshold schemes for information sharing. It examines the opportunities of using these techniques to create new models of managing strategic information shared within a commercial organisation or a state institution.

  1. Melanin microcavitation threshold in the near infrared

    Science.gov (United States)

    Schmidt, Morgan S.; Kennedy, Paul K.; Vincelette, Rebecca L.; Schuster, Kurt J.; Noojin, Gary D.; Wharmby, Andrew W.; Thomas, Robert J.; Rockwell, Benjamin A.

    2014-02-01

    Thresholds for microcavitation of isolated bovine and porcine melanosomes were determined using single nanosecond (ns) laser pulses in the NIR (1000 - 1319 nm) wavelength regime. Average fluence thresholds for microcavitation increased non-linearly with increasing wavelength. Average fluence thresholds were also measured for 10-ns pulses at 532 nm, and found to be comparable to visible ns pulse values published in previous reports. Fluence thresholds were used to calculate melanosome absorption coefficients, which decreased with increasing wavelength. This trend was found to be comparable to the decrease in retinal pigmented epithelial (RPE) layer absorption coefficients reported over the same wavelength region. Estimated corneal total intraocular energy (TIE) values were determined and compared to the current and proposed maximum permissible exposure (MPE) safe exposure levels. Results from this study support the proposed changes to the MPE levels.

  2. A prototype threshold Cherenkov counter for DIRAC

    CERN Document Server

    Bragadireanu, M; Cima, E; Dulach, B; Gianotti, P; Guaraldo, C; Iliescu, M A; Lanaro, A; Levi-Sandri, P; Petrascu, C; Girolami, B; Groza, L; Kulikov, A; Kuptsov, A; Topilin, N; Trusov, S

    1999-01-01

    We have designed, built and tested a gas threshold Cherenkov counter as prototype for a larger counter foreseen for use in the DIRAC experiment, at CERN. We describe the performances of the counter on a test beam.

  3. Recent progress in understanding climate thresholds

    NARCIS (Netherlands)

    Good, Peter; Bamber, Jonathan; Halladay, Kate; Harper, Anna B.; Jackson, Laura C.; Kay, Gillian; Kruijt, Bart; Lowe, Jason A.; Phillips, Oliver L.; Ridley, Jeff; Srokosz, Meric; Turley, Carol; Williamson, Phillip

    2018-01-01

    This article reviews recent scientific progress, relating to four major systems that could exhibit threshold behaviour: ice sheets, the Atlantic meridional overturning circulation (AMOC), tropical forests and ecosystem responses to ocean acidification. The focus is on advances since the

  4. Deficiencies of the cryptography based on multiple-parameter fractional Fourier transform.

    Science.gov (United States)

    Ran, Qiwen; Zhang, Haiying; Zhang, Jin; Tan, Liying; Ma, Jing

    2009-06-01

    Methods of image encryption based on fractional Fourier transform have an incipient flaw in security. We show that the schemes have the deficiency that one group of encryption keys has many groups of keys to decrypt the encrypted image correctly for several reasons. In some schemes, many factors result in the deficiencies, such as the encryption scheme based on multiple-parameter fractional Fourier transform [Opt. Lett.33, 581 (2008)]. A modified method is proposed to avoid all the deficiencies. Security and reliability are greatly improved without increasing the complexity of the encryption process. (c) 2009 Optical Society of America.

  5. Combined threshold and transverse momentum resummation for inclusive observables

    International Nuclear Information System (INIS)

    Muselli, Claudio; Forte, Stefano; Ridolfi, Giovanni

    2017-01-01

    We present a combined resummation for the transverse momentum distribution of a colorless final state in perturbative QCD, expressed as a function of transverse momentum p T and the scaling variable x. Its expression satisfies three requirements: it reduces to standard transverse momentum resummation to any desired logarithmic order in the limit p T →0 for fixed x, up to power suppressed corrections in p T ; it reduces to threshold resummation to any desired logarithmic order in the limit x→1 for fixed p T , up to power suppressed correction in 1−x; upon integration over transverse momentum it reproduces the resummation of the total cross cross at any given logarithmic order in the threshold x→1 limit, up to power suppressed correction in 1−x. Its main ingredient, and our main new result, is a modified form of transverse momentum resummation, which leads to threshold resummation upon integration over p T , and for which we provide a simple closed-form analytic expression in Fourier-Mellin (b,N) space. We give explicit coefficients up to NNLL order for the specific case of Higgs production in gluon fusion in the effective field theory limit. Our result allows for a systematic improvement of the transverse momentum distribution through threshold resummation which holds for all p T , and elucidates the relation between transverse momentum resummation and threshold resummation at the inclusive level, specifically by providing within perturbative QCD a simple derivation of the main consequence of the so-called collinear anomaly of SCET.

  6. Combined threshold and transverse momentum resummation for inclusive observables

    Energy Technology Data Exchange (ETDEWEB)

    Muselli, Claudio; Forte, Stefano [Tif Lab, Dipartimento di Fisica, Università di Milano and INFN, Sezione di Milano,Via Celoria 16, I-20133 Milano (Italy); Ridolfi, Giovanni [Dipartimento di Fisica, Università di Genova and INFN, Sezione di Genova,Via Dodecaneso 33, I-16146 Genova (Italy)

    2017-03-21

    We present a combined resummation for the transverse momentum distribution of a colorless final state in perturbative QCD, expressed as a function of transverse momentum p{sub T} and the scaling variable x. Its expression satisfies three requirements: it reduces to standard transverse momentum resummation to any desired logarithmic order in the limit p{sub T}→0 for fixed x, up to power suppressed corrections in p{sub T}; it reduces to threshold resummation to any desired logarithmic order in the limit x→1 for fixed p{sub T}, up to power suppressed correction in 1−x; upon integration over transverse momentum it reproduces the resummation of the total cross cross at any given logarithmic order in the threshold x→1 limit, up to power suppressed correction in 1−x. Its main ingredient, and our main new result, is a modified form of transverse momentum resummation, which leads to threshold resummation upon integration over p{sub T}, and for which we provide a simple closed-form analytic expression in Fourier-Mellin (b,N) space. We give explicit coefficients up to NNLL order for the specific case of Higgs production in gluon fusion in the effective field theory limit. Our result allows for a systematic improvement of the transverse momentum distribution through threshold resummation which holds for all p{sub T}, and elucidates the relation between transverse momentum resummation and threshold resummation at the inclusive level, specifically by providing within perturbative QCD a simple derivation of the main consequence of the so-called collinear anomaly of SCET.

  7. Cost?effectiveness thresholds: pros and cons

    OpenAIRE

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost?effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost?effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost?effectiveness thresholds allow cost?effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization?s Commission on Macroeconomics in Health suggested cost?effectiveness thresholds based...

  8. Threshold concepts as barriers to understanding climate science

    Science.gov (United States)

    Walton, P.

    2013-12-01

    scientific engagement with the public to develop climate literacy. The analysis of 3 successive cohorts of students' journals who followed the same degree module identified that threshold concepts do exist within the field, such as those related to: role of ocean circulation, use of proxy indicators, forcing factors and feedback mechanisms. Once identified, the study looked at possible strategies to overcome these barriers to support student climate literacy. It concluded that the use of threshold concepts could be problematic when trying to improve climate literacy, as each individual has their own concepts they find ';troublesome' that do not necessarily relate to others. For scientists this presents the difficulty of how to develop a strategy that supports the individual that is cost and time effective. However, the study identifies that eLearning can be used effectively to help people understand troublesome knowledge.

  9. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    Science.gov (United States)

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology

  10. Study on PWSCC Susceptibility Index Considering Threshold Value

    International Nuclear Information System (INIS)

    Kim, Hyung Jun; Chung, June Ho; Kim, Tae Ryong

    2013-01-01

    Primary Water Stress Corrosion Cracking (PWSCC) of alloy 600 base metal is unique aging mechanism that is occurring in primary water environment in nuclear power plant. To manage PWSCC phenomenon, priority ranking is designated by using PWSCC susceptibility index. Westinghouse model is well used method in PWSCC susceptibility index calculation and the past Alloy 600 aging management program is also made by using PWSCC susceptibility index in Korea. In this paper, Westinghouse model was reviewed and the improvement of PWSCC susceptibility index considering threshold value is discussed. Normalization process using a reference value or a threshold value was introduced in calculation of PWSCC susceptibility index. The result shows the meaningful value of similar order of magnitude, while the component ranking was not changed so much from the original Westinghouse model. It can be concluded that the normalization process can be implemented in Alloy 600 aging management program

  11. Rate modulation detection thresholds for cochlear implant users.

    Science.gov (United States)

    Brochier, Tim; McKay, Colette; McDermott, Hugh

    2018-02-01

    The perception of temporal amplitude modulations is critical for speech understanding by cochlear implant (CI) users. The present study compared the ability of CI users to detect sinusoidal modulations of the electrical stimulation rate and current level, at different presentation levels (80% and 40% of the dynamic range) and modulation frequencies (10 and 100 Hz). Rate modulation detection thresholds (RMDTs) and amplitude modulation detection thresholds (AMDTs) were measured and compared to assess whether there was a perceptual advantage to either modulation method. Both RMDTs and AMDTs improved with increasing presentation level and decreasing modulation frequency. RMDTs and AMDTs were correlated, indicating that a common processing mechanism may underlie the perception of rate modulation and amplitude modulation, or that some subject-dependent factors affect both types of modulation detection.

  12. A Multiserver Biometric Authentication Scheme for TMIS using Elliptic Curve Cryptography.

    Science.gov (United States)

    Chaudhry, Shehzad Ashraf; Khan, Muhammad Tawab; Khan, Muhammad Khurram; Shon, Taeshik

    2016-11-01

    Recently several authentication schemes are proposed for telecare medicine information system (TMIS). Many of such schemes are proved to have weaknesses against known attacks. Furthermore, numerous such schemes cannot be used in real time scenarios. Because they assume a single server for authentication across the globe. Very recently, Amin et al. (J. Med. Syst. 39(11):180, 2015) designed an authentication scheme for secure communication between a patient and a medical practitioner using a trusted central medical server. They claimed their scheme to extend all security requirements and emphasized the efficiency of their scheme. However, the analysis in this article proves that the scheme designed by Amin et al. is vulnerable to stolen smart card and stolen verifier attacks. Furthermore, their scheme is having scalability issues along with inefficient password change and password recovery phases. Then we propose an improved scheme. The proposed scheme is more practical, secure and lightweight than Amin et al.'s scheme. The security of proposed scheme is proved using the popular automated tool ProVerif.

  13. A Visual Cryptography Based Watermark Technology for Individual and Group Images

    Directory of Open Access Journals (Sweden)

    Azzam Sleit

    2007-04-01

    Full Text Available The ease by which digital information can be duplicated and distributed has led to the need for effective copyright protection tools. Various techniques including watermarking have been introduced in attempt to address these growing concerns. Most watermarking algorithms call for a piece of information to be hidden directly in media content, in such a way that it is imperceptible to a human observer, but detectable by a computer. This paper presents an improved cryptographic watermark method based on Hwang and Naor-Shamir [1, 2] approaches. The technique does not require that the watermark pattern to be embedded in to the original digital image. Verification information is generated and used to validate the ownership of the image or a group of images. The watermark pattern can be any bitmap image. Experimental results show that the proposed method can recover the watermark pattern from the marked image (or group of images even if major changes are reflected on the original digital image or any member of the image group such as rotation, scaling and distortion.

  14. Objective definition of rainfall intensity-duration thresholds for the initiation of post-fire debris flows in southern California

    Science.gov (United States)

    Staley, Dennis; Kean, Jason W.; Cannon, Susan H.; Schmidt, Kevin M.; Laber, Jayme L.

    2012-01-01

    Rainfall intensity–duration (ID) thresholds are commonly used to predict the temporal occurrence of debris flows and shallow landslides. Typically, thresholds are subjectively defined as the upper limit of peak rainstorm intensities that do not produce debris flows and landslides, or as the lower limit of peak rainstorm intensities that initiate debris flows and landslides. In addition, peak rainstorm intensities are often used to define thresholds, as data regarding the precise timing of debris flows and associated rainfall intensities are usually not available, and rainfall characteristics are often estimated from distant gauging locations. Here, we attempt to improve the performance of existing threshold-based predictions of post-fire debris-flow occurrence by utilizing data on the precise timing of debris flows relative to rainfall intensity, and develop an objective method to define the threshold intensities. We objectively defined the thresholds by maximizing the number of correct predictions of debris flow occurrence while minimizing the rate of both Type I (false positive) and Type II (false negative) errors. We identified that (1) there were statistically significant differences between peak storm and triggering intensities, (2) the objectively defined threshold model presents a better balance between predictive success, false alarms and failed alarms than previous subjectively defined thresholds, (3) thresholds based on measurements of rainfall intensity over shorter duration (≤60 min) are better predictors of post-fire debris-flow initiation than longer duration thresholds, and (4) the objectively defined thresholds were exceeded prior to the recorded time of debris flow at frequencies similar to or better than subjective thresholds. Our findings highlight the need to better constrain the timing and processes of initiation of landslides and debris flows for future threshold studies. In addition, the methods used to define rainfall thresholds in this

  15. Tracking of nociceptive thresholds using adaptive psychophysical methods

    NARCIS (Netherlands)

    Doll, Robert; Buitenweg, Jan R.; Meijer, Hil Gaétan Ellart; Veltink, Petrus H.

    Psychophysical thresholds reflect the state of the underlying nociceptive mechanisms. For example, noxious events can activate endogenous analgesic mechanisms that increase the nociceptive threshold. Therefore, tracking thresholds over time facilitates the investigation of the dynamics of these

  16. Identifying Threshold Concepts for Information Literacy: A Delphi Study

    Directory of Open Access Journals (Sweden)

    Lori Townsend

    2016-06-01

    Full Text Available This study used the Delphi method to engage expert practitioners on the topic of threshold concepts for information literacy. A panel of experts considered two questions. First, is the threshold concept approach useful for information literacy instruction? The panel unanimously agreed that the threshold concept approach holds potential for information literacy instruction. Second, what are the threshold concepts for information literacy instruction? The panel proposed and discussed over fifty potential threshold concepts, finally settling on six information literacy threshold concepts.

  17. Cost-effectiveness thresholds: pros and cons.

    Science.gov (United States)

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  18. Improving the security of arbitrated quantum signature against the forgery attack

    Science.gov (United States)

    Zhang, Ke-Jia; Zhang, Wei-Wei; Li, Dan

    2013-08-01

    As a feasible model for signing quantum messages, some cryptanalysis and improvement of arbitrated quantum signature (AQS) have received a great deal of attentions in recent years. However, in this paper we find the previous improvement is not suitable implemented in some typical AQS protocols in the sense that the receiver, Bob, can forge a valid signature under known message attack. We describe the forgery strategy and present some corresponding improved strategies to stand against the forgery attack by modifying the encryption algorithm, an important part of AQS. These works preserve the merits of AQS and lead some potential improvements of the security in quantum signature or other cryptography problems.

  19. Pursuing optimal thresholds to recommend breast biopsy by quantifying the value of tomosynthesis

    Science.gov (United States)

    Wu, Yirong; Alagoz, Oguzhan; Vanness, David J.; Trentham-Dietz, Amy; Burnside, Elizabeth S.

    2014-03-01

    A 2% threshold has been traditionally used to recommend breast biopsy in mammography. We aim to characterize how the biopsy threshold varies to achieve the maximum expected utility (MEU) of tomosynthesis for breast cancer diagnosis. A cohort of 312 patients, imaged with standard full field digital mammography (FFDM) and digital breast tomosynthesis (DBT), was selected for a reader study. Fifteen readers interpreted each patient's images and estimated the probability of malignancy using two modes: FFDM versus FFDM + DBT. We generated receiver operator characteristic (ROC) curves with the probabilities for all readers combined. We found that FFDM+DBT provided improved accuracy and MEU compared with FFDM alone. When DBT was included in the diagnosis along with FFDM, the optimal biopsy threshold increased to 2.7% as compared with the 2% threshold for FFDM alone. While understanding the optimal threshold from a decision analytic standpoint will not help physicians improve their performance without additional guidance (e.g. decision support to reinforce this threshold), the discovery of this level does demonstrate the potential clinical improvements attainable with DBT. Specifically, DBT has the potential to lead to substantial improvements in breast cancer diagnosis since it could reduce the number of patients recommended for biopsy while preserving the maximal expected utility.

  20. Overwash threshold experiment for gravel barriers

    Science.gov (United States)

    Matias, Ana; Williams, Jon; Bradbury, Andrew; Masselink, Gerhard; Ferreira, Óscar

    2010-05-01

    Field measurements of overwash effects, associated physical forcing, and determination of threshold conditions, are much less common for gravel than for sandy barriers (e.g., field measurements by Lorang, 2002; Bradbury et al., 2005; and laboratory studies by Obhrai et al., 2008). In order to define overwash thresholds for gravel there is a need for measurements under a variety of forcing conditions that include waves, tides and surges. Flume experiments allow the manipulation of physical forcing and can make a valuable contribution to improve the understanding and prediction of overwash. To study gravel barrier overwash processes, BARDEX proto-type scale laboratory experiment was undertaken in the Delta flume (Williams et al., 2009). A 4 m high, 50 m wide gravel barrier composed of sediments with D50 = 10 mm was emplaced in the flume and subjected to a range of water levels, wave heights and wave periods. Barrier morphology was surveyed before and after each run. Two situations were simulated: overwashing and overtopping. Following Orford and Carter (1982) terminology, the distinction between overtopping and overwash was based on the type of morphological change over the barrier crest. Overtopping causes vertical accretion at the crest, whereas overwashing promotes the formation of washover deposits landwards from the crest. Ten overwash experiments were conducted (divided in 63 runs), and overtopping was recorded in 22 runs and overwash in 20 runs. In other runs, only the beach face was reworked by waves. In a systematic series of tests water levels were varied between 3.00 m and 3.75 m (in steps of 0.125 m); wave height was varied between 0.8 m and 1.3 m (in steps of 0.05 or 0.1 m); and wave periods of 4.5, 6, 7 and 8 seconds were used. These hydrodynamic conditions were used to compute wave run-up using several well-known formulae (cf., Powell, 1990; Stockdon et al., 2007). Comparison between run-up estimations and the barrier crest elevation prior to wave

  1. Regression Discontinuity Designs Based on Population Thresholds

    DEFF Research Database (Denmark)

    Eggers, Andrew C.; Freier, Ronny; Grembi, Veronica

    ) to measure the effects of these threshold-based policies on political and economic outcomes. Using evidence from France, Germany, and Italy, we highlight two common pitfalls that arise in exploiting population-based policies (confounded treatment and sorting) and we provide guidance for detecting......In many countries, important features of municipal government (such as the electoral system, mayors' salaries, and the number of councillors) depend on whether the municipality is above or below arbitrary population thresholds. Several papers have used a regression discontinuity design (RDD...... and addressing these pitfalls. Even when these problems are present, population-threshold RDD may be the best available research design for studying the effects of certain policies and political institutions....

  2. Effects of pulse duration on magnetostimulation thresholds

    Energy Technology Data Exchange (ETDEWEB)

    Saritas, Emine U., E-mail: saritas@ee.bilkent.edu.tr [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of Electrical and Electronics Engineering, Bilkent University, Bilkent, Ankara 06800 (Turkey); National Magnetic Resonance Research Center (UMRAM), Bilkent University, Bilkent, Ankara 06800 (Turkey); Goodwill, Patrick W. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Conolly, Steven M. [Department of Bioengineering, University of California, Berkeley, Berkeley, California 94720-1762 (United States); Department of EECS, University of California, Berkeley, California 94720-1762 (United States)

    2015-06-15

    Purpose: Medical imaging techniques such as magnetic resonance imaging and magnetic particle imaging (MPI) utilize time-varying magnetic fields that are subject to magnetostimulation limits, which often limit the speed of the imaging process. Various human-subject experiments have studied the amplitude and frequency dependence of these thresholds for gradient or homogeneous magnetic fields. Another contributing factor was shown to be number of cycles in a magnetic pulse, where the thresholds decreased with longer pulses. The latter result was demonstrated on two subjects only, at a single frequency of 1.27 kHz. Hence, whether the observed effect was due to the number of cycles or due to the pulse duration was not specified. In addition, a gradient-type field was utilized; hence, whether the same phenomenon applies to homogeneous magnetic fields remained unknown. Here, the authors investigate the pulse duration dependence of magnetostimulation limits for a 20-fold range of frequencies using homogeneous magnetic fields, such as the ones used for the drive field in MPI. Methods: Magnetostimulation thresholds were measured in the arms of six healthy subjects (age: 27 ± 5 yr). Each experiment comprised testing the thresholds at eight different pulse durations between 2 and 125 ms at a single frequency, which took approximately 30–40 min/subject. A total of 34 experiments were performed at three different frequencies: 1.2, 5.7, and 25.5 kHz. A solenoid coil providing homogeneous magnetic field was used to induce stimulation, and the field amplitude was measured in real time. A pre-emphasis based pulse shaping method was employed to accurately control the pulse durations. Subjects reported stimulation via a mouse click whenever they felt a twitching/tingling sensation. A sigmoid function was fitted to the subject responses to find the threshold at a specific frequency and duration, and the whole procedure was repeated at all relevant frequencies and pulse durations

  3. THRESHOLD PARAMETER OF THE EXPECTED LOSSES

    Directory of Open Access Journals (Sweden)

    Josip Arnerić

    2012-12-01

    Full Text Available The objective of extreme value analysis is to quantify the probabilistic behavior of unusually large losses using only extreme values above some high threshold rather than using all of the data which gives better fit to tail distribution in comparison to traditional methods with assumption of normality. In our case we estimate market risk using daily returns of the CROBEX index at the Zagreb Stock Exchange. Therefore, it’s necessary to define the excess distribution above some threshold, i.e. Generalized Pareto Distribution (GPD is used as much more reliable than the normal distribution due to the fact that gives the accent on the extreme values. Parameters of GPD distribution will be estimated using maximum likelihood method (MLE. The contribution of this paper is to specify threshold which is large enough so that GPD approximation valid but low enough so that a sufficient number of observations are available for a precise fit.

  4. Threshold Theory Tested in an Organizational Setting

    DEFF Research Database (Denmark)

    Christensen, Bo T.; Hartmann, Peter V. W.; Hedegaard Rasmussen, Thomas

    2017-01-01

    correlations differed significantly. The finding was stable across distinct parts of the sample, providing support for the theory, although the correlations in all subsamples were small. The findings lend support to the existence of threshold effects using perceptual measures of behavior in real......A large sample of leaders (N = 4257) was used to test the link between leader innovativeness and intelligence. The threshold theory of the link between creativity and intelligence assumes that below a certain IQ level (approximately IQ 120), there is some correlation between IQ and creative...... potential, but above this cutoff point, there is no correlation. Support for the threshold theory of creativity was found, in that the correlation between IQ and innovativeness was positive and significant below a cutoff point of IQ 120. Above the cutoff, no significant relation was identified, and the two...

  5. Thresholding functional connectomes by means of mixture modeling.

    Science.gov (United States)

    Bielczyk, Natalia Z; Walocha, Fabian; Ebel, Patrick W; Haak, Koen V; Llera, Alberto; Buitelaar, Jan K; Glennon, Jeffrey C; Beckmann, Christian F

    2018-05-01

    Functional connectivity has been shown to be a very promising tool for studying the large-scale functional architecture of the human brain. In network research in fMRI, functional connectivity is considered as a set of pair-wise interactions between the nodes of the network. These interactions are typically operationalized through the full or partial correlation between all pairs of regional time series. Estimating the structure of the latent underlying functional connectome from the set of pair-wise partial correlations remains an open research problem though. Typically, this thresholding problem is approached by proportional thresholding, or by means of parametric or non-parametric permutation testing across a cohort of subjects at each possible connection. As an alternative, we propose a data-driven thresholding approach for network matrices on the basis of mixture modeling. This approach allows for creating subject-specific sparse connectomes by modeling the full set of partial correlations as a mixture of low correlation values associated with weak or unreliable edges in the connectome and a sparse set of reliable connections. Consequently, we propose to use alternative thresholding strategy based on the model fit using pseudo-False Discovery Rates derived on the basis of the empirical null estimated as part of the mixture distribution. We evaluate the method on synthetic benchmark fMRI datasets where the underlying network structure is known, and demonstrate that it gives improved performance with respect to the alternative methods for thresholding connectomes, given the canonical thresholding levels. We also demonstrate that mixture modeling gives highly reproducible results when applied to the functional connectomes of the visual system derived from the n-back Working Memory task in the Human Connectome Project. The sparse connectomes obtained from mixture modeling are further discussed in the light of the previous knowledge of the functional architecture

  6. Optimal Investment Under Transaction Costs: A Threshold Rebalanced Portfolio Approach

    Science.gov (United States)

    Tunc, Sait; Donmez, Mehmet Ali; Kozat, Suleyman Serdar

    2013-06-01

    We study optimal investment in a financial market having a finite number of assets from a signal processing perspective. We investigate how an investor should distribute capital over these assets and when he should reallocate the distribution of the funds over these assets to maximize the cumulative wealth over any investment period. In particular, we introduce a portfolio selection algorithm that maximizes the expected cumulative wealth in i.i.d. two-asset discrete-time markets where the market levies proportional transaction costs in buying and selling stocks. We achieve this using "threshold rebalanced portfolios", where trading occurs only if the portfolio breaches certain thresholds. Under the assumption that the relative price sequences have log-normal distribution from the Black-Scholes model, we evaluate the expected wealth under proportional transaction costs and find the threshold rebalanced portfolio that achieves the maximal expected cumulative wealth over any investment period. Our derivations can be readily extended to markets having more than two stocks, where these extensions are pointed out in the paper. As predicted from our derivations, we significantly improve the achieved wealth over portfolio selection algorithms from the literature on historical data sets.

  7. Social contagion with degree-dependent thresholds

    Science.gov (United States)

    Lee, Eun; Holme, Petter

    2017-07-01

    We investigate opinion spreading by a threshold model in a situation in which the influence of people is heterogeneously distributed. We assume that there is a coupling between the influence of an individual (measured by the out-degree) and the threshold for accepting a new opinion or habit. We find that if the coupling is strongly positive, the final state of the system will be a mix of different opinions. Otherwise, it will converge to a consensus state. This phenomenon cannot simply be explained as a phase transition, but it is a combined effect of mechanisms and their relative dominance in different regions of parameter space.

  8. Thresholds in Xeric Hydrology and Biogeochemistry

    Science.gov (United States)

    Meixner, T.; Brooks, P. D.; Simpson, S. C.; Soto, C. D.; Yuan, F.; Turner, D.; Richter, H.

    2011-12-01

    Due to water limitation, thresholds in hydrologic and biogeochemical processes are common in arid and semi-arid systems. Some of these thresholds such as those focused on rainfall runoff relationships have been well studied. However to gain a full picture of the role that thresholds play in driving the hydrology and biogeochemistry of xeric systems a full view of the entire array of processes at work is needed. Here a walk through the landscape of xeric systems will be conducted illustrating the powerful role of hydrologic thresholds on xeric system biogeochemistry. To understand xeric hydro-biogeochemistry two key ideas need to be focused on. First, it is important to start from a framework of reaction and transport. Second an understanding of the temporal and spatial components of thresholds that have a large impact on hydrologic and biogeochemical fluxes needs to be offered. In the uplands themselves episodic rewetting and drying of soils permits accelerated biogeochemical processing but also more gradual drainage of water through the subsurface than expected in simple conceptions of biogeochemical processes. Hydrologic thresholds (water content above hygroscopic) results in a stop start nutrient spiral of material across the landscape since runoff connecting uplands to xeric perennial riparian is episodic and often only transports materials a short distance (100's of m). This episodic movement results in important and counter-intuitive nutrient inputs to riparian zones but also significant processing and uptake of nutrients. The floods that transport these biogeochemicals also result in significant input to riparian groundwater and may be key to sustaining these critical ecosystems. Importantly the flood driven recharge process itself is a threshold process dependent on flood characteristics (floods greater than 100 cubic meters per second) and antecedent conditions (losing to near neutral gradients). Floods also appear to influence where arid and semi

  9. Color image Segmentation using automatic thresholding techniques

    International Nuclear Information System (INIS)

    Harrabi, R.; Ben Braiek, E.

    2011-01-01

    In this paper, entropy and between-class variance based thresholding methods for color images segmentation are studied. The maximization of the between-class variance (MVI) and the entropy (ME) have been used as a criterion functions to determine an optimal threshold to segment images into nearly homogenous regions. Segmentation results from the two methods are validated and the segmentation sensitivity for the test data available is evaluated, and a comparative study between these methods in different color spaces is presented. The experimental results demonstrate the superiority of the MVI method for color image segmentation.

  10. The threshold effects of meteorological factors on Hand, foot, and mouth disease (HFMD) in China, 2011

    OpenAIRE

    Du, Zhicheng; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng; Hao, Yuantao

    2016-01-01

    We explored the threshold effects of meteorological factors on hand, foot and mouth disease (HFMD) in mainland China to improve the prevention and early warning. Using HFMD surveillance and meteorological data in 2011, we identified the threshold effects of predictors on the monthly incidence of HFMD and predicted the high risk months, with classification and regression tree models (CART). The results of the classification tree showed that there was an 82.35% chance for a high risk of HFMD wh...

  11. Input-output relation and energy efficiency in the neuron with different spike threshold dynamics

    Directory of Open Access Journals (Sweden)

    Guo-Sheng eYi

    2015-05-01

    Full Text Available Neuron encodes and transmits information through generating sequences of output spikes, which is a high energy-consuming process. The spike is initiated when membrane depolarization reaches a threshold voltage. In many neurons, threshold is dynamic and depends on the rate of membrane depolarization (dV/dt preceding a spike. Identifying the metabolic energy involved in neural coding and their relationship to threshold dynamic is critical to understanding neuronal function and evolution. Here, we use a modified Morris-Lecar model to investigate neuronal input-output property and energy efficiency associated with different spike threshold dynamics. We find that the neurons with dynamic threshold sensitive to dV/dt generate discontinuous frequency-current curve and type II phase response curve (PRC through Hopf bifurcation, and weak noise could prohibit spiking when bifurcation just occurs. The threshold that is insensitive to dV/dt, instead, results in a continuous frequency-current curve, a type I PRC and a saddle-node on invariant circle bifurcation, and simultaneously weak noise cannot inhibit spiking. It is also shown that the bifurcation, frequency-current curve and PRC type associated with different threshold dynamics arise from the distinct subthreshold interactions of membrane currents. Further, we observe that the energy consumption of the neuron is related to its firing characteristics. The depolarization of spike threshold improves neuronal energy efficiency by reducing the overlap of Na+ and K+ currents during an action potential. The high energy efficiency is achieved at more depolarized spike threshold and high stimulus current. These results provide a fundamental biophysical connection that links spike threshold dynamics, input-output relation, energetics and spike initiation, which could contribute to uncover neural encoding mechanism.

  12. Input-output relation and energy efficiency in the neuron with different spike threshold dynamics.

    Science.gov (United States)

    Yi, Guo-Sheng; Wang, Jiang; Tsang, Kai-Ming; Wei, Xi-Le; Deng, Bin

    2015-01-01

    Neuron encodes and transmits information through generating sequences of output spikes, which is a high energy-consuming process. The spike is initiated when membrane depolarization reaches a threshold voltage. In many neurons, threshold is dynamic and depends on the rate of membrane depolarization (dV/dt) preceding a spike. Identifying the metabolic energy involved in neural coding and their relationship to threshold dynamic is critical to understanding neuronal function and evolution. Here, we use a modified Morris-Lecar model to investigate neuronal input-output property and energy efficiency associated with different spike threshold dynamics. We find that the neurons with dynamic threshold sensitive to dV/dt generate discontinuous frequency-current curve and type II phase response curve (PRC) through Hopf bifurcation, and weak noise could prohibit spiking when bifurcation just occurs. The threshold that is insensitive to dV/dt, instead, results in a continuous frequency-current curve, a type I PRC and a saddle-node on invariant circle bifurcation, and simultaneously weak noise cannot inhibit spiking. It is also shown that the bifurcation, frequency-current curve and PRC type associated with different threshold dynamics arise from the distinct subthreshold interactions of membrane currents. Further, we observe that the energy consumption of the neuron is related to its firing characteristics. The depolarization of spike threshold improves neuronal energy efficiency by reducing the overlap of Na(+) and K(+) currents during an action potential. The high energy efficiency is achieved at more depolarized spike threshold and high stimulus current. These results provide a fundamental biophysical connection that links spike threshold dynamics, input-output relation, energetics and spike initiation, which could contribute to uncover neural encoding mechanism.

  13. Low-threshold support for families with dementia in Germany

    Directory of Open Access Journals (Sweden)

    Hochgraeber Iris

    2012-06-01

    Full Text Available Abstract Background Low-threshold support services are a part of the German health care system and help relieving family caregivers. There is limited information available on how to construct and implement low-threshold support services for people with dementia and their families in Germany. Some studies describe separately different perspectives of experiences and expectations, but there is no study combining all the different perspectives of those involved and taking the arrangements and organisation as well as their opinions on supporting and inhibiting factors into consideration. Findings This protocol describes the design of the study on low-threshold support services for families with a person with dementia in two German regions. The aim is to develop recommendations on how to build up these services and how to implement them in a region. A quantitative as well as a qualitative approach will be used. The quantitative part will be a survey on characteristics of service users and providers, as well as health care structures of the two project regions and an evaluation of important aspects derived from a literature search. Group discussions and semi-structured interviews will be carried out to get a deeper insight into the facilitators and barriers for both using and providing these services. All people involved will be included, such as the people with dementia, their relatives, volunteers, coordinators and institution representatives. Discussion Results of this study will provide important aspects for policymakers who are interested in an effective and low-threshold support for people with dementia. Furthermore the emerging recommendations can help staff and institutions to improve quality of care and can contribute to developing health and social care structures in Germany.

  14. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    Science.gov (United States)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  15. Analysis of the width-[Formula: see text] non-adjacent form in conjunction with hyperelliptic curve cryptography and with lattices.

    Science.gov (United States)

    Krenn, Daniel

    2013-06-17

    In this work the number of occurrences of a fixed non-zero digit in the width-[Formula: see text] non-adjacent forms of all elements of a lattice in some region (e.g. a ball) is analysed. As bases, expanding endomorphisms with eigenvalues of the same absolute value are allowed. Applications of the main result are on numeral systems with an algebraic integer as base. Those come from efficient scalar multiplication methods (Frobenius-and-add methods) in hyperelliptic curves cryptography, and the result is needed for analysing the running time of such algorithms. The counting result itself is an asymptotic formula, where its main term coincides with the full block length analysis. In its second order term a periodic fluctuation is exhibited. The proof follows Delange's method.

  16. A Theoretical and Experimental Comparison of One Time Pad Cryptography using Key and Plaintext Insertion and Transposition (KPIT and Key Coloumnar Transposition (KCT Method

    Directory of Open Access Journals (Sweden)

    Pryo Utomo

    2017-06-01

    Full Text Available One Time Pad (OTP is a cryptographic algorithm that is quite easy to be implemented. This algorithm works by converting plaintext and key into decimal then converting into binary number and calculating Exclusive-OR logic. In this paper, the authors try to make the comparison of OTP cryptography using KPI and KCT so that the ciphertext will be generated more difficult to be known. In the Key and Plaintext Insertion (KPI Method, we modify the OTP algorithm by adding the key insertion in the plaintext that has been splitted. Meanwhile in the Key Coloumnar Transposition (KCT Method, we modify the OTP algorithm by dividing the key into some parts in matrix of rows and coloumns. Implementation of the algorithms using PHP programming language.

  17. On the two steps threshold selection for over-threshold modelling of extreme events

    Science.gov (United States)

    Bernardara, Pietro; Mazas, Franck; Weiss, Jerome; Andreewsky, Marc; Kergadallan, Xavier; Benoit, Michel; Hamm, Luc

    2013-04-01

    The estimation of the probability of occurrence of extreme events is traditionally achieved by fitting a probability distribution on a sample of extreme observations. In particular, the extreme value theory (EVT) states that values exceeding a given threshold converge through a Generalized Pareto Distribution (GPD) if the original sample is composed of independent and identically distributed values. However, the temporal series of sea and ocean variables usually show strong temporal autocorrelation. Traditionally, in order to select independent events for the following statistical analysis, the concept of a physical threshold is introduced: events that excess that threshold are defined as "extreme events". This is the so-called "Peak Over a Threshold (POT)" sampling, widely spread in the literature and currently used for engineering applications among many others. In the past, the threshold for the statistical sampling of extreme values asymptotically convergent toward GPD and the threshold for the physical selection of independent extreme events were confused, as the same threshold was used for both sampling data and to meet the hypothesis of extreme value convergence, leading to some incoherencies. In particular, if the two steps are performed simultaneously, the number of peaks over the threshold can increase but also decrease when the threshold decreases. This is logic in a physical point of view, since the definition of the sample of "extreme events" changes, but is not coherent with the statistical theory. We introduce a two-steps threshold selection for over-threshold modelling, aiming to discriminate (i) a physical threshold for the selection of extreme and independent events, and (ii) a statistical threshold for the optimization of the coherence with the hypothesis of the EVT. The former is a physical events identification procedure (also called "declustering") aiming at selecting independent extreme events. The latter is a purely statistical optimization

  18. Mesoscale spatial variability in seawater cavitation thresholds

    Science.gov (United States)

    Mel'nikov, N. P.; Elistratov, V. P.

    2017-03-01

    The paper presents the spatial variability of cavitation thresholds and some hydrological and hydrochemical parameters of seawater in the interfrontal zone of the Pacific Subarctic Front, in the Drake Passage, and in the equatorial part of the Pacific Ocean, measured in the near-surface layer to a depth of 70 m.

  19. Threshold Concepts in Finance: Conceptualizing the Curriculum

    Science.gov (United States)

    Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim

    2015-01-01

    Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to…

  20. Heritability estimates derived from threshold analyses for ...

    African Journals Online (AJOL)

    Unknown

    Abstract. The object of this study was to estimate heritabilities and sire breeding values for stayability and reproductive traits in a composite multibreed beef cattle herd using a threshold model. A GFCAT set of programmes was used to analyse reproductive data. Heritabilities and product-moment correlations between.

  1. Design of Threshold Controller Based Chaotic Circuits

    DEFF Research Database (Denmark)

    Mohamed, I. Raja; Murali, K.; Sinha, Sudeshna

    2010-01-01

    We propose a very simple implementation of a second-order nonautonomous chaotic oscillator, using a threshold controller as the only source of nonlinearity. We demonstrate the efficacy and simplicity of our design through numerical and experimental results. Further, we show that this approach...

  2. Grid - a fast threshold tracking procedure

    DEFF Research Database (Denmark)

    Fereczkowski, Michal; Dau, Torsten; MacDonald, Ewen

    2016-01-01

    A new procedure, called “grid”, is evaluated that allows rapid acquisition of threshold curves for psychophysics and, in particular, psychoacoustic, experiments. In this method, the parameterresponse space is sampled in two dimensions within a single run. This allows the procedure to focus more e...

  3. Atherogenic Risk Factors and Hearing Thresholds

    DEFF Research Database (Denmark)

    Frederiksen, Thomas Winther; Ramlau-Hansen, Cecilia Høst; Stokholm, Zara Ann

    2014-01-01

    The objective of this study was to evaluate the influence of atherogenic risk factors on hearing thresholds. In a cross-sectional study we analyzed data from a Danish survey in 2009-2010 on physical and psychological working conditions. The study included 576 white- and blue-collar workers from c...

  4. 40 CFR 68.115 - Threshold determination.

    Science.gov (United States)

    2010-07-01

    ... in accordance with the definition of flammability hazard rating 4 in the NFPA 704, Standard System... more than a threshold quantity is present at a stationary source. (iii) Naturally occurring hydrocarbon..., regulated substances in naturally occurring hydrocarbon mixtures need not be considered when determining...

  5. Identification of Threshold Concepts for Biochemistry

    Science.gov (United States)

    Loertscher, Jennifer; Green, David; Lewis, Jennifer E.; Lin, Sara; Minderhout, Vicky

    2014-01-01

    Threshold concepts (TCs) are concepts that, when mastered, represent a transformed understanding of a discipline without which the learner cannot progress. We have undertaken a process involving more than 75 faculty members and 50 undergraduate students to identify a working list of TCs for biochemistry. The process of identifying TCs for…

  6. Determining lower threshold concentrations for synergistic effects

    DEFF Research Database (Denmark)

    Bjergager, Maj-Britt Andersen; Dalhoff, Kristoffer; Kretschmann, Andreas

    2017-01-01

    which proven synergists cease to act as synergists towards the aquatic crustacean Daphnia magna. To do this, we compared several approaches and test-setups to evaluate which approach gives the most conservative estimate for the lower threshold for synergy for three known azole synergists. We focus.......619±8.555μgL(-1)) and 0.122±0.0417μM (40.236±13.75μgL(-1)), respectively, in the 14-days tests. Testing synergy in relation to concentration addition provided the most conservative values. The threshold values for the vertical assessments in tests where the two could be compared were in general 1.2 to 4.......7 fold higher than the horizontal assessments. Using passive dosing rather than dilution series or spiking did not lower the threshold significantly. Below the threshold for synergy, slight antagony could often be observed. This is most likely due to induction of enzymes active in metabolization of alpha...

  7. Microplastic effect thresholds for freshwater benthic macroinvertebrates

    NARCIS (Netherlands)

    Redondo Hasselerharm, P.E.; Dede Falahudin, Dede; Peeters, E.T.H.M.; Koelmans, A.A.

    2018-01-01

    Now that microplastics have been detected in lakes, rivers and estuaries all over the globe, evaluating their effects on biota has become an urgent research priority. This is the first study that aims at determining the effect thresholds for a battery of six freshwater benthic macroinvertebrates

  8. Distribution of sensory taste thresholds for phenylthiocarbamide ...

    African Journals Online (AJOL)

    The ability to taste Phenylthiocarbamide (PTC), a bitter organic compound has been described as a bimodal autosomal trait in both genetic and anthropological studies. This study is based on the ability of a person to taste PTC. The present study reports the threshold distribution of PTC taste sensitivity among some Muslim ...

  9. A low-threshold erbium glass minilaser

    Science.gov (United States)

    Gapontsev, V. P.; Gromov, A. K.; Izyneev, A. A.; Sadovskii, P. I.; Stavrov, A. A.

    1989-04-01

    Minilasers emitting in the 1.54-micron region with an emission threshold less than 5 J and efficiency up to 1.7 percent have been constructed using a Cr-Y-Er laser glass, LGS-Kh. Under repetitively-pulsed operation, an average lasing power of 0.7 W and a pulse repetition rate of 7 Hz have been achieved.

  10. Thresholding methods for PET imaging: A review

    International Nuclear Information System (INIS)

    Dewalle-Vignion, A.S.; Betrouni, N.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; El Abiad, A.

    2010-01-01

    This work deals with positron emission tomography segmentation methods for tumor volume determination. We propose a state of art techniques based on fixed or adaptive threshold. Methods found in literature are analysed with an objective point of view on their methodology, advantages and limitations. Finally, a comparative study is presented. (authors)

  11. Low-threshold conical microcavity dye lasers

    DEFF Research Database (Denmark)

    Grossmann, Tobias; Schleede, Simone; Hauser, Mario

    2010-01-01

    element simulations confirm that lasing occurs in whispering gallery modes which corresponds well to the measured multimode laser-emission. The effect of dye concentration on lasing threshold and lasing wavelength is investigated and can be explained using a standard dye laser model....

  12. Classification error of the thresholded independence rule

    DEFF Research Database (Denmark)

    Bak, Britta Anker; Fenger-Grøn, Morten; Jensen, Jens Ledet

    We consider classification in the situation of two groups with normally distributed data in the ‘large p small n’ framework. To counterbalance the high number of variables we consider the thresholded independence rule. An upper bound on the classification error is established which is taylored...

  13. The acoustic reflex threshold in aging ears.

    Science.gov (United States)

    Silverman, C A; Silman, S; Miller, M H

    1983-01-01

    This study investigates the controversy regarding the influence of age on the acoustic reflex threshold for broadband noise, 500-, 1000-, 2000-, and 4000-Hz activators between Jerger et al. [Mono. Contemp. Audiol. 1 (1978)] and Jerger [J. Acoust. Soc. Am. 66 (1979)] on the one hand and Silman [J. Acoust. Soc. Am. 66 (1979)] and others on the other. The acoustic reflex thresholds for broadband noise, 500-, 1000-, 2000-, and 4000-Hz activators were evaluated under two measurement conditions. Seventy-two normal-hearing ears were drawn from 72 subjects ranging in age from 20-69 years. The results revealed that age was correlated with the acoustic reflex threshold for BBN activator but not for any of the tonal activators; the correlation was stronger under the 1-dB than under the 5-dB measurement condition. Also, the mean acoustic reflex thresholds for broadband noise activator were essentially similar to those reported by Jerger et al. (1978) but differed from those obtained in this study under the 1-dB measurement condition.

  14. Public-Key Cryptography

    Science.gov (United States)

    1991-04-01

    primes, which is probabilistically computable in polynomial time (e.g., [SOLO77]), but for which no deterministic algorithm is known. Let BPP ...it follows easily that P < BPP < NP. An important question, with implications for schemes such as RSA as well as zero-knowledge schemes...MORR75] M. A. Morrison and J. Brillhart, "A method of factoring and the factorization of F7 ," Mathematics of Computation, Vol. 29, No. 129

  15. Algebraic curves and cryptography

    CERN Document Server

    Murty, V Kumar

    2010-01-01

    It is by now a well-known paradigm that public-key cryptosystems can be built using finite Abelian groups and that algebraic geometry provides a supply of such groups through Abelian varieties over finite fields. Of special interest are the Abelian varieties that are Jacobians of algebraic curves. All of the articles in this volume are centered on the theme of point counting and explicit arithmetic on the Jacobians of curves over finite fields. The topics covered include Schoof's \\ell-adic point counting algorithm, the p-adic algorithms of Kedlaya and Denef-Vercauteren, explicit arithmetic on

  16. Cryptography from noisy storage.

    Science.gov (United States)

    Wehner, Stephanie; Schaffner, Christian; Terhal, Barbara M

    2008-06-06

    We show how to implement cryptographic primitives based on the realistic assumption that quantum storage of qubits is noisy. We thereby consider individual-storage attacks; i.e., the dishonest party attempts to store each incoming qubit separately. Our model is similar to the model of bounded-quantum storage; however, we consider an explicit noise model inspired by present-day technology. To illustrate the power of this new model, we show that a protocol for oblivious transfer is secure for any amount of quantum-storage noise, as long as honest players can perform perfect quantum operations. Our model also allows us to show the security of protocols that cope with noise in the operations of the honest players and achieve more advanced tasks such as secure identification.

  17. Quantum-chaotic cryptography

    Science.gov (United States)

    de Oliveira, G. L.; Ramos, R. V.

    2018-03-01

    In this work, it is presented an optical scheme for quantum key distribution employing two synchronized optoelectronic oscillators (OEO) working in the chaotic regime. The produced key depends on the chaotic dynamic, and the synchronization between Alice's and Bob's OEOs uses quantum states. An attack on the synchronization signals will disturb the synchronization of the chaotic systems increasing the error rate in the final key.

  18. Cryptography: Cracking Codes.

    Science.gov (United States)

    Myerscough, Don; And Others

    1996-01-01

    Describes an activity whose objectives are to encode and decode messages using linear functions and their inverses; to use modular arithmetic, including use of the reciprocal for simple equation solving; to analyze patterns and make and test conjectures; to communicate procedures and algorithms; and to use problem-solving strategies. (ASK)

  19. Relativistic quantum cryptography

    International Nuclear Information System (INIS)

    Molotkov, S. N.

    2011-01-01

    A new protocol of quantum key distribution is proposed to transmit keys through free space. Along with quantum-mechanical restrictions on the discernibility of nonorthogonal quantum states, the protocol uses additional restrictions imposed by special relativity theory. Unlike all existing quantum key distribution protocols, this protocol ensures key secrecy for a not strictly one-photon source of quantum states and an arbitrary length of a quantum communication channel.

  20. Cryptography with chaotic mixing

    International Nuclear Information System (INIS)

    Oliveira, Luiz P.L. de; Sobottka, Marcelo

    2008-01-01

    We propose a cryptosystem based on one-dimensional chaotic maps of the form H p (x)=r p -1 0G0r p (x) defined in the interval [0, 10 p ) for a positive integer parameter p, where G(x)=10x(mod10) and r p (x)= p √(x), which is a topological conjugacy between G and the shift map σ on the space Σ of the sequences with 10 symbols. There are three advantages in comparison with the recently proposed cryptosystem based on chaotic logistic maps F μ (x)=μx(1-x) with 3 p is always chaotic for all parameters p, (b) the knowledge of an ergodic measure allows assignments of the alphabetic symbols to equiprobable sites of H p 's domain and (c) for each p, the security of the cryptosystem is manageable against brute force attacks

  1. Small circuits for cryptography.

    Energy Technology Data Exchange (ETDEWEB)

    Torgerson, Mark Dolan; Draelos, Timothy John; Schroeppel, Richard Crabtree; Miller, Russell D.; Anderson, William Erik

    2005-10-01

    This report examines a number of hardware circuit design issues associated with implementing certain functions in FPGA and ASIC technologies. Here we show circuit designs for AES and SHA-1 that have an extremely small hardware footprint, yet show reasonably good performance characteristics as compared to the state of the art designs found in the literature. Our AES performance numbers are fueled by an optimized composite field S-box design for the Stratix chipset. Our SHA-1 designs use register packing and feedback functionalities of the Stratix LE, which reduce the logic element usage by as much as 72% as compared to other SHA-1 designs.

  2. Fault Analysis in Cryptography

    CERN Document Server

    Joye, Marc

    2012-01-01

    In the 1970s researchers noticed that radioactive particles produced by elements naturally present in packaging material could cause bits to flip in sensitive areas of electronic chips. Research into the effect of cosmic rays on semiconductors, an area of particular interest in the aerospace industry, led to methods of hardening electronic devices designed for harsh environments. Ultimately various mechanisms for fault creation and propagation were discovered, and in particular it was noted that many cryptographic algorithms succumb to so-called fault attacks. Preventing fault attacks without

  3. Distance Discrimination Thresholds During Flight Simulation in a Maritime Environment

    Science.gov (United States)

    2011-11-01

    UNCLASSIFIED Distance Discrimination Thresholds During Flight Simulation in a Maritime Environment Jessica Parker Air Operations...Distance Discrimination Thresholds During Flight Simulation in a Maritime Environment Executive Summary The Aeronautical Design Standard...position to be perceived. This minimum distance was defined as the distance discrimination threshold. For both high and low sea states, the thresholds

  4. Cost–effectiveness thresholds: pros and cons

    Science.gov (United States)

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  5. Multimodal distribution of human cold pain thresholds.

    Science.gov (United States)

    Lötsch, Jörn; Dimova, Violeta; Lieb, Isabel; Zimmermann, Michael; Oertel, Bruno G; Ultsch, Alfred

    2015-01-01

    It is assumed that different pain phenotypes are based on varying molecular pathomechanisms. Distinct ion channels seem to be associated with the perception of cold pain, in particular TRPM8 and TRPA1 have been highlighted previously. The present study analyzed the distribution of cold pain thresholds with focus at describing the multimodality based on the hypothesis that it reflects a contribution of distinct ion channels. Cold pain thresholds (CPT) were available from 329 healthy volunteers (aged 18 - 37 years; 159 men) enrolled in previous studies. The distribution of the pooled and log-transformed threshold data was described using a kernel density estimation (Pareto Density Estimation (PDE)) and subsequently, the log data was modeled as a mixture of Gaussian distributions using the expectation maximization (EM) algorithm to optimize the fit. CPTs were clearly multi-modally distributed. Fitting a Gaussian Mixture Model (GMM) to the log-transformed threshold data revealed that the best fit is obtained when applying a three-model distribution pattern. The modes of the identified three Gaussian distributions, retransformed from the log domain to the mean stimulation temperatures at which the subjects had indicated pain thresholds, were obtained at 23.7 °C, 13.2 °C and 1.5 °C for Gaussian #1, #2 and #3, respectively. The localization of the first and second Gaussians was interpreted as reflecting the contribution of two different cold sensors. From the calculated localization of the modes of the first two Gaussians, the hypothesis of an involvement of TRPM8, sensing temperatures from 25 - 24 °C, and TRPA1, sensing cold from 17 °C can be derived. In that case, subjects belonging to either Gaussian would possess a dominance of the one or the other receptor at the skin area where the cold stimuli had been applied. The findings therefore support a suitability of complex analytical approaches to detect mechanistically determined patterns from pain phenotype data.

  6. Do multiple body modifications alter pain threshold?

    Science.gov (United States)

    Yamamotová, A; Hrabák, P; Hříbek, P; Rokyta, R

    2017-12-30

    In recent years, epidemiological data has shown an increasing number of young people who deliberately self-injure. There have also been parallel increases in the number of people with tattoos and those who voluntarily undergo painful procedures associated with piercing, scarification, and tattooing. People with self-injury behaviors often say that they do not feel the pain. However, there is no information regarding pain perception in those that visit tattoo parlors and piercing studios compared to those who don't. The aim of this study was to compare nociceptive sensitivity in four groups of subjects (n=105, mean age 26 years, 48 women and 57 men) with different motivations to experience pain (i.e., with and without multiple body modifications) in two different situations; (1) in controlled, emotionally neutral conditions, and (2) at a "Hell Party" (HP), an event organized by a piercing and tattoo parlor, with a main event featuring a public demonstration of painful techniques (burn scars, hanging on hooks, etc.). Pain thresholds of the fingers of the hand were measured using a thermal stimulator and mechanical algometer. In HP participants, information about alcohol intake, self-harming behavior, and psychiatric history were used in the analysis as intervening variables. Individuals with body modifications as well as without body modifications had higher thermal pain thresholds at Hell Party, compared to thresholds measured at control neutral conditions. No such differences were found relative to mechanical pain thresholds. Increased pain threshold in all HP participants, irrespectively of body modification, cannot be simply explained by a decrease in the sensory component of pain; instead, we found that the environment significantly influenced the cognitive and affective component of pain.

  7. A brief peripheral motion contrast threshold test predicts older drivers' hazardous behaviors in simulated driving.

    Science.gov (United States)

    Henderson, Steven; Woods-Fry, Heather; Collin, Charles A; Gagnon, Sylvain; Voloaca, Misha; Grant, John; Rosenthal, Ted; Allen, Wade

    2015-05-01

    Our research group has previously demonstrated that the peripheral motion contrast threshold (PMCT) test predicts older drivers' self-report accident risk, as well as simulated driving performance. However, the PMCT is too lengthy to be a part of a battery of tests to assess fitness to drive. Therefore, we have developed a new version of this test, which takes under two minutes to administer. We assessed the motion contrast thresholds of 24 younger drivers (19-32) and 25 older drivers (65-83) with both the PMCT-10min and the PMCT-2min test and investigated if thresholds were associated with measures of simulated driving performance. Younger participants had significantly lower motion contrast thresholds than older participants and there were no significant correlations between younger participants' thresholds and any measures of driving performance. The PMCT-10min and the PMCT-2min thresholds of older drivers' predicted simulated crash risk, as well as the minimum distance of approach to all hazards. This suggests that our tests of motion processing can help predict the risk of collision or near collision in older drivers. Thresholds were also correlated with the total lane deviation time, suggesting a deficiency in processing of peripheral flow and delayed detection of adjacent cars. The PMCT-2min is an improved version of a previously validated test, and it has the potential to help assess older drivers' fitness to drive. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    International Nuclear Information System (INIS)

    Vedam, S.; Archambault, L.; Starkschall, G.; Mohan, R.; Beddar, S.

    2007-01-01

    and delivery gate thresholds to within 0.3%. For patient data analysis, differences between simulation and delivery gate thresholds are reported as a fraction of the total respiratory motion range. For the smaller phase interval, the differences between simulation and delivery gate thresholds are 8±11% and 14±21% with and without audio-visual biofeedback, respectively, when the simulation gate threshold is determined based on the mean respiratory displacement within the 40%-60% gating phase interval. For the longer phase interval, corresponding differences are 4±7% and 8±15% with and without audio-visual biofeedback, respectively. Alternatively, when the simulation gate threshold is determined based on the maximum average respiratory displacement within the gating phase interval, greater differences between simulation and delivery gate thresholds are observed. A relationship between retrospective simulation gate threshold and prospective delivery gate threshold for respiratory gating is established and validated for regular and nonregular respiratory motion. Using this relationship, the delivery gate threshold can be reliably estimated at the time of 4D CT simulation, thereby improving the accuracy and efficiency of respiratory-gated radiation delivery

  9. Measuring Input Thresholds on an Existing Board

    Science.gov (United States)

    Kuperman, Igor; Gutrich, Daniel G.; Berkun, Andrew C.

    2011-01-01

    A critical PECL (positive emitter-coupled logic) interface to Xilinx interface needed to be changed on an existing flight board. The new Xilinx input interface used a CMOS (complementary metal-oxide semiconductor) type of input, and the driver could meet its thresholds typically, but not in worst-case, according to the data sheet. The previous interface had been based on comparison with an external reference, but the CMOS input is based on comparison with an internal divider from the power supply. A way to measure what the exact input threshold was for this device for 64 inputs on a flight board was needed. The measurement technique allowed an accurate measurement of the voltage required to switch a Xilinx input from high to low for each of the 64 lines, while only probing two of them. Directly driving an external voltage was considered too risky, and tests done on any other unit could not be used to qualify the flight board. The two lines directly probed gave an absolute voltage threshold calibration, while data collected on the remaining 62 lines without probing gave relative measurements that could be used to identify any outliers. The PECL interface was forced to a long-period square wave by driving a saturated square wave into the ADC (analog to digital converter). The active pull-down circuit was turned off, causing each line to rise rapidly and fall slowly according to the input s weak pull-down circuitry. The fall time shows up as a change in the pulse width of the signal ready by the Xilinx. This change in pulse width is a function of capacitance, pulldown current, and input threshold. Capacitance was known from the different trace lengths, plus a gate input capacitance, which is the same for all inputs. The pull-down current is the same for all inputs including the two that are probed directly. The data was combined, and the Excel solver tool was used to find input thresholds for the 62 lines. This was repeated over different supply voltages and

  10. Threshold pion electroproduction at large momentum transfers; Threshold Pion-Elektroproduktion bei grossen Energieuebertraegen

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Andreas

    2008-02-15

    We consider pion electroproduction close to threshold for Q{sup 2} in the region 1-10 GeV{sup 2} on a nucleon target. The momentum transfer dependence of the S-wave multipoles at threshold, E{sub 0+} and L{sub 0+}, is calculated in the chiral limit using light-cone sum rules. Predictions for the cross sections in the threshold region are given taking into account P-wave contributions that, as we argue, are model independent to a large extent. The results are compared with the SLAC E136 data on the structure function F{sub 2}(W,Q{sup 2}) in the threshold region. (orig.)

  11. Predicting visual acuity from detection thresholds.

    Science.gov (United States)

    Newacheck, J S; Haegerstrom-Portnoy, G; Adams, A J

    1990-03-01

    Visual performance based exclusively on high luminance and high contrast letter acuity measures often fails to predict individual performance at low contrast and low luminance. Here we measured visual acuity over a wide range of contrasts and luminances (low mesopic to photopic) for 17 young normal observers. Acuity vs. contrast functions appear to fit a single template which can be displaced laterally along the log contrast axis. The magnitude of this lateral displacement for different luminances was well predicted by the contrast threshold difference for a 4 min arc spot. The acuity vs. contrast template, taken from the mean of all 17 subjects, was used in conjunction with individual spot contrast threshold measures to predict an individual's visual acuity over a wide range of luminance and contrast levels. The accuracy of the visual acuity predictions from this simple procedure closely approximates test-retest accuracy for both positive (projected Landolt rings) and negative contrast (Bailey-Lovie charts).

  12. Edith Wharton's threshold phobia and two worlds.

    Science.gov (United States)

    Holtzman, Deanna; Kulish, Nancy

    2014-08-01

    The American novelist Edith Wharton suffered an unusual childhood neurotic symptom, a fear of crossing thresholds, a condition that might be called a "threshold phobia." This symptom is identified and examined in autobiographical material, letters, diaries, and selected literary fiction and nonfiction left by Wharton to arrive at a formulation not previously drawn together. A fascinating theme-living or being trapped between "two worlds"-runs through much of the writer's life and work. The phobia is related to this theme, and both can be linked more broadly to certain sexual conflicts in women. This understanding of Wharton's phobia, it is argued, throws new light on the developmental issues and conflicts related to the female "oedipal" or triadic phase, characterized by the need to negotiate the two worlds of mother and of father. © 2014 by the American Psychoanalytic Association.

  13. Multiparty Computation from Threshold Homomorphic Encryption

    DEFF Research Database (Denmark)

    Cramer, Ronald; Damgård, Ivan Bjerre; Nielsen, Jesper Buus

    2001-01-01

    We introduce a new approach to multiparty computation (MPC) basing it on homomorphic threshold crypto-systems. We show that given keys for any sufficiently efficient system of this type, general MPC protocols for n parties can be devised which are secure against an active adversary that corrupts...... any minority of the parties. The total number of bits broadcast is O(nk|C|), where k is the security parameter and |C| is the size of a (Boolean) circuit computing the function to be securely evaluated. An earlier proposal by Franklin and Haber with the same complexity was only secure for passive...... adversaries, while all earlier protocols with active security had complexity at least quadratic in n. We give two examples of threshold cryptosystems that can support our construction and lead to the claimed complexities....

  14. The Resting Motor Threshold - Restless or Resting?

    DEFF Research Database (Denmark)

    Karabanov, Anke Ninija; Raffin, Estelle Emeline; Siebner, Hartwig Roman

    2015-01-01

    Background The resting motor threshold (RMT) is used to individually adjust the intensity of transcranial magnetic stimulation (TMS) intensity and is assumed to be stable. Here we challenge this notion by showing that RMT expresses acute context-dependent fluctuations. Method In twelve participants......, the RMT of the right first dorsal interosseus muscle was repeatedly determined using a threshold-hunting procedure while participants performed motor imagery and visual attention tasks with the right or left hand. Data were analyzed using repeated-measure ANOVA. Results RMT differed depending on which...... hand performed the task (P = 0.003). RMT of right FDI was lower during motor imagery than during visual attention of the right hand (P = 0.002), but did not differ between left-hand tasks (P = 0.988). Conclusions State-dependent changes of RMT occur in absence of overt motor activity and can...

  15. Gamin partable radiation meter with alarm threshold

    International Nuclear Information System (INIS)

    Payat, Rene.

    1981-10-01

    The Gamin Radiation meter is a direct reading, portable, battery-powered gamma doserate meter featuring alarm thresholds. Doserate is read on a micro-ammeter with a millirad-per-hour logarithmic scale, covering a range of 0,1 to 1000 millirads/hour. The instrument issues an audible warning signal when dose-rate level exceeds a threshold value, which can be selected. The detector tube is of the Geiger-Muller counter, energy compensated type. Because of its low battery drain, the instrument can be operated continously for 1000 hours. It is powered by four 1.5 volt alcaline batteries of the R6 type. The electronic circuitry is housed in a small lightweight case made of impact resistant plastic. Applications of the Gamin portable radiation monitor are found in health physics, safety departments, medical facilities, teaching, civil defense [fr

  16. Rayleigh scattering from ions near threshold

    International Nuclear Information System (INIS)

    Roy, S.C.; Gupta, S.K.S.; Kissel, L.; Pratt, R.H.

    1988-01-01

    Theoretical studies of Rayleigh scattering of photons from neon atoms with different degrees of ionization, for energies both below and above the K-edges of the ions, are presented. Some unexpected structures both in Rayleigh scattering and in photoionization from neutral and weakly ionized atoms, very close to threshold, have been reported. It has recently been realized that some of the predicted structures may have a nonphysical origin and are due to the limitation of the independent-particle model and also to the use of a Coulombic Latter tail. Use of a K-shell vacancy potential - in which an electron is assumed to be removed from the K-shell - in calculating K-shell Rayleigh scattering amplitudes removes some of the structure effects near threshold. We present in this work a discussion of scattering angular distributions and total cross sections, obtained utilizing vacancy potentials, and compare these predictions with those previously obtained in other potential model. (author) [pt

  17. Infrared small target detection method based on local threshold attenuation of constant false alarm

    Science.gov (United States)

    Yu, Ke-feng; Shi, Zhi-guang; Lu, Xin-ping

    2016-03-01

    In the infrared small target detection system, CFAR (Constant False Alarm Rate) is a commonly used technology, but in the traditional single frame detection method, detection rate is requested to be improved while the false alarm rate is increasing. This paper proposes a threshold attenuation CFAR detection method based on Gauss distribution. After the preprocessing of infrared images, we came into the designing of CFAR detector based on Gauss distribution. According to the previous frame target location and attenuation of local threshold, the detection rate of the target neighbourhood can be improved to obtain the current target location. The experimental results show that the proposed method can effectively control the threshold, and under the precondition that the background clutter was suppressed by the global low false alarm rate, it can improve the local detection rate and reduce the probability of target loss.

  18. The monolithic double-threshold discriminator

    International Nuclear Information System (INIS)

    Baturitsky, M.A.; Dvornikov, O.V.

    1999-01-01

    A double-threshold discriminator capable of processing input signals of different duration is described. Simplicity of the discriminator circuitry makes it possible to embody the discriminator in multichannel ICs using microwave bipolar-JFET technology. Time walk is calculated to be less than 0.35 ns for the input ramp signals with rise times 25-100 ns and amplitudes 50 mV-1 V

  19. Bivariate hard thresholding in wavelet function estimation

    OpenAIRE

    Piotr Fryzlewicz

    2007-01-01

    We propose a generic bivariate hard thresholding estimator of the discrete wavelet coefficients of a function contaminated with i.i.d. Gaussian noise. We demonstrate its good risk properties in a motivating example, and derive upper bounds for its mean-square error. Motivated by the clustering of large wavelet coefficients in real-life signals, we propose two wavelet denoising algorithms, both of which use specific instances of our bivariate estimator. The BABTE algorithm uses basis averaging...

  20. Estimasi Regresi Wavelet Thresholding Dengan Metode Bootstrap

    OpenAIRE

    Suparti, Suparti; Mustofa, Achmad; Rusgiyono, Agus

    2007-01-01

    Wavelet is a function that has the certainly characteristic for example, it oscillate about zero point ascillating, localized in the time and frequency domain and construct the orthogonal bases in L2(R) space. On of the wavelet application is to estimate non parametric regression function. There are two kinds of wavelet estimator, i.e., linear and non linear wavelet estimator. The non linear wavelet estimator is called a thresholding wavelet rstimator. The application of the bootstrap method...