WorldWideScience

Sample records for infrared face verification

  1. Face recognition in the thermal infrared domain

    Science.gov (United States)

    Kowalski, M.; Grudzień, A.; Palka, N.; Szustakowski, M.

    2017-10-01

    Biometrics refers to unique human characteristics. Each unique characteristic may be used to label and describe individuals and for automatic recognition of a person based on physiological or behavioural properties. One of the most natural and the most popular biometric trait is a face. The most common research methods on face recognition are based on visible light. State-of-the-art face recognition systems operating in the visible light spectrum achieve very high level of recognition accuracy under controlled environmental conditions. Thermal infrared imagery seems to be a promising alternative or complement to visible range imaging due to its relatively high resistance to illumination changes. A thermal infrared image of the human face presents its unique heat-signature and can be used for recognition. The characteristics of thermal images maintain advantages over visible light images, and can be used to improve algorithms of human face recognition in several aspects. Mid-wavelength or far-wavelength infrared also referred to as thermal infrared seems to be promising alternatives. We present the study on 1:1 recognition in thermal infrared domain. The two approaches we are considering are stand-off face verification of non-moving person as well as stop-less face verification on-the-move. The paper presents methodology of our studies and challenges for face recognition systems in the thermal infrared domain.

  2. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  3. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  4. Near infrared face recognition: A literature survey

    Czech Academy of Sciences Publication Activity Database

    Farokhi, Sajad; Flusser, Jan; Sheikh, U. U.

    2016-01-01

    Roč. 21, č. 1 (2016), s. 1-17 ISSN 1574-0137 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Literature survey * Biometrics * Face recognition * Near infrared * Illumination invariant Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2016/ZOI/flusser-0461834.pdf

  5. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  6. MobileFaceNets: Efficient CNNs for Accurate Real-time Face Verification on Mobile Devices

    OpenAIRE

    Chen, Sheng; Liu, Yang; Gao, Xiang; Han, Zhen

    2018-01-01

    In this paper, we proposed a class of extremely efficient CNN models, MobileFaceNets, which use less than 1 million parameters and are specifically tailored for high-accuracy real-time face verification on mobile and embedded devices. We first make a simple analysis on the weakness of common mobile networks for face verification. The weakness has been well overcome by our specifically designed MobileFaceNets. Under the same experimental conditions, our MobileFaceNets achieve significantly sup...

  7. Invariant Face recognition Using Infrared Images

    International Nuclear Information System (INIS)

    Zahran, E.G.

    2012-01-01

    Over the past few decades, face recognition has become a rapidly growing research topic due to the increasing demands in many applications of our daily life such as airport surveillance, personal identification in law enforcement, surveillance systems, information safety, securing financial transactions, and computer security. The objective of this thesis is to develop a face recognition system capable of recognizing persons with a high recognition capability, low processing time, and under different illumination conditions, and different facial expressions. The thesis presents a study for the performance of the face recognition system using two techniques; the Principal Component Analysis (PCA), and the Zernike Moments (ZM). The performance of the recognition system is evaluated according to several aspects including the recognition rate, and the processing time. Face recognition systems that use visual images are sensitive to variations in the lighting conditions and facial expressions. The performance of these systems may be degraded under poor illumination conditions or for subjects of various skin colors. Several solutions have been proposed to overcome these limitations. One of these solutions is to work in the Infrared (IR) spectrum. IR images have been suggested as an alternative source of information for detection and recognition of faces, when there is little or no control over lighting conditions. This arises from the fact that these images are formed due to thermal emissions from skin, which is an intrinsic property because these emissions depend on the distribution of blood vessels under the skin. On the other hand IR face recognition systems still have limitations with temperature variations and recognition of persons wearing eye glasses. In this thesis we will fuse IR images with visible images to enhance the performance of face recognition systems. Images are fused using the wavelet transform. Simulation results show that the fusion of visible and

  8. Infrared and visible fusion face recognition based on NSCT domain

    Science.gov (United States)

    Xie, Zhihua; Zhang, Shuai; Liu, Guodong; Xiong, Jinquan

    2018-01-01

    Visible face recognition systems, being vulnerable to illumination, expression, and pose, can not achieve robust performance in unconstrained situations. Meanwhile, near infrared face images, being light- independent, can avoid or limit the drawbacks of face recognition in visible light, but its main challenges are low resolution and signal noise ratio (SNR). Therefore, near infrared and visible fusion face recognition has become an important direction in the field of unconstrained face recognition research. In this paper, a novel fusion algorithm in non-subsampled contourlet transform (NSCT) domain is proposed for Infrared and visible face fusion recognition. Firstly, NSCT is used respectively to process the infrared and visible face images, which exploits the image information at multiple scales, orientations, and frequency bands. Then, to exploit the effective discriminant feature and balance the power of high-low frequency band of NSCT coefficients, the local Gabor binary pattern (LGBP) and Local Binary Pattern (LBP) are applied respectively in different frequency parts to obtain the robust representation of infrared and visible face images. Finally, the score-level fusion is used to fuse the all the features for final classification. The visible and near infrared face recognition is tested on HITSZ Lab2 visible and near infrared face database. Experiments results show that the proposed method extracts the complementary features of near-infrared and visible-light images and improves the robustness of unconstrained face recognition.

  9. Simple thermal to thermal face verification method based on local texture descriptors

    Science.gov (United States)

    Grudzien, A.; Palka, Norbert; Kowalski, M.

    2017-08-01

    Biometrics is a science that studies and analyzes physical structure of a human body and behaviour of people. Biometrics found many applications ranging from border control systems, forensics systems for criminal investigations to systems for access control. Unique identifiers, also referred to as modalities are used to distinguish individuals. One of the most common and natural human identifiers is a face. As a result of decades of investigations, face recognition achieved high level of maturity, however recognition in visible spectrum is still challenging due to illumination aspects or new ways of spoofing. One of the alternatives is recognition of face in different parts of light spectrum, e.g. in infrared spectrum. Thermal infrared offer new possibilities for human recognition due to its specific properties as well as mature equipment. In this paper we present the scheme of subject's verification methodology by using facial images in thermal range. The study is focused on the local feature extraction methods and on the similarity metrics. We present comparison of two local texture-based descriptors for thermal 1-to-1 face recognition.

  10. Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification

    Directory of Open Access Journals (Sweden)

    Holger Steiner

    2016-01-01

    Full Text Available Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as “skin” or “no-skin.” The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.

  11. Evaluation of Face Detection Algorithms for the Bank Client Identity Verification

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2017-06-01

    Full Text Available Results of investigation of face detection algorithms efficiency in the banking client visual verification system are presented. The video recordings were made in real conditions met in three bank operating outlets employing a miniature industrial USB camera. The aim of the experiments was to check the practical usability of the face detection method in the biometric bank client verification system. The main assumption was to provide a simplified as much as possible user interaction with the application. Applied algorithms for face detection are described and achieved results of face detection in the real bank environment conditions are presented. Practical limitations of the application based on encountered problems are discussed.

  12. Near infrared face recognition using Zernike moments and Hermite kernels

    Czech Academy of Sciences Publication Activity Database

    Farokhi, Sajad; Sheikh, U.U.; Flusser, Jan; Yang, Bo

    2015-01-01

    Roč. 316, č. 1 (2015), s. 234-245 ISSN 0020-0255 R&D Projects: GA ČR(CZ) GA13-29225S Keywords : face recognition * Zernike moments * Hermite kernel * Decision fusion * Near infrared Subject RIV: JD - Computer Applications, Robotics Impact factor: 3.364, year: 2015 http://library.utia.cas.cz/separaty/2015/ZOI/flusser-0444205.pdf

  13. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  14. Anti-Makeup: Learning A Bi-Level Adversarial Network for Makeup-Invariant Face Verification

    OpenAIRE

    Li, Yi; Song, Lingxiao; Wu, Xiang; He, Ran; Tan, Tieniu

    2017-01-01

    Makeup is widely used to improve facial attractiveness and is well accepted by the public. However, different makeup styles will result in significant facial appearance changes. It remains a challenging problem to match makeup and non-makeup face images. This paper proposes a learning from generation approach for makeup-invariant face verification by introducing a bi-level adversarial network (BLAN). To alleviate the negative effects from makeup, we first generate non-makeup images from makeu...

  15. Near infrared and visible face recognition based on decision fusion of LBP and DCT features

    Science.gov (United States)

    Xie, Zhihua; Zhang, Shuai; Liu, Guodong; Xiong, Jinquan

    2018-03-01

    Visible face recognition systems, being vulnerable to illumination, expression, and pose, can not achieve robust performance in unconstrained situations. Meanwhile, near infrared face images, being light- independent, can avoid or limit the drawbacks of face recognition in visible light, but its main challenges are low resolution and signal noise ratio (SNR). Therefore, near infrared and visible fusion face recognition has become an important direction in the field of unconstrained face recognition research. In order to extract the discriminative complementary features between near infrared and visible images, in this paper, we proposed a novel near infrared and visible face fusion recognition algorithm based on DCT and LBP features. Firstly, the effective features in near-infrared face image are extracted by the low frequency part of DCT coefficients and the partition histograms of LBP operator. Secondly, the LBP features of visible-light face image are extracted to compensate for the lacking detail features of the near-infrared face image. Then, the LBP features of visible-light face image, the DCT and LBP features of near-infrared face image are sent to each classifier for labeling. Finally, decision level fusion strategy is used to obtain the final recognition result. The visible and near infrared face recognition is tested on HITSZ Lab2 visible and near infrared face database. The experiment results show that the proposed method extracts the complementary features of near-infrared and visible face images and improves the robustness of unconstrained face recognition. Especially for the circumstance of small training samples, the recognition rate of proposed method can reach 96.13%, which has improved significantly than 92.75 % of the method based on statistical feature fusion.

  16. Currency verification by a 2D infrared barcode

    International Nuclear Information System (INIS)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-01-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler. (technical design note)

  17. Weighted piecewise LDA for solving the small sample size problem in face verification.

    Science.gov (United States)

    Kyperountas, Marios; Tefas, Anastasios; Pitas, Ioannis

    2007-03-01

    A novel algorithm that can be used to boost the performance of face-verification methods that utilize Fisher's criterion is presented and evaluated. The algorithm is applied to similarity, or matching error, data and provides a general solution for overcoming the "small sample size" (SSS) problem, where the lack of sufficient training samples causes improper estimation of a linear separation hyperplane between the classes. Two independent phases constitute the proposed method. Initially, a set of weighted piecewise discriminant hyperplanes are used in order to provide a more accurate discriminant decision than the one produced by the traditional linear discriminant analysis (LDA) methodology. The expected classification ability of this method is investigated throughout a series of simulations. The second phase defines proper combinations for person-specific similarity scores and describes an outlier removal process that further enhances the classification ability. The proposed technique has been tested on the M2VTS and XM2VTS frontal face databases. Experimental results indicate that the proposed framework greatly improves the face-verification performance.

  18. Infrared face recognition based on LBP histogram and KW feature selection

    Science.gov (United States)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  19. Physiology-based face recognition in the thermal infrared spectrum.

    Science.gov (United States)

    Buddharaju, Pradeep; Pavlidis, Ioannis T; Tsiamyrtzis, Panagiotis; Bazakos, Mike

    2007-04-01

    The current dominant approaches to face recognition rely on facial characteristics that are on or over the skin. Some of these characteristics have low permanency can be altered, and their phenomenology varies significantly with environmental factors (e.g., lighting). Many methodologies have been developed to address these problems to various degrees. However, the current framework of face recognition research has a potential weakness due to its very nature. We present a novel framework for face recognition based on physiological information. The motivation behind this effort is to capitalize on the permanency of innate characteristics that are under the skin. To establish feasibility, we propose a specific methodology to capture facial physiological patterns using the bioheat information contained in thermal imagery. First, the algorithm delineates the human face from the background using the Bayesian framework. Then, it localizes the superficial blood vessel network using image morphology. The extracted vascular network produces contour shapes that are characteristic to each individual. The branching points of the skeletonized vascular network are referred to as Thermal Minutia Points (TMPs) and constitute the feature database. To render the method robust to facial pose variations, we collect for each subject to be stored in the database five different pose images (center, midleft profile, left profile, midright profile, and right profile). During the classification stage, the algorithm first estimates the pose of the test image. Then, it matches the local and global TMP structures extracted from the test image with those of the corresponding pose images in the database. We have conducted experiments on a multipose database of thermal facial images collected in our laboratory, as well as on the time-gap database of the University of Notre Dame. The good experimental results show that the proposed methodology has merit, especially with respect to the problem of

  20. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images

    Directory of Open Access Journals (Sweden)

    Chih-Lung Lin

    2015-12-01

    Full Text Available In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1 automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2 applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3 extracting the line-like features (LLFs from the fused image; (4 obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5 using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.

  1. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images.

    Science.gov (United States)

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-12-12

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.

  2. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images

    Science.gov (United States)

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-01-01

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods. PMID:26703596

  3. NIRFaceNet: A Convolutional Neural Network for Near-Infrared Face Identification

    Directory of Open Access Journals (Sweden)

    Min Peng

    2016-10-01

    Full Text Available Near-infrared (NIR face recognition has attracted increasing attention because of its advantage of illumination invariance. However, traditional face recognition methods based on NIR are designed for and tested in cooperative-user applications. In this paper, we present a convolutional neural network (CNN for NIR face recognition (specifically face identification in non-cooperative-user applications. The proposed NIRFaceNet is modified from GoogLeNet, but has a more compact structure designed specifically for the Chinese Academy of Sciences Institute of Automation (CASIA NIR database and can achieve higher identification rates with less training time and less processing time. The experimental results demonstrate that NIRFaceNet has an overall advantage compared to other methods in the NIR face recognition domain when image blur and noise are present. The performance suggests that the proposed NIRFaceNet method may be more suitable for non-cooperative-user applications.

  4. Near infrared face recognition by combining Zernike moments and undecimated discrete wavelet transform

    Czech Academy of Sciences Publication Activity Database

    Farokhi, Sajad; Shamsuddin, S.M.; Sheikh, U.U.; Flusser, Jan; Khansari, M.; Jafari-Khouzani, K.

    2014-01-01

    Roč. 31, č. 1 (2014), s. 13-27 ISSN 1051-2004 R&D Projects: GA ČR GAP103/11/1552 Institutional support: RVO:67985556 Keywords : Zernike moments * Undecimated discrete wavelet transform * Decision fusion * Near infrared * Face recognition Subject RIV: JD - Computer Applications, Robotics Impact factor: 1.256, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/flusser-0428536.pdf

  5. Rotation and Noise Invariant Near-Infrared Face Recognition by means of Zernike Moments and Spectral Regression Discriminant Analysis

    Czech Academy of Sciences Publication Activity Database

    Farokhi, S.; Shamsuddin, S. M.; Flusser, Jan; Sheikh, U. U.; Khansari, M.; Jafari-Khouzani, K.

    2013-01-01

    Roč. 22, č. 1 (2013), s. 1-11 ISSN 1017-9909 R&D Projects: GA ČR GAP103/11/1552 Keywords : face recognition * infrared imaging * image moments Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.850, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/flusser-rotation and noise invariant near-infrared face recognition by means of zernike moments and spectral regression discriminant analysis.pdf

  6. Self-face recognition in children with autism spectrum disorders: a near-infrared spectroscopy study.

    Science.gov (United States)

    Kita, Yosuke; Gunji, Atsuko; Inoue, Yuki; Goto, Takaaki; Sakihara, Kotoe; Kaga, Makiko; Inagaki, Masumi; Hosokawa, Toru

    2011-06-01

    It is assumed that children with autism spectrum disorders (ASD) have specificities for self-face recognition, which is known to be a basic cognitive ability for social development. In the present study, we investigated neurological substrates and potentially influential factors for self-face recognition of ASD patients using near-infrared spectroscopy (NIRS). The subjects were 11 healthy adult men, 13 normally developing boys, and 10 boys with ASD. Their hemodynamic activities in the frontal area and their scanning strategies (eye-movement) were examined during self-face recognition. Other factors such as ASD severities and self-consciousness were also evaluated by parents and patients, respectively. Oxygenated hemoglobin levels were higher in the regions corresponding to the right inferior frontal gyrus than in those corresponding to the left inferior frontal gyrus. In two groups of children these activities reflected ASD severities, such that the more serious ASD characteristics corresponded with lower activity levels. Moreover, higher levels of public self-consciousness intensified the activities, which were not influenced by the scanning strategies. These findings suggest that dysfunction in the right inferior frontal gyrus areas responsible for self-face recognition is one of the crucial neural substrates underlying ASD characteristics, which could potentially be used to evaluate psychological aspects such as public self-consciousness. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  7. Illumination normalization based on simplified local binary patterns for a face verification system

    NARCIS (Netherlands)

    Tao, Q.; Veldhuis, Raymond N.J.

    2007-01-01

    Illumination normalization is a very important step in face recognition. In this paper we propose a simple implementation of Local Binary Patterns, which effectively reduces the variability caused by illumination changes. In combination with a likelihood ratio classifier, this illumination

  8. Implementation of an RBF neural network on embedded systems: real-time face tracking and identity verification.

    Science.gov (United States)

    Yang, Fan; Paindavoine, M

    2003-01-01

    This paper describes a real time vision system that allows us to localize faces in video sequences and verify their identity. These processes are image processing techniques based on the radial basis function (RBF) neural network approach. The robustness of this system has been evaluated quantitatively on eight video sequences. We have adapted our model for an application of face recognition using the Olivetti Research Laboratory (ORL), Cambridge, UK, database so as to compare the performance against other systems. We also describe three hardware implementations of our model on embedded systems based on the field programmable gate array (FPGA), zero instruction set computer (ZISC) chips, and digital signal processor (DSP) TMS320C62, respectively. We analyze the algorithm complexity and present results of hardware implementations in terms of the resources used and processing speed. The success rates of face tracking and identity verification are 92% (FPGA), 85% (ZISC), and 98.2% (DSP), respectively. For the three embedded systems, the processing speeds for images size of 288 /spl times/ 352 are 14 images/s, 25 images/s, and 4.8 images/s, respectively.

  9. Differences in the Pattern of Hemodynamic Response to Self-Face and Stranger-Face Images in Adolescents with Anorexia Nervosa: A Near-Infrared Spectroscopic Study.

    Directory of Open Access Journals (Sweden)

    Takeshi Inoue

    Full Text Available There have been no reports concerning the self-face perception in patients with anorexia nervosa (AN. The purpose of this study was to compare the neuronal correlates of viewing self-face images (i.e. images of familiar face and stranger-face images (i.e. images of an unfamiliar face in female adolescents with and without AN. We used near-infrared spectroscopy (NIRS to measure hemodynamic responses while the participants viewed full-color photographs of self-face and stranger-face. Fifteen females with AN (mean age, 13.8 years and 15 age- and intelligence quotient (IQ-matched female controls without AN (mean age, 13.1 years participated in the study. The responses to photographs were compared with the baseline activation (response to white uniform blank. In the AN group, the concentration of oxygenated hemoglobin (oxy-Hb significantly increased in the right temporal area during the presentation of both the self-face and stranger-face images compared with the baseline level. In contrast, in the control group, the concentration of oxy-Hb significantly increased in the right temporal area only during the presentation of the self-face image. To our knowledge the present study is the first report to assess brain activities during self-face and stranger-face perception among female adolescents with AN. There were different patterns of brain activation in response to the sight of the self-face and stranger-face images in female adolescents with AN and controls.

  10. Differences in the Pattern of Hemodynamic Response to Self-Face and Stranger-Face Images in Adolescents with Anorexia Nervosa: A Near-Infrared Spectroscopic Study.

    Science.gov (United States)

    Inoue, Takeshi; Sakuta, Yuiko; Shimamura, Keiichi; Ichikawa, Hiroko; Kobayashi, Megumi; Otani, Ryoko; Yamaguchi, Masami K; Kanazawa, So; Kakigi, Ryusuke; Sakuta, Ryoichi

    2015-01-01

    There have been no reports concerning the self-face perception in patients with anorexia nervosa (AN). The purpose of this study was to compare the neuronal correlates of viewing self-face images (i.e. images of familiar face) and stranger-face images (i.e. images of an unfamiliar face) in female adolescents with and without AN. We used near-infrared spectroscopy (NIRS) to measure hemodynamic responses while the participants viewed full-color photographs of self-face and stranger-face. Fifteen females with AN (mean age, 13.8 years) and 15 age- and intelligence quotient (IQ)-matched female controls without AN (mean age, 13.1 years) participated in the study. The responses to photographs were compared with the baseline activation (response to white uniform blank). In the AN group, the concentration of oxygenated hemoglobin (oxy-Hb) significantly increased in the right temporal area during the presentation of both the self-face and stranger-face images compared with the baseline level. In contrast, in the control group, the concentration of oxy-Hb significantly increased in the right temporal area only during the presentation of the self-face image. To our knowledge the present study is the first report to assess brain activities during self-face and stranger-face perception among female adolescents with AN. There were different patterns of brain activation in response to the sight of the self-face and stranger-face images in female adolescents with AN and controls.

  11. Improving Face Verification in Photo Albums by Combining Facial Recognition and Metadata With Cross-Matching

    Science.gov (United States)

    2017-12-01

    with our method. The third chapter presents the description and implementation of our approach. We provide a definition of the dataset, the...a means of classification using the shape and the texture. 16 Figure 7. 3D Alignment Pipeline. Adapted from [20]. In 2014, Facebook announced...Stating that face recognition consists of four main stages, detect ⟹ align ⟹ represent ⟹ classify, the Facebook team’s intent is to revisit the

  12. A possible method of carbon deposit mapping on plasma facing components using infrared thermography

    International Nuclear Information System (INIS)

    Mitteau, R.; Spruytte, J.; Vallet, S.; Travere, J.M.; Guilhem, D.; Brosset, C.

    2007-01-01

    The material eroded from the surface of plasma facing components is redeposited partly close to high heat flux areas. At these locations, the deposit is heated by the plasma and the deposition pattern evolves depending on the operation parameters. The mapping of the deposit is still a matter of intense scientific activity, especially during the course of experimental campaigns. A method based on the comparison of surface temperature maps, obtained in situ by infrared cameras and by theoretical modelling is proposed. The difference between the two is attributed to the thermal resistance added by deposited material, and expressed as a deposit thickness. The method benefits of elaborated imaging techniques such as possibility theory and fuzzy logics. The results are consistent with deposit maps obtained by visual inspection during shutdowns

  13. Patient set-up verification by infrared optical localization and body surface sensing in breast radiation therapy

    International Nuclear Information System (INIS)

    Spadea, Maria Francesca; Baroni, Guido; Riboldi, Marco; Orecchia, Roberto; Pedotti, Antonio; Tagaste, Barbara; Garibaldi, Cristina

    2006-01-01

    Background and purpose: The aim of the study was to investigate the clinical application of a technique for patient set-up verification in breast cancer radiotherapy, based on the 3D localization of a hybrid configuration of surface control points. Materials and methods: An infrared optical tracker provided the 3D position of two passive markers and 10 laser spots placed around and within the irradiation field on nine patients. A fast iterative constrained minimization procedure was applied to detect and compensate patient set-up errors, through the control points registration with reference data coming from treatment plan (markers reference position, CT-based surface model). Results: The application of the corrective spatial transformation estimated by the registration procedure led to significant improvement of patient set-up. Median value of 3D errors affecting three additional verification markers within the irradiation field decreased from 5.7 to 3.5 mm. Errors variability (25-75%) decreased from 3.2 to 2.1 mm. Laser spots registration on the reference surface model was documented to contribute substantially to set-up errors compensation. Conclusions: Patient set-up verification through a hybrid set of control points and constrained surface minimization algorithm was confirmed to be feasible in clinical practice and to provide valuable information for the improvement of the quality of patient set-up, with minimal requirement of operator-dependant procedures. The technique combines conveniently the advantages of passive markers based methods and surface registration techniques, by featuring immediate and robust estimation of the set-up accuracy from a redundant dataset

  14. An embedded face-classification system for infrared images on an FPGA

    Science.gov (United States)

    Soto, Javier E.; Figueroa, Miguel

    2014-10-01

    We present a face-classification architecture for long-wave infrared (IR) images implemented on a Field Programmable Gate Array (FPGA). The circuit is fast, compact and low power, can recognize faces in real time and be embedded in a larger image-processing and computer vision system operating locally on an IR camera. The algorithm uses Local Binary Patterns (LBP) to perform feature extraction on each IR image. First, each pixel in the image is represented as an LBP pattern that encodes the similarity between the pixel and its neighbors. Uniform LBP codes are then used to reduce the number of patterns to 59 while preserving more than 90% of the information contained in the original LBP representation. Then, the image is divided into 64 non-overlapping regions, and each region is represented as a 59-bin histogram of patterns. Finally, the algorithm concatenates all 64 regions to create a 3,776-bin spatially enhanced histogram. We reduce the dimensionality of this histogram using Linear Discriminant Analysis (LDA), which improves clustering and enables us to store an entire database of 53 subjects on-chip. During classification, the circuit applies LBP and LDA to each incoming IR image in real time, and compares the resulting feature vector to each pattern stored in the local database using the Manhattan distance. We implemented the circuit on a Xilinx Artix-7 XC7A100T FPGA and tested it with the UCHThermalFace database, which consists of 28 81 x 150-pixel images of 53 subjects in indoor and outdoor conditions. The circuit achieves a 98.6% hit ratio, trained with 16 images and tested with 12 images of each subject in the database. Using a 100 MHz clock, the circuit classifies 8,230 images per second, and consumes only 309mW.

  15. TECHNICAL DESIGN NOTE: Currency verification by a 2D infrared barcode

    Science.gov (United States)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-10-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler.

  16. Data merging of infrared and ultrasonic images for plasma facing components inspection

    Energy Technology Data Exchange (ETDEWEB)

    Richou, M. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France)], E-mail: marianne.richou@cea.fr; Durocher, A. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France); Medrano, M. [Association EURATOM - CIEMAT, Avda. Complutense 22, 28040 Madrid (Spain); Martinez-Ona, R. [Tecnatom, 28703 S. Sebastian de los Reyes, Madrid (Spain); Moysan, J. [LCND, Universite de la Mediterranee, F-13625 Aix-en-Provence (France); Riccardi, B. [Fusion For Energy, 08019 Barcelona (Spain)

    2009-06-15

    For steady-state magnetic thermonuclear fusion devices which need large power exhaust capability, actively cooled plasma facing components have been developed. In order to guarantee the integrity of these components during the required lifetime, their thermal and mechanical behaviour must be assessed. Before the procurement of the ITER Divertor, the examination of the heat sink to armour joints with non-destructive techniques is an essential topic to be addressed. Defects may be localised at different bonding interfaces. In order to improve the defect detection capability of the SATIR technique, the possibility of merging the infrared thermography test data coming from SATIR results with the ultrasonic test data has been identified. The data merging of SATIR and ultrasonic results has been performed on Carbon Fiber Composite (CFC) monoblocks with calibrated defects, identified by their position and extension. These calibrated defects were realised with machining, with 'stop-off' or by a lack of CFC activation techniques, these last two representing more accurately a real defect. A batch of 56 samples was produced to simulate each possibility of combination with regards to interface location, position and extension and way of realising the defect. The use of a data merging method based on Dempster-Shafer theory improves significantly the detection sensibility and reliability of defect location and size.

  17. Data merging of infrared and ultrasonic images for plasma facing components inspection

    International Nuclear Information System (INIS)

    Richou, M.; Durocher, A.; Medrano, M.; Martinez-Ona, R.; Moysan, J.; Riccardi, B.

    2009-01-01

    For steady-state magnetic thermonuclear fusion devices which need large power exhaust capability, actively cooled plasma facing components have been developed. In order to guarantee the integrity of these components during the required lifetime, their thermal and mechanical behaviour must be assessed. Before the procurement of the ITER Divertor, the examination of the heat sink to armour joints with non-destructive techniques is an essential topic to be addressed. Defects may be localised at different bonding interfaces. In order to improve the defect detection capability of the SATIR technique, the possibility of merging the infrared thermography test data coming from SATIR results with the ultrasonic test data has been identified. The data merging of SATIR and ultrasonic results has been performed on Carbon Fiber Composite (CFC) monoblocks with calibrated defects, identified by their position and extension. These calibrated defects were realised with machining, with 'stop-off' or by a lack of CFC activation techniques, these last two representing more accurately a real defect. A batch of 56 samples was produced to simulate each possibility of combination with regards to interface location, position and extension and way of realising the defect. The use of a data merging method based on Dempster-Shafer theory improves significantly the detection sensibility and reliability of defect location and size.

  18. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum

    Directory of Open Access Journals (Sweden)

    Brahmastro Kresnaraman

    2016-04-01

    Full Text Available During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA. The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations.

  19. Mid-infrared volume diffraction gratings in IG2 chalcogenide glass: fabrication, characterization, and theoretical verification

    Science.gov (United States)

    Butcher, Helen L.; MacLachlan, David G.; Lee, David; Brownsword, Richard A.; Thomson, Robert R.; Weidmann, Damien

    2018-02-01

    Ultrafast laser inscription (ULI) has previously been employed to fabricate volume diffraction gratings in chalcogenide glasses, which operate in transmission mode in the mid-infrared spectral region. Prior gratings were manufactured for applications in astrophotonics, at wavelengths around 2.5 μm. Rugged volume gratings also have potential use in remote atmospheric sensing and molecular spectroscopy; for these applications, longer wavelength operation is required to coincide with atmospheric transparency windows (3-5 μm) and intense ro-vibrational molecular absorption bands. We report on ULI gratings inscribed in IG2 chalcogenide glass, enabling access to the full 3-5 μm window. High-resolution broadband spectral characterization of fabricated gratings was performed using a Fourier transform spectrometer. The zeroth order transmission was characterized to derive the diffraction efficiency into higher orders, up to the fourth orders in the case of gratings optimized for first order diffraction at 3 μm. The outcomes imply that ULI in IG2 is well suited for the fabrication of volume gratings in the mid infrared, providing the impact of the ULI fabrication parameters on the grating properties are well understood. To develop this understanding, grating modeling was conducted. Parameters studied include grating thickness, refractive index modification, and aspect ratio of the modulation achieved by ULI. Knowledge of the contribution and sensitivity of these parameters was used to inform the design of a 4.3 μm grating expected to achieve > 95% first order efficiency. We will also present the characterization of these latest mid-infrared diffraction gratings in IG2.

  20. Faces

    DEFF Research Database (Denmark)

    Mortensen, Kristine Køhler; Brotherton, Chloe

    2018-01-01

    for the face the be put into action. Based on an ethnographic study of Danish teenagers’ use of SnapChat we demonstrate how the face is used as a central medium for interaction with peers. Through the analysis of visual SnapChat messages we investigate how SnapChat requires the sender to put an ‘ugly’ face...... already secured their popular status on the heterosexual marketplace in the broad context of the school. Thus SnapChat functions both as a challenge to beauty norms of ‘flawless faces’ and as a reinscription of these same norms by further manifesting the exclusive status of the popular girl...

  1. Verification of Ganoderma (lingzhi) commercial products by Fourier Transform infrared spectroscopy and two-dimensional IR correlation spectroscopy

    Science.gov (United States)

    Choong, Yew-Keong; Sun, Su-Qin; Zhou, Qun; Lan, Jin; Lee, Han-Lim; Chen, Xiang-Dong

    2014-07-01

    Ganoderma commercial products are typically based on two sources, raw material (powder form and/or spores) and extract (water and/or solvent). This study compared three types of Ganoderma commercial products using 1 Dimensional Fourier Transform infrared and second derivative spectroscopy. The analyzed spectra of Ganoderma raw material products were compared with spectra of cultivated Ganoderma raw material powder from different mushroom farms in Malaysia. The Ganoderma extract product was also compared with three types of cultivated Ganoderma extracts. Other medicinal Ganoderma contents in commercial extract product that included glucan and triterpenoid were analyzed by using FTIR and 2DIR. The results showed that water extract of cultivated Ganoderma possessed comparable spectra with that of Ganoderma product water extract. By comparing the content of Ganoderma commercial products using FTIR and 2DIR, product content profiles could be detected. In addition, the geographical origin of the Ganoderma products could be verified by comparing their spectra with Ganoderma products from known areas. This study demonstrated the possibility of developing verification tool to validate the purity of commercial medicinal herbal and mushroom products.

  2. Neural correlates of own- and other-race face recognition in children: a functional near-infrared spectroscopy study.

    Science.gov (United States)

    Ding, Xiao Pan; Fu, Genyue; Lee, Kang

    2014-01-15

    The present study used the functional Near-infrared Spectroscopy (fNIRS) methodology to investigate the neural correlates of elementary school children's own- and other-race face processing. An old-new paradigm was used to assess children's recognition ability of own- and other-race faces. FNIRS data revealed that other-race faces elicited significantly greater [oxy-Hb] changes than own-race faces in the right middle frontal gyrus and inferior frontal gyrus regions (BA9) and the left cuneus (BA18). With increased age, the [oxy-Hb] activity differences between own- and other-race faces, or the neural other-race effect (NORE), underwent significant changes in these two cortical areas: at younger ages, the neural response to the other-race faces was modestly greater than that to the own-race faces, but with increased age, the neural response to the own-race faces became increasingly greater than that to the other-race faces. Moreover, these areas had strong regional functional connectivity with a swath of the cortical regions in terms of the neural other-race effect that also changed with increased age. We also found significant and positive correlations between the behavioral other-race effect (reaction time) and the neural other-race effect in the right middle frontal gyrus and inferior frontal gyrus regions (BA9). These results taken together suggest that children, like adults, devote different amounts of neural resources to processing own- and other-race faces, but the size and direction of the neural other-race effect and associated functional regional connectivity change with increased age. © 2013.

  3. Combining Dark Energy Survey Science Verification data with near-infrared data from the ESO VISTA Hemisphere Survey

    Energy Technology Data Exchange (ETDEWEB)

    Banerji, M.; Jouvel, S.; Lin, H.; McMahon, R. G.; Lahav, O.; Castander, F. J.; Abdalla, F. B.; Bertin, E.; Bosman, S. E.; Carnero, A.; Kind, M. C.; da Costa, L. N.; Gerdes, D.; Gschwend, J.; Lima, M.; Maia, M. A. G.; Merson, A.; Miller, C.; Ogando, R.; Pellegrini, P.; Reed, S.; Saglia, R.; Sanchez, C.; Allam, S.; Annis, J.; Bernstein, G.; Bernstein, J.; Bernstein, R.; Capozzi, D.; Childress, M.; Cunha, C. E.; Davis, T. M.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Findlay, J.; Finley, D. A.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Gonzalez-Fernandez, C.; Gonzalez-Solares, E.; Honscheid, K.; Irwin, M. J.; Jarvis, M. J.; Kim, A.; Koposov, S.; Kuehn, K.; Kupcu-Yoldas, A.; Lagattuta, D.; Lewis, J. R.; Lidman, C.; Makler, M.; Marriner, J.; Marshall, J. L.; Miquel, R.; Mohr, J. J.; Neilsen, E.; Peoples, J.; Sako, M.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla, I.; Sharp, R.; Soares-Santos, M.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Tucker, D.; Uddin, S. A.; Wechsler, R.; Wester, W.; Yuan, F.; Zuntz, J.

    2014-11-25

    We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band (grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factor of ~4.5 relative to a simple catalogue level matching and results in a ~1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES+VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z ~ 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.

  4. Improvement of non destructive infrared test bed SATIR for examination of actively cooled tungsten armour Plasma Facing Components

    Energy Technology Data Exchange (ETDEWEB)

    Vignal, N., E-mail: nicolas.vignal@cea.fr; Desgranges, C.; Cantone, V.; Richou, M.; Courtois, X.; Missirlian, M.; Magaud, Ph.

    2013-10-15

    Highlights: • Non destructive infrared techniques for control ITER like PFCs. • Reflective surface such as W induce a measurement temperature error. • Numerical data processing by evaluation of the local emissivity. • SATIR test bed can control metallic surface with low and variable emissivity. -- Abstract: For steady state (magnetic) thermonuclear fusion devices which need large power exhaust capability and have to withstand heat fluxes in the range 10–20 MW m{sup −2}, advanced Plasma Facing Components (PFCs) have been developed. The importance of PFCs for operating tokamaks requests to verify their manufacturing quality before mounting. SATIR is an IR test bed validated and recognized as a reliable and suitable tool to detect cooling defaults on PFCs with CFC armour material. Current tokamak developments implement metallic armour materials for first wall and divertor; their low emissivity causes several difficulties for infrared thermography control. We present SATIR infrared thermography test bed improvements for W monoblocks components without defect and with calibrated defects. These results are compared to ultrasonic inspection. This study demonstrates that SATIR method is fully usable for PFCs with low emissivity armour material.

  5. Improvement of non destructive infrared test bed SATIR for examination of actively cooled tungsten armour Plasma Facing Components

    International Nuclear Information System (INIS)

    Vignal, N.; Desgranges, C.; Cantone, V.; Richou, M.; Courtois, X.; Missirlian, M.; Magaud, Ph.

    2013-01-01

    Highlights: • Non destructive infrared techniques for control ITER like PFCs. • Reflective surface such as W induce a measurement temperature error. • Numerical data processing by evaluation of the local emissivity. • SATIR test bed can control metallic surface with low and variable emissivity. -- Abstract: For steady state (magnetic) thermonuclear fusion devices which need large power exhaust capability and have to withstand heat fluxes in the range 10–20 MW m −2 , advanced Plasma Facing Components (PFCs) have been developed. The importance of PFCs for operating tokamaks requests to verify their manufacturing quality before mounting. SATIR is an IR test bed validated and recognized as a reliable and suitable tool to detect cooling defaults on PFCs with CFC armour material. Current tokamak developments implement metallic armour materials for first wall and divertor; their low emissivity causes several difficulties for infrared thermography control. We present SATIR infrared thermography test bed improvements for W monoblocks components without defect and with calibrated defects. These results are compared to ultrasonic inspection. This study demonstrates that SATIR method is fully usable for PFCs with low emissivity armour material

  6. Temporal lobe and inferior frontal gyrus dysfunction in patients with schizophrenia during face-to-face conversation: a near-infrared spectroscopy study.

    Science.gov (United States)

    Takei, Yuichi; Suda, Masashi; Aoyama, Yoshiyuki; Yamaguchi, Miho; Sakurai, Noriko; Narita, Kosuke; Fukuda, Masato; Mikuni, Masahiko

    2013-11-01

    Schizophrenia (SC) is marked by poor social-role performance and social-skill deficits that are well reflected in daily conversation. Although the mechanism underlying these impairments has been investigated by functional neuroimaging, technical limitations have prevented the investigation of brain activation during conversation in typical clinical situations. To fill this research gap, this study investigated and compared frontal and temporal lobe activation in patients with SC during face-to-face conversation. Frontal and temporal lobe activation in 29 patients and 31 normal controls (NC) (n = 60) were measured during 180-s conversation periods by using near-infrared spectroscopy (NIRS). The grand average values of oxyhemoglobin concentration ([oxy-Hb]) changes during task performance were analyzed to determine their correlation with clinical variables and Positive and Negative Syndrome Scale (PANSS) subscores. Compared to NCs, patients with SC exhibited decreased performance in the conversation task and decreased activation in both the temporal lobes and the right inferior frontal gyrus (IFG) during task performance, as indicated by the grand average of [oxy-Hb] changes. The decreased activation in the left temporal lobe was negatively correlated with the PANSS disorganization and negative symptoms subscores and that in the right IFG was negatively correlated with illness duration, PANSS disorganization, and negative symptom subscores. These findings indicate that brain dysfunction in SC during conversation is related to functional deficits in both the temporal lobes and the right IFG and manifests primarily in the form of disorganized thinking and negative symptomatology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  8. Infrared

    Science.gov (United States)

    Vollmer, M.

    2013-11-01

    'Infrared' is a very wide field in physics and the natural sciences which has evolved enormously in recent decades. It all started in 1800 with Friedrich Wilhelm Herschel's discovery of infrared (IR) radiation within the spectrum of the Sun. Thereafter a few important milestones towards widespread use of IR were the quantitative description of the laws of blackbody radiation by Max Planck in 1900; the application of quantum mechanics to understand the rotational-vibrational spectra of molecules starting in the first half of the 20th century; and the revolution in source and detector technologies due to micro-technological breakthroughs towards the end of the 20th century. This has led to much high-quality and sophisticated equipment in terms of detectors, sources and instruments in the IR spectral range, with a multitude of different applications in science and technology. This special issue tries to focus on a few aspects of the astonishing variety of different disciplines, techniques and applications concerning the general topic of infrared radiation. Part of the content is based upon an interdisciplinary international conference on the topic held in 2012 in Bad Honnef, Germany. It is hoped that the information provided here may be useful for teaching the general topic of electromagnetic radiation in the IR spectral range in advanced university courses for postgraduate students. In the most general terms, the infrared spectral range is defined to extend from wavelengths of 780 nm (upper range of the VIS spectral range) up to wavelengths of 1 mm (lower end of the microwave range). Various definitions of near, middle and far infrared or thermal infrared, and lately terahertz frequencies, are used, which all fall in this range. These special definitions often depend on the scientific field of research. Unfortunately, many of these fields seem to have developed independently from neighbouring disciplines, although they deal with very similar topics in respect of the

  9. Data combination of infrared thermography images and lock-in thermography images for NDE of plasma facing components

    International Nuclear Information System (INIS)

    Moysan, J.; Gueudre, C.; Corneloup, G.; Durocher, A.

    2006-01-01

    A pioneering activity has been developed by CEA and the European industry in the field of actively cooled high heat flux plasma facing components (PFC) from the very beginning of Tore Supra project. These components have been developed in order to enable a large power exhaust capability. The goal of this study is to improve the Non Destructive Evaluation (NDE) of these components. The difficulty encountered is the evaluation of the junction between a carbon and a metallic substrate. This was even more difficult when complex designs have to be implemented. A first NDE solution was based on the so called SATIR test. The method is based on infrared measurements of tile surface temperatures during a thermal transient produced by hot/cold water flowing in the heat sink cooling channel. In order to improve the definition of acceptance rules for the PFCs, a second NDE method based on Lock-in Thermography is developed. In this work we present how we can combine the two resulting images in order to accept or to reject a component. This prospective study allows improving the experimental setup and the definition of acceptance criteria. The experimental study was conducted on trial components for the Wendelstein 7X stellarator. The conclusions will also influence future non destructive projects dedicated to the ITER project. (orig.)

  10. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  11. Infrared reflection properties and modelling of in situ reflection measurements on plasma-facing materials in Tore Supra

    International Nuclear Information System (INIS)

    Reichle, R; Desgranges, C; Faisse, F; Pocheau, C; Lasserre, J-P; Oelhoffen, F; Eupherte, L; Todeschini, M

    2009-01-01

    Tore Supra has-like ITER-reflecting internal surfaces, which can perturb the machine protection systems based on infrared (IR) thermography. To ameliorate this situation, we have measured and modelled in the 3-5 μm wavelength range the bi-directional reflection distribution function (BRDF) of wall material samples from Tore Supra and conducted in situ reflection measurements and simulated them with the CEA COSMOS code. BRDF results are presented for B 4 C and carbon fibre composite (CFC) tiles. The hemispherical integrated reflection ranges from 0.12 for the B 4 C sample to 0.39 for a CFC tile from the limiter erosion zone. In situ measurements of the IR reflection of a blackbody source off an ICRH and an LHCD antenna of Tore Supra are well reproduced by the simulation.

  12. Infrared reflection properties and modelling of in situ reflection measurements on plasma-facing materials in Tore Supra

    Energy Technology Data Exchange (ETDEWEB)

    Reichle, R; Desgranges, C; Faisse, F; Pocheau, C [CEA, IRFM, F-13108 Saint-Paul-lez-Durance (France); Lasserre, J-P; Oelhoffen, F; Eupherte, L; Todeschini, M [CEA, DAM, CESTA, F-33114 Le Barp (France)

    2009-12-15

    Tore Supra has-like ITER-reflecting internal surfaces, which can perturb the machine protection systems based on infrared (IR) thermography. To ameliorate this situation, we have measured and modelled in the 3-5 {mu}m wavelength range the bi-directional reflection distribution function (BRDF) of wall material samples from Tore Supra and conducted in situ reflection measurements and simulated them with the CEA COSMOS code. BRDF results are presented for B{sub 4}C and carbon fibre composite (CFC) tiles. The hemispherical integrated reflection ranges from 0.12 for the B{sub 4}C sample to 0.39 for a CFC tile from the limiter erosion zone. In situ measurements of the IR reflection of a blackbody source off an ICRH and an LHCD antenna of Tore Supra are well reproduced by the simulation.

  13. Study of heat fluxes on plasma facing components in a tokamak from measurements of temperature by infrared thermography

    International Nuclear Information System (INIS)

    Daviot, R.

    2010-05-01

    The goal of this thesis is the development of a method of computation of those heat loads from measurements of temperature by infrared thermography. The research was conducted on three issues arising in current tokamaks but also future ones like ITER: the measurement of temperature on reflecting walls, the determination of thermal properties for deposits observed on the surface of tokamak components and the development of a three-dimensional, non-linear computation of heat loads. A comparison of several means of pyrometry, monochromatic, bi-chromatic and photothermal, is performed on an experiment of temperature measurement. We show that this measurement is sensitive to temperature gradients on the observed area. Layers resulting from carbon deposition by the plasma on the surface of components are modeled through a field of equivalent thermal resistance, without thermal inertia. The field of this resistance is determined, for each measurement points, from a comparison of surface temperature from infrared thermographs with the result of a simulation, which is based on a mono-dimensional linear model of components. The spatial distribution of the deposit on the component surface is obtained. Finally, a three-dimensional and non-linear computation of fields of heat fluxes, based on a finite element method, is developed here. Exact geometries of the component are used. The sensitivity of the computed heat fluxes is discussed regarding the accuracy of the temperature measurements. This computation is applied to two-dimensional temperature measurements of the JET tokamak. Several components of this tokamak are modeled, such as tiles of the divertor, upper limiter and inner and outer poloidal limiters. The distribution of heat fluxes on the surface of these components is computed and studied along the two main tokamak directions, poloidal and toroidal. Toroidal symmetry of the heat loads from one tile to another is shown. The influence of measurements spatial resolution

  14. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  15. Near-Infrared Spectroscopy for Zeeman Spectra of Ti I in Plasma Using a Facing Target Sputtering System

    Science.gov (United States)

    Kobayashi, Shinji; Nishimiya, Nobuo; Suzuki, Masao

    2017-10-01

    The saturated absorption lines of neutral titanium were measured in the region of 9950-14380 cm-1 using a Ti:sapphire ring laser. A facing target sputtering system was used to obtain the gaseous state of a Ti I atom. The Zeeman splitting of 38 transitions was observed under the condition that the electric field component of a linearly polarized laser beam was parallel to the magnetic field. The gJ factors of the odd parity states were determined for 28 states belonging to 3d24s4p and 3d34p using those of the even parity states reported by Stachowska in 1997. The gJ factors of z5P1,2,3 levels were newly determined. gJ of y3F2, y3D2, z3P2, and z5S2 levels were refined.

  16. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  17. Correlative two-photon and serial block face scanning electron microscopy in neuronal tissue using 3D near-infrared branding maps.

    Science.gov (United States)

    Lees, Robert M; Peddie, Christopher J; Collinson, Lucy M; Ashby, Michael C; Verkade, Paul

    2017-01-01

    Linking cellular structure and function has always been a key goal of microscopy, but obtaining high resolution spatial and temporal information from the same specimen is a fundamental challenge. Two-photon (2P) microscopy allows imaging deep inside intact tissue, bringing great insight into the structural and functional dynamics of cells in their physiological environment. At the nanoscale, the complex ultrastructure of a cell's environment in tissue can be reconstructed in three dimensions (3D) using serial block face scanning electron microscopy (SBF-SEM). This provides a snapshot of high resolution structural information pertaining to the shape, organization, and localization of multiple subcellular structures at the same time. The pairing of these two imaging modalities in the same specimen provides key information to relate cellular dynamics to the ultrastructural environment. Until recently, approaches to relocate a region of interest (ROI) in tissue from 2P microscopy for SBF-SEM have been inefficient or unreliable. However, near-infrared branding (NIRB) overcomes this by using the laser from a multiphoton microscope to create fiducial markers for accurate correlation of 2P and electron microscopy (EM) imaging volumes. The process is quick and can be user defined for each sample. Here, to increase the efficiency of ROI relocation, multiple NIRB marks are used in 3D to target ultramicrotomy. A workflow is described and discussed to obtain a data set for 3D correlated light and electron microscopy, using three different preparations of brain tissue as examples. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Photodissociation of CH3CHO at 248 nm by time-resolved Fourier-transform infrared emission spectroscopy: Verification of roaming and triple fragmentation

    Science.gov (United States)

    Hung, Kai-Chan; Tsai, Po-Yu; Li, Hou-Kuan; Lin, King-Chuen

    2014-02-01

    By using time-resolved Fourier-transform infrared emission spectroscopy, the HCO fragment dissociated from acetaldehyde (CH3CHO) at 248 nm is found to partially decompose to H and CO. The fragment yields are enhanced by the Ar addition that facilitates the collision-induced internal conversion. The channels to CH2CO + H2 and CH3CO + H are not detected significantly. The rotational population distribution of CO, after removing the Ar collision effect, shows a bimodal feature comprising both low- and high-rotational (J) components, sharing a fraction of 19% and 81%, respectively, for the vibrational state v = 1. The low-J component is ascribed to both roaming pathway and triple fragmentation. They are determined to have a branching ratio of 0.06, respectively, relative to the whole v = 1 population. The CO roaming is accompanied by a highly vibrational population of CH4 that yields a vibrational bimodality.

  19. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    Science.gov (United States)

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  20. Decoding of faces and face components in face-sensitive human visual cortex

    Directory of Open Access Journals (Sweden)

    David F Nichols

    2010-07-01

    Full Text Available A great challenge to the field of visual neuroscience is to understand how faces are encoded and represented within the human brain. Here we show evidence from functional magnetic resonance imaging (fMRI for spatially distributed processing of the whole face and its components in face-sensitive human visual cortex. We used multi-class linear pattern classifiers constructed with a leave-one-scan-out verification procedure to discriminate brain activation patterns elicited by whole faces, the internal features alone, and the external head outline alone. Furthermore, our results suggest that whole faces are represented disproportionately in the fusiform cortex (FFA whereas the building blocks of faces are represented disproportionately in occipitotemporal cortex (OFA. Faces and face components may therefore be organized with functional clustering within both the FFA and OFA, but with specialization for face components in the OFA and the whole face in the FFA.

  1. Hawk-I - First results from science verification

    NARCIS (Netherlands)

    Kissler-Patig, M.; Larsen, S.S.|info:eu-repo/dai/nl/304833347; Wehner, E.M.|info:eu-repo/dai/nl/314114688

    2008-01-01

    The VLT wide-field near-infrared imager HAWK-I was commissioned in 2007 and Science Verification (SV) programmes were conducted in August 2007. A selection of results from among the twelve Science Verfication proposals are summarised.

  2. Swarm Verification

    Science.gov (United States)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  3. Face to Face

    OpenAIRE

    Robert Leckey

    2013-01-01

    This paper uses Queer theory, specifically literature on Bowers v. Hardwick, to analyze debates over legislation proposed in Quebec regarding covered faces. Queer theory sheds light on legal responses to the veil. Parliamentary debates in Quebec reconstitute the polity, notably as secular and united. The paper highlights the contradictory and unstable character of four binaries: legislative text versus social practice, act versus status, majority versus minority, and knowable versus unknowabl...

  4. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  5. Investigations on in situ diagnostics by an infrared camera to distinguish between the plasma facing tiles with carbonaceous surface layer and defect in the underneath junction

    International Nuclear Information System (INIS)

    Cai, Laizhong; Gauthier, Eric; Corre, Yann; Liu, Jian

    2013-01-01

    Both a deposition surface layer and a delamination underneath junction existing on plasma facing components (PFCs) can result in abnormal high surface temperature under normal heating conditions. The tile with delamination has to be replaced to prevent from a critical failure (complete delamination) during plasma operation while the carbon deposit can be removed without any repairing. Therefore, distinguishing in situ deposited tiles and junction defect tiles is crucial to avoid the critical failure without unwanted shutdown. In this paper, the thermal behaviors of junction defect tiles and carbon deposit tiles are simulated numerically. A modified time constant method is then introduced to analyze the thermal behaviors of deposited tiles and junction defect tiles. The feasibility of discrimination by analyzing the thermal behaviors of tiles is discussed and the requirements of this method for discrimination are described. Finally, the time resolution requirement of IR cameras to do the discrimination is mentioned

  6. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  7. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  8. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  9. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  10. Face to Face

    Directory of Open Access Journals (Sweden)

    Robert Leckey

    2013-12-01

    Full Text Available This paper uses Queer theory, specifically literature on Bowers v. Hardwick, to analyze debates over legislation proposed in Quebec regarding covered faces. Queer theory sheds light on legal responses to the veil. Parliamentary debates in Quebec reconstitute the polity, notably as secular and united. The paper highlights the contradictory and unstable character of four binaries: legislative text versus social practice, act versus status, majority versus minority, and knowable versus unknowable. As with contradictory propositions about homosexuality, contradiction does not undermine discourse but makes it stronger and more agile. Este artículo utiliza la teoría Queer, más concretamente la literatura sobre Bowers vs. Hardwick, para analizar los debates sobre la legislación propuesta en Quebec en relación al velo. La teoría Queer arroja luz sobre las respuestas legales al velo. Los debates parlamentarios en Quebec reconstituyen la forma de gobierno, especialmente como secular y unido. El documento pone de relieve el carácter contradictorio e inestable de cuatro binarios: texto legislativo frente a las prácticas sociales; legislación frente a estado; mayoría versus minoría; y conocible frente a incognoscible. Al igual que con las proposiciones contradictorias acerca de la homosexualidad, la contradicción no socava el discurso, sino que lo hace más fuerte y más ágil.

  11. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  12. Improving Face Detection with TOE Cameras

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Larsen, Rasmus; Lauze, F

    2007-01-01

    A face detection method based on a boosted classifier using images from a time-of-flight sensor is presented. We show that the performance of face detection can be improved when using both depth and gray scale images and that the common use of integration of hypotheses for verification can...... be relaxed. Based on the detected face we employ an active contour method on depth images for full head segmentation....

  13. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  14. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  15. The effect of image resolution on the performance of a face recognition system

    NARCIS (Netherlands)

    Boom, B.J.; Beumer, G.M.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2006-01-01

    In this paper we investigate the effect of image resolution on the error rates of a face verification system. We do not restrict ourselves to the face recognition algorithm only, but we also consider the face registration. In our face recognition system, the face registration is done by finding

  16. About Face

    Medline Plus

    Full Text Available ... Basics PTSD Treatment What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos ... Basics PTSD Treatment What is AboutFace? Resources for Professionals Get Help PTSD We've been there. After ...

  17. About Face

    Medline Plus

    Full Text Available ... Treatment What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos by Type ... Treatment What is AboutFace? Resources for Professionals Get Help PTSD We've been there. After a traumatic ...

  18. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  19. Quantified Faces

    DEFF Research Database (Denmark)

    Sørensen, Mette-Marie Zacher

    2016-01-01

    artist Marnix de Nijs' Physiognomic Scrutinizer is an interactive installation whereby the viewer's face is scanned and identified with historical figures. The American artist Zach Blas' project Fag Face Mask consists of three-dimensional portraits that blend biometric facial data from 30 gay men's faces...... and critically examine bias in surveillance technologies, as well as scientific investigations, regarding the stereotyping mode of the human gaze. The American artist Heather Dewey-Hagborg creates three-dimensional portraits of persons she has “identified” from their garbage. Her project from 2013 entitled...

  20. Reading faces and Facing words

    DEFF Research Database (Denmark)

    Robotham, Julia Emma; Lindegaard, Martin Weis; Delfi, Tzvetelina Shentova

    unilateral lesions, we found no patient with a selective deficit in either reading or face processing. Rather, the patients showing a deficit in processing either words or faces were also impaired with the other category. One patient performed within the normal range on all tasks. In addition, all patients......It has long been argued that perceptual processing of faces and words is largely independent, highly specialised and strongly lateralised. Studies of patients with either pure alexia or prosopagnosia have strongly contributed to this view. The aim of our study was to investigate how visual...... perception of faces and words is affected by unilateral posterior stroke. Two patients with lesions in their dominant hemisphere and two with lesions in their non-dominant hemisphere were tested on sensitive tests of face and word perception during the stable phase of recovery. Despite all patients having...

  1. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Interviews Our Stories Search All Videos PTSD Basics PTSD Treatment What is AboutFace? Resources for Professionals Get Help Home Watch Interviews Our ...

  2. About Face

    Medline Plus

    Full Text Available ... not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is AboutFace In these videos, Veterans, family members, and clinicians share their experiences with PTSD ...

  3. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Videos by Topic Videos by Type Search All ... What is AboutFace? Resources for Professionals Get Help Home Watch Videos by Topic Videos by Type Search ...

  4. About Face

    Medline Plus

    Full Text Available Skip to Content Menu Closed (Tap to Open) Home Interviews Our Stories Search All Videos PTSD Basics ... What is AboutFace? Resources for Professionals Get Help Home Watch Interviews Our Stories Search All Videos Learn ...

  5. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  6. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  7. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  8. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  9. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  10. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  11. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  12. Enhanced detectability of fluorinated derivatives of N,N-dialkylamino alcohols and precursors of nitrogen mustards by gas chromatography coupled to Fourier transform infrared spectroscopy analysis for verification of chemical weapons convention.

    Science.gov (United States)

    Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K

    2009-11-06

    N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests.

  13. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  14. Multithread Face Recognition in Cloud

    Directory of Open Access Journals (Sweden)

    Dakshina Ranjan Kisku

    2016-01-01

    Full Text Available Faces are highly challenging and dynamic objects that are employed as biometrics evidence in identity verification. Recently, biometrics systems have proven to be an essential security tools, in which bulk matching of enrolled people and watch lists is performed every day. To facilitate this process, organizations with large computing facilities need to maintain these facilities. To minimize the burden of maintaining these costly facilities for enrollment and recognition, multinational companies can transfer this responsibility to third-party vendors who can maintain cloud computing infrastructures for recognition. In this paper, we showcase cloud computing-enabled face recognition, which utilizes PCA-characterized face instances and reduces the number of invariant SIFT points that are extracted from each face. To achieve high interclass and low intraclass variances, a set of six PCA-characterized face instances is computed on columns of each face image by varying the number of principal components. Extracted SIFT keypoints are fused using sum and max fusion rules. A novel cohort selection technique is applied to increase the total performance. The proposed protomodel is tested on BioID and FEI face databases, and the efficacy of the system is proven based on the obtained results. We also compare the proposed method with other well-known methods.

  15. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  16. About Face

    Medline Plus

    Full Text Available ... at first. But if it's been months or years since the trauma and you're not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is AboutFace In these videos, Veterans, family members, ...

  17. About Face

    Medline Plus

    Full Text Available ... What is AboutFace? Resources for Professionals Get Help PTSD We've been there. After a traumatic event — ... you're not feeling better, you may have PTSD (posttraumatic stress disorder). Watch the intro This is ...

  18. Infrared astronomy

    International Nuclear Information System (INIS)

    Setti, G.; Fazio, G.

    1978-01-01

    This volume contains lectures describing the important achievements in infrared astronomy. The topics included are galactic infrared sources and their role in star formation, the nature of the interstellar medium and galactic structure, the interpretation of infrared, optical and radio observations of extra-galactic sources and their role in the origin and structure of the universe, instrumental techniques and a review of future space observations. (C.F.)

  19. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  20. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  1. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  2. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  5. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  6. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  7. Collaborative Random Faces-Guided Encoders for Pose-Invariant Face Representation Learning.

    Science.gov (United States)

    Shao, Ming; Zhang, Yizhe; Fu, Yun

    2018-04-01

    Learning discriminant face representation for pose-invariant face recognition has been identified as a critical issue in visual learning systems. The challenge lies in the drastic changes of facial appearances between the test face and the registered face. To that end, we propose a high-level feature learning framework called "collaborative random faces (RFs)-guided encoders" toward this problem. The contributions of this paper are three fold. First, we propose a novel supervised autoencoder that is able to capture the high-level identity feature despite of pose variations. Second, we enrich the identity features by replacing the target values of conventional autoencoders with random signals (RFs in this paper), which are unique for each subject under different poses. Third, we further improve the performance of the framework by incorporating deep convolutional neural network facial descriptors and linking discriminative identity features from different RFs for the augmented identity features. Finally, we conduct face identification experiments on Multi-PIE database, and face verification experiments on labeled faces in the wild and YouTube Face databases, where face recognition rate and verification accuracy with Receiver Operating Characteristic curves are rendered. In addition, discussions of model parameters and connections with the existing methods are provided. These experiments demonstrate that our learning system works fairly well on handling pose variations.

  8. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    International Nuclear Information System (INIS)

    Chung, Jin Ho

    2016-01-01

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs

  9. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Jin Ho [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs.

  10. Infrared thermography

    CERN Document Server

    Meola, Carosena

    2012-01-01

    This e-book conveys information about basic IRT theory, infrared detectors, signal digitalization and applications of infrared thermography in many fields such as medicine, foodstuff conservation, fluid-dynamics, architecture, anthropology, condition monitoring, non destructive testing and evaluation of materials and structures.

  11. Solar and infrared radiation measurements

    CERN Document Server

    Vignola, Frank; Michalsky, Joseph

    2012-01-01

    The rather specialized field of solar and infrared radiation measurement has become more and more important in the face of growing demands by the renewable energy and climate change research communities for data that are more accurate and have increased temporal and spatial resolution. Updating decades of acquired knowledge in the field, Solar and Infrared Radiation Measurements details the strengths and weaknesses of instruments used to conduct such solar and infrared radiation measurements. Topics covered include: Radiometer design and performance Equipment calibration, installation, operati

  12. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  13. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  14. UNSCOM faces entirely new verification challenges in Iraq

    International Nuclear Information System (INIS)

    Trevan, T.

    1993-01-01

    Starting with the very first declarations and inspections, it became evident that Iraq was not acting in good faith, would use every possible pretext to reinterpret UNSCOM's inspection rights, and occasionally would use harassment tactics to make inspections as difficult as possible. Topics considered in detail include; initial assumptions, outstanding issues, and UNSCOM's future attitude

  15. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  16. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  17. Cropland Field Monitoring: MMV Page 1 Montana Cropland Enrolled Farm Fields Carbon Sequestration Field Sampling, Measurement, Monitoring, and Verification: Application of Visible-Near Infrared Diffuse Reflectance Spectroscopy (VNIR) and Laser-induced Breakdown Spectroscopy (LIBS)

    Energy Technology Data Exchange (ETDEWEB)

    Lee Spangler; Ross Bricklemyer; David Brown

    2012-03-15

    There is growing need for rapid, accurate, and inexpensive methods to measure, and verify soil organic carbon (SOC) change for national greenhouse gas accounting and the development of a soil carbon trading market. Laboratory based soil characterization typically requires significant soil processing, which is time and resource intensive. This severely limits application for large-region soil characterization. Thus, development of rapid and accurate methods for characterizing soils are needed to map soil properties for precision agriculture applications, improve regional and global soil carbon (C) stock and flux estimates and efficiently map sub-surface metal contamination, among others. The greatest gains for efficient soil characterization will come from collecting soil data in situ, thus minimizing soil sample transportation, processing, and lab-based measurement costs. Visible and near-infrared diffuse reflectance spectroscopy (VisNIR) and laser-induced breakdown spectroscopy (LIBS) are two complementary, yet fundamentally different spectroscopic techniques that have the potential to meet this need. These sensors have the potential to be mounted on a soil penetrometer and deployed for rapid soil profile characterization at field and landscape scales. Details of sensor interaction, efficient data management, and appropriate statistical analysis techniques for model calibrations are first needed. In situ or on-the-go VisNIR spectroscopy has been proposed as a rapid and inexpensive tool for intensively mapping soil texture and organic carbon (SOC). While lab-based VisNIR has been established as a viable technique for estimating various soil properties, few experiments have compared the predictive accuracy of on-the-go and lab-based VisNIR. Eight north central Montana wheat fields were intensively interrogated using on-the-go and lab-based VisNIR. Lab-based spectral data consistently provided more accurate predictions than on-the-go data. However, neither in situ

  18. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  19. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  20. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  1. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  2. Face Liveness Detection Using Defocus

    Directory of Open Access Journals (Sweden)

    Sooyeon Kim

    2015-01-01

    Full Text Available In order to develop security systems for identity authentication, face recognition (FR technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures. To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH, are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER at a given depth of field (DoF and can be extended to camera-equipped devices, like smartphones.

  3. Famous face recognition, face matching, and extraversion.

    Science.gov (United States)

    Lander, Karen; Poyarekar, Siddhi

    2015-01-01

    It has been previously established that extraverts who are skilled at interpersonal interaction perform significantly better than introverts on a face-specific recognition memory task. In our experiment we further investigate the relationship between extraversion and face recognition, focusing on famous face recognition and face matching. Results indicate that more extraverted individuals perform significantly better on an upright famous face recognition task and show significantly larger face inversion effects. However, our results did not find an effect of extraversion on face matching or inverted famous face recognition.

  4. The infrared spectrum of Jupiter

    Science.gov (United States)

    Ridgway, S. T.; Larson, H. P.; Fink, U.

    1976-01-01

    The principal characteristics of Jupiter's infrared spectrum are reviewed with emphasis on their significance for our understanding of the composition and temperature structure of the Jovian upper atmosphere. The spectral region from 1 to 40 microns divides naturally into three regimes: the reflecting region, thermal emission from below the cloud deck (5-micron hot spots), and thermal emission from above the clouds. Opaque parts of the Jovian atmosphere further subdivide these regions into windows, and each is discussed in the context of its past or potential contributions to our knowledge of the planet. Recent results are incorporated into a table of atmospheric composition and abundance which includes positively identified constituents as well as several which require verification. The limited available information about spatial variations of the infrared spectrum is presented

  5. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  6. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  7. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  8. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  9. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  10. High-level verification

    CERN Document Server

    Lerner, Sorin; Kundu, Sudipta

    2011-01-01

    Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based

  11. Virtual & Real Face to Face Teaching

    Science.gov (United States)

    Teneqexhi, Romeo; Kuneshka, Loreta

    2016-01-01

    In traditional "face to face" lessons, during the time the teacher writes on a black or white board, the students are always behind the teacher. Sometimes, this happens even in the recorded lesson in videos. Most of the time during the lesson, the teacher shows to the students his back not his face. We do not think the term "face to…

  12. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  13. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  14. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  15. Shift Verification and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  16. Online fingerprint verification.

    Science.gov (United States)

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  17. First Images from VLT Science Verification Programme

    Science.gov (United States)

    1998-09-01

    .1 hrs); Edge-on Galaxies (7.4 hrs); Globular cluster cores (6.7 hrs); QSO Hosts (4.4 hrs); TNOs (3.4 hrs); Pulsars (1.3 hrs); Calibrations (22.7 hrs). All of the SV data are now in the process of being prepared for public release by September 30, 1998 to the ESO and Chilean astronomical communities. It will be possible to retrieve the data from the VLT archive, and a set of CDs will be distributed to all astronomical research institutes within the ESO member states and Chile. Moreover, data obtained on the HDF-S will become publicly available worldwide, and retrievable from the VLT archive. Updated information on this data release can be found on the ESO web site at http://www.eso.org/vltsv/. It is expected that the first scientific results based on the SV data will become available in the course of October and November 1998. First images from the Science Verification programme This Press Release is accompanied by three photos that reproduce some of the images obtained during the SV period. ESO PR Photo 35a/98 ESO PR Photo 35a/98 [Preview - JPEG: 671 x 800 pix - 752k] [High-Res - JPEG: 2518 x 3000 pix - 5.8Mb] This colour composite was constructed from the U+B, R and I Test Camera Images of the Hubble Deep Field South (HDF-S) NICMOS field. These images are displayed as blue, green and red, respectively. The first photo is a colour composite of the HDF-S NICMOS sky field that combines exposures obtained in different wavebands: ultraviolet (U) + blue (B), red (R) and near-infrared (I). For all of them, the image quality is better than 0.9 arcsec. Most of the objects seen in the field are distant galaxies. The image is reproduced in such a way that it shows the faintest features scaled, while rendering the image of the star below the large spiral galaxy approximately white. The spiral galaxy is displayed in such a way that the internal structure is visible. A provisional analysis has shown that limiting magnitudes that were predicted for the HDF-S observations (27.0 - 28

  18. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  19. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  20. European cinema: face to face with Hollywood

    NARCIS (Netherlands)

    Elsaesser, T.

    2005-01-01

    In the face of renewed competition from Hollywood since the early 1980s and the challenges posed to Europe's national cinemas by the fall of the Wall in 1989, independent filmmaking in Europe has begun to re-invent itself. European Cinema: Face to Face with Hollywood re-assesses the different

  1. A Face Inversion Effect without a Face

    Science.gov (United States)

    Brandman, Talia; Yovel, Galit

    2012-01-01

    Numerous studies have attributed the face inversion effect (FIE) to configural processing of internal facial features in upright but not inverted faces. Recent findings suggest that face mechanisms can be activated by faceless stimuli presented in the context of a body. Here we asked whether faceless stimuli with or without body context may induce…

  2. Human body region enhancement method based on Kinect infrared imaging

    Science.gov (United States)

    Yang, Lei; Fan, Yubo; Song, Xiaowei; Cai, Wenjing

    2016-10-01

    To effectively improve the low contrast of human body region in the infrared images, a combing method of several enhancement methods is utilized to enhance the human body region. Firstly, for the infrared images acquired by Kinect, in order to improve the overall contrast of the infrared images, an Optimal Contrast-Tone Mapping (OCTM) method with multi-iterations is applied to balance the contrast of low-luminosity infrared images. Secondly, to enhance the human body region better, a Level Set algorithm is employed to improve the contour edges of human body region. Finally, to further improve the human body region in infrared images, Laplacian Pyramid decomposition is adopted to enhance the contour-improved human body region. Meanwhile, the background area without human body region is processed by bilateral filtering to improve the overall effect. With theoretical analysis and experimental verification, the results show that the proposed method could effectively enhance the human body region of such infrared images.

  3. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  4. Infrared Heaters

    Science.gov (United States)

    1979-01-01

    The heating units shown in the accompanying photos are Panelbloc infrared heaters, energy savers which burn little fuel in relation to their effective heat output. Produced by Bettcher Manufacturing Corporation, Cleveland, Ohio, Panelblocs are applicable to industrial or other facilities which have ceilings more than 12 feet high, such as those pictured: at left the Bare Hills Tennis Club, Baltimore, Maryland and at right, CVA Lincoln- Mercury, Gaithersburg, Maryland. The heaters are mounted high above the floor and they radiate infrared energy downward. Panelblocs do not waste energy by warming the surrounding air. Instead, they beam invisible heat rays directly to objects which absorb the radiation- people, floors, machinery and other plant equipment. All these objects in turn re-radiate the energy to the air. A key element in the Panelbloc design is a coating applied to the aluminized steel outer surface of the heater. This coating must be corrosion resistant at high temperatures and it must have high "emissivity"-the ability of a surface to emit radiant energy. The Bettcher company formerly used a porcelain coating, but it caused a production problem. Bettcher did not have the capability to apply the material in its own plant, so the heaters had to be shipped out of state for porcelainizing, which entailed extra cost. Bettcher sought a coating which could meet the specifications yet be applied in its own facilities. The company asked The Knowledge Availability Systems Center, Pittsburgh, Pennsylvania, a NASA Industrial Applications Center (IAC), for a search of NASA's files

  5. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2017-11-01

    Large reflectors and antennas for the IR to mm wavelength range are being planned for many Earth observation and astronomical space missions and for commercial communication satellites as well. Scientific observatories require large telescopes with precisely shaped reflectors for collecting the electro-magnetic radiation from faint sources. The challenging tasks of on-ground testing are to achieve the required accuracy in the measurement of the reflector shapes and antenna structures and to verify their performance under simulated space conditions (vacuum, low temperatures). Due to the specific surface characteristics of reflectors operating in these spectral regions, standard optical metrology methods employed in the visible spectrum do not provide useful measurement results. The current state-of-the-art commercial metrology systems are not able to measure these types of reflectors because they have to face the measurement of shape and waviness over relatively large areas with a large deformation dynamic range and encompassing a wide range of spatial frequencies. 3-D metrology (tactile coordinate measurement) machines are generally used during the manufacturing process. Unfortunately, these instruments cannot be used in the operational environmental conditions of the reflector. The application of standard visible wavelength interferometric methods is very limited or impossible due to the large relative surface roughnesses involved. A small number of infrared interferometers have been commercially developed over the last 10 years but their applications have also been limited due to poor dynamic range and the restricted spatial resolution of their detectors. These restrictions affect also the surface error slopes that can be captured and makes their application to surfaces manufactured using CRFP honeycomb technologies rather difficult or impossible. It has therefore been considered essential, from the viewpoint of supporting future ESA exploration missions, to

  6. Experimental inventory verification system

    International Nuclear Information System (INIS)

    Steverson, C.A.; Angerman, M.I.

    1991-01-01

    As Low As Reasonably Achievable (ALARA) goals and Department of Energy (DOE) inventory requirements are frequently in conflict at facilities across the DOE complex. The authors wish, on one hand, to verify the presence of correct amounts of nuclear materials that are in storage or in process; yet on the other hand, we wish to achieve ALARA goals by keeping individual and collective exposures as low as social, technical, economic, practical, and public policy considerations permit. The Experimental Inventory Verification System (EIVSystem) is a computer-based, camera-driven system that utilizes image processing technology to detect change in vault areas. Currently in the test and evaluation phase at Idaho National Engineering Laboratory, this system guards personnel. The EIVSystem continually monitors the vault, providing proof of changed status for objects sorted within the vault. This paper reports that these data could provide the basis for reducing inventory requirements when no change has occurred, thus helping implement ALARA policy; the data will also help describe there target area of an inventory when change has been shown to occur

  7. Woodward Effect Experimental Verifications

    Science.gov (United States)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  8. Verification of hypergraph states

    Science.gov (United States)

    Morimae, Tomoyuki; Takeuchi, Yuki; Hayashi, Masahito

    2017-12-01

    Hypergraph states are generalizations of graph states where controlled-Z gates on edges are replaced with generalized controlled-Z gates on hyperedges. Hypergraph states have several advantages over graph states. For example, certain hypergraph states, such as the Union Jack states, are universal resource states for measurement-based quantum computing with only Pauli measurements, while graph state measurement-based quantum computing needs non-Clifford basis measurements. Furthermore, it is impossible to classically efficiently sample measurement results on hypergraph states unless the polynomial hierarchy collapses to the third level. Although several protocols have been proposed to verify graph states with only sequential single-qubit Pauli measurements, there was no verification method for hypergraph states. In this paper, we propose a method for verifying a certain class of hypergraph states with only sequential single-qubit Pauli measurements. Importantly, no i.i.d. property of samples is assumed in our protocol: any artificial entanglement among samples cannot fool the verifier. As applications of our protocol, we consider verified blind quantum computing with hypergraph states, and quantum computational supremacy demonstrations with hypergraph states.

  9. Calibration and verification of thermographic cameras for geometric measurements

    Science.gov (United States)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  10. Attention Capture by Faces

    Science.gov (United States)

    Langton, Stephen R. H.; Law, Anna S.; Burton, A. Mike; Schweinberger, Stefan R.

    2008-01-01

    We report three experiments that investigate whether faces are capable of capturing attention when in competition with other non-face objects. In Experiment 1a participants took longer to decide that an array of objects contained a butterfly target when a face appeared as one of the distracting items than when the face did not appear in the array.…

  11. Individuation instructions decrease the Cross-Race Effect in a face matching task

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Conclusions: Individuation instructions are an effective moderator of the CRE even within a face matching paradigm. Since unfamiliar face matching tasks most closely simulate document verification tasks, specifically passport screening, instructional techniques such as these may improve task performance within applied settings of significant practical importance.

  12. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  13. Iterative closest normal point for 3D face recognition.

    Science.gov (United States)

    Mohammadzade, Hoda; Hatzinakos, Dimitrios

    2013-02-01

    The common approach for 3D face recognition is to register a probe face to each of the gallery faces and then calculate the sum of the distances between their points. This approach is computationally expensive and sensitive to facial expression variation. In this paper, we introduce the iterative closest normal point method for finding the corresponding points between a generic reference face and every input face. The proposed correspondence finding method samples a set of points for each face, denoted as the closest normal points. These points are effectively aligned across all faces, enabling effective application of discriminant analysis methods for 3D face recognition. As a result, the expression variation problem is addressed by minimizing the within-class variability of the face samples while maximizing the between-class variability. As an important conclusion, we show that the surface normal vectors of the face at the sampled points contain more discriminatory information than the coordinates of the points. We have performed comprehensive experiments on the Face Recognition Grand Challenge database, which is presently the largest available 3D face database. We have achieved verification rates of 99.6 and 99.2 percent at a false acceptance rate of 0.1 percent for the all versus all and ROC III experiments, respectively, which, to the best of our knowledge, have seven and four times less error rates, respectively, compared to the best existing methods on this database.

  14. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  15. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  16. Familiar face + novel face = familiar face? Representational bias in the perception of morphed faces in chimpanzees

    Directory of Open Access Journals (Sweden)

    Yoshi-Taka Matsuda

    2016-08-01

    Full Text Available Highly social animals possess a well-developed ability to distinguish the faces of familiar from novel conspecifics to induce distinct behaviors for maintaining society. However, the behaviors of animals when they encounter ambiguous faces of familiar yet novel conspecifics, e.g., strangers with faces resembling known individuals, have not been well characterised. Using a morphing technique and preferential-looking paradigm, we address this question via the chimpanzee’s facial–recognition abilities. We presented eight subjects with three types of stimuli: (1 familiar faces, (2 novel faces and (3 intermediate morphed faces that were 50% familiar and 50% novel faces of conspecifics. We found that chimpanzees spent more time looking at novel faces and scanned novel faces more extensively than familiar or intermediate faces. Interestingly, chimpanzees looked at intermediate faces in a manner similar to familiar faces with regards to the fixation duration, fixation count, and saccade length for facial scanning, even though the participant was encountering the intermediate faces for the first time. We excluded the possibility that subjects merely detected and avoided traces of morphing in the intermediate faces. These findings suggest a bias for a feeling-of-familiarity that chimpanzees perceive familiarity with an intermediate face by detecting traces of a known individual, as 50% alternation is sufficient to perceive familiarity.

  17. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  18. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  19. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  20. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  1. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  2. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  3. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  4. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  5. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  6. Editing faces in videos

    OpenAIRE

    Amberg, Brian

    2011-01-01

    Editing faces in movies is of interest in the special effects industry. We aim at producing effects such as the addition of accessories interacting correctly with the face or replacing the face of a stuntman with the face of the main actor. The system introduced in this thesis is based on a 3D generative face model. Using a 3D model makes it possible to edit the face in the semantic space of pose, expression, and identity instead of pixel space, and due to its 3D nature allows...

  7. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  8. Series 'Facing Radiation'. 2 Facing radiation is facing residents

    International Nuclear Information System (INIS)

    Hanzawa, Takahiro

    2013-01-01

    The series is to report how general people, who are not at all radiological experts, have faced and understood the problems and tasks of radiation given by the Fukushima Daiichi Nuclear Power Plant Accident (Mar. 2011). The section 2 is reported by an officer of Date City, which localizes at 60 km northern west of the Plant, borders on Iitate Village of Fukushima prefecture, and is indicated as the important area of contamination search (IACS), which the reporter has been conducted for as responsible personnel. In July 2011, the ambient dose was as high as 3.0-3.5 mc-Sv/h and the tentative storage place of contaminated materials was decided by own initiative of residents of a small community, from which the real decontamination started in the City. The target dose after decontamination was defined to be 1.0 mc-Sv/h: however, 28/32 IACS municipalities in the prefecture had not defined the target although they had worked for 2 years after the Accident for their areas exceeding the standard 0.23 mc-Sv/h. At the moment of decontamination of the reporter's own house, he noticed that resident's concerns had directed toward its work itself, not toward the target dose, and wondered if these figures had obstructed to correctly face the radiation. At present that about 2.5 years have passed since the Accident, all of Date citizens have personal accumulated glass dosimeters for seeing the effective external dose and it seems that their dose will not exceed 1 mSv/y if the ambient dose estimated is 0.3-5 mc-Sv/h. Media run to popularity not to face radiation, experts tend to hesitate to face media and residents, and radiation dose will be hardly reduced to zero, despite that correct understanding of radiation is a shorter way for residents' own ease: facing radiation is facing residents. (T.T.)

  9. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  10. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  11. Face Detection and Recognition

    National Research Council Canada - National Science Library

    Jain, Anil K

    2004-01-01

    .... Specifically, the report addresses the problem of detecting faces in color images in the presence of various lighting conditions and complex backgrounds as well as recognizing faces under variations...

  12. Measuring External Face Appearance for Face Classification

    OpenAIRE

    Masip, David; Lapedriza, Agata; Vitria, Jordi

    2007-01-01

    In this chapter we introduce the importance of the external features in face classification problems, and propose a methodology to extract the external features obtaining an aligned feature set. The extracted features can be used as input to any standard pattern recognition classifier, as the classic feature extraction approaches dealing with internal face regions in the literature. The resulting scheme follows a top-down segmentation approach to deal with the diversity inherent to the extern...

  13. Examining the examiners: an online eyebrow verification experiment inspired by FISWG

    NARCIS (Netherlands)

    Zeinstra, Christopher Gerard; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2015-01-01

    In forensic face comparison, one of the features taken into account are the eyebrows. In this paper, we investigate human performance on an eyebrow verification task. This task is executed twice by participants: a "best-effort" approach and an approach using features based on forensic knowledge. The

  14. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  15. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  16. Hot cell verification facility update

    International Nuclear Information System (INIS)

    Titzler, P.A.; Moffett, S.D.; Lerch, R.E.

    1985-01-01

    The Hot Cell Verification Facility (HCVF) provides a prototypic hot cell mockup to check equipment for functional and remote operation, and provides actual hands-on training for operators. The facility arrangement is flexible and assists in solving potential problems in a nonradioactive environment. HCVF has been in operation for six years, and the facility is a part of the Hanford Engineering Development Laboratory

  17. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  18. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  19. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  20. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  1. Face identification with frequency domain matched filtering in mobile environments

    Science.gov (United States)

    Lee, Dong-Su; Woo, Yong-Hyun; Yeom, Seokwon; Kim, Shin-Hwan

    2012-06-01

    Face identification at a distance is very challenging since captured images are often degraded by blur and noise. Furthermore, the computational resources and memory are often limited in the mobile environments. Thus, it is very challenging to develop a real-time face identification system on the mobile device. This paper discusses face identification based on frequency domain matched filtering in the mobile environments. Face identification is performed by the linear or phase-only matched filter and sequential verification stages. The candidate window regions are decided by the major peaks of the linear or phase-only matched filtering outputs. The sequential stages comprise a skin-color test and an edge mask filtering test, which verify color and shape information of the candidate regions in order to remove false alarms. All algorithms are built on the mobile device using Android platform. The preliminary results show that face identification of East Asian people can be performed successfully in the mobile environments.

  2. Face time: educating face transplant candidates.

    Science.gov (United States)

    Lamparello, Brooke M; Bueno, Ericka M; Diaz-Siso, Jesus Rodrigo; Sisk, Geoffroy C; Pomahac, Bohdan

    2013-01-01

    Face transplantation is the innovative application of microsurgery and immunology to restore appearance and function to those with severe facial disfigurements. Our group aims to establish a multidisciplinary education program that can facilitate informed consent and build a strong knowledge base in patients to enhance adherence to medication regimes, recovery, and quality of life. We analyzed handbooks from our institution's solid organ transplant programs to identify topics applicable to face transplant patients. The team identified unique features of face transplantation that warrant comprehensive patient education. We created a 181-page handbook to provide subjects interested in pursuing transplantation with a written source of information on the process and team members and to address concerns they may have. While the handbook covers a wide range of topics, it is easy to understand and visually appealing. Face transplantation has many unique aspects that must be relayed to the patients pursuing this novel therapy. Since candidates lack third-party support groups and programs, the transplant team must provide an extensive educational component to enhance this complex process. As face transplantation continues to develop, programs must create sound education programs that address patients' needs and concerns to facilitate optimal care.

  3. Face averages enhance user recognition for smartphone security.

    Science.gov (United States)

    Robertson, David J; Kramer, Robin S S; Burton, A Mike

    2015-01-01

    Our recognition of familiar faces is excellent, and generalises across viewing conditions. However, unfamiliar face recognition is much poorer. For this reason, automatic face recognition systems might benefit from incorporating the advantages of familiarity. Here we put this to the test using the face verification system available on a popular smartphone (the Samsung Galaxy). In two experiments we tested the recognition performance of the smartphone when it was encoded with an individual's 'face-average'--a representation derived from theories of human face perception. This technique significantly improved performance for both unconstrained celebrity images (Experiment 1) and for real faces (Experiment 2): users could unlock their phones more reliably when the device stored an average of the user's face than when they stored a single image. This advantage was consistent across a wide variety of everyday viewing conditions. Furthermore, the benefit did not reduce the rejection of imposter faces. This benefit is brought about solely by consideration of suitable representations for automatic face recognition, and we argue that this is just as important as development of matching algorithms themselves. We propose that this representation could significantly improve recognition rates in everyday settings.

  4. Extraction and fusion of spectral parameters for face recognition

    Science.gov (United States)

    Boisier, B.; Billiot, B.; Abdessalem, Z.; Gouton, P.; Hardeberg, J. Y.

    2011-03-01

    Many methods have been developed in image processing for face recognition, especially in recent years with the increase of biometric technologies. However, most of these techniques are used on grayscale images acquired in the visible range of the electromagnetic spectrum. The aims of our study are to improve existing tools and to develop new methods for face recognition. The techniques used take advantage of the different spectral ranges, the visible, optical infrared and thermal infrared, by either combining them or analyzing them separately in order to extract the most appropriate information for face recognition. We also verify the consistency of several keypoints extraction techniques in the Near Infrared (NIR) and in the Visible Spectrum.

  5. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  6. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  7. The Secrets of Faces

    OpenAIRE

    Enquist, Magnus; Ghirlanda, Stefano

    1998-01-01

    This is a comment on an article by Perrett et al., on the same issue of Nature, investigating face perception. With computer graphics, Perrett and colleagues have produced exaggerated male and female faces, and asked people to rate them with respect to femininity or masculinity, and personality traits such as intelligence, emotionality and so on. The key question is: what informations do faces (and sexual signals in general) convey? One view, supported by Perrett and colleagues, is that all a...

  8. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  9. Learning discriminant face descriptor.

    Science.gov (United States)

    Lei, Zhen; Pietikäinen, Matti; Li, Stan Z

    2014-02-01

    Local feature descriptor is an important module for face recognition and those like Gabor and local binary patterns (LBP) have proven effective face descriptors. Traditionally, the form of such local descriptors is predefined in a handcrafted way. In this paper, we propose a method to learn a discriminant face descriptor (DFD) in a data-driven way. The idea is to learn the most discriminant local features that minimize the difference of the features between images of the same person and maximize that between images from different people. In particular, we propose to enhance the discriminative ability of face representation in three aspects. First, the discriminant image filters are learned. Second, the optimal neighborhood sampling strategy is soft determined. Third, the dominant patterns are statistically constructed. Discriminative learning is incorporated to extract effective and robust features. We further apply the proposed method to the heterogeneous (cross-modality) face recognition problem and learn DFD in a coupled way (coupled DFD or C-DFD) to reduce the gap between features of heterogeneous face images to improve the performance of this challenging problem. Extensive experiments on FERET, CAS-PEAL-R1, LFW, and HFB face databases validate the effectiveness of the proposed DFD learning on both homogeneous and heterogeneous face recognition problems. The DFD improves POEM and LQP by about 4.5 percent on LFW database and the C-DFD enhances the heterogeneous face recognition performance of LBP by over 25 percent.

  10. Oracle ADF Faces cookbook

    CERN Document Server

    Gawish, Amr

    2014-01-01

    This is a cookbook that covers more than 80 different recipes to teach you about different aspects of Oracle ADF Faces. It follows a practical approach and covers how to build your components for reuse in different applications. This book will also help you in tuning the performance of your ADF Faces application. If you are an ADF developer who wants to harness the power of Oracle ADF Faces to create exceptional user interfaces and reactive applications, this book will provide you with the recipes needed to do just that. You will not need to be familiar with Oracle ADF Faces, but you should be

  11. Face inversion increases attractiveness.

    Science.gov (United States)

    Leder, Helmut; Goller, Juergen; Forster, Michael; Schlageter, Lena; Paul, Matthew A

    2017-07-01

    Assessing facial attractiveness is a ubiquitous, inherent, and hard-wired phenomenon in everyday interactions. As such, it has highly adapted to the default way that faces are typically processed: viewing faces in upright orientation. By inverting faces, we can disrupt this default mode, and study how facial attractiveness is assessed. Faces, rotated at 90 (tilting to either side) and 180°, were rated on attractiveness and distinctiveness scales. For both orientations, we found that faces were rated more attractive and less distinctive than upright faces. Importantly, these effects were more pronounced for faces rated low in upright orientation, and smaller for highly attractive faces. In other words, the less attractive a face was, the more it gained in attractiveness by inversion or rotation. Based on these findings, we argue that facial attractiveness assessments might not rely on the presence of attractive facial characteristics, but on the absence of distinctive, unattractive characteristics. These unattractive characteristics are potentially weighed against an individual, attractive prototype in assessing facial attractiveness. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Entropy Measurement for Biometric Verification Systems.

    Science.gov (United States)

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  13. Multi-task pose-invariant face recognition.

    Science.gov (United States)

    Ding, Changxing; Xu, Chang; Tao, Dacheng

    2015-03-01

    Face images captured in unconstrained environments usually contain significant pose variation, which dramatically degrades the performance of algorithms designed to recognize frontal faces. This paper proposes a novel face identification framework capable of handling the full range of pose variations within ±90° of yaw. The proposed framework first transforms the original pose-invariant face recognition problem into a partial frontal face recognition problem. A robust patch-based face representation scheme is then developed to represent the synthesized partial frontal faces. For each patch, a transformation dictionary is learnt under the proposed multi-task learning scheme. The transformation dictionary transforms the features of different poses into a discriminative subspace. Finally, face matching is performed at patch level rather than at the holistic level. Extensive and systematic experimentation on FERET, CMU-PIE, and Multi-PIE databases shows that the proposed method consistently outperforms single-task-based baselines as well as state-of-the-art methods for the pose problem. We further extend the proposed algorithm for the unconstrained face verification problem and achieve top-level performance on the challenging LFW data set.

  14. Improving Shadow Suppression for Illumination Robust Face Recognition

    KAUST Repository

    Zhang, Wuming

    2017-10-13

    2D face analysis techniques, such as face landmarking, face recognition and face verification, are reasonably dependent on illumination conditions which are usually uncontrolled and unpredictable in the real world. An illumination robust preprocessing method thus remains a significant challenge in reliable face analysis. In this paper we propose a novel approach for improving lighting normalization through building the underlying reflectance model which characterizes interactions between skin surface, lighting source and camera sensor, and elaborates the formation of face color appearance. Specifically, the proposed illumination processing pipeline enables the generation of Chromaticity Intrinsic Image (CII) in a log chromaticity space which is robust to illumination variations. Moreover, as an advantage over most prevailing methods, a photo-realistic color face image is subsequently reconstructed which eliminates a wide variety of shadows whilst retaining the color information and identity details. Experimental results under different scenarios and using various face databases show the effectiveness of the proposed approach to deal with lighting variations, including both soft and hard shadows, in face recognition.

  15. Feldspar, Infrared Stimulated Luminescence

    DEFF Research Database (Denmark)

    Jain, Mayank

    2014-01-01

    This entry primarily concerns the characteristics and the origins of infrared-stimulated luminescence in feldspars.......This entry primarily concerns the characteristics and the origins of infrared-stimulated luminescence in feldspars....

  16. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  17. Autistic traits and brain activation during face-to-face conversations in typically developed adults.

    Science.gov (United States)

    Suda, Masashi; Takei, Yuichi; Aoyama, Yoshiyuki; Narita, Kosuke; Sakurai, Noriko; Fukuda, Masato; Mikuni, Masahiko

    2011-01-01

    Autism spectrum disorders (ASD) are characterized by impaired social interaction and communication, restricted interests, and repetitive behaviours. The severity of these characteristics is posited to lie on a continuum that extends into the general population. Brain substrates underlying ASD have been investigated through functional neuroimaging studies using functional magnetic resonance imaging (fMRI). However, fMRI has methodological constraints for studying brain mechanisms during social interactions (for example, noise, lying on a gantry during the procedure, etc.). In this study, we investigated whether variations in autism spectrum traits are associated with changes in patterns of brain activation in typically developed adults. We used near-infrared spectroscopy (NIRS), a recently developed functional neuroimaging technique that uses near-infrared light, to monitor brain activation in a natural setting that is suitable for studying brain functions during social interactions. We monitored regional cerebral blood volume changes using a 52-channel NIRS apparatus over the prefrontal cortex (PFC) and superior temporal sulcus (STS), 2 areas implicated in social cognition and the pathology of ASD, in 28 typically developed participants (14 male and 14 female) during face-to-face conversations. This task was designed to resemble a realistic social situation. We examined the correlations of these changes with autistic traits assessed using the Autism-Spectrum Quotient (AQ). Both the PFC and STS were significantly activated during face-to-face conversations. AQ scores were negatively correlated with regional cerebral blood volume increases in the left STS during face-to-face conversations, especially in males. Our results demonstrate successful monitoring of brain function during realistic social interactions by NIRS as well as lesser brain activation in the left STS during face-to-face conversations in typically developed participants with higher levels of autistic

  18. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  19. Random-Profiles-Based 3D Face Recognition System

    Directory of Open Access Journals (Sweden)

    Joongrock Kim

    2014-03-01

    Full Text Available In this paper, a noble nonintrusive three-dimensional (3D face modeling system for random-profile-based 3D face recognition is presented. Although recent two-dimensional (2D face recognition systems can achieve a reliable recognition rate under certain conditions, their performance is limited by internal and external changes, such as illumination and pose variation. To address these issues, 3D face recognition, which uses 3D face data, has recently received much attention. However, the performance of 3D face recognition highly depends on the precision of acquired 3D face data, while also requiring more computational power and storage capacity than 2D face recognition systems. In this paper, we present a developed nonintrusive 3D face modeling system composed of a stereo vision system and an invisible near-infrared line laser, which can be directly applied to profile-based 3D face recognition. We further propose a novel random-profile-based 3D face recognition method that is memory-efficient and pose-invariant. The experimental results demonstrate that the reconstructed 3D face data consists of more than 50 k 3D point clouds and a reliable recognition rate against pose variation.

  20. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  1. Extragalactic infrared astronomy

    International Nuclear Information System (INIS)

    Gondhalekar, P.M.

    1985-05-01

    The paper concerns the field of Extragalactic Infrared Astronomy, discussed at the Fourth RAL Workshop on Astronomy and Astrophysics. Fifteen papers were presented on infrared emission from extragalactic objects. Both ground-(and aircraft-) based and IRAS infrared data were reviewed. The topics covered star formation in galaxies, active galactic nuclei and cosmology. (U.K.)

  2. Morphing morphing faces

    NARCIS (Netherlands)

    Lier, R.J. van

    2009-01-01

    We have made cyclic morphing animations using two different faces. The morphing animations gradually evolved from one face to the other, and vice versa. When free viewing, the perceived changes were not very large, but the changes could easily be observed. Observers were asked to fixate on a dot

  3. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  4. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  5. SHIELD verification and validation report

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation

  6. Trojan technical specification verification project

    International Nuclear Information System (INIS)

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error

  7. A verification environment for bigraphs

    DEFF Research Database (Denmark)

    Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas

    2013-01-01

    We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...

  8. Hot-cell verification facility

    International Nuclear Information System (INIS)

    Eschenbaum, R.A.

    1981-01-01

    The Hot Cell Verification Facility (HCVF) was established as the test facility for the Fuels and Materials Examination Facility (FMEF) examination equipment. HCVF provides a prototypic hot cell environment to check the equipment for functional and remote operation. It also provides actual hands-on training for future FMEF Operators. In its two years of operation, HCVF has already provided data to make significant changes in items prior to final fabrication. It will also shorten the startup time in FMEF since the examination equipment will have been debugged and operated in HCVF

  9. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.

  10. Discrimination between authentic and adulterated liquors by near-infrared spectroscopy and ensemble classification

    Science.gov (United States)

    Chen, Hui; Tan, Chao; Wu, Tong; Wang, Li; Zhu, Wanping

    2014-09-01

    Chinese liquor is one of the famous distilled spirits and counterfeit liquor is becoming a serious problem in the market. Especially, age liquor is facing the crisis of confidence because it is difficult for consumer to identify the marked age, which prompts unscrupulous traders to pose off low-grade liquors as high-grade liquors. An ideal method for authenticity confirmation of liquors should be non-invasive, non-destructive and timely. The combination of near-infrared spectroscopy with chemometrics proves to be a good way to reach these premises. A new strategy is proposed for classification and verification of the adulteration of liquors by using NIR spectroscopy and chemometric classification, i.e., ensemble support vector machines (SVM). Three measures, i.e., accuracy, sensitivity and specificity were used for performance evaluation. The results confirmed that the strategy can serve as a screening tool applied to verify adulteration of the liquor, that is, a prior step used to condition the sample to a deeper analysis only when a positive result for adulteration is obtained by the proposed methodology.

  11. Gaze Cueing by Pareidolia Faces

    Directory of Open Access Journals (Sweden)

    Kohske Takahashi

    2013-12-01

    Full Text Available Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon. While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cueing effect was comparable between the face-like objects and a cartoon face. However, the cueing effect was eliminated when the observer did not perceive the objects as faces. These results demonstrated that pareidolia faces do more than give the impression of the presence of faces; indeed, they trigger an additional face-specific attentional process.

  12. Gaze cueing by pareidolia faces.

    Science.gov (United States)

    Takahashi, Kohske; Watanabe, Katsumi

    2013-01-01

    Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon). While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cueing effect was comparable between the face-like objects and a cartoon face. However, the cueing effect was eliminated when the observer did not perceive the objects as faces. These results demonstrated that pareidolia faces do more than give the impression of the presence of faces; indeed, they trigger an additional face-specific attentional process.

  13. Hyper-Spectral Imager in visible and near-infrared band for lunar ...

    Indian Academy of Sciences (India)

    India's first lunar mission, Chandrayaan-1, will have a Hyper-Spectral Imager in the visible and near-infrared spectral ... mapping of the Moon's crust in a large number of spectral channels. The planned .... In-flight verification may be done.

  14. Face Detection and Recognition

    National Research Council Canada - National Science Library

    Jain, Anil K

    2004-01-01

    This report describes research efforts towards developing algorithms for a robust face recognition system to overcome many of the limitations found in existing two-dimensional facial recognition systems...

  15. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  16. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  17. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  18. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  19. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  20. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  1. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  2. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  3. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  4. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  5. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  6. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  7. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  8. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  9. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  10. Infrared microscope inspection apparatus

    Science.gov (United States)

    Forman, Steven E.; Caunt, James W.

    1985-02-26

    Apparatus and system for inspecting infrared transparents, such as an array of photovoltaic modules containing silicon solar cells, includes an infrared microscope, at least three sources of infrared light placed around and having their axes intersect the center of the object field and means for sending the reflected light through the microscope. The apparatus is adapted to be mounted on an X-Y translator positioned adjacent the object surface.

  11. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    . For example, just as cost-effective and readily applicable technologies can solve the problems faced by the nuclear safeguards community, these same technologies offer solutions for the CWC safeguards regime. This paper discusses similarities between nuclear and chemical weapons arms control in terms of verification methodologies and the potential for shared applications of safeguards technologies. (author)

  12. Far infrared supplement: Catalog of infrared observations, second edition

    International Nuclear Information System (INIS)

    Gezari, D.Y.; Schmitz, M.; Mead, J.M.

    1988-08-01

    The Far Infrared Supplement: Catalog of Infrared Observations summarizes all infrared astronomical observations at far infrared wavelengths (5 to 1000 microns) published in the scientific literature from 1965 through 1986. The Supplement list contain 25 percent of the observations in the full Catalog of Infrared Observations (CIO), and essentially eliminates most visible stars from the listings. The Supplement is thus more compact than the main catalog, and is intended for easy reference during astronomical observations. The Far Infrared Supplement (2nd Edition) includes the Index of Infrared Source Positions and the Bibliography of Infrared Astronomy for the subset of far infrared observations listed

  13. Mid-Infrared Lasers

    Data.gov (United States)

    National Aeronautics and Space Administration — Mid infrared solid state lasers for Differential Absorption Lidar (DIAL) systems required for understanding atmospheric chemistry are not available. This program...

  14. Buzz: Face-to-Face Contact and the Urban Economy

    OpenAIRE

    Michael Storper; Anthony J. Venables

    2003-01-01

    This paper argues that existing models of urban concentrations are incomplete unless grounded in the most fundamental aspect of proximity; face-to-face contact. Face-to-face contact has four main features; it is an efficient communication technology; it can help solve incentive problems; it can facilitate socialization and learning; and it provides psychological motivation. We discuss each of these features in turn, and develop formal economic models of two of them. Face-to-face is particular...

  15. Facing Aggression: Cues Differ for Female versus Male Faces

    OpenAIRE

    Geniole, Shawn N.; Keyes, Amanda E.; Mondloch, Catherine J.; Carr?, Justin M.; McCormick, Cheryl M.

    2012-01-01

    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judge...

  16. The Complete Gabor-Fisher Classifier for Robust Face Recognition

    Directory of Open Access Journals (Sweden)

    Štruc Vitomir

    2010-01-01

    Full Text Available Abstract This paper develops a novel face recognition technique called Complete Gabor Fisher Classifier (CGFC. Different from existing techniques that use Gabor filters for deriving the Gabor face representation, the proposed approach does not rely solely on Gabor magnitude information but effectively uses features computed based on Gabor phase information as well. It represents one of the few successful attempts found in the literature of combining Gabor magnitude and phase information for robust face recognition. The novelty of the proposed CGFC technique comes from (1 the introduction of a Gabor phase-based face representation and (2 the combination of the recognition technique using the proposed representation with classical Gabor magnitude-based methods into a unified framework. The proposed face recognition framework is assessed in a series of face verification and identification experiments performed on the XM2VTS, Extended YaleB, FERET, and AR databases. The results of the assessment suggest that the proposed technique clearly outperforms state-of-the-art face recognition techniques from the literature and that its performance is almost unaffected by the presence of partial occlusions of the facial area, changes in facial expression, or severe illumination changes.

  17. Vertical vector face lift.

    Science.gov (United States)

    Somoano, Brian; Chan, Joanna; Morganroth, Greg

    2011-01-01

    Facial rejuvenation using local anesthesia has evolved in the past decade as a safer option for patients seeking fewer complications and minimal downtime. Mini- and short-scar face lifts using more conservative incision lengths and extent of undermining can be effective in the younger patient with lower face laxity and minimal loose, elastotic neck skin. By incorporating both an anterior and posterior approach and using an incision length between the mini and more traditional face lift, the Vertical Vector Face Lift can achieve longer-lasting and natural results with lesser cost and risk. Submentoplasty and liposuction of the neck and jawline, fundamental components of the vertical vector face lift, act synergistically with superficial musculoaponeurotic system plication to reestablish a more youthful, sculpted cervicomental angle, even in patients with prominent jowls. Dramatic results can be achieved in the right patient by combining with other procedures such as injectable fillers, chin implants, laser resurfacing, or upper and lower blepharoplasties. © 2011 Wiley Periodicals, Inc.

  18. Successful decoding of famous faces in the fusiform face area.

    Directory of Open Access Journals (Sweden)

    Vadim Axelrod

    Full Text Available What are the neural mechanisms of face recognition? It is believed that the network of face-selective areas, which spans the occipital, temporal, and frontal cortices, is important in face recognition. A number of previous studies indeed reported that face identity could be discriminated based on patterns of multivoxel activity in the fusiform face area and the anterior temporal lobe. However, given the difficulty in localizing the face-selective area in the anterior temporal lobe, its role in face recognition is still unknown. Furthermore, previous studies limited their analysis to occipito-temporal regions without testing identity decoding in more anterior face-selective regions, such as the amygdala and prefrontal cortex. In the current high-resolution functional Magnetic Resonance Imaging study, we systematically examined the decoding of the identity of famous faces in the temporo-frontal network of face-selective and adjacent non-face-selective regions. A special focus has been put on the face-area in the anterior temporal lobe, which was reliably localized using an optimized scanning protocol. We found that face-identity could be discriminated above chance level only in the fusiform face area. Our results corroborate the role of the fusiform face area in face recognition. Future studies are needed to further explore the role of the more recently discovered anterior face-selective areas in face recognition.

  19. How Well Do Computer-Generated Faces Tap Face Expertise?

    Directory of Open Access Journals (Sweden)

    Kate Crookes

    Full Text Available The use of computer-generated (CG stimuli in face processing research is proliferating due to the ease with which faces can be generated, standardised and manipulated. However there has been surprisingly little research into whether CG faces are processed in the same way as photographs of real faces. The present study assessed how well CG faces tap face identity expertise by investigating whether two indicators of face expertise are reduced for CG faces when compared to face photographs. These indicators were accuracy for identification of own-race faces and the other-race effect (ORE-the well-established finding that own-race faces are recognised more accurately than other-race faces. In Experiment 1 Caucasian and Asian participants completed a recognition memory task for own- and other-race real and CG faces. Overall accuracy for own-race faces was dramatically reduced for CG compared to real faces and the ORE was significantly and substantially attenuated for CG faces. Experiment 2 investigated perceptual discrimination for own- and other-race real and CG faces with Caucasian and Asian participants. Here again, accuracy for own-race faces was significantly reduced for CG compared to real faces. However the ORE was not affected by format. Together these results signal that CG faces of the type tested here do not fully tap face expertise. Technological advancement may, in the future, produce CG faces that are equivalent to real photographs. Until then caution is advised when interpreting results obtained using CG faces.

  20. MFTF sensor verification computer program

    International Nuclear Information System (INIS)

    Chow, H.K.

    1984-01-01

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system

  1. Numerical Verification Of Equilibrium Chemistry

    International Nuclear Information System (INIS)

    Piro, Markus; Lewis, Brent; Thompson, William T.; Simunovic, Srdjan; Besmann, Theodore M.

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  2. Seismic verification of underground explosions

    International Nuclear Information System (INIS)

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper

  3. Retail applications of signature verification

    Science.gov (United States)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  4. The verification of ethnographic data.

    Science.gov (United States)

    Pool, Robert

    2017-09-01

    Anthropologists are increasingly required to account for the data on which they base their interpretations and to make it available for public scrutiny and re-analysis. While this may seem straightforward (why not place our data in online repositories?), it is not. Ethnographic 'data' may consist of everything from verbatim transcripts ('hard data') to memories and impressions ('soft data'). Hard data can be archived and re-analysed; soft data cannot. The focus on hard 'objective' data contributes to the delegitimizing of the soft data that are essential for ethnographic understanding, and without which hard data cannot be properly interpreted. However, the credibility of ethnographic interpretation requires the possibility of verification. This could be achieved by obligatory, standardised forms of personal storage with the option for audit if required, and by being more explicit in publications about the nature and status of the data and the process of interpretation.

  5. The NRC measurement verification program

    International Nuclear Information System (INIS)

    Pham, T.N.; Ong, L.D.Y.

    1995-01-01

    A perspective is presented on the US Nuclear Regulatory Commission (NRC) approach for effectively monitoring the measurement methods and directly testing the capability and performance of licensee measurement systems. A main objective in material control and accounting (MC and A) inspection activities is to assure the accuracy and precision of the accounting system and the absence of potential process anomalies through overall accountability. The primary means of verification remains the NRC random sampling during routine safeguards inspections. This involves the independent testing of licensee measurement performance with statistical sampling plans for physical inventories, item control, and auditing. A prospective cost-effective alternative overcheck is also discussed in terms of an externally coordinated sample exchange or ''round robin'' program among participating fuel cycle facilities in order to verify the quality of measurement systems, i.e., to assure that analytical measurement results are free of bias

  6. Digital correlation applied to recognition and identification faces

    International Nuclear Information System (INIS)

    Arroyave, S.; Hernandez, L. J.; Torres, Cesar; Matos, Lorenzo

    2009-01-01

    It developed a system capable of recognizing faces of people from their facial features, the images are taken by the software automatically through a process of validating the presence of face to the camera lens, the digitized image is compared with a database that contains previously images captured, to subsequently be recognized and finally identified. The contribution of system set out is the fact that the acquisition of data is done in real time and using a web cam commercial usb interface offering an system equally optimal but much more economical. This tool is very effective in systems where the security is off vital importance, support with a high degree of verification to entities that possess databases with faces of people. (Author)

  7. Facing Sound - Voicing Art

    DEFF Research Database (Denmark)

    Lønstrup, Ansa

    2013-01-01

    This article is based on examples of contemporary audiovisual art, with a special focus on the Tony Oursler exhibition Face to Face at Aarhus Art Museum ARoS in Denmark in March-July 2012. My investigation involves a combination of qualitative interviews with visitors, observations of the audience´s...... interactions with the exhibition and the artwork in the museum space and short analyses of individual works of art based on reception aesthetics and phenomenology and inspired by newer writings on sound, voice and listening....

  8. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  9. Facing Aggression: Cues Differ for Female versus Male Faces

    Science.gov (United States)

    Geniole, Shawn N.; Keyes, Amanda E.; Mondloch, Catherine J.; Carré, Justin M.; McCormick, Cheryl M.

    2012-01-01

    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F1,36 = 7.43, p = 0.01). In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait) and on femininity. Judgements of nurturing were associated with femininity (positively) and masculinity (negatively) ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as “honest signals”. PMID:22276184

  10. Facing aggression: cues differ for female versus male faces.

    Directory of Open Access Journals (Sweden)

    Shawn N Geniole

    Full Text Available The facial width-to-height ratio (face ratio, is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F(1,36 = 7.43, p = 0.01. In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait and on femininity. Judgements of nurturing were associated with femininity (positively and masculinity (negatively ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as "honest signals".

  11. Facing aggression: cues differ for female versus male faces.

    Science.gov (United States)

    Geniole, Shawn N; Keyes, Amanda E; Mondloch, Catherine J; Carré, Justin M; McCormick, Cheryl M

    2012-01-01

    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F(1,36) = 7.43, p = 0.01). In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait) and on femininity. Judgements of nurturing were associated with femininity (positively) and masculinity (negatively) ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as "honest signals".

  12. Gaze Cueing by Pareidolia Faces

    OpenAIRE

    Kohske Takahashi; Katsumi Watanabe

    2013-01-01

    Visual images that are not faces are sometimes perceived as faces (the pareidolia phenomenon). While the pareidolia phenomenon provides people with a strong impression that a face is present, it is unclear how deeply pareidolia faces are processed as faces. In the present study, we examined whether a shift in spatial attention would be produced by gaze cueing of face-like objects. A robust cueing effect was observed when the face-like objects were perceived as faces. The magnitude of the cuei...

  13. On infrared divergences

    International Nuclear Information System (INIS)

    Parisi, G.

    1979-01-01

    The structure of infrared divergences is studied in superrenormalizable interactions. It is conjectured that there is an extension of the Bogoliubov-Parasiuk-Hepp theorem which copes also with infrared divergences. The consequences of this conjecture on the singularities of the Borel transform in a massless asymptotic free field theory are discussed. The application of these ideas to gauge theories is briefly discussed. (Auth.)

  14. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  15. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  16. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  17. Robust Statistical Face Frontalization

    NARCIS (Netherlands)

    Sagonas, Christos; Panagakis, Yannis; Zafeiriou, Stefanos; Pantic, Maja

    2015-01-01

    Recently, it has been shown that excellent results can be achieved in both facial landmark localization and pose-invariant face recognition. These breakthroughs are attributed to the efforts of the community to manually annotate facial images in many different poses and to collect 3D facial data. In

  18. PrimeFaces blueprints

    CERN Document Server

    Jonna, Sudheer

    2014-01-01

    If you are a Java developer with experience of frontend UI development, and want to take the plunge to develop stunning UI applications with the most popular JSF framework, PrimeFaces, then this book is for you. For those with entrepreneurial aspirations, this book will provide valuable insights into how to utilize successful business models.

  19. Face-Lift

    Science.gov (United States)

    ... or sun damage, you might also consider a skin-resurfacing procedure. A face-lift can be done in combination with some other cosmetic procedures, such as a brow lift or eyelid surgery. Why it's done As you get older, your facial skin changes — sagging and becoming loose. This can make ...

  20. Facing competitive pressures

    International Nuclear Information System (INIS)

    Weinrich, H.

    1994-01-01

    This article discusses the problems facing the electric power industry and professional personnel as a result of economic downturn and the resulting down sizing of individual companies and utilities. The author proposes that the most efficient use of technology will have greater impact in making a utility more competitive than reducing the head count

  1. Mechanical Face Seal Dynamics.

    Science.gov (United States)

    1985-12-01

    1473, 83 APR EDITION OF I JAN 73 IS OBSOLETE. UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE -,1 - " P V 7 V - • ... f -N- PRE FACE This final...dimensionless mass m and support damping 1), ~ at-e aisas M"= -1,,i -4 4) y positive. ’he damping D is Ihe tinplete system of momeints acting on tile

  2. Sensual expressions on faces

    NARCIS (Netherlands)

    Hendriks, A.W.C.J.; Engels, R.C.M.E.; Roek, M.A.E.

    2009-01-01

    We explored the possibility that an emotional facial expression exists specifically for signalling sexual interest. We selected photographs of twenty-eight fashion models (male and female) with large portfolios (range 81 - 1593), choosing only face photographs in which the model was looking into the

  3. Problems Facing Rural Schools.

    Science.gov (United States)

    Stewart, C. E.; And Others

    Problems facing rural Scottish schools range from short term consideration of daily operation to long term consideration of organizational alternatives. Addressed specifically, such problems include consideration of: (1) liaison between a secondary school and its feeder primary schools; (2) preservice teacher training for work in small, isolated…

  4. Problems facing developing countries

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    Financing, above all political and technical considerations, remains the major obstacle faced by developing countries who wish to embark on a nuclear power programme. According to the IAEA, the support of the official lending agencies of the suppliers is essential. (author)

  5. Neural synchronization during face-to-face communication.

    Science.gov (United States)

    Jiang, Jing; Dai, Bohan; Peng, Danling; Zhu, Chaozhe; Liu, Li; Lu, Chunming

    2012-11-07

    Although the human brain may have evolutionarily adapted to face-to-face communication, other modes of communication, e.g., telephone and e-mail, increasingly dominate our modern daily life. This study examined the neural difference between face-to-face communication and other types of communication by simultaneously measuring two brains using a hyperscanning approach. The results showed a significant increase in the neural synchronization in the left inferior frontal cortex during a face-to-face dialog between partners but none during a back-to-back dialog, a face-to-face monologue, or a back-to-back monologue. Moreover, the neural synchronization between partners during the face-to-face dialog resulted primarily from the direct interactions between the partners, including multimodal sensory information integration and turn-taking behavior. The communicating behavior during the face-to-face dialog could be predicted accurately based on the neural synchronization level. These results suggest that face-to-face communication, particularly dialog, has special neural features that other types of communication do not have and that the neural synchronization between partners may underlie successful face-to-face communication.

  6. Voicing on Virtual and Face to Face Discussion

    Science.gov (United States)

    Yamat, Hamidah

    2013-01-01

    This paper presents and discusses findings of a study conducted on pre-service teachers' experiences in virtual and face to face discussions. Technology has brought learning nowadays beyond the classroom context or time zone. The learning context and process no longer rely solely on face to face communications in the presence of a teacher.…

  7. Bayesian Face Recognition and Perceptual Narrowing in Face-Space

    Science.gov (United States)

    Balas, Benjamin

    2012-01-01

    During the first year of life, infants' face recognition abilities are subject to "perceptual narrowing", the end result of which is that observers lose the ability to distinguish previously discriminable faces (e.g. other-race faces) from one another. Perceptual narrowing has been reported for faces of different species and different races, in…

  8. Optimal Face-Iris Multimodal Fusion Scheme

    Directory of Open Access Journals (Sweden)

    Omid Sharifi

    2016-06-01

    Full Text Available Multimodal biometric systems are considered a way to minimize the limitations raised by single traits. This paper proposes new schemes based on score level, feature level and decision level fusion to efficiently fuse face and iris modalities. Log-Gabor transformation is applied as the feature extraction method on face and iris modalities. At each level of fusion, different schemes are proposed to improve the recognition performance and, finally, a combination of schemes at different fusion levels constructs an optimized and robust scheme. In this study, CASIA Iris Distance database is used to examine the robustness of all unimodal and multimodal schemes. In addition, Backtracking Search Algorithm (BSA, a novel population-based iterative evolutionary algorithm, is applied to improve the recognition accuracy of schemes by reducing the number of features and selecting the optimized weights for feature level and score level fusion, respectively. Experimental results on verification rates demonstrate a significant improvement of proposed fusion schemes over unimodal and multimodal fusion methods.

  9. Real Time Face Quality Assessment for Face Log Generation

    DEFF Research Database (Denmark)

    Kamal, Nasrollahi; Moeslund, Thomas B.

    2009-01-01

    Summarizing a long surveillance video to just a few best quality face images of each subject, a face-log, is of great importance in surveillance systems. Face quality assessment is the back-bone for face log generation and improving the quality assessment makes the face logs more reliable....... Developing a real time face quality assessment system using the most important facial features and employing it for face logs generation are the concerns of this paper. Extensive tests using four databases are carried out to validate the usability of the system....

  10. Face-to-Face Activities in Blended Learning

    DEFF Research Database (Denmark)

    Kjærgaard, Annemette

    While blended learning combines online and face-to-face teaching, research on blended learning has primarily focused on the role of technology and the opportunities it creates for engaging students. Less focus has been put on face-to-face activities in blended learning. This paper argues...... that it is not only the online activities in blended learning that provide new opportunities for rethinking pedagogy in higher education, it is also imperative to reconsider the face-to-face activities when part of the learning is provided online. Based on a review of blended learning in business and management...... education, we identify what forms of teaching and learning are suggested to take place face-to-face when other activities are moved online. We draw from the Community of Inquiry framework to analyze how face-to-face activities contribute to a blended learning pedagogy and discuss the implications...

  11. Human faces are slower than chimpanzee faces.

    Directory of Open Access Journals (Sweden)

    Anne M Burrows

    Full Text Available While humans (like other primates communicate with facial expressions, the evolution of speech added a new function to the facial muscles (facial expression muscles. The evolution of speech required the development of a coordinated action between visual (movement of the lips and auditory signals in a rhythmic fashion to produce "visemes" (visual movements of the lips that correspond to specific sounds. Visemes depend upon facial muscles to regulate shape of the lips, which themselves act as speech articulators. This movement necessitates a more controlled, sustained muscle contraction than that produced during spontaneous facial expressions which occur rapidly and last only a short period of time. Recently, it was found that human tongue musculature contains a higher proportion of slow-twitch myosin fibers than in rhesus macaques, which is related to the slower, more controlled movements of the human tongue in the production of speech. Are there similar unique, evolutionary physiologic biases found in human facial musculature related to the evolution of speech?Using myosin immunohistochemistry, we tested the hypothesis that human facial musculature has a higher percentage of slow-twitch myosin fibers relative to chimpanzees (Pan troglodytes and rhesus macaques (Macaca mulatta. We sampled the orbicularis oris and zygomaticus major muscles from three cadavers of each species and compared proportions of fiber-types. Results confirmed our hypothesis: humans had the highest proportion of slow-twitch myosin fibers while chimpanzees had the highest proportion of fast-twitch fibers.These findings demonstrate that the human face is slower than that of rhesus macaques and our closest living relative, the chimpanzee. They also support the assertion that human facial musculature and speech co-evolved. Further, these results suggest a unique set of evolutionary selective pressures on human facial musculature to slow down while the function of this muscle

  12. Face recognition system and method using face pattern words and face pattern bytes

    Science.gov (United States)

    Zheng, Yufeng

    2014-12-23

    The present invention provides a novel system and method for identifying individuals and for face recognition utilizing facial features for face identification. The system and method of the invention comprise creating facial features or face patterns called face pattern words and face pattern bytes for face identification. The invention also provides for pattern recognitions for identification other than face recognition. The invention further provides a means for identifying individuals based on visible and/or thermal images of those individuals by utilizing computer software implemented by instructions on a computer or computer system and a computer readable medium containing instructions on a computer system for face recognition and identification.

  13. Neural synchronization during face-to-face communication

    OpenAIRE

    Jiang, J.; Dai, B.; Peng, D.; Zhu, C.; Liu, L.; Lu, C.

    2012-01-01

    Although the human brain may have evolutionarily adapted to face-to-face communication, other modes of communication, e.g., telephone and e-mail, increasingly dominate our modern daily life. This study examined the neural difference between face-to-face communication and other types of communication by simultaneously measuring two brains using a hyperscanning approach. The results showed a significant increase in the neural synchronization in the left inferior frontal cortex during a face-to-...

  14. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  15. 10 CFR 300.11 - Independent verification.

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY CLIMATE CHANGE VOLUNTARY GREENHOUSE GAS REPORTING PROGRAM: GENERAL GUIDELINES § 300.11... managing an auditing or verification process, including the recruitment and allocation of other individual.... (c) Qualifications of organizations accrediting verifiers. Organizations that accredit individual...

  16. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  17. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  18. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  19. The Caledonian face test: A new test of face discrimination.

    Science.gov (United States)

    Logan, Andrew J; Wilkinson, Frances; Wilson, Hugh R; Gordon, Gael E; Loffler, Gunter

    2016-02-01

    This study aimed to develop a clinical test of face perception which is applicable to a wide range of patients and can capture normal variability. The Caledonian face test utilises synthetic faces which combine simplicity with sufficient realism to permit individual identification. Face discrimination thresholds (i.e. minimum difference between faces required for accurate discrimination) were determined in an "odd-one-out" task. The difference between faces was controlled by an adaptive QUEST procedure. A broad range of face discrimination sensitivity was determined from a group (N=52) of young adults (mean 5.75%; SD 1.18; range 3.33-8.84%). The test is fast (3-4 min), repeatable (test-re-test r(2)=0.795) and demonstrates a significant inversion effect. The potential to identify impairments of face discrimination was evaluated by testing LM who reported a lifelong difficulty with face perception. While LM's impairment for two established face tests was close to the criterion for significance (Z-scores of -2.20 and -2.27) for the Caledonian face test, her Z-score was -7.26, implying a more than threefold higher sensitivity. The new face test provides a quantifiable and repeatable assessment of face discrimination ability. The enhanced sensitivity suggests that the Caledonian face test may be capable of detecting more subtle impairments of face perception than available tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  1. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  2. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  3. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  4. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  5. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  6. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  7. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  8. Verification of RESRAD-RDD. (Version 2.01)

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Flood, Paul E. [Argonne National Lab. (ANL), Argonne, IL (United States); LePoire, David [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions. The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD

  9. Anatomy of ageing face.

    Science.gov (United States)

    Ilankovan, V

    2014-03-01

    Ageing is a biological process that results from changes at a cellular level, particularly modification of mRNA. The face is affected by the same physiological process and results in skeletal, muscular, and cutaneous ageing; ligamentous attenuation, descent of fat, and ageing of the appendages. I describe these changes on a structural and clinical basis and summarise possible solutions for a rejuvenation surgeon. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  10. IntraFace

    OpenAIRE

    De la Torre, Fernando; Chu, Wen-Sheng; Xiong, Xuehan; Vicente, Francisco; Ding, Xiaoyu; Cohn, Jeffrey

    2015-01-01

    Within the last 20 years, there has been an increasing interest in the computer vision community in automated facial image analysis algorithms. This has been driven by applications in animation, market research, autonomous-driving, surveillance, and facial editing among others. To date, there exist several commercial packages for specific facial image analysis tasks such as facial expression recognition, facial attribute analysis or face tracking. However, free and easy-to-use software that i...

  11. Tomotherapy: IMRT and tomographic verification

    International Nuclear Information System (INIS)

    Mackie, T.R.

    2000-01-01

    include MLC's and many clinics use them to replace 90% or more of the field-shaping requirements of conventional radiotherapy. Now, several academic centers are treating patients with IMRT using conventional MLC's to modulate the field. IMRT using conventional MLC's have the advantage that the patient is stationary during the treatment and the MLC's can be used in conventional practice. Nevertheless, tomotherapy using the Peacock system delivers the most conformal dose distributions of any commercial system to date. The biggest limitation with the both the NOMOS Peacock tomotherapy system and conventional MLC's for IMRT delivery is the lack of treatment verification. In conventional few-field radiotherapy one relied on portal images to determine if the patient was setup correctly and the beams were correctly positioned. With IMRT the image contrast is superimposed on the beam intensity variation. Conventional practice allowed for monitor unit calculation checks and point dosimeters placed on the patient's surface to verify that the treatment was properly delivered. With IMRT it is impossible to perform hand calculations of monitor units and dosimeters placed on the patient's surface are prone to error due to high gradients in the beam intensity. NOMOS has developed a verification phantom that allows multiple sheets of film to be placed in a light-tight box that is irradiated with the same beam pattern that is used to treat the patient. The optical density of the films are adjusted, normalized, and calibrated and then quantitatively compared with the dose calculated for the phantom delivery. However, this process is too laborious to be used for patient-specific QA. If IMRT becomes ubiquitous and it can be shown that IMRT is useful on most treatment sites then there is a need to design treatment units dedicated to IMRT delivery and verification. Helical tomotherapy is such a redesign. Helical tomotherapy is the delivery of a rotational fan beam while the patient is

  12. Verification of excess defense material

    International Nuclear Information System (INIS)

    Fearey, B.L.; Pilat, J.F.; Eccleston, G.W.; Nicholas, N.J.; Tape, J.W.

    1997-01-01

    The international community in the post-Cold War period has expressed an interest in the International Atomic Energy Agency (IAEA) using its expertise in support of the arms control and disarmament process in unprecedented ways. The pledges of the US and Russian presidents to place excess defense materials under some type of international inspections raises the prospect of using IAEA safeguards approaches for monitoring excess materials, which include both classified and unclassified materials. Although the IAEA has suggested the need to address inspections of both types of materials, the most troublesome and potentially difficult problems involve approaches to the inspection of classified materials. The key issue for placing classified nuclear components and materials under IAEA safeguards is the conflict between these traditional IAEA materials accounting procedures and the US classification laws and nonproliferation policy designed to prevent the disclosure of critical weapon-design information. Possible verification approaches to classified excess defense materials could be based on item accountancy, attributes measurements, and containment and surveillance. Such approaches are not wholly new; in fact, they are quite well established for certain unclassified materials. Such concepts may be applicable to classified items, but the precise approaches have yet to be identified, fully tested, or evaluated for technical and political feasibility, or for their possible acceptability in an international inspection regime. Substantial work remains in these areas. This paper examines many of the challenges presented by international inspections of classified materials

  13. Dosimetric verification of IMRT plans

    International Nuclear Information System (INIS)

    Bulski, W.; Cheimicski, K.; Rostkowska, J.

    2012-01-01

    Intensity modulated radiotherapy (IMRT) is a complex procedure requiring proper dosimetric verification. IMRT dose distributions are characterized by steep dose gradients which enable to spare organs at risk and allow for an escalation of the dose to the tumor. They require large number of radiation beams (sometimes over 10). The fluence measurements for individual beams are not sufficient for evaluation of the total dose distribution and to assure patient safety. The methods used at the Centre of Oncology in Warsaw are presented. In order to measure dose distributions in various cross-sections the film dosimeters were used (radiographic Kodak EDR2 films and radiochromic Gafchromic EBT films). The film characteristics were carefully examined. Several types of tissue equivalent phantoms were developed. A methodology of comparing measured dose distributions against the distributions calculated by treatment planning systems (TPS) was developed and tested. The tolerance level for this comparison was set at 3% difference in dose and 3 mm in distance to agreement. The so called gamma formalism was used. The results of these comparisons for a group of over 600 patients are presented. Agreement was found in 87 % of cases. This film dosimetry methodology was used as a benchmark to test and validate the performance of commercially available 2D and 3D matrices of detectors (ionization chambers or diodes). The results of these validations are also presented. (authors)

  14. Beyond Faces and Expertise

    Science.gov (United States)

    Zhao, Mintao; Bülthoff, Heinrich H.; Bülthoff, Isabelle

    2016-01-01

    Holistic processing—the tendency to perceive objects as indecomposable wholes—has long been viewed as a process specific to faces or objects of expertise. Although current theories differ in what causes holistic processing, they share a fundamental constraint for its generalization: Nonface objects cannot elicit facelike holistic processing in the absence of expertise. Contrary to this prevailing view, here we show that line patterns with salient Gestalt information (i.e., connectedness, closure, and continuity between parts) can be processed as holistically as faces without any training. Moreover, weakening the saliency of Gestalt information in these patterns reduced holistic processing of them, which indicates that Gestalt information plays a crucial role in holistic processing. Therefore, holistic processing can be achieved not only via a top-down route based on expertise, but also via a bottom-up route relying merely on object-based information. The finding that facelike holistic processing can extend beyond the domains of faces and objects of expertise poses a challenge to current dominant theories. PMID:26674129

  15. Barrier Infrared Detector (BIRD)

    Data.gov (United States)

    National Aeronautics and Space Administration — A recent breakthrough in MWIR detector design, has resulted in a high operating temperature (HOT) barrier infrared detector (BIRD) that is capable of spectral...

  16. Infrared Sky Surveys

    Science.gov (United States)

    Price, Stephan D.

    2009-02-01

    A retrospective is given on infrared sky surveys from Thomas Edison’s proposal in the late 1870s to IRAS, the first sensitive mid- to far-infrared all-sky survey, and the mid-1990s experiments that filled in the IRAS deficiencies. The emerging technology for space-based surveys is highlighted, as is the prominent role the US Defense Department, particularly the Air Force, played in developing and applying detector and cryogenic sensor advances to early mid-infrared probe-rocket and satellite-based surveys. This technology was transitioned to the infrared astronomical community in relatively short order and was essential to the success of IRAS, COBE and ISO. Mention is made of several of the little known early observational programs that were superseded by more successful efforts.

  17. Infrared emission from protostars

    International Nuclear Information System (INIS)

    Adams, F.C.; Shu, F.H.

    1985-01-01

    The emergent spectral energy distribution at infrared to radio wavelengths is calculated for the simplest theoretical construct of a low-mass protostar. It is shown that the emergent spectrum in the infrared is insensitive to the details assumed for the temperature profile as long as allowance is made for a transition from optically thick to optically thin conditions and luminosity conservation isenforced at the inner and outer shells. The radiation in the far infrared and submillimeter wavelengths depends on the exact assumptions made for grain opacities at low frequencies. An atlas of emergent spectral energy distributions is presented for a grid of values of the instantaneous mass of the protostar and the mass infall rate. The attenuated contribution of the accretion shock to the near-infrared radiation is considered. 50 references

  18. Visual search of Mooney faces

    Directory of Open Access Journals (Sweden)

    Jessica Emeline Goold

    2016-02-01

    Full Text Available Faces spontaneously capture attention. However, which special attributes of a face underlie this effect are unclear. To address this question, we investigate how gist information, specific visual properties and differing amounts of experience with faces affect the time required to detect a face. Three visual search experiments were conducted investigating the rapidness of human observers to detect Mooney face images. Mooney images are two-toned, ambiguous images. They were used in order to have stimuli that maintain gist information but limit low-level image properties. Results from the experiments show: 1 although upright Mooney faces were searched inefficiently, they were detected more rapidly than inverted Mooney face targets, demonstrating the important role of gist information in guiding attention towards a face. 2 Several specific Mooney face identities were searched efficiently while others were not, suggesting the involvement of specific visual properties in face detection. 3 By providing participants with unambiguous gray-scale versions of the Mooney face targets prior to the visual search task, the targets were detected significantly more efficiently, suggesting that prior experience with Mooney faces improves the ability to extract gist information for rapid face detection. However, a week of training with Mooney face categorization did not lead to even more efficient visual search of Mooney face targets. In summary, these results reveal that specific local image properties cannot account for how faces capture attention. On the other hand, gist information alone cannot account for how faces capture attention either. Prior experience facilitates the effect of gist on visual search of faces, making faces a special object category for guiding attention.

  19. Early results from the Infrared Astronomical Satellite

    International Nuclear Information System (INIS)

    Neugebauer, G.; Beichman, C.A.; Soifer, B.T.

    1984-01-01

    For 10 months the Infrared Astronomical Satellite (IRAS) provided astronomers with what might be termed their first view of the infrared sky on a clear, dark night. Without IRAS, atmospheric absorption and the thermal emission from both the atmosphere and Earthbound telescopes make the task of the infrared astronomer comparable to what an optical astronomer would face if required to work only on cloudy afternoons. IRAS observations are serving astronomers in the same manner as the photographic plates of the Palomar Observatory Sky Survey; just as the optical survey has been used by all astronomers for over three decades, as a source of quantitative information about the sky and as a roadmap for future observations, the results of IRAS will be studied for years to come. IRAS has demonstrated the power of infrared astronomy from space. Already, from a brief look at a miniscule fraction of the data available, we have learned much about the solar system, about nearby stars, about the Galaxy as a whole and about distant extragalactic systems. Comets are much dustier than previously thought. Solid particles, presumably the remnants of the star-formation process, orbit around Vega and other stars and may provide the raw material for planetary systems. Emission from cool interstellar material has been traced throughout the Galaxy all the way to the galactic poles. Both the clumpiness and breadth of the distribution of this material were previously unsuspected. The far-infrared sky away from the galactic plane has been found to be dominate by spiral galaxies, some of which emit more than 50% and as much as 98% of their energy in the infrared - an exciting and surprising revelation. The IRAS mission is clearly the pathfinder for future mission that, to a large extent, will be devoted to the discoveries revealed by IRAS. 8 figures

  20. History of infrared detectors

    Science.gov (United States)

    Rogalski, A.

    2012-09-01

    This paper overviews the history of infrared detector materials starting with Herschel's experiment with thermometer on February 11th, 1800. Infrared detectors are in general used to detect, image, and measure patterns of the thermal heat radiation which all objects emit. At the beginning, their development was connected with thermal detectors, such as thermocouples and bolometers, which are still used today and which are generally sensitive to all infrared wavelengths and operate at room temperature. The second kind of detectors, called the photon detectors, was mainly developed during the 20th Century to improve sensitivity and response time. These detectors have been extensively developed since the 1940's. Lead sulphide (PbS) was the first practical IR detector with sensitivity to infrared wavelengths up to ˜3 μm. After World War II infrared detector technology development was and continues to be primarily driven by military applications. Discovery of variable band gap HgCdTe ternary alloy by Lawson and co-workers in 1959 opened a new area in IR detector technology and has provided an unprecedented degree of freedom in infrared detector design. Many of these advances were transferred to IR astronomy from Departments of Defence research. Later on civilian applications of infrared technology are frequently called "dual-use technology applications." One should point out the growing utilisation of IR technologies in the civilian sphere based on the use of new materials and technologies, as well as the noticeable price decrease in these high cost technologies. In the last four decades different types of detectors are combined with electronic readouts to make detector focal plane arrays (FPAs). Development in FPA technology has revolutionized infrared imaging. Progress in integrated circuit design and fabrication techniques has resulted in continued rapid growth in the size and performance of these solid state arrays.

  1. Additive Manufacturing Infrared Inspection

    Science.gov (United States)

    Gaddy, Darrell; Nettles, Mindy

    2015-01-01

    The Additive Manufacturing Infrared Inspection Task started the development of a real-time dimensional inspection technique and digital quality record for the additive manufacturing process using infrared camera imaging and processing techniques. This project will benefit additive manufacturing by providing real-time inspection of internal geometry that is not currently possible and reduce the time and cost of additive manufactured parts with automated real-time dimensional inspections which deletes post-production inspections.

  2. Challenges facing production grids

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Ruth; /Fermilab

    2007-06-01

    Today's global communities of users expect quality of service from distributed Grid systems equivalent to that their local data centers. This must be coupled to ubiquitous access to the ensemble of processing and storage resources across multiple Grid infrastructures. We are still facing significant challenges in meeting these expectations, especially in the underlying security, a sustainable and successful economic model, and smoothing the boundaries between administrative and technical domains. Using the Open Science Grid as an example, I examine the status and challenges of Grids operating in production today.

  3. Mining face equipment

    Energy Technology Data Exchange (ETDEWEB)

    G, Litvinskiy G; Babyuk, G V; Yakovenko, V A

    1981-01-07

    Mining face equipment includes drilling advance wells, drilling using explosives on the contour bore holes, loading and transporting the crushed mass, drilling reinforcement shafts, injecting reinforcement compounds and moving the timber. Camouflet explosives are used to form relaxed rock stress beyond the mining area to decrease costs of reinforcing the mining area by using nonstressed rock in the advance well as support. The strengthening solution is injected through advanced cementing wells before drilling the contour bores as well as through radial cementing wells beyond the timbers following loading and transport of the mining debris. The advance well is 50-80 m.

  4. Face the voice

    DEFF Research Database (Denmark)

    Lønstrup, Ansa

    2014-01-01

    will be based on a reception aesthetic and phenomenological approach, the latter as presented by Don Ihde in his book Listening and Voice. Phenomenologies of Sound , and my analytical sketches will be related to theoretical statements concerning the understanding of voice and media (Cavarero, Dolar, La......Belle, Neumark). Finally, the article will discuss the specific artistic combination and our auditory experience of mediated human voices and sculpturally projected faces in an art museum context under the general conditions of the societal panophonia of disembodied and mediated voices, as promoted by Steven...

  5. Use of social media to encourage face to face communication

    OpenAIRE

    Čufer, Matija; Knežević, Anja

    2017-01-01

    Face-to-face communication is of key importance for successful socialization of a person into a society. Social media makes a good complement to such form of communication. Parents and pedagogical workers must be aware of children not replacing face-to-face communication for communication through the social media in the process of education and growing up. Young people nevertheless frequently communicate through the social media. For this reason, we tried to extract positive features of those...

  6. Face-to-Face Interference in Typical and Atypical Development

    Science.gov (United States)

    Riby, Deborah M.; Doherty-Sneddon, Gwyneth; Whittle, Lisa

    2012-01-01

    Visual communication cues facilitate interpersonal communication. It is important that we look at faces to retrieve and subsequently process such cues. It is also important that we sometimes look away from faces as they increase cognitive load that may interfere with online processing. Indeed, when typically developing individuals hold face gaze…

  7. Assessing Students Perceptions on Intensive Face to Face in Open ...

    African Journals Online (AJOL)

    Therefore, this study assessed students‟ perception on Intensive Face to Face sessions. The study specifically aimed at identifying students‟ perception on quality of interaction between tutors and students and between students on the other hand. It also explored the nature of challenges students meet in attending face to ...

  8. Face recognition : implementation of face recognition on AMIGO

    NARCIS (Netherlands)

    Geelen, M.J.A.J.; Molengraft, van de M.J.G.; Elfring, J.

    2011-01-01

    In this (traineeship)report two possible methods of face recognition were presented. The first method describes how to detect and recognize faces by using the SURF algorithm. This algorithm finally was not used for recognizing faces, with the reason that the Eigenface algorithm was an already tested

  9. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  10. IntraFace.

    Science.gov (United States)

    De la Torre, Fernando; Chu, Wen-Sheng; Xiong, Xuehan; Vicente, Francisco; Ding, Xiaoyu; Cohn, Jeffrey

    2015-05-01

    Within the last 20 years, there has been an increasing interest in the computer vision community in automated facial image analysis algorithms. This has been driven by applications in animation, market research, autonomous-driving, surveillance, and facial editing among others. To date, there exist several commercial packages for specific facial image analysis tasks such as facial expression recognition, facial attribute analysis or face tracking. However, free and easy-to-use software that incorporates all these functionalities is unavailable. This paper presents IntraFace (IF), a publicly-available software package for automated facial feature tracking, head pose estimation, facial attribute recognition, and facial expression analysis from video. In addition, IFincludes a newly develop technique for unsupervised synchrony detection to discover correlated facial behavior between two or more persons, a relatively unexplored problem in facial image analysis. In tests, IF achieved state-of-the-art results for emotion expression and action unit detection in three databases, FERA, CK+ and RU-FACS; measured audience reaction to a talk given by one of the authors; and discovered synchrony for smiling in videos of parent-infant interaction. IF is free of charge for academic use at http://www.humansensing.cs.cmu.edu/intraface/.

  11. ITER plasma facing components

    International Nuclear Information System (INIS)

    Kuroda, T.; Vieider, G.; Akiba, M.

    1991-01-01

    This document summarizes results of the Conceptual Design Activities (1988-1990) for the International Thermonuclear Experimental Reactor (ITER) project, namely those that pertain to the plasma facing components of the reactor vessel, of which the main components are the first wall and the divertor plates. After an introduction and an executive summary, the principal functions of the plasma-facing components are delineated, i.e., (i) define the low-impurity region within which the plasma is produced, (ii) absorb the electromagnetic radiation and charged-particle flux from the plasma, and (iii) protect the blanket/shield components from the plasma. A list of critical design issues for the divertor plates and the first wall is given, followed by discussions of the divertor plate design (including the issues of material selection, erosion lifetime, design concepts, thermal and mechanical analysis, operating limits and overall lifetime, tritium inventory, baking and conditioning, safety analysis, manufacture and testing, and advanced divertor concepts) and the first wall design (armor material and design, erosion lifetime, overall design concepts, thermal and mechanical analysis, lifetime and operating limits, tritium inventory, baking and conditioning, safety analysis, manufacture and testing, an alternative first wall design, and the limiters used instead of the divertor plates during start-up). Refs, figs and tabs

  12. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  13. Aging changes in the face

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/004004.htm Aging changes in the face To use the sharing ... face with age References Brodie SE, Francis JH. Aging and disorders of the eye. In: Fillit HM, ...

  14. Characterization of double face adhesive sheets for ceramic tile installation

    International Nuclear Information System (INIS)

    Nascimento, Otavio L.; Mansur, Alexandra A.P.; Mansur, Herman S.

    2011-01-01

    The main goal of this work was the characterization of an innovative ceramic tile installation product based on double face adhesive sheets. Density, hardness, tensile strength, x-ray diffraction, infrared spectroscopy, and scanning electron microscopy coupled with spectroscopy of dispersive energy assays were conducted. The results are in agreement with some manufacture specifications and the obtained information will be crucial in the analysis of durability and stability of the ceramic tile system installed with this new product. (author)

  15. Enabling dynamics in face analysis

    NARCIS (Netherlands)

    Dibeklioğlu, H.

    2014-01-01

    Most of the approaches in automatic face analysis rely solely on static appearance. However, temporal analysis of expressions reveals interesting patterns. For a better understanding of the human face, this thesis focuses on temporal changes in the face, and dynamic patterns of expressions. In

  16. Matching score based face recognition

    NARCIS (Netherlands)

    Boom, B.J.; Beumer, G.M.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2006-01-01

    Accurate face registration is of vital importance to the performance of a face recognition algorithm. We propose a new method: matching score based face registration, which searches for optimal alignment by maximizing the matching score output of a classifier as a function of the different

  17. Side-View Face Recognition

    NARCIS (Netherlands)

    Santemiz, P.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2010-01-01

    Side-view face recognition is a challenging problem with many applications. Especially in real-life scenarios where the environment is uncontrolled, coping with pose variations up to side-view positions is an important task for face recognition. In this paper we discuss the use of side view face

  18. Forensic Face Recognition: A Survey

    NARCIS (Netherlands)

    Ali, Tauseef; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; Quaglia, Adamo; Epifano, Calogera M.

    2012-01-01

    The improvements of automatic face recognition during the last 2 decades have disclosed new applications like border control and camera surveillance. A new application field is forensic face recognition. Traditionally, face recognition by human experts has been used in forensics, but now there is a

  19. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  20. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  1. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  2. Facing the Challenges

    DEFF Research Database (Denmark)

    He, Kai

    2014-01-01

    China's rise signifies a gradual transformation of the international system from unipolarity to a non-unipolar world. ,4s an organization of small and middle powers, ASEAN faces strategic uncertainties brought about by the power transition in the system. Deepening economic interdependence between...... Summit (EAS), the Regional Comprehensive Economic Partnership (RCEP), and the ASEAN Community, to constrain and shape China's behaviour in the region in the post-Cold War era. It argues that due to globalization and economic interdependence, the power transition in the 21st century is different from...... the previous ones. ASEAN can potentially make a great contribution to a peaceful transformation of the international system. How to resolve the South China Sea disputes peacefully will be a critical task for both the ASEAN and Chinese leaders in the next decade or two....

  3. Faced with a dilemma

    DEFF Research Database (Denmark)

    Christensen, Anne Vinggaard; Christiansen, Anne Hjøllund; Petersson, Birgit

    2013-01-01

    's legal right to choose TOP and considerations about the foetus' right to live were suppressed. Midwives experienced a dilemma when faced with aborted foetuses that looked like newborns and when aborted foetuses showed signs of life after a termination. Furthermore, they were critical of how physicians......: A qualitative study consisting of ten individual interviews with Danish midwives, all of whom had taken part in late TOP. RESULTS: Current practice of late TOP resembles the practice of normal deliveries and is influenced by a growing personalisation of the aborted foetus. The midwives strongly supported women...... counsel women/couples after prenatal diagnosis. CONCLUSIONS: The midwives' practice in relation to late TOP was characterised by an acknowledgement of the growing ethical status of the foetus and the emotional reactions of the women/couples going through late TOP. Other professions as well as structural...

  4. Infrared source test

    Energy Technology Data Exchange (ETDEWEB)

    Ott, L.

    1994-11-15

    The purpose of the Infrared Source Test (IRST) is to demonstrate the ability to track a ground target with an infrared sensor from an airplane. The system is being developed within the Advance Technology Program`s Theater Missile Defense/Unmanned Aerial Vehicle (UAV) section. The IRST payload consists of an Amber Radiance 1 infrared camera system, a computer, a gimbaled mirror, and a hard disk. The processor is a custom R3000 CPU board made by Risq Modular Systems, Inc. for LLNL. The board has ethernet, SCSI, parallel I/O, and serial ports, a DMA channel, a video (frame buffer) interface, and eight MBytes of main memory. The real-time operating system VxWorks has been ported to the processor. The application code is written in C on a host SUN 4 UNIX workstation. The IRST is the result of a combined effort by physicists, electrical and mechanical engineers, and computer scientists.

  5. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  6. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  7. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  8. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  9. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  10. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  11. The case for a United Nations verification agency. Disarmament under effective international control. Working paper 26

    International Nuclear Information System (INIS)

    Dorn, A.W.

    1990-07-01

    It is now universally recognized that arms control treaties should be effectively verified. The most objective, flexible and cost-effective means to verify the majority of multilateral treaties would be through a new agency under the United Nations. As a cooperative international effort to develop both the technology and the political framework for arms control verification, a United Nations verification agency (UNVA) would speed up and help secure the disarmament process by: verifying a number of existing and future treaties; investigating alleged breaches of treaties; and certifying, upon request, that voluntary arms control and confidence-building measures have been carried out. This paper presents the case for such a proposal, outlines a possible institutional configuration, considers the possibilities for growth and discusses the challenges facing the establishment of such an agency. (author). 16 refs., 1 tab

  12. The case for a United Nations verification agency. Disarmament under effective international control. Working paper 26

    Energy Technology Data Exchange (ETDEWEB)

    Dorn, A W

    1990-07-01

    It is now universally recognized that arms control treaties should be effectively verified. The most objective, flexible and cost-effective means to verify the majority of multilateral treaties would be through a new agency under the United Nations. As a cooperative international effort to develop both the technology and the political framework for arms control verification, a United Nations verification agency (UNVA) would speed up and help secure the disarmament process by: verifying a number of existing and future treaties; investigating alleged breaches of treaties; and certifying, upon request, that voluntary arms control and confidence-building measures have been carried out. This paper presents the case for such a proposal, outlines a possible institutional configuration, considers the possibilities for growth and discusses the challenges facing the establishment of such an agency. (author). 16 refs., 1 tab.

  13. Powerful infrared emitting diodes

    Directory of Open Access Journals (Sweden)

    Kogan L. M.

    2012-02-01

    Full Text Available Powerful infrared LEDs with emission wavelength 805 ± 10, 870 ± 20 and 940 ± 10 nm developed at SPC OED "OPTEL" are presented in the article. The radiant intensity of beam diode is under 4 W/sr in the continuous mode and under 100 W/sr in the pulse mode. The radiation power of wide-angle LEDs reaches 1 W in continuous mode. The external quantum efficiency of emission IR diodes runs up to 30%. There also has been created infrared diode modules with a block of flat Fresnel lenses with radiant intensity under 70 W/sr.

  14. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  15. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  16. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  17. Exploring the unconscious using faces.

    Science.gov (United States)

    Axelrod, Vadim; Bar, Moshe; Rees, Geraint

    2015-01-01

    Understanding the mechanisms of unconscious processing is one of the most substantial endeavors of cognitive science. While there are many different empirical ways to address this question, the use of faces in such research has proven exceptionally fruitful. We review here what has been learned about unconscious processing through the use of faces and face-selective neural correlates. A large number of cognitive systems can be explored with faces, including emotions, social cueing and evaluation, attention, multisensory integration, and various aspects of face processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Face Attention Network: An Effective Face Detector for the Occluded Faces

    OpenAIRE

    Wang, Jianfeng; Yuan, Ye; Yu, Gang

    2017-01-01

    The performance of face detection has been largely improved with the development of convolutional neural network. However, the occlusion issue due to mask and sunglasses, is still a challenging problem. The improvement on the recall of these occluded cases usually brings the risk of high false positives. In this paper, we present a novel face detector called Face Attention Network (FAN), which can significantly improve the recall of the face detection problem in the occluded case without comp...

  19. Heterogeneous sharpness for cross-spectral face recognition

    Science.gov (United States)

    Cao, Zhicheng; Schmid, Natalia A.

    2017-05-01

    Matching images acquired in different electromagnetic bands remains a challenging problem. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images, known as cross-spectral face recognition. Among many unsolved issues is the one of quality disparity of the heterogeneous images. Images acquired in different spectral bands are of unequal image quality due to distinct imaging mechanism, standoff distances, or imaging environment, etc. To reduce the effect of quality disparity on the recognition performance, one can manipulate images to either improve the quality of poor-quality images or to degrade the high-quality images to the level of the quality of their heterogeneous counterparts. To estimate the level of discrepancy in quality of two heterogeneous images a quality metric such as image sharpness is needed. It provides a guidance in how much quality improvement or degradation is appropriate. In this work we consider sharpness as a relative measure of heterogeneous image quality. We propose a generalized definition of sharpness by first achieving image quality parity and then finding and building a relationship between the image quality of two heterogeneous images. Therefore, the new sharpness metric is named heterogeneous sharpness. Image quality parity is achieved by experimentally finding the optimal cross-spectral face recognition performance where quality of the heterogeneous images is varied using a Gaussian smoothing function with different standard deviation. This relationship is established using two models; one of them involves a regression model and the other involves a neural network. To train, test and validate the model, we use composite operators developed in our lab to extract features from heterogeneous face images and use the sharpness metric to evaluate the face image quality within each band. Images from three different spectral bands visible light, near infrared, and short

  20. Energy conservation using face detection

    Science.gov (United States)

    Deotale, Nilesh T.; Kalbande, Dhananjay R.; Mishra, Akassh A.

    2011-10-01

    Computerized Face Detection, is concerned with the difficult task of converting a video signal of a person to written text. It has several applications like face recognition, simultaneous multiple face processing, biometrics, security, video surveillance, human computer interface, image database management, digital cameras use face detection for autofocus, selecting regions of interest in photo slideshows that use a pan-and-scale and The Present Paper deals with energy conservation using face detection. Automating the process to a computer requires the use of various image processing techniques. There are various methods that can be used for Face Detection such as Contour tracking methods, Template matching, Controlled background, Model based, Motion based and color based. Basically, the video of the subject are converted into images are further selected manually for processing. However, several factors like poor illumination, movement of face, viewpoint-dependent Physical appearance, Acquisition geometry, Imaging conditions, Compression artifacts makes Face detection difficult. This paper reports an algorithm for conservation of energy using face detection for various devices. The present paper suggests Energy Conservation can be done by Detecting the Face and reducing the brightness of complete image and then adjusting the brightness of the particular area of an image where the face is located using histogram equalization.

  1. [Comparative studies of face recognition].

    Science.gov (United States)

    Kawai, Nobuyuki

    2012-07-01

    Every human being is proficient in face recognition. However, the reason for and the manner in which humans have attained such an ability remain unknown. These questions can be best answered-through comparative studies of face recognition in non-human animals. Studies in both primates and non-primates show that not only primates, but also non-primates possess the ability to extract information from their conspecifics and from human experimenters. Neural specialization for face recognition is shared with mammals in distant taxa, suggesting that face recognition evolved earlier than the emergence of mammals. A recent study indicated that a social insect, the golden paper wasp, can distinguish their conspecific faces, whereas a closely related species, which has a less complex social lifestyle with just one queen ruling a nest of underlings, did not show strong face recognition for their conspecifics. Social complexity and the need to differentiate between one another likely led humans to evolve their face recognition abilities.

  2. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  3. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  4. Drunk identification using far infrared imagery based on DCT features in DWT domain

    Science.gov (United States)

    Xie, Zhihua; Jiang, Peng; Xiong, Ying; Li, Ke

    2016-10-01

    Drunk driving problem is a serious threat to traffic safety. Automatic drunk driver identification is vital to improve the traffic safety. This paper copes with automatic drunk driver detection using far infrared thermal images by the holistic features. To improve the robustness of drunk driver detection, instead of traditional local pixels, a holistic feature extraction method is proposed to attain compact and discriminative features for infrared face drunk identification. Discrete cosine transform (DCT) in discrete wavelet transform (DWT) domain is used to extract the useful features in infrared face images for its high speed. Then, the first six DCT coefficients are retained for drunk classification by means of "Z" scanning. Finally, SVM is applied to classify the drunk person. Experimental results illustrate that the accuracy rate of proposed infrared face drunk identification can reach 98.5% with high computation efficiency, which can be applied in real drunk driver detection system.

  5. Mathematical description for the measurement and verification of energy efficiency improvement

    International Nuclear Information System (INIS)

    Xia, Xiaohua; Zhang, Jiangfeng

    2013-01-01

    Highlights: • A mathematical model for the measurement and verification problem is established. • Criteria to choose the four measurement and verification options are given. • Optimal measurement and verification plan is defined. • Calculus of variations and optimal control can be further applied. - Abstract: Insufficient energy supply is a problem faced by many countries, and energy efficiency improvement is identified as the quickest and most effective solution to this problem. Many energy efficiency projects are therefore initiated to reach various energy saving targets. These energy saving targets need to be measured and verified, and in many countries such a measurement and verification (M and V) activity is guided by the International Performance Measurement and Verification Protocol (IPMVP). However, M and V is widely regarded as an inaccurate science: an engineering practice relying heavily on professional judgement. This paper presents a mathematical description of the energy efficiency M and V problem and thus casts into a scientific framework the basic M and V concepts, propositions, techniques and methodologies. For this purpose, a general description of energy system modeling is provided to facilitate the discussion, strict mathematical definitions for baseline and baseline adjustment are given, and the M and V plan development is formulated as an M and V modeling problem. An optimal M and V plan is therefore obtained through solving a calculus of variation, or equivalently, an optimal control problem. This approach provides a fruitful source of research problems by which optimal M and V plans under various practical constraints can be determined. With the aid of linear control system models, this mathematical description also provides sufficient conditions for M and V practitioners to determine which one of the four M and V options in IPMVP should be used in a practical M and V project

  6. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  7. Verification ghosts. The changing political environment of the IAEA

    International Nuclear Information System (INIS)

    Redden, K.J.

    2003-01-01

    Six years ago, Dr. Hans Blix wrote in the IAEA Bulletin of a 'general optimism about further arms control and verification.' At the time, world events warranted such a prognosis; the IAEA was riding a wave of momentum after its instrumental role in the roll-back of the South African nuclear weapons program and bringing Ukraine, Belarus, and Kazakhstan into the Nuclear Non Proliferation Treaty (NPT) as non-nuclear-weapon States. The NPT's indefinite extension was only two years old, and the most pressing challenges, while recognizable, were somewhat stagnant. Today, some tidings elicit similar optimism. The IAEA's increasing efforts to combat terrorism and the decision by Member States to depart from nearly 20 years of zero real growth budgetary policy are remarkable testaments to the Agency's adaptability and credibility in the face of new threats. And with the worldwide frenzy over terrorism and redoubled phobia of weapons of mass destruction (WMD), the Agency garners public attention now as never before. Emblematic of this recent upsurge in political attention, US President George W. Bush's annual State of the Union address in 2003 mentioned supporting the IAEA as a specific priority of his administration, the first mention of the Agency in that speech since President Eisenhower in 1961 lauded its creation under 'Atoms for Peace'. Such visibility portends a future with prospects for overcoming bureaucratic inertia and effecting significant changes to the Agency's benefit. But with that visibility has come an uncertainty about the IAEA's role in world affairs. Despite being able to resolve most benign problems more easily, the Agency must operate in an environment haunted by the non-proliferation analogue of Charles Dickens' triumvirate specters: the ghosts of verification challenges past, present and future -namely, the cessation of UN-mandated inspections in Iraq, the difficulties ensuring compliance in North Korea and Iran, and the need to maintain the IAEA

  8. The construction FACE database - Codifying the NIOSH FACE reports.

    Science.gov (United States)

    Dong, Xiuwen Sue; Largay, Julie A; Wang, Xuanwen; Cain, Chris Trahan; Romano, Nancy

    2017-09-01

    The National Institute for Occupational Safety and Health (NIOSH) has published reports detailing the results of investigations on selected work-related fatalities through the Fatality Assessment and Control Evaluation (FACE) program since 1982. Information from construction-related FACE reports was coded into the Construction FACE Database (CFD). Use of the CFD was illustrated by analyzing major CFD variables. A total of 768 construction fatalities were included in the CFD. Information on decedents, safety training, use of PPE, and FACE recommendations were coded. Analysis shows that one in five decedents in the CFD died within the first two months on the job; 75% and 43% of reports recommended having safety training or installing protection equipment, respectively. Comprehensive research using FACE reports may improve understanding of work-related fatalities and provide much-needed information on injury prevention. The CFD allows researchers to analyze the FACE reports quantitatively and efficiently. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  9. The infrared retina

    International Nuclear Information System (INIS)

    Krishna, Sanjay

    2009-01-01

    As infrared imaging systems have evolved from the first generation of linear devices to the second generation of small format staring arrays to the present 'third-gen' systems, there is an increased emphasis on large area focal plane arrays (FPAs) with multicolour operation and higher operating temperature. In this paper, we discuss how one needs to develop an increased functionality at the pixel level for these next generation FPAs. This functionality could manifest itself as spectral, polarization, phase or dynamic range signatures that could extract more information from a given scene. This leads to the concept of an infrared retina, which is an array that works similarly to the human eye that has a 'single' FPA but multiple cones, which are photoreceptor cells in the retina of the eye that enable the perception of colour. These cones are then coupled with powerful signal processing techniques that allow us to process colour information from a scene, even with a limited basis of colour cones. Unlike present day multi or hyperspectral systems, which are bulky and expensive, the idea would be to build a poor man's 'infrared colour' camera. We use examples such as plasmonic tailoring of the resonance or bias dependent dynamic tuning based on quantum confined Stark effect or incorporation of avalanche gain to achieve embodiments of the infrared retina.

  10. Feasibility of tropospheric water vapor profiling using infrared heterodyne differential absorption lidar

    Energy Technology Data Exchange (ETDEWEB)

    Grund, C.J.; Hardesty, R.M. [National Oceanic and Atmospheric Administration Environmental Technology Laboratoy, Boulder, CO (United States); Rye, B.J. [Univ. of Colorado, Boulder, CO (United States)

    1996-04-01

    The development and verification of realistic climate model parameterizations for clouds and net radiation balance and the correction of other site sensor observations for interferences due to the presence of water vapor are critically dependent on water vapor profile measurements. In this study, we develop system performance models and examine the potential of infrared differential absoroption lidar (DIAL) to determine the concentration of water vapor.

  11. Elektronická komunikace vs. komunikace face to face

    OpenAIRE

    Pipková, Zuzana

    2009-01-01

    This thesis deals with new forms of communication particularly electronic ones. The main goal is to distinguish electronic communication from face to face communication in a way that differs from traditional media theories. By using examples of the most important medium in electronic communication, Internet, it is shown that nowadays we have such forms of electronic communication that surpass the traditional classification of oral/written communication, immediate/mediate communication, face t...

  12. Face au risque

    CERN Document Server

    Grosse, Christian; November, Valérie

    2007-01-01

    Ce volume collectif sur le risque inaugure la collection L'ÉQUINOXE. Ancré dans l'histoire pour mesurer les continuités et les ruptures, il illustre la manière dont les sciences humaines évaluent et mesurent les enjeux collectifs du risque sur les plans politiques, scientifiques, énergétiques, juridiques et éthiques. Puisse-t-il nourrir la réflexion sur la culture et la prévention du risque. Ses formes épidémiques, écologiques, sociales, terroristes et militaires nourrissent les peurs actuelles, structurent les projets sécuritaires et constituent - sans doute - les défis majeurs à notre modernité. Dans la foulée de la richesse scientifique d'Equinoxe, L'ÉQUINOXE hérite de son esprit en prenant à son tour le pari de contribuer - non sans risque - à enrichir en Suisse romande et ailleurs le champ éditorial des sciences humaines dont notre société a besoin pour forger ses repères. Après Face au risque suivra cet automne Du sens des Lumières. (MICHEL PORRET Professeur Ordinaire à la F...

  13. Many Faces of Migrations

    Directory of Open Access Journals (Sweden)

    Milica Antić Gaber

    2013-12-01

    We believe that in the present thematic issue we have succeeded in capturing an important part of the modern European research dynamic in the field of migration. In addition to well-known scholars in this field several young authors at the beginning their research careers have been shortlisted for the publication. We are glad of their success as it bodes a vibrancy of this research area in the future. At the same time, we were pleased to receive responses to the invitation from representatives of so many disciplines, and that the number of papers received significantly exceeded the maximum volume of the journal. Recognising and understanding of the many faces of migration are important steps towards the comprehensive knowledge needed to successfully meet the challenges of migration issues today and even more so in the future. It is therefore of utmost importance that researchers find ways of transferring their academic knowledge into practice – to all levels of education, the media, the wider public and, of course, the decision makers in local, national and international institutions. The call also applies to all authors in this issue of the journal.

  14. Facing the Crises

    Directory of Open Access Journals (Sweden)

    Moira Baker

    2014-12-01

    Full Text Available Timely, provocative, and theoretically sophisticated, the essays comprising In the Face of Crises: Anglophone Literature in the Postmodern World situate their work amid several critical global concerns: the devastation wreaked by global capitalism following the worldwide financial crash, the financial sector’s totalizing grip upon the world economy, the challenge to traditional definitions of “human nature” and identity posed by technologies of the body and of warfare, the quest of indigenous communities for healing from the continuing traumatic effects of colonization, and the increasing corporatization of the academy as an apparatus of the neo-liberal state – to specify only a few. Edited by Professors Ljubica Matek and Jasna Poljak Rehlicki, these essays deploy a broad range of contemporary theories, representing recent developments in cultural studies, the new economic criticism, postcolonial film studies, feminism and gender studies, and the new historicism. The eleven essays selected by Matek and Rehlicki offer convincing support for their claim that humanistic research delving into Anglophone literature, far from being a “non-profitable” pursuit in an increasingly technologized society, affords clarifying insights into contemporary “economic, cultural, and social processes in the globalizing and globalized culture of the West” (ix.

  15. Photographic infrared spectroscopy and near infrared photometry of Be stars

    International Nuclear Information System (INIS)

    Swings, J.P.

    1976-01-01

    Two topics are tackled in this presentation: spectroscopy and photometry. The following definitions are chosen: photographic infrared spectroscopy (wavelengths Hα<=lambda<1.2 μ); near infrared photometry (wavebands: 1.6 μ<=lambda<=20 μ). Near infrared spectroscopy and photometry of classical and peculiar Be stars are discussed and some future developments in the field are outlined. (Auth.)

  16. Standardized Definitions for Code Verification Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-14

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  17. 9 CFR 417.8 - Agency verification.

    Science.gov (United States)

    2010-01-01

    ....8 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis to...

  18. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  19. Programmable electronic system design & verification utilizing DFM

    NARCIS (Netherlands)

    Houtermans, M.J.M.; Apostolakis, G.E.; Brombacher, A.C.; Karydas, D.M.

    2000-01-01

    The objective of this paper is to demonstrate the use of the Dynamic Flowgraph Methodology (DIM) during the design and verification of programmable electronic safety-related systems. The safety system consists of hardware as well as software. This paper explains and demonstrates the use of DIM to

  20. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  1. A Comparison of Modular Verification Techniques

    DEFF Research Database (Denmark)

    Andersen, Henrik Reif; Staunstrup, Jørgen; Maretti, Niels

    1997-01-01

    This paper presents and compares three techniques for mechanized verification of state oriented design descriptions. One is a traditional forwardgeneration of a fixed point characterizing the reachable states. The two others can utilize a modular structure provided by the designer. Onerequires...

  2. Formal Verification of Circuits and Systems

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    The problem of validation and verification of correctness of present day hardware and soft- ware systems has become extemely complex due to the enormous growth in the size of the designs. Today typically 50% to 70% of the design cycle time is spent in verifying correct- ness. While simulation remains a predominant form ...

  3. Model Checking - Automated Verification of Computational Systems

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 14; Issue 7. Model Checking - Automated Verification of Computational Systems. Madhavan Mukund. General Article Volume 14 Issue 7 July 2009 pp 667-681. Fulltext. Click here to view fulltext PDF. Permanent link:

  4. Formal Verification of Quasi-Synchronous Systems

    Science.gov (United States)

    2015-07-01

    pg. 215-226, Springer-Verlag: London, UK, 2001. [4] Nicolas Halbwachs and Louis Mandel, Simulation and Verification of Asynchronous Systems by...Huang, S. A. Smolka, W. Tan , and S. Tripakis, Deep Random Search for Efficient Model Checking of Timed Automata, in Proceedings of the 13th Monterey

  5. Behaviour Protocols Verification: Fighting State Explosion

    Czech Academy of Sciences Publication Activity Database

    Mach, M.; Plášil, František; Kofroň, Jan

    2005-01-01

    Roč. 6, č. 2 (2005), s. 22-30 ISSN 1525-9293 R&D Projects: GA ČR(CZ) GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : formal verification * software components * stateexplos ion * behavior protocols * parse trees Subject RIV: JC - Computer Hardware ; Software

  6. Verification of Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Jacobsen, Lasse; Jacobsen, Morten; Møller, Mikael Harkjær

    2011-01-01

    of interesting theoretical properties distinguishing them from other time extensions of Petri nets. We shall give an overview of the recent theory developed in the verification of TAPN extended with features like read/transport arcs, timed inhibitor arcs and age invariants. We will examine in detail...

  7. Unification & sharing in timed automata verification

    DEFF Research Database (Denmark)

    David, Alexandre; Behrmann, Gerd; Larsen, Kim Guldstrand

    2003-01-01

    We present the design of the model-checking engine and internal data structures for the next generation of UPPAAL. The design is based on a pipeline architecture where each stage represents one independent operation in the verification algorithms. The architecture is based on essentially one shar...

  8. A Verification Framework for Agent Communication

    NARCIS (Netherlands)

    Eijk, R.M. van; Boer, F.S. de; Hoek, W. van der; Meyer, J-J.Ch.

    2003-01-01

    In this paper, we introduce a verification method for the correctness of multiagent systems as described in the framework of acpl (Agent Communication Programming Language). The computational model of acpl consists of an integration of the two different paradigms of ccp (Concurrent Constraint

  9. A Typical Verification Challenge for the GRID

    NARCIS (Netherlands)

    van de Pol, Jan Cornelis; Bal, H. E.; Brim, L.; Leucker, M.

    2008-01-01

    A typical verification challenge for the GRID community is presented. The concrete challenge is to implement a simple recursive algorithm for finding the strongly connected components in a graph. The graph is typically stored in the collective memory of a number of computers, so a distributed

  10. Zero leakage quantization scheme for biometric verification

    NARCIS (Netherlands)

    Groot, de J.A.; Linnartz, J.P.M.G.

    2011-01-01

    Biometrics gain increasing interest as a solution for many security issues, but privacy risks exist in case we do not protect the stored templates well. This paper presents a new verification scheme, which protects the secrets of the enrolled users. We will show that zero leakage is achieved if

  11. Hydrostatic Paradox: Experimental Verification of Pressure Equilibrium

    Science.gov (United States)

    Kodejška, C.; Ganci, S.; Ríha, J.; Sedlácková, H.

    2017-01-01

    This work is focused on the experimental verification of the balance between the atmospheric pressure acting on the sheet of paper, which encloses the cylinder completely or partially filled with water from below, where the hydrostatic pressure of the water column acts against the atmospheric pressure. First of all this paper solves a theoretical…

  12. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  13. Using timing information in speaker verification

    CSIR Research Space (South Africa)

    Van Heerden, CJ

    2005-11-01

    Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...

  14. Sampling for the verification of materials balances

    International Nuclear Information System (INIS)

    Avenhaus, R.; Goeres, H.J.; Beedgen, R.

    1983-08-01

    The results of a theory for verification of nuclear materials balance data are presented. The sampling theory is based on two diversion models where also a combination of models is taken into account. The theoretical considerations are illustrated with numerical examples using the data of a highly enriched uranium fabrication plant. (orig.) [de

  15. A multistage framework for dismount spectral verification in the VNIR

    Science.gov (United States)

    Rosario, Dalton

    2013-05-01

    A multistage algorithm suite is proposed for a specific target detection/verification scenario, where a visible/near infrared hyperspectral (HS) sample is assumed to be available as the only cue from a reference image frame. The target is a suspicious dismount. The suite first applies a biometric based human skin detector to focus the attention of the search. Using as reference all of the bands in the spectral cue, the suite follows with a Bayesian Lasso inference stage designed to isolate pixels representing the specific material type cued by the user and worn by the human target (e.g., hat, jacket). In essence, the search focuses on testing material types near skin pixels. The third stage imposes an additional constraint through RGB color quantization and distance metric checking, limiting even further the search for material types in the scene having visible color similar to the target visible color. Using the proposed cumulative evidence strategy produced some encouraging range-invariant results on real HS imagery, dramatically reducing to zero the false alarm rate on the example dataset. These results were in contrast to the results independently produced by each one of the suite's stages, as the spatial areas of each stage's high false alarm outcome were mutually exclusive in the imagery. These conclusions also apply to results produced by other standard methods, in particular the kernel SVDD (support vector data description) and matched filter, as shown in the paper.

  16. Adaptation improves face trustworthiness discrimination

    Directory of Open Access Journals (Sweden)

    Bruce D Keefe

    2013-06-01

    Full Text Available Adaptation to facial characteristics, such as gender and viewpoint, has been shown to both bias our perception of faces and improve facial discrimination. In this study, we examined whether adapting to two levels of face trustworthiness improved sensitivity around the adapted level. Facial trustworthiness was manipulated by morphing between trustworthy and untrustworthy prototypes, each generated by morphing eight trustworthy and eight untrustworthy faces respectively. In the first experiment, just-noticeable differences (JNDs were calculated for an untrustworthy face after participants adapted to an untrustworthy face, a trustworthy face, or did not adapt. In the second experiment, the three conditions were identical, except that JNDs were calculated for a trustworthy face. In the third experiment we examined whether adapting to an untrustworthy male face improved discrimination to an untrustworthy female face. In all experiments, participants completed a two-interval forced-choice adaptive staircase procedure, in which they judged which face was more untrustworthy. JNDs were derived from a psychometric function fitted to the data. Adaptation improved sensitivity to faces conveying the same level of trustworthiness when compared to no adaptation. When adapting to and discriminating around a different level of face trustworthiness there was no improvement in sensitivity and JNDs were equivalent to those in the no adaptation condition. The improvement in sensitivity was found to occur even when adapting to a face with different gender and identity. These results suggest that adaptation to facial trustworthiness can selectively enhance mechanisms underlying the coding of facial trustworthiness to improve perceptual sensitivity. These findings have implications for the role of our visual experience in the decisions we make about the trustworthiness of other individuals.

  17. About-face on face recognition ability and holistic processing.

    Science.gov (United States)

    Richler, Jennifer J; Floyd, R Jackie; Gauthier, Isabel

    2015-01-01

    Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically.

  18. Calibration procedures of the Tore-Supra infrared endoscopes

    Science.gov (United States)

    Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.

    2018-01-01

    Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.

  19. Comparing Face Detection and Recognition Techniques

    OpenAIRE

    Korra, Jyothi

    2016-01-01

    This paper implements and compares different techniques for face detection and recognition. One is find where the face is located in the images that is face detection and second is face recognition that is identifying the person. We study three techniques in this paper: Face detection using self organizing map (SOM), Face recognition by projection and nearest neighbor and Face recognition using SVM.

  20. Infrared thermography of loose hangingwalls

    CSIR Research Space (South Africa)

    Kononov, VA

    2002-09-01

    Full Text Available This project is the continuation of GAP706 “Pre-feasibility investigation of infrared thermography for the identification of loose hangingwall and impending falls of ground”. The main concept behind the infrared thermography method...

  1. Face adaptation improves gender discrimination.

    Science.gov (United States)

    Yang, Hua; Shen, Jianhong; Chen, Juan; Fang, Fang

    2011-01-01

    Adaptation to a visual pattern can alter the sensitivities of neuronal populations encoding the pattern. However, the functional roles of adaptation, especially in high-level vision, are still equivocal. In the present study, we performed three experiments to investigate if face gender adaptation could affect gender discrimination. Experiments 1 and 2 revealed that adapting to a male/female face could selectively enhance discrimination for male/female faces. Experiment 3 showed that the discrimination enhancement induced by face adaptation could transfer across a substantial change in three-dimensional face viewpoint. These results provide further evidence suggesting that, similar to low-level vision, adaptation in high-level vision could calibrate the visual system to current inputs of complex shapes (i.e. face) and improve discrimination at the adapted characteristic. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  3. Recent advances in infrared astronomy

    International Nuclear Information System (INIS)

    Robson, E.I.

    1980-01-01

    A background survey is given of developments in infrared astronomy during the last decade. Advantages obtained in using infrared wavelengths to penetrate the Earth's atmosphere and the detectors used for this work are considered. Infrared studies of, among other subjects, the stars, dust clouds, the centre of our galaxy and the 3k cosmic background radiation, are discussed. (UK)

  4. Infrared up-conversion microscope

    DEFF Research Database (Denmark)

    2014-01-01

    There is presented an up-conversion infrared microscope (110) arranged for imaging an associated object (130), wherein the up-conversion infrared microscope (110) comprises a non-linear crystal (120) arranged for up-conversion of infrared electromagnetic radiation, and wherein an objective optical...

  5. Infrared up-conversion telescope

    DEFF Research Database (Denmark)

    2014-01-01

    There is presented to an up-conversion infrared telescope (110) arranged for imaging an associated scene (130), wherein the up-conversion infrared telescope (110) comprises a non-linear crystal (120) arranged for up-conversion of infrared electromagnetic radiation, and wherein a first optical...

  6. Infrared emission and extragalactic starbursts

    International Nuclear Information System (INIS)

    Telesco, C.M.

    1985-01-01

    The paper examines the belief that recent star formation plays a significant role in determining many of the infrared properties of galaxies. Pertinent types of infrared observations and the infrared properties of starbursts are briefly summarized. Recently developed models which describe the evolution of starbursts are also considered. (U.K.)

  7. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  8. Emotion Words: Adding Face Value.

    Science.gov (United States)

    Fugate, Jennifer M B; Gendron, Maria; Nakashima, Satoshi F; Barrett, Lisa Feldman

    2017-06-12

    Despite a growing number of studies suggesting that emotion words affect perceptual judgments of emotional stimuli, little is known about how emotion words affect perceptual memory for emotional faces. In Experiments 1 and 2 we tested how emotion words (compared with control words) affected participants' abilities to select a target emotional face from among distractor faces. Participants were generally more likely to false alarm to distractor emotional faces when primed with an emotion word congruent with the face (compared with a control word). Moreover, participants showed both decreased sensitivity (d') to discriminate between target and distractor faces, as well as altered response biases (c; more likely to answer "yes") when primed with an emotion word (compared with a control word). In Experiment 3 we showed that emotion words had more of an effect on perceptual memory judgments when the structural information in the target face was limited, as well as when participants were only able to categorize the face with a partially congruent emotion word. The overall results are consistent with the idea that emotion words affect the encoding of emotional faces in perceptual memory. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Matching faces with emotional expressions

    Directory of Open Access Journals (Sweden)

    Wenfeng eChen

    2011-08-01

    Full Text Available There is some evidence that faces with a happy expression are recognized better than faces with other expressions. However, little is known about whether this happy face advantage also applies to perceptual face matching, and whether similar differences exist among other expressions. Using a sequential matching paradigm, we systematically compared the effects of seven basic facial expressions on identity recognition. Identity matching was quickest when a pair of faces had an identical happy/sad/neutral expression, poorer when they had a fearful/surprise/angry expression, and poorest when they had a disgust expression. Faces with a happy/sad/fear/surprise expression were matched faster than those with an anger/disgust expression when the second face in a pair had a neutral expression. These results demonstrate that effects of facial expression on identity recognition are not limited to happy faces when a learned face is immediately tested. The results suggest different influences of expression in perceptual matching and long-term recognition memory.

  10. Face Recognition using Approximate Arithmetic

    DEFF Research Database (Denmark)

    Marso, Karol

    Face recognition is image processing technique which aims to identify human faces and found its use in various different fields for example in security. Throughout the years this field evolved and there are many approaches and many different algorithms which aim to make the face recognition as effective...... processing applications the results do not need to be completely precise and use of the approximate arithmetic can lead to reduction in terms of delay, space and power consumption. In this paper we examine possible use of approximate arithmetic in face recognition using Eigenfaces algorithm....

  11. The Kent Face Matching Test.

    Science.gov (United States)

    Fysh, Matthew C; Bindemann, Markus

    2018-05-01

    This study presents the Kent Face Matching Test (KFMT), which comprises 200 same-identity and 20 different-identity pairs of unfamiliar faces. Each face pair consists of a photograph from a student ID card and a high-quality portrait that was taken at least three months later. The test is designed to complement existing resources for face-matching research, by providing a more ecologically valid stimulus set that captures the natural variability that can arise in a person's appearance over time. Two experiments are presented to demonstrate that the KFMT provides a challenging measure of face matching but correlates with established tests. Experiment 1 compares a short version of this test with the optimized Glasgow Face Matching Test (GFMT). In Experiment 2, a longer version of the KFMT, with infrequent identity mismatches, is correlated with performance on the Cambridge Face Memory Test (CFMT) and the Cambridge Face Perception Test (CFPT). The KFMT is freely available for use in face-matching research. © 2017 The British Psychological Society.

  12. High precision automated face localization in thermal images: oral cancer dataset as test case

    Science.gov (United States)

    Chakraborty, M.; Raman, S. K.; Mukhopadhyay, S.; Patsa, S.; Anjum, N.; Ray, J. G.

    2017-02-01

    Automated face detection is the pivotal step in computer vision aided facial medical diagnosis and biometrics. This paper presents an automatic, subject adaptive framework for accurate face detection in the long infrared spectrum on our database for oral cancer detection consisting of malignant, precancerous and normal subjects of varied age group. Previous works on oral cancer detection using Digital Infrared Thermal Imaging(DITI) reveals that patients and normal subjects differ significantly in their facial thermal distribution. Therefore, it is a challenging task to formulate a completely adaptive framework to veraciously localize face from such a subject specific modality. Our model consists of first extracting the most probable facial regions by minimum error thresholding followed by ingenious adaptive methods to leverage the horizontal and vertical projections of the segmented thermal image. Additionally, the model incorporates our domain knowledge of exploiting temperature difference between strategic locations of the face. To our best knowledge, this is the pioneering work on detecting faces in thermal facial images comprising both patients and normal subjects. Previous works on face detection have not specifically targeted automated medical diagnosis; face bounding box returned by those algorithms are thus loose and not apt for further medical automation. Our algorithm significantly outperforms contemporary face detection algorithms in terms of commonly used metrics for evaluating face detection accuracy. Since our method has been tested on challenging dataset consisting of both patients and normal subjects of diverse age groups, it can be seamlessly adapted in any DITI guided facial healthcare or biometric applications.

  13. At face value : categorization goals modulate vigilance for angry faces

    NARCIS (Netherlands)

    Van Dillen, L.F.; Lakens, D.; Bos, van den K.

    2010-01-01

    The present research demonstrates that the attention bias to angry faces is modulated by how people categorize these faces. Since facial expressions contain psychologically meaningful information for social categorizations (i.e., gender, personality) but not for non-social categorizations (i.e.,

  14. Alternative face models for 3D face registration

    Science.gov (United States)

    Salah, Albert Ali; Alyüz, Neşe; Akarun, Lale

    2007-01-01

    3D has become an important modality for face biometrics. The accuracy of a 3D face recognition system depends on a correct registration that aligns the facial surfaces and makes a comparison possible. The best results obtained so far use a one-to-all registration approach, which means each new facial surface is registered to all faces in the gallery, at a great computational cost. We explore the approach of registering the new facial surface to an average face model (AFM), which automatically establishes correspondence to the pre-registered gallery faces. Going one step further, we propose that using a couple of well-selected AFMs can trade-off computation time with accuracy. Drawing on cognitive justifications, we propose to employ category-specific alternative average face models for registration, which is shown to increase the accuracy of the subsequent recognition. We inspect thin-plate spline (TPS) and iterative closest point (ICP) based registration schemes under realistic assumptions on manual or automatic landmark detection prior to registration. We evaluate several approaches for the coarse initialization of ICP. We propose a new algorithm for constructing an AFM, and show that it works better than a recent approach. Finally, we perform simulations with multiple AFMs that correspond to different clusters in the face shape space and compare these with gender and morphology based groupings. We report our results on the FRGC 3D face database.

  15. Cyber- and Face-to-Face Bullying: Who Crosses Over?

    Science.gov (United States)

    Shin, Hwayeon Helene; Braithwaite, Valerie; Ahmed, Eliza

    2016-01-01

    A total of 3956 children aged 12-13 years who completed the Longitudinal Study of Australian Children (LSAC Wave 5) were studied about their experiences of traditional face-to-face bullying and cyberbullying in the last month. In terms of prevalence, sixty percent of the sample had been involved in traditional bullying as the victim and/or the…

  16. Attention to internal face features in unfamiliar face matching.

    Science.gov (United States)

    Fletcher, Kingsley I; Butavicius, Marcus A; Lee, Michael D

    2008-08-01

    Accurate matching of unfamiliar faces is vital in security and forensic applications, yet previous research has suggested that humans often perform poorly when matching unfamiliar faces. Hairstyle and facial hair can strongly influence unfamiliar face matching but are potentially unreliable cues. This study investigated whether increased attention to the more stable internal face features of eyes, nose, and mouth was associated with more accurate face-matching performance. Forty-three first-year psychology students decided whether two simultaneously presented faces were of the same person or not. The faces were displayed for either 2 or 6 seconds, and had either similar or dissimilar hairstyles. The level of attention to internal features was measured by the proportion of fixation time spent on the internal face features and the sensitivity of discrimination to changes in external feature similarity. Increased attention to internal features was associated with increased discrimination in the 2-second display-time condition, but no significant relationship was found in the 6-second condition. Individual differences in eye-movements were highly stable across the experimental conditions.

  17. Forensic face recognition as a means to determine strength of evidence: A survey.

    Science.gov (United States)

    Zeinstra, C G; Meuwly, D; Ruifrok, A Cc; Veldhuis, R Nj; Spreeuwers, L J

    2018-01-01

    This paper surveys the literature on forensic face recognition (FFR), with a particular focus on the strength of evidence as used in a court of law. FFR is the use of biometric face recognition for several applications in forensic science. It includes scenarios of ID verification and open-set identification, investigation and intelligence, and evaluation of the strength of evidence. We present FFR from operational, tactical, and strategic perspectives. We discuss criticism of FFR and we provide an overview of research efforts from multiple perspectives that relate to the domain of FFR. Finally, we sketch possible future directions for FFR. Copyright © 2018 Central Police University.

  18. Facing My Fears (Editorial

    Directory of Open Access Journals (Sweden)

    Lindsay Glynn

    2008-03-01

    Full Text Available I’m scared. I’m nervous. In a few short weeks the contractors and electricians will take over my library for several months. They will drill huge gouges in the concrete floor, hammer, saw, scrape,move, wire, etc. No doubt they may have to be asked to keep their voices down once or twice. Half of the print journal collection will be relocated to accommodate a new teaching lab that will also double as an information commons. The planning has been going on for many months. We have consulted with other libraries, reviewed the literature, identified the needs of our various user groups, measured space,tested technical possibilities, and met with architects and engineers. Up until now the new lab was an organic idea on paper, discussed over coffee and in meetings. That’s fairly easy to deal with. But just around the corner it becomes a reality and I’m a bag of nerves. Have we made the right decisions? Will it address all our needs? Is there anything I forgot to consider? What if our users don’t like it? What if it is a complete failure?!Theoretically, it should be ok. I’ve followed the right steps and worked with a creative, talented and dedicated team. This is different from trying out a new instructional technique or reorganizing the information desk. This is big. I talk the evidence based talk regularly, but now I am walking the walk in a bigger way than I had ever imagined. Change can be frightening. Moving out of comfort zones is not easy. Having said that, the challenge can be invigorating and the change, refreshing. I find myself welcoming the change as much as I dread it. I’ll face my fears and see it through to the implementation and evaluations and beyond. And hey, no matter what the outcome, it should make for a good paper. If anyone else out there is going through a similar process, I’d be interested in comparing notes. I invite you to try something new this year in your work environment or in your professional activities

  19. Uncooled tunneling infrared sensor

    Science.gov (United States)

    Kenny, Thomas W. (Inventor); Kaiser, William J. (Inventor); Podosek, Judith A. (Inventor); Vote, Erika C. (Inventor); Muller, Richard E. (Inventor); Maker, Paul D. (Inventor)

    1995-01-01

    An uncooled infrared tunneling sensor in which the only moving part is a diaphragm which is deflected into contact with a micromachined silicon tip electrode prepared by a novel lithographic process. Similarly prepared deflection electrodes employ electrostatic force to control the deflection of a silicon nitride, flat diaphragm membrane. The diaphragm exhibits a high resonant frequency which reduces the sensor's sensitivity to vibration. A high bandwidth feedback circuit controls the tunneling current by adjusting the deflection voltage to maintain a constant deflection of the membrane. The resulting infrared sensor can be miniaturized to pixel dimensions smaller than 100 .mu.m. An alternative embodiment is implemented using a corrugated membrane to permit large deflection without complicated clamping and high deflection voltages. The alternative embodiment also employs a pinhole aperture in a membrane to accommodate environmental temperature variation and a sealed chamber to eliminate environmental contamination of the tunneling electrodes and undesireable accoustic coupling to the sensor.

  20. Wireless infrared computer control

    Science.gov (United States)

    Chen, George C.; He, Xiaofei

    2004-04-01

    Wireless mouse is not restricted by cable"s length and has advantage over its wired counterpart. However, all the mice available in the market have detection range less than 2 meters and angular coverage less than 180 degrees. Furthermore, commercial infrared mice are based on track ball and rollers to detect movements. This restricts them to be used in those occasions where users want to have dynamic movement, such as presentations and meetings etc. This paper presents our newly developed infrared wireless mouse, which has a detection range of 6 meters and angular coverage of 180 degrees. This new mouse uses buttons instead of traditional track ball and is developed to be a hand-held device like remote controller. It enables users to control cursor with a distance closed to computer and the mouse to be free from computer operation.

  1. Infrared Astronomy Satellite

    Science.gov (United States)

    Ferrera, G. A.

    1981-09-01

    In 1982, the Infrared Astronomy Satellite (IRAS) will be launched into a 900-km sun-synchronous (twilight) orbit to perform an unbiased, all-sky survey of the far-infrared spectrum from 8 to 120 microns. Observations telemetered to ground stations will be compiled into an IR astronomy catalog. Attention is given the cryogenically cooled, 60-cm Ritchey-Chretien telescope carried by the satellite, whose primary and secondary mirrors are fabricated from beryllium by means of 'Cryo-Null Figuring'. This technique anticipates the mirror distortions that will result from cryogenic cooling of the telescope and introduces dimensional compensations for them during machining and polishing. Consideration is also given to the interferometric characterization of telescope performance and Cryo/Thermal/Vacuum simulated space environment testing.

  2. Far infrared photoconductors

    International Nuclear Information System (INIS)

    Leotin, J.; Meny, C.

    1990-01-01

    This paper presents the development of far infrared photoconductors for the focal plane of a spaceborne instrument named SAFIRE. SAFIRE (Spectroscopy of the Atmosphere using Far-Infrared Emission) belongs to the EOS program (Earth Observing System) and is now in the definition phase. It is a joint effort by scientists from the United States, Great Britain, Italy and France for a new generation of atmosphere sensor. The overall goal of the SAFIRE experiment is to improve the understanding of the ozone distribution in the middle atmosphere by conducting global scale measurements of the important chemical, radiative and dynamical processes which influence its changes. This will be accomplished by the measurement of the far infrared thermal limb emission in seven spectral channels covering the range 80 to 400 cm -1 with a maximum resolution of 0.004 cm -1 . For example key gases like OH, O, HO 2 , N 2 O 5 will be probed for the first time. Achievement of the required detector sensitivity in the far-infrared imposes the choice of photoconductive detectors operating at liquid helium temperatures. Germanium doped with gallium is selected for six channels whereas germanium doped with beryllium is suitable for the N 2 O 5 channel. Both photoconductors Ge:Ga and Ge:Be benefit from a well established material technology. A better wavelength coverage of channel 1 is achieved by applying a small uniaxial stress of the order of 0.1 GPa on the Ge:Ga photoconductors. The channel 6B wavelength coverage could be improved by using zinc-doped-germanium (Ge:Zn) or, much better, by using a Blocked Impurity band silicon detector doped with antimony (BIB Si:Sb). The later is developed as an optional basis

  3. Infrared thermal annealing device

    International Nuclear Information System (INIS)

    Gladys, M.J.; Clarke, I.; O'Connor, D.J.

    2003-01-01

    A device for annealing samples within an ultrahigh vacuum (UHV) scanning tunneling microscopy system was designed, constructed, and tested. The device is based on illuminating the sample with infrared radiation from outside the UHV chamber with a tungsten projector bulb. The apparatus uses an elliptical mirror to focus the beam through a sapphire viewport for low absorption. Experiments were conducted on clean Pd(100) and annealing temperatures in excess of 1000 K were easily reached

  4. Ultrafast infrared vibrational spectroscopy

    CERN Document Server

    Fayer, Michael D

    2013-01-01

    The past ten years or so have seen the introduction of multidimensional methods into infrared and optical spectroscopy. The technology of multidimensional spectroscopy is developing rapidly and its applications are spreading to biology and materials science. Edited by a recognized leader in the field and with contributions from top researchers, including experimentalists and theoreticians, this book presents the latest research methods and results and will serve as an excellent resource for other researchers.

  5. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  6. Infrared Astronomy and Star Formation

    International Nuclear Information System (INIS)

    Evans, N.J.

    1985-01-01

    Infrared astronomy is a natural tool to use in studying star formation because infrared light penetrates the surrounding dust and because protostars are expected to emit infrared light. Infrared mapping and photometry have revealed many compact sources, often embedded in more extensive warm dust associated with a molecular cloud core. More detailed study of these objects is now beginning, and traditional interpretations are being questioned. Some compact sources are now thought to be density enhancements which are not self-luminous. Infrared excesses around young stars may not always be caused by circumstellar dust; speckle measurements have shown that at least some of the excess toward T Tauri is caused by an infrared companion. Spectroscopic studies of the dense, star-forming cores and of the compact objects themselves have uncovered a wealth of new phenomena, including the widespread occurence of energetic outflows. New discoveries with IRAS and with other planned infrared telescopes will continue to advance this field. (author)

  7. Side-View Face Recognition

    NARCIS (Netherlands)

    Santemiz, P.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.; van den Biggelaar, Olivier

    As a widely used biometrics, face recognition has many advantages such as being non-intrusive, natural and passive. On the other hand, in real-life scenarios with uncontrolled environment, pose variation up to side-view positions makes face recognition a challenging work. In this paper we discuss

  8. Forensic Face Recognition: A Survey

    NARCIS (Netherlands)

    Ali, Tauseef; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2010-01-01

    Beside a few papers which focus on the forensic aspects of automatic face recognition, there is not much published about it in contrast to the literature on developing new techniques and methodologies for biometric face recognition. In this report, we review forensic facial identification which is

  9. PrimeFaces beginner's guide

    CERN Document Server

    Reddy, K Siva Prasad

    2013-01-01

    A guide for beginner's with step-by-step instructions and an easy-to-follow approach.PrimeFaces Beginners Guide is a simple and effective guide for beginners, wanting to learn and implement PrimeFaces in their JSF-based applications. Some basic JSF and jQuery skills are required before you start working through the book.

  10. Genetic specificity of face recognition.

    Science.gov (United States)

    Shakeshaft, Nicholas G; Plomin, Robert

    2015-10-13

    Specific cognitive abilities in diverse domains are typically found to be highly heritable and substantially correlated with general cognitive ability (g), both phenotypically and genetically. Recent twin studies have found the ability to memorize and recognize faces to be an exception, being similarly heritable but phenotypically substantially uncorrelated both with g and with general object recognition. However, the genetic relationships between face recognition and other abilities (the extent to which they share a common genetic etiology) cannot be determined from phenotypic associations. In this, to our knowledge, first study of the genetic associations between face recognition and other domains, 2,000 18- and 19-year-old United Kingdom twins completed tests assessing their face recognition, object recognition, and general cognitive abilities. Results confirmed the substantial heritability of face recognition (61%), and multivariate genetic analyses found that most of this genetic influence is unique and not shared with other cognitive abilities.

  11. Face-Lift Satisfaction Using the FACE-Q.

    Science.gov (United States)

    Sinno, Sammy; Schwitzer, Jonathan; Anzai, Lavinia; Thorne, Charles H

    2015-08-01

    Face lifting is one of the most common operative procedures for facial aging and perhaps the procedure most synonymous with plastic surgery in the minds of the lay public, but no verifiable documentation of patient satisfaction exists in the literature. This study is the first to examine face-lift outcomes and patient satisfaction using a validated questionnaire. One hundred five patients undergoing a face lift performed by the senior author (C.H.T.) using a high, extended-superficial musculoaponeurotic system with submental platysma approximation technique were asked to complete anonymously the FACE-Q by e-mail. FACE-Q scores were assessed for each domain (range, 0 to 100), with higher scores indicating greater satisfaction with appearance or superior quality of life. Fifty-three patients completed the FACE-Q (50.5 percent response rate). Patients demonstrated high satisfaction with facial appearance (mean ± SD, 80.7 ± 22.3), and quality of life, including social confidence (90.4 ± 16.6), psychological well-being (92.8 ± 14.3), and early life impact (92.2 ± 16.4). Patients also reported extremely high satisfaction with their decision to undergo face lifting (90.5 ± 15.9). On average, patients felt they appeared 6.9 years younger than their actual age. Patients were most satisfied with the appearance of their nasolabial folds (86.2 ± 18.5), cheeks (86.1 ± 25.4), and lower face/jawline (86.0 ± 20.6), compared with their necks (78.1 ± 25.6) and area under the chin (67.9 ± 32.3). Patients who responded in this study were extremely satisfied with their decision to undergo face lifting and the outcomes and quality of life following the procedure.

  12. Face Gear Technology for Aerospace Power Transmission Progresses

    Science.gov (United States)

    2005-01-01

    the effects of manufacturing process improvements on the operating characteristics of face gears. The program is being conducted with McDonnell Douglas Helicopter Co., Lucas Western Inc., the University of Illinois at Chicago, and a NASA/U.S. Army team. The goal of the project is develop the grinding process, experimentally verify the improvement in face gear fatigue life, and conduct a full-scale helicopter transmission test. The theory and methodology to grind face gears has been completed, and manufacture of the test hardware is ongoing. Experimental verification on test hardware is scheduled to begin in fiscal 1996.

  13. Shield verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Boman, C.

    1992-02-01

    WSRC-RP-90-26, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification are integral part of the certification process. This document identifies the work performed and documentation generated to satisfy these action items for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system, it is not certification of the complete SHIELD system. Complete certification will follow at a later date. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but can be found in the references. The validation and verification effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system computer code is completed

  14. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  15. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  16. Development and verification of the CATHENA GUI

    International Nuclear Information System (INIS)

    Chin, T.

    2008-01-01

    This paper presents the development and verification of a graphical user interface for CATHENA MOD-3.5d. The thermalhydraulic computer code CATHENA has been developed to simulate the physical behaviour of the hydraulic components in nuclear reactors and experimental facilities. A representation of the facility is developed as an ASCII text file and used by CATHENA to perform the simulation. The existing method of manual generation of idealizations of a physical system for performing thermal hydraulic analysis is complex, time-consuming and prone to errors. An overview is presented of the CATHENA GUI and its depiction of a CATHENA idealization through the manipulation of a visual collection of objects. The methodologies and rigour involved in the verification of the CATHENA GUI will be discussed. (author)

  17. Packaged low-level waste verification system

    Energy Technology Data Exchange (ETDEWEB)

    Tuite, K.; Winberg, M.R.; McIsaac, C.V. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1995-12-31

    The Department of Energy through the National Low-Level Waste Management Program and WMG Inc. have entered into a joint development effort to design, build, and demonstrate the Packaged Low-Level Waste Verification System. Currently, states and low-level radioactive waste disposal site operators have no method to independently verify the radionuclide content of packaged low-level waste that arrives at disposal sites for disposition. At this time, the disposal site relies on the low-level waste generator shipping manifests and accompanying records to ensure that low-level waste received meets the site`s waste acceptance criteria. The subject invention provides the equipment, software, and methods to enable the independent verification of low-level waste shipping records to ensure that the site`s waste acceptance criteria are being met. The objective of the prototype system is to demonstrate a mobile system capable of independently verifying the content of packaged low-level waste.

  18. Time Optimal Reachability Analysis Using Swarm Verification

    DEFF Research Database (Denmark)

    Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand

    2016-01-01

    Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling...... and planning problems, response time optimization etc. We propose swarm verification to accelerate time optimal reachability using the real-time model-checker Uppaal. In swarm verification, a large number of model checker instances execute in parallel on a computer cluster using different, typically randomized...... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...

  19. Systems Approach to Arms Control Verification

    Energy Technology Data Exchange (ETDEWEB)

    Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M

    2015-05-15

    Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.

  20. GRIMHX verification and validation action matrix summary

    International Nuclear Information System (INIS)

    Trumble, E.F.

    1991-12-01

    WSRC-RP-90-026, Certification Plan for Reactor Analysis Computer Codes, describes a series of action items to be completed for certification of reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. Validation and verification of the code is an integral part of this process. This document identifies the work performed and documentation generated to satisfy these action items for the Reactor Physics computer code GRIMHX. Each action item is discussed with the justification for its completion. Specific details of the work performed are not included in this document but are found in the references. The publication of this document signals the validation and verification effort for the GRIMHX code is completed

  1. Automated Formal Verification for PLC Control Systems

    CERN Multimedia

    Fernández Adiego, Borja

    2014-01-01

    Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.

  2. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  3. Verification of Spent Fuel Transfers in Germany — Linking Strategy, Implementation and People

    International Nuclear Information System (INIS)

    Tsvetkov, I.; Araujo, J.; Morris, G.; Vukadin, Z.; Wishard, B.; Kahnmeyer, W.; ); Trautwein, W.

    2015-01-01

    Following the decision of the German Government to completely phase out nuclear energy by 2022, the Agency is facing an increasing number of spent fuel (SF) transfers from nuclear power plants (NPP) to dry SF storage facilities. Verification of these transfers in the period 2015-2016 would have required about 1000 additional calendar-days in the field by inspectors. To meet the verification requirements with the available resources, the Agency together with the European Commission (EC) designed an innovative approach. The approach is making full use of safeguards cooperation with the EC and Germany's NPP operators to reduce the inspector's efforts, while fully adhering to the Agency's safeguards policy and requirements. The approach includes verification for partial defect test using digital Cerenkov viewing device (DCVD) of all SF assemblies in a reactor pond(s) before and after a SF loading campaign; during the SF loading campaign all SF in pond(s) is maintained under continuous surveillance, while the containment measures on SF casks, i.e., fibre-optic and electronic seals, and corresponding fibre-optic cables, are applied by the NPP operator in accordance with the agreed procedure. While the above approach allows for substantial reduction of the Agency inspector presence during the SF cask loading campaign, it can only be implemented when good cooperation exists between the Agency, the facility operator, and, as in the case of Germany, the regional safeguards authority. (author)

  4. Verification tests for CANDU advanced fuel

    International Nuclear Information System (INIS)

    Chung, Chang Hwan; Chang, S.K.; Hong, S.D.

    1997-07-01

    For the development of a CANDU advanced fuel, the CANFLEX-NU fuel bundles were tested under reactor operating conditions at the CANDU-Hot test loop. This report describes test results and test methods in the performance verification tests for the CANFLEX-NU bundle design. The main items described in the report are as follows. - Fuel bundle cross-flow test - Endurance fretting/vibration test - Freon CHF test - Production of technical document. (author). 25 refs., 45 tabs., 46 figs

  5. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  6. Verification of the SLC wake potentials

    International Nuclear Information System (INIS)

    Bane, K.; Weiland, T.

    1983-01-01

    The accurate knowledge of the monopole, dipole, and quadrupole wake potentials is essential for SLC. These wake potentials were previously computed by the modal method. The time domain code TBCI allows independent verification of these results. This comparison shows that the two methods agree to within 10% for bunch lengths down to 1 mm. TBCI results also indicate that rounding the irises gives at least a 10% reduction in the wake potentials

  7. Safety Verification for Probabilistic Hybrid Systems

    Czech Academy of Sciences Publication Activity Database

    Zhang, J.; She, Z.; Ratschan, Stefan; Hermanns, H.; Hahn, E.M.

    2012-01-01

    Roč. 18, č. 6 (2012), s. 572-587 ISSN 0947-3580 R&D Projects: GA MŠk OC10048; GA ČR GC201/08/J020 Institutional research plan: CEZ:AV0Z10300504 Keywords : model checking * hybrid system s * formal verification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.250, year: 2012

  8. Stamp Verification for Automated Document Authentication

    DEFF Research Database (Denmark)

    Micenková, Barbora; van Beusekom, Joost; Shafait, Faisal

    Stamps, along with signatures, can be considered as the most widely used extrinsic security feature in paper documents. In contrast to signatures, however, for stamps little work has been done to automatically verify their authenticity. In this paper, an approach for verification of color stamps ...... and copied stamps. Sensitivity and specificity of up to 95% could be obtained on a data set that is publicly available....

  9. Component Verification and Certification in NASA Missions

    Science.gov (United States)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  10. Survey of Existing Tools for Formal Verification.

    Energy Technology Data Exchange (ETDEWEB)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo

    2014-12-01

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  11. Analyzing personalized policies for online biometric verification.

    Science.gov (United States)

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  12. System Description: Embedding Verification into Microsoft Excel

    OpenAIRE

    Collins, Graham; Dennis, Louise Abigail

    2000-01-01

    The aim of the PROSPER project is to allow the embedding of existing verification technology into applications in such a way that the theorem proving is hidden, or presented to the end user in a natural way. This paper describes a system built to test whether the PROSPER toolkit satisfied this aim. The system combines the toolkit with Microsoft Excel, a popular commercial spreadsheet application.

  13. Functional Verification of Enhanced RISC Processor

    OpenAIRE

    SHANKER NILANGI; SOWMYA L

    2013-01-01

    This paper presents design and verification of a 32-bit enhanced RISC processor core having floating point computations integrated within the core, has been designed to reduce the cost and complexity. The designed 3 stage pipelined 32-bit RISC processor is based on the ARM7 processor architecture with single precision floating point multiplier, floating point adder/subtractor for floating point operations and 32 x 32 booths multiplier added to the integer core of ARM7. The binary representati...

  14. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  15. Initial Verification and Validation Assessment for VERA

    Energy Technology Data Exchange (ETDEWEB)

    Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States); Athe, Paridhi [North Carolina State Univ., Raleigh, NC (United States); Jones, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hetzler, Adam [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sieger, Matt [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-04-01

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. This approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.

  16. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  17. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  18. IMRT delivery verification using a spiral phantom

    International Nuclear Information System (INIS)

    Richardson, Susan L.; Tome, Wolfgang A.; Orton, Nigel P.; McNutt, Todd R.; Paliwal, Bhudatt R.

    2003-01-01

    In this paper we report on the testing and verification of a system for IMRT delivery quality assurance that uses a cylindrical solid water phantom with a spiral trajectory for radiographic film placement. This spiral film technique provides more complete dosimetric verification of the entire IMRT treatment than perpendicular film methods, since it samples a three-dimensional dose subspace rather than using measurements at only one or two depths. As an example, the complete analysis of the predicted and measured spiral films is described for an intracranial IMRT treatment case. The results of this analysis are compared to those of a single field perpendicular film technique that is typically used for IMRT QA. The comparison demonstrates that both methods result in a dosimetric error within a clinical tolerance of 5%, however the spiral phantom QA technique provides a more complete dosimetric verification while being less time consuming. To independently verify the dosimetry obtained with the spiral film, the same IMRT treatment was delivered to a similar phantom in which LiF thermoluminescent dosimeters were arranged along the spiral trajectory. The maximum difference between the predicted and measured TLD data for the 1.8 Gy fraction was 0.06 Gy for a TLD located in a high dose gradient region. This further validates the ability of the spiral phantom QA process to accurately verify delivery of an IMRT plan

  19. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  20. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  1. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  2. Application of Infrared Thermography in Power Distribution System

    Directory of Open Access Journals (Sweden)

    Anwer Ali Sahito

    2014-07-01

    Full Text Available Electricity sector of Pakistan is facing daunting energy crisis. Generation deficit results in long duration of load shedding throughout the country. Old aged distribution system, lack of maintenance and equipment failure cause long unplanned outages and frequent supply interruptions. HESCO (Hyderabad Electric Supply Company is facing high technical losses, supply interruption and financial loss due to transformer damages. Infrared Thermography is non-contact, safe and fast measure for distribution system inspection. In this paper, thermographic inspection for different distribution system equipment is carried out to identify possible developed faults. It is observed that IR (Infrared thermography is effective measure for detecting developed faulty conditions at the initial stages to avoid unplanned outages

  3. Can Faces Prime a Language?

    Science.gov (United States)

    Woumans, Evy; Martin, Clara D; Vanden Bulcke, Charlotte; Van Assche, Eva; Costa, Albert; Hartsuiker, Robert J; Duyck, Wouter

    2015-09-01

    Bilinguals have two languages that are activated in parallel. During speech production, one of these languages must be selected on the basis of some cue. The present study investigated whether the face of an interlocutor can serve as such a cue. Spanish-Catalan and Dutch-French bilinguals were first familiarized with certain faces, each of which was associated with only one language, during simulated Skype conversations. Afterward, these participants performed a language production task in which they generated words associated with the words produced by familiar and unfamiliar faces displayed on-screen. When responding to familiar faces, participants produced words faster if the faces were speaking the same language as in the previous Skype simulation than if the same faces were speaking a different language. Furthermore, this language priming effect disappeared when it became clear that the interlocutors were actually bilingual. These findings suggest that faces can prime a language, but their cuing effect disappears when it turns out that they are unreliable as language cues. © The Author(s) 2015.

  4. Modeling human dynamics of face-to-face interaction networks

    OpenAIRE

    Starnini, Michele; Baronchelli, Andrea; Pastor-Satorras, Romualdo

    2013-01-01

    Face-to-face interaction networks describe social interactions in human gatherings, and are the substrate for processes such as epidemic spreading and gossip propagation. The bursty nature of human behavior characterizes many aspects of empirical data, such as the distribution of conversation lengths, of conversations per person, or of inter-conversation times. Despite several recent attempts, a general theoretical understanding of the global picture emerging from data is still lacking. Here ...

  5. FUSION DECISION FOR A BIMODAL BIOMETRIC VERIFICATION SYSTEM USING SUPPORT VECTOR MACHINE AND ITS VARIATIONS

    Directory of Open Access Journals (Sweden)

    A. Teoh

    2017-12-01

    Full Text Available This paw presents fusion detection technique comparisons based on support vector machine and its variations for a bimodal biometric verification system that makes use of face images and speech utterances. The system is essentially constructed by a face expert, a speech expert and a fusion decision module. Each individual expert has been optimized to operate in automatic mode and designed for security access application. Fusion decision schemes considered are linear, weighted Support Vector Machine (SVM and linear SVM with quadratic transformation. The conditions tested include the balanced and unbalanced conditions between the two experts in order to obtain the optimum fusion module from  these techniques best suited to the target application.

  6. Faces in the Mist: Illusory Face and Letter Detection

    Directory of Open Access Journals (Sweden)

    Cory A. Rieth

    2011-06-01

    Full Text Available We report three behavioral experiments on the spatial characteristics evoking illusory face and letter detection. False detections made to pure noise images were analyzed using a modified reverse correlation method in which hundreds of observers rated a modest number of noise images (480 during a single session. This method was originally developed for brain imaging research, and has been used in a number of fMRI publications, but this is the first report of the behavioral classification images. In Experiment 1 illusory face detection occurred in response to scattered dark patches throughout the images, with a bias to the left visual field. This occurred despite the use of a fixation cross and expectations that faces would be centered. In contrast, illusory letter detection (Experiment 2 occurred in response to centrally positioned dark patches. Experiment 3 included an oval in all displays to spatially constrain illusory face detection. With the addition of this oval the classification image revealed an eyes/nose/mouth pattern. These results suggest that face detection is triggered by a minimal face-like pattern even when these features are not centered in visual focus.

  7. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  8. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  9. Information Theory for Gabor Feature Selection for Face Recognition

    Directory of Open Access Journals (Sweden)

    Shen Linlin

    2006-01-01

    Full Text Available A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004, our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  10. Information Theory for Gabor Feature Selection for Face Recognition

    Science.gov (United States)

    Shen, Linlin; Bai, Li

    2006-12-01

    A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004), our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  11. Two-dimensional spectroscopy at infrared and optical frequencies

    OpenAIRE

    Hochstrasser, Robin M.

    2007-01-01

    This Perspective on multidimensional spectroscopy in the optical and infrared spectral regions focuses on the principles and the scientific and technical challenges facing these new fields. The methods hold great promise for advances in the visualization of time-dependent structural changes in complex systems ranging from liquids to biological assemblies, new materials, and fundamental physical processes. The papers in this special feature on multidimensional spectroscopy in chemistry, physic...

  12. Infrared diffuse interstellar bands

    Science.gov (United States)

    Galazutdinov, G. A.; Lee, Jae-Joon; Han, Inwoo; Lee, Byeong-Cheol; Valyavin, G.; Krełowski, J.

    2017-05-01

    We present high-resolution (R ˜ 45 000) profiles of 14 diffuse interstellar bands in the ˜1.45 to ˜2.45 μm range based on spectra obtained with the Immersion Grating INfrared Spectrograph at the McDonald Observatory. The revised list of diffuse bands with accurately estimated rest wavelengths includes six new features. The diffuse band at 15 268.2 Å demonstrates a very symmetric profile shape and thus can serve as a reference for finding the 'interstellar correction' to the rest wavelength frame in the H range, which suffers from a lack of known atomic/molecular lines.

  13. Infrared upconversion hyperspectral imaging

    DEFF Research Database (Denmark)

    Kehlet, Louis Martinus; Tidemand-Lichtenberg, Peter; Dam, Jeppe Seidelin

    2015-01-01

    In this Letter, hyperspectral imaging in the mid-IR spectral region is demonstrated based on nonlinear frequency upconversion and subsequent imaging using a standard Si-based CCD camera. A series of upconverted images are acquired with different phase match conditions for the nonlinear frequency...... conversion process. From this, a sequence of monochromatic images in the 3.2-3.4 mu m range is generated. The imaged object consists of a standard United States Air Force resolution target combined with a polystyrene film, resulting in the presence of both spatial and spectral information in the infrared...... image. (C) 2015 Optical Society of America...

  14. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  15. A Brazing Defect Detection Using an Ultrasonic Infrared Imaging Inspection

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Choi, Young Soo; Jung, Seung Ho; Jung, Hyun Kyu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2007-10-15

    When a high-energy ultrasound propagates through a solid body that contains a crack or a delamination, the two faces of the defect do not ordinarily vibrate in unison, and dissipative phenomena such as friction, rubbing and clapping between the faces will convert some of the vibrational energy to heat. By combining this heating effect with infrared imaging, one can detect a subsurface defect in material in real time. In this paper a realtime detection of the brazing defect of thin Inconel plates using the UIR (ultrasonic infrared imaging) technology is described. A low frequency (23 kHz) ultrasonic transducer was used to infuse the welded Inconel plates with a short pulse of sound for 280 ms. The ultrasonic source has a maximum power of 2 kW. The surface temperature of the area under inspection is imaged by an infrared camera that is coupled to a fast frame grabber in a computer. The hot spots, which are a small area around the bound between the two faces of the Inconel plates near the defective brazing point and heated up highly, are observed. And the weak thermal signal is observed at the defect position of brazed plate also. Using the image processing technology such as background subtraction average and image enhancement using histogram equalization, the position of defective brazing regions in the thin Inconel plates can be located certainly

  16. Similarity measures for face recognition

    CERN Document Server

    Vezzetti, Enrico

    2015-01-01

    Face recognition has several applications, including security, such as (authentication and identification of device users and criminal suspects), and in medicine (corrective surgery and diagnosis). Facial recognition programs rely on algorithms that can compare and compute the similarity between two sets of images. This eBook explains some of the similarity measures used in facial recognition systems in a single volume. Readers will learn about various measures including Minkowski distances, Mahalanobis distances, Hansdorff distances, cosine-based distances, among other methods. The book also summarizes errors that may occur in face recognition methods. Computer scientists "facing face" and looking to select and test different methods of computing similarities will benefit from this book. The book is also useful tool for students undertaking computer vision courses.

  17. Visual attention to dynamic faces and objects is linked to face processing skills: A combined study of children with autism and controls

    Directory of Open Access Journals (Sweden)

    Julia eParish-Morris

    2013-04-01

    Full Text Available Although the extant literature on face recognition skills in Autism Spectrum Disorder (ASD shows clear impairments compared to typically developing controls (TDC at the group level, the distribution of scores within ASD is broad. In the present research, we take a dimensional approach and explore how differences in social attention during an eye tracking experiment correlate with face recognition skills across ASD and TDC. Emotional discrimination and person identity perception face processing skills were assessed using the Let’s Face It! Skills Battery in 110 children with and without ASD. Social attention was assessed using infrared eye gaze tracking during passive viewing of movies of facial expressions and objects displayed together on a computer screen. Face processing skills were significantly correlated with measures of attention to faces and with social skills as measured by the Social Communication Questionnaire. Consistent with prior research, children with ASD scored significantly lower on face processing skills tests but, unexpectedly, group differences in amount of attention to faces (versus objects were not found. We discuss possible methodological contributions to this null finding. We also highlight the importance of a dimensional approach for understanding the developmental origins of reduced face perception skills, and emphasize the need for longitudinal research to truly understand how social motivation and social attention influence the development of social perceptual skills.

  18. Thermography by Infrared

    International Nuclear Information System (INIS)

    Harara, W.; Allouch, Y.; Altahan, A.

    2015-08-01

    This study focused on the principle’s explanation of metallic components and structures testing by thermography method using infrared waves. The study confirmed that, thermal waves testing technique as one of the most important method among the modern non-destructive testing methods. It is characterized by its economy, easy to apply and timely testing of components and metallic structures. This method is applicable to a wide variety of components such as testing pieces of planes, power plants, electric transmission lines and aerospace components, in order to verify their structures and fabrication quality and their comformance to the international standards.Testing the components by thermography using infrared radiation is easy and rapid if compared to other NDT methods. The study included an introduction to the thermography testing method, its equipements, components and the applied technique. Finally, two practical applications are given in order to show the importance of this method in industry concerned with determining the liquid level in a tank and testing the stability of the control box of electrical supply.(author)

  19. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  20. 3D Face Apperance Model

    DEFF Research Database (Denmark)

    Lading, Brian; Larsen, Rasmus; Astrom, K

    2006-01-01

    We build a 3D face shape model, including inter- and intra-shape variations, derive the analytical Jacobian of its resulting 2D rendered image, and show example of its fitting performance with light, pose, id, expression and texture variations......We build a 3D face shape model, including inter- and intra-shape variations, derive the analytical Jacobian of its resulting 2D rendered image, and show example of its fitting performance with light, pose, id, expression and texture variations...

  1. Low background infrared (LBIR) facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Low background infrared (LBIR) facility was originally designed to calibrate user supplied blackbody sources and to characterize low-background IR detectors and...

  2. Infrared emission from supernova condensates

    International Nuclear Information System (INIS)

    Dwek, E.; Werner, M.W.

    1981-01-01

    We examine the possibility of detecting grains formed in supernovae by observations of their emission in the infrared. The basic processes determining the temperature and infrared radiation of grains in supernovae environments are analyzed, and the results are used to estimate the infrared emission from the highly metal enriched ''fast moving knots'' in Cas A. The predicted fluxes lie within the reach of current ground-based facilities at 10 μm, and their emission should be detectable throughout the infrared band with cryogenic space telescopes

  3. Enhanced attention amplifies face adaptation.

    Science.gov (United States)

    Rhodes, Gillian; Jeffery, Linda; Evangelista, Emma; Ewing, Louise; Peters, Marianne; Taylor, Libby

    2011-08-15

    Perceptual adaptation not only produces striking perceptual aftereffects, but also enhances coding efficiency and discrimination by calibrating coding mechanisms to prevailing inputs. Attention to simple stimuli increases adaptation, potentially enhancing its functional benefits. Here we show that attention also increases adaptation to faces. In Experiment 1, face identity aftereffects increased when attention to adapting faces was increased using a change detection task. In Experiment 2, figural (distortion) face aftereffects increased when attention was increased using a snap game (detecting immediate repeats) during adaptation. Both were large effects. Contributions of low-level adaptation were reduced using free viewing (both experiments) and a size change between adapt and test faces (Experiment 2). We suggest that attention may enhance adaptation throughout the entire cortical visual pathway, with functional benefits well beyond the immediate advantages of selective processing of potentially important stimuli. These results highlight the potential to facilitate adaptive updating of face-coding mechanisms by strategic deployment of attentional resources. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  5. Verification and Validation of Embedded Knowledge-Based Software Systems

    National Research Council Canada - National Science Library

    Santos, Eugene

    1999-01-01

    .... We pursued this by carefully examining the nature of uncertainty and information semantics and developing intelligent tools for verification and validation that provides assistance to the subject...

  6. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  7. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  8. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  9. Investigation of Navier-Stokes Code Verification and Design Optimization

    Science.gov (United States)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  10. Near-infrared photometric study of open star cluster IC 1805

    International Nuclear Information System (INIS)

    Sagar, R.; Yu, Q.Z.

    1990-01-01

    The JHK magnitudes of 29 stars in the region of open star cluster IC 1805 were measured. These, and the existing infrared and optical observations, indicate a normal interstellar extinction law in the direction of the cluster. Further, most of the early-type stars have near-infrared fluxes as expected from their spectral types. Patchy distribution of ionized gas and dust appears to be the cause of nonuniform extinction across the cluster face. 36 refs

  11. Discrimination between smiling faces: Human observers vs. automated face analysis.

    Science.gov (United States)

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Statistical Model-Based Face Pose Estimation

    Institute of Scientific and Technical Information of China (English)

    GE Xinliang; YANG Jie; LI Feng; WANG Huahua

    2007-01-01

    A robust face pose estimation approach is proposed by using face shape statistical model approach and pose parameters are represented by trigonometric functions. The face shape statistical model is firstly built by analyzing the face shapes from different people under varying poses. The shape alignment is vital in the process of building the statistical model. Then, six trigonometric functions are employed to represent the face pose parameters. Lastly, the mapping function is constructed between face image and face pose by linearly relating different parameters. The proposed approach is able to estimate different face poses using a few face training samples. Experimental results are provided to demonstrate its efficiency and accuracy.

  13. Specification and Verification of Hybrid System

    International Nuclear Information System (INIS)

    Widjaja, Belawati H.

    1997-01-01

    Hybrid systems are reactive systems which intermix between two components, discrete components and continuous components. The continuous components are usually called plants, subject to disturbances which cause the state variables of the systems changing continuously by physical laws and/or by the control laws. The discrete components can be digital computers, sensor and actuators controlled by programs. These programs are designed to select, control and supervise the behavior of the continuous components. Specification and verification of hybrid systems has recently become an active area of research in both computer science and control engineering, many papers concerning hybrid system have been published. This paper gives a design methodology for hybrid systems as an example to the specification and verification of hybrid systems. The design methodology is based on the cooperation between two disciplines, control engineering and computer science. The methodology brings into the design of control loops and decision loops. The external behavior of control loops are specified in a notation which is understandable by the two disciplines. The design of control loops which employed theory of differential equation is done by control engineers, and its correctness is also guaranteed analytically or experimentally by control engineers. The decision loops are designed in computing science based on the specifications of control loops. The verification of systems requirements can be done by computing scientists using a formal reasoning mechanism. For illustrating the proposed design, a problem of balancing an inverted pendulum which is a popular experiment device in control theory is considered, and the Mean Value Calculus is chosen as a formal notation for specifying the control loops and designing the decision loops

  14. VEG-01: Veggie Hardware Verification Testing

    Science.gov (United States)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  15. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  16. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  17. MODELS CONCERNING PREVENTIVE VERIFICATION OF TECHNICAL EQUIPMENT

    Directory of Open Access Journals (Sweden)

    CÂRLAN M.

    2016-12-01

    Full Text Available The paper presents three operative models whose purpose is to improve the practice of preventive maintenance to a wide range of technical installations. Although the calculation criteria are different, the goal is the same: to determine the optimum time between two consecutive preventive interventions. The optimum criteria of these models are: - the maximum share of technical entity operating probabilities, in the case of the Ackoff - Sasieni [1] method; -the optimum time interval for preventive verification depending on the preventive-corrective maintenance costs imposed by the deciding factor, for the AsturioBaldin [2] model; - the minimum number of renewals – preventive and/or corrective maintenance operations [3

  18. Verification report for SIMREP 1.1

    International Nuclear Information System (INIS)

    Tarapore, P.S.

    1987-06-01

    SIMREP 1.1 is a discrete event computer simulation of repository operations in the surface waste-handling facility. The logic for this model is provided by Fluor Technology, Inc., the Architect/Engineer of the salt repository. The verification methods included a line-by-line review of the code, a detailed examination of a generated trace of all simulated events over a given period of operations, and a comparison of the simulation output results with expected values. SIMREP 1.1 performs in the required manner under the given range of input conditions

  19. Turf Conversion Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M and V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings as a result of water conservation measures (WCMs) in energy performance contracts associated with converting turfgrass or other water-intensive plantings to water-wise and sustainable landscapes. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M and V plan, and details the procedures to use to determine water savings.

  20. Outdoor Irrigation Measurement and Verification Protocol

    Energy Technology Data Exchange (ETDEWEB)

    Kurnik, Charles W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Stoughton, Kate M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Figueroa, Jorge [Western Resource Advocates, Boulder, CO (United States)

    2017-12-05

    This measurement and verification (M&V) protocol provides procedures for energy service companies (ESCOs) and water efficiency service companies (WESCOs) to determine water savings resulting from water conservation measures (WCMs) in energy performance contracts associated with outdoor irrigation efficiency projects. The water savings are determined by comparing the baseline water use to the water use after the WCM has been implemented. This protocol outlines the basic structure of the M&V plan, and details the procedures to use to determine water savings.

  1. Verification of product quality from process control

    International Nuclear Information System (INIS)

    Drobot, A.; Bunnell, L.R.; Freeborn, W.P.; Macedo, P.B.; Mellinger, G.B.; Pegg, I.L.; Piepel, G.F.; Reimus, M.A.H.; Routt, K.R.; Saad, E.

    1989-01-01

    Process models were developed to characterize the waste vitrification at West Valley, in terms of process operating constraints and glass compositions achievable. The need for verification of compliance with the proposed Waste Acceptance Preliminary Specification criteria led to development of product models, the most critical one being a glass durability model. Both process and product models were used in developing a target composition for the waste glass. This target composition designed to ensure that glasses made to this target will be of acceptable durability after all process variations have been accounted for. 4 refs., 11 figs., 5 tabs

  2. FEFTRA {sup TM} verification. Update 2013

    Energy Technology Data Exchange (ETDEWEB)

    Loefman, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Meszaros, F. [The Relief Lab., Harskut, (Hungary)

    2013-12-15

    FEFTRA is a finite element program package developed at VTT for the analyses of groundwater flow in Posiva's site evaluation programme that seeks a final repository for spent nuclear fuel in Finland. The code is capable of modelling steady-state or transient groundwater flow, solute transport and heat transfer as coupled or separate phenomena. Being a typical research tool used only by its developers, the FEFTRA code lacked long of a competent testing system and precise documentation of the verification of the code. In 2006 a project was launched, in which the objective was to reorganise all the material related to the existing verification cases and place them into the FEFTRA program path under the version-control system. The work also included development of a new testing system, which automatically calculates the selected cases, checks the new results against the old approved results and constructs a summary of the test run. All the existing cases were gathered together, checked and added into the new testing system. The documentation of each case was rewritten with the LATEX document preparation system and added into the testing system in a way that the whole test documentation (this report) could easily be generated in a postscript or pdf-format. The current report is the updated version of the verification report published in 2007. At the moment the report includes mainly the cases related to the testing of the primary result quantities (i.e. hydraulic head, pressure, salinity concentration, temperature). The selected cases, however, represent typical hydrological applications, in which the program package has been and will be employed in the Posiva's site evaluation programme, i.e. the simulations of groundwater flow, solute transport and heat transfer as separate or coupled phenomena. The comparison of the FEFTRA results to the analytical, semianalytical and/or other numerical solutions proves the capability of FEFTRA to simulate such problems

  3. CIT photoheliograph functional verification unit test program

    Science.gov (United States)

    1973-01-01

    Tests of the 2/3-meter photoheliograph functional verification unit FVU were performed with the FVU installed in its Big Bear Solar Observatory vacuum chamber. Interferometric tests were run both in Newtonian (f/3.85) and Gregorian (f/50) configurations. Tests were run in both configurations with optical axis horizontal, vertical, and at 45 deg to attempt to determine any gravity effects on the system. Gravity effects, if present, were masked by scatter in the data associated with the system wavefront error of 0.16 lambda rms ( = 6328A) apparently due to problems in the primary mirror. Tests showed that the redesigned secondary mirror assembly works well.

  4. SCALE criticality safety verification and validation package

    International Nuclear Information System (INIS)

    Bowman, S.M.; Emmett, M.B.; Jordan, W.C.

    1998-01-01

    Verification and validation (V and V) are essential elements of software quality assurance (QA) for computer codes that are used for performing scientific calculations. V and V provides a means to ensure the reliability and accuracy of such software. As part of the SCALE QA and V and V plans, a general V and V package for the SCALE criticality safety codes has been assembled, tested and documented. The SCALE criticality safety V and V package is being made available to SCALE users through the Radiation Safety Information Computational Center (RSICC) to assist them in performing adequate V and V for their SCALE applications

  5. Accelerating functional verification of an integrated circuit

    Science.gov (United States)

    Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.

    2015-10-27

    Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.

  6. Burnup verification using the FORK measurement system

    International Nuclear Information System (INIS)

    Ewing, R.I.

    1994-01-01

    Verification measurements may be used to help ensure nuclear criticality safety when burnup credit is applied to spent fuel transport and storage systems. The FORK measurement system, designed at Los Alamos National Laboratory for the International Atomic Energy Agency safeguards program, has been used to verify reactor site records for burnup and cooling time for many years. The FORK system measures the passive neutron and gamma-ray emission from spent fuel assemblies while in the storage pool. This report deals with the application of the FORK system to burnup credit operations based on measurements performed on spent fuel assemblies at the Oconee Nuclear Station of Duke Power Company

  7. The backfitting process and its verification

    International Nuclear Information System (INIS)

    Del Nero, G.; Grimaldi, G.

    1990-01-01

    Backfitting of plants in operation is based on: - compliance with new standards and regulations, - lessons learned from operating experience. This goal can be more effectively achieved on the basis of a valid methodology of analysis and a consistent process of collection, storage and retrieval of the operating data. The general backfitting problem, the verification process and the utilization of TPA as mean to assess backfitting are illustrated. The results of the analyses performed on Caorso plant are presented as well, using some specially designed software tools Management more than hardware problems are focused. Some general conclusions are then presented as final results of the whole work

  8. CINE: Comet INfrared Excitation

    Science.gov (United States)

    de Val-Borro, Miguel; Cordiner, Martin A.; Milam, Stefanie N.; Charnley, Steven B.

    2017-08-01

    CINE calculates infrared pumping efficiencies that can be applied to the most common molecules found in cometary comae such as water, hydrogen cyanide or methanol. One of the main mechanisms for molecular excitation in comets is the fluorescence by the solar radiation followed by radiative decay to the ground vibrational state. This command-line tool calculates the effective pumping rates for rotational levels in the ground vibrational state scaled by the heliocentric distance of the comet. Fluorescence coefficients are useful for modeling rotational emission lines observed in cometary spectra at sub-millimeter wavelengths. Combined with computational methods to solve the radiative transfer equations based, e.g., on the Monte Carlo algorithm, this model can retrieve production rates and rotational temperatures from the observed emission spectrum.

  9. Infrared Quenched Photoinduced Superconductivity

    Science.gov (United States)

    Federici, J. F.; Chew, D.; Guttierez-Solana, J.; Molina, G.; Savin, W.; Wilber, W.

    1996-03-01

    Persistant photoconductivity (PPC) and photoinduced superconductivity (PISC) in oxygen deficient YBa_2Cu_3O_6+x have received recent attention. It has been suggested that oxygen vacancy defects play an important role in the PISC/PPC mechanism.(J. F. Federici, D. Chew, B. Welker, W. Savin, J. Gutierrez-Solana, and T. Fink, Phys. Rev. B), December 1995 Supported by National Science Foundation In this model, defects trap photogenerated electrons so that electron-hole recombination can not occur thereby allowing photogenerated holes to contribute to the carrier density. Nominally, the photoinduced state is long-lived, persisting for days at low temperature. Experiment results will be presented demonstrating that the photoinduced superconductivity state can be quenched using infrared radiation. Implications for the validity of the PISC/PCC defect model will be discussed.

  10. Near-infrared spectroscopy

    Directory of Open Access Journals (Sweden)

    Virendra Jain

    2015-01-01

    Full Text Available Tissue ischaemia can be a significant contributor to increased morbidity and mortality. Conventional oxygenation monitoring modalities measure systemic oxygenation, but regional tissue oxygenation is not monitored. Near-infrared spectroscopy (NIRS is a non-invasive monitor for measuring regional oxygen saturation which provides real-time information. There has been increased interest in the clinical application of NIRS following numerous studies that show improved outcome in various clinical situations especially cardiac surgery. Its use has shown improved neurological outcome and decreased postoperative stay in cardiac surgery. Its usefulness has been investigated in various high risk surgeries such as carotid endarterectomy, thoracic surgeries, paediatric population and has shown promising results. There is however, limited data supporting its role in neurosurgical population. We strongly feel, it might play a key role in future. It has significant advantages over other neuromonitoring modalities, but more technological advances are needed before it can be used more widely into clinical practice.

  11. Terahertz and Mid Infrared

    CERN Document Server

    Shulika, Oleksiy; Detection of Explosives and CBRN (Using Terahertz)

    2014-01-01

    The reader will find here a timely update on new THz sources and detection schemes as well as concrete applications to the detection of Explosives and CBRN. Included is a method to identify hidden RDX-based explosives (pure and plastic ones) in the frequency domain study by Fourier Transformation, which has been complemented by the demonstration of improvement of the quality of the images captured commercially available THz passive cameras. The presented examples show large potential for the detection of small hidden objects at long distances (6-10 m).  Complementing the results in the short-wavelength range, laser spectroscopy with a mid-infrared, room temperature, continuous wave, DFB laser diode and high performance DFB QCL have been demonstrated to offer excellent enabling sensor technologies for environmental monitoring, medical diagnostics, industrial and security applications.  From the new source point of view a number of systems have been presented - From superconductors to semiconductors, e.g. Det...

  12. Infrared laser system

    International Nuclear Information System (INIS)

    Cantrell, C.D.; Carbone, R.J.

    1977-01-01

    An infrared laser system and method for isotope separation may comprise a molecular gas laser oscillator to produce a laser beam at a first wavelength, Raman spin flip means for shifting the laser to a second wavelength, a molecular gas laser amplifier to amplify said second wavelength laser beam to high power, and optical means for directing the second wavelength, high power laser beam against a desired isotope for selective excitation thereof in a mixture with other isotopes. The optical means may include a medium which shifts the second wavelength high power laser beam to a third wavelength, high power laser beam at a wavelength coincidental with a corresponding vibrational state of said isotope and which is different from vibrational states of other isotopes in the gas mixture

  13. Mid-infrared upconversion spectroscopy

    DEFF Research Database (Denmark)

    Tidemand-Lichtenberg, Peter; Dam, Jeppe Seidelin; Andersen, H. V.

    2016-01-01

    Mid-infrared (MIR) spectroscopy is emerging as an attractive alternative to near-infrared or visible spectroscopy. MIR spectroscopy offers a unique possibility to probe the fundamental absorption bands of a large number of gases as well as the vibrational spectra of complex molecules. In this paper...

  14. The application of Near Infrared Reflectance Spectroscopy (NIRS) for the quantitative analysis of hydrocortisone in primary materials

    OpenAIRE

    A. PITTAS; C. SERGIDES; K. NIKOLICH

    2001-01-01

    Near Infrared Reflectance Spectroscopy (NIRS), coupled with fiber optic probes, has been shown to be a quick and reliable analytical tool for quality assurance and quality control in the pharmaceutical industry, both for verifications of raw materials and quantification of the active ingredients in final products. In this paper, a typical pharmaceutical product, hydrocortisone sodium succinate, is used as an example for the application of NIR spectroscopy for quality control. In order to deve...

  15. [Treatment goals in FACE philosophy].

    Science.gov (United States)

    Martin, Domingo; Maté, Amaia; Zabalegui, Paula; Valenzuela, Jaime

    2017-03-01

    The FACE philosophy is characterized by clearly defined treatment goals: facial esthetics, dental esthetics, periodontal health, functional occlusion, neuromuscular mechanism and joint function. The purpose is to establish ideal occlusion with good facial esthetics and an orthopedic stable joint position. The authors present all the concepts of FACE philosophy and illustrate them through one case report. Taking into account all the FACE philosophy concepts increases diagnostic ability and improves the quality and stability of treatment outcomes. The goal of this philosophy is to harmonize the facial profile, tooth alignment, periodontium, functional occlusion, neuromuscular mechanism and joint function. The evaluation and treatment approach to vertical problems are unique to the philosophy. © EDP Sciences, SFODF, 2017.

  16. 'Pale Face'/'Pointy Face: SA Criminology in Denial | Henkeman ...

    African Journals Online (AJOL)

    This paper responds to key aspects of Bill Dixon's article, Understanding 'Pointy Face': What is criminology for?1 It suggests that criminology should unambiguously be 'for' social justice in South Africa's transhistorically unequal context. South African prison statistics are used as a conceptual shortcut to briefly highlight ...

  17. Registration of 3D Face Scans with Average Face Models

    NARCIS (Netherlands)

    A.A. Salah (Albert Ali); N. Alyuz; L. Akarun

    2008-01-01

    htmlabstractThe accuracy of a 3D face recognition system depends on a correct registration that aligns the facial surfaces and makes a comparison possible. The best results obtained so far use a costly one-to-all registration approach, which requires the registration of each facial surface to all

  18. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  19. CTBT integrated verification system evaluation model supplement

    Energy Technology Data Exchange (ETDEWEB)

    EDENBURN,MICHAEL W.; BUNTING,MARCUS; PAYNE JR.,ARTHUR C.; TROST,LAWRENCE C.

    2000-03-02

    Sandia National Laboratories has developed a computer based model called IVSEM (Integrated Verification System Evaluation Model) to estimate the performance of a nuclear detonation monitoring system. The IVSEM project was initiated in June 1994, by Sandia's Monitoring Systems and Technology Center and has been funded by the U.S. Department of Energy's Office of Nonproliferation and National Security (DOE/NN). IVSEM is a simple, ''top-level,'' modeling tool which estimates the performance of a Comprehensive Nuclear Test Ban Treaty (CTBT) monitoring system and can help explore the impact of various sensor system concepts and technology advancements on CTBT monitoring. One of IVSEM's unique features is that it integrates results from the various CTBT sensor technologies (seismic, in sound, radionuclide, and hydroacoustic) and allows the user to investigate synergy among the technologies. Specifically, IVSEM estimates the detection effectiveness (probability of detection), location accuracy, and identification capability of the integrated system and of each technology subsystem individually. The model attempts to accurately estimate the monitoring system's performance at medium interfaces (air-land, air-water) and for some evasive testing methods such as seismic decoupling. The original IVSEM report, CTBT Integrated Verification System Evaluation Model, SAND97-25 18, described version 1.2 of IVSEM. This report describes the changes made to IVSEM version 1.2 and the addition of identification capability estimates that have been incorporated into IVSEM version 2.0.

  20. Nonintrusive verification attributes for excess fissile materials

    International Nuclear Information System (INIS)

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status