WorldWideScience

Sample records for infrared face verification

  1. Face recognition in the thermal infrared domain

    Science.gov (United States)

    Kowalski, M.; Grudzień, A.; Palka, N.; Szustakowski, M.

    2017-10-01

    Biometrics refers to unique human characteristics. Each unique characteristic may be used to label and describe individuals and for automatic recognition of a person based on physiological or behavioural properties. One of the most natural and the most popular biometric trait is a face. The most common research methods on face recognition are based on visible light. State-of-the-art face recognition systems operating in the visible light spectrum achieve very high level of recognition accuracy under controlled environmental conditions. Thermal infrared imagery seems to be a promising alternative or complement to visible range imaging due to its relatively high resistance to illumination changes. A thermal infrared image of the human face presents its unique heat-signature and can be used for recognition. The characteristics of thermal images maintain advantages over visible light images, and can be used to improve algorithms of human face recognition in several aspects. Mid-wavelength or far-wavelength infrared also referred to as thermal infrared seems to be promising alternatives. We present the study on 1:1 recognition in thermal infrared domain. The two approaches we are considering are stand-off face verification of non-moving person as well as stop-less face verification on-the-move. The paper presents methodology of our studies and challenges for face recognition systems in the thermal infrared domain.

  2. Face Verification for Mobile Personal Devices

    NARCIS (Netherlands)

    Tao, Q.

    2009-01-01

    In this thesis, we presented a detailed study of the face verification problem on the mobile device, covering every component of the system. The study includes face detection, registration, normalization, and verification. Furthermore, the information fusion problem is studied to verify face

  3. Simple thermal to thermal face verification method based on local texture descriptors

    Science.gov (United States)

    Grudzien, A.; Palka, Norbert; Kowalski, M.

    2017-08-01

    Biometrics is a science that studies and analyzes physical structure of a human body and behaviour of people. Biometrics found many applications ranging from border control systems, forensics systems for criminal investigations to systems for access control. Unique identifiers, also referred to as modalities are used to distinguish individuals. One of the most common and natural human identifiers is a face. As a result of decades of investigations, face recognition achieved high level of maturity, however recognition in visible spectrum is still challenging due to illumination aspects or new ways of spoofing. One of the alternatives is recognition of face in different parts of light spectrum, e.g. in infrared spectrum. Thermal infrared offer new possibilities for human recognition due to its specific properties as well as mature equipment. In this paper we present the scheme of subject's verification methodology by using facial images in thermal range. The study is focused on the local feature extraction methods and on the similarity metrics. We present comparison of two local texture-based descriptors for thermal 1-to-1 face recognition.

  4. MobileFaceNets: Efficient CNNs for Accurate Real-time Face Verification on Mobile Devices

    OpenAIRE

    Chen, Sheng; Liu, Yang; Gao, Xiang; Han, Zhen

    2018-01-01

    In this paper, we proposed a class of extremely efficient CNN models, MobileFaceNets, which use less than 1 million parameters and are specifically tailored for high-accuracy real-time face verification on mobile and embedded devices. We first make a simple analysis on the weakness of common mobile networks for face verification. The weakness has been well overcome by our specifically designed MobileFaceNets. Under the same experimental conditions, our MobileFaceNets achieve significantly sup...

  5. Compressive sensing using optimized sensing matrix for face verification

    Science.gov (United States)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  6. Infrared and visible fusion face recognition based on NSCT domain

    Science.gov (United States)

    Xie, Zhihua; Zhang, Shuai; Liu, Guodong; Xiong, Jinquan

    2018-01-01

    Visible face recognition systems, being vulnerable to illumination, expression, and pose, can not achieve robust performance in unconstrained situations. Meanwhile, near infrared face images, being light- independent, can avoid or limit the drawbacks of face recognition in visible light, but its main challenges are low resolution and signal noise ratio (SNR). Therefore, near infrared and visible fusion face recognition has become an important direction in the field of unconstrained face recognition research. In this paper, a novel fusion algorithm in non-subsampled contourlet transform (NSCT) domain is proposed for Infrared and visible face fusion recognition. Firstly, NSCT is used respectively to process the infrared and visible face images, which exploits the image information at multiple scales, orientations, and frequency bands. Then, to exploit the effective discriminant feature and balance the power of high-low frequency band of NSCT coefficients, the local Gabor binary pattern (LGBP) and Local Binary Pattern (LBP) are applied respectively in different frequency parts to obtain the robust representation of infrared and visible face images. Finally, the score-level fusion is used to fuse the all the features for final classification. The visible and near infrared face recognition is tested on HITSZ Lab2 visible and near infrared face database. Experiments results show that the proposed method extracts the complementary features of near-infrared and visible-light images and improves the robustness of unconstrained face recognition.

  7. Evaluation of Face Detection Algorithms for the Bank Client Identity Verification

    Directory of Open Access Journals (Sweden)

    Szczodrak Maciej

    2017-06-01

    Full Text Available Results of investigation of face detection algorithms efficiency in the banking client visual verification system are presented. The video recordings were made in real conditions met in three bank operating outlets employing a miniature industrial USB camera. The aim of the experiments was to check the practical usability of the face detection method in the biometric bank client verification system. The main assumption was to provide a simplified as much as possible user interaction with the application. Applied algorithms for face detection are described and achieved results of face detection in the real bank environment conditions are presented. Practical limitations of the application based on encountered problems are discussed.

  8. Face Verification using MLP and SVM

    OpenAIRE

    Cardinaux, Fabien; Marcel, Sébastien

    2002-01-01

    The performance of machine learning algorithms has steadily improved over the past few years, such as MLP or more recently SVM. In this paper, we compare two successful discriminant machine learning algorithms apply to the problem of face verification: MLP and SVM. These two algorithms are tested on a benchmark database, namely XM2VTS. Results show that a MLP is better than a SVM on this particular task.

  9. Design of an Active Multispectral SWIR Camera System for Skin Detection and Face Verification

    Directory of Open Access Journals (Sweden)

    Holger Steiner

    2016-01-01

    Full Text Available Biometric face recognition is becoming more frequently used in different application scenarios. However, spoofing attacks with facial disguises are still a serious problem for state of the art face recognition algorithms. This work proposes an approach to face verification based on spectral signatures of material surfaces in the short wave infrared (SWIR range. They allow distinguishing authentic human skin reliably from other materials, independent of the skin type. We present the design of an active SWIR imaging system that acquires four-band multispectral image stacks in real-time. The system uses pulsed small band illumination, which allows for fast image acquisition and high spectral resolution and renders it widely independent of ambient light. After extracting the spectral signatures from the acquired images, detected faces can be verified or rejected by classifying the material as “skin” or “no-skin.” The approach is extensively evaluated with respect to both acquisition and classification performance. In addition, we present a database containing RGB and multispectral SWIR face images, as well as spectrometer measurements of a variety of subjects, which is used to evaluate our approach and will be made available to the research community by the time this work is published.

  10. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  11. Infrared face recognition based on LBP histogram and KW feature selection

    Science.gov (United States)

    Xie, Zhihua

    2014-07-01

    The conventional LBP-based feature as represented by the local binary pattern (LBP) histogram still has room for performance improvements. This paper focuses on the dimension reduction of LBP micro-patterns and proposes an improved infrared face recognition method based on LBP histogram representation. To extract the local robust features in infrared face images, LBP is chosen to get the composition of micro-patterns of sub-blocks. Based on statistical test theory, Kruskal-Wallis (KW) feature selection method is proposed to get the LBP patterns which are suitable for infrared face recognition. The experimental results show combination of LBP and KW features selection improves the performance of infrared face recognition, the proposed method outperforms the traditional methods based on LBP histogram, discrete cosine transform(DCT) or principal component analysis(PCA).

  12. Near infrared and visible face recognition based on decision fusion of LBP and DCT features

    Science.gov (United States)

    Xie, Zhihua; Zhang, Shuai; Liu, Guodong; Xiong, Jinquan

    2018-03-01

    Visible face recognition systems, being vulnerable to illumination, expression, and pose, can not achieve robust performance in unconstrained situations. Meanwhile, near infrared face images, being light- independent, can avoid or limit the drawbacks of face recognition in visible light, but its main challenges are low resolution and signal noise ratio (SNR). Therefore, near infrared and visible fusion face recognition has become an important direction in the field of unconstrained face recognition research. In order to extract the discriminative complementary features between near infrared and visible images, in this paper, we proposed a novel near infrared and visible face fusion recognition algorithm based on DCT and LBP features. Firstly, the effective features in near-infrared face image are extracted by the low frequency part of DCT coefficients and the partition histograms of LBP operator. Secondly, the LBP features of visible-light face image are extracted to compensate for the lacking detail features of the near-infrared face image. Then, the LBP features of visible-light face image, the DCT and LBP features of near-infrared face image are sent to each classifier for labeling. Finally, decision level fusion strategy is used to obtain the final recognition result. The visible and near infrared face recognition is tested on HITSZ Lab2 visible and near infrared face database. The experiment results show that the proposed method extracts the complementary features of near-infrared and visible face images and improves the robustness of unconstrained face recognition. Especially for the circumstance of small training samples, the recognition rate of proposed method can reach 96.13%, which has improved significantly than 92.75 % of the method based on statistical feature fusion.

  13. Weighted piecewise LDA for solving the small sample size problem in face verification.

    Science.gov (United States)

    Kyperountas, Marios; Tefas, Anastasios; Pitas, Ioannis

    2007-03-01

    A novel algorithm that can be used to boost the performance of face-verification methods that utilize Fisher's criterion is presented and evaluated. The algorithm is applied to similarity, or matching error, data and provides a general solution for overcoming the "small sample size" (SSS) problem, where the lack of sufficient training samples causes improper estimation of a linear separation hyperplane between the classes. Two independent phases constitute the proposed method. Initially, a set of weighted piecewise discriminant hyperplanes are used in order to provide a more accurate discriminant decision than the one produced by the traditional linear discriminant analysis (LDA) methodology. The expected classification ability of this method is investigated throughout a series of simulations. The second phase defines proper combinations for person-specific similarity scores and describes an outlier removal process that further enhances the classification ability. The proposed technique has been tested on the M2VTS and XM2VTS frontal face databases. Experimental results indicate that the proposed framework greatly improves the face-verification performance.

  14. Near infrared face recognition: A literature survey

    Czech Academy of Sciences Publication Activity Database

    Farokhi, Sajad; Flusser, Jan; Sheikh, U. U.

    2016-01-01

    Roč. 21, č. 1 (2016), s. 1-17 ISSN 1574-0137 R&D Projects: GA ČR GA15-16928S Institutional support: RVO:67985556 Keywords : Literature survey * Biometrics * Face recognition * Near infrared * Illumination invariant Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2016/ZOI/flusser-0461834.pdf

  15. Anti-Makeup: Learning A Bi-Level Adversarial Network for Makeup-Invariant Face Verification

    OpenAIRE

    Li, Yi; Song, Lingxiao; Wu, Xiang; He, Ran; Tan, Tieniu

    2017-01-01

    Makeup is widely used to improve facial attractiveness and is well accepted by the public. However, different makeup styles will result in significant facial appearance changes. It remains a challenging problem to match makeup and non-makeup face images. This paper proposes a learning from generation approach for makeup-invariant face verification by introducing a bi-level adversarial network (BLAN). To alleviate the negative effects from makeup, we first generate non-makeup images from makeu...

  16. Invariant Face recognition Using Infrared Images

    International Nuclear Information System (INIS)

    Zahran, E.G.

    2012-01-01

    Over the past few decades, face recognition has become a rapidly growing research topic due to the increasing demands in many applications of our daily life such as airport surveillance, personal identification in law enforcement, surveillance systems, information safety, securing financial transactions, and computer security. The objective of this thesis is to develop a face recognition system capable of recognizing persons with a high recognition capability, low processing time, and under different illumination conditions, and different facial expressions. The thesis presents a study for the performance of the face recognition system using two techniques; the Principal Component Analysis (PCA), and the Zernike Moments (ZM). The performance of the recognition system is evaluated according to several aspects including the recognition rate, and the processing time. Face recognition systems that use visual images are sensitive to variations in the lighting conditions and facial expressions. The performance of these systems may be degraded under poor illumination conditions or for subjects of various skin colors. Several solutions have been proposed to overcome these limitations. One of these solutions is to work in the Infrared (IR) spectrum. IR images have been suggested as an alternative source of information for detection and recognition of faces, when there is little or no control over lighting conditions. This arises from the fact that these images are formed due to thermal emissions from skin, which is an intrinsic property because these emissions depend on the distribution of blood vessels under the skin. On the other hand IR face recognition systems still have limitations with temperature variations and recognition of persons wearing eye glasses. In this thesis we will fuse IR images with visible images to enhance the performance of face recognition systems. Images are fused using the wavelet transform. Simulation results show that the fusion of visible and

  17. Near infrared face recognition using Zernike moments and Hermite kernels

    Czech Academy of Sciences Publication Activity Database

    Farokhi, Sajad; Sheikh, U.U.; Flusser, Jan; Yang, Bo

    2015-01-01

    Roč. 316, č. 1 (2015), s. 234-245 ISSN 0020-0255 R&D Projects: GA ČR(CZ) GA13-29225S Keywords : face recognition * Zernike moments * Hermite kernel * Decision fusion * Near infrared Subject RIV: JD - Computer Applications, Robotics Impact factor: 3.364, year: 2015 http://library.utia.cas.cz/separaty/2015/ZOI/flusser-0444205.pdf

  18. NIRFaceNet: A Convolutional Neural Network for Near-Infrared Face Identification

    Directory of Open Access Journals (Sweden)

    Min Peng

    2016-10-01

    Full Text Available Near-infrared (NIR face recognition has attracted increasing attention because of its advantage of illumination invariance. However, traditional face recognition methods based on NIR are designed for and tested in cooperative-user applications. In this paper, we present a convolutional neural network (CNN for NIR face recognition (specifically face identification in non-cooperative-user applications. The proposed NIRFaceNet is modified from GoogLeNet, but has a more compact structure designed specifically for the Chinese Academy of Sciences Institute of Automation (CASIA NIR database and can achieve higher identification rates with less training time and less processing time. The experimental results demonstrate that NIRFaceNet has an overall advantage compared to other methods in the NIR face recognition domain when image blur and noise are present. The performance suggests that the proposed NIRFaceNet method may be more suitable for non-cooperative-user applications.

  19. Rotation and Noise Invariant Near-Infrared Face Recognition by means of Zernike Moments and Spectral Regression Discriminant Analysis

    Czech Academy of Sciences Publication Activity Database

    Farokhi, S.; Shamsuddin, S. M.; Flusser, Jan; Sheikh, U. U.; Khansari, M.; Jafari-Khouzani, K.

    2013-01-01

    Roč. 22, č. 1 (2013), s. 1-11 ISSN 1017-9909 R&D Projects: GA ČR GAP103/11/1552 Keywords : face recognition * infrared imaging * image moments Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.850, year: 2013 http://library.utia.cas.cz/separaty/2013/ZOI/flusser-rotation and noise invariant near-infrared face recognition by means of zernike moments and spectral regression discriminant analysis.pdf

  20. Inspector measurement verification activities

    International Nuclear Information System (INIS)

    George, R.S.; Crouch, R.

    e most difficult and complex activity facing a safeguards inspector involves the verification of measurements and the performance of the measurement system. Remeasurement is the key to measurement verification activities. Remeasurerements using the facility's measurement system provide the bulk of the data needed for determining the performance of the measurement system. Remeasurements by reference laboratories are also important for evaluation of the measurement system and determination of systematic errors. The use of these measurement verification activities in conjunction with accepted inventory verification practices provides a better basis for accepting or rejecting an inventory. (U.S.)

  1. Hierarchical Representation Learning for Kinship Verification.

    Science.gov (United States)

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  2. Implementation of an RBF neural network on embedded systems: real-time face tracking and identity verification.

    Science.gov (United States)

    Yang, Fan; Paindavoine, M

    2003-01-01

    This paper describes a real time vision system that allows us to localize faces in video sequences and verify their identity. These processes are image processing techniques based on the radial basis function (RBF) neural network approach. The robustness of this system has been evaluated quantitatively on eight video sequences. We have adapted our model for an application of face recognition using the Olivetti Research Laboratory (ORL), Cambridge, UK, database so as to compare the performance against other systems. We also describe three hardware implementations of our model on embedded systems based on the field programmable gate array (FPGA), zero instruction set computer (ZISC) chips, and digital signal processor (DSP) TMS320C62, respectively. We analyze the algorithm complexity and present results of hardware implementations in terms of the resources used and processing speed. The success rates of face tracking and identity verification are 92% (FPGA), 85% (ZISC), and 98.2% (DSP), respectively. For the three embedded systems, the processing speeds for images size of 288 /spl times/ 352 are 14 images/s, 25 images/s, and 4.8 images/s, respectively.

  3. Near infrared face recognition by combining Zernike moments and undecimated discrete wavelet transform

    Czech Academy of Sciences Publication Activity Database

    Farokhi, Sajad; Shamsuddin, S.M.; Sheikh, U.U.; Flusser, Jan; Khansari, M.; Jafari-Khouzani, K.

    2014-01-01

    Roč. 31, č. 1 (2014), s. 13-27 ISSN 1051-2004 R&D Projects: GA ČR GAP103/11/1552 Institutional support: RVO:67985556 Keywords : Zernike moments * Undecimated discrete wavelet transform * Decision fusion * Near infrared * Face recognition Subject RIV: JD - Computer Applications, Robotics Impact factor: 1.256, year: 2014 http://library.utia.cas.cz/separaty/2014/ZOI/flusser-0428536.pdf

  4. Collaborative Random Faces-Guided Encoders for Pose-Invariant Face Representation Learning.

    Science.gov (United States)

    Shao, Ming; Zhang, Yizhe; Fu, Yun

    2018-04-01

    Learning discriminant face representation for pose-invariant face recognition has been identified as a critical issue in visual learning systems. The challenge lies in the drastic changes of facial appearances between the test face and the registered face. To that end, we propose a high-level feature learning framework called "collaborative random faces (RFs)-guided encoders" toward this problem. The contributions of this paper are three fold. First, we propose a novel supervised autoencoder that is able to capture the high-level identity feature despite of pose variations. Second, we enrich the identity features by replacing the target values of conventional autoencoders with random signals (RFs in this paper), which are unique for each subject under different poses. Third, we further improve the performance of the framework by incorporating deep convolutional neural network facial descriptors and linking discriminative identity features from different RFs for the augmented identity features. Finally, we conduct face identification experiments on Multi-PIE database, and face verification experiments on labeled faces in the wild and YouTube Face databases, where face recognition rate and verification accuracy with Receiver Operating Characteristic curves are rendered. In addition, discussions of model parameters and connections with the existing methods are provided. These experiments demonstrate that our learning system works fairly well on handling pose variations.

  5. Decoding of faces and face components in face-sensitive human visual cortex

    Directory of Open Access Journals (Sweden)

    David F Nichols

    2010-07-01

    Full Text Available A great challenge to the field of visual neuroscience is to understand how faces are encoded and represented within the human brain. Here we show evidence from functional magnetic resonance imaging (fMRI for spatially distributed processing of the whole face and its components in face-sensitive human visual cortex. We used multi-class linear pattern classifiers constructed with a leave-one-scan-out verification procedure to discriminate brain activation patterns elicited by whole faces, the internal features alone, and the external head outline alone. Furthermore, our results suggest that whole faces are represented disproportionately in the fusiform cortex (FFA whereas the building blocks of faces are represented disproportionately in occipitotemporal cortex (OFA. Faces and face components may therefore be organized with functional clustering within both the FFA and OFA, but with specialization for face components in the OFA and the whole face in the FFA.

  6. Patient set-up verification by infrared optical localization and body surface sensing in breast radiation therapy

    International Nuclear Information System (INIS)

    Spadea, Maria Francesca; Baroni, Guido; Riboldi, Marco; Orecchia, Roberto; Pedotti, Antonio; Tagaste, Barbara; Garibaldi, Cristina

    2006-01-01

    Background and purpose: The aim of the study was to investigate the clinical application of a technique for patient set-up verification in breast cancer radiotherapy, based on the 3D localization of a hybrid configuration of surface control points. Materials and methods: An infrared optical tracker provided the 3D position of two passive markers and 10 laser spots placed around and within the irradiation field on nine patients. A fast iterative constrained minimization procedure was applied to detect and compensate patient set-up errors, through the control points registration with reference data coming from treatment plan (markers reference position, CT-based surface model). Results: The application of the corrective spatial transformation estimated by the registration procedure led to significant improvement of patient set-up. Median value of 3D errors affecting three additional verification markers within the irradiation field decreased from 5.7 to 3.5 mm. Errors variability (25-75%) decreased from 3.2 to 2.1 mm. Laser spots registration on the reference surface model was documented to contribute substantially to set-up errors compensation. Conclusions: Patient set-up verification through a hybrid set of control points and constrained surface minimization algorithm was confirmed to be feasible in clinical practice and to provide valuable information for the improvement of the quality of patient set-up, with minimal requirement of operator-dependant procedures. The technique combines conveniently the advantages of passive markers based methods and surface registration techniques, by featuring immediate and robust estimation of the set-up accuracy from a redundant dataset

  7. Method for secure electronic voting system: face recognition based approach

    Science.gov (United States)

    Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran

    2017-06-01

    In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.

  8. Differences in the Pattern of Hemodynamic Response to Self-Face and Stranger-Face Images in Adolescents with Anorexia Nervosa: A Near-Infrared Spectroscopic Study.

    Directory of Open Access Journals (Sweden)

    Takeshi Inoue

    Full Text Available There have been no reports concerning the self-face perception in patients with anorexia nervosa (AN. The purpose of this study was to compare the neuronal correlates of viewing self-face images (i.e. images of familiar face and stranger-face images (i.e. images of an unfamiliar face in female adolescents with and without AN. We used near-infrared spectroscopy (NIRS to measure hemodynamic responses while the participants viewed full-color photographs of self-face and stranger-face. Fifteen females with AN (mean age, 13.8 years and 15 age- and intelligence quotient (IQ-matched female controls without AN (mean age, 13.1 years participated in the study. The responses to photographs were compared with the baseline activation (response to white uniform blank. In the AN group, the concentration of oxygenated hemoglobin (oxy-Hb significantly increased in the right temporal area during the presentation of both the self-face and stranger-face images compared with the baseline level. In contrast, in the control group, the concentration of oxy-Hb significantly increased in the right temporal area only during the presentation of the self-face image. To our knowledge the present study is the first report to assess brain activities during self-face and stranger-face perception among female adolescents with AN. There were different patterns of brain activation in response to the sight of the self-face and stranger-face images in female adolescents with AN and controls.

  9. Differences in the Pattern of Hemodynamic Response to Self-Face and Stranger-Face Images in Adolescents with Anorexia Nervosa: A Near-Infrared Spectroscopic Study.

    Science.gov (United States)

    Inoue, Takeshi; Sakuta, Yuiko; Shimamura, Keiichi; Ichikawa, Hiroko; Kobayashi, Megumi; Otani, Ryoko; Yamaguchi, Masami K; Kanazawa, So; Kakigi, Ryusuke; Sakuta, Ryoichi

    2015-01-01

    There have been no reports concerning the self-face perception in patients with anorexia nervosa (AN). The purpose of this study was to compare the neuronal correlates of viewing self-face images (i.e. images of familiar face) and stranger-face images (i.e. images of an unfamiliar face) in female adolescents with and without AN. We used near-infrared spectroscopy (NIRS) to measure hemodynamic responses while the participants viewed full-color photographs of self-face and stranger-face. Fifteen females with AN (mean age, 13.8 years) and 15 age- and intelligence quotient (IQ)-matched female controls without AN (mean age, 13.1 years) participated in the study. The responses to photographs were compared with the baseline activation (response to white uniform blank). In the AN group, the concentration of oxygenated hemoglobin (oxy-Hb) significantly increased in the right temporal area during the presentation of both the self-face and stranger-face images compared with the baseline level. In contrast, in the control group, the concentration of oxy-Hb significantly increased in the right temporal area only during the presentation of the self-face image. To our knowledge the present study is the first report to assess brain activities during self-face and stranger-face perception among female adolescents with AN. There were different patterns of brain activation in response to the sight of the self-face and stranger-face images in female adolescents with AN and controls.

  10. HDL to verification logic translator

    Science.gov (United States)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  11. Hawk-I - First results from science verification

    NARCIS (Netherlands)

    Kissler-Patig, M.; Larsen, S.S.|info:eu-repo/dai/nl/304833347; Wehner, E.M.|info:eu-repo/dai/nl/314114688

    2008-01-01

    The VLT wide-field near-infrared imager HAWK-I was commissioned in 2007 and Science Verification (SV) programmes were conducted in August 2007. A selection of results from among the twelve Science Verfication proposals are summarised.

  12. Monitoring and verification R and D

    International Nuclear Information System (INIS)

    Pilat, Joseph F.; Budlong-Sylvester, Kory W.; Fearey, Bryan L.

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R and D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R and D required to address these gaps and other monitoring and verification challenges.

  13. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images

    Directory of Open Access Journals (Sweden)

    Chih-Lung Lin

    2015-12-01

    Full Text Available In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1 automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2 applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3 extracting the line-like features (LLFs from the fused image; (4 obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5 using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.

  14. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images.

    Science.gov (United States)

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-12-12

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.

  15. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images

    Science.gov (United States)

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-01-01

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods. PMID:26703596

  16. Self-face recognition in children with autism spectrum disorders: a near-infrared spectroscopy study.

    Science.gov (United States)

    Kita, Yosuke; Gunji, Atsuko; Inoue, Yuki; Goto, Takaaki; Sakihara, Kotoe; Kaga, Makiko; Inagaki, Masumi; Hosokawa, Toru

    2011-06-01

    It is assumed that children with autism spectrum disorders (ASD) have specificities for self-face recognition, which is known to be a basic cognitive ability for social development. In the present study, we investigated neurological substrates and potentially influential factors for self-face recognition of ASD patients using near-infrared spectroscopy (NIRS). The subjects were 11 healthy adult men, 13 normally developing boys, and 10 boys with ASD. Their hemodynamic activities in the frontal area and their scanning strategies (eye-movement) were examined during self-face recognition. Other factors such as ASD severities and self-consciousness were also evaluated by parents and patients, respectively. Oxygenated hemoglobin levels were higher in the regions corresponding to the right inferior frontal gyrus than in those corresponding to the left inferior frontal gyrus. In two groups of children these activities reflected ASD severities, such that the more serious ASD characteristics corresponded with lower activity levels. Moreover, higher levels of public self-consciousness intensified the activities, which were not influenced by the scanning strategies. These findings suggest that dysfunction in the right inferior frontal gyrus areas responsible for self-face recognition is one of the crucial neural substrates underlying ASD characteristics, which could potentially be used to evaluate psychological aspects such as public self-consciousness. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  17. Improving Face Detection with TOE Cameras

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Larsen, Rasmus; Lauze, F

    2007-01-01

    A face detection method based on a boosted classifier using images from a time-of-flight sensor is presented. We show that the performance of face detection can be improved when using both depth and gray scale images and that the common use of integration of hypotheses for verification can...... be relaxed. Based on the detected face we employ an active contour method on depth images for full head segmentation....

  18. Neural correlates of own- and other-race face recognition in children: a functional near-infrared spectroscopy study.

    Science.gov (United States)

    Ding, Xiao Pan; Fu, Genyue; Lee, Kang

    2014-01-15

    The present study used the functional Near-infrared Spectroscopy (fNIRS) methodology to investigate the neural correlates of elementary school children's own- and other-race face processing. An old-new paradigm was used to assess children's recognition ability of own- and other-race faces. FNIRS data revealed that other-race faces elicited significantly greater [oxy-Hb] changes than own-race faces in the right middle frontal gyrus and inferior frontal gyrus regions (BA9) and the left cuneus (BA18). With increased age, the [oxy-Hb] activity differences between own- and other-race faces, or the neural other-race effect (NORE), underwent significant changes in these two cortical areas: at younger ages, the neural response to the other-race faces was modestly greater than that to the own-race faces, but with increased age, the neural response to the own-race faces became increasingly greater than that to the other-race faces. Moreover, these areas had strong regional functional connectivity with a swath of the cortical regions in terms of the neural other-race effect that also changed with increased age. We also found significant and positive correlations between the behavioral other-race effect (reaction time) and the neural other-race effect in the right middle frontal gyrus and inferior frontal gyrus regions (BA9). These results taken together suggest that children, like adults, devote different amounts of neural resources to processing own- and other-race faces, but the size and direction of the neural other-race effect and associated functional regional connectivity change with increased age. © 2013.

  19. An embedded face-classification system for infrared images on an FPGA

    Science.gov (United States)

    Soto, Javier E.; Figueroa, Miguel

    2014-10-01

    We present a face-classification architecture for long-wave infrared (IR) images implemented on a Field Programmable Gate Array (FPGA). The circuit is fast, compact and low power, can recognize faces in real time and be embedded in a larger image-processing and computer vision system operating locally on an IR camera. The algorithm uses Local Binary Patterns (LBP) to perform feature extraction on each IR image. First, each pixel in the image is represented as an LBP pattern that encodes the similarity between the pixel and its neighbors. Uniform LBP codes are then used to reduce the number of patterns to 59 while preserving more than 90% of the information contained in the original LBP representation. Then, the image is divided into 64 non-overlapping regions, and each region is represented as a 59-bin histogram of patterns. Finally, the algorithm concatenates all 64 regions to create a 3,776-bin spatially enhanced histogram. We reduce the dimensionality of this histogram using Linear Discriminant Analysis (LDA), which improves clustering and enables us to store an entire database of 53 subjects on-chip. During classification, the circuit applies LBP and LDA to each incoming IR image in real time, and compares the resulting feature vector to each pattern stored in the local database using the Manhattan distance. We implemented the circuit on a Xilinx Artix-7 XC7A100T FPGA and tested it with the UCHThermalFace database, which consists of 28 81 x 150-pixel images of 53 subjects in indoor and outdoor conditions. The circuit achieves a 98.6% hit ratio, trained with 16 images and tested with 12 images of each subject in the database. Using a 100 MHz clock, the circuit classifies 8,230 images per second, and consumes only 309mW.

  20. The effect of image resolution on the performance of a face recognition system

    NARCIS (Netherlands)

    Boom, B.J.; Beumer, G.M.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2006-01-01

    In this paper we investigate the effect of image resolution on the error rates of a face verification system. We do not restrict ourselves to the face recognition algorithm only, but we also consider the face registration. In our face recognition system, the face registration is done by finding

  1. FMCT verification: Case studies

    International Nuclear Information System (INIS)

    Hui Zhang

    2001-01-01

    for states that have traditionally had 'less transparency' in their military sectors. As case studies, first we investigate how to applied verification measures including remote sensing, off-site environmental sampling and on-site inspections to monitor the shutdown status of plutonium production facilities, and what measures could be taken to prevent the disclosure of sensitive information at the site. We find the most effective verification measure to monitor the status of the reprocessing plant would be on-site environmental sampling. Some countries may worry that sample analysis could disclose sensitive information about their past plutonium production activities. However, we find that sample analysis at the reprocessing site need not reveal such information. Sampling would not reveal such information as long as inspectors are not able to measure total quantities of Cs-137 and Sr-90 from HLW produced at former military plutonium production facilities. Secondly, we consider verification measures for shutdown gaseous diffusion uranium-enrichment plants (GDPs). The GDPs could be monitored effectively by satellite imagery, as one telltale operational signature of the GDP would be the water-vapor plume coming from the cooling tower, which should be easy to detect with satellite images. Furthermore, the hot roof of the enrichment building could be detectable using satellite thermal-infrared images. Finally, some on-site verification measures should be allowed, such as visual observation, surveillance and tamper-indicating seals. Finally, FMCT verification regime would have to be designed to detect undeclared fissile material production activities and facilities. These verification measures could include something like special or challenge inspections or complementary access. There would need to be provisions to prevent the abuse of such inspections, especially at sensitive and non-proscribed military and nuclear activities. In particular, to protect sensitive

  2. Verification of Ganoderma (lingzhi) commercial products by Fourier Transform infrared spectroscopy and two-dimensional IR correlation spectroscopy

    Science.gov (United States)

    Choong, Yew-Keong; Sun, Su-Qin; Zhou, Qun; Lan, Jin; Lee, Han-Lim; Chen, Xiang-Dong

    2014-07-01

    Ganoderma commercial products are typically based on two sources, raw material (powder form and/or spores) and extract (water and/or solvent). This study compared three types of Ganoderma commercial products using 1 Dimensional Fourier Transform infrared and second derivative spectroscopy. The analyzed spectra of Ganoderma raw material products were compared with spectra of cultivated Ganoderma raw material powder from different mushroom farms in Malaysia. The Ganoderma extract product was also compared with three types of cultivated Ganoderma extracts. Other medicinal Ganoderma contents in commercial extract product that included glucan and triterpenoid were analyzed by using FTIR and 2DIR. The results showed that water extract of cultivated Ganoderma possessed comparable spectra with that of Ganoderma product water extract. By comparing the content of Ganoderma commercial products using FTIR and 2DIR, product content profiles could be detected. In addition, the geographical origin of the Ganoderma products could be verified by comparing their spectra with Ganoderma products from known areas. This study demonstrated the possibility of developing verification tool to validate the purity of commercial medicinal herbal and mushroom products.

  3. Improvement of non destructive infrared test bed SATIR for examination of actively cooled tungsten armour Plasma Facing Components

    Energy Technology Data Exchange (ETDEWEB)

    Vignal, N., E-mail: nicolas.vignal@cea.fr; Desgranges, C.; Cantone, V.; Richou, M.; Courtois, X.; Missirlian, M.; Magaud, Ph.

    2013-10-15

    Highlights: • Non destructive infrared techniques for control ITER like PFCs. • Reflective surface such as W induce a measurement temperature error. • Numerical data processing by evaluation of the local emissivity. • SATIR test bed can control metallic surface with low and variable emissivity. -- Abstract: For steady state (magnetic) thermonuclear fusion devices which need large power exhaust capability and have to withstand heat fluxes in the range 10–20 MW m{sup −2}, advanced Plasma Facing Components (PFCs) have been developed. The importance of PFCs for operating tokamaks requests to verify their manufacturing quality before mounting. SATIR is an IR test bed validated and recognized as a reliable and suitable tool to detect cooling defaults on PFCs with CFC armour material. Current tokamak developments implement metallic armour materials for first wall and divertor; their low emissivity causes several difficulties for infrared thermography control. We present SATIR infrared thermography test bed improvements for W monoblocks components without defect and with calibrated defects. These results are compared to ultrasonic inspection. This study demonstrates that SATIR method is fully usable for PFCs with low emissivity armour material.

  4. Improvement of non destructive infrared test bed SATIR for examination of actively cooled tungsten armour Plasma Facing Components

    International Nuclear Information System (INIS)

    Vignal, N.; Desgranges, C.; Cantone, V.; Richou, M.; Courtois, X.; Missirlian, M.; Magaud, Ph.

    2013-01-01

    Highlights: • Non destructive infrared techniques for control ITER like PFCs. • Reflective surface such as W induce a measurement temperature error. • Numerical data processing by evaluation of the local emissivity. • SATIR test bed can control metallic surface with low and variable emissivity. -- Abstract: For steady state (magnetic) thermonuclear fusion devices which need large power exhaust capability and have to withstand heat fluxes in the range 10–20 MW m −2 , advanced Plasma Facing Components (PFCs) have been developed. The importance of PFCs for operating tokamaks requests to verify their manufacturing quality before mounting. SATIR is an IR test bed validated and recognized as a reliable and suitable tool to detect cooling defaults on PFCs with CFC armour material. Current tokamak developments implement metallic armour materials for first wall and divertor; their low emissivity causes several difficulties for infrared thermography control. We present SATIR infrared thermography test bed improvements for W monoblocks components without defect and with calibrated defects. These results are compared to ultrasonic inspection. This study demonstrates that SATIR method is fully usable for PFCs with low emissivity armour material

  5. Currency verification by a 2D infrared barcode

    International Nuclear Information System (INIS)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-01-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler. (technical design note)

  6. A possible method of carbon deposit mapping on plasma facing components using infrared thermography

    International Nuclear Information System (INIS)

    Mitteau, R.; Spruytte, J.; Vallet, S.; Travere, J.M.; Guilhem, D.; Brosset, C.

    2007-01-01

    The material eroded from the surface of plasma facing components is redeposited partly close to high heat flux areas. At these locations, the deposit is heated by the plasma and the deposition pattern evolves depending on the operation parameters. The mapping of the deposit is still a matter of intense scientific activity, especially during the course of experimental campaigns. A method based on the comparison of surface temperature maps, obtained in situ by infrared cameras and by theoretical modelling is proposed. The difference between the two is attributed to the thermal resistance added by deposited material, and expressed as a deposit thickness. The method benefits of elaborated imaging techniques such as possibility theory and fuzzy logics. The results are consistent with deposit maps obtained by visual inspection during shutdowns

  7. Extraction and fusion of spectral parameters for face recognition

    Science.gov (United States)

    Boisier, B.; Billiot, B.; Abdessalem, Z.; Gouton, P.; Hardeberg, J. Y.

    2011-03-01

    Many methods have been developed in image processing for face recognition, especially in recent years with the increase of biometric technologies. However, most of these techniques are used on grayscale images acquired in the visible range of the electromagnetic spectrum. The aims of our study are to improve existing tools and to develop new methods for face recognition. The techniques used take advantage of the different spectral ranges, the visible, optical infrared and thermal infrared, by either combining them or analyzing them separately in order to extract the most appropriate information for face recognition. We also verify the consistency of several keypoints extraction techniques in the Near Infrared (NIR) and in the Visible Spectrum.

  8. Calibration and verification of thermographic cameras for geometric measurements

    Science.gov (United States)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  9. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum

    Directory of Open Access Journals (Sweden)

    Brahmastro Kresnaraman

    2016-04-01

    Full Text Available During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA. The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations.

  10. Multi-task pose-invariant face recognition.

    Science.gov (United States)

    Ding, Changxing; Xu, Chang; Tao, Dacheng

    2015-03-01

    Face images captured in unconstrained environments usually contain significant pose variation, which dramatically degrades the performance of algorithms designed to recognize frontal faces. This paper proposes a novel face identification framework capable of handling the full range of pose variations within ±90° of yaw. The proposed framework first transforms the original pose-invariant face recognition problem into a partial frontal face recognition problem. A robust patch-based face representation scheme is then developed to represent the synthesized partial frontal faces. For each patch, a transformation dictionary is learnt under the proposed multi-task learning scheme. The transformation dictionary transforms the features of different poses into a discriminative subspace. Finally, face matching is performed at patch level rather than at the holistic level. Extensive and systematic experimentation on FERET, CMU-PIE, and Multi-PIE databases shows that the proposed method consistently outperforms single-task-based baselines as well as state-of-the-art methods for the pose problem. We further extend the proposed algorithm for the unconstrained face verification problem and achieve top-level performance on the challenging LFW data set.

  11. Individuation instructions decrease the Cross-Race Effect in a face matching task

    Directory of Open Access Journals (Sweden)

    2015-09-01

    Conclusions: Individuation instructions are an effective moderator of the CRE even within a face matching paradigm. Since unfamiliar face matching tasks most closely simulate document verification tasks, specifically passport screening, instructional techniques such as these may improve task performance within applied settings of significant practical importance.

  12. Sensor-fusion-based biometric identity verification

    International Nuclear Information System (INIS)

    Carlson, J.J.; Bouchard, A.M.; Osbourn, G.C.; Martinez, R.F.; Bartholomew, J.W.; Jordan, J.B.; Flachs, G.M.; Bao, Z.; Zhu, L.

    1998-02-01

    Future generation automated human biometric identification and verification will require multiple features/sensors together with internal and external information sources to achieve high performance, accuracy, and reliability in uncontrolled environments. The primary objective of the proposed research is to develop a theoretical and practical basis for identifying and verifying people using standoff biometric features that can be obtained with minimal inconvenience during the verification process. The basic problem involves selecting sensors and discovering features that provide sufficient information to reliably verify a person's identity under the uncertainties caused by measurement errors and tactics of uncooperative subjects. A system was developed for discovering hand, face, ear, and voice features and fusing them to verify the identity of people. The system obtains its robustness and reliability by fusing many coarse and easily measured features into a near minimal probability of error decision algorithm

  13. Temporal lobe and inferior frontal gyrus dysfunction in patients with schizophrenia during face-to-face conversation: a near-infrared spectroscopy study.

    Science.gov (United States)

    Takei, Yuichi; Suda, Masashi; Aoyama, Yoshiyuki; Yamaguchi, Miho; Sakurai, Noriko; Narita, Kosuke; Fukuda, Masato; Mikuni, Masahiko

    2013-11-01

    Schizophrenia (SC) is marked by poor social-role performance and social-skill deficits that are well reflected in daily conversation. Although the mechanism underlying these impairments has been investigated by functional neuroimaging, technical limitations have prevented the investigation of brain activation during conversation in typical clinical situations. To fill this research gap, this study investigated and compared frontal and temporal lobe activation in patients with SC during face-to-face conversation. Frontal and temporal lobe activation in 29 patients and 31 normal controls (NC) (n = 60) were measured during 180-s conversation periods by using near-infrared spectroscopy (NIRS). The grand average values of oxyhemoglobin concentration ([oxy-Hb]) changes during task performance were analyzed to determine their correlation with clinical variables and Positive and Negative Syndrome Scale (PANSS) subscores. Compared to NCs, patients with SC exhibited decreased performance in the conversation task and decreased activation in both the temporal lobes and the right inferior frontal gyrus (IFG) during task performance, as indicated by the grand average of [oxy-Hb] changes. The decreased activation in the left temporal lobe was negatively correlated with the PANSS disorganization and negative symptoms subscores and that in the right IFG was negatively correlated with illness duration, PANSS disorganization, and negative symptom subscores. These findings indicate that brain dysfunction in SC during conversation is related to functional deficits in both the temporal lobes and the right IFG and manifests primarily in the form of disorganized thinking and negative symptomatology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Hyper-Spectral Imager in visible and near-infrared band for lunar ...

    Indian Academy of Sciences (India)

    India's first lunar mission, Chandrayaan-1, will have a Hyper-Spectral Imager in the visible and near-infrared spectral ... mapping of the Moon's crust in a large number of spectral channels. The planned .... In-flight verification may be done.

  15. Empirical Tests and Preliminary Results with the Krakatoa Tool for Full Static Program Verification

    Directory of Open Access Journals (Sweden)

    Ramírez-de León Edgar Darío

    2014-10-01

    Full Text Available XJML (Ramírez et al., 2012 is a modular external platform for Verification and Validation of Java classes using the Java Modeling Language (JML through contracts written in XML. One problem faced in the XJML development was how to integrate Full Static Program Verification (FSPV. This paper presents the experiments and results that allowed us to define what tool to embed in XJML to execute FSPV.

  16. Face Liveness Detection Using Defocus

    Directory of Open Access Journals (Sweden)

    Sooyeon Kim

    2015-01-01

    Full Text Available In order to develop security systems for identity authentication, face recognition (FR technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures. To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH, are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER at a given depth of field (DoF and can be extended to camera-equipped devices, like smartphones.

  17. Discriminative Projection Selection Based Face Image Hashing

    Science.gov (United States)

    Karabat, Cagatay; Erdogan, Hakan

    Face image hashing is an emerging method used in biometric verification systems. In this paper, we propose a novel face image hashing method based on a new technique called discriminative projection selection. We apply the Fisher criterion for selecting the rows of a random projection matrix in a user-dependent fashion. Moreover, another contribution of this paper is to employ a bimodal Gaussian mixture model at the quantization step. Our simulation results on three different databases demonstrate that the proposed method has superior performance in comparison to previously proposed random projection based methods.

  18. Verification and nuclear material security

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2001-01-01

    Full text: The Director General will open the symposium by presenting a series of challenges facing the international safeguards community: the need to ensure a robust system, with strong verification tools and a sound research and development programme; the importance of securing the necessary support for the system, in terms of resources; the effort to achieve universal participation in the non-proliferation regime; and the necessity of re-energizing disarmament efforts. Special focus will be given to the challenge underscored by recent events, of strengthening international efforts to combat nuclear terrorism. (author)

  19. ECG based biometrics verification system using LabVIEW

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Singla

    2010-07-01

    Full Text Available Biometric based authentication systems provide solutions to the problems in high security which remain with conventionalsecurity systems. In a biometric verification system, human’s biological parameters (such as voice, finger print,palm print or hand geometry, face, iris etc. are used to verify the authenticity of a person. These parameters are good to beused as biometric parameters but do not provide the guarantee that the person is present and alive. As voice can be copied,finger print can be picked from glass on synthetic skin and in face recognition system due to genetic factors identical twinsor father-son may have the same facial appearance. ECG does not have these problems. It can not be recorded without theknowledge of the person and ECG of every person is unique even identical twins have different ECG. In this paper an ECGbasedbiometrics verification system which was developed using Laboratory Virtual Instruments Engineering Workbench(LabVIEW version 7.1 is discussed. Experiments were conducted on the database stored in the laboratory of 20 individualshaving 10 samples each and the results revealed a false rejection rate (FRR of 3% and false acceptance rate (FAR of 3.21%.

  20. Autistic traits and brain activation during face-to-face conversations in typically developed adults.

    Science.gov (United States)

    Suda, Masashi; Takei, Yuichi; Aoyama, Yoshiyuki; Narita, Kosuke; Sakurai, Noriko; Fukuda, Masato; Mikuni, Masahiko

    2011-01-01

    Autism spectrum disorders (ASD) are characterized by impaired social interaction and communication, restricted interests, and repetitive behaviours. The severity of these characteristics is posited to lie on a continuum that extends into the general population. Brain substrates underlying ASD have been investigated through functional neuroimaging studies using functional magnetic resonance imaging (fMRI). However, fMRI has methodological constraints for studying brain mechanisms during social interactions (for example, noise, lying on a gantry during the procedure, etc.). In this study, we investigated whether variations in autism spectrum traits are associated with changes in patterns of brain activation in typically developed adults. We used near-infrared spectroscopy (NIRS), a recently developed functional neuroimaging technique that uses near-infrared light, to monitor brain activation in a natural setting that is suitable for studying brain functions during social interactions. We monitored regional cerebral blood volume changes using a 52-channel NIRS apparatus over the prefrontal cortex (PFC) and superior temporal sulcus (STS), 2 areas implicated in social cognition and the pathology of ASD, in 28 typically developed participants (14 male and 14 female) during face-to-face conversations. This task was designed to resemble a realistic social situation. We examined the correlations of these changes with autistic traits assessed using the Autism-Spectrum Quotient (AQ). Both the PFC and STS were significantly activated during face-to-face conversations. AQ scores were negatively correlated with regional cerebral blood volume increases in the left STS during face-to-face conversations, especially in males. Our results demonstrate successful monitoring of brain function during realistic social interactions by NIRS as well as lesser brain activation in the left STS during face-to-face conversations in typically developed participants with higher levels of autistic

  1. Solar and infrared radiation measurements

    CERN Document Server

    Vignola, Frank; Michalsky, Joseph

    2012-01-01

    The rather specialized field of solar and infrared radiation measurement has become more and more important in the face of growing demands by the renewable energy and climate change research communities for data that are more accurate and have increased temporal and spatial resolution. Updating decades of acquired knowledge in the field, Solar and Infrared Radiation Measurements details the strengths and weaknesses of instruments used to conduct such solar and infrared radiation measurements. Topics covered include: Radiometer design and performance Equipment calibration, installation, operati

  2. Data merging of infrared and ultrasonic images for plasma facing components inspection

    International Nuclear Information System (INIS)

    Richou, M.; Durocher, A.; Medrano, M.; Martinez-Ona, R.; Moysan, J.; Riccardi, B.

    2009-01-01

    For steady-state magnetic thermonuclear fusion devices which need large power exhaust capability, actively cooled plasma facing components have been developed. In order to guarantee the integrity of these components during the required lifetime, their thermal and mechanical behaviour must be assessed. Before the procurement of the ITER Divertor, the examination of the heat sink to armour joints with non-destructive techniques is an essential topic to be addressed. Defects may be localised at different bonding interfaces. In order to improve the defect detection capability of the SATIR technique, the possibility of merging the infrared thermography test data coming from SATIR results with the ultrasonic test data has been identified. The data merging of SATIR and ultrasonic results has been performed on Carbon Fiber Composite (CFC) monoblocks with calibrated defects, identified by their position and extension. These calibrated defects were realised with machining, with 'stop-off' or by a lack of CFC activation techniques, these last two representing more accurately a real defect. A batch of 56 samples was produced to simulate each possibility of combination with regards to interface location, position and extension and way of realising the defect. The use of a data merging method based on Dempster-Shafer theory improves significantly the detection sensibility and reliability of defect location and size.

  3. Multithread Face Recognition in Cloud

    Directory of Open Access Journals (Sweden)

    Dakshina Ranjan Kisku

    2016-01-01

    Full Text Available Faces are highly challenging and dynamic objects that are employed as biometrics evidence in identity verification. Recently, biometrics systems have proven to be an essential security tools, in which bulk matching of enrolled people and watch lists is performed every day. To facilitate this process, organizations with large computing facilities need to maintain these facilities. To minimize the burden of maintaining these costly facilities for enrollment and recognition, multinational companies can transfer this responsibility to third-party vendors who can maintain cloud computing infrastructures for recognition. In this paper, we showcase cloud computing-enabled face recognition, which utilizes PCA-characterized face instances and reduces the number of invariant SIFT points that are extracted from each face. To achieve high interclass and low intraclass variances, a set of six PCA-characterized face instances is computed on columns of each face image by varying the number of principal components. Extracted SIFT keypoints are fused using sum and max fusion rules. A novel cohort selection technique is applied to increase the total performance. The proposed protomodel is tested on BioID and FEI face databases, and the efficacy of the system is proven based on the obtained results. We also compare the proposed method with other well-known methods.

  4. The infrared spectrum of Jupiter

    Science.gov (United States)

    Ridgway, S. T.; Larson, H. P.; Fink, U.

    1976-01-01

    The principal characteristics of Jupiter's infrared spectrum are reviewed with emphasis on their significance for our understanding of the composition and temperature structure of the Jovian upper atmosphere. The spectral region from 1 to 40 microns divides naturally into three regimes: the reflecting region, thermal emission from below the cloud deck (5-micron hot spots), and thermal emission from above the clouds. Opaque parts of the Jovian atmosphere further subdivide these regions into windows, and each is discussed in the context of its past or potential contributions to our knowledge of the planet. Recent results are incorporated into a table of atmospheric composition and abundance which includes positively identified constituents as well as several which require verification. The limited available information about spatial variations of the infrared spectrum is presented

  5. Human body region enhancement method based on Kinect infrared imaging

    Science.gov (United States)

    Yang, Lei; Fan, Yubo; Song, Xiaowei; Cai, Wenjing

    2016-10-01

    To effectively improve the low contrast of human body region in the infrared images, a combing method of several enhancement methods is utilized to enhance the human body region. Firstly, for the infrared images acquired by Kinect, in order to improve the overall contrast of the infrared images, an Optimal Contrast-Tone Mapping (OCTM) method with multi-iterations is applied to balance the contrast of low-luminosity infrared images. Secondly, to enhance the human body region better, a Level Set algorithm is employed to improve the contour edges of human body region. Finally, to further improve the human body region in infrared images, Laplacian Pyramid decomposition is adopted to enhance the contour-improved human body region. Meanwhile, the background area without human body region is processed by bilateral filtering to improve the overall effect. With theoretical analysis and experimental verification, the results show that the proposed method could effectively enhance the human body region of such infrared images.

  6. A Feature Subtraction Method for Image Based Kinship Verification under Uncontrolled Environments

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    The most fundamental problem of local feature based kinship verification methods is that a local feature can capture the variations of environmental conditions and the differences between two persons having a kin relation, which can significantly decrease the performance. To address this problem...... the feature distance between face image pairs with kinship and maximize the distance between non-kinship pairs. Based on the subtracted feature, the verification is realized through a simple Gaussian based distance comparison method. Experiments on two public databases show that the feature subtraction method...

  7. Examining the examiners: an online eyebrow verification experiment inspired by FISWG

    NARCIS (Netherlands)

    Zeinstra, Christopher Gerard; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2015-01-01

    In forensic face comparison, one of the features taken into account are the eyebrows. In this paper, we investigate human performance on an eyebrow verification task. This task is executed twice by participants: a "best-effort" approach and an approach using features based on forensic knowledge. The

  8. Improving Shadow Suppression for Illumination Robust Face Recognition

    KAUST Repository

    Zhang, Wuming

    2017-10-13

    2D face analysis techniques, such as face landmarking, face recognition and face verification, are reasonably dependent on illumination conditions which are usually uncontrolled and unpredictable in the real world. An illumination robust preprocessing method thus remains a significant challenge in reliable face analysis. In this paper we propose a novel approach for improving lighting normalization through building the underlying reflectance model which characterizes interactions between skin surface, lighting source and camera sensor, and elaborates the formation of face color appearance. Specifically, the proposed illumination processing pipeline enables the generation of Chromaticity Intrinsic Image (CII) in a log chromaticity space which is robust to illumination variations. Moreover, as an advantage over most prevailing methods, a photo-realistic color face image is subsequently reconstructed which eliminates a wide variety of shadows whilst retaining the color information and identity details. Experimental results under different scenarios and using various face databases show the effectiveness of the proposed approach to deal with lighting variations, including both soft and hard shadows, in face recognition.

  9. Data merging of infrared and ultrasonic images for plasma facing components inspection

    Energy Technology Data Exchange (ETDEWEB)

    Richou, M. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France)], E-mail: marianne.richou@cea.fr; Durocher, A. [CEA, IRFM, F-13108 Saint Paul-lez-Durance (France); Medrano, M. [Association EURATOM - CIEMAT, Avda. Complutense 22, 28040 Madrid (Spain); Martinez-Ona, R. [Tecnatom, 28703 S. Sebastian de los Reyes, Madrid (Spain); Moysan, J. [LCND, Universite de la Mediterranee, F-13625 Aix-en-Provence (France); Riccardi, B. [Fusion For Energy, 08019 Barcelona (Spain)

    2009-06-15

    For steady-state magnetic thermonuclear fusion devices which need large power exhaust capability, actively cooled plasma facing components have been developed. In order to guarantee the integrity of these components during the required lifetime, their thermal and mechanical behaviour must be assessed. Before the procurement of the ITER Divertor, the examination of the heat sink to armour joints with non-destructive techniques is an essential topic to be addressed. Defects may be localised at different bonding interfaces. In order to improve the defect detection capability of the SATIR technique, the possibility of merging the infrared thermography test data coming from SATIR results with the ultrasonic test data has been identified. The data merging of SATIR and ultrasonic results has been performed on Carbon Fiber Composite (CFC) monoblocks with calibrated defects, identified by their position and extension. These calibrated defects were realised with machining, with 'stop-off' or by a lack of CFC activation techniques, these last two representing more accurately a real defect. A batch of 56 samples was produced to simulate each possibility of combination with regards to interface location, position and extension and way of realising the defect. The use of a data merging method based on Dempster-Shafer theory improves significantly the detection sensibility and reliability of defect location and size.

  10. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    International Nuclear Information System (INIS)

    Chung, Jin Ho

    2016-01-01

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs

  11. Verification of a Fissile Material Cut-off Treaty (FMCT): The Potential Role of the IAEA

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Jin Ho [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-05-15

    The objective of a future verification of a FMCT(Fissile Material Cut-off Treaty) is to deter and detect non-compliance with treaty obligations in a timely and non-discriminatory manner with regard to banning the production of fissile material for nuclear weapons or other nuclear devices. Since the International Atomic Energy Agency (IAEA) has already established the IAEA safeguards as a verification system mainly for Non -Nuclear Weapon States (NNWSs), it is expected that the IAEA's experience and expertise in this field will make a significant contribution to setting up a future treaty's verification regime. This paper is designed to explore the potential role of the IAEA in verifying the future treaty by analyzing verification abilities of the Agency in terms of treaty verification and expected challenges. Furthermore, the concept of multilateral verification that could be facilitated by the IAEA will be examined as a measure of providing a credible assurance of compliance with a future treaty. In this circumstance, it is necessary for the IAEA to be prepared for playing a leading role in FMCT verifications as a form of multilateral verification by taking advantage of its existing verification concepts, methods, and tools. Also, several challenges that the Agency faces today need to be overcome, including dealing with sensitive and proliferative information, attribution of fissile materials, lack of verification experience in military fuel cycle facilities, and different attitude and culture towards verification between NWSs and NNWSs.

  12. Face averages enhance user recognition for smartphone security.

    Science.gov (United States)

    Robertson, David J; Kramer, Robin S S; Burton, A Mike

    2015-01-01

    Our recognition of familiar faces is excellent, and generalises across viewing conditions. However, unfamiliar face recognition is much poorer. For this reason, automatic face recognition systems might benefit from incorporating the advantages of familiarity. Here we put this to the test using the face verification system available on a popular smartphone (the Samsung Galaxy). In two experiments we tested the recognition performance of the smartphone when it was encoded with an individual's 'face-average'--a representation derived from theories of human face perception. This technique significantly improved performance for both unconstrained celebrity images (Experiment 1) and for real faces (Experiment 2): users could unlock their phones more reliably when the device stored an average of the user's face than when they stored a single image. This advantage was consistent across a wide variety of everyday viewing conditions. Furthermore, the benefit did not reduce the rejection of imposter faces. This benefit is brought about solely by consideration of suitable representations for automatic face recognition, and we argue that this is just as important as development of matching algorithms themselves. We propose that this representation could significantly improve recognition rates in everyday settings.

  13. Multimodal Personal Verification Using Likelihood Ratio for the Match Score Fusion

    Directory of Open Access Journals (Sweden)

    Long Binh Tran

    2017-01-01

    Full Text Available In this paper, the authors present a novel personal verification system based on the likelihood ratio test for fusion of match scores from multiple biometric matchers (face, fingerprint, hand shape, and palm print. In the proposed system, multimodal features are extracted by Zernike Moment (ZM. After matching, the match scores from multiple biometric matchers are fused based on the likelihood ratio test. A finite Gaussian mixture model (GMM is used for estimating the genuine and impostor densities of match scores for personal verification. Our approach is also compared to some different famous approaches such as the support vector machine and the sum rule with min-max. The experimental results have confirmed that the proposed system can achieve excellent identification performance for its higher level in accuracy than different famous approaches and thus can be utilized for more application related to person verification.

  14. Physiology-based face recognition in the thermal infrared spectrum.

    Science.gov (United States)

    Buddharaju, Pradeep; Pavlidis, Ioannis T; Tsiamyrtzis, Panagiotis; Bazakos, Mike

    2007-04-01

    The current dominant approaches to face recognition rely on facial characteristics that are on or over the skin. Some of these characteristics have low permanency can be altered, and their phenomenology varies significantly with environmental factors (e.g., lighting). Many methodologies have been developed to address these problems to various degrees. However, the current framework of face recognition research has a potential weakness due to its very nature. We present a novel framework for face recognition based on physiological information. The motivation behind this effort is to capitalize on the permanency of innate characteristics that are under the skin. To establish feasibility, we propose a specific methodology to capture facial physiological patterns using the bioheat information contained in thermal imagery. First, the algorithm delineates the human face from the background using the Bayesian framework. Then, it localizes the superficial blood vessel network using image morphology. The extracted vascular network produces contour shapes that are characteristic to each individual. The branching points of the skeletonized vascular network are referred to as Thermal Minutia Points (TMPs) and constitute the feature database. To render the method robust to facial pose variations, we collect for each subject to be stored in the database five different pose images (center, midleft profile, left profile, midright profile, and right profile). During the classification stage, the algorithm first estimates the pose of the test image. Then, it matches the local and global TMP structures extracted from the test image with those of the corresponding pose images in the database. We have conducted experiments on a multipose database of thermal facial images collected in our laboratory, as well as on the time-gap database of the University of Notre Dame. The good experimental results show that the proposed methodology has merit, especially with respect to the problem of

  15. Synergies across verification regimes: Nuclear safeguards and chemical weapons convention compliance

    International Nuclear Information System (INIS)

    Kadner, Steven P.; Turpen, Elizabeth

    2001-01-01

    . For example, just as cost-effective and readily applicable technologies can solve the problems faced by the nuclear safeguards community, these same technologies offer solutions for the CWC safeguards regime. This paper discusses similarities between nuclear and chemical weapons arms control in terms of verification methodologies and the potential for shared applications of safeguards technologies. (author)

  16. Digital correlation applied to recognition and identification faces

    International Nuclear Information System (INIS)

    Arroyave, S.; Hernandez, L. J.; Torres, Cesar; Matos, Lorenzo

    2009-01-01

    It developed a system capable of recognizing faces of people from their facial features, the images are taken by the software automatically through a process of validating the presence of face to the camera lens, the digitized image is compared with a database that contains previously images captured, to subsequently be recognized and finally identified. The contribution of system set out is the fact that the acquisition of data is done in real time and using a web cam commercial usb interface offering an system equally optimal but much more economical. This tool is very effective in systems where the security is off vital importance, support with a high degree of verification to entities that possess databases with faces of people. (Author)

  17. TECHNICAL DESIGN NOTE: Currency verification by a 2D infrared barcode

    Science.gov (United States)

    Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla

    2010-10-01

    Nowadays all the National Central Banks are continuously studying innovative anti-counterfeiting systems for banknotes. In this note, an innovative solution is proposed, which combines the potentiality of a hylemetric approach (methodology conceptually similar to biometry), based on notes' intrinsic characteristics, with a well-known and consolidated 2D barcode identification system. In particular, in this note we propose to extract from the banknotes a univocal binary control sequence (template) and insert an encrypted version of it in a barcode printed on the same banknote. For a more acceptable look and feel of a banknote, the superposed barcode can be stamped using IR ink that is visible to near-IR image sensors. This makes the banknote verification simpler.

  18. Advanced verification topics

    CERN Document Server

    Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat

    2011-01-01

    The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.

  19. Drunk identification using far infrared imagery based on DCT features in DWT domain

    Science.gov (United States)

    Xie, Zhihua; Jiang, Peng; Xiong, Ying; Li, Ke

    2016-10-01

    Drunk driving problem is a serious threat to traffic safety. Automatic drunk driver identification is vital to improve the traffic safety. This paper copes with automatic drunk driver detection using far infrared thermal images by the holistic features. To improve the robustness of drunk driver detection, instead of traditional local pixels, a holistic feature extraction method is proposed to attain compact and discriminative features for infrared face drunk identification. Discrete cosine transform (DCT) in discrete wavelet transform (DWT) domain is used to extract the useful features in infrared face images for its high speed. Then, the first six DCT coefficients are retained for drunk classification by means of "Z" scanning. Finally, SVM is applied to classify the drunk person. Experimental results illustrate that the accuracy rate of proposed infrared face drunk identification can reach 98.5% with high computation efficiency, which can be applied in real drunk driver detection system.

  20. Discrimination between authentic and adulterated liquors by near-infrared spectroscopy and ensemble classification

    Science.gov (United States)

    Chen, Hui; Tan, Chao; Wu, Tong; Wang, Li; Zhu, Wanping

    2014-09-01

    Chinese liquor is one of the famous distilled spirits and counterfeit liquor is becoming a serious problem in the market. Especially, age liquor is facing the crisis of confidence because it is difficult for consumer to identify the marked age, which prompts unscrupulous traders to pose off low-grade liquors as high-grade liquors. An ideal method for authenticity confirmation of liquors should be non-invasive, non-destructive and timely. The combination of near-infrared spectroscopy with chemometrics proves to be a good way to reach these premises. A new strategy is proposed for classification and verification of the adulteration of liquors by using NIR spectroscopy and chemometric classification, i.e., ensemble support vector machines (SVM). Three measures, i.e., accuracy, sensitivity and specificity were used for performance evaluation. The results confirmed that the strategy can serve as a screening tool applied to verify adulteration of the liquor, that is, a prior step used to condition the sample to a deeper analysis only when a positive result for adulteration is obtained by the proposed methodology.

  1. Heterogeneous sharpness for cross-spectral face recognition

    Science.gov (United States)

    Cao, Zhicheng; Schmid, Natalia A.

    2017-05-01

    Matching images acquired in different electromagnetic bands remains a challenging problem. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images, known as cross-spectral face recognition. Among many unsolved issues is the one of quality disparity of the heterogeneous images. Images acquired in different spectral bands are of unequal image quality due to distinct imaging mechanism, standoff distances, or imaging environment, etc. To reduce the effect of quality disparity on the recognition performance, one can manipulate images to either improve the quality of poor-quality images or to degrade the high-quality images to the level of the quality of their heterogeneous counterparts. To estimate the level of discrepancy in quality of two heterogeneous images a quality metric such as image sharpness is needed. It provides a guidance in how much quality improvement or degradation is appropriate. In this work we consider sharpness as a relative measure of heterogeneous image quality. We propose a generalized definition of sharpness by first achieving image quality parity and then finding and building a relationship between the image quality of two heterogeneous images. Therefore, the new sharpness metric is named heterogeneous sharpness. Image quality parity is achieved by experimentally finding the optimal cross-spectral face recognition performance where quality of the heterogeneous images is varied using a Gaussian smoothing function with different standard deviation. This relationship is established using two models; one of them involves a regression model and the other involves a neural network. To train, test and validate the model, we use composite operators developed in our lab to extract features from heterogeneous face images and use the sharpness metric to evaluate the face image quality within each band. Images from three different spectral bands visible light, near infrared, and short

  2. A Real-Time Angle- and Illumination-Aware Face Recognition System Based on Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Hisateru Kato

    2012-01-01

    Full Text Available Automatic authentication systems, using biometric technology, are becoming increasingly important with the increased need for person verification in our daily life. A few years back, fingerprint verification was done only in criminal investigations. Now fingerprints and face images are widely used in bank tellers, airports, and building entrances. Face images are easy to obtain, but successful recognition depends on proper orientation and illumination of the image, compared to the one taken at registration time. Facial features heavily change with illumination and orientation angle, leading to increased false rejection as well as false acceptance. Registering face images for all possible angles and illumination is impossible. In this work, we proposed a memory efficient way to register (store multiple angle and changing illumination face image data, and a computationally efficient authentication technique, using multilayer perceptron (MLP. Though MLP is trained using a few registered images with different orientation, due to generalization property of MLP, interpolation of features for intermediate orientation angles was possible. The algorithm is further extended to include illumination robust authentication system. Results of extensive experiments verify the effectiveness of the proposed algorithm.

  3. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  4. Application of Infrared Thermography in Power Distribution System

    Directory of Open Access Journals (Sweden)

    Anwer Ali Sahito

    2014-07-01

    Full Text Available Electricity sector of Pakistan is facing daunting energy crisis. Generation deficit results in long duration of load shedding throughout the country. Old aged distribution system, lack of maintenance and equipment failure cause long unplanned outages and frequent supply interruptions. HESCO (Hyderabad Electric Supply Company is facing high technical losses, supply interruption and financial loss due to transformer damages. Infrared Thermography is non-contact, safe and fast measure for distribution system inspection. In this paper, thermographic inspection for different distribution system equipment is carried out to identify possible developed faults. It is observed that IR (Infrared thermography is effective measure for detecting developed faulty conditions at the initial stages to avoid unplanned outages

  5. The Complete Gabor-Fisher Classifier for Robust Face Recognition

    Directory of Open Access Journals (Sweden)

    Štruc Vitomir

    2010-01-01

    Full Text Available Abstract This paper develops a novel face recognition technique called Complete Gabor Fisher Classifier (CGFC. Different from existing techniques that use Gabor filters for deriving the Gabor face representation, the proposed approach does not rely solely on Gabor magnitude information but effectively uses features computed based on Gabor phase information as well. It represents one of the few successful attempts found in the literature of combining Gabor magnitude and phase information for robust face recognition. The novelty of the proposed CGFC technique comes from (1 the introduction of a Gabor phase-based face representation and (2 the combination of the recognition technique using the proposed representation with classical Gabor magnitude-based methods into a unified framework. The proposed face recognition framework is assessed in a series of face verification and identification experiments performed on the XM2VTS, Extended YaleB, FERET, and AR databases. The results of the assessment suggest that the proposed technique clearly outperforms state-of-the-art face recognition techniques from the literature and that its performance is almost unaffected by the presence of partial occlusions of the facial area, changes in facial expression, or severe illumination changes.

  6. Multilateral disarmament verification

    International Nuclear Information System (INIS)

    Persbo, A.

    2013-01-01

    Non-governmental organisations, such as VERTIC (Verification Research, Training and Information Centre), can play an important role in the promotion of multilateral verification. Parties involved in negotiating nuclear arms accords are for the most part keen that such agreements include suitable and robust provisions for monitoring and verification. Generally progress in multilateral arms control verification is often painstakingly slow, but from time to time 'windows of opportunity' - that is, moments where ideas, technical feasibility and political interests are aligned at both domestic and international levels - may occur and we have to be ready, so the preparatory work is very important. In the context of nuclear disarmament, verification (whether bilateral or multilateral) entails an array of challenges, hurdles and potential pitfalls relating to national security, health, safety and even non-proliferation, so preparatory work is complex and time-greedy. A UK-Norway Initiative was established in order to investigate the role that a non-nuclear-weapon state such as Norway could potentially play in the field of nuclear arms control verification. (A.C.)

  7. Face identification with frequency domain matched filtering in mobile environments

    Science.gov (United States)

    Lee, Dong-Su; Woo, Yong-Hyun; Yeom, Seokwon; Kim, Shin-Hwan

    2012-06-01

    Face identification at a distance is very challenging since captured images are often degraded by blur and noise. Furthermore, the computational resources and memory are often limited in the mobile environments. Thus, it is very challenging to develop a real-time face identification system on the mobile device. This paper discusses face identification based on frequency domain matched filtering in the mobile environments. Face identification is performed by the linear or phase-only matched filter and sequential verification stages. The candidate window regions are decided by the major peaks of the linear or phase-only matched filtering outputs. The sequential stages comprise a skin-color test and an edge mask filtering test, which verify color and shape information of the candidate regions in order to remove false alarms. All algorithms are built on the mobile device using Android platform. The preliminary results show that face identification of East Asian people can be performed successfully in the mobile environments.

  8. Calibration procedures of the Tore-Supra infrared endoscopes

    Science.gov (United States)

    Desgranges, C.; Jouve, M.; Balorin, C.; Reichle, R.; Firdaouss, M.; Lipa, M.; Chantant, M.; Gardarein, J. L.; Saille, A.; Loarer, T.

    2018-01-01

    Five endoscopes equipped with infrared cameras working in the medium infrared range (3-5 μm) are installed on the controlled thermonuclear fusion research device Tore-Supra. These endoscopes aim at monitoring the plasma facing components surface temperature to prevent their overheating. Signals delivered by infrared cameras through endoscopes are analysed and used on the one hand through a real time feedback control loop acting on the heating systems of the plasma to decrease plasma facing components surface temperatures when necessary, on the other hand for physics studies such as determination of the incoming heat flux . To ensure these two roles a very accurate knowledge of the absolute surface temperatures is mandatory. Consequently the infrared endoscopes must be calibrated through a very careful procedure. This means determining their transmission coefficients which is a delicate operation. Methods to calibrate infrared endoscopes during the shutdown period of the Tore-Supra machine will be presented. As they do not allow determining the possible transmittances evolution during operation an in-situ method is presented. It permits the validation of the calibration performed in laboratory as well as the monitoring of their evolution during machine operation. This is possible by the use of the endoscope shutter and a dedicated plasma scenario developed to heat it. Possible improvements of this method are briefly evoked.

  9. Embedded software verification and debugging

    CERN Document Server

    Winterholer, Markus

    2017-01-01

    This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...

  10. Software verification for nuclear industry

    International Nuclear Information System (INIS)

    Wilburn, N.P.

    1985-08-01

    Why verification of software products throughout the software life cycle is necessary is considered. Concepts of verification, software verification planning, and some verification methodologies for products generated throughout the software life cycle are then discussed

  11. The design of verification regimes

    International Nuclear Information System (INIS)

    Gallagher, N.W.

    1991-01-01

    Verification of a nuclear agreement requires more than knowledge of relevant technologies and institutional arrangements. It also demands thorough understanding of the nature of verification and the politics of verification design. Arms control efforts have been stymied in the past because key players agreed to verification in principle, only to disagree radically over verification in practice. In this chapter, it is shown that the success and stability of arms control endeavors can be undermined by verification designs which promote unilateral rather than cooperative approaches to security, and which may reduce, rather than enhance, the security of both sides. Drawing on logical analysis and practical lessons from previous superpower verification experience, this chapter summarizes the logic and politics of verification and suggests implications for South Asia. The discussion begins by determining what properties all forms of verification have in common, regardless of the participants or the substance and form of their agreement. Viewing verification as the political process of making decisions regarding the occurrence of cooperation points to four critical components: (1) determination of principles, (2) information gathering, (3) analysis and (4) projection. It is shown that verification arrangements differ primarily in regards to how effectively and by whom these four stages are carried out

  12. Improved verification methods for safeguards verifications at enrichment plants

    International Nuclear Information System (INIS)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D.

    2009-01-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF 6 cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  13. Improved verification methods for safeguards verifications at enrichment plants

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, A.; Kane, S. C.; Bourva, L.; Poirier, S.; Loghin, N. E.; Langlands, D. [Department of Safeguards, International Atomic Energy Agency, Wagramer Strasse 5, A1400 Vienna (Austria)

    2009-07-01

    The International Atomic Energy Agency (IAEA) has initiated a coordinated research and development programme to improve its verification methods and equipment applicable to enrichment plants. The programme entails several individual projects to meet the objectives of the IAEA Safeguards Model Approach for Gas Centrifuge Enrichment Plants updated in 2006. Upgrades of verification methods to confirm the absence of HEU (highly enriched uranium) production have been initiated and, in particular, the Cascade Header Enrichment Monitor (CHEM) has been redesigned to reduce its weight and incorporate an electrically cooled germanium detector. Such detectors are also introduced to improve the attended verification of UF{sub 6} cylinders for the verification of the material balance. Data sharing of authenticated operator weighing systems such as accountancy scales and process load cells is also investigated as a cost efficient and an effective safeguards measure combined with unannounced inspections, surveillance and non-destructive assay (NDA) measurement. (authors)

  14. Iterative closest normal point for 3D face recognition.

    Science.gov (United States)

    Mohammadzade, Hoda; Hatzinakos, Dimitrios

    2013-02-01

    The common approach for 3D face recognition is to register a probe face to each of the gallery faces and then calculate the sum of the distances between their points. This approach is computationally expensive and sensitive to facial expression variation. In this paper, we introduce the iterative closest normal point method for finding the corresponding points between a generic reference face and every input face. The proposed correspondence finding method samples a set of points for each face, denoted as the closest normal points. These points are effectively aligned across all faces, enabling effective application of discriminant analysis methods for 3D face recognition. As a result, the expression variation problem is addressed by minimizing the within-class variability of the face samples while maximizing the between-class variability. As an important conclusion, we show that the surface normal vectors of the face at the sampled points contain more discriminatory information than the coordinates of the points. We have performed comprehensive experiments on the Face Recognition Grand Challenge database, which is presently the largest available 3D face database. We have achieved verification rates of 99.6 and 99.2 percent at a false acceptance rate of 0.1 percent for the all versus all and ROC III experiments, respectively, which, to the best of our knowledge, have seven and four times less error rates, respectively, compared to the best existing methods on this database.

  15. Physics Verification Overview

    Energy Technology Data Exchange (ETDEWEB)

    Doebling, Scott William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-12

    The purpose of the verification project is to establish, through rigorous convergence analysis, that each ASC computational physics code correctly implements a set of physics models and algorithms (code verification); Evaluate and analyze the uncertainties of code outputs associated with the choice of temporal and spatial discretization (solution or calculation verification); and Develop and maintain the capability to expand and update these analyses on demand. This presentation describes project milestones.

  16. Random-Profiles-Based 3D Face Recognition System

    Directory of Open Access Journals (Sweden)

    Joongrock Kim

    2014-03-01

    Full Text Available In this paper, a noble nonintrusive three-dimensional (3D face modeling system for random-profile-based 3D face recognition is presented. Although recent two-dimensional (2D face recognition systems can achieve a reliable recognition rate under certain conditions, their performance is limited by internal and external changes, such as illumination and pose variation. To address these issues, 3D face recognition, which uses 3D face data, has recently received much attention. However, the performance of 3D face recognition highly depends on the precision of acquired 3D face data, while also requiring more computational power and storage capacity than 2D face recognition systems. In this paper, we present a developed nonintrusive 3D face modeling system composed of a stereo vision system and an invisible near-infrared line laser, which can be directly applied to profile-based 3D face recognition. We further propose a novel random-profile-based 3D face recognition method that is memory-efficient and pose-invariant. The experimental results demonstrate that the reconstructed 3D face data consists of more than 50 k 3D point clouds and a reliable recognition rate against pose variation.

  17. Feasibility of tropospheric water vapor profiling using infrared heterodyne differential absorption lidar

    Energy Technology Data Exchange (ETDEWEB)

    Grund, C.J.; Hardesty, R.M. [National Oceanic and Atmospheric Administration Environmental Technology Laboratoy, Boulder, CO (United States); Rye, B.J. [Univ. of Colorado, Boulder, CO (United States)

    1996-04-01

    The development and verification of realistic climate model parameterizations for clouds and net radiation balance and the correction of other site sensor observations for interferences due to the presence of water vapor are critically dependent on water vapor profile measurements. In this study, we develop system performance models and examine the potential of infrared differential absoroption lidar (DIAL) to determine the concentration of water vapor.

  18. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  19. Verification and disarmament

    Energy Technology Data Exchange (ETDEWEB)

    Blix, H. [IAEA, Vienna (Austria)

    1998-07-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed.

  20. Verification and disarmament

    International Nuclear Information System (INIS)

    Blix, H.

    1998-01-01

    The main features are described of the IAEA safeguards verification system that non-nuclear weapon states parties of the NPT are obliged to accept. Verification activities/problems in Iraq and North Korea are discussed

  1. Verification of RESRAD-RDD. (Version 2.01)

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Flood, Paul E. [Argonne National Lab. (ANL), Argonne, IL (United States); LePoire, David [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-01

    In this report, the results generated by RESRAD-RDD version 2.01 are compared with those produced by RESRAD-RDD version 1.7 for different scenarios with different sets of input parameters. RESRAD-RDD version 1.7 is spreadsheet-driven, performing calculations with Microsoft Excel spreadsheets. RESRAD-RDD version 2.01 revamped version 1.7 by using command-driven programs designed with Visual Basic.NET to direct calculations with data saved in Microsoft Access database, and re-facing the graphical user interface (GUI) to provide more flexibility and choices in guideline derivation. Because version 1.7 and version 2.01 perform the same calculations, the comparison of their results serves as verification of both versions. The verification covered calculation results for 11 radionuclides included in both versions: Am-241, Cf-252, Cm-244, Co-60, Cs-137, Ir-192, Po-210, Pu-238, Pu-239, Ra-226, and Sr-90. At first, all nuclidespecific data used in both versions were compared to ensure that they are identical. Then generic operational guidelines and measurement-based radiation doses or stay times associated with a specific operational guideline group were calculated with both versions using different sets of input parameters, and the results obtained with the same set of input parameters were compared. A total of 12 sets of input parameters were used for the verification, and the comparison was performed for each operational guideline group, from A to G, sequentially. The verification shows that RESRAD-RDD version 1.7 and RESRAD-RDD version 2.01 generate almost identical results; the slight differences could be attributed to differences in numerical precision with Microsoft Excel and Visual Basic.NET. RESRAD-RDD version 2.01 allows the selection of different units for use in reporting calculation results. The results of SI units were obtained and compared with the base results (in traditional units) used for comparison with version 1.7. The comparison shows that RESRAD

  2. Infrared surface temperature measurements for long pulse operation, and real time feedback control in Tore-Supra, an actively cooled Tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Guilhem, D.; Adjeroud, B.; Balorin, C.; Buravand, Y.; Bertrand, B.; Bondil, J.L.; Desgranges, C.; Gauthier, E.; Lipa, M.; Messina, P.; Missirlian, M.; Mitteau, R.; Moulin, D.; Pocheau, C.; Portafaix, C.; Reichle, R.; Roche, H.; Saille, A.; Vallet, S

    2004-07-01

    Tore-Supra has a steady-state magnetic field using super-conducting magnets and water-cooled plasma facing components for high performances long pulse plasma discharges. When not actively cooled, plasma-facing components can only accumulate a limited amount of energy since the temperature increase continuously (T proportional to {radical}(t)) during the discharge until radiation cooling is equal to the incoming heat flux (T > 1800 K). Such an environment is found in most today Tokamaks. In the present paper we report the recent results of Tore-Supra, especially the design of the new generation of infrared endoscopes to measure the surface temperature of the plasma facing components. The Tore-Supra infrared thermography system is composed of 7 infrared endoscopes, this system is described in details in the paper, the new JET infrared thermography system is presented and some insights of the ITER set of visible/infrared endoscope is given. (authors)

  3. Infrared surface temperature measurements for long pulse operation, and real time feedback control in Tore-Supra, an actively cooled Tokamak

    International Nuclear Information System (INIS)

    Guilhem, D.; Adjeroud, B.; Balorin, C.; Buravand, Y.; Bertrand, B.; Bondil, J.L.; Desgranges, C.; Gauthier, E.; Lipa, M.; Messina, P.; Missirlian, M.; Mitteau, R.; Moulin, D.; Pocheau, C.; Portafaix, C.; Reichle, R.; Roche, H.; Saille, A.; Vallet, S.

    2004-01-01

    Tore-Supra has a steady-state magnetic field using super-conducting magnets and water-cooled plasma facing components for high performances long pulse plasma discharges. When not actively cooled, plasma-facing components can only accumulate a limited amount of energy since the temperature increase continuously (T proportional to √(t)) during the discharge until radiation cooling is equal to the incoming heat flux (T > 1800 K). Such an environment is found in most today Tokamaks. In the present paper we report the recent results of Tore-Supra, especially the design of the new generation of infrared endoscopes to measure the surface temperature of the plasma facing components. The Tore-Supra infrared thermography system is composed of 7 infrared endoscopes, this system is described in details in the paper, the new JET infrared thermography system is presented and some insights of the ITER set of visible/infrared endoscope is given. (authors)

  4. Resonance control of mid-infrared metamaterials using arrays of split-ring resonator pairs

    KAUST Repository

    Yue, Weisheng

    2016-01-11

    We present our design, fabrication and characterization of resonance-controllable metamaterials operating at mid-infrared wavelengths. The metamaterials are composed of pairs of back-to-back or face-to-face U-shape split-ring resonators (SRRs). Transmission spectra of the metamaterials are measured using Fourier-transform infrared spectroscopy. The results show that the transmission resonance is dependent on the distance between the two SRRs in each SRR pair. The dips in the transmission spectrum shift to shorter wavelengths with increasing distance between the two SRRs for both the back-to-back and face-to-face SRR pairs. The position of the resonance dips in the spectrum can hence be controlled by the relative position of the SRRs. This mechanism of resonance control offers a promising way of developing metamaterials with tunability for optical filters and bio/chemical sensing devices in integrated nano-optics.

  5. Resonance control of mid-infrared metamaterials using arrays of split-ring resonator pairs

    KAUST Repository

    Yue, Weisheng; Wang, Zhihong; Whittaker, John; Schedin, Fredrik; Wu, Zhipeng; Han, Jiaguang

    2016-01-01

    We present our design, fabrication and characterization of resonance-controllable metamaterials operating at mid-infrared wavelengths. The metamaterials are composed of pairs of back-to-back or face-to-face U-shape split-ring resonators (SRRs). Transmission spectra of the metamaterials are measured using Fourier-transform infrared spectroscopy. The results show that the transmission resonance is dependent on the distance between the two SRRs in each SRR pair. The dips in the transmission spectrum shift to shorter wavelengths with increasing distance between the two SRRs for both the back-to-back and face-to-face SRR pairs. The position of the resonance dips in the spectrum can hence be controlled by the relative position of the SRRs. This mechanism of resonance control offers a promising way of developing metamaterials with tunability for optical filters and bio/chemical sensing devices in integrated nano-optics.

  6. Optimal Face-Iris Multimodal Fusion Scheme

    Directory of Open Access Journals (Sweden)

    Omid Sharifi

    2016-06-01

    Full Text Available Multimodal biometric systems are considered a way to minimize the limitations raised by single traits. This paper proposes new schemes based on score level, feature level and decision level fusion to efficiently fuse face and iris modalities. Log-Gabor transformation is applied as the feature extraction method on face and iris modalities. At each level of fusion, different schemes are proposed to improve the recognition performance and, finally, a combination of schemes at different fusion levels constructs an optimized and robust scheme. In this study, CASIA Iris Distance database is used to examine the robustness of all unimodal and multimodal schemes. In addition, Backtracking Search Algorithm (BSA, a novel population-based iterative evolutionary algorithm, is applied to improve the recognition accuracy of schemes by reducing the number of features and selecting the optimized weights for feature level and score level fusion, respectively. Experimental results on verification rates demonstrate a significant improvement of proposed fusion schemes over unimodal and multimodal fusion methods.

  7. Mathematical description for the measurement and verification of energy efficiency improvement

    International Nuclear Information System (INIS)

    Xia, Xiaohua; Zhang, Jiangfeng

    2013-01-01

    Highlights: • A mathematical model for the measurement and verification problem is established. • Criteria to choose the four measurement and verification options are given. • Optimal measurement and verification plan is defined. • Calculus of variations and optimal control can be further applied. - Abstract: Insufficient energy supply is a problem faced by many countries, and energy efficiency improvement is identified as the quickest and most effective solution to this problem. Many energy efficiency projects are therefore initiated to reach various energy saving targets. These energy saving targets need to be measured and verified, and in many countries such a measurement and verification (M and V) activity is guided by the International Performance Measurement and Verification Protocol (IPMVP). However, M and V is widely regarded as an inaccurate science: an engineering practice relying heavily on professional judgement. This paper presents a mathematical description of the energy efficiency M and V problem and thus casts into a scientific framework the basic M and V concepts, propositions, techniques and methodologies. For this purpose, a general description of energy system modeling is provided to facilitate the discussion, strict mathematical definitions for baseline and baseline adjustment are given, and the M and V plan development is formulated as an M and V modeling problem. An optimal M and V plan is therefore obtained through solving a calculus of variation, or equivalently, an optimal control problem. This approach provides a fruitful source of research problems by which optimal M and V plans under various practical constraints can be determined. With the aid of linear control system models, this mathematical description also provides sufficient conditions for M and V practitioners to determine which one of the four M and V options in IPMVP should be used in a practical M and V project

  8. Development of optical ground verification method for μm to sub-mm reflectors

    Science.gov (United States)

    Stockman, Y.; Thizy, C.; Lemaire, P.; Georges, M.; Mazy, E.; Mazzoli, A.; Houbrechts, Y.; Rochus, P.; Roose, S.; Doyle, D.; Ulbrich, G.

    2017-11-01

    Large reflectors and antennas for the IR to mm wavelength range are being planned for many Earth observation and astronomical space missions and for commercial communication satellites as well. Scientific observatories require large telescopes with precisely shaped reflectors for collecting the electro-magnetic radiation from faint sources. The challenging tasks of on-ground testing are to achieve the required accuracy in the measurement of the reflector shapes and antenna structures and to verify their performance under simulated space conditions (vacuum, low temperatures). Due to the specific surface characteristics of reflectors operating in these spectral regions, standard optical metrology methods employed in the visible spectrum do not provide useful measurement results. The current state-of-the-art commercial metrology systems are not able to measure these types of reflectors because they have to face the measurement of shape and waviness over relatively large areas with a large deformation dynamic range and encompassing a wide range of spatial frequencies. 3-D metrology (tactile coordinate measurement) machines are generally used during the manufacturing process. Unfortunately, these instruments cannot be used in the operational environmental conditions of the reflector. The application of standard visible wavelength interferometric methods is very limited or impossible due to the large relative surface roughnesses involved. A small number of infrared interferometers have been commercially developed over the last 10 years but their applications have also been limited due to poor dynamic range and the restricted spatial resolution of their detectors. These restrictions affect also the surface error slopes that can be captured and makes their application to surfaces manufactured using CRFP honeycomb technologies rather difficult or impossible. It has therefore been considered essential, from the viewpoint of supporting future ESA exploration missions, to

  9. Likelihood-ratio-based biometric verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    2002-01-01

    This paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that for single-user verification the likelihood ratio is optimal.

  10. Likelihood Ratio-Based Biometric Verification

    NARCIS (Netherlands)

    Bazen, A.M.; Veldhuis, Raymond N.J.

    The paper presents results on optimal similarity measures for biometric verification based on fixed-length feature vectors. First, we show that the verification of a single user is equivalent to the detection problem, which implies that, for single-user verification, the likelihood ratio is optimal.

  11. The data acquisition and interlock system for Tore Supra infrared imaging

    International Nuclear Information System (INIS)

    Moulin, D.; Balorin, C.; Buravand, Y.; Caulier, G.; Ducobu, L.; Guilhem, D.; Jouve, M.; Roche, H.

    2003-01-01

    The data acquisition for the infrared measurement system on Tore-Supra is a key element in ensuring the supervision of the new actively-cooled plasma facing components of the CIEL project. It will allow us to follow the thermal evolution of components of Tore-Supra, in particular the toroidal pumped limiter (LPT) (360 deg.-15 m long) and the five additional heating launchers. When fully installed, the infrared measurement system will be composed of 12 digital 16-bit infrared cameras. They cover a 100-1200 deg.C temperature range and each picture has a definition of 320x240 pixels with a 20 ms time resolution. The objectives of the data acquisition system is real-time recording and analysis of each view element for further post-pulse analysis in order to understand the physics phenomenon and ensure the supervision of the plasma facing components and also to be part of the global feedback control system of Tore Supra

  12. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  13. A Brazing Defect Detection Using an Ultrasonic Infrared Imaging Inspection

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Wan; Choi, Young Soo; Jung, Seung Ho; Jung, Hyun Kyu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2007-10-15

    When a high-energy ultrasound propagates through a solid body that contains a crack or a delamination, the two faces of the defect do not ordinarily vibrate in unison, and dissipative phenomena such as friction, rubbing and clapping between the faces will convert some of the vibrational energy to heat. By combining this heating effect with infrared imaging, one can detect a subsurface defect in material in real time. In this paper a realtime detection of the brazing defect of thin Inconel plates using the UIR (ultrasonic infrared imaging) technology is described. A low frequency (23 kHz) ultrasonic transducer was used to infuse the welded Inconel plates with a short pulse of sound for 280 ms. The ultrasonic source has a maximum power of 2 kW. The surface temperature of the area under inspection is imaged by an infrared camera that is coupled to a fast frame grabber in a computer. The hot spots, which are a small area around the bound between the two faces of the Inconel plates near the defective brazing point and heated up highly, are observed. And the weak thermal signal is observed at the defect position of brazed plate also. Using the image processing technology such as background subtraction average and image enhancement using histogram equalization, the position of defective brazing regions in the thin Inconel plates can be located certainly

  14. Combining Dark Energy Survey Science Verification data with near-infrared data from the ESO VISTA Hemisphere Survey

    Energy Technology Data Exchange (ETDEWEB)

    Banerji, M.; Jouvel, S.; Lin, H.; McMahon, R. G.; Lahav, O.; Castander, F. J.; Abdalla, F. B.; Bertin, E.; Bosman, S. E.; Carnero, A.; Kind, M. C.; da Costa, L. N.; Gerdes, D.; Gschwend, J.; Lima, M.; Maia, M. A. G.; Merson, A.; Miller, C.; Ogando, R.; Pellegrini, P.; Reed, S.; Saglia, R.; Sanchez, C.; Allam, S.; Annis, J.; Bernstein, G.; Bernstein, J.; Bernstein, R.; Capozzi, D.; Childress, M.; Cunha, C. E.; Davis, T. M.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Findlay, J.; Finley, D. A.; Flaugher, B.; Frieman, J.; Gaztanaga, E.; Glazebrook, K.; Gonzalez-Fernandez, C.; Gonzalez-Solares, E.; Honscheid, K.; Irwin, M. J.; Jarvis, M. J.; Kim, A.; Koposov, S.; Kuehn, K.; Kupcu-Yoldas, A.; Lagattuta, D.; Lewis, J. R.; Lidman, C.; Makler, M.; Marriner, J.; Marshall, J. L.; Miquel, R.; Mohr, J. J.; Neilsen, E.; Peoples, J.; Sako, M.; Sanchez, E.; Scarpine, V.; Schindler, R.; Schubnell, M.; Sevilla, I.; Sharp, R.; Soares-Santos, M.; Swanson, M. E. C.; Tarle, G.; Thaler, J.; Tucker, D.; Uddin, S. A.; Wechsler, R.; Wester, W.; Yuan, F.; Zuntz, J.

    2014-11-25

    We present the combination of optical data from the Science Verification phase of the Dark Energy Survey (DES) with near-infrared (NIR) data from the European Southern Observatory VISTA Hemisphere Survey (VHS). The deep optical detections from DES are used to extract fluxes and associated errors from the shallower VHS data. Joint seven-band (grizYJK) photometric catalogues are produced in a single 3 sq-deg dedicated camera field centred at 02h26m-04d36m where the availability of ancillary multiwavelength photometry and spectroscopy allows us to test the data quality. Dual photometry increases the number of DES galaxies with measured VHS fluxes by a factor of ~4.5 relative to a simple catalogue level matching and results in a ~1.5 mag increase in the 80 per cent completeness limit of the NIR data. Almost 70 per cent of DES sources have useful NIR flux measurements in this initial catalogue. Photometric redshifts are estimated for a subset of galaxies with spectroscopic redshifts and initial results, although currently limited by small number statistics, indicate that the VHS data can help reduce the photometric redshift scatter at both z < 0.5 and z > 1. We present example DES+VHS colour selection criteria for high-redshift luminous red galaxies (LRGs) at z ~ 0.7 as well as luminous quasars. Using spectroscopic observations in this field we show that the additional VHS fluxes enable a cleaner selection of both populations with <10 per cent contamination from galactic stars in the case of spectroscopically confirmed quasars and <0.5 per cent contamination from galactic stars in the case of spectroscopically confirmed LRGs. The combined DES+VHS data set, which will eventually cover almost 5000 sq-deg, will therefore enable a range of new science and be ideally suited for target selection for future wide-field spectroscopic surveys.

  15. Near-infrared photometric study of open star cluster IC 1805

    International Nuclear Information System (INIS)

    Sagar, R.; Yu, Q.Z.

    1990-01-01

    The JHK magnitudes of 29 stars in the region of open star cluster IC 1805 were measured. These, and the existing infrared and optical observations, indicate a normal interstellar extinction law in the direction of the cluster. Further, most of the early-type stars have near-infrared fluxes as expected from their spectral types. Patchy distribution of ionized gas and dust appears to be the cause of nonuniform extinction across the cluster face. 36 refs

  16. Can self-verification strivings fully transcend the self-other barrier? Seeking verification of ingroup identities.

    Science.gov (United States)

    Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B

    2009-12-01

    Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.

  17. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  18. Utterance Verification for Text-Dependent Speaker Recognition

    DEFF Research Database (Denmark)

    Kinnunen, Tomi; Sahidullah, Md; Kukanov, Ivan

    2016-01-01

    Text-dependent automatic speaker verification naturally calls for the simultaneous verification of speaker identity and spoken content. These two tasks can be achieved with automatic speaker verification (ASV) and utterance verification (UV) technologies. While both have been addressed previously...

  19. A Readout Integrated Circuit (ROIC) employing self-adaptive background current compensation technique for Infrared Focal Plane Array (IRFPA)

    Science.gov (United States)

    Zhou, Tong; Zhao, Jian; He, Yong; Jiang, Bo; Su, Yan

    2018-05-01

    A novel self-adaptive background current compensation circuit applied to infrared focal plane array is proposed in this paper, which can compensate the background current generated in different conditions. Designed double-threshold detection strategy is to estimate and eliminate the background currents, which could significantly reduce the hardware overhead and improve the uniformity among different pixels. In addition, the circuit is well compatible to various categories of infrared thermo-sensitive materials. The testing results of a 4 × 4 experimental chip showed that the proposed circuit achieves high precision, wide application and high intelligence. Tape-out of the 320 × 240 readout circuit, as well as the bonding, encapsulation and imaging verification of uncooled infrared focal plane array, have also been completed.

  20. Illumination normalization based on simplified local binary patterns for a face verification system

    NARCIS (Netherlands)

    Tao, Q.; Veldhuis, Raymond N.J.

    2007-01-01

    Illumination normalization is a very important step in face recognition. In this paper we propose a simple implementation of Local Binary Patterns, which effectively reduces the variability caused by illumination changes. In combination with a likelihood ratio classifier, this illumination

  1. Information Theory for Gabor Feature Selection for Face Recognition

    Directory of Open Access Journals (Sweden)

    Shen Linlin

    2006-01-01

    Full Text Available A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004, our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  2. Information Theory for Gabor Feature Selection for Face Recognition

    Science.gov (United States)

    Shen, Linlin; Bai, Li

    2006-12-01

    A discriminative and robust feature—kernel enhanced informative Gabor feature—is proposed in this paper for face recognition. Mutual information is applied to select a set of informative and nonredundant Gabor features, which are then further enhanced by kernel methods for recognition. Compared with one of the top performing methods in the 2004 Face Verification Competition (FVC2004), our methods demonstrate a clear advantage over existing methods in accuracy, computation efficiency, and memory cost. The proposed method has been fully tested on the FERET database using the FERET evaluation protocol. Significant improvements on three of the test data sets are observed. Compared with the classical Gabor wavelet-based approaches using a huge number of features, our method requires less than 4 milliseconds to retrieve a few hundreds of features. Due to the substantially reduced feature dimension, only 4 seconds are required to recognize 200 face images. The paper also unified different Gabor filter definitions and proposed a training sample generation algorithm to reduce the effects caused by unbalanced number of samples available in different classes.

  3. Forensic face recognition as a means to determine strength of evidence: A survey.

    Science.gov (United States)

    Zeinstra, C G; Meuwly, D; Ruifrok, A Cc; Veldhuis, R Nj; Spreeuwers, L J

    2018-01-01

    This paper surveys the literature on forensic face recognition (FFR), with a particular focus on the strength of evidence as used in a court of law. FFR is the use of biometric face recognition for several applications in forensic science. It includes scenarios of ID verification and open-set identification, investigation and intelligence, and evaluation of the strength of evidence. We present FFR from operational, tactical, and strategic perspectives. We discuss criticism of FFR and we provide an overview of research efforts from multiple perspectives that relate to the domain of FFR. Finally, we sketch possible future directions for FFR. Copyright © 2018 Central Police University.

  4. A Practitioners Perspective on Verification

    Science.gov (United States)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  5. High precision automated face localization in thermal images: oral cancer dataset as test case

    Science.gov (United States)

    Chakraborty, M.; Raman, S. K.; Mukhopadhyay, S.; Patsa, S.; Anjum, N.; Ray, J. G.

    2017-02-01

    Automated face detection is the pivotal step in computer vision aided facial medical diagnosis and biometrics. This paper presents an automatic, subject adaptive framework for accurate face detection in the long infrared spectrum on our database for oral cancer detection consisting of malignant, precancerous and normal subjects of varied age group. Previous works on oral cancer detection using Digital Infrared Thermal Imaging(DITI) reveals that patients and normal subjects differ significantly in their facial thermal distribution. Therefore, it is a challenging task to formulate a completely adaptive framework to veraciously localize face from such a subject specific modality. Our model consists of first extracting the most probable facial regions by minimum error thresholding followed by ingenious adaptive methods to leverage the horizontal and vertical projections of the segmented thermal image. Additionally, the model incorporates our domain knowledge of exploiting temperature difference between strategic locations of the face. To our best knowledge, this is the pioneering work on detecting faces in thermal facial images comprising both patients and normal subjects. Previous works on face detection have not specifically targeted automated medical diagnosis; face bounding box returned by those algorithms are thus loose and not apt for further medical automation. Our algorithm significantly outperforms contemporary face detection algorithms in terms of commonly used metrics for evaluating face detection accuracy. Since our method has been tested on challenging dataset consisting of both patients and normal subjects of diverse age groups, it can be seamlessly adapted in any DITI guided facial healthcare or biometric applications.

  6. Hybrid generative-discriminative approach to age-invariant face recognition

    Science.gov (United States)

    Sajid, Muhammad; Shafique, Tamoor

    2018-03-01

    Age-invariant face recognition is still a challenging research problem due to the complex aging process involving types of facial tissues, skin, fat, muscles, and bones. Most of the related studies that have addressed the aging problem are focused on generative representation (aging simulation) or discriminative representation (feature-based approaches). Designing an appropriate hybrid approach taking into account both the generative and discriminative representations for age-invariant face recognition remains an open problem. We perform a hybrid matching to achieve robustness to aging variations. This approach automatically segments the eyes, nose-bridge, and mouth regions, which are relatively less sensitive to aging variations compared with the rest of the facial regions that are age-sensitive. The aging variations of age-sensitive facial parts are compensated using a demographic-aware generative model based on a bridged denoising autoencoder. The age-insensitive facial parts are represented by pixel average vector-based local binary patterns. Deep convolutional neural networks are used to extract relative features of age-sensitive and age-insensitive facial parts. Finally, the feature vectors of age-sensitive and age-insensitive facial parts are fused to achieve the recognition results. Extensive experimental results on morphological face database II (MORPH II), face and gesture recognition network (FG-NET), and Verification Subset of cross-age celebrity dataset (CACD-VS) demonstrate the effectiveness of the proposed method for age-invariant face recognition well.

  7. Nuclear disarmament verification

    International Nuclear Information System (INIS)

    DeVolpi, A.

    1993-01-01

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification

  8. Verification Account Management System (VAMS)

    Data.gov (United States)

    Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...

  9. Quantum money with classical verification

    Energy Technology Data Exchange (ETDEWEB)

    Gavinsky, Dmitry [NEC Laboratories America, Princeton, NJ (United States)

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  10. Quantum money with classical verification

    International Nuclear Information System (INIS)

    Gavinsky, Dmitry

    2014-01-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it

  11. Particularities of Verification Processes for Distributed Informatics Applications

    Directory of Open Access Journals (Sweden)

    Ion IVAN

    2013-01-01

    Full Text Available This paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and there are identified the differences from software testing. Particularities of the software testing and software verification processes are described. The verification steps and necessary conditions are presented and there are established influence factors of quality verification. Software optimality verification is analyzed and some metrics are defined for the verification process.

  12. FUSION DECISION FOR A BIMODAL BIOMETRIC VERIFICATION SYSTEM USING SUPPORT VECTOR MACHINE AND ITS VARIATIONS

    Directory of Open Access Journals (Sweden)

    A. Teoh

    2017-12-01

    Full Text Available This paw presents fusion detection technique comparisons based on support vector machine and its variations for a bimodal biometric verification system that makes use of face images and speech utterances. The system is essentially constructed by a face expert, a speech expert and a fusion decision module. Each individual expert has been optimized to operate in automatic mode and designed for security access application. Fusion decision schemes considered are linear, weighted Support Vector Machine (SVM and linear SVM with quadratic transformation. The conditions tested include the balanced and unbalanced conditions between the two experts in order to obtain the optimum fusion module from  these techniques best suited to the target application.

  13. Nuclear test ban verification

    International Nuclear Information System (INIS)

    Chun, Kin-Yip

    1991-07-01

    This report describes verification and its rationale, the basic tasks of seismic verification, the physical basis for earthquake/explosion source discrimination and explosion yield determination, the technical problems pertaining to seismic monitoring of underground nuclear tests, the basic problem-solving strategy deployed by the forensic seismology resarch team at the University of Toronto, and the scientific significance of the team's research. The research carried out at the Univeristy of Toronto has two components: teleseismic verification using P wave recordings from the Yellowknife Seismic Array (YKA), and regional (close-in) verification using high-frequency L g and P n recordings from the Eastern Canada Telemetered Network. Major differences have been found in P was attenuation among the propagation paths connecting the YKA listening post with seven active nuclear explosion testing areas in the world. Significant revisions have been made to previously published P wave attenuation results, leading to more interpretable nuclear explosion source functions. (11 refs., 12 figs.)

  14. Visual attention to dynamic faces and objects is linked to face processing skills: A combined study of children with autism and controls

    Directory of Open Access Journals (Sweden)

    Julia eParish-Morris

    2013-04-01

    Full Text Available Although the extant literature on face recognition skills in Autism Spectrum Disorder (ASD shows clear impairments compared to typically developing controls (TDC at the group level, the distribution of scores within ASD is broad. In the present research, we take a dimensional approach and explore how differences in social attention during an eye tracking experiment correlate with face recognition skills across ASD and TDC. Emotional discrimination and person identity perception face processing skills were assessed using the Let’s Face It! Skills Battery in 110 children with and without ASD. Social attention was assessed using infrared eye gaze tracking during passive viewing of movies of facial expressions and objects displayed together on a computer screen. Face processing skills were significantly correlated with measures of attention to faces and with social skills as measured by the Social Communication Questionnaire. Consistent with prior research, children with ASD scored significantly lower on face processing skills tests but, unexpectedly, group differences in amount of attention to faces (versus objects were not found. We discuss possible methodological contributions to this null finding. We also highlight the importance of a dimensional approach for understanding the developmental origins of reduced face perception skills, and emphasize the need for longitudinal research to truly understand how social motivation and social attention influence the development of social perceptual skills.

  15. Java bytecode verification via static single assignment form

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian W.; Franz, Michael

    2008-01-01

    Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism that trans......Java Virtual Machines (JVMs) traditionally perform bytecode verification by way of an iterative data-flow analysis. Bytecode verification is necessary to ensure type safety because temporary variables in the JVM are not statically typed. We present an alternative verification mechanism...

  16. Data combination of infrared thermography images and lock-in thermography images for NDE of plasma facing components

    International Nuclear Information System (INIS)

    Moysan, J.; Gueudre, C.; Corneloup, G.; Durocher, A.

    2006-01-01

    A pioneering activity has been developed by CEA and the European industry in the field of actively cooled high heat flux plasma facing components (PFC) from the very beginning of Tore Supra project. These components have been developed in order to enable a large power exhaust capability. The goal of this study is to improve the Non Destructive Evaluation (NDE) of these components. The difficulty encountered is the evaluation of the junction between a carbon and a metallic substrate. This was even more difficult when complex designs have to be implemented. A first NDE solution was based on the so called SATIR test. The method is based on infrared measurements of tile surface temperatures during a thermal transient produced by hot/cold water flowing in the heat sink cooling channel. In order to improve the definition of acceptance rules for the PFCs, a second NDE method based on Lock-in Thermography is developed. In this work we present how we can combine the two resulting images in order to accept or to reject a component. This prospective study allows improving the experimental setup and the definition of acceptance criteria. The experimental study was conducted on trial components for the Wendelstein 7X stellarator. The conclusions will also influence future non destructive projects dedicated to the ITER project. (orig.)

  17. Formal verification of algorithms for critical systems

    Science.gov (United States)

    Rushby, John M.; Von Henke, Friedrich

    1993-01-01

    We describe our experience with formal, machine-checked verification of algorithms for critical applications, concentrating on a Byzantine fault-tolerant algorithm for synchronizing the clocks in the replicated computers of a digital flight control system. First, we explain the problems encountered in unsynchronized systems and the necessity, and criticality, of fault-tolerant synchronization. We give an overview of one such algorithm, and of the arguments for its correctness. Next, we describe a verification of the algorithm that we performed using our EHDM system for formal specification and verification. We indicate the errors we found in the published analysis of the algorithm, and other benefits that we derived from the verification. Based on our experience, we derive some key requirements for a formal specification and verification system adequate to the task of verifying algorithms of the type considered. Finally, we summarize our conclusions regarding the benefits of formal verification in this domain, and the capabilities required of verification systems in order to realize those benefits.

  18. Challenges for effective WMD verification

    International Nuclear Information System (INIS)

    Andemicael, B.

    2006-01-01

    Effective verification is crucial to the fulfillment of the objectives of any disarmament treaty, not least as regards the proliferation of weapons of mass destruction (WMD). The effectiveness of the verification package depends on a number of factors, some inherent in the agreed structure and others related to the type of responses demanded by emerging challenges. The verification systems of three global agencies-the IAEA, the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO, currently the Preparatory Commission), and the Organization for the Prohibition of Chemical Weapons (OPCW)-share similarities in their broad objectives of confidence-building and deterrence by assuring members that rigorous verification would deter or otherwise detect non-compliance. Yet they are up against various constraints and other issues, both internal and external to the treaty regime. These constraints pose major challenges to the effectiveness and reliability of the verification operations. In the nuclear field, the IAEA safeguards process was the first to evolve incrementally from modest Statute beginnings to a robust verification system under the global Treaty on the Non-Proliferation of Nuclear Weapons (NPT). The nuclear non-proliferation regime is now being supplemented by a technology-intensive verification system of the nuclear test-ban treaty (CTBT), a product of over three decades of negotiation. However, there still remain fundamental gaps and loopholes in the regime as a whole, which tend to diminish the combined effectiveness of the IAEA and the CTBT verification capabilities. He three major problems are (a) the lack of universality of membership, essentially because of the absence of three nuclear weapon-capable States-India, Pakistan and Israel-from both the NPT and the CTBT, (b) the changes in US disarmament policy, especially in the nuclear field, and (c) the failure of the Conference on Disarmament to conclude a fissile material cut-off treaty. The world is

  19. A Syntactic-Semantic Approach to Incremental Verification

    OpenAIRE

    Bianculli, Domenico; Filieri, Antonio; Ghezzi, Carlo; Mandrioli, Dino

    2013-01-01

    Software verification of evolving systems is challenging mainstream methodologies and tools. Formal verification techniques often conflict with the time constraints imposed by change management practices for evolving systems. Since changes in these systems are often local to restricted parts, an incremental verification approach could be beneficial. This paper introduces SiDECAR, a general framework for the definition of verification procedures, which are made incremental by the framework...

  20. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  1. Verification of Ceramic Structures

    Science.gov (United States)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  2. Is flow verification necessary

    International Nuclear Information System (INIS)

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper

  3. Person-Specific Face Detection in a Scene with Optimum Composite Filtering and Colour-Shape Information

    Directory of Open Access Journals (Sweden)

    Seokwon Yeom

    2013-01-01

    Full Text Available Face detection and recognition have wide applications in robot vision and intelligent surveillance. However, face identification at a distance is very challenging because long-distance images are often degraded by low resolution, blurring and noise. This paper introduces a person-specific face detection method that uses a nonlinear optimum composite filter and subsequent verification stages. The filter's optimum criterion minimizes the sum of the output energy generated by the input noise and the input image. The composite filter is trained with several training images under long-distance modelling. The candidate facial regions are provided by the filter's outputs of the input scene. False alarms are eliminated by subsequent testing stages, which comprise skin colour and edge mask filtering tests. In the experiments, images captured by a webcam and a CCTV camera are processed to show the effectiveness of the person-specific face detection system at a long distance.

  4. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  5. A Scalable Approach for Hardware Semiformal Verification

    OpenAIRE

    Grimm, Tomas; Lettnin, Djones; Hübner, Michael

    2018-01-01

    The current verification flow of complex systems uses different engines synergistically: virtual prototyping, formal verification, simulation, emulation and FPGA prototyping. However, none is able to verify a complete architecture. Furthermore, hybrid approaches aiming at complete verification use techniques that lower the overall complexity by increasing the abstraction level. This work focuses on the verification of complex systems at the RT level to handle the hardware peculiarities. Our r...

  6. Mid-infrared volume diffraction gratings in IG2 chalcogenide glass: fabrication, characterization, and theoretical verification

    Science.gov (United States)

    Butcher, Helen L.; MacLachlan, David G.; Lee, David; Brownsword, Richard A.; Thomson, Robert R.; Weidmann, Damien

    2018-02-01

    Ultrafast laser inscription (ULI) has previously been employed to fabricate volume diffraction gratings in chalcogenide glasses, which operate in transmission mode in the mid-infrared spectral region. Prior gratings were manufactured for applications in astrophotonics, at wavelengths around 2.5 μm. Rugged volume gratings also have potential use in remote atmospheric sensing and molecular spectroscopy; for these applications, longer wavelength operation is required to coincide with atmospheric transparency windows (3-5 μm) and intense ro-vibrational molecular absorption bands. We report on ULI gratings inscribed in IG2 chalcogenide glass, enabling access to the full 3-5 μm window. High-resolution broadband spectral characterization of fabricated gratings was performed using a Fourier transform spectrometer. The zeroth order transmission was characterized to derive the diffraction efficiency into higher orders, up to the fourth orders in the case of gratings optimized for first order diffraction at 3 μm. The outcomes imply that ULI in IG2 is well suited for the fabrication of volume gratings in the mid infrared, providing the impact of the ULI fabrication parameters on the grating properties are well understood. To develop this understanding, grating modeling was conducted. Parameters studied include grating thickness, refractive index modification, and aspect ratio of the modulation achieved by ULI. Knowledge of the contribution and sensitivity of these parameters was used to inform the design of a 4.3 μm grating expected to achieve > 95% first order efficiency. We will also present the characterization of these latest mid-infrared diffraction gratings in IG2.

  7. Survey on Offline Finger Print Verification System

    NARCIS (Netherlands)

    Suman, R.; Kaur, R.

    2012-01-01

    The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological

  8. In-core Instrument Subcritical Verification (INCISV) - Core Design Verification Method - 358

    International Nuclear Information System (INIS)

    Prible, M.C.; Heibel, M.D.; Conner, S.L.; Sebastiani, P.J.; Kistler, D.P.

    2010-01-01

    According to the standard on reload startup physics testing, ANSI/ANS 19.6.1, a plant must verify that the constructed core behaves sufficiently close to the designed core to confirm that the various safety analyses bound the actual behavior of the plant. A large portion of this verification must occur before the reactor operates at power. The INCISV Core Design Verification Method uses the unique characteristics of a Westinghouse Electric Company fixed in-core self powered detector design to perform core design verification after a core reload before power operation. A Vanadium self powered detector that spans the length of the active fuel region is capable of confirming the required core characteristics prior to power ascension; reactivity balance, shutdown margin, temperature coefficient and power distribution. Using a detector element that spans the length of the active fuel region inside the core provides a signal of total integrated flux. Measuring the integrated flux distributions and changes at various rodded conditions and plant temperatures, and comparing them to predicted flux levels, validates all core necessary core design characteristics. INCISV eliminates the dependence on various corrections and assumptions between the ex-core detectors and the core for traditional physics testing programs. This program also eliminates the need for special rod maneuvers which are infrequently performed by plant operators during typical core design verification testing and allows for safer startup activities. (authors)

  9. The case for a United Nations verification agency. Disarmament under effective international control. Working paper 26

    International Nuclear Information System (INIS)

    Dorn, A.W.

    1990-07-01

    It is now universally recognized that arms control treaties should be effectively verified. The most objective, flexible and cost-effective means to verify the majority of multilateral treaties would be through a new agency under the United Nations. As a cooperative international effort to develop both the technology and the political framework for arms control verification, a United Nations verification agency (UNVA) would speed up and help secure the disarmament process by: verifying a number of existing and future treaties; investigating alleged breaches of treaties; and certifying, upon request, that voluntary arms control and confidence-building measures have been carried out. This paper presents the case for such a proposal, outlines a possible institutional configuration, considers the possibilities for growth and discusses the challenges facing the establishment of such an agency. (author). 16 refs., 1 tab

  10. The case for a United Nations verification agency. Disarmament under effective international control. Working paper 26

    Energy Technology Data Exchange (ETDEWEB)

    Dorn, A W

    1990-07-01

    It is now universally recognized that arms control treaties should be effectively verified. The most objective, flexible and cost-effective means to verify the majority of multilateral treaties would be through a new agency under the United Nations. As a cooperative international effort to develop both the technology and the political framework for arms control verification, a United Nations verification agency (UNVA) would speed up and help secure the disarmament process by: verifying a number of existing and future treaties; investigating alleged breaches of treaties; and certifying, upon request, that voluntary arms control and confidence-building measures have been carried out. This paper presents the case for such a proposal, outlines a possible institutional configuration, considers the possibilities for growth and discusses the challenges facing the establishment of such an agency. (author). 16 refs., 1 tab.

  11. Selection of the optimal hard facing (HF technology of damaged forging dies based on cooling time t8/5

    Directory of Open Access Journals (Sweden)

    D. Arsić

    2016-01-01

    Full Text Available In exploitation, the forging dies are exposed to heating up to very high temperatures, variable loads: compressive, impact and shear. In this paper, the reparatory hard facing of the damaged forging dies is considered. The objective was to establish the optimal reparatory technology based on cooling time t8/5 . The verification of the adopted technology was done by investigation of the hard faced layers microstructure and measurements of hardness within the welded layers’ characteristic zones. Cooling time was determined theoretically, numerically and experimentally.

  12. Fingerprint verification prediction model in hand dermatitis.

    Science.gov (United States)

    Lee, Chew K; Chang, Choong C; Johor, Asmah; Othman, Puwira; Baba, Roshidah

    2015-07-01

    Hand dermatitis associated fingerprint changes is a significant problem and affects fingerprint verification processes. This study was done to develop a clinically useful prediction model for fingerprint verification in patients with hand dermatitis. A case-control study involving 100 patients with hand dermatitis. All patients verified their thumbprints against their identity card. Registered fingerprints were randomized into a model derivation and model validation group. Predictive model was derived using multiple logistic regression. Validation was done using the goodness-of-fit test. The fingerprint verification prediction model consists of a major criterion (fingerprint dystrophy area of ≥ 25%) and two minor criteria (long horizontal lines and long vertical lines). The presence of the major criterion predicts it will almost always fail verification, while presence of both minor criteria and presence of one minor criterion predict high and low risk of fingerprint verification failure, respectively. When none of the criteria are met, the fingerprint almost always passes the verification. The area under the receiver operating characteristic curve was 0.937, and the goodness-of-fit test showed agreement between the observed and expected number (P = 0.26). The derived fingerprint verification failure prediction model is validated and highly discriminatory in predicting risk of fingerprint verification in patients with hand dermatitis. © 2014 The International Society of Dermatology.

  13. GARLIC — A general purpose atmospheric radiative transfer line-by-line infrared-microwave code: Implementation and evaluation

    International Nuclear Information System (INIS)

    Schreier, Franz; Gimeno García, Sebastián; Hedelt, Pascal; Hess, Michael; Mendrok, Jana; Vasquez, Mayte; Xu, Jian

    2014-01-01

    A suite of programs for high resolution infrared-microwave atmospheric radiative transfer modeling has been developed with emphasis on efficient and reliable numerical algorithms and a modular approach appropriate for simulation and/or retrieval in a variety of applications. The Generic Atmospheric Radiation Line-by-line Infrared Code — GARLIC — is suitable for arbitrary observation geometry, instrumental field-of-view, and line shape. The core of GARLIC's subroutines constitutes the basis of forward models used to implement inversion codes to retrieve atmospheric state parameters from limb and nadir sounding instruments. This paper briefly introduces the physical and mathematical basics of GARLIC and its descendants and continues with an in-depth presentation of various implementation aspects: An optimized Voigt function algorithm combined with a two-grid approach is used to accelerate the line-by-line modeling of molecular cross sections; various quadrature methods are implemented to evaluate the Schwarzschild and Beer integrals; and Jacobians, i.e. derivatives with respect to the unknowns of the atmospheric inverse problem, are implemented by means of automatic differentiation. For an assessment of GARLIC's performance, a comparison of the quadrature methods for solution of the path integral is provided. Verification and validation are demonstrated using intercomparisons with other line-by-line codes and comparisons of synthetic spectra with spectra observed on Earth and from Venus. - Highlights: • High resolution infrared-microwave radiative transfer model. • Discussion of algorithmic and computational aspects. • Jacobians by automatic/algorithmic differentiation. • Performance evaluation by intercomparisons, verification, validation

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  16. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I

    2005-01-01

    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  17. Post-silicon and runtime verification for modern processors

    CERN Document Server

    Wagner, Ilya

    2010-01-01

    The purpose of this book is to survey the state of the art and evolving directions in post-silicon and runtime verification. The authors start by giving an overview of the state of the art in verification, particularly current post-silicon methodologies in use in the industry, both for the domain of processor pipeline design and for memory subsystems. They then dive into the presentation of several new post-silicon verification solutions aimed at boosting the verification coverage of modern processors, dedicating several chapters to this topic. The presentation of runtime verification solution

  18. Verification and the safeguards legacy

    International Nuclear Information System (INIS)

    Perricos, Demetrius

    2001-01-01

    A number of inspection or monitoring systems throughout the world over the last decades have been structured drawing upon the IAEA experience of setting up and operating its safeguards system. The first global verification system was born with the creation of the IAEA safeguards system, about 35 years ago. With the conclusion of the NPT in 1968, inspections were to be performed under safeguards agreements, concluded directly between the IAEA and non-nuclear weapon states parties to the Treaty. The IAEA developed the safeguards system within the limitations reflected in the Blue Book (INFCIRC 153), such as limitations of routine access by the inspectors to 'strategic points', including 'key measurement points', and the focusing of verification on declared nuclear material in declared installations. The system, based as it was on nuclear material accountancy. It was expected to detect a diversion of nuclear material with a high probability and within a given time and therefore determine also that there had been no diversion of nuclear material from peaceful purposes. The most vital element of any verification system is the inspector. Technology can assist but cannot replace the inspector in the field. Their experience, knowledge, intuition and initiative are invaluable factors contributing to the success of any inspection regime. The IAEA inspectors are however not part of an international police force that will intervene to prevent a violation taking place. To be credible they should be technically qualified with substantial experience in industry or in research and development before they are recruited. An extensive training program has to make sure that the inspectors retain their professional capabilities and that it provides them with new skills. Over the years, the inspectors and through them the safeguards verification system gained experience in: organization and management of large teams; examination of records and evaluation of material balances

  19. The application of Near Infrared Reflectance Spectroscopy (NIRS) for the quantitative analysis of hydrocortisone in primary materials

    OpenAIRE

    A. PITTAS; C. SERGIDES; K. NIKOLICH

    2001-01-01

    Near Infrared Reflectance Spectroscopy (NIRS), coupled with fiber optic probes, has been shown to be a quick and reliable analytical tool for quality assurance and quality control in the pharmaceutical industry, both for verifications of raw materials and quantification of the active ingredients in final products. In this paper, a typical pharmaceutical product, hydrocortisone sodium succinate, is used as an example for the application of NIR spectroscopy for quality control. In order to deve...

  20. RESRAD-BUILD verification

    International Nuclear Information System (INIS)

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-01

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified

  1. Spent fuel verification options for final repository safeguards in Finland. A study on verification methods, their feasibility and safety aspects

    International Nuclear Information System (INIS)

    Hautamaeki, J.; Tiitta, A.

    2000-12-01

    The verification possibilities of the spent fuel assemblies from the Olkiluoto and Loviisa NPPs and the fuel rods from the research reactor of VTT are contemplated in this report. The spent fuel assemblies have to be verified at the partial defect level before the final disposal into the geologic repository. The rods from the research reactor may be verified at the gross defect level. Developing a measurement system for partial defect verification is a complicated and time-consuming task. The Passive High Energy Gamma Emission Tomography and the Fork Detector combined with Gamma Spectrometry are the most potential measurement principles to be developed for this purpose. The whole verification process has to be planned to be as slick as possible. An early start in the planning of the verification and developing the measurement devices is important in order to enable a smooth integration of the verification measurements into the conditioning and disposal process. The IAEA and Euratom have not yet concluded the safeguards criteria for the final disposal. E.g. criteria connected to the selection of the best place to perform the verification. Measurements have not yet been concluded. Options for the verification places have been considered in this report. One option for a verification measurement place is the intermediate storage. The other option is the encapsulation plant. Crucial viewpoints are such as which one offers the best practical possibilities to perform the measurements effectively and which would be the better place in the safeguards point of view. Verification measurements may be needed both in the intermediate storages and in the encapsulation plant. In this report also the integrity of the fuel assemblies after wet intermediate storage period is assessed, because the assemblies have to stand the handling operations of the verification measurements. (orig.)

  2. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    Science.gov (United States)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  3. Verification of Spent Fuel Transfers in Germany — Linking Strategy, Implementation and People

    International Nuclear Information System (INIS)

    Tsvetkov, I.; Araujo, J.; Morris, G.; Vukadin, Z.; Wishard, B.; Kahnmeyer, W.; ); Trautwein, W.

    2015-01-01

    Following the decision of the German Government to completely phase out nuclear energy by 2022, the Agency is facing an increasing number of spent fuel (SF) transfers from nuclear power plants (NPP) to dry SF storage facilities. Verification of these transfers in the period 2015-2016 would have required about 1000 additional calendar-days in the field by inspectors. To meet the verification requirements with the available resources, the Agency together with the European Commission (EC) designed an innovative approach. The approach is making full use of safeguards cooperation with the EC and Germany's NPP operators to reduce the inspector's efforts, while fully adhering to the Agency's safeguards policy and requirements. The approach includes verification for partial defect test using digital Cerenkov viewing device (DCVD) of all SF assemblies in a reactor pond(s) before and after a SF loading campaign; during the SF loading campaign all SF in pond(s) is maintained under continuous surveillance, while the containment measures on SF casks, i.e., fibre-optic and electronic seals, and corresponding fibre-optic cables, are applied by the NPP operator in accordance with the agreed procedure. While the above approach allows for substantial reduction of the Agency inspector presence during the SF cask loading campaign, it can only be implemented when good cooperation exists between the Agency, the facility operator, and, as in the case of Germany, the regional safeguards authority. (author)

  4. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  5. Future of monitoring and verification

    International Nuclear Information System (INIS)

    Wagenmakers, H.

    1991-01-01

    The organized verification entrusted to IAEA for the implementation of the NPT, of the Treaty of Tlatelolco and of the Treaty of Rarotonga, reaches reasonable standards. The current dispute with the Democratic People's Republic of Korea about the conclusion of a safeguards agreement with IAEA, by its exceptional nature, underscores rather than undermines the positive judgement to be passed on IAEA's overall performance. The additional task given to the Director General of IAEA under Security Council resolution 687 (1991) regarding Iraq's nuclear-weapons-usable material is particularly challenging. For the purposes of this paper, verification is defined as the process for establishing whether the States parties are complying with an agreement. In the final stage verification may lead into consideration of how to respond to non-compliance. Monitoring is perceived as the first level in the verification system. It is one generic form of collecting information on objects, activities or events and it involves a variety of instruments ranging from communications satellites to television cameras or human inspectors. Monitoring may also be used as a confidence-building measure

  6. Concepts for inventory verification in critical facilities

    International Nuclear Information System (INIS)

    Cobb, D.D.; Sapir, J.L.; Kern, E.A.; Dietz, R.J.

    1978-12-01

    Materials measurement and inventory verification concepts for safeguarding large critical facilities are presented. Inspection strategies and methods for applying international safeguards to such facilities are proposed. The conceptual approach to routine inventory verification includes frequent visits to the facility by one inspector, and the use of seals and nondestructive assay (NDA) measurements to verify the portion of the inventory maintained in vault storage. Periodic verification of the reactor inventory is accomplished by sampling and NDA measurement of in-core fuel elements combined with measurements of integral reactivity and related reactor parameters that are sensitive to the total fissile inventory. A combination of statistical sampling and NDA verification with measurements of reactor parameters is more effective than either technique used by itself. Special procedures for assessment and verification for abnormal safeguards conditions are also considered. When the inspection strategies and inventory verification methods are combined with strict containment and surveillance methods, they provide a high degree of assurance that any clandestine attempt to divert a significant quantity of fissile material from a critical facility inventory will be detected. Field testing of specific hardware systems and procedures to determine their sensitivity, reliability, and operational acceptability is recommended. 50 figures, 21 tables

  7. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  8. Two-dimensional spectroscopy at infrared and optical frequencies

    OpenAIRE

    Hochstrasser, Robin M.

    2007-01-01

    This Perspective on multidimensional spectroscopy in the optical and infrared spectral regions focuses on the principles and the scientific and technical challenges facing these new fields. The methods hold great promise for advances in the visualization of time-dependent structural changes in complex systems ranging from liquids to biological assemblies, new materials, and fundamental physical processes. The papers in this special feature on multidimensional spectroscopy in chemistry, physic...

  9. A new paradigm of oral cancer detection using digital infrared thermal imaging

    Science.gov (United States)

    Chakraborty, M.; Mukhopadhyay, S.; Dasgupta, A.; Banerjee, S.; Mukhopadhyay, S.; Patsa, S.; Ray, J. G.; Chaudhuri, K.

    2016-03-01

    Histopathology is considered the gold standard for oral cancer detection. But a major fraction of patient pop- ulation is incapable of accessing such healthcare facilities due to poverty. Moreover, such analysis may report false negatives when test tissue is not collected from exact cancerous location. The proposed work introduces a pioneering computer aided paradigm of fast, non-invasive and non-ionizing modality for oral cancer detection us- ing Digital Infrared Thermal Imaging (DITI). Due to aberrant metabolic activities in carcinogenic facial regions, heat signatures of patients are different from that of normal subjects. The proposed work utilizes asymmetry of temperature distribution of facial regions as principle cue for cancer detection. Three views of a subject, viz. front, left and right are acquired using long infrared (7:5 - 13μm) camera for analysing distribution of temperature. We study asymmetry of facial temperature distribution between: a) left and right profile faces and b) left and right half of frontal face. Comparison of temperature distribution suggests that patients manifest greater asymmetry compared to normal subjects. For classification, we initially use k-means and fuzzy k-means for unsupervised clustering followed by cluster class prototype assignment based on majority voting. Average classification accuracy of 91:5% and 92:8% are achieved by k-mean and fuzzy k-mean framework for frontal face. The corresponding metrics for profile face are 93:4% and 95%. Combining features of frontal and profile faces, average accuracies are increased to 96:2% and 97:6% respectively for k-means and fuzzy k-means framework.

  10. Interpersonal brain synchronization in the right temporo-parietal junction during face-to-face economic exchange.

    Science.gov (United States)

    Tang, Honghong; Mai, Xiaoqin; Wang, Shun; Zhu, Chaozhe; Krueger, Frank; Liu, Chao

    2016-01-01

    In daily life, interpersonal interactions are influenced by uncertainty about other people's intentions. Face-to-face (FF) interaction reduces such uncertainty by providing external visible cues such as facial expression or body gestures and facilitates shared intentionality to promote belief of cooperative decisions and actual cooperative behaviors in interaction. However, so far little is known about interpersonal brain synchronization between two people engaged in naturally occurring FF interactions. In this study, we combined an adapted ultimatum game with functional near-infrared spectroscopy (fNIRS) hyperscanning to investigate how FF interaction impacts interpersonal brain synchronization during economic exchange. Pairs of strangers interacted repeatedly either FF or face-blocked (FB), while their activation was simultaneously measured in the right temporo-parietal junction (rTPJ) and the control region, right dorsolateral prefrontal cortex (rDLPFC). Behaviorally, FF interactions increased shared intentionality between strangers, leading more positive belief of cooperative decisions and more actual gains in the game. FNIRS results indicated increased interpersonal brain synchronizations during FF interactions in rTPJ (but not in rDLPFC) with greater shared intentionality between partners. These results highlighted the importance of rTPJ in collaborative social interactions during FF economic exchange and warrant future research that combines FF interactions with fNIRS hyperscanning to study social brain disorders such as autism. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  11. Entropy Measurement for Biometric Verification Systems.

    Science.gov (United States)

    Lim, Meng-Hui; Yuen, Pong C

    2016-05-01

    Biometric verification systems are designed to accept multiple similar biometric measurements per user due to inherent intrauser variations in the biometric data. This is important to preserve reasonable acceptance rate of genuine queries and the overall feasibility of the recognition system. However, such acceptance of multiple similar measurements decreases the imposter's difficulty of obtaining a system-acceptable measurement, thus resulting in a degraded security level. This deteriorated security needs to be measurable to provide truthful security assurance to the users. Entropy is a standard measure of security. However, the entropy formula is applicable only when there is a single acceptable possibility. In this paper, we develop an entropy-measuring model for biometric systems that accepts multiple similar measurements per user. Based on the idea of guessing entropy, the proposed model quantifies biometric system security in terms of adversarial guessing effort for two practical attacks. Excellent agreement between analytic and experimental simulation-based measurement results on a synthetic and a benchmark face dataset justify the correctness of our model and thus the feasibility of the proposed entropy-measuring approach.

  12. Gender Verification of Female Olympic Athletes.

    Science.gov (United States)

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  13. Multimodal Biometric System Based on the Recognition of Face and Both Irises

    Directory of Open Access Journals (Sweden)

    Yeong Gon Kim

    2012-09-01

    Full Text Available The performance of unimodal biometric systems (based on a single modality such as face or fingerprint has to contend with various problems, such as illumination variation, skin condition and environmental conditions, and device variations. Therefore, multimodal biometric systems have been used to overcome the limitations of unimodal biometrics and provide high accuracy recognition. In this paper, we propose a new multimodal biometric system based on score level fusion of face and both irises' recognition. Our study has the following novel features. First, the device proposed acquires images of the face and both irises simultaneously. The proposed device consists of a face camera, two iris cameras, near-infrared illuminators and cold mirrors. Second, fast and accurate iris detection is based on two circular edge detections, which are accomplished in the iris image on the basis of the size of the iris detected in the face image. Third, the combined accuracy is enhanced by combining each score for the face and both irises using a support vector machine. The experimental results show that the equal error rate for the proposed method is 0.131%, which is lower than that of face or iris recognition and other fusion methods.

  14. Reload core safety verification

    International Nuclear Information System (INIS)

    Svetlik, M.; Minarcin, M.

    2003-01-01

    This paper presents a brief look at the process of reload core safety evaluation and verification in Slovak Republic. It gives an overview of experimental verification of selected nuclear parameters in the course of physics testing during reactor start-up. The comparison of IAEA recommendations and testing procedures at Slovak and European nuclear power plants of similar design is included. An introduction of two level criteria for evaluation of tests represents an effort to formulate the relation between safety evaluation and measured values (Authors)

  15. Characterization of double face adhesive sheets for ceramic tile installation; Caracterizacao de sistema de colagem dupla-face para assentamento de revestimento ceramico

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Otavio L.; Mansur, Alexandra A.P.; Mansur, Herman S., E-mail: hmansur@demet.ufmg.br [Universidade Federal de Minas Gerais. UFMG, Departamento de Engenharia Metalurgica e de Materiais, Belo Horizonte, MG (Brazil)

    2011-07-01

    The main goal of this work was the characterization of an innovative ceramic tile installation product based on double face adhesive sheets. Density, hardness, tensile strength, x-ray diffraction, infrared spectroscopy, and scanning electron microscopy coupled with spectroscopy of dispersive energy assays were conducted. The results are in agreement with some manufacture specifications and the obtained information will be crucial in the analysis of durability and stability of the ceramic tile system installed with this new product. (author)

  16. Validation of Embedded System Verification Models

    NARCIS (Netherlands)

    Marincic, J.; Mader, Angelika H.; Wieringa, Roelf J.

    The result of a model-based requirements verification shows that the model of a system satisfies (or not) formalised system requirements. The verification result is correct only if the model represents the system adequately. No matter what modelling technique we use, what precedes the model

  17. On Verification Modelling of Embedded Systems

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.

    Computer-aided verification of embedded systems hinges on the availability of good verification models of the systems at hand. Such models must be much simpler than full design models or specifications to be of practical value, because of the unavoidable combinatorial complexities in the

  18. Compositional verification of real-time systems using Ecdar

    DEFF Research Database (Denmark)

    David, Alexandre; Larsen, Kim Guldstrand; Legay, Axel

    2012-01-01

    We present a specification theory for timed systems implemented in the Ecdar tool. We illustrate the operations of the specification theory on a running example, showing the models and verification checks. To demonstrate the power of the compositional verification, we perform an in depth case study...... of a leader election protocol; Modeling it in Ecdar as Timed input/output automata Specifications and performing both monolithic and compositional verification of two interesting properties on it. We compare the execution time of the compositional to the classical verification showing a huge difference...

  19. Disarmament Verification - the OPCW Experience

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  20. Verification of Chemical Weapons Destruction

    International Nuclear Information System (INIS)

    Lodding, J.

    2010-01-01

    The Chemical Weapons Convention is the only multilateral treaty that bans completely an entire category of weapons of mass destruction under international verification arrangements. Possessor States, i.e. those that have chemical weapons stockpiles at the time of becoming party to the CWC, commit to destroying these. All States undertake never to acquire chemical weapons and not to help other States acquire such weapons. The CWC foresees time-bound chemical disarmament. The deadlines for destruction for early entrants to the CWC are provided in the treaty. For late entrants, the Conference of States Parties intervenes to set destruction deadlines. One of the unique features of the CWC is thus the regime for verifying destruction of chemical weapons. But how can you design a system for verification at military sites, while protecting military restricted information? What degree of assurance is considered sufficient in such circumstances? How do you divide the verification costs? How do you deal with production capability and initial declarations of existing stockpiles? The founders of the CWC had to address these and other challenges in designing the treaty. Further refinement of the verification system has followed since the treaty opened for signature in 1993 and since inspection work was initiated following entry-into-force of the treaty in 1997. Most of this work concerns destruction at the two large possessor States, Russia and the United States. Perhaps some of the lessons learned from the OPCW experience may be instructive in a future verification regime for nuclear weapons. (author)

  1. Cross spectral, active and passive approach to face recognition for improved performance

    Science.gov (United States)

    Grudzien, A.; Kowalski, M.; Szustakowski, M.

    2017-08-01

    Biometrics is a technique for automatic recognition of a person based on physiological or behavior characteristics. Since the characteristics used are unique, biometrics can create a direct link between a person and identity, based on variety of characteristics. The human face is one of the most important biometric modalities for automatic authentication. The most popular method of face recognition which relies on processing of visual information seems to be imperfect. Thermal infrared imagery may be a promising alternative or complement to visible range imaging due to its several reasons. This paper presents an approach of combining both methods.

  2. A Model for Collaborative Runtime Verification

    NARCIS (Netherlands)

    Testerink, Bas; Bulling, Nils; Dastani, Mehdi

    2015-01-01

    Runtime verification concerns checking whether a system execution satisfies a given property. In this paper we propose a model for collaborative runtime verification where a network of local monitors collaborates in order to verify properties of the system. A local monitor has only a local view on

  3. HDM/PASCAL Verification System User's Manual

    Science.gov (United States)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  4. Self-verification and contextualized self-views.

    Science.gov (United States)

    Chen, Serena; English, Tammy; Peng, Kaiping

    2006-07-01

    Whereas most self-verification research has focused on people's desire to verify their global self-conceptions, the present studies examined self-verification with regard to contextualized selfviews-views of the self in particular situations and relationships. It was hypothesized that individuals whose core self-conceptions include contextualized self-views should seek to verify these self-views. In Study 1, the more individuals defined the self in dialectical terms, the more their judgments were biased in favor of verifying over nonverifying feedback about a negative, situation-specific self-view. In Study 2, consistent with research on gender differences in the importance of relationships to the self-concept, women but not men showed a similar bias toward feedback about a negative, relationship-specific self-view, a pattern not seen for global self-views. Together, the results support the notion that self-verification occurs for core self-conceptions, whatever form(s) they may take. Individual differences in self-verification and the nature of selfhood and authenticity are discussed.

  5. Verification of RESRAD-build computer code, version 3.1

    International Nuclear Information System (INIS)

    2003-01-01

    RESRAD-BUILD is a computer model for analyzing the radiological doses resulting from the remediation and occupancy of buildings contaminated with radioactive material. It is part of a family of codes that includes RESRAD, RESRAD-CHEM, RESRAD-RECYCLE, RESRAD-BASELINE, and RESRAD-ECORISK. The RESRAD-BUILD models were developed and codified by Argonne National Laboratory (ANL); version 1.5 of the code and the user's manual were publicly released in 1994. The original version of the code was written for the Microsoft DOS operating system. However, subsequent versions of the code were written for the Microsoft Windows operating system. The purpose of the present verification task (which includes validation as defined in the standard) is to provide an independent review of the latest version of RESRAD-BUILD under the guidance provided by ANSI/ANS-10.4 for verification and validation of existing computer programs. This approach consists of a posteriori V and V review which takes advantage of available program development products as well as user experience. The purpose, as specified in ANSI/ANS-10.4, is to determine whether the program produces valid responses when used to analyze problems within a specific domain of applications, and to document the level of verification. The culmination of these efforts is the production of this formal Verification Report. The first step in performing the verification of an existing program was the preparation of a Verification Review Plan. The review plan consisted of identifying: Reason(s) why a posteriori verification is to be performed; Scope and objectives for the level of verification selected; Development products to be used for the review; Availability and use of user experience; and Actions to be taken to supplement missing or unavailable development products. The purpose, scope and objectives for the level of verification selected are described in this section of the Verification Report. The development products that were used

  6. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    Science.gov (United States)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  7. Early results from the Infrared Astronomical Satellite

    International Nuclear Information System (INIS)

    Neugebauer, G.; Beichman, C.A.; Soifer, B.T.

    1984-01-01

    For 10 months the Infrared Astronomical Satellite (IRAS) provided astronomers with what might be termed their first view of the infrared sky on a clear, dark night. Without IRAS, atmospheric absorption and the thermal emission from both the atmosphere and Earthbound telescopes make the task of the infrared astronomer comparable to what an optical astronomer would face if required to work only on cloudy afternoons. IRAS observations are serving astronomers in the same manner as the photographic plates of the Palomar Observatory Sky Survey; just as the optical survey has been used by all astronomers for over three decades, as a source of quantitative information about the sky and as a roadmap for future observations, the results of IRAS will be studied for years to come. IRAS has demonstrated the power of infrared astronomy from space. Already, from a brief look at a miniscule fraction of the data available, we have learned much about the solar system, about nearby stars, about the Galaxy as a whole and about distant extragalactic systems. Comets are much dustier than previously thought. Solid particles, presumably the remnants of the star-formation process, orbit around Vega and other stars and may provide the raw material for planetary systems. Emission from cool interstellar material has been traced throughout the Galaxy all the way to the galactic poles. Both the clumpiness and breadth of the distribution of this material were previously unsuspected. The far-infrared sky away from the galactic plane has been found to be dominate by spiral galaxies, some of which emit more than 50% and as much as 98% of their energy in the infrared - an exciting and surprising revelation. The IRAS mission is clearly the pathfinder for future mission that, to a large extent, will be devoted to the discoveries revealed by IRAS. 8 figures

  8. VERIFICATION OF 3D BUILDING MODELS USING MUTUAL INFORMATION IN AIRBORNE OBLIQUE IMAGES

    Directory of Open Access Journals (Sweden)

    A. P. Nyaruhuma

    2012-07-01

    Full Text Available This paper describes a method for automatic verification of 3D building models using airborne oblique images. The problem being tackled is identifying buildings that are demolished or changed since the models were constructed or identifying wrong models using the images. The models verified are of CityGML LOD2 or higher since their edges are expected to coincide with actual building edges. The verification approach is based on information theory. Corresponding variables between building models and oblique images are used for deriving mutual information for individual edges, faces or whole buildings, and combined for all perspective images available for the building. The wireframe model edges are projected to images and verified using low level image features – the image pixel gradient directions. A building part is only checked against images in which it may be visible. The method has been tested with models constructed using laser points against Pictometry images that are available for most cities of Europe and may be publically viewed in the so called Birds Eye view of the Microsoft Bing Maps. Results are that nearly all buildings are correctly categorised as existing or demolished. Because we now concentrate only on roofs we also used the method to test and compare results from nadir images. This comparison made clear that especially height errors in models can be more reliably detected in oblique images because of the tilted view. Besides overall building verification, results per individual edges can be used for improving the 3D building models.

  9. A Verification Logic for GOAL Agents

    Science.gov (United States)

    Hindriks, K. V.

    Although there has been a growing body of literature on verification of agents programs, it has been difficult to design a verification logic for agent programs that fully characterizes such programs and to connect agent programs to agent theory. The challenge is to define an agent programming language that defines a computational framework but also allows for a logical characterization useful for verification. The agent programming language GOAL has been originally designed to connect agent programming to agent theory and we present additional results here that GOAL agents can be fully represented by a logical theory. GOAL agents can thus be said to execute the corresponding logical theory.

  10. Secure optical verification using dual phase-only correlation

    International Nuclear Information System (INIS)

    Liu, Wei; Liu, Shutian; Zhang, Yan; Xie, Zhenwei; Liu, Zhengjun

    2015-01-01

    We introduce a security-enhanced optical verification system using dual phase-only correlation based on a novel correlation algorithm. By employing a nonlinear encoding, the inherent locks of the verification system are obtained in real-valued random distributions, and the identity keys assigned to authorized users are designed as pure phases. The verification process is implemented in two-step correlation, so only authorized identity keys can output the discriminate auto-correlation and cross-correlation signals that satisfy the reset threshold values. Compared with the traditional phase-only-correlation-based verification systems, a higher security level against counterfeiting and collisions are obtained, which is demonstrated by cryptanalysis using known attacks, such as the known-plaintext attack and the chosen-plaintext attack. Optical experiments as well as necessary numerical simulations are carried out to support the proposed verification method. (paper)

  11. Verification of DRAGON: the NXT tracking module

    International Nuclear Information System (INIS)

    Zkiek, A.; Marleau, G.

    2007-01-01

    The version of DRAGON-IST that has been verified for the calculation of the incremental cross sections associated with CANDU reactivity devices is version 3.04Bb that was released in 2001. Since then, various improvements were implemented in the code including the NXT: module that can track assemblies of clusters in 2-D and 3-D geometries. Here we will discuss the verification plan for the NXT: module of DRAGON, illustrate the verification procedure we selected and present our verification results. (author)

  12. Experiences in Building Python Automation Framework for Verification and Data Collections

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available

    face="Times New Roman" size="3">This paper describes our experiences in building a Python automation framework. Specifically, the automation framework is used to support verification and data collection scripts. The scripts control various test equipments in addition to the device under test (DUT to characterize a specific performance with a specific configuration or to evaluate the correctness of the behaviour of the DUT. The specific focus on this paper is on documenting our experiences in building an automation framework using Python: on the purposes, goals and the benefits, rather than on a tutorial of how to build such a framework.

  13. Technical challenges for dismantlement verification

    International Nuclear Information System (INIS)

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-01-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion

  14. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....

  15. SSN Verification Service

    Data.gov (United States)

    Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...

  16. Familiar face + novel face = familiar face? Representational bias in the perception of morphed faces in chimpanzees

    Directory of Open Access Journals (Sweden)

    Yoshi-Taka Matsuda

    2016-08-01

    Full Text Available Highly social animals possess a well-developed ability to distinguish the faces of familiar from novel conspecifics to induce distinct behaviors for maintaining society. However, the behaviors of animals when they encounter ambiguous faces of familiar yet novel conspecifics, e.g., strangers with faces resembling known individuals, have not been well characterised. Using a morphing technique and preferential-looking paradigm, we address this question via the chimpanzee’s facial–recognition abilities. We presented eight subjects with three types of stimuli: (1 familiar faces, (2 novel faces and (3 intermediate morphed faces that were 50% familiar and 50% novel faces of conspecifics. We found that chimpanzees spent more time looking at novel faces and scanned novel faces more extensively than familiar or intermediate faces. Interestingly, chimpanzees looked at intermediate faces in a manner similar to familiar faces with regards to the fixation duration, fixation count, and saccade length for facial scanning, even though the participant was encountering the intermediate faces for the first time. We excluded the possibility that subjects merely detected and avoided traces of morphing in the intermediate faces. These findings suggest a bias for a feeling-of-familiarity that chimpanzees perceive familiarity with an intermediate face by detecting traces of a known individual, as 50% alternation is sufficient to perceive familiarity.

  17. Enhanced Verification Test Suite for Physics Simulation Codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of

  18. Lessons Learned From Microkernel Verification — Specification is the New Bottleneck

    Directory of Open Access Journals (Sweden)

    Thorsten Bormer

    2012-11-01

    Full Text Available Software verification tools have become a lot more powerful in recent years. Even verification of large, complex systems is feasible, as demonstrated in the L4.verified and Verisoft XT projects. Still, functional verification of large software systems is rare – for reasons beyond the large scale of verification effort needed due to the size alone. In this paper we report on lessons learned for verification of large software systems based on the experience gained in microkernel verification in the Verisoft XT project. We discuss a number of issues that impede widespread introduction of formal verification in the software life-cycle process.

  19. Advancing Disarmament Verification Tools: A Task for Europe?

    International Nuclear Information System (INIS)

    Göttsche, Malte; Kütt, Moritz; Neuneck, Götz; Niemeyer, Irmgard

    2015-01-01

    A number of scientific-technical activities have been carried out to establish more robust and irreversible disarmament verification schemes. Regardless of the actual path towards deeper reductions in nuclear arsenals or their total elimination in the future, disarmament verification will require new verification procedures and techniques. This paper discusses the information that would be required as a basis for building confidence in disarmament, how it could be principally verified and the role Europe could play. Various ongoing activities are presented that could be brought together to produce a more intensified research and development environment in Europe. The paper argues that if ‘effective multilateralism’ is the main goal of the European Union’s (EU) disarmament policy, EU efforts should be combined and strengthened to create a coordinated multilateral disarmament verification capacity in the EU and other European countries. The paper concludes with several recommendations that would have a significant impact on future developments. Among other things, the paper proposes a one-year review process that should include all relevant European actors. In the long run, an EU Centre for Disarmament Verification could be envisaged to optimize verification needs, technologies and procedures.

  20. The verification of DRAGON: progress and lessons learned

    International Nuclear Information System (INIS)

    Marleau, G.

    2002-01-01

    The general requirements for the verification of the legacy code DRAGON are somewhat different from those used for new codes. For example, the absence of a design manual for DRAGON makes it difficult to confirm that the each part of the code performs as required since these requirements are not explicitly spelled out for most of the DRAGON modules. In fact, this conformance of the code can only be assessed, in most cases, by making sure that the contents of the DRAGON data structures, which correspond to the output generated by a module of the code, contains the adequate information. It is also possible in some cases to use the self-verification options in DRAGON to perform additional verification or to evaluate, using an independent software, the performance of specific functions in the code. Here, we will describe the global verification process that was considered in order to bring DRAGON to an industry standard tool-set (IST) status. We will also discuss some of the lessons we learned in performing this verification and present some of the modification to DRAGON that were implemented as a consequence of this verification. (author)

  1. Simulation Environment Based on the Universal Verification Methodology

    CERN Document Server

    AUTHOR|(SzGeCERN)697338

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC desi...

  2. Intraretinal Correlates of Reticular Pseudodrusen Revealed by Autofluorescence and En Face OCT.

    Science.gov (United States)

    Paavo, Maarjaliis; Lee, Winston; Merriam, John; Bearelly, Srilaxmi; Tsang, Stephen; Chang, Stanley; Sparrow, Janet R

    2017-09-01

    We sought to determine whether information revealed from the reflectance, autofluorescence, and absorption properties of RPE cells situated posterior to reticular pseudodrusen (RPD) could provide insight into the origins and structure of RPD. RPD were studied qualitatively by near-infrared fundus autofluorescence (NIR-AF), short-wavelength fundus autofluorescence (SW-AF), and infrared reflectance (IR-R) images, and the presentation was compared to horizontal and en face spectral domain optical coherence tomographic (SD-OCT) images. Images were acquired from 23 patients (39 eyes) diagnosed with RPD (mean age 80.7 ± 7.1 [SD]; 16 female; 4 Hispanics, 19 non-Hispanic whites). In SW-AF, NIR-AF, and IR-R images, fundus RPD were recognized as interlacing networks of small scale variations in IR-R and fluorescence (SW-AF, NIR-AF) intensities. Darkened foci of RPD colocalized in SW-AF and NIR-AF images, and in SD-OCT images corresponded to disturbances of the interdigitation (IZ) and ellipsoid (EZ) zones and to more pronounced hyperreflective lesions traversing photoreceptor-attributable bands in SD-OCT images. Qualitative assessment of the outer nuclear layer (ONL) revealed thinning as RPD extended radially from the outer to inner retina. In en face OCT, hyperreflective areas in the EZ band correlated topographically with hyporeflective foci at the level of the RPE. The hyperreflective lesions corresponding to RPD in SD-OCT scans are likely indicative of degenerating photoreceptor cells. The darkened foci at positions of RPD in NIR-AF and en face OCT images indicate changes in the RPE monolayer with the reduced NIR-AF and en face OCT signal suggesting a reduction in melanin that could be accounted for by RPE thinning.

  3. Self-verification motives at the collective level of self-definition.

    Science.gov (United States)

    Chen, Serena; Chen, Karen Y; Shaw, Lindsay

    2004-01-01

    Three studies examined self-verification motives in relation to collective aspects of the self. Several moderators of collective self-verification were also examined--namely, the certainty with which collective self-views are held, the nature of one's ties to a source of self-verification, the salience of the collective self, and the importance of group identification. Evidence for collective self-verification emerged across all studies, particularly when collective self-views were held with high certainty (Studies 1 and 2), perceivers were somehow tied to the source of self-verification (Study 1), the collective self was salient (Study 2), and group identification was important (Study 3). To the authors' knowledge, these studies are the first to examine self-verification at the collective level of self-definition. The parallel and distinct ways in which self-verification processes may operate at different levels of self-definition are discussed.

  4. 3D face recognition with asymptotic cones based principal curvatures

    KAUST Repository

    Tang, Yinhang

    2015-05-01

    The classical curvatures of smooth surfaces (Gaussian, mean and principal curvatures) have been widely used in 3D face recognition (FR). However, facial surfaces resulting from 3D sensors are discrete meshes. In this paper, we present a general framework and define three principal curvatures on discrete surfaces for the purpose of 3D FR. These principal curvatures are derived from the construction of asymptotic cones associated to any Borel subset of the discrete surface. They describe the local geometry of the underlying mesh. First two of them correspond to the classical principal curvatures in the smooth case. We isolate the third principal curvature that carries out meaningful geometric shape information. The three principal curvatures in different Borel subsets scales give multi-scale local facial surface descriptors. We combine the proposed principal curvatures with the LNP-based facial descriptor and SRC for recognition. The identification and verification experiments demonstrate the practicability and accuracy of the third principal curvature and the fusion of multi-scale Borel subset descriptors on 3D face from FRGC v2.0.

  5. 3D face recognition with asymptotic cones based principal curvatures

    KAUST Repository

    Tang, Yinhang; Sun, Xiang; Huang, Di; Morvan, Jean-Marie; Wang, Yunhong; Chen, Liming

    2015-01-01

    The classical curvatures of smooth surfaces (Gaussian, mean and principal curvatures) have been widely used in 3D face recognition (FR). However, facial surfaces resulting from 3D sensors are discrete meshes. In this paper, we present a general framework and define three principal curvatures on discrete surfaces for the purpose of 3D FR. These principal curvatures are derived from the construction of asymptotic cones associated to any Borel subset of the discrete surface. They describe the local geometry of the underlying mesh. First two of them correspond to the classical principal curvatures in the smooth case. We isolate the third principal curvature that carries out meaningful geometric shape information. The three principal curvatures in different Borel subsets scales give multi-scale local facial surface descriptors. We combine the proposed principal curvatures with the LNP-based facial descriptor and SRC for recognition. The identification and verification experiments demonstrate the practicability and accuracy of the third principal curvature and the fusion of multi-scale Borel subset descriptors on 3D face from FRGC v2.0.

  6. 24 CFR 5.512 - Verification of eligible immigration status.

    Science.gov (United States)

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status of...

  7. Verification of RADTRAN

    International Nuclear Information System (INIS)

    Kanipe, F.L.; Neuhauser, K.S.

    1995-01-01

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes

  8. CASL Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dinh, Nam [North Carolina State Univ., Raleigh, NC (United States)

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation and verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.

  9. Far infrared supplement: Catalog of infrared observations, second edition

    International Nuclear Information System (INIS)

    Gezari, D.Y.; Schmitz, M.; Mead, J.M.

    1988-08-01

    The Far Infrared Supplement: Catalog of Infrared Observations summarizes all infrared astronomical observations at far infrared wavelengths (5 to 1000 microns) published in the scientific literature from 1965 through 1986. The Supplement list contain 25 percent of the observations in the full Catalog of Infrared Observations (CIO), and essentially eliminates most visible stars from the listings. The Supplement is thus more compact than the main catalog, and is intended for easy reference during astronomical observations. The Far Infrared Supplement (2nd Edition) includes the Index of Infrared Source Positions and the Bibliography of Infrared Astronomy for the subset of far infrared observations listed

  10. The monitoring and verification of nuclear weapons

    International Nuclear Information System (INIS)

    Garwin, Richard L.

    2014-01-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers

  11. DarcyTools, Version 2.1. Verification and validation

    International Nuclear Information System (INIS)

    Svensson, Urban

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  12. Verification and quality control of routine hematology analyzers.

    Science.gov (United States)

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  13. DarcyTools, Version 2.1. Verification and validation

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-03-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups, thus reflect the main areas of recent developments. The present report will focus on the Verification and Validation of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods, Equations and Demo Simulations. - User's Guide. The objective of this report is to compile all verification and validation studies that have been carried out so far. After some brief introductory sections, all cases will be reported in Appendix A (verification cases) and Appendix B (validation cases)

  14. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  15. Multi-canister overpack project - verification and validation, MCNP 4A

    International Nuclear Information System (INIS)

    Goldmann, L.H.

    1997-01-01

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and the old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error

  16. Key Nuclear Verification Priorities: Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  17. Key Nuclear Verification Priorities - Safeguards and Beyond

    International Nuclear Information System (INIS)

    Carlson, J.

    2010-01-01

    In addressing nuclear verification priorities, we should look beyond the current safeguards system. Non-proliferation, which the safeguards system underpins, is not an end in itself, but an essential condition for achieving and maintaining nuclear disarmament. Effective safeguards are essential for advancing disarmament, and safeguards issues, approaches and techniques are directly relevant to the development of future verification missions. The extent to which safeguards challenges are successfully addressed - or otherwise - will impact not only on confidence in the safeguards system, but on the effectiveness of, and confidence in, disarmament verification. To identify the key nuclear verification priorities, we need to consider the objectives of verification, and the challenges to achieving these. The strategic objective of IAEA safeguards might be expressed as: To support the global nuclear non-proliferation regime by: - Providing credible assurance that states are honouring their safeguards commitments - thereby removing a potential motivation to proliferate; and - Early detection of misuse of nuclear material and technology - thereby deterring proliferation by the risk of early detection, enabling timely intervention by the international community. Or to summarise - confidence-building, detection capability, and deterrence. These will also be essential objectives for future verification missions. The challenges to achieving these involve a mix of political, technical and institutional dimensions. Confidence is largely a political matter, reflecting the qualitative judgment of governments. Clearly assessments of detection capability and deterrence have a major impact on confidence. Detection capability is largely thought of as 'technical', but also involves issues of legal authority, as well as institutional issues. Deterrence has both political and institutional aspects - including judgments on risk of detection and risk of enforcement action being taken. The

  18. On the organisation of program verification competitions

    NARCIS (Netherlands)

    Huisman, Marieke; Klebanov, Vladimir; Monahan, Rosemary; Klebanov, Vladimir; Beckert, Bernhard; Biere, Armin; Sutcliffe, Geoff

    In this paper, we discuss the challenges that have to be addressed when organising program verification competitions. Our focus is on competitions for verification systems where the participants both formalise an informally stated requirement and (typically) provide some guidance for the tool to

  19. Simulation environment based on the Universal Verification Methodology

    International Nuclear Information System (INIS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  20. HTGR analytical methods and design verification

    International Nuclear Information System (INIS)

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier

  1. IMRT plan verification in radiotherapy

    International Nuclear Information System (INIS)

    Vlk, P.

    2006-01-01

    This article describes the procedure for verification of IMRT (Intensity modulated radiation therapy) plan, which is used in the Oncological Institute of St. Elisabeth in Bratislava. It contains basic description of IMRT technology and developing a deployment plan for IMRT planning system CORVUS 6.0, the device Mimic (Multilammelar intensity modulated collimator) and the overall process of verifying the schedule created. The aim of verification is particularly good control of the functions of MIMIC and evaluate the overall reliability of IMRT planning. (author)

  2. Automatic Verification of Timing Constraints for Safety Critical Space Systems

    Science.gov (United States)

    Fernandez, Javier; Parra, Pablo; Sanchez Prieto, Sebastian; Polo, Oscar; Bernat, Guillem

    2015-09-01

    In this paper is presented an automatic process of verification. We focus in the verification of scheduling analysis parameter. This proposal is part of process based on Model Driven Engineering to automate a Verification and Validation process of the software on board of satellites. This process is implemented in a software control unit of the energy particle detector which is payload of Solar Orbiter mission. From the design model is generated a scheduling analysis model and its verification model. The verification as defined as constraints in way of Finite Timed Automatas. When the system is deployed on target the verification evidence is extracted as instrumented points. The constraints are fed with the evidence, if any of the constraints is not satisfied for the on target evidence the scheduling analysis is not valid.

  3. A framework for nuclear agreement and verification

    International Nuclear Information System (INIS)

    Ali, A.

    1991-01-01

    This chapter assesses the prospects for a nuclear agreement between India and Pakistan. The chapter opens with a review of past and present political environments of the two countries. The discussion proceeds to describe the linkage of global arms control agreements, prospects for verification of a Comprehensive Test Ban Treaty, the role of nuclear power in any agreements, the intrusiveness of verification, and possible post-proliferation agreements. Various monitoring and verification technologies are described (mainly satellite oriented). The chapter concludes with an analysis of the likelihood of persuading India and Pakistan to agree to a nonproliferation arrangement

  4. Verification of Many-Qubit States

    Directory of Open Access Journals (Sweden)

    Yuki Takeuchi

    2018-06-01

    Full Text Available Verification is a task to check whether a given quantum state is close to an ideal state or not. In this paper, we show that a variety of many-qubit quantum states can be verified with only sequential single-qubit measurements of Pauli operators. First, we introduce a protocol for verifying ground states of Hamiltonians. We next explain how to verify quantum states generated by a certain class of quantum circuits. We finally propose an adaptive test of stabilizers that enables the verification of all polynomial-time-generated hypergraph states, which include output states of the Bremner-Montanaro-Shepherd-type instantaneous quantum polynomial time (IQP circuits. Importantly, we do not make any assumption that the identically and independently distributed copies of the same states are given: Our protocols work even if some highly complicated entanglement is created among copies in any artificial way. As applications, we consider the verification of the quantum computational supremacy demonstration with IQP models, and verifiable blind quantum computing.

  5. Formal Verification -26 ...

    Indian Academy of Sciences (India)

    by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...

  6. Characterization of double face adhesive sheets for ceramic tile installation

    International Nuclear Information System (INIS)

    Nascimento, Otavio L.; Mansur, Alexandra A.P.; Mansur, Herman S.

    2011-01-01

    The main goal of this work was the characterization of an innovative ceramic tile installation product based on double face adhesive sheets. Density, hardness, tensile strength, x-ray diffraction, infrared spectroscopy, and scanning electron microscopy coupled with spectroscopy of dispersive energy assays were conducted. The results are in agreement with some manufacture specifications and the obtained information will be crucial in the analysis of durability and stability of the ceramic tile system installed with this new product. (author)

  7. Neural synchronization during face-to-face communication.

    Science.gov (United States)

    Jiang, Jing; Dai, Bohan; Peng, Danling; Zhu, Chaozhe; Liu, Li; Lu, Chunming

    2012-11-07

    Although the human brain may have evolutionarily adapted to face-to-face communication, other modes of communication, e.g., telephone and e-mail, increasingly dominate our modern daily life. This study examined the neural difference between face-to-face communication and other types of communication by simultaneously measuring two brains using a hyperscanning approach. The results showed a significant increase in the neural synchronization in the left inferior frontal cortex during a face-to-face dialog between partners but none during a back-to-back dialog, a face-to-face monologue, or a back-to-back monologue. Moreover, the neural synchronization between partners during the face-to-face dialog resulted primarily from the direct interactions between the partners, including multimodal sensory information integration and turn-taking behavior. The communicating behavior during the face-to-face dialog could be predicted accurately based on the neural synchronization level. These results suggest that face-to-face communication, particularly dialog, has special neural features that other types of communication do not have and that the neural synchronization between partners may underlie successful face-to-face communication.

  8. Verification of wet blasting decontamination technology

    International Nuclear Information System (INIS)

    Matsubara, Sachito; Murayama, Kazunari; Yoshida, Hirohisa; Igei, Shigemitsu; Izumida, Tatsuo

    2013-01-01

    Macoho Co., Ltd. participated in the projects of 'Decontamination Verification Test FY 2011 by the Ministry of the Environment' and 'Decontamination Verification Test FY 2011 by the Cabinet Office.' And we tested verification to use a wet blasting technology for decontamination of rubble and roads contaminated by the accident of Fukushima Daiichi Nuclear Power Plant of the Tokyo Electric Power Company. As a results of the verification test, the wet blasting decontamination technology showed that a decontamination rate became 60-80% for concrete paving, interlocking, dense-grated asphalt pavement when applied to the decontamination of the road. When it was applied to rubble decontamination, a decontamination rate was 50-60% for gravel and approximately 90% for concrete and wood. It was thought that Cs-134 and Cs-137 attached to the fine sludge scraped off from a decontamination object and the sludge was found to be separated from abrasives by wet cyclene classification: the activity concentration of the abrasives is 1/30 or less than the sludge. The result shows that the abrasives can be reused without problems when the wet blasting decontamination technology is used. (author)

  9. Combining Task Execution and Background Knowledge for the Verification of Medical Guidelines

    Science.gov (United States)

    Hommersom, Arjen; Groot, Perry; Lucas, Peter; Balser, Michael; Schmitt, Jonathan

    The use of a medical guideline can be seen as the execution of computational tasks, sequentially or in parallel, in the face of patient data. It has been shown that many of such guidelines can be represented as a 'network of tasks', i.e., as a number of steps that have a specific function or goal. To investigate the quality of such guidelines we propose a formalization of criteria for good practice medicine a guideline should comply to. We use this theory in conjunction with medical background knowledge to verify the quality of a guideline dealing with diabetes mellitus type 2 using the interactive theorem prover KIV. Verification using task execution and background knowledge is a novel approach to quality checking of medical guidelines.

  10. K Basins Field Verification Program

    International Nuclear Information System (INIS)

    Booth, H.W.

    1994-01-01

    The Field Verification Program establishes a uniform and systematic process to ensure that technical information depicted on selected engineering drawings accurately reflects the actual existing physical configuration. This document defines the Field Verification Program necessary to perform the field walkdown and inspection process that identifies the physical configuration of the systems required to support the mission objectives of K Basins. This program is intended to provide an accurate accounting of the actual field configuration by documenting the as-found information on a controlled drawing

  11. Engineering drawing field verification program. Revision 3

    International Nuclear Information System (INIS)

    Ulk, P.F.

    1994-01-01

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented

  12. Verification of Open Interactive Markov Chains

    OpenAIRE

    Brazdil, Tomas; Hermanns, Holger; Krcal, Jan; Kretinsky, Jan; Rehak, Vojtech

    2012-01-01

    Interactive Markov chains (IMC) are compositional behavioral models extending both labeled transition systems and continuous-time Markov chains. IMC pair modeling convenience - owed to compositionality properties - with effective verification algorithms and tools - owed to Markov properties. Thus far however, IMC verification did not consider compositionality properties, but considered closed systems. This paper discusses the evaluation of IMC in an open and thus compositional interpretation....

  13. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    Science.gov (United States)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  14. Face Gear Technology for Aerospace Power Transmission Progresses

    Science.gov (United States)

    2005-01-01

    the effects of manufacturing process improvements on the operating characteristics of face gears. The program is being conducted with McDonnell Douglas Helicopter Co., Lucas Western Inc., the University of Illinois at Chicago, and a NASA/U.S. Army team. The goal of the project is develop the grinding process, experimentally verify the improvement in face gear fatigue life, and conduct a full-scale helicopter transmission test. The theory and methodology to grind face gears has been completed, and manufacture of the test hardware is ongoing. Experimental verification on test hardware is scheduled to begin in fiscal 1996.

  15. Infrared pre-drying and dry-dehulling of walnuts for improved processing efficiency and product quality

    Science.gov (United States)

    The walnut industry is faced with an urgent need to improve post-harvest processing efficiency, particularly drying and dehulling operations. This research investigated the feasibility of dry-dehulling and infrared (IR) pre-drying of walnuts for improved processing efficiency and dried product quali...

  16. Local gradient Gabor pattern (LGGP) with applications in face recognition, cross-spectral matching, and soft biometrics

    Science.gov (United States)

    Chen, Cunjian; Ross, Arun

    2013-05-01

    Researchers in face recognition have been using Gabor filters for image representation due to their robustness to complex variations in expression and illumination. Numerous methods have been proposed to model the output of filter responses by employing either local or global descriptors. In this work, we propose a novel but simple approach for encoding Gradient information on Gabor-transformed images to represent the face, which can be used for identity, gender and ethnicity assessment. Extensive experiments on the standard face benchmark FERET (Visible versus Visible), as well as the heterogeneous face dataset HFB (Near-infrared versus Visible), suggest that the matching performance due to the proposed descriptor is comparable against state-of-the-art descriptor-based approaches in face recognition applications. Furthermore, the same feature set is used in the framework of a Collaborative Representation Classification (CRC) scheme for deducing soft biometric traits such as gender and ethnicity from face images in the AR, Morph and CAS-PEAL databases.

  17. Design verification for large reprocessing plants (Proposed procedures)

    International Nuclear Information System (INIS)

    Rolandi, G.

    1988-07-01

    In the 1990s, four large commercial reprocessing plants will progressively come into operation: If an effective and efficient safeguards system is to be applied to these large and complex plants, several important factors have to be considered. One of these factors, addressed in the present report, concerns plant design verification. Design verification provides an overall assurance on plant measurement data. To this end design verification, although limited to the safeguards aspects of the plant, must be a systematic activity, which starts during the design phase, continues during the construction phase and is particularly performed during the various steps of the plant's commissioning phase. The detailed procedures for design information verification on commercial reprocessing plants must be defined within the frame of the general provisions set forth in INFCIRC/153 for any type of safeguards related activities and specifically for design verification. The present report is intended as a preliminary contribution on a purely technical level, and focusses on the problems within the Agency. For the purpose of the present study the most complex case was assumed: i.e. a safeguards system based on conventional materials accountancy, accompanied both by special input and output verification and by some form of near-real-time accountancy involving in-process inventory taking, based on authenticated operator's measurement data. C/S measures are also foreseen, where necessary to supplement the accountancy data. A complete ''design verification'' strategy comprehends: informing the Agency of any changes in the plant system which are defined as ''safeguards relevant''; ''reverifying by the Agency upon receiving notice from the Operator on any changes, on ''design information''. 13 refs

  18. Material integrity verification radar

    International Nuclear Information System (INIS)

    Koppenjan, S.K.

    1999-01-01

    The International Atomic Energy Agency (IAEA) has the need for verification of 'as-built' spent fuel-dry storage containers and other concrete structures. The IAEA has tasked the Special Technologies Laboratory (STL) to fabricate, test, and deploy a stepped-frequency Material Integrity Verification Radar (MIVR) system to nondestructively verify the internal construction of these containers. The MIVR system is based on previously deployed high-frequency, ground penetrating radar (GPR) systems that have been developed by STL for the U.S. Department of Energy (DOE). Whereas GPR technology utilizes microwave radio frequency energy to create subsurface images, MTVR is a variation for which the medium is concrete instead of soil. The purpose is to nondestructively verify the placement of concrete-reinforcing materials, pipes, inner liners, and other attributes of the internal construction. The MIVR system underwent an initial field test on CANDU reactor spent fuel storage canisters at Atomic Energy of Canada Limited (AECL), Chalk River Laboratories, Ontario, Canada, in October 1995. A second field test at the Embalse Nuclear Power Plant in Embalse, Argentina, was completed in May 1996. The DOE GPR also was demonstrated at the site. Data collection and analysis were performed for the Argentine National Board of Nuclear Regulation (ENREN). IAEA and the Brazilian-Argentine Agency for the Control and Accounting of Nuclear Material (ABACC) personnel were present as observers during the test. Reinforcing materials were evident in the color, two-dimensional images produced by the MIVR system. A continuous pattern of reinforcing bars was evident and accurate estimates on the spacing, depth, and size were made. The potential uses for safeguard applications were jointly discussed. The MIVR system, as successfully demonstrated in the two field tests, can be used as a design verification tool for IAEA safeguards. A deployment of MIVR for Design Information Questionnaire (DIQ

  19. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Andersen, J.H.; Skou, A.

    1995-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  20. Specification and Automated Verification of Real-Time Behaviour

    DEFF Research Database (Denmark)

    Andersen, J.H.; Kristensen, C.H.; Skou, A.

    1996-01-01

    In this paper we sketch a method for specification and automatic verification of real-time software properties.......In this paper we sketch a method for specification and automatic verification of real-time software properties....

  1. Formal Verification of Continuous Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer

    2012-01-01

    and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due...

  2. Biometric Technologies and Verification Systems

    CERN Document Server

    Vacca, John R

    2007-01-01

    Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior

  3. Runtime Verification Through Forward Chaining

    Directory of Open Access Journals (Sweden)

    Alan Perotti

    2014-12-01

    Full Text Available In this paper we present a novel rule-based approach for Runtime Verification of FLTL properties over finite but expanding traces. Our system exploits Horn clauses in implication form and relies on a forward chaining-based monitoring algorithm. This approach avoids the branching structure and exponential complexity typical of tableaux-based formulations, creating monitors with a single state and a fixed number of rules. This allows for a fast and scalable tool for Runtime Verification: we present the technical details together with a working implementation.

  4. Are all types of expertise created equal? Car experts use different spatial frequency scales for subordinate categorization of cars and faces.

    Science.gov (United States)

    Harel, Assaf; Bentin, Shlomo

    2013-01-01

    A much-debated question in object recognition is whether expertise for faces and expertise for non-face objects utilize common perceptual information. We investigated this issue by assessing the diagnostic information required for different types of expertise. Specifically, we asked whether face categorization and expert car categorization at the subordinate level relies on the same spatial frequency (SF) scales. Fifteen car experts and fifteen novices performed a category verification task with spatially filtered images of faces, cars, and airplanes. Images were categorized based on their basic (e.g. "car") and subordinate level (e.g. "Japanese car") identity. The effect of expertise was not evident when objects were categorized at the basic level. However, when the car experts categorized faces and cars at the subordinate level, the two types of expertise required different kinds of SF information. Subordinate categorization of faces relied on low SFs more than on high SFs, whereas subordinate expert car categorization relied on high SFs more than on low SFs. These findings suggest that expertise in the recognition of objects and faces do not utilize the same type of information. Rather, different types of expertise require different types of diagnostic visual information.

  5. Current status of verification practices in clinical biochemistry in Spain.

    Science.gov (United States)

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  6. PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory

    Science.gov (United States)

    Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.

    2018-02-01

    PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.

  7. Complementary technologies for verification of excess plutonium

    International Nuclear Information System (INIS)

    Langner, D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-01-01

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of 240 Pu to 239 Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime

  8. Improving Face Verification in Photo Albums by Combining Facial Recognition and Metadata With Cross-Matching

    Science.gov (United States)

    2017-12-01

    with our method. The third chapter presents the description and implementation of our approach. We provide a definition of the dataset, the...a means of classification using the shape and the texture. 16 Figure 7. 3D Alignment Pipeline. Adapted from [20]. In 2014, Facebook announced...Stating that face recognition consists of four main stages, detect ⟹ align ⟹ represent ⟹ classify, the Facebook team’s intent is to revisit the

  9. A Synthesized Framework for Formal Verification of Computing Systems

    Directory of Open Access Journals (Sweden)

    Nikola Bogunovic

    2003-12-01

    Full Text Available Design process of computing systems gradually evolved to a level that encompasses formal verification techniques. However, the integration of formal verification techniques into a methodical design procedure has many inherent miscomprehensions and problems. The paper explicates the discrepancy between the real system implementation and the abstracted model that is actually used in the formal verification procedure. Particular attention is paid to the seamless integration of all phases of the verification procedure that encompasses definition of the specification language and denotation and execution of conformance relation between the abstracted model and its intended behavior. The concealed obstacles are exposed, computationally expensive steps identified and possible improvements proposed.

  10. Independent verification in operations at nuclear power plants

    International Nuclear Information System (INIS)

    Donderi, D.C.; Smiley, A.; Ostry, D.J.; Moray, N.P.

    1995-09-01

    A critical review of approaches to independent verification in operations used in nuclear power plant quality assurance programs in other countries, was conducted for this study. This report identifies the uses of independent verification and provides an assessment of the effectiveness of the various approaches. The findings indicate that at Canadian nuclear power plants as much, if not more, independent verification is performed than at power plants in the other countries included in the study. Additional requirements in this area are not proposed for Canadian stations. (author)

  11. Famous face recognition, face matching, and extraversion.

    Science.gov (United States)

    Lander, Karen; Poyarekar, Siddhi

    2015-01-01

    It has been previously established that extraverts who are skilled at interpersonal interaction perform significantly better than introverts on a face-specific recognition memory task. In our experiment we further investigate the relationship between extraversion and face recognition, focusing on famous face recognition and face matching. Results indicate that more extraverted individuals perform significantly better on an upright famous face recognition task and show significantly larger face inversion effects. However, our results did not find an effect of extraversion on face matching or inverted famous face recognition.

  12. Verification in Referral-Based Crowdsourcing

    Science.gov (United States)

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  13. Nuclear Data Verification and Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  14. Verification and quality control of routine hematology analyzers

    NARCIS (Netherlands)

    Vis, J Y; Huisman, A

    2016-01-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and

  15. Solid waste operations complex engineering verification program plan

    International Nuclear Information System (INIS)

    Bergeson, C.L.

    1994-01-01

    This plan supersedes, but does not replace, the previous Waste Receiving and Processing/Solid Waste Engineering Development Program Plan. In doing this, it does not repeat the basic definitions of the various types or classes of development activities nor provide the rigorous written description of each facility and assign the equipment to development classes. The methodology described in the previous document is still valid and was used to determine the types of verification efforts required. This Engineering Verification Program Plan will be updated on a yearly basis. This EVPP provides programmatic definition of all engineering verification activities for the following SWOC projects: (1) Project W-026 - Waste Receiving and Processing Facility Module 1; (2) Project W-100 - Waste Receiving and Processing Facility Module 2A; (3) Project W-112 - Phase V Storage Facility; and (4) Project W-113 - Solid Waste Retrieval. No engineering verification activities are defined for Project W-112 as no verification work was identified. The Acceptance Test Procedures/Operational Test Procedures will be part of each project's Title III operation test efforts. The ATPs/OTPs are not covered by this EVPP

  16. 21 CFR 21.44 - Verification of identity.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Verification of identity. 21.44 Section 21.44 Food... Verification of identity. (a) An individual seeking access to records in a Privacy Act Record System may be... identity. The identification required shall be suitable considering the nature of the records sought. No...

  17. NASA/IPAC Infrared Science Archive (IRSA) in the 2020s.

    Science.gov (United States)

    Desai, Vandana; Rebull, Luisa M.; IRSA Team

    2018-06-01

    I will discuss challenges faced by IRSA in the next decade due to changes in our user base: the dissolution of wavelength boundaries among astronomers, and the education of astronomers as data scientists. While the fraction of astronomers who use infrared data has increased drastically in the era of Spitzer, Herschel, and WISE, most people who do science with those data sets don’t use infrared data exclusively or identify as “Infrared astronomers”. Our archive, and others, need to be responsive to the needs of an increasingly multiwavelength community, and those exploring time domain astronomy. That means making the archives interlink seamlessly, while preserving expert knowledge so that data don’t get misused. As astronomical data sets grow in volume, users will increasingly expect server side resources, including both storage and analysis resources. These expectations come with a host of ramifications, from cost to security. Our archives must be built to satisfy the needs of both the power user and the beginning astronomer. I will discuss how IRSA plans to meet the evolving needs of our user community.

  18. A methodology for the rigorous verification of plasma simulation codes

    Science.gov (United States)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  19. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  20. Tolerance Verification of Micro and Nano Structures on Polycarbonate Substrates

    DEFF Research Database (Denmark)

    Gasparin, Stefania; Tosello, Guido; Hansen, Hans Nørgaard

    2010-01-01

    Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features are defi......Micro and nano structures are an increasing challenge in terms of tolerance verification and process quality control: smaller dimensions led to a smaller tolerance zone to be evaluated. This paper focuses on the verification of CD, DVD and HD-DVD nanoscale features. CD tolerance features...

  1. Face-to-Face Activities in Blended Learning

    DEFF Research Database (Denmark)

    Kjærgaard, Annemette

    While blended learning combines online and face-to-face teaching, research on blended learning has primarily focused on the role of technology and the opportunities it creates for engaging students. Less focus has been put on face-to-face activities in blended learning. This paper argues...... that it is not only the online activities in blended learning that provide new opportunities for rethinking pedagogy in higher education, it is also imperative to reconsider the face-to-face activities when part of the learning is provided online. Based on a review of blended learning in business and management...... education, we identify what forms of teaching and learning are suggested to take place face-to-face when other activities are moved online. We draw from the Community of Inquiry framework to analyze how face-to-face activities contribute to a blended learning pedagogy and discuss the implications...

  2. Stability of transition to a world without nuclear weapons: Technical problems of verification

    International Nuclear Information System (INIS)

    Zhigalov, V.

    1998-01-01

    A serious psychological barrier to acceptance of the concept for achieving the nuclear-weapon-free world is fear of facing the prospect that one or more nations or extremist political groups might develop their own nuclear weapons. Actually this is a question of stability of the nuclear-weapon-free world. From this point of view the most effective system of verification is an absolute necessity. This system must ensure detection of so called undeclared nuclear activity at early stage. Scientists of Russian nuclear centers are working today on solving this problem. This paper is considered to be a comprehensive attempt to analyze the technical and organizational aspects of the problems of transition to a nuclear-weapons-free world, setting aside the difficulties of resolving purely political problems

  3. Standard Verification System (SVS)

    Data.gov (United States)

    Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...

  4. Histochemistry and infrared microspectroscopy of lignified tissue in young stems of Struthanthus vulgaris Mart.

    Directory of Open Access Journals (Sweden)

    Gisely de Lima Oliveira

    Full Text Available In this study, we aimed to determine lignified tissue in young stems of Struthanthus vulgaris Mart. by infrared microspectroscopy and histochemical methods as well as by fluorescence microscopy. Struthanthus vulgaris Mart. is a mistletoe species that belongs to the Loranthaceae family. A brief anatomical description was also carried out. The first procedure for analysis was to elaborate anatomical cross sections (20-30 µm from young stems before and after treatment with NaOH 1%. This procedure was applied to release possible low molecular mass phenolic compounds. Safranin-astra blue was used to distinguish anatomical tissues while Wiesner test enabled verification of lignified pericyclic fibers. Infrared microspectroscopy analysis confirmed the presence of lignin in this region according to the following spectral signals: 1600 (shoulder, 1511, 1453, 1338 and 1244 cm-1. Analyses of the cross section of young stems under fluorescence microscopy before and after treatment with NaOH 1% allowed us to confirm the presence of low mass phenolic compounds in the region of pericyclic fibers.

  5. Packaged low-level waste verification system

    International Nuclear Information System (INIS)

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-01-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL)

  6. Face recognition system and method using face pattern words and face pattern bytes

    Science.gov (United States)

    Zheng, Yufeng

    2014-12-23

    The present invention provides a novel system and method for identifying individuals and for face recognition utilizing facial features for face identification. The system and method of the invention comprise creating facial features or face patterns called face pattern words and face pattern bytes for face identification. The invention also provides for pattern recognitions for identification other than face recognition. The invention further provides a means for identifying individuals based on visible and/or thermal images of those individuals by utilizing computer software implemented by instructions on a computer or computer system and a computer readable medium containing instructions on a computer system for face recognition and identification.

  7. Spent Nuclear Fuel (SNF) Project Design Verification and Validation Process

    International Nuclear Information System (INIS)

    OLGUIN, L.J.

    2000-01-01

    This document provides a description of design verification and validation activities implemented by the Spent Nuclear Fuel (SNF) Project. During the execution of early design verification, a management assessment (Bergman, 1999) and external assessments on configuration management (Augustenburg, 1999) and testing (Loscoe, 2000) were conducted and identified potential uncertainties in the verification process. This led the SNF Chief Engineer to implement corrective actions to improve process and design products. This included Design Verification Reports (DVRs) for each subproject, validation assessments for testing, and verification of the safety function of systems and components identified in the Safety Equipment List to ensure that the design outputs were compliant with the SNF Technical Requirements. Although some activities are still in progress, the results of the DVR and associated validation assessments indicate that Project requirements for design verification are being effectively implemented. These results have been documented in subproject-specific technical documents (Table 2). Identified punch-list items are being dispositioned by the Project. As these remaining items are closed, the technical reports (Table 2) will be revised and reissued to document the results of this work

  8. The City’s Many Faces: Proceedings of the RAND Arroyo-MCWL-J8 UWG Urban Operations Conference, April 13-14 1999

    Science.gov (United States)

    1999-04-14

    FBI Headquarters FCU Future Combat Vehicle FDA Food and Drug Administration FEMA Federal Emergency Management Agency FLIR Forward Looking Infrared ...Infantry IN (L) Infantry (Light) 10 Information Operations IPB Intelligence Preparation of the Battlefield IPSF Interim Public Security Forces IR Infrared ...Acquisition Program WSO Weapons System Officer vi The City’s Many Faces WTI Weapons and Tactics Instructor WW2 World War Two Y2K Year 2000, often used

  9. As-Built Verification Plan Spent Nuclear Fuel Canister Storage Building MCO Handling Machine

    International Nuclear Information System (INIS)

    SWENSON, C.E.

    2000-01-01

    This as-built verification plan outlines the methodology and responsibilities that will be implemented during the as-built field verification activity for the Canister Storage Building (CSB) MCO HANDLING MACHINE (MHM). This as-built verification plan covers THE ELECTRICAL PORTION of the CONSTRUCTION PERFORMED BY POWER CITY UNDER CONTRACT TO MOWAT. The as-built verifications will be performed in accordance Administrative Procedure AP 6-012-00, Spent Nuclear Fuel Project As-Built Verification Plan Development Process, revision I. The results of the verification walkdown will be documented in a verification walkdown completion package, approved by the Design Authority (DA), and maintained in the CSB project files

  10. 37 CFR 262.7 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  11. 40 CFR 1065.675 - CLD quench verification calculations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false CLD quench verification calculations... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calculations and Data Requirements § 1065.675 CLD quench verification calculations. Perform CLD quench-check calculations as follows: (a) Perform a CLD analyzer quench...

  12. Analysis and Transformation Tools for Constrained Horn Clause Verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2014-01-01

    Several techniques and tools have been developed for verification of properties expressed as Horn clauses with constraints over a background theory (CHC). Current CHC verification tools implement intricate algorithms and are often limited to certain subclasses of CHC problems. Our aim in this work...... is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....

  13. Verification of the thermal design of electronic equipment

    Energy Technology Data Exchange (ETDEWEB)

    Hienonen, R.; Karjalainen, M.; Lankinen, R. [VTT Automation, Espoo (Finland). ProTechno

    1997-12-31

    The project `Verification of the thermal design of electronic equipment` studied the methodology to be followed in the verification of thermal design of electronic equipment. This project forms part of the `Cool Electronics` research programme funded by TEKES, the Finnish Technology Development Centre. This project was carried out jointly by VTT Automation, Lappeenranta University of Technology, Nokia Research Center and ABB Industry Oy VSD-Technology. The thermal design of electronic equipment has a significant impact on the cost, reliability, tolerance to different environments, selection of components and materials, and ergonomics of the product. This report describes the method for verification of thermal design. It assesses the goals set for thermal design, environmental requirements, technical implementation of the design, thermal simulation and modelling, and design qualification testing and the measurements needed. The verification method covers all packaging levels of electronic equipment from the system level to the electronic component level. The method described in this report can be used as part of the quality system of a corporation. The report includes information about the measurement and test methods needed in the verification process. Some measurement methods for the temperature, flow and pressure of air are described. (orig.) Published in Finnish VTT Julkaisuja 824. 22 refs.

  14. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  15. A COMPARISON OF STUDY RESULTS OF BUSINESS ENGLISH STUDENTS IN E-LEARNING AND FACE-TO-FACE COURSES

    Directory of Open Access Journals (Sweden)

    Petr Kučera

    2012-09-01

    Full Text Available The paper deals with the comparison of results of students in thelessons of Business English e-learning course with face-to-faceteaching at the Faculty of Economics and Management of the CULSin Prague. E-learning as a method of instruction refers to learningusing technology, such as the Internet, CD-ROMs and portabledevices. A current trend in university teaching is a particular focus one-learning method of studies enhancing the quality and effectivenessof studies and self-studies. In the paper we have analysed the currentstate in the area of English for Specific Purposes (ESP e-learningresearch, pointed out the results of a pilot ESP e-learning course intesting a control and an experimental group of students and resultsof questionnaires with views of students on e-learning. The paperfocuses on the experimental verification of e-learning influenceon the results of both groups of students. Online study materialsupports an interactive form of the teaching by means of multimediaapplication. It could be used not only for full-time students but alsofor distance students and centers of lifelong learning.

  16. Verification and Validation in Systems Engineering

    CERN Document Server

    Debbabi, Mourad; Jarraya, Yosr; Soeanu, Andrei; Alawneh, Luay

    2010-01-01

    "Verification and validation" represents an important process used for the quality assessment of engineered systems and their compliance with the requirements established at the beginning of or during the development cycle. Debbabi and his coauthors investigate methodologies and techniques that can be employed for the automatic verification and validation of systems engineering design models expressed in standardized modeling languages. Their presentation includes a bird's eye view of the most prominent modeling languages for software and systems engineering, namely the Unified Model

  17. Virtual & Real Face to Face Teaching

    Science.gov (United States)

    Teneqexhi, Romeo; Kuneshka, Loreta

    2016-01-01

    In traditional "face to face" lessons, during the time the teacher writes on a black or white board, the students are always behind the teacher. Sometimes, this happens even in the recorded lesson in videos. Most of the time during the lesson, the teacher shows to the students his back not his face. We do not think the term "face to…

  18. 37 CFR 260.6 - Verification of royalty payments.

    Science.gov (United States)

    2010-07-01

    ... verification of the payment of royalty fees to those parties entitled to receive such fees, according to terms... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR PREEXISTING SUBSCRIPTION...

  19. A Roadmap for the Implementation of Continued Process Verification.

    Science.gov (United States)

    Boyer, Marcus; Gampfer, Joerg; Zamamiri, Abdel; Payne, Robin

    2016-01-01

    In 2014, the members of the BioPhorum Operations Group (BPOG) produced a 100-page continued process verification case study, entitled "Continued Process Verification: An Industry Position Paper with Example Protocol". This case study captures the thought processes involved in creating a continued process verification plan for a new product in response to the U.S. Food and Drug Administration's guidance on the subject introduced in 2011. In so doing, it provided the specific example of a plan developed for a new molecular antibody product based on the "A MAb Case Study" that preceded it in 2009.This document provides a roadmap that draws on the content of the continued process verification case study to provide a step-by-step guide in a more accessible form, with reference to a process map of the product life cycle. It could be used as a basis for continued process verification implementation in a number of different scenarios: For a single product and process;For a single site;To assist in the sharing of data monitoring responsibilities among sites;To assist in establishing data monitoring agreements between a customer company and a contract manufacturing organization. The U.S. Food and Drug Administration issued guidance on the management of manufacturing processes designed to improve quality and control of drug products. This involved increased focus on regular monitoring of manufacturing processes, reporting of the results, and the taking of opportunities to improve. The guidance and practice associated with it is known as continued process verification This paper summarizes good practice in responding to continued process verification guidance, gathered from subject matter experts in the biopharmaceutical industry. © PDA, Inc. 2016.

  20. Land surface Verification Toolkit (LVT)

    Science.gov (United States)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  1. On Backward-Style Anonymity Verification

    Science.gov (United States)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  2. 78 FR 27882 - VA Veteran-Owned Small Business (VOSB) Verification Guidelines

    Science.gov (United States)

    2013-05-13

    ... Verification Self-Assessment Tool that walks the veteran through the regulation and how it applies to the...) Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Advanced notice of proposed rulemaking... regulations governing the Department of Veterans Affairs (VA) Veteran-Owned Small Business (VOSB) Verification...

  3. Verification and validation for waste disposal models

    International Nuclear Information System (INIS)

    1987-07-01

    A set of evaluation criteria has been developed to assess the suitability of current verification and validation techniques for waste disposal methods. A survey of current practices and techniques was undertaken and evaluated using these criteria with the items most relevant to waste disposal models being identified. Recommendations regarding the most suitable verification and validation practices for nuclear waste disposal modelling software have been made

  4. Reading faces and Facing words

    DEFF Research Database (Denmark)

    Robotham, Julia Emma; Lindegaard, Martin Weis; Delfi, Tzvetelina Shentova

    unilateral lesions, we found no patient with a selective deficit in either reading or face processing. Rather, the patients showing a deficit in processing either words or faces were also impaired with the other category. One patient performed within the normal range on all tasks. In addition, all patients......It has long been argued that perceptual processing of faces and words is largely independent, highly specialised and strongly lateralised. Studies of patients with either pure alexia or prosopagnosia have strongly contributed to this view. The aim of our study was to investigate how visual...... perception of faces and words is affected by unilateral posterior stroke. Two patients with lesions in their dominant hemisphere and two with lesions in their non-dominant hemisphere were tested on sensitive tests of face and word perception during the stable phase of recovery. Despite all patients having...

  5. Property-driven functional verification technique for high-speed vision system-on-chip processor

    Science.gov (United States)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  6. Design verification methodology for a solenoid valve for industrial applications

    International Nuclear Information System (INIS)

    Park, Chang Dae; Lim, Byung Ju; Chun, Kyung Yul

    2015-01-01

    Solenoid operated valves (SOV) are widely used in many applications due to their fast dynamic responses, cost effectiveness, and less contamination sensitive characteristics. In this paper, we tried to provide a convenient method of design verification of SOV to design engineers who depend on their experiences and experiment during design and development process of SOV. First, we summarize a detailed procedure for designing SOVs for industrial applications. All of the design constraints are defined in the first step of the design, and then the detail design procedure is presented based on design experiences as well as various physical and electromagnetic relationships. Secondly, we have suggested a verification method of this design using theoretical relationships, which enables optimal design of SOV from a point of view of safety factor of design attraction force. Lastly, experimental performance tests using several prototypes manufactured based on this design method show that the suggested design verification methodology is appropriate for designing new models of solenoids. We believe that this verification process is novel logic and useful to save time and expenses during development of SOV because verification tests with manufactured specimen may be substituted partly by this verification methodology.

  7. Verification test report on a solar heating and hot water system

    Science.gov (United States)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  8. Face Attention Network: An Effective Face Detector for the Occluded Faces

    OpenAIRE

    Wang, Jianfeng; Yuan, Ye; Yu, Gang

    2017-01-01

    The performance of face detection has been largely improved with the development of convolutional neural network. However, the occlusion issue due to mask and sunglasses, is still a challenging problem. The improvement on the recall of these occluded cases usually brings the risk of high false positives. In this paper, we present a novel face detector called Face Attention Network (FAN), which can significantly improve the recall of the face detection problem in the occluded case without comp...

  9. Correlative two-photon and serial block face scanning electron microscopy in neuronal tissue using 3D near-infrared branding maps.

    Science.gov (United States)

    Lees, Robert M; Peddie, Christopher J; Collinson, Lucy M; Ashby, Michael C; Verkade, Paul

    2017-01-01

    Linking cellular structure and function has always been a key goal of microscopy, but obtaining high resolution spatial and temporal information from the same specimen is a fundamental challenge. Two-photon (2P) microscopy allows imaging deep inside intact tissue, bringing great insight into the structural and functional dynamics of cells in their physiological environment. At the nanoscale, the complex ultrastructure of a cell's environment in tissue can be reconstructed in three dimensions (3D) using serial block face scanning electron microscopy (SBF-SEM). This provides a snapshot of high resolution structural information pertaining to the shape, organization, and localization of multiple subcellular structures at the same time. The pairing of these two imaging modalities in the same specimen provides key information to relate cellular dynamics to the ultrastructural environment. Until recently, approaches to relocate a region of interest (ROI) in tissue from 2P microscopy for SBF-SEM have been inefficient or unreliable. However, near-infrared branding (NIRB) overcomes this by using the laser from a multiphoton microscope to create fiducial markers for accurate correlation of 2P and electron microscopy (EM) imaging volumes. The process is quick and can be user defined for each sample. Here, to increase the efficiency of ROI relocation, multiple NIRB marks are used in 3D to target ultramicrotomy. A workflow is described and discussed to obtain a data set for 3D correlated light and electron microscopy, using three different preparations of brain tissue as examples. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Temporal Specification and Verification of Real-Time Systems.

    Science.gov (United States)

    1991-08-30

    of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .

  11. Composite multi-lobe descriptor for cross spectral face recognition: matching active IR to visible light images

    Science.gov (United States)

    Cao, Zhicheng; Schmid, Natalia A.

    2015-05-01

    Matching facial images across electromagnetic spectrum presents a challenging problem in the field of biometrics and identity management. An example of this problem includes cross spectral matching of active infrared (IR) face images or thermal IR face images against a dataset of visible light images. This paper describes a new operator named Composite Multi-Lobe Descriptor (CMLD) for facial feature extraction in cross spectral matching of near-infrared (NIR) or short-wave infrared (SWIR) against visible light images. The new operator is inspired by the design of ordinal measures. The operator combines Gaussian-based multi-lobe kernel functions, Local Binary Pattern (LBP), generalized LBP (GLBP) and Weber Local Descriptor (WLD) and modifies them into multi-lobe functions with smoothed neighborhoods. The new operator encodes both the magnitude and phase responses of Gabor filters. The combining of LBP and WLD utilizes both the orientation and intensity information of edges. Introduction of multi-lobe functions with smoothed neighborhoods further makes the proposed operator robust against noise and poor image quality. Output templates are transformed into histograms and then compared by means of a symmetric Kullback-Leibler metric resulting in a matching score. The performance of the multi-lobe descriptor is compared with that of other operators such as LBP, Histogram of Oriented Gradients (HOG), ordinal measures, and their combinations. The experimental results show that in many cases the proposed method, CMLD, outperforms the other operators and their combinations. In addition to different infrared spectra, various standoff distances from close-up (1.5 m) to intermediate (50 m) and long (106 m) are also investigated in this paper. Performance of CMLD is evaluated for of each of the three cases of distances.

  12. A verification regime for the spatial discretization of the SN transport equations

    Energy Technology Data Exchange (ETDEWEB)

    Schunert, S.; Azmy, Y. [North Carolina State Univ., Dept. of Nuclear Engineering, 2500 Stinson Drive, Raleigh, NC 27695 (United States)

    2012-07-01

    The order-of-accuracy test in conjunction with the method of manufactured solutions is the current state of the art in computer code verification. In this work we investigate the application of a verification procedure including the order-of-accuracy test on a generic SN transport solver that implements the AHOTN spatial discretization. Different types of semantic errors, e.g. removal of a line of code or changing a single character, are introduced randomly into the previously verified S{sub N} code and the proposed verification procedure is used to identify the coding mistakes (if possible) and classify them. Itemized by error type we record the stage of the verification procedure where the error is detected and report the frequency with which the errors are correctly identified at various stages of the verification. Errors that remain undetected by the verification procedure are further scrutinized to determine the reason why the introduced coding mistake eluded the verification procedure. The result of this work is that the verification procedure based on an order-of-accuracy test finds almost all detectable coding mistakes but rarely, 1.44% of the time, and under certain circumstances can fail. (authors)

  13. How Well Do Computer-Generated Faces Tap Face Expertise?

    Directory of Open Access Journals (Sweden)

    Kate Crookes

    Full Text Available The use of computer-generated (CG stimuli in face processing research is proliferating due to the ease with which faces can be generated, standardised and manipulated. However there has been surprisingly little research into whether CG faces are processed in the same way as photographs of real faces. The present study assessed how well CG faces tap face identity expertise by investigating whether two indicators of face expertise are reduced for CG faces when compared to face photographs. These indicators were accuracy for identification of own-race faces and the other-race effect (ORE-the well-established finding that own-race faces are recognised more accurately than other-race faces. In Experiment 1 Caucasian and Asian participants completed a recognition memory task for own- and other-race real and CG faces. Overall accuracy for own-race faces was dramatically reduced for CG compared to real faces and the ORE was significantly and substantially attenuated for CG faces. Experiment 2 investigated perceptual discrimination for own- and other-race real and CG faces with Caucasian and Asian participants. Here again, accuracy for own-race faces was significantly reduced for CG compared to real faces. However the ORE was not affected by format. Together these results signal that CG faces of the type tested here do not fully tap face expertise. Technological advancement may, in the future, produce CG faces that are equivalent to real photographs. Until then caution is advised when interpreting results obtained using CG faces.

  14. Accuracy of an infrared marker-based patient positioning system (ExacTrac®) for stereotactic body radiotherapy in localizing the planned isocenter using fiducial markers

    Science.gov (United States)

    Montes-Rodríguez, María de los Ángeles; Hernández-Bojórquez, Mariana; Martínez-Gómez, Alma Angélica; Contreras-Pérez, Agustín; Negrete-Hernández, Ingrid Mireya; Hernández-Oviedo, Jorge Omar; Mitsoura, Eleni; Santiago-Concha, Bernardino Gabriel

    2014-11-01

    Stereotactic Body Radiation Therapy (SBRT) requires a controlled immobilization and position monitoring of patient and target. The purpose of this work is to analyze the performance of the imaging system ExacTrac® (ETX) using infrared and fiducial markers. Materials and methods: In order to assure the accuracy of isocenter localization, a Quality Assurance procedure was applied using an infrared marker-based positioning system. Scans were acquired of an inhouse-agar gel and solid water phantom with infrared spheres. In the inner part of the phantom, three reference markers were delineated as reference and one pellet was place internally; which was assigned as the isocenter. The iPlan® RT Dose treatment planning system. Images were exported to the ETX console. Images were acquired with the ETX to check the correctness of the isocenter placement. Adjustments were made in 6D the reference markers were used to fuse the images. Couch shifts were registered. The procedure was repeated for verification purposes. Results: The data recorded of the verifications in translational and rotational movements showed averaged 3D spatial uncertainties of 0.31 ± 0.42 mm respectively 0.82° ± 0.46° in the phantom and the first correction of these uncertainties were of 1.51 ± 1.14 mm respectively and 1.37° ± 0.61°. Conclusions: This study shows a high accuracy and repeatability in positioning the selected isocenter. The ETX-system for verifying the treatment isocenter position has the ability to monitor the tracing position of interest, making it possible to be used for SBRT positioning within uncertainty ≤1mm.

  15. Accuracy of an infrared marker-based patient positioning system (ExacTrac®) for stereotactic body radiotherapy in localizing the planned isocenter using fiducial markers

    Energy Technology Data Exchange (ETDEWEB)

    Montes-Rodríguez, María de los Ángeles, E-mail: angy24538@yahoo.com; Mitsoura, Eleni [Medical Physics Graduate Programme, Universidad Autónoma del Estado de México, Facultad de Medicina, Paseo Tollocan esquina Jesús Carranza Colonia Moderna de la Cruz, C.P. 50180, Toluca, Estado de México (Mexico); Hernández-Bojórquez, Mariana; Martínez-Gómez, Alma Angélica; Contreras-Pérez, Agustín; Negrete-Hernández, Ingrid Mireya; Hernández-Oviedo, Jorge Omar; Santiago-Concha, Bernardino Gabriel [Centro de Cáncer ABC, The American British Cowdray Medical Center I.A.P. (CMABC), Calle Sur 136, no. 116, Colonia las Américas, C.P. 01120, México, D.F. (Mexico)

    2014-11-07

    Stereotactic Body Radiation Therapy (SBRT) requires a controlled immobilization and position monitoring of patient and target. The purpose of this work is to analyze the performance of the imaging system ExacTrac® (ETX) using infrared and fiducial markers. Materials and methods: In order to assure the accuracy of isocenter localization, a Quality Assurance procedure was applied using an infrared marker-based positioning system. Scans were acquired of an inhouse-agar gel and solid water phantom with infrared spheres. In the inner part of the phantom, three reference markers were delineated as reference and one pellet was place internally; which was assigned as the isocenter. The iPlan® RT Dose treatment planning system. Images were exported to the ETX console. Images were acquired with the ETX to check the correctness of the isocenter placement. Adjustments were made in 6D the reference markers were used to fuse the images. Couch shifts were registered. The procedure was repeated for verification purposes. Results: The data recorded of the verifications in translational and rotational movements showed averaged 3D spatial uncertainties of 0.31 ± 0.42 mm respectively 0.82° ± 0.46° in the phantom and the first correction of these uncertainties were of 1.51 ± 1.14 mm respectively and 1.37° ± 0.61°. Conclusions: This study shows a high accuracy and repeatability in positioning the selected isocenter. The ETX-system for verifying the treatment isocenter position has the ability to monitor the tracing position of interest, making it possible to be used for SBRT positioning within uncertainty ≤1mm.

  16. Environmental technology verification methods

    CSIR Research Space (South Africa)

    Szewczuk, S

    2016-03-01

    Full Text Available Environmental Technology Verification (ETV) is a tool that has been developed in the United States of America, Europe and many other countries around the world to help innovative environmental technologies reach the market. Claims about...

  17. Verification and Optimization of a PLC Control Schedule

    NARCIS (Netherlands)

    Brinksma, Hendrik; Mader, Angelika H.; Havelund, K.; Penix, J.; Visser, W.

    We report on the use of the SPIN model checker for both the verification of a process control program and the derivation of optimal control schedules. This work was carried out as part of a case study for the EC VHS project (Verification of Hybrid Systems), in which the program for a Programmable

  18. Neural synchronization during face-to-face communication

    OpenAIRE

    Jiang, J.; Dai, B.; Peng, D.; Zhu, C.; Liu, L.; Lu, C.

    2012-01-01

    Although the human brain may have evolutionarily adapted to face-to-face communication, other modes of communication, e.g., telephone and e-mail, increasingly dominate our modern daily life. This study examined the neural difference between face-to-face communication and other types of communication by simultaneously measuring two brains using a hyperscanning approach. The results showed a significant increase in the neural synchronization in the left inferior frontal cortex during a face-to-...

  19. 340 and 310 drawing field verification

    International Nuclear Information System (INIS)

    Langdon, J.

    1996-01-01

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format

  20. Verification of Scientific Simulations via Hypothesis-Driven Comparative and Quantitative Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Ahrens, James P [ORNL; Heitmann, Katrin [ORNL; Petersen, Mark R [ORNL; Woodring, Jonathan [Los Alamos National Laboratory (LANL); Williams, Sean [Los Alamos National Laboratory (LANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Ahrens, Christine [Los Alamos National Laboratory (LANL); Hsu, Chung-Hsing [ORNL; Geveci, Berk [ORNL

    2010-11-01

    This article presents a visualization-assisted process that verifies scientific-simulation codes. Code verification is necessary because scientists require accurate predictions to interpret data confidently. This verification process integrates iterative hypothesis verification with comparative, feature, and quantitative visualization. Following this process can help identify differences in cosmological and oceanographic simulations.

  1. Formal verification of Simulink/Stateflow diagrams a deductive approach

    CERN Document Server

    Zhan, Naijun; Zhao, Hengjun

    2017-01-01

    This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.

  2. Ontology Matching with Semantic Verification.

    Science.gov (United States)

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  3. Interpolant tree automata and their application in Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2016-01-01

    This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way in this ......This paper investigates the combination of abstract interpretation over the domain of convex polyhedra with interpolant tree automata, in an abstraction-refinement scheme for Horn clause verification. These techniques have been previously applied separately, but are combined in a new way...... clause verification problems indicates that the combination of interpolant tree automaton with abstract interpretation gives some increase in the power of the verification tool, while sometimes incurring a performance overhead....

  4. Hierarchical Spatio-Temporal Probabilistic Graphical Model with Multiple Feature Fusion for Binary Facial Attribute Classification in Real-World Face Videos.

    Science.gov (United States)

    Demirkus, Meltem; Precup, Doina; Clark, James J; Arbel, Tal

    2016-06-01

    Recent literature shows that facial attributes, i.e., contextual facial information, can be beneficial for improving the performance of real-world applications, such as face verification, face recognition, and image search. Examples of face attributes include gender, skin color, facial hair, etc. How to robustly obtain these facial attributes (traits) is still an open problem, especially in the presence of the challenges of real-world environments: non-uniform illumination conditions, arbitrary occlusions, motion blur and background clutter. What makes this problem even more difficult is the enormous variability presented by the same subject, due to arbitrary face scales, head poses, and facial expressions. In this paper, we focus on the problem of facial trait classification in real-world face videos. We have developed a fully automatic hierarchical and probabilistic framework that models the collective set of frame class distributions and feature spatial information over a video sequence. The experiments are conducted on a large real-world face video database that we have collected, labelled and made publicly available. The proposed method is flexible enough to be applied to any facial classification problem. Experiments on a large, real-world video database McGillFaces [1] of 18,000 video frames reveal that the proposed framework outperforms alternative approaches, by up to 16.96 and 10.13%, for the facial attributes of gender and facial hair, respectively.

  5. Heavy water physical verification in power plants

    International Nuclear Information System (INIS)

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper

  6. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    International Nuclear Information System (INIS)

    Kim, Eui Sub; Yoo, Jun Beom; Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo

    2014-01-01

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  7. A Correctness Verification Technique for Commercial FPGA Synthesis Tools

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun; Kim, Jang Yeol; Lee, Jang Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    Once the FPGA (Filed-Programmable Gate Array) designers designs Verilog programs, the commercial synthesis tools automatically translate the Verilog programs into EDIF programs so that the designers can have largely focused on HDL designs for correctness of functionality. Nuclear regulation authorities, however, require more considerate demonstration of the correctness and safety of mechanical synthesis processes of FPGA synthesis tools, even if the FPGA industry have acknowledged them empirically as correct and safe processes and tools. In order to assure of the safety, the industry standards for the safety of electronic/electrical devices, such as IEC 61508 and IEC 60880, recommend using the formal verification technique. There are several formal verification tools (i.e., 'FormalPro' 'Conformal' 'Formality' and so on) to verify the correctness of translation from Verilog into EDIF programs, but it is too expensive to use and hard to apply them to the works of 3rd-party developers. This paper proposes a formal verification technique which can contribute to the correctness demonstration in part. It formally checks the behavioral equivalence between Verilog and subsequently synthesized Net list with the VIS verification system. A Net list is an intermediate output of FPGA synthesis process, and EDIF is used as a standard format of Net lists. If the formal verification succeeds, then we can assure that the synthesis process from Verilog into Net list worked correctly at least for the Verilog used. In order to support the formal verification, we developed the mechanical translator 'EDIFtoBLIFMV,' which translates EDIF into BLIF-MV as an input front-end of VIS system, while preserving their behavior equivalence.. We performed the case study with an example of a preliminary version of RPS in a Korean nuclear power plant in order to provide the efficiency of the proposed formal verification technique and implemented translator. It

  8. Logic verification system for power plant sequence diagrams

    International Nuclear Information System (INIS)

    Fukuda, Mitsuko; Yamada, Naoyuki; Teshima, Toshiaki; Kan, Ken-ichi; Utsunomiya, Mitsugu.

    1994-01-01

    A logic verification system for sequence diagrams of power plants has been developed. The system's main function is to verify correctness of the logic realized by sequence diagrams for power plant control systems. The verification is based on a symbolic comparison of the logic of the sequence diagrams with the logic of the corresponding IBDs (interlock Block Diagrams) in combination with reference to design knowledge. The developed system points out the sub-circuit which is responsible for any existing mismatches between the IBD logic and the logic realized by the sequence diagrams. Applications to the verification of actual sequence diagrams of power plants confirmed that the developed system is practical and effective. (author)

  9. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  10. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  11. Facing aggression: cues differ for female versus male faces.

    Directory of Open Access Journals (Sweden)

    Shawn N Geniole

    Full Text Available The facial width-to-height ratio (face ratio, is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F(1,36 = 7.43, p = 0.01. In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait and on femininity. Judgements of nurturing were associated with femininity (positively and masculinity (negatively ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as "honest signals".

  12. Facing aggression: cues differ for female versus male faces.

    Science.gov (United States)

    Geniole, Shawn N; Keyes, Amanda E; Mondloch, Catherine J; Carré, Justin M; McCormick, Cheryl M

    2012-01-01

    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F(1,36) = 7.43, p = 0.01). In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait) and on femininity. Judgements of nurturing were associated with femininity (positively) and masculinity (negatively) ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as "honest signals".

  13. Facing Aggression: Cues Differ for Female versus Male Faces

    Science.gov (United States)

    Geniole, Shawn N.; Keyes, Amanda E.; Mondloch, Catherine J.; Carré, Justin M.; McCormick, Cheryl M.

    2012-01-01

    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judgements of aggression were associated with the face ratio even when other cues in the face related to masculinity were controlled statistically. Nevertheless, correlations between the face ratio and judgements of aggression were smaller for female than for male faces (F1,36 = 7.43, p = 0.01). In Study 1, there was no significant relationship between judgements of femininity and of aggression in female faces. In Study 2, the association between judgements of masculinity and aggression was weaker in female faces than for male faces in Study 1. The weaker association in female faces may be because aggression and masculinity are stereotypically male traits. Thus, in Study 3, observers rated faces on nurturing (a stereotypically female trait) and on femininity. Judgements of nurturing were associated with femininity (positively) and masculinity (negatively) ratings in both female and male faces. In summary, the perception of aggression differs in female versus male faces. The sex difference was not simply because aggression is a gendered construct; the relationships between masculinity/femininity and nurturing were similar for male and female faces even though nurturing is also a gendered construct. Masculinity and femininity ratings are not associated with aggression ratings nor with the face ratio for female faces. In contrast, all four variables are highly inter-correlated in male faces, likely because these cues in male faces serve as “honest signals”. PMID:22276184

  14. Consideration of the Verleur model of far-infrared spectroscopy of ternary compounds

    International Nuclear Information System (INIS)

    Robouch, B. V.; Kisiel, A.; Sheregii, E. M.

    2001-01-01

    The clustering model proposed by Verleur and Barker [Phys. Rev. 149, 715 (1966)] to interpret far infrared data for face-centered-cubic ternary compounds is critically analyzed. It is shown that their approach, satisfactory for fitting some ternary compound spectral curves, is too restricted by its one-parameter β model to be able to describe preferences (with respect to a random distribution case) for the five tetrahedron configurations

  15. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    Science.gov (United States)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  16. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  17. Inventory verification measurements using neutron multiplicity counting

    International Nuclear Information System (INIS)

    Ensslin, N.; Foster, L.A.; Harker, W.C.; Krick, M.S.; Langner, D.G.

    1998-01-01

    This paper describes a series of neutron multiplicity measurements of large plutonium samples at the Los Alamos Plutonium Facility. The measurements were corrected for bias caused by neutron energy spectrum shifts and nonuniform multiplication, and are compared with calorimetry/isotopics. The results show that multiplicity counting can increase measurement throughput and yield good verification results for some inventory categories. The authors provide recommendations on the future application of the technique to inventory verification

  18. The MODUS Approach to Formal Verification

    Directory of Open Access Journals (Sweden)

    Brewka Lukasz

    2014-03-01

    Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST/QA PLAN FOR THE VERIFICATION TESTING OF SELECTIVE CATALYTIC REDUCTION CONTROL TECHNOLOGIES FOR HIGHWAY, NONROAD, AND STATIONARY USE DIESEL ENGINES

    Science.gov (United States)

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  20. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    Science.gov (United States)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the

  1. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  2. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards (including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security.

  3. Verification Games: Crowd-Sourced Formal Verification

    Science.gov (United States)

    2016-03-01

    additional paintbrushes. Additionally, in Paradox , human players are never given small optimization problems (for example, toggling the values of 50...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox . Verification tools and games were integrated to verify...4 4. Paradox …………………………………………………......5 5. MyClass ………………………………………………….....7 6. Results …………………………………………………......11 7. Time to

  4. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  5. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  6. Compromises produced by the dialectic between self-verification and self-enhancement.

    Science.gov (United States)

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  7. Calibration and verification of surface contamination meters --- Procedures and techniques

    International Nuclear Information System (INIS)

    Schuler, C; Butterweck, G.; Wernli, C.; Bochud, F.; Valley, J.-F.

    2007-03-01

    A standardised measurement procedure for surface contamination meters (SCM) is presented. The procedure aims at rendering surface contamination measurements to be simply and safely interpretable. Essential for the approach is the introduction and common use of the radionuclide specific quantity 'guideline value' specified in the Swiss Radiation Protection Ordinance as unit for the measurement of surface activity. The according radionuclide specific 'guideline value count rate' can be summarized as verification reference value for a group of radionuclides ('basis guideline value count rate'). The concept can be generalized for SCM of the same type or for SCM of different types using he same principle of detection. A SCM multi source calibration technique is applied for the determination of the instrument efficiency. Four different electron radiation energy regions, four different photon radiation energy regions and an alpha radiation energy region are represented by a set of calibration sources built according to ISO standard 8769-2. A guideline value count rate representing the activity per unit area of a surface contamination of one guideline value can be calculated for any radionuclide using instrument efficiency, radionuclide decay data, contamination source efficiency, guideline value averaging area (100 cm 2 ), and radionuclide specific guideline value. n this way, instrument responses for the evaluation of surface contaminations are obtained for radionuclides without available calibration sources as well as for short-Iived radionuclides, for which the continuous replacement of certified calibration sources can lead to unreasonable costs. SCM verification is based on surface emission rates of reference sources with an active area of 100 cm 2 . The verification for a given list of radionuclides is based on the radionuclide specific quantity guideline value count rate. Guideline value count rates for groups of radionuclides can be represented within the maximum

  8. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  9. Transmutation Fuel Performance Code Thermal Model Verification

    Energy Technology Data Exchange (ETDEWEB)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  10. Verification and Validation of RADTRAN 5.5.

    Energy Technology Data Exchange (ETDEWEB)

    Osborn, Douglas.; Weiner, Ruth F.; Mills, George Scott; Hamp, Steve C.

    2005-02-01

    This document contains a description of the verification and validation process used for the RADTRAN 5.5 code. The verification and validation process ensured the proper calculational models and mathematical and numerical methods were used in the RADTRAN 5.5 code for the determination of risk and consequence assessments. The differences between RADTRAN 5 and RADTRAN 5.5 are the addition of tables, an expanded isotope library, and the additional User-Defined meteorological option for accident dispersion. 3

  11. Comparing formal verification approaches of interlocking systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Nguyen, Hoang Nga; Roggenbach, Markus

    2016-01-01

    these approaches. As a first step towards this, in this paper we suggest a way to compare different formal approaches for verifying designs of route-based interlocking systems and we demonstrate it on modelling and verification approaches developed within the research groups at DTU/Bremen and at Surrey......The verification of railway interlocking systems is a challenging task, and therefore several research groups have suggested to improve this task by using formal methods, but they use different modelling and verification approaches. To advance this research, there is a need to compare....../Swansea. The focus is on designs that are specified by so-called control tables. The paper can serve as a starting point for further comparative studies. The DTU/Bremen research has been funded by the RobustRailS project granted by Innovation Fund Denmark. The Surrey/Swansea research has been funded by the Safe...

  12. Software Testing and Verification in Climate Model Development

    Science.gov (United States)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  13. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  14. Integrated Java Bytecode Verification

    DEFF Research Database (Denmark)

    Gal, Andreas; Probst, Christian; Franz, Michael

    2005-01-01

    Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program...

  15. Radiochemical verification and validation in the environmental data collection process

    International Nuclear Information System (INIS)

    Rosano-Reece, D.; Bottrell, D.; Bath, R.J.

    1994-01-01

    A credible and cost effective environmental data collection process should produce analytical data which meets regulatory and program specific requirements. Analytical data, which support the sampling and analysis activities at hazardous waste sites, undergo verification and independent validation before the data are submitted to regulators. Understanding the difference between verification and validation and their respective roles in the sampling and analysis process is critical to the effectiveness of a program. Verification is deciding whether the measurement data obtained are what was requested. The verification process determines whether all the requirements were met. Validation is more complicated than verification. It attempts to assess the impacts on data use, especially when requirements are not met. Validation becomes part of the decision-making process. Radiochemical data consists of a sample result with an associated error. Therefore, radiochemical validation is different and more quantitative than is currently possible for the validation of hazardous chemical data. Radiochemical data include both results and uncertainty that can be statistically compared to identify significance of differences in a more technically defensible manner. Radiochemical validation makes decisions about analyte identification, detection, and uncertainty for a batch of data. The process focuses on the variability of the data in the context of the decision to be made. The objectives of this paper are to present radiochemical verification and validation for environmental data and to distinguish the differences between the two operations

  16. Automatic verification of a lip-synchronisation protocol using Uppaal

    NARCIS (Netherlands)

    Bowman, H.; Faconti, G.; Katoen, J.-P.; Latella, D.; Massink, M.

    1998-01-01

    We present the formal specification and verification of a lip-synchronisation protocol using the real-time model checker Uppaal. A number of specifications of this protocol can be found in the literature, but this is the first automatic verification. We take a published specification of the

  17. Numident Online Verification Utility (NOVU)

    Data.gov (United States)

    Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...

  18. Revitalization of the damaged machine parts by hard facing as a way of saving funds

    Directory of Open Access Journals (Sweden)

    Vukić Lazić

    2016-09-01

    Full Text Available The objective of the research, presented in this paper, was to demonstrate the superiority of the hard facing as the revitalization technology of various damaged machine parts. The analysis of the two different revitalization methods of the damaged machine parts is presented – the replacement of the damaged part by the new – spare part and reparation by hard facing. The comparison is done on the example of hard facing and replacing of damaged loader's teeth. The paper presents a method for calculating costs of the two revitalization technologies based on their profitability and their comparison. That method could be applied for similar calculations for any machine part, with smallest or no adjustments. The paper presents a verification of advantage of applying the hard facing as the machine parts reparatory technology with respect to the other revitalization technology. The savings realized by application of hard facing reparation of the loader's teeth reach 73.5 % for one set of teeth and 82.40per annum of the costs for purchasing the new spare parts. The analysis was conducted under an assumption that organization of the maintenance function is at the exceptionally high level so that the purchasing of the new part/repairing of the damaged one is always done in time. This idealized approach was adopted since in that way one obtains the least economic effects of the reparatory technology application with respect to replacing the part with the spare one. In any other case the economic effects would be significantly higher, namely even more positive in favor of the hard facing revitalization technology.

  19. Systematic study of source mask optimization and verification flows

    Science.gov (United States)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  20. Organics Verification Study for Sinclair and Dyes Inlets, Washington

    Energy Technology Data Exchange (ETDEWEB)

    Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.

    2006-09-28

    Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the

  1. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  2. Standard artifact for the geometric verification of terrestrial laser scanning systems

    Science.gov (United States)

    González-Jorge, H.; Riveiro, B.; Armesto, J.; Arias, P.

    2011-10-01

    Terrestrial laser scanners are geodetic instruments with applications in areas such as architecture, civil engineering or environment. Although it is common to receive the technical specifications of the systems from their manufacturers, there are not any solutions for data verification in the market available for the users. This work proposes a standard artifact and a methodology to perform, in a simple way, the metrology verification of laser scanners. The artifact is manufactured using aluminium and delrin, materials that make the artifact robust and portable. The system consists of a set of five spheres situated at equal distances to one another, and a set of seven cubes of different sizes. A coordinate measuring machine with sub-millimetre precision is used for calibration purposes under controlled environmental conditions. After its calibration, the artifact can be used for the verification of metrology specifications given by manufacturers of laser scanners. The elements of the artifact are destinated to test different metrological characteristics, such as accuracy, precision and resolution. The distance between centres of the spheres is used to obtain the accuracy data, the standard deviation of the top face of the largest cube is used to establish the precision (repeatability) and the error in the measurement of the cubes provides the resolution value in axes X, Y and Z. Methodology for the evaluation is mainly supported by least squares fitting algorithms developed using Matlab programming. The artifact and methodology proposed were tested using a terrestrial laser scanner Riegl LMSZ-390i at three different ranges (10, 30 and 50 m) and four stepwidths (0.002°, 0.005°, 0.010° and 0.020°), both for horizontal and vertical displacements. Results obtained are in agreement with the accuracy and precision data given by the manufacturer, 6 and 4 mm, respectively. On the other hand, important influences between resolution and range and between resolution and

  3. The role of the United Nations in the field of verification

    International Nuclear Information System (INIS)

    1991-01-01

    By resolution 43/81 B of 7 December 1988, the General Assembly requested the Secretary General to undertake, with the assistance of a group of qualified governmental experts, an in-depth study of the role of the United Nations in the field of verification. In August 1990, the Secretary-General transmitted to the General Assembly the unanimously approved report of the experts. The report is structured in six chapters and contains a bibliographic appendix on technical aspects of verification. The Introduction provides a brief historical background on the development of the question of verification in the United Nations context, culminating with the adoption by the General Assembly of resolution 43/81 B, which requested the study. Chapters II and III address the definition and functions of verification and the various approaches, methods, procedures and techniques used in the process of verification. Chapters IV and V examine the existing activities of the United Nations in the field of verification, possibilities for improvements in those activities as well as possible additional activities, while addressing the organizational, technical, legal, operational and financial implications of each of the possibilities discussed. Chapter VI presents the conclusions and recommendations of the Group

  4. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Directory of Open Access Journals (Sweden)

    Jin-Won Park

    2009-01-01

    Full Text Available As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  5. Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System

    Science.gov (United States)

    Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won

    2009-12-01

    As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.

  6. 45 CFR 1626.7 - Verification of eligible alien status.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Verification of eligible alien status. 1626.7... CORPORATION RESTRICTIONS ON LEGAL ASSISTANCE TO ALIENS § 1626.7 Verification of eligible alien status. (a) An alien seeking representation shall submit appropriate documents to verify eligibility, unless the only...

  7. A Quantitative Approach to the Formal Verification of Real-Time Systems.

    Science.gov (United States)

    1996-09-01

    Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic

  8. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    Science.gov (United States)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  9. Distorted Fingerprint Verification System

    Directory of Open Access Journals (Sweden)

    Divya KARTHIKAESHWARAN

    2011-01-01

    Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.

  10. Enhanced verification test suite for physics simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  11. Test-driven verification/validation of model transformations

    Institute of Scientific and Technical Information of China (English)

    László LENGYEL; Hassan CHARAF

    2015-01-01

    Why is it important to verify/validate model transformations? The motivation is to improve the quality of the trans-formations, and therefore the quality of the generated software artifacts. Verified/validated model transformations make it possible to ensure certain properties of the generated software artifacts. In this way, verification/validation methods can guarantee different requirements stated by the actual domain against the generated/modified/optimized software products. For example, a verified/ validated model transformation can ensure the preservation of certain properties during the model-to-model transformation. This paper emphasizes the necessity of methods that make model transformation verified/validated, discusses the different scenarios of model transformation verification and validation, and introduces the principles of a novel test-driven method for verifying/ validating model transformations. We provide a solution that makes it possible to automatically generate test input models for model transformations. Furthermore, we collect and discuss the actual open issues in the field of verification/validation of model transformations.

  12. Lightweight Methods for Effective Verification of Software Product Lines with Off-the-Shelf Tools

    DEFF Research Database (Denmark)

    Iosif-Lazar, Alexandru Florin

    Certification is the process of assessing the quality of a product and whether it meets a set of requirements and adheres to functional and safety standards. I is often legally required to provide guarantee for human safety and to make the product available on the market. The certification process...... relies on objective evidence of quality, which is produced by using qualified and state-of-the-art tools and verification and validation techniques. Software product line (SPL) engineering distributes costs among similar products that are developed simultaneously. However, SPL certification faces major...... SPL reengineering projects that involve complex source code transformations. To facilitate product (re)certification, the transformation must preserve certain qualitative properties such as code structure and semantics—a difficult task due to the complexity of the transformation and because certain...

  13. Technical safety requirements control level verification

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  14. Verification Survey of Uranium Mine Remediation

    International Nuclear Information System (INIS)

    Ron, Stager

    2009-01-01

    The Canadian Nuclear Safety Commission (CNSC) contracted an independent verification of an intensive gamma radiation survey conducted by a mining company to demonstrate that remediation of disturbed areas was complete. This site was the first of the recent mines being decommissioned in Canada and experience gained here may be applied to other mines being decommissioned in the future. The review included examination of the site-specific basis for clean-up criteria and ALARA as required by CNSC guidance. A paper review of the company report was conducted to determine if protocols were followed and that the summarized results could be independently reproduced. An independent verification survey was conducted on parts of the site and comparisons were made between gamma radiation measurements from the verification survey and the original company survey. Some aspects of data collection using rate meters linked to GPS data loggers are discussed as are aspects for data management and analyses methods required for the large amount of data collected during these surveys. Recommendations were made for implementation of future surveys and reporting the data from those surveys in order to ensure that remediation was complete. (authors)

  15. Verification and validation of RADMODL Version 1.0

    International Nuclear Information System (INIS)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V ampersand V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident

  16. Verification and validation of RADMODL Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transport of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.

  17. Successful decoding of famous faces in the fusiform face area.

    Directory of Open Access Journals (Sweden)

    Vadim Axelrod

    Full Text Available What are the neural mechanisms of face recognition? It is believed that the network of face-selective areas, which spans the occipital, temporal, and frontal cortices, is important in face recognition. A number of previous studies indeed reported that face identity could be discriminated based on patterns of multivoxel activity in the fusiform face area and the anterior temporal lobe. However, given the difficulty in localizing the face-selective area in the anterior temporal lobe, its role in face recognition is still unknown. Furthermore, previous studies limited their analysis to occipito-temporal regions without testing identity decoding in more anterior face-selective regions, such as the amygdala and prefrontal cortex. In the current high-resolution functional Magnetic Resonance Imaging study, we systematically examined the decoding of the identity of famous faces in the temporo-frontal network of face-selective and adjacent non-face-selective regions. A special focus has been put on the face-area in the anterior temporal lobe, which was reliably localized using an optimized scanning protocol. We found that face-identity could be discriminated above chance level only in the fusiform face area. Our results corroborate the role of the fusiform face area in face recognition. Future studies are needed to further explore the role of the more recently discovered anterior face-selective areas in face recognition.

  18. Online 3D EPID-based dose verification: Proof of concept

    Energy Technology Data Exchange (ETDEWEB)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozendaal@nki.nl; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben [Department of Radiation Oncology, The Netherlands Cancer Institute, Amsterdam 1066 CX (Netherlands); Herk, Marcel van [University of Manchester, Manchester Academic Health Science Centre, The Christie NHS Foundation Trust, Manchester M20 4BX (United Kingdom)

    2016-07-15

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  19. Online 3D EPID-based dose verification: Proof of concept

    International Nuclear Information System (INIS)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; Herk, Marcel van

    2016-01-01

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame

  20. Online 3D EPID-based dose verification: Proof of concept.

    Science.gov (United States)

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took

  1. Real Time Face Quality Assessment for Face Log Generation

    DEFF Research Database (Denmark)

    Kamal, Nasrollahi; Moeslund, Thomas B.

    2009-01-01

    Summarizing a long surveillance video to just a few best quality face images of each subject, a face-log, is of great importance in surveillance systems. Face quality assessment is the back-bone for face log generation and improving the quality assessment makes the face logs more reliable....... Developing a real time face quality assessment system using the most important facial features and employing it for face logs generation are the concerns of this paper. Extensive tests using four databases are carried out to validate the usability of the system....

  2. Formal Verification of Real-Time System Requirements

    Directory of Open Access Journals (Sweden)

    Marcin Szpyrka

    2000-01-01

    Full Text Available The methodology of system requirements verification presented in this paper is a proposition of a practical procedure for reducing some negatives of the specification of requirements. The main problem that is considered is to create a complete description of the system requirements without any negatives. Verification of the initially defined requirements is based on the coloured Petri nets. Those nets are useful for testing some properties of system requirements such as completeness, consistency and optimality. An example ofthe litt controller is presented.

  3. Finite Countermodel Based Verification for Program Transformation (A Case Study

    Directory of Open Access Journals (Sweden)

    Alexei P. Lisitsa

    2015-12-01

    Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.

  4. Technical workshop on safeguards, verification technologies, and other related experience

    International Nuclear Information System (INIS)

    1998-01-01

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation

  5. Technical workshop on safeguards, verification technologies, and other related experience

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-12-31

    The aim of the Technical Workshop on safeguards was to encourage a clearer understanding of the IAEA Safeguards System, its origins and evolution and the present state of the art. Presentations held by the IAEA officials and outside experts examined as well other components of the non-proliferation regime, the current practices and procedures, and the future prospects. A series of presentations described the characteristics of the interaction between global and regional verification systems and described relevant past and present experience. Prominence given to such state of the art verification technologies as environmental sampling, satellite imaging and monitoring thorough remote and unattended techniques demonstrated, beyond any doubt, the essentially dynamic nature of verification. It is generally acknowledged that there have been major achievements in preventing spread of nuclear weapons, but no verification system can in itself prevent proliferation Refs, figs, tabs

  6. Learning a Genetic Measure for Kinship Verification Using Facial Images

    Directory of Open Access Journals (Sweden)

    Lu Kou

    2015-01-01

    Full Text Available Motivated by the key observation that children generally resemble their parents more than other persons with respect to facial appearance, distance metric (similarity learning has been the dominant choice for state-of-the-art kinship verification via facial images in the wild. Most existing learning-based approaches to kinship verification, however, are focused on learning a genetic similarity measure in a batch learning manner, leading to less scalability for practical applications with ever-growing amount of data. To address this, we propose a new kinship verification approach by learning a sparse similarity measure in an online fashion. Experimental results on the kinship datasets show that our approach is highly competitive to the state-of-the-art alternatives in terms of verification accuracy, yet it is superior in terms of scalability for practical applications.

  7. Towards Model Validation and Verification with SAT Techniques

    OpenAIRE

    Gogolla, Martin

    2010-01-01

    After sketching how system development and the UML (Unified Modeling Language) and the OCL (Object Constraint Language) are related, validation and verification with the tool USE (UML-based Specification Environment) is demonstrated. As a more efficient alternative for verification tasks, two approaches using SAT-based techniques are put forward: First, a direct encoding of UML and OCL with Boolean variables and propositional formulas, and second, an encoding employing an...

  8. Engineering a static verification tool for GPU kernels

    OpenAIRE

    Bardsley, E; Betts, A; Chong, N; Collingbourne, P; Deligiannis, P; Donaldson, AF; Ketema, J; Liew, D; Qadeer, S

    2014-01-01

    We report on practical experiences over the last 2.5 years related to the engineering of GPUVerify, a static verification tool for OpenCL and CUDA GPU kernels, plotting the progress of GPUVerify from a prototype to a fully functional and relatively efficient analysis tool. Our hope is that this experience report will serve the verification community by helping to inform future tooling efforts. ? 2014 Springer International Publishing.

  9. Symposium on international safeguards: Verification and nuclear material security. Book of extended synopses. Addendum

    International Nuclear Information System (INIS)

    2001-01-01

    The symposium covered the topics related to international safeguards, verification and nuclear materials security, namely: verification and nuclear material security; the NPT regime: progress and promises; the Additional Protocol as an important tool for the strengthening of the safeguards system; the nuclear threat and the nuclear threat initiative. Eighteen sessions dealt with the following subjects: the evolution of IAEA safeguards ( including strengthened safeguards, present and future challenges; verification of correctness and completeness of initial declarations; implementation of the Additional Protocol, progress and experience; security of material; nuclear disarmament and ongoing monitoring and verification in Iraq; evolution of IAEA verification in relation to nuclear disarmament); integrated safeguards; physical protection and illicit trafficking; destructive analysis for safeguards; the additional protocol; innovative safeguards approaches; IAEA verification and nuclear disarmament; environmental sampling; safeguards experience; safeguards equipment; panel discussion on development of state systems of accountancy and control; information analysis in the strengthened safeguard system; satellite imagery and remote monitoring; emerging IAEA safeguards issues; verification technology for nuclear disarmament; the IAEA and the future of nuclear verification and security

  10. Cross-modal face recognition using multi-matcher face scores

    Science.gov (United States)

    Zheng, Yufeng; Blasch, Erik

    2015-05-01

    The performance of face recognition can be improved using information fusion of multimodal images and/or multiple algorithms. When multimodal face images are available, cross-modal recognition is meaningful for security and surveillance applications. For example, a probe face is a thermal image (especially at nighttime), while only visible face images are available in the gallery database. Matching a thermal probe face onto the visible gallery faces requires crossmodal matching approaches. A few such studies were implemented in facial feature space with medium recognition performance. In this paper, we propose a cross-modal recognition approach, where multimodal faces are cross-matched in feature space and the recognition performance is enhanced with stereo fusion at image, feature and/or score level. In the proposed scenario, there are two cameras for stereo imaging, two face imagers (visible and thermal images) in each camera, and three recognition algorithms (circular Gaussian filter, face pattern byte, linear discriminant analysis). A score vector is formed with three cross-matched face scores from the aforementioned three algorithms. A classifier (e.g., k-nearest neighbor, support vector machine, binomial logical regression [BLR]) is trained then tested with the score vectors by using 10-fold cross validations. The proposed approach was validated with a multispectral stereo face dataset from 105 subjects. Our experiments show very promising results: ACR (accuracy rate) = 97.84%, FAR (false accept rate) = 0.84% when cross-matching the fused thermal faces onto the fused visible faces by using three face scores and the BLR classifier.

  11. Meta-analytic review of the development of face discrimination in infancy: Face race, face gender, infant age, and methodology moderate face discrimination.

    Science.gov (United States)

    Sugden, Nicole A; Marquis, Alexandra R

    2017-11-01

    Infants show facility for discriminating between individual faces within hours of birth. Over the first year of life, infants' face discrimination shows continued improvement with familiar face types, such as own-race faces, but not with unfamiliar face types, like other-race faces. The goal of this meta-analytic review is to provide an effect size for infants' face discrimination ability overall, with own-race faces, and with other-race faces within the first year of life, how this differs with age, and how it is influenced by task methodology. Inclusion criteria were (a) infant participants aged 0 to 12 months, (b) completing a human own- or other-race face discrimination task, (c) with discrimination being determined by infant looking. Our analysis included 30 works (165 samples, 1,926 participants participated in 2,623 tasks). The effect size for infants' face discrimination was small, 6.53% greater than chance (i.e., equal looking to the novel and familiar). There was a significant difference in discrimination by race, overall (own-race, 8.18%; other-race, 3.18%) and between ages (own-race: 0- to 4.5-month-olds, 7.32%; 5- to 7.5-month-olds, 9.17%; and 8- to 12-month-olds, 7.68%; other-race: 0- to 4.5-month-olds, 6.12%; 5- to 7.5-month-olds, 3.70%; and 8- to 12-month-olds, 2.79%). Multilevel linear (mixed-effects) models were used to predict face discrimination; infants' capacity to discriminate faces is sensitive to face characteristics including race, gender, and emotion as well as the methods used, including task timing, coding method, and visual angle. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. Formal verification of reactor process control software using assertion checking environment

    International Nuclear Information System (INIS)

    Sharma, Babita; Balaji, Sowmya; John, Ajith K.; Bhattacharjee, A.K.; Dhodapkar, S.D.

    2005-01-01

    Assertion Checking Environment (ACE) was developed in-house for carrying out formal (rigorous/ mathematical) functional verification of embedded software written in MISRA C. MISRA C is an industrially sponsored safe sub-set of C programming language and is well accepted in the automotive and aerospace industries. ACE uses static assertion checking technique for verification of MISRA C programs. First the functional specifications of the program are derived from the specifications in the form of pre- and post-conditions for each C function. These pre- and post-conditions are then introduced as assertions (formal comments) in the program code. The annotated C code is then formally verified using ACE. In this paper we present our experience of using ACE for the formal verification of process control software of a nuclear reactor. The Software Requirements Document (SRD) contained textual specifications of the process control software. The SRD was used by the designers to draw logic diagrams which were given as input to a code generator. The verification of the generated C code was done at 2 levels viz. (i) verification against specifications derived from logic diagrams, and (ii) verification against specifications derived from SRD. In this work we checked approximately 600 functional specifications of the software having roughly 15000 lines of code. (author)

  13. IP cores design from specifications to production modeling, verification, optimization, and protection

    CERN Document Server

    Mohamed, Khaled Salah

    2016-01-01

    This book describes the life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection. Various trade-offs in the design process are discussed, including  those associated with many of the most common memory cores, controller IPs  and system-on-chip (SoC) buses. Readers will also benefit from the author’s practical coverage of new verification methodologies. such as bug localization, UVM, and scan-chain.  A SoC case study is presented to compare traditional verification with the new verification methodologies. ·         Discusses the entire life cycle process of IP cores, from specification to production, including IP modeling, verification, optimization, and protection; ·         Introduce a deep introduction for Verilog for both implementation and verification point of view.  ·         Demonstrates how to use IP in applications such as memory controllers and SoC buses. ·         Describes a new ver...

  14. Investigation of depth dependent changes in cerebral haemodynamics during face perception in infants

    International Nuclear Information System (INIS)

    Blasi, A; Fox, S; Everdell, N; Volein, A; Tucker, L; Csibra, G; Gibson, A P; Hebden, J C; Johnson, M H; Elwell, C E

    2007-01-01

    Near-infrared spectroscopy has been used to record oxygenation changes in the visual cortex of 4 month old infants. Our in-house topography system, with 30 channels and 3 different source-detector separations, recorded changes in the concentration of oxy-, deoxy- and total haemoglobin (HbO 2 , HHb and HbT) in response to visual stimuli (face, scrambled visual noise and cartoons as rest). The aim of this work was to demonstrate the capability of the system to spatially localize functional activation and study the possibility of depth discrimination in the haemodynamic response. The group data show both face stimulation and visual noise stimulation induced significant increases in HbO 2 from rest, but the increase in HbO 2 with face stimulation was not significantly different from that seen with visual noise stimulation. The face stimuli induced increases in HbO 2 were spread across a greater area across all depths than visual noise induced changes. In results from a single subject there was a significant increase of HbO 2 in the inferior area of the visual cortex in response to both types of stimuli, and a larger number of channels (source-detector pairs) showed HbO 2 increase to face stimuli, especially at the greatest depth. Activation maps were obtained using 3D reconstruction methods on multi source-detector separation optical topography data

  15. VAMOS: The verification and monitoring options study: Current research options for in-situ monitoring and verification of contaminant remediation and containment within the vadose zone

    International Nuclear Information System (INIS)

    Betsill, J.D.; Gruebel, R.D.

    1995-09-01

    The Verification and Monitoring Options Study Project (VAMOS) was established to identify high-priority options for future vadose-zone environmental research in the areas of in-situ remediation monitoring, post-closure monitoring, and containment emplacement and verification monitoring. VAMOS examined projected needs not currently being met with applied technology in order to develop viable monitoring and verification research options. The study emphasized a compatible systems approach to reinforce the need for utilizing compatible components to provide user friendly site monitoring systems. To identify the needs and research options related to vadose-zone environmental monitoring and verification, a literature search and expert panel forums were conducted. The search included present drivers for environmental monitoring technology, technology applications, and research efforts. The forums included scientific, academic, industry, and regulatory environmental professionals as well as end users of environmental technology. The experts evaluated current and future monitoring and verification needs, methods for meeting these needs, and viable research options and directions. A variety of high-priority technology development, user facility, and technology guidance research options were developed and presented as an outcome of the literature search and expert panel forums

  16. Voicing on Virtual and Face to Face Discussion

    Science.gov (United States)

    Yamat, Hamidah

    2013-01-01

    This paper presents and discusses findings of a study conducted on pre-service teachers' experiences in virtual and face to face discussions. Technology has brought learning nowadays beyond the classroom context or time zone. The learning context and process no longer rely solely on face to face communications in the presence of a teacher.…

  17. Facing Aggression: Cues Differ for Female versus Male Faces

    OpenAIRE

    Geniole, Shawn N.; Keyes, Amanda E.; Mondloch, Catherine J.; Carr?, Justin M.; McCormick, Cheryl M.

    2012-01-01

    The facial width-to-height ratio (face ratio), is a sexually dimorphic metric associated with actual aggression in men and with observers' judgements of aggression in male faces. Here, we sought to determine if observers' judgements of aggression were associated with the face ratio in female faces. In three studies, participants rated photographs of female and male faces on aggression, femininity, masculinity, attractiveness, and nurturing. In Studies 1 and 2, for female and male faces, judge...

  18. Buzz: Face-to-Face Contact and the Urban Economy

    OpenAIRE

    Michael Storper; Anthony J. Venables

    2003-01-01

    This paper argues that existing models of urban concentrations are incomplete unless grounded in the most fundamental aspect of proximity; face-to-face contact. Face-to-face contact has four main features; it is an efficient communication technology; it can help solve incentive problems; it can facilitate socialization and learning; and it provides psychological motivation. We discuss each of these features in turn, and develop formal economic models of two of them. Face-to-face is particular...

  19. Remote infrared signage evaluation for transit stations and intersections.

    Science.gov (United States)

    Crandall, W; Brabyn, J; Bentzen, B L; Myers, L

    1999-10-01

    Opportunities for education and employment depend upon effective and independent travel. For mainstream society, this is accomplished to a large extent by printed signs. People who are print disabled, visually impaired, or totally blind are at a disadvantage because they do not have access to signage. Remote infrared signage, such as the Talking Signs (TS) system, provides a solution to this need by labeling the environment for distant viewing. The system uses a transmitting "sign" and a hand-held receiver to tell people about their surroundings. In a seamless infrared signage environment, a visually impaired traveler could: walk safely across an intersection to an ATM or fare machine, from fare machine to bus stop, from bus stop to bus; from bus to building, from building to elevator, from elevator to office, from office to restroom, and so forth. This paper focuses on two problems that are among the most challenging and dangerous faced by blind travelers: negotiating complex transit stations and controlled intersections. We report on human factors studies of TS in these critical tasks, examining such issues as how much training is needed to use the system, its impact on performance and safety, benefits for different population subgroups and user opinions of its value. Results indicate that blind people can quickly and easily learn to use remote infrared signage effectively, and that its use improves travel safety, efficiency, and independence.

  20. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    International Nuclear Information System (INIS)

    Luke, S.J.

    2011-01-01

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  1. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    Energy Technology Data Exchange (ETDEWEB)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of the US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the

  2. Verification of industrial x-ray machine: MINTs experience

    International Nuclear Information System (INIS)

    Aziz Amat; Saidi Rajab; Eesan Pasupathi; Saipo Bahari Abdul Ratan; Shaharudin Sayuti; Abd Nassir Ibrahim; Abd Razak Hamzah

    2005-01-01

    Radiation and electrical safety of the industrial x-ray equipment required to meet Atomic Energy Licensing Board(AELB) guidelines ( LEM/TEK/42 ) at the time of installation and subsequently a periodic verification should be ensured. The purpose of the guide is to explain the requirements employed in conducting the test on industrial x-ray apparatus and be certified in meeting with our local legislative and regulation. Verification is aimed to provide safety assurance information on electrical requirements and the minimum radiation exposure to the operator. This regulation is introduced on new models imported into the Malaysian market. Since June, 1997, Malaysian Institute for Nuclear Technology Research (MINT) has been approved by AELB to provide verification services to private company, government and corporate body throughout Malaysia. Early January 1997, AELB has made it mandatory that all x-ray equipment for industrial purpose (especially Industrial Radiography) must fulfill certain performance test based on the LEM/TEK/42 guidelines. MINT as the third party verification encourages user to improve maintenance of the equipment. MINT experiences in measuring the performance on intermittent and continuous duty rating single-phase industrial x-ray machine in the year 2004 indicated that all of irradiating apparatus tested pass the test and met the requirements of the guideline. From MINT record, 1997 to 2005 , three x-ray models did not meet the requirement and thus not allowed to be used unless the manufacturers willing to modify it to meet AELB requirement. This verification procedures on electrical and radiation safety on industrial x-ray has significantly improved the the maintenance cultures and safety awareness in the usage of x-ray apparatus in the industrial environment. (Author)

  3. Infrared astronomy

    International Nuclear Information System (INIS)

    Setti, G.; Fazio, G.

    1978-01-01

    This volume contains lectures describing the important achievements in infrared astronomy. The topics included are galactic infrared sources and their role in star formation, the nature of the interstellar medium and galactic structure, the interpretation of infrared, optical and radio observations of extra-galactic sources and their role in the origin and structure of the universe, instrumental techniques and a review of future space observations. (C.F.)

  4. Experimental preparation and verification of quantum money

    Science.gov (United States)

    Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei

    2018-03-01

    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.

  5. VBMC: a formal verification tool for VHDL programs

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-01-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into subparts and implementing each subpart in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have disastrous consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This paper describes an indigenously developed software tool named VBMC (VHDL Bounded Model Checker) for mathematically proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts hardware design as VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counter example providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  6. VBMC: a formal verification tool for VHDL program

    International Nuclear Information System (INIS)

    Ajith, K.J.; Bhattacharjee, A.K.

    2014-08-01

    The design of Control and Instrumentation (C and I) systems used in safety critical applications such as nuclear power plants involves partitioning of the overall system functionality into sub-parts and implementing each sub-part in hardware and/or software as appropriate. With increasing use of programmable devices like FPGA, the hardware subsystems are often implemented in Hardware Description Languages (HDL) like VHDL. Since the functional bugs in such hardware subsystems used in safety critical C and I systems have serious consequences, it is important to use rigorous reasoning to verify the functionalities of the HDL models. This report describes the design of a software tool named VBMC (VHDL Bounded Model Checker). The capability of this tool is in proving/refuting functional properties of hardware designs described in VHDL. VBMC accepts design as a VHDL program file, functional property in PSL, and verification bound (number of cycles of operation) as inputs. It either reports that the design satisfies the functional property for the given verification bound or generates a counterexample providing the reason of violation. In case of satisfaction, the proof holds good for the verification bound. VBMC has been used for the functional verification of FPGA based intelligent I/O boards developed at Reactor Control Division, BARC. (author)

  7. European cinema: face to face with Hollywood

    NARCIS (Netherlands)

    Elsaesser, T.

    2005-01-01

    In the face of renewed competition from Hollywood since the early 1980s and the challenges posed to Europe's national cinemas by the fall of the Wall in 1989, independent filmmaking in Europe has begun to re-invent itself. European Cinema: Face to Face with Hollywood re-assesses the different

  8. Independent verification: operational phase liquid metal breeder reactors

    International Nuclear Information System (INIS)

    Bourne, P.B.

    1981-01-01

    The Fast Flux Test Facility (FFTF) recently achieved 100-percent power and now is in the initial stages of operation as a test reactor. An independent verification program has been established to assist in maintaining stable plant conditions, and to assure the safe operation of the reactor. Independent verification begins with the development of administrative procedures to control all other procedures and changes to the plant configurations. The technical content of the controlling procedures is subject to independent verification. The actual accomplishment of test procedures and operational maneuvers is witnessed by personnel not responsible for operating the plant. Off-normal events are analyzed, problem reports from other operating reactors are evaluated, and these results are used to improve on-line performance. Audits are used to confirm compliance with established practices and to identify areas where individual performance can be improved

  9. TWRS system drawings and field verification

    International Nuclear Information System (INIS)

    Shepard, D.G.

    1995-01-01

    The Configuration Management Program combines the TWRS Labeling and O and M drawing and drawing verification programs. The combined program will produce system drawings for systems that are normally operated or have maintenance performed on the system, label individual pieces of equipment for proper identification, even if system drawings are not warranted, and perform verification of drawings that are identified as essential in Tank Farm Essential Drawing Plans. During fiscal year 1994, work was begun to label Tank Farm components and provide user friendly system based drawings for Tank Waste Remediation System (TWRS) operations and maintenance. During the first half of fiscal 1995, the field verification program continued to convert TWRS drawings into CAD format and verify the accuracy based on visual inspections. During the remainder of fiscal year 1995 these efforts will be combined into a single program providing system based drawings and field verification of TWRS equipment and facilities. This combined program for TWRS will include all active systems for tank farms. Operations will determine the extent of drawing and labeling requirements for single shell tanks, i.e. the electrical distribution, HVAC, leak detection, and the radiation monitoring system. The tasks required to meet these objectives, include the following: identify system boundaries or scope for drawing being verified; label equipment/components in the process systems with a unique Equipment Identification Number (EIN) per the TWRS Data Standard; develop system drawings that are coordinated by ''smart'' drawing numbers and/or drawing references as identified on H-14-020000; develop a Master Equipment List (MEL) multi-user data base application which will contain key information about equipment identified in the field; and field verify and release TWRS Operation and Maintenance (O and M) drawings

  10. The Caledonian face test: A new test of face discrimination.

    Science.gov (United States)

    Logan, Andrew J; Wilkinson, Frances; Wilson, Hugh R; Gordon, Gael E; Loffler, Gunter

    2016-02-01

    This study aimed to develop a clinical test of face perception which is applicable to a wide range of patients and can capture normal variability. The Caledonian face test utilises synthetic faces which combine simplicity with sufficient realism to permit individual identification. Face discrimination thresholds (i.e. minimum difference between faces required for accurate discrimination) were determined in an "odd-one-out" task. The difference between faces was controlled by an adaptive QUEST procedure. A broad range of face discrimination sensitivity was determined from a group (N=52) of young adults (mean 5.75%; SD 1.18; range 3.33-8.84%). The test is fast (3-4 min), repeatable (test-re-test r(2)=0.795) and demonstrates a significant inversion effect. The potential to identify impairments of face discrimination was evaluated by testing LM who reported a lifelong difficulty with face perception. While LM's impairment for two established face tests was close to the criterion for significance (Z-scores of -2.20 and -2.27) for the Caledonian face test, her Z-score was -7.26, implying a more than threefold higher sensitivity. The new face test provides a quantifiable and repeatable assessment of face discrimination ability. The enhanced sensitivity suggests that the Caledonian face test may be capable of detecting more subtle impairments of face perception than available tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Face-Lift Satisfaction Using the FACE-Q.

    Science.gov (United States)

    Sinno, Sammy; Schwitzer, Jonathan; Anzai, Lavinia; Thorne, Charles H

    2015-08-01

    Face lifting is one of the most common operative procedures for facial aging and perhaps the procedure most synonymous with plastic surgery in the minds of the lay public, but no verifiable documentation of patient satisfaction exists in the literature. This study is the first to examine face-lift outcomes and patient satisfaction using a validated questionnaire. One hundred five patients undergoing a face lift performed by the senior author (C.H.T.) using a high, extended-superficial musculoaponeurotic system with submental platysma approximation technique were asked to complete anonymously the FACE-Q by e-mail. FACE-Q scores were assessed for each domain (range, 0 to 100), with higher scores indicating greater satisfaction with appearance or superior quality of life. Fifty-three patients completed the FACE-Q (50.5 percent response rate). Patients demonstrated high satisfaction with facial appearance (mean ± SD, 80.7 ± 22.3), and quality of life, including social confidence (90.4 ± 16.6), psychological well-being (92.8 ± 14.3), and early life impact (92.2 ± 16.4). Patients also reported extremely high satisfaction with their decision to undergo face lifting (90.5 ± 15.9). On average, patients felt they appeared 6.9 years younger than their actual age. Patients were most satisfied with the appearance of their nasolabial folds (86.2 ± 18.5), cheeks (86.1 ± 25.4), and lower face/jawline (86.0 ± 20.6), compared with their necks (78.1 ± 25.6) and area under the chin (67.9 ± 32.3). Patients who responded in this study were extremely satisfied with their decision to undergo face lifting and the outcomes and quality of life following the procedure.

  12. CATS Deliverable 5.1 : CATS verification of test matrix and protocol

    OpenAIRE

    Uittenbogaard, J.; Camp, O.M.G.C. op den; Montfort, S. van

    2016-01-01

    This report summarizes the work conducted within work package (WP) 5 "Verification of test matrix and protocol" of the Cyclist AEB testing system (CATS) project. It describes the verification process of the draft CATS test matrix resulting from WP1 and WP2, and the feasibility of meeting requirements set by CATS consortium based on requirements in Euro NCAP AEB protocols regarding accuracy, repeatability and reproducibility using the developed test hardware. For the cases where verification t...

  13. A Face Inversion Effect without a Face

    Science.gov (United States)

    Brandman, Talia; Yovel, Galit

    2012-01-01

    Numerous studies have attributed the face inversion effect (FIE) to configural processing of internal facial features in upright but not inverted faces. Recent findings suggest that face mechanisms can be activated by faceless stimuli presented in the context of a body. Here we asked whether faceless stimuli with or without body context may induce…

  14. Formal verification of industrial control systems

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Verification of critical software is a high priority but a challenging task for industrial control systems. For many kinds of problems, testing is not an efficient method. Formal methods, such as model checking appears to be an appropriate complementary method. However, it is not common to use model checking in industry yet, as this method needs typically formal methods expertise and huge computing power. In the EN-ICE-PLC section, we are working on a [methodology][1] and a tool ([PLCverif][2]) to overcome these challenges and to integrate formal verification in the development process of our PLC-based control systems. [1]: http://cern.ch/project-plc-formalmethods [2]: http://cern.ch/plcverif

  15. Focussed approach to verification under FMCT

    International Nuclear Information System (INIS)

    Bragin, V.; Carlson, J.; Bardsley, J.; Hill, J.

    1998-01-01

    FMCT will have different impacts on individual states due to the enormous variance in their nuclear fuel cycles and the associated fissile material inventories. The problem is how to negotiate a treaty that would achieve results favourable for all participants, given that interests and priorities vary so much. We believe that focussed verification, confined to safeguarding of enrichment and reprocessing facilities in NWS and TS, coupled with verification of unirradiated direct-use material produced after entry-into-force of a FMCT and supported with measures to detect possible undeclared enrichment and reprocessing activities, is technically adequate for the FMCT. Eventually this would become the appropriate model for all states party to the NPT

  16. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1999-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic...

  17. Formal Development and Verification of a Distributed Railway Control System

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan

    1998-01-01

    In this article we introduce the concept for a distributed railway control system and present the specification and verification of the main algorithm used for safe distributed control. Our design and verification approach is based on the RAISE method, starting with highly abstract algebraic spec...

  18. Verification ghosts. The changing political environment of the IAEA

    International Nuclear Information System (INIS)

    Redden, K.J.

    2003-01-01

    Six years ago, Dr. Hans Blix wrote in the IAEA Bulletin of a 'general optimism about further arms control and verification.' At the time, world events warranted such a prognosis; the IAEA was riding a wave of momentum after its instrumental role in the roll-back of the South African nuclear weapons program and bringing Ukraine, Belarus, and Kazakhstan into the Nuclear Non Proliferation Treaty (NPT) as non-nuclear-weapon States. The NPT's indefinite extension was only two years old, and the most pressing challenges, while recognizable, were somewhat stagnant. Today, some tidings elicit similar optimism. The IAEA's increasing efforts to combat terrorism and the decision by Member States to depart from nearly 20 years of zero real growth budgetary policy are remarkable testaments to the Agency's adaptability and credibility in the face of new threats. And with the worldwide frenzy over terrorism and redoubled phobia of weapons of mass destruction (WMD), the Agency garners public attention now as never before. Emblematic of this recent upsurge in political attention, US President George W. Bush's annual State of the Union address in 2003 mentioned supporting the IAEA as a specific priority of his administration, the first mention of the Agency in that speech since President Eisenhower in 1961 lauded its creation under 'Atoms for Peace'. Such visibility portends a future with prospects for overcoming bureaucratic inertia and effecting significant changes to the Agency's benefit. But with that visibility has come an uncertainty about the IAEA's role in world affairs. Despite being able to resolve most benign problems more easily, the Agency must operate in an environment haunted by the non-proliferation analogue of Charles Dickens' triumvirate specters: the ghosts of verification challenges past, present and future -namely, the cessation of UN-mandated inspections in Iraq, the difficulties ensuring compliance in North Korea and Iran, and the need to maintain the IAEA

  19. Time-space modal logic for verification of bit-slice circuits

    Science.gov (United States)

    Hiraishi, Hiromi

    1996-03-01

    The major goal of this paper is to propose a new modal logic aiming at formal verification of bit-slice circuits. The new logic is called as time-space modal logic and its major feature is that it can handle two transition relations: one for time transition and the other for space transition. As for a verification algorithm, a symbolic model checking algorithm of the new logic is shown. This could be applicable to verification of bit-slice microprocessor of infinite bit width and 1D systolic array of infinite length. A simple benchmark result shows the effectiveness of the proposed approach.

  20. SoC Design Approach Using Convertibility Verification

    Directory of Open Access Journals (Sweden)

    Basu Samik

    2008-01-01

    Full Text Available Abstract Compositional design of systems on chip from preverified components helps to achieve shorter design cycles and time to market. However, the design process is affected by the issue of protocol mismatches, where two components fail to communicate with each other due to protocol differences. Convertibility verification, which involves the automatic generation of a converter to facilitate communication between two mismatched components, is a collection of techniques to address protocol mismatches. We present an approach to convertibility verification using module checking. We use Kripke structures to represent protocols and the temporal logic to describe desired system behavior. A tableau-based converter generation algorithm is presented which is shown to be sound and complete. We have developed a prototype implementation of the proposed algorithm and have used it to verify that it can handle many classical protocol mismatch problems along with SoC problems. The initial idea for -based convertibility verification was presented at SLA++P '07 as presented in the work by Roopak Sinha et al. 2008.

  1. Human ear detection in the thermal infrared spectrum

    Science.gov (United States)

    Abaza, Ayman; Bourlai, Thirimachos

    2012-06-01

    In this paper the problem of human ear detection in the thermal infrared (IR) spectrum is studied in order to illustrate the advantages and limitations of the most important steps of ear-based biometrics that can operate in day and night time environments. The main contributions of this work are two-fold: First, a dual-band database is assembled that consists of visible and thermal profile face images. The thermal data was collected using a high definition middle-wave infrared (3-5 microns) camera that is capable of acquiring thermal imprints of human skin. Second, a fully automated, thermal imaging based ear detection method is developed for real-time segmentation of human ears in either day or night time environments. The proposed method is based on Haar features forming a cascaded AdaBoost classifier (our modified version of the original Viola-Jones approach1 that was designed to be applied mainly in visible band images). The main advantage of the proposed method, applied on our profile face image data set collected in the thermal-band, is that it is designed to reduce the learning time required by the original Viola-Jones method from several weeks to several hours. Unlike other approaches reported in the literature, which have been tested but not designed to operate in the thermal band, our method yields a high detection accuracy that reaches ~ 91.5%. Further analysis on our data set yielded that: (a) photometric normalization techniques do not directly improve ear detection performance. However, when using a certain photometric normalization technique (CLAHE) on falsely detected images, the detection rate improved by ~ 4%; (b) the high detection accuracy of our method did not degrade when we lowered down the original spatial resolution of thermal ear images. For example, even after using one third of the original spatial resolution (i.e. ~ 20% of the original computational time) of the thermal profile face images, the high ear detection accuracy of our method

  2. THE BOLOCAM GALACTIC PLANE SURVEY. VIII. A MID-INFRARED KINEMATIC DISTANCE DISCRIMINATION METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Ellsworth-Bowers, Timothy P.; Glenn, Jason; Battersby, Cara; Ginsburg, Adam; Bally, John [CASA, University of Colorado, UCB 389, University of Colorado, Boulder, CO 80309 (United States); Rosolowsky, Erik [Department of Physics and Astronomy, University of British Columbia Okanagan, 3333 University Way, Kelowna, BC V1V 1V7 (Canada); Mairs, Steven [Department of Physics and Astronomy, University of Victoria, 3800 Finnerty Road, Victoria, BC V8P 1A1 (Canada); Evans, Neal J. II [Department of Astronomy, University of Texas, 1 University Station C1400, Austin, TX 78712 (United States); Shirley, Yancy L., E-mail: timothy.ellsworthbowers@colorado.edu [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States)

    2013-06-10

    We present a new distance estimation method for dust-continuum-identified molecular cloud clumps. Recent (sub-)millimeter Galactic plane surveys have cataloged tens of thousands of these objects, plausible precursors to stellar clusters, but detailed study of their physical properties requires robust distance determinations. We derive Bayesian distance probability density functions (DPDFs) for 770 objects from the Bolocam Galactic Plane Survey in the Galactic longitude range 7. Degree-Sign 5 {<=} l {<=} 65 Degree-Sign . The DPDF formalism is based on kinematic distances, and uses any number of external data sets to place prior distance probabilities to resolve the kinematic distance ambiguity (KDA) for objects in the inner Galaxy. We present here priors related to the mid-infrared absorption of dust in dense molecular regions and the distribution of molecular gas in the Galactic disk. By assuming a numerical model of Galactic mid-infrared emission and simple radiative transfer, we match the morphology of (sub-)millimeter thermal dust emission with mid-infrared absorption to compute a prior DPDF for distance discrimination. Selecting objects first from (sub-)millimeter source catalogs avoids a bias towards the darkest infrared dark clouds (IRDCs) and extends the range of heliocentric distance probed by mid-infrared extinction and includes lower-contrast sources. We derive well-constrained KDA resolutions for 618 molecular cloud clumps, with approximately 15% placed at or beyond the tangent distance. Objects with mid-infrared contrast sufficient to be cataloged as IRDCs are generally placed at the near kinematic distance. Distance comparisons with Galactic Ring Survey KDA resolutions yield a 92% agreement. A face-on view of the Milky Way using resolved distances reveals sections of the Sagittarius and Scutum-Centaurus Arms. This KDA-resolution method for large catalogs of sources through the combination of (sub-)millimeter and mid-infrared observations of molecular

  3. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  4. Series 'Facing Radiation'. 2 Facing radiation is facing residents

    International Nuclear Information System (INIS)

    Hanzawa, Takahiro

    2013-01-01

    The series is to report how general people, who are not at all radiological experts, have faced and understood the problems and tasks of radiation given by the Fukushima Daiichi Nuclear Power Plant Accident (Mar. 2011). The section 2 is reported by an officer of Date City, which localizes at 60 km northern west of the Plant, borders on Iitate Village of Fukushima prefecture, and is indicated as the important area of contamination search (IACS), which the reporter has been conducted for as responsible personnel. In July 2011, the ambient dose was as high as 3.0-3.5 mc-Sv/h and the tentative storage place of contaminated materials was decided by own initiative of residents of a small community, from which the real decontamination started in the City. The target dose after decontamination was defined to be 1.0 mc-Sv/h: however, 28/32 IACS municipalities in the prefecture had not defined the target although they had worked for 2 years after the Accident for their areas exceeding the standard 0.23 mc-Sv/h. At the moment of decontamination of the reporter's own house, he noticed that resident's concerns had directed toward its work itself, not toward the target dose, and wondered if these figures had obstructed to correctly face the radiation. At present that about 2.5 years have passed since the Accident, all of Date citizens have personal accumulated glass dosimeters for seeing the effective external dose and it seems that their dose will not exceed 1 mSv/y if the ambient dose estimated is 0.3-5 mc-Sv/h. Media run to popularity not to face radiation, experts tend to hesitate to face media and residents, and radiation dose will be hardly reduced to zero, despite that correct understanding of radiation is a shorter way for residents' own ease: facing radiation is facing residents. (T.T.)

  5. Verification of 3-D generation code package for neutronic calculations of WWERs

    International Nuclear Information System (INIS)

    Sidorenko, V.D.; Aleshin, S.S.; Bolobov, P.A.; Bolshagin, S.N.; Lazarenko, A.P.; Markov, A.V.; Morozov, V.V.; Syslov, A.A.; Tsvetkov, V.M.

    2000-01-01

    Materials on verification of the 3 -d generation code package for WWERs neutronic calculations are presented. The package includes: - spectral code TVS-M; - 2-D fine mesh diffusion code PERMAK-A for 4- or 6-group calculation of WWER core burnup; - 3-D coarse mesh diffusion code BIPR-7A for 2-group calculations of quasi-stationary WWERs regimes. The materials include both TVS-M verification data and verification data on PERMAK-A and BIPR-7A codes using constant libraries generated with TVS-M. All materials are related to the fuel without Gd. TVS-M verification materials include results of comparison both with benchmark calculations obtained by other codes and with experiments carried out at ZR-6 critical facility. PERMAK-A verification materials contain results of comparison with TVS-M calculations and with ZR-6 experiments. BIPR-7A materials include comparison with operation data for Dukovany-2 and Loviisa-1 NPPs (WWER-440) and for Balakovo NPP Unit 4 (WWER-1000). The verification materials demonstrate rather good accuracy of calculations obtained with the use of code package of the 3 -d generation. (Authors)

  6. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  7. Automated Inspection of Defects in Optical Fiber Connector End Face Using Novel Morphology Approaches.

    Science.gov (United States)

    Mei, Shuang; Wang, Yudan; Wen, Guojun; Hu, Yang

    2018-05-03

    Increasing deployment of optical fiber networks and the need for reliable high bandwidth make the task of inspecting optical fiber connector end faces a crucial process that must not be neglected. Traditional end face inspections are usually performed by manual visual methods, which are low in efficiency and poor in precision for long-term industrial applications. More seriously, the inspection results cannot be quantified for subsequent analysis. Aiming at the characteristics of typical defects in the inspection process for optical fiber end faces, we propose a novel method, “difference of min-max ranking filtering” (DO2MR), for detection of region-based defects, e.g., dirt, oil, contamination, pits, and chips, and a special model, a “linear enhancement inspector” (LEI), for the detection of scratches. The DO2MR is a morphology method that intends to determine whether a pixel belongs to a defective region by comparing the difference of gray values of pixels in the neighborhood around the pixel. The LEI is also a morphology method that is designed to search for scratches at different orientations with a special linear detector. These two approaches can be easily integrated into optical inspection equipment for automatic quality verification. As far as we know, this is the first time that complete defect detection methods for optical fiber end faces are available in the literature. Experimental results demonstrate that the proposed DO2MR and LEI models yield good comprehensive performance with high precision and accepted recall rates, and the image-level detection accuracies reach 96.0 and 89.3%, respectively.

  8. Automated Inspection of Defects in Optical Fiber Connector End Face Using Novel Morphology Approaches

    Directory of Open Access Journals (Sweden)

    Shuang Mei

    2018-05-01

    Full Text Available Increasing deployment of optical fiber networks and the need for reliable high bandwidth make the task of inspecting optical fiber connector end faces a crucial process that must not be neglected. Traditional end face inspections are usually performed by manual visual methods, which are low in efficiency and poor in precision for long-term industrial applications. More seriously, the inspection results cannot be quantified for subsequent analysis. Aiming at the characteristics of typical defects in the inspection process for optical fiber end faces, we propose a novel method, “difference of min-max ranking filtering” (DO2MR, for detection of region-based defects, e.g., dirt, oil, contamination, pits, and chips, and a special model, a “linear enhancement inspector” (LEI, for the detection of scratches. The DO2MR is a morphology method that intends to determine whether a pixel belongs to a defective region by comparing the difference of gray values of pixels in the neighborhood around the pixel. The LEI is also a morphology method that is designed to search for scratches at different orientations with a special linear detector. These two approaches can be easily integrated into optical inspection equipment for automatic quality verification. As far as we know, this is the first time that complete defect detection methods for optical fiber end faces are available in the literature. Experimental results demonstrate that the proposed DO2MR and LEI models yield good comprehensive performance with high precision and accepted recall rates, and the image-level detection accuracies reach 96.0 and 89.3%, respectively.

  9. Thermal loads on tokamak plasma-facing components during normal operation and disruptions

    International Nuclear Information System (INIS)

    McGrath, R.T.

    1990-01-01

    Power loadings experienced by tokamak plasma-facing components during normal operation and during off-normal events are discussed. A model for power and particle flow in the tokamak boundary layer is presented and model predictions are compared to infrared measurements of component heating. The inclusion of the full three-dimensional geometry of the components and of the magnetic flux surface is very important in the modeling. Experimental measurements show that misalignment of component armour tile surfaces by only a millimeter can lead to significant localized heating. An application to the design of plasma-facing components for future machines is presented. Finally, thermal loads expected during tokamak disruptions are discussed. The primary problems are surface melting and vaporization due to localized intense heating during the disruption thermal quench and volumetric heating of the component armour and structure due to localised impact of runaway electrons. (author)

  10. Validation and verification plan for safety and PRA codes

    International Nuclear Information System (INIS)

    Ades, M.J.; Crowe, R.D.; Toffer, H.

    1991-04-01

    This report discusses a verification and validation (V ampersand V) plan for computer codes used for safety analysis and probabilistic risk assessment calculations. The present plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River Office (DOE-SRO) to bring the essential safety analysis and probabilistic risk assessment codes in compliance with verification and validation requirements

  11. Verification of a CT scanner using a miniature step gauge

    DEFF Research Database (Denmark)

    Cantatore, Angela; Andreasen, J.L.; Carmignato, S.

    2011-01-01

    The work deals with performance verification of a CT scanner using a 42mm miniature replica step gauge developed for optical scanner verification. Errors quantification and optimization of CT system set-up in terms of resolution and measurement accuracy are fundamental for use of CT scanning...

  12. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  13. DOE handbook: Integrated safety management systems (ISMS) verification. Team leader's handbook

    International Nuclear Information System (INIS)

    1999-06-01

    The primary purpose of this handbook is to provide guidance to the ISMS verification Team Leader and the verification team in conducting ISMS verifications. The handbook describes methods and approaches for the review of the ISMS documentation (Phase I) and ISMS implementation (Phase II) and provides information useful to the Team Leader in preparing the review plan, selecting and training the team, coordinating the conduct of the verification, and documenting the results. The process and techniques described are based on the results of several pilot ISMS verifications that have been conducted across the DOE complex. A secondary purpose of this handbook is to provide information useful in developing DOE personnel to conduct these reviews. Specifically, this handbook describes methods and approaches to: (1) Develop the scope of the Phase 1 and Phase 2 review processes to be consistent with the history, hazards, and complexity of the site, facility, or activity; (2) Develop procedures for the conduct of the Phase 1 review, validating that the ISMS documentation satisfies the DEAR clause as amplified in DOE Policies 450.4, 450.5, 450.6 and associated guidance and that DOE can effectively execute responsibilities as described in the Functions, Responsibilities, and Authorities Manual (FRAM); (3) Develop procedures for the conduct of the Phase 2 review, validating that the description approved by the Approval Authority, following or concurrent with the Phase 1 review, has been implemented; and (4) Describe a methodology by which the DOE ISMS verification teams will be advised, trained, and/or mentored to conduct subsequent ISMS verifications. The handbook provides proven methods and approaches for verifying that commitments related to the DEAR, the FRAM, and associated amplifying guidance are in place and implemented in nuclear and high risk facilities. This handbook also contains useful guidance to line managers when preparing for a review of ISMS for radiological

  14. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1998-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  15. Knowledge base verification based on enhanced colored petri net

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-12-31

    Verification is a process aimed at demonstrating whether a system meets it`s specified requirements. As expert systems are used in various applications, the knowledge base verification of systems takes an important position. The conventional Petri net approach that has been studied recently in order to verify the knowledge base is found that it is inadequate to verify the knowledge base of large and complex system, such as alarm processing system of nuclear power plant. Thus, we propose an improved method that models the knowledge base as enhanced colored Petri net. In this study, we analyze the reachability and the error characteristics of the knowledge base and apply the method to verification of simple knowledge base. 8 refs., 4 figs. (Author)

  16. Neighbors Based Discriminative Feature Difference Learning for Kinship Verification

    DEFF Research Database (Denmark)

    Duan, Xiaodong; Tan, Zheng-Hua

    2015-01-01

    In this paper, we present a discriminative feature difference learning method for facial image based kinship verification. To transform feature difference of an image pair to be discriminative for kinship verification, a linear transformation matrix for feature difference between an image pair...... than the commonly used feature concatenation, leading to a low complexity. Furthermore, there is no positive semi-definitive constrain on the transformation matrix while there is in metric learning methods, leading to an easy solution for the transformation matrix. Experimental results on two public...... databases show that the proposed method combined with a SVM classification method outperforms or is comparable to state-of-the-art kinship verification methods. © Springer International Publishing AG, Part of Springer Science+Business Media...

  17. Bibliography for Verification and Validation in Computational Simulation

    International Nuclear Information System (INIS)

    Oberkampf, W.L.

    1998-01-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering

  18. Bibliography for Verification and Validation in Computational Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.

    1998-10-01

    A bibliography has been compiled dealing with the verification and validation of computational simulations. The references listed in this bibliography are concentrated in the field of computational fluid dynamics (CFD). However, references from the following fields are also included: operations research, heat transfer, solid dynamics, software quality assurance, software accreditation, military systems, and nuclear reactor safety. This bibliography, containing 221 references, is not meant to be comprehensive. It was compiled during the last ten years in response to the author's interest and research in the methodology for verification and validation. The emphasis in the bibliography is in the following areas: philosophy of science underpinnings, development of terminology and methodology, high accuracy solutions for CFD verification, experimental datasets for CFD validation, and the statistical quantification of model validation. This bibliography should provide a starting point for individual researchers in many fields of computational simulation in science and engineering.

  19. Is identity per se irrelevant? A contrarian view of self-verification effects

    OpenAIRE

    Gregg, Aiden P.

    2008-01-01

    Self-verification theory (SVT) posits that people who hold negative self-views, such as depressive patients, ironically strive to verify that these self-views are correct, by actively seeking out critical feedback or interaction partners who evaluate them unfavorably. Such verification strivings are allegedly directed towards maximizing subjective perceptions of prediction and control. Nonetheless, verification strivings are also alleged to stabilize maladaptive self-perceptions, and thereby ...

  20. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  1. A COMPREHENSIVE SEARCH FOR STELLAR BOWSHOCK NEBULAE IN THE MILKY WAY: A CATALOG OF 709 MID-INFRARED SELECTED CANDIDATES

    Energy Technology Data Exchange (ETDEWEB)

    Kobulnicky, Henry A.; Chick, William T.; Schurhammer, Danielle P.; Andrews, Julian E.; Munari, Stephan A.; Olivier, Grace M.; Sorber, Rebecca L.; Wernke, Heather N.; Dale, Daniel A. [Dept. of Physics and Astronomy, University of Wyoming, Laramie, WY 82070 (United States); Povich, Matthew S.; Dixon, Don M. [Department of Physics and Astronomy, California State Polytechnic University, 3801 West Temple Avenue, Pomona, CA 91768 (United States)

    2016-12-01

    We identify 709 arc-shaped mid-infrared nebula in 24 μ m Spitzer Space Telescope or 22 μ m Wide Field Infrared Explorer surveys of the Galactic Plane as probable dusty interstellar bowshocks powered by early-type stars. About 20% are visible at 8 μ m or at shorter mid-infrared wavelengths. The vast majority (660) have no previous identification in the literature. These extended infrared sources are strongly concentrated near the Galactic mid-plane, with an angular scale height of ∼0.°6. All host a symmetrically placed star implicated as the source of a stellar wind sweeping up interstellar material. These are candidate “runaway” stars potentially having high velocities in the reference frame of the local medium. Among the 286 objects with measured proper motions, we find an unambiguous excess with velocity vectors aligned with the infrared morphology—kinematic evidence that many of these are “runaway” stars with large peculiar motions responsible for the bowshock signature. We discuss a population of “in situ” bowshocks (∼103 objects) that face giant H ii regions where the relative motions between the star and ISM may be caused by bulk outflows from an overpressured bubble. We also identify ∼58 objects that face 8 μ m bright-rimmed clouds and apparently constitute a sub-class of in situ bowshocks where the stellar wind interacts with a photoevaporative flow (PEF) from an eroding molecular cloud interface (i.e., “PEF bowshocks”). Orientations of the arcuate nebulae exhibit a correlation over small angular scales, indicating that external influences such as H ii regions are responsible for producing some bowshock nebulae. However, the vast majority of the nebulae in this sample appear to be isolated (499 objects) from obvious external influences.

  2. Core power capability verification for PWR NPP

    International Nuclear Information System (INIS)

    Xian Chunyu; Liu Changwen; Zhang Hong; Liang Wei

    2002-01-01

    The Principle and methodology of pressurized water reactor nuclear power plant core power capability verification for reload are introduced. The radial and axial power distributions of normal operation (category I or condition I) and abnormal operation (category II or condition II) are simulated by using neutronics calculation code. The linear power density margin and DNBR margin for both categories, which reflect core safety, are analyzed from the point view of reactor physics and T/H, and thus category I operating domain and category II protection set point are verified. Besides, the verification results of reference NPP are also given

  3. Data Exchanges and Verifications Online (DEVO)

    Data.gov (United States)

    Social Security Administration — DEVO is the back-end application for processing SSN verifications and data exchanges. DEVO uses modern technology for parameter driven processing of both batch and...

  4. Face time: educating face transplant candidates.

    Science.gov (United States)

    Lamparello, Brooke M; Bueno, Ericka M; Diaz-Siso, Jesus Rodrigo; Sisk, Geoffroy C; Pomahac, Bohdan

    2013-01-01

    Face transplantation is the innovative application of microsurgery and immunology to restore appearance and function to those with severe facial disfigurements. Our group aims to establish a multidisciplinary education program that can facilitate informed consent and build a strong knowledge base in patients to enhance adherence to medication regimes, recovery, and quality of life. We analyzed handbooks from our institution's solid organ transplant programs to identify topics applicable to face transplant patients. The team identified unique features of face transplantation that warrant comprehensive patient education. We created a 181-page handbook to provide subjects interested in pursuing transplantation with a written source of information on the process and team members and to address concerns they may have. While the handbook covers a wide range of topics, it is easy to understand and visually appealing. Face transplantation has many unique aspects that must be relayed to the patients pursuing this novel therapy. Since candidates lack third-party support groups and programs, the transplant team must provide an extensive educational component to enhance this complex process. As face transplantation continues to develop, programs must create sound education programs that address patients' needs and concerns to facilitate optimal care.

  5. A transformation of SDL specifications : a step towards the verification

    NARCIS (Netherlands)

    Ioustinova, N.; Sidorova, N.; Bjorner, D.; Broy, M.; Zamulin, A.

    2001-01-01

    Industrial-size specifications/models (whose state space is often infinite) can not be model checked in a direct way— a verification model of a system is model checked instead. Program transformation is a way to build a finite-state verification model that can be submitted to a model checker.

  6. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    Science.gov (United States)

    2012-07-16

    also require considerable manual effort. For example, the verification of the SEL4 operating system [45] required several man years effort. In...Winwood. seL4 : formal verification of an OS kernel. In Proc. of SOSP, 2009. [46] K. Kortchinsky. Cloudburst: A VMware guest to host escape story

  7. A Proof-checked Verification of a Real-Time Communication Protocol

    NARCIS (Netherlands)

    Polak, I.

    We present an analysis of a protocol developed by Philips to connect several components of an audio-system. The verification of the protocol is carried out using the timed I/O-automata model of Lynch and Vaandrager. The verification has been partially proof-checked with the interactive proof

  8. Modular Verification of Linked Lists with Views via Separation Logic

    DEFF Research Database (Denmark)

    Jensen, Jonas Braband; Birkedal, Lars; Sestoft, Peter

    2011-01-01

    We present a separation logic specification and verification of linked lists with views, a data structure from the C5 collection library for .NET. A view is a generalization of the well-known concept of an iterator. Linked lists with views form an interesting case study for verification since...

  9. Verification and validation in computational fluid dynamics

    Science.gov (United States)

    Oberkampf, William L.; Trucano, Timothy G.

    2002-04-01

    Verification and validation (V&V) are the primary means to assess accuracy and reliability in computational simulations. This paper presents an extensive review of the literature in V&V in computational fluid dynamics (CFD), discusses methods and procedures for assessing V&V, and develops a number of extensions to existing ideas. The review of the development of V&V terminology and methodology points out the contributions from members of the operations research, statistics, and CFD communities. Fundamental issues in V&V are addressed, such as code verification versus solution verification, model validation versus solution validation, the distinction between error and uncertainty, conceptual sources of error and uncertainty, and the relationship between validation and prediction. The fundamental strategy of verification is the identification and quantification of errors in the computational model and its solution. In verification activities, the accuracy of a computational solution is primarily measured relative to two types of highly accurate solutions: analytical solutions and highly accurate numerical solutions. Methods for determining the accuracy of numerical solutions are presented and the importance of software testing during verification activities is emphasized. The fundamental strategy of validation is to assess how accurately the computational results compare with the experimental data, with quantified error and uncertainty estimates for both. This strategy employs a hierarchical methodology that segregates and simplifies the physical and coupling phenomena involved in the complex engineering system of interest. A hypersonic cruise missile is used as an example of how this hierarchical structure is formulated. The discussion of validation assessment also encompasses a number of other important topics. A set of guidelines is proposed for designing and conducting validation experiments, supported by an explanation of how validation experiments are different

  10. Verification of road databases using multiple road models

    Science.gov (United States)

    Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian

    2017-08-01

    In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.

  11. Server-Aided Verification Signature with Privacy for Mobile Computing

    Directory of Open Access Journals (Sweden)

    Lingling Xu

    2015-01-01

    Full Text Available With the development of wireless technology, much data communication and processing has been conducted in mobile devices with wireless connection. As we know that the mobile devices will always be resource-poor relative to static ones though they will improve in absolute ability, therefore, they cannot process some expensive computational tasks due to the constrained computational resources. According to this problem, server-aided computing has been studied in which the power-constrained mobile devices can outsource some expensive computation to a server with powerful resources in order to reduce their computational load. However, in existing server-aided verification signature schemes, the server can learn some information about the message-signature pair to be verified, which is undesirable especially when the message includes some secret information. In this paper, we mainly study the server-aided verification signatures with privacy in which the message-signature pair to be verified can be protected from the server. Two definitions of privacy for server-aided verification signatures are presented under collusion attacks between the server and the signer. Then based on existing signatures, two concrete server-aided verification signature schemes with privacy are proposed which are both proved secure.

  12. Automated Verification of Virtualized Infrastructures

    DEFF Research Database (Denmark)

    Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander

    2011-01-01

    Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...

  13. Near Infrared Characterization of Hetero-Core Optical Fiber SPR Sensors Coated with Ta2O5 Film and Their Applications

    Directory of Open Access Journals (Sweden)

    Kazuhiro Watanabe

    2012-02-01

    Full Text Available This paper describes the characteristics of optical fiber sensors with surface plasmon resonance (SPR at 1,310 nm in which the scattering loss of silica optical fiber is low. SPR operation in the infrared wavelength range is achieved by coating a thin tantalum pentaoxide (Ta2O5 film. The novelty of this paper lies in the verification of how the hetero-core scheme could be operated as a commercial base candidate in the sense of easy fabrication, sufficient mechanical strength, and significant sensitivity as a liquid detector under the basis of a low loss transmission network in the near infrared wavelength region. The effect of Ta2O5 layer thickness has been experimentally revealed in the wavelength region extending to 1,800 nm by using the hetero-core structured optical fiber. SPR characterizations have been made in the wavelength region 1,000–1,300 nm, showing the feasible operation at the near infrared wavelength and the possible practical applications. In addition, the technique developed in this work has been interestingly applied to a multi-point water-detection and a water-level gauge in which tandem-connected SPR sensors system using hetero-core structured fibers were incorporated. The detailed performance characteristics are also shown on these applications.

  14. Photographic infrared spectroscopy and near infrared photometry of Be stars

    International Nuclear Information System (INIS)

    Swings, J.P.

    1976-01-01

    Two topics are tackled in this presentation: spectroscopy and photometry. The following definitions are chosen: photographic infrared spectroscopy (wavelengths Hα<=lambda<1.2 μ); near infrared photometry (wavebands: 1.6 μ<=lambda<=20 μ). Near infrared spectroscopy and photometry of classical and peculiar Be stars are discussed and some future developments in the field are outlined. (Auth.)

  15. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  16. From face processing to face recognition: Comparing three different processing levels.

    Science.gov (United States)

    Besson, G; Barragan-Jason, G; Thorpe, S J; Fabre-Thorpe, M; Puma, S; Ceccaldi, M; Barbeau, E J

    2017-01-01

    Verifying that a face is from a target person (e.g. finding someone in the crowd) is a critical ability of the human face processing system. Yet how fast this can be performed is unknown. The 'entry-level shift due to expertise' hypothesis suggests that - since humans are face experts - processing faces should be as fast - or even faster - at the individual than at superordinate levels. In contrast, the 'superordinate advantage' hypothesis suggests that faces are processed from coarse to fine, so that the opposite pattern should be observed. To clarify this debate, three different face processing levels were compared: (1) a superordinate face categorization level (i.e. detecting human faces among animal faces), (2) a face familiarity level (i.e. recognizing famous faces among unfamiliar ones) and (3) verifying that a face is from a target person, our condition of interest. The minimal speed at which faces can be categorized (∼260ms) or recognized as familiar (∼360ms) has largely been documented in previous studies, and thus provides boundaries to compare our condition of interest to. Twenty-seven participants were included. The recent Speed and Accuracy Boosting procedure paradigm (SAB) was used since it constrains participants to use their fastest strategy. Stimuli were presented either upright or inverted. Results revealed that verifying that a face is from a target person (minimal RT at ∼260ms) was remarkably fast but longer than the face categorization level (∼240ms) and was more sensitive to face inversion. In contrast, it was much faster than recognizing a face as familiar (∼380ms), a level severely affected by face inversion. Face recognition corresponding to finding a specific person in a crowd thus appears achievable in only a quarter of a second. In favor of the 'superordinate advantage' hypothesis or coarse-to-fine account of the face visual hierarchy, these results suggest a graded engagement of the face processing system across processing

  17. Standard Verification System Lite (SVS Lite)

    Data.gov (United States)

    Social Security Administration — SVS Lite is a mainframe program used exclusively by the Office of Child Support Enforcement (OCSE) to perform batch SSN verifications. This process is exactly the...

  18. Towards a CPN-Based Modelling Approach for Reconciling Verification and Implementation of Protocol Models

    DEFF Research Database (Denmark)

    Simonsen, Kent Inge; Kristensen, Lars Michael

    2013-01-01

    Formal modelling of protocols is often aimed at one specific purpose such as verification or automatically generating an implementation. This leads to models that are useful for one purpose, but not for others. Being able to derive models for verification and implementation from a single model...... is beneficial both in terms of reduced total modelling effort and confidence that the verification results are valid also for the implementation model. In this paper we introduce the concept of a descriptive specification model and an approach based on refining a descriptive model to target both verification...... how this model can be refined to target both verification and implementation....

  19. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  20. FMEF Electrical single line diagram and panel schedule verification process

    International Nuclear Information System (INIS)

    Fong, S.K.

    1998-01-01

    Since the FMEF did not have a mission, a formal drawing verification program was not developed, however, a verification process on essential electrical single line drawings and panel schedules was established to benefit the operations lock and tag program and to enhance the electrical safety culture of the facility. The purpose of this document is to provide a basis by which future landlords and cognizant personnel can understand the degree of verification performed on the electrical single lines and panel schedules. It is the intent that this document be revised or replaced by a more formal requirements document if a mission is identified for the FMEF

  1. Verification of Java Programs using Symbolic Execution and Invariant Generation

    Science.gov (United States)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S PETROTAC

    Science.gov (United States)

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DUST SUPPRESSANT PRODUCTS: SYNTECH PRODUCTS CORPORATION'S TECHSUPPRESS

    Science.gov (United States)

    Dust suppressant products used to control particulate emissions from unpaved roads are among the technologies evaluated by the Air Pollution Control Technology (APCT) Verification Center, part of the U.S. Environmental Protection Agency's Environmental Technology Verification (ET...

  4. Parallel verification of dynamic systems with rich configurations

    OpenAIRE

    Pessoa, Eduardo José Dias

    2016-01-01

    Dissertação de mestrado em Engenharia Informática (área de especialização em Informática) Model checking is a technique used to automatically verify a model which represents the specification of some system. To ensure the correctness of the system the verification of both static and dynamic properties is often needed. The specification of a system is made through modeling languages, while the respective verification is made by its model-checker. Most modeling frameworks are not...

  5. Tree dimension in verification of constrained Horn clauses

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick; Ganty, Pierre

    2018-01-01

    In this paper, we show how the notion of tree dimension can be used in the verification of constrained Horn clauses (CHCs). The dimension of a tree is a numerical measure of its branching complexity and the concept here applies to Horn clause derivation trees. Derivation trees of dimension zero c...... algorithms using these constructions to decompose a CHC verification problem. One variation of this decomposition considers derivations of successively increasing dimension. The paper includes descriptions of implementations and experimental results....

  6. Bayesian Face Recognition and Perceptual Narrowing in Face-Space

    Science.gov (United States)

    Balas, Benjamin

    2012-01-01

    During the first year of life, infants' face recognition abilities are subject to "perceptual narrowing", the end result of which is that observers lose the ability to distinguish previously discriminable faces (e.g. other-race faces) from one another. Perceptual narrowing has been reported for faces of different species and different races, in…

  7. Experimental verification of layout physical verification of silicon photonics

    Science.gov (United States)

    El Shamy, Raghi S.; Swillam, Mohamed A.

    2018-02-01

    Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.

  8. Verification of Triple Modular Redundancy (TMR) Insertion for Reliable and Trusted Systems

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems. If a system is expected to be protected using TMR, improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. This manuscript addresses the challenge of confirming that TMR has been inserted without corruption of functionality and with correct application of the expected TMR topology. The proposed verification method combines the usage of existing formal analysis tools with a novel search-detect-and-verify tool. Field programmable gate array (FPGA),Triple Modular Redundancy (TMR),Verification, Trust, Reliability,

  9. Machinability studies of infrared window materials and metals

    International Nuclear Information System (INIS)

    Arnold, J.B.; Morris, T.O.; Sladky, R.E.; Steger, P.J.

    1976-01-01

    Diamond machining of materials for optical applications is becoming an important fabrication process. Development work in material-removal technology to better understand the mechanics of the diamond-turning process, its limitations, and applications is described. The technique is presently limited to a select group of metals, most of which are of a face-center-cubic crystal structure. Machinability studies were done which were designed to better understand diamond compatibility and thus expand the range of applicable materials. Nonconventional methods such as ultrasonic tool stimulation were investigated. Work done to determine the machinability of infrared window materials indicates that this is a viable fabrication technique for many materials, although additional effort is needed to optimize the process for particular materials

  10. Attention to internal face features in unfamiliar face matching.

    Science.gov (United States)

    Fletcher, Kingsley I; Butavicius, Marcus A; Lee, Michael D

    2008-08-01

    Accurate matching of unfamiliar faces is vital in security and forensic applications, yet previous research has suggested that humans often perform poorly when matching unfamiliar faces. Hairstyle and facial hair can strongly influence unfamiliar face matching but are potentially unreliable cues. This study investigated whether increased attention to the more stable internal face features of eyes, nose, and mouth was associated with more accurate face-matching performance. Forty-three first-year psychology students decided whether two simultaneously presented faces were of the same person or not. The faces were displayed for either 2 or 6 seconds, and had either similar or dissimilar hairstyles. The level of attention to internal features was measured by the proportion of fixation time spent on the internal face features and the sensitivity of discrimination to changes in external feature similarity. Increased attention to internal features was associated with increased discrimination in the 2-second display-time condition, but no significant relationship was found in the 6-second condition. Individual differences in eye-movements were highly stable across the experimental conditions.

  11. Face recognition : implementation of face recognition on AMIGO

    NARCIS (Netherlands)

    Geelen, M.J.A.J.; Molengraft, van de M.J.G.; Elfring, J.

    2011-01-01

    In this (traineeship)report two possible methods of face recognition were presented. The first method describes how to detect and recognize faces by using the SURF algorithm. This algorithm finally was not used for recognizing faces, with the reason that the Eigenface algorithm was an already tested

  12. Verification and Diagnostics Framework in ATLAS Trigger/DAQ

    CERN Document Server

    Barczyk, M.; Caprini, M.; Da Silva Conceicao, J.; Dobson, M.; Flammer, J.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Soloviev, I.; Hart, R.; Amorim, A.; Klose, D.; Lima, J.; Pedro, J.; Wolters, H.; Badescu, E.; Alexandrov, I.; Kotov, V.; Mineev, M.; Ryabov, Yu.; Ryabov, Yu.

    2003-01-01

    Trigger and data acquisition (TDAQ) systems for modern HEP experiments are composed of thousands of hardware and software components depending on each other in a very complex manner. Typically, such systems are operated by non-expert shift operators, which are not aware of system functionality details. It is therefore necessary to help the operator to control the system and to minimize system down-time by providing knowledge-based facilities for automatic testing and verification of system components and also for error diagnostics and recovery. For this purpose, a verification and diagnostic framework was developed in the scope of ATLAS TDAQ. The verification functionality of the framework allows developers to configure simple low-level tests for any component in a TDAQ configuration. The test can be configured as one or more processes running on different hosts. The framework organizes tests in sequences, using knowledge about components hierarchy and dependencies, and allowing the operator to verify the fun...

  13. Functional verification of dynamically reconfigurable FPGA-based systems

    CERN Document Server

    Gong, Lingkan

    2015-01-01

    This book analyzes the challenges in verifying Dynamically Reconfigurable Systems (DRS) with respect to the user design and the physical implementation of such systems. The authors describe the use of a simulation-only layer to emulate the behavior of target FPGAs and accurately model the characteristic features of reconfiguration. Readers are enabled with this simulation-only layer to maintain verification productivity by abstracting away the physical details of the FPGA fabric.  Two implementations of the simulation-only layer are included: Extended ReChannel is a SystemC library that can be used to check DRS designs at a high level; ReSim is a library to support RTL simulation of a DRS reconfiguring both its logic and state. Through a number of case studies, the authors demonstrate how their approach integrates seamlessly with existing, mainstream DRS design flows and with well-established verification methodologies such as top-down modeling and coverage-driven verification. Provides researchers with an i...

  14. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  15. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  16. Investigation of depth dependent changes in cerebral haemodynamics during face perception in infants

    Energy Technology Data Exchange (ETDEWEB)

    Blasi, A [Biomedical Optics Research Lab, Department of Medical Physics and Bioengineering, Malet Place Engineering Building, University College London, London WC1E 6BT (United Kingdom); Fox, S [Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck College, University of London, Malet Street, London WC1E 7HX (United Kingdom); Everdell, N [Biomedical Optics Research Lab, Department of Medical Physics and Bioengineering, Malet Place Engineering Building, University College London, London WC1E 6BT (United Kingdom); Volein, A [Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck College, University of London, Malet Street, London WC1E 7HX (United Kingdom); Tucker, L [Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck College, University of London, Malet Street, London WC1E 7HX (United Kingdom); Csibra, G [Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck College, University of London, Malet Street, London WC1E 7HX (United Kingdom); Gibson, A P [Biomedical Optics Research Lab, Department of Medical Physics and Bioengineering, Malet Place Engineering Building, University College London, London WC1E 6BT (United Kingdom); Hebden, J C [Biomedical Optics Research Lab, Department of Medical Physics and Bioengineering, Malet Place Engineering Building, University College London, London WC1E 6BT (United Kingdom); Johnson, M H [Centre for Brain and Cognitive Development, Henry Wellcome Building, Birkbeck College, University of London, Malet Street, London WC1E 7HX (United Kingdom); Elwell, C E [Biomedical Optics Research Lab, Department of Medical Physics and Bioengineering, Malet Place Engineering Building, University College London, London WC1E 6BT (United Kingdom)

    2007-12-07

    Near-infrared spectroscopy has been used to record oxygenation changes in the visual cortex of 4 month old infants. Our in-house topography system, with 30 channels and 3 different source-detector separations, recorded changes in the concentration of oxy-, deoxy- and total haemoglobin (HbO{sub 2}, HHb and HbT) in response to visual stimuli (face, scrambled visual noise and cartoons as rest). The aim of this work was to demonstrate the capability of the system to spatially localize functional activation and study the possibility of depth discrimination in the haemodynamic response. The group data show both face stimulation and visual noise stimulation induced significant increases in HbO{sub 2} from rest, but the increase in HbO{sub 2} with face stimulation was not significantly different from that seen with visual noise stimulation. The face stimuli induced increases in HbO{sub 2} were spread across a greater area across all depths than visual noise induced changes. In results from a single subject there was a significant increase of HbO{sub 2} in the inferior area of the visual cortex in response to both types of stimuli, and a larger number of channels (source-detector pairs) showed HbO{sub 2} increase to face stimuli, especially at the greatest depth. Activation maps were obtained using 3D reconstruction methods on multi source-detector separation optical topography data.

  17. Jet-stirred reactor oxidation of alkane-rich FACE gasoline fuels

    KAUST Repository

    Chen, Bingjie

    2016-06-23

    Understanding species evolution upon gasoline fuel oxidation can aid in mitigating harmful emissions and improving combustion efficiency. Experimentally measured speciation profiles are also important targets for surrogate fuel kinetic models. This work presents the low- and high-temperature oxidation of two alkane-rich FACE gasolines (A and C, Fuels for Advanced Combustion Engines) in a jet-stirred reactor at 10. bar and equivalence ratios from 0.5 to 2 by probe sampling combined with gas chromatography and Fourier Transformed Infrared Spectrometry analysis. Detailed speciation profiles as a function of temperature are presented and compared to understand the combustion chemistry of these two real fuels. Simulations were conducted using three surrogates (i.e., FGA2, FGC2, and FRF 84), which have similar physical and chemical properties as the two gasolines. The experimental results reveal that the reactivity and major product distributions of these two alkane-rich FACE fuels are very similar, indicating that they have similar global reactivity despite their different compositions. The simulation results using all the surrogates capture the two-stage oxidation behavior of the two FACE gasolines, but the extent of low temperature reactivity is over-predicted. The simulations were analyzed, with a focus on the n-heptane and n-butane sub-mechanisms, to help direct the future model development and surrogate fuel formulation strategies.

  18. Jet-stirred reactor oxidation of alkane-rich FACE gasoline fuels

    KAUST Repository

    Chen, Bingjie; Togbé , Casimir; Wang, Zhandong; Dagaut, Philippe; Sarathy, Mani

    2016-01-01

    Understanding species evolution upon gasoline fuel oxidation can aid in mitigating harmful emissions and improving combustion efficiency. Experimentally measured speciation profiles are also important targets for surrogate fuel kinetic models. This work presents the low- and high-temperature oxidation of two alkane-rich FACE gasolines (A and C, Fuels for Advanced Combustion Engines) in a jet-stirred reactor at 10. bar and equivalence ratios from 0.5 to 2 by probe sampling combined with gas chromatography and Fourier Transformed Infrared Spectrometry analysis. Detailed speciation profiles as a function of temperature are presented and compared to understand the combustion chemistry of these two real fuels. Simulations were conducted using three surrogates (i.e., FGA2, FGC2, and FRF 84), which have similar physical and chemical properties as the two gasolines. The experimental results reveal that the reactivity and major product distributions of these two alkane-rich FACE fuels are very similar, indicating that they have similar global reactivity despite their different compositions. The simulation results using all the surrogates capture the two-stage oxidation behavior of the two FACE gasolines, but the extent of low temperature reactivity is over-predicted. The simulations were analyzed, with a focus on the n-heptane and n-butane sub-mechanisms, to help direct the future model development and surrogate fuel formulation strategies.

  19. The Effect of Mystery Shopper Reports on Age Verification for Tobacco Purchases

    Science.gov (United States)

    KREVOR, BRAD S.; PONICKI, WILLIAM R.; GRUBE, JOEL W.; DeJONG, WILLIAM

    2011-01-01

    Mystery shops (MS) involving attempted tobacco purchases by young buyers have been employed to monitor retail stores’ performance in refusing underage sales. Anecdotal evidence suggests that MS visits with immediate feedback to store personnel can improve age verification. This study investigated the impact of monthly and twice-monthly MS reports on age verification. Forty-five Walgreens stores were each visited 20 times by mystery shoppers. The stores were randomly assigned to one of three conditions. Control group stores received no feedback, whereas two treatment groups received feedback communications every visit (twice monthly) or every second visit (monthly) after baseline. Logit regression models tested whether each treatment group improved verification rates relative to the control group. Post-baseline verification rates were higher in both treatment groups than in the control group, but only the stores receiving monthly communications had a significantly greater improvement than control group stores. Verification rates increased significantly during the study period for all three groups, with delayed improvement among control group stores. Communication between managers regarding the MS program may account for the delayed age-verification improvements observed in the control group stores. Encouraging inter-store communication might extend the benefits of MS programs beyond those stores that receive this intervention. PMID:21541874

  20. A Secure Framework for Location Verification in Pervasive Computing

    Science.gov (United States)

    Liu, Dawei; Lee, Moon-Chuen; Wu, Dan

    The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.

  1. The complex duration perception of emotional faces: Effects of face direction

    Directory of Open Access Journals (Sweden)

    Katrin Martina Kliegl

    2015-03-01

    Full Text Available The perceived duration of emotional face stimuli strongly depends on the expressed emotion. But, emotional faces also differ regarding a number of other features like gaze, face direction, or sex. Usually, these features have been controlled by only using pictures of female models with straight gaze and face direction. Doi and Shinohara (2009 reported that an overestimation of angry faces could only be found when the model’s gaze was oriented towards the observer. We aimed at replicating this effect for face direction. Moreover, we explored the effect of face direction on the duration perception sad faces. Controlling for the sex of the face model and the participant, female and male participants rated the duration of neutral, angry and sad face stimuli of both sexes photographed from different perspectives in a bisection task. In line with current findings, we report a significant overestimation of angry compared to neutral face stimuli that was modulated by face direction. Moreover, the perceived duration of sad face stimuli did not differ from that of neutral faces and was not influenced by face direction. Furthermore, we found that faces of the opposite sex appeared to last longer than those of the same sex. This outcome is discussed with regards to stimulus parameters like the induced arousal, social relevance and an evolutionary context.

  2. Static Verification for Code Contracts

    Science.gov (United States)

    Fähndrich, Manuel

    The Code Contracts project [3] at Microsoft Research enables programmers on the .NET platform to author specifications in existing languages such as C# and VisualBasic. To take advantage of these specifications, we provide tools for documentation generation, runtime contract checking, and static contract verification.

  3. In vivo near-infrared dual-axis confocal microendoscopy in the human lower gastrointestinal tract

    Science.gov (United States)

    Piyawattanametha, Wibool; Ra, Hyejun; Qiu, Zhen; Friedland, Shai; Liu, Jonathan T. C.; Loewke, Kevin; Kino, Gordon S.; Solgaard, Olav; Wang, Thomas D.; Mandella, Michael J.; Contag, Christopher H.

    2012-02-01

    Near-infrared confocal microendoscopy is a promising technique for deep in vivo imaging of tissues and can generate high-resolution cross-sectional images at the micron-scale. We demonstrate the use of a dual-axis confocal (DAC) near-infrared fluorescence microendoscope with a 5.5-mm outer diameter for obtaining clinical images of human colorectal mucosa. High-speed two-dimensional en face scanning was achieved through a microelectromechanical systems (MEMS) scanner while a micromotor was used for adjusting the axial focus. In vivo images of human patients are collected at 5 frames/sec with a field of view of 362×212 μm2 and a maximum imaging depth of 140 μm. During routine endoscopy, indocyanine green (ICG) was topically applied a nonspecific optical contrasting agent to regions of the human colon. The DAC microendoscope was then used to obtain microanatomic images of the mucosa by detecting near-infrared fluorescence from ICG. These results suggest that DAC microendoscopy may have utility for visualizing the anatomical and, perhaps, functional changes associated with colorectal pathology for the early detection of colorectal cancer.

  4. A causal relationship between face-patch activity and face-detection behavior.

    Science.gov (United States)

    Sadagopan, Srivatsun; Zarco, Wilbert; Freiwald, Winrich A

    2017-04-04

    The primate brain contains distinct areas densely populated by face-selective neurons. One of these, face-patch ML, contains neurons selective for contrast relationships between face parts. Such contrast-relationships can serve as powerful heuristics for face detection. However, it is unknown whether neurons with such selectivity actually support face-detection behavior. Here, we devised a naturalistic face-detection task and combined it with fMRI-guided pharmacological inactivation of ML to test whether ML is of critical importance for real-world face detection. We found that inactivation of ML impairs face detection. The effect was anatomically specific, as inactivation of areas outside ML did not affect face detection, and it was categorically specific, as inactivation of ML impaired face detection while sparing body and object detection. These results establish that ML function is crucial for detection of faces in natural scenes, performing a critical first step on which other face processing operations can build.

  5. Multi parametric sensitivity study applied to temperature measurement of metallic plasma facing components in fusion devices

    International Nuclear Information System (INIS)

    Aumeunier, M-H.; Corre, Y.; Firdaouss, M.; Gauthier, E.; Loarer, T.; Travere, J-M.; Gardarein, J-L.; EFDA JET Contributor

    2013-06-01

    In nuclear fusion experiments, the protection system of the Plasma Facing Components (PFCs) is commonly ensured by infrared (IR) thermography. Nevertheless, the surface monitoring of new metallic plasma facing component, as in JET and ITER is being challenging. Indeed, the analysis of infrared signals is made more complicated in such a metallic environment since the signals will be perturbed by the reflected photons coming from high temperature regions. To address and anticipate this new measurement environment, predictive photonic models, based on Monte-Carlo ray tracing (SPEOS R CAA V5 Based), have been performed to assess the contribution of the reflective part in the total flux collected by the camera and the resulting temperature error. This paper deals with the effects of metals features, as the emissivity and reflectivity models, on the accuracy of the surface temperature estimation. The reliability of the features models is discussed by comparing the simulation with experimental data obtained with the wide angle IR thermography system of JET ITER like wall. The impact of the temperature distribution is studied by considering two different typical plasma scenarios, in limiter (ITER start-up scenario) and in X-point configurations (standard divertor scenario). The achievable measurement performances of IR system and risks analysis on its functionalities are discussed. (authors)

  6. Technical safety requirements control level verification; TOPICAL

    International Nuclear Information System (INIS)

    STEWART, J.L.

    1999-01-01

    A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL

  7. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    Science.gov (United States)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  8. Alternative face models for 3D face registration

    Science.gov (United States)

    Salah, Albert Ali; Alyüz, Neşe; Akarun, Lale

    2007-01-01

    3D has become an important modality for face biometrics. The accuracy of a 3D face recognition system depends on a correct registration that aligns the facial surfaces and makes a comparison possible. The best results obtained so far use a one-to-all registration approach, which means each new facial surface is registered to all faces in the gallery, at a great computational cost. We explore the approach of registering the new facial surface to an average face model (AFM), which automatically establishes correspondence to the pre-registered gallery faces. Going one step further, we propose that using a couple of well-selected AFMs can trade-off computation time with accuracy. Drawing on cognitive justifications, we propose to employ category-specific alternative average face models for registration, which is shown to increase the accuracy of the subsequent recognition. We inspect thin-plate spline (TPS) and iterative closest point (ICP) based registration schemes under realistic assumptions on manual or automatic landmark detection prior to registration. We evaluate several approaches for the coarse initialization of ICP. We propose a new algorithm for constructing an AFM, and show that it works better than a recent approach. Finally, we perform simulations with multiple AFMs that correspond to different clusters in the face shape space and compare these with gender and morphology based groupings. We report our results on the FRGC 3D face database.

  9. Withholding response to self-face is faster than to other-face.

    Science.gov (United States)

    Zhu, Min; Hu, Yinying; Tang, Xiaochen; Luo, Junlong; Gao, Xiangping

    2015-01-01

    Self-face advantage refers to adults' response to self-face is faster than that to other-face. A stop-signal task was used to explore how self-face advantage interacted with response inhibition. The results showed that reaction times of self-face were faster than that of other-face not in the go task but in the stop response trials. The novelty of the finding was that self-face has shorter stop-signal reaction time compared to other-face in the successful inhibition trials. These results indicated the processing mechanism of self-face may be characterized by a strong response tendency and a corresponding strong inhibition control.

  10. Experience in non-proliferation verification: The Treaty of Raratonga

    International Nuclear Information System (INIS)

    Walker, R.A.

    1998-01-01

    The verification provisions of the Treaty of Raratonga are subdivided into two categories: those performed by IAEA and those performed by other entities. A final provision of the Treaty of Raratonga is relevant to IAEA safeguards according to support of the continued effectiveness of the international non-proliferation system based on the Non-proliferation Treaty and the IAEA safeguards system. The non-IAEA verification process is described as well

  11. Timed verification with µCRL

    NARCIS (Netherlands)

    Blom, S.C.C.; Ioustinova, N.; Sidorova, N.; Broy, M.; Zamulin, A.V.

    2003-01-01

    µCRL is a process algebraic language for specification and verification of distributed systems. µCRL allows to describe temporal properties of distributed systems but it has no explicit reference to time. In this work we propose a manner of introducing discrete time without extending the language.

  12. Eggspectation : organic egg verification tool

    NARCIS (Netherlands)

    Ruth, van S.M.; Hoogenboom, L.A.P.

    2011-01-01

    In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and

  13. About-face on face recognition ability and holistic processing.

    Science.gov (United States)

    Richler, Jennifer J; Floyd, R Jackie; Gauthier, Isabel

    2015-01-01

    Previous work found a small but significant relationship between holistic processing measured with the composite task and face recognition ability measured by the Cambridge Face Memory Test (CFMT; Duchaine & Nakayama, 2006). Surprisingly, recent work using a different measure of holistic processing (Vanderbilt Holistic Face Processing Test [VHPT-F]; Richler, Floyd, & Gauthier, 2014) and a larger sample found no evidence for such a relationship. In Experiment 1 we replicate this unexpected result, finding no relationship between holistic processing (VHPT-F) and face recognition ability (CFMT). A key difference between the VHPT-F and other holistic processing measures is that unique face parts are used on each trial in the VHPT-F, unlike in other tasks where a small set of face parts repeat across the experiment. In Experiment 2, we test the hypothesis that correlations between the CFMT and holistic processing tasks are driven by stimulus repetition that allows for learning during the composite task. Consistent with our predictions, CFMT performance was correlated with holistic processing in the composite task when a small set of face parts repeated over trials, but not when face parts did not repeat. A meta-analysis confirms that relationships between the CFMT and holistic processing depend on stimulus repetition. These results raise important questions about what is being measured by the CFMT, and challenge current assumptions about why faces are processed holistically.

  14. BEval: A Plug-in to Extend Atelier B with Current Verification Technologies

    Directory of Open Access Journals (Sweden)

    Valério Medeiros Jr.

    2014-01-01

    Full Text Available This paper presents BEval, an extension of Atelier B to improve automation in the verification activities in the B method or Event-B. It combines a tool for managing and verifying software projects (Atelier B and a model checker/animator (ProB so that the verification conditions generated in the former are evaluated with the latter. In our experiments, the two main verification strategies (manual and automatic showed significant improvement as ProB's evaluator proves complementary to Atelier B built-in provers. We conducted experiments with the B model of a micro-controller instruction set; several verification conditions, that we were not able to discharge automatically or manually with AtelierB's provers, were automatically verified using BEval.

  15. Cognitive Bias in Systems Verification

    Science.gov (United States)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  16. The joint verification experiments as a global non-proliferation exercise

    International Nuclear Information System (INIS)

    Shaner, J.W.

    1998-01-01

    This conference commemorates the 10th anniversary of the second of two Joint Verification Experiments conducted by the Soviet Union and the US. These two experiments, one at the Nevada test site in the US, and the second here at the Semipalatinsk test site were designed to test the verification of a nuclear testing treaty limiting the size underground explosions to 150 kilotons. By building trust and technical respect between the weapons scientists of the two most powerful adversaries, the Joint Verification Experiment (JVE) had the unanticipated result of initiating a suite of cooperative projects and programs aimed at reducing the Cold War threats and preventing the proliferation of weapons of mass destruction

  17. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  18. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  19. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    Science.gov (United States)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  20. Tore-Supra infrared thermography system, a real steady-state diagnostic

    International Nuclear Information System (INIS)

    Guilhem, D.; Bondil, J.L.; Bertrand, B.; Desgranges, C.; Lipa, M.; Messina, P.; Missirlian, M.; Portafaix, C.; Reichle, R.; Roche, H.; Saille, A.

    2005-01-01

    Tore-Supra Tokamak (I p = 1.5 MA, B t = 4 T) has been constructed with a steady-state magnetic field using super-conducting magnets and water-cooled plasma facing components (PFCs) for high-performance long pulse plasma discharges. When not actively cooled, plasma facing components can only accumulate a limited amount of energy since the temperature increases continuously during the discharge until radiation cooling equals the incoming heat flux. Such an environment is found in the JET Tokamak [JET Team, IAEA-CN-60/A1-3, Seville, 1994] and on TRIAM [M. Sakamoto, H. Nakashima, S. Kawasaki, A. Iyomasa, S.V. Kulkarni, M. Hasegawa, E. Jotaki, H. Zushi, K. Nakamura, K. Hanada, S. Itoh, Static and dynamic properties of wall recycling in TRIAM-1M, J. Nucl. Mater. 313-316 (2003) 519-523] [Y. Kamada, et al., Nucl. Fusion 3 (1999) 1845]. In Tore-Supra, the surface temperature of the actively cooled plasma facing components reach steady state within a second. We present here the Tore-Supra thermographic system, made of seven endoscope bodies equipped so far with eight infrared (IR) cameras. It has to be noted that this diagnostic is the first diagnostic to be actively cooled, as required for steady state. The main purpose of such a diagnostic is to prevent the plasma to damage the actively cooled plasma facing components (ACPFCs), which consist of the toroidal pumped limiter (TPL), 7 m 2 , and of five radio-frequency antennae, 1.5 m 2 each

  1. Extragalactic infrared astronomy

    International Nuclear Information System (INIS)

    Gondhalekar, P.M.

    1985-05-01

    The paper concerns the field of Extragalactic Infrared Astronomy, discussed at the Fourth RAL Workshop on Astronomy and Astrophysics. Fifteen papers were presented on infrared emission from extragalactic objects. Both ground-(and aircraft-) based and IRAS infrared data were reviewed. The topics covered star formation in galaxies, active galactic nuclei and cosmology. (U.K.)

  2. Wavelet-based verification of the quantitative precipitation forecast

    Science.gov (United States)

    Yano, Jun-Ichi; Jakubiak, Bogumil

    2016-06-01

    This paper explores the use of wavelets for spatial verification of quantitative precipitation forecasts (QPF), and especially the capacity of wavelets to provide both localization and scale information. Two 24-h forecast experiments using the two versions of the Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS) on 22 August 2010 over Poland are used to illustrate the method. Strong spatial localizations and associated intermittency of the precipitation field make verification of QPF difficult using standard statistical methods. The wavelet becomes an attractive alternative, because it is specifically designed to extract spatially localized features. The wavelet modes are characterized by the two indices for the scale and the localization. Thus, these indices can simply be employed for characterizing the performance of QPF in scale and localization without any further elaboration or tunable parameters. Furthermore, spatially-localized features can be extracted in wavelet space in a relatively straightforward manner with only a weak dependence on a threshold. Such a feature may be considered an advantage of the wavelet-based method over more conventional "object" oriented verification methods, as the latter tend to represent strong threshold sensitivities. The present paper also points out limits of the so-called "scale separation" methods based on wavelets. Our study demonstrates how these wavelet-based QPF verifications can be performed straightforwardly. Possibilities for further developments of the wavelet-based methods, especially towards a goal of identifying a weak physical process contributing to forecast error, are also pointed out.

  3. Societal Verification: Intellectual Game or International Game-Changer

    International Nuclear Information System (INIS)

    Hartigan, Kelsey; Hinderstein, Corey

    2013-01-01

    Within the nuclear nonproliferation and arms control field, there is an increasing appreciation for the potential of open source information technologies to supplement existing verification and compliance regimes. While clearly not a substitute for on-site inspections or national technical means, it may be possible to better leverage information gleaned from commercial satellite imagery, international trade records and the vast amount of data being exchanged online and between publics (including social media) so as to develop a more comprehensive set of tools and practices for monitoring and verifying a state’s nuclear activities and helping judge compliance with international obligations. The next generation “toolkit” for monitoring and verifying items, facility operations and activities will likely include a more diverse set of analytical tools and technologies than are currently used internationally. To explore these and other issues, the Nuclear Threat Initiative has launched an effort that examines, in part, the role that emerging technologies and “citizen scientists” might play in future verification regimes. This paper will include an assessment of past proliferation and security “events” and whether emerging tools and technologies would have provided indicators concurrently or in advance of these actions. Such case studies will be instrumental in understanding the reliability of these technologies and practices and in thinking through the requirements of a 21st century verification regime. Keywords: Verification, social media, open-source information, arms control, disarmament.

  4. Infrared thermography

    CERN Document Server

    Meola, Carosena

    2012-01-01

    This e-book conveys information about basic IRT theory, infrared detectors, signal digitalization and applications of infrared thermography in many fields such as medicine, foodstuff conservation, fluid-dynamics, architecture, anthropology, condition monitoring, non destructive testing and evaluation of materials and structures.

  5. Portable system for periodical verification of area monitors for neutrons

    International Nuclear Information System (INIS)

    Souza, Luciane de R.; Leite, Sandro Passos; Lopes, Ricardo Tadeu; Patrao, Karla C. de Souza; Fonseca, Evaldo S. da; Pereira, Walsan W.

    2009-01-01

    The Neutrons Laboratory develops a project viewing the construction of a portable test system for verification of functioning conditions of neutron area monitors. This device will allow to the users the verification of the calibration maintenance of his instruments at the use installations, avoiding the use of an inadequate equipment related to his answer to the neutron beam response

  6. Neutron spectrometric methods for core inventory verification in research reactors

    International Nuclear Information System (INIS)

    Ellinger, A.; Filges, U.; Hansen, W.; Knorr, J.; Schneider, R.

    2002-01-01

    In consequence of the Non-Proliferation Treaty safeguards, inspections are periodically made in nuclear facilities by the IAEA and the EURATOM Safeguards Directorate. The inspection methods are permanently improved. Therefore, the Core Inventory Verification method is being developed as an indirect method for the verification of the core inventory and to check the declared operation of research reactors

  7. Tree automata-based refinement with application to Horn clause verification

    DEFF Research Database (Denmark)

    Kafle, Bishoksan; Gallagher, John Patrick

    2015-01-01

    In this paper we apply tree-automata techniques to refinement of abstract interpretation in Horn clause verification. We go beyond previous work on refining trace abstractions; firstly we handle tree automata rather than string automata and thereby can capture traces in any Horn clause derivation...... compare the results with other state of the art Horn clause verification tools....

  8. Design of verification platform for wireless vision sensor networks

    Science.gov (United States)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  9. Darcy Tools version 3.4. Verification, validation and demonstration

    International Nuclear Information System (INIS)

    Svensson, Urban

    2010-12-01

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups (see Table 3-1 below), thus reflect the scope of DarcyTools. The present report will focus on the Verification, Validation and Demonstration of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods and Equations, /Svensson et al. 2010/ (Hereafter denoted Report 1). - User's Guide, /Svensson and Ferry 2010/ (Hereafter denoted Report 2)

  10. Darcy Tools version 3.4. Verification, validation and demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Urban (Computer-aided Fluid Engineering AB, Lyckeby (Sweden))

    2010-12-15

    DarcyTools is a computer code for simulation of flow and transport in porous and/or fractured media. The fractured media in mind is a fractured rock and the porous media the soil cover on the top of the rock; it is hence groundwater flows, which is the class of flows in mind. A number of novel methods and features form the present version of DarcyTools. In the verification studies, these methods are evaluated by comparisons with analytical solutions for idealized situations. The five verification groups (see Table 3-1 below), thus reflect the scope of DarcyTools. The present report will focus on the Verification, Validation and Demonstration of DarcyTools. Two accompanying reports cover other aspects: - Concepts, Methods and Equations, /Svensson et al. 2010/ (Hereafter denoted Report 1). - User's Guide, /Svensson and Ferry 2010/ (Hereafter denoted Report 2)

  11. Faces in places: humans and machines make similar face detection errors.

    Directory of Open Access Journals (Sweden)

    Bernard Marius 't Hart

    Full Text Available The human visual system seems to be particularly efficient at detecting faces. This efficiency sometimes comes at the cost of wrongfully seeing faces in arbitrary patterns, including famous examples such as a rock configuration on Mars or a toast's roast patterns. In machine vision, face detection has made considerable progress and has become a standard feature of many digital cameras. The arguably most wide-spread algorithm for such applications ("Viola-Jones" algorithm achieves high detection rates at high computational efficiency. To what extent do the patterns that the algorithm mistakenly classifies as faces also fool humans? We selected three kinds of stimuli from real-life, first-person perspective movies based on the algorithm's output: correct detections ("real faces", false positives ("illusory faces" and correctly rejected locations ("non faces". Observers were shown pairs of these for 20 ms and had to direct their gaze to the location of the face. We found that illusory faces were mistaken for faces more frequently than non faces. In addition, rotation of the real face yielded more errors, while rotation of the illusory face yielded fewer errors. Using colored stimuli increases overall performance, but does not change the pattern of results. When replacing the eye movement by a manual response, however, the preference for illusory faces over non faces disappeared. Taken together, our data show that humans make similar face-detection errors as the Viola-Jones algorithm, when directing their gaze to briefly presented stimuli. In particular, the relative spatial arrangement of oriented filters seems of relevance. This suggests that efficient face detection in humans is likely to be pre-attentive and based on rather simple features as those encoded in the early visual system.

  12. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  13. 12 CFR 715.8 - Requirements for verification of accounts and passbooks.

    Science.gov (United States)

    2010-01-01

    ...' share and loan accounts; (2) Statistical method. A sampling method which provides for: (i) Random... CREDIT UNIONS SUPERVISORY COMMITTEE AUDITS AND VERIFICATIONS § 715.8 Requirements for verification of... jurisdiction in which the credit union is principally located, the auditor may choose among the sampling...

  14. Age verification cards fail to fully prevent minors from accessing tobacco products.

    Science.gov (United States)

    Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi

    2011-03-01

    Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.

  15. A Cache System Design for CMPs with Built-In Coherence Verification

    Directory of Open Access Journals (Sweden)

    Mamata Dalui

    2016-01-01

    Full Text Available This work reports an effective design of cache system for Chip Multiprocessors (CMPs. It introduces built-in logic for verification of cache coherence in CMPs realizing directory based protocol. It is developed around the cellular automata (CA machine, invented by John von Neumann in the 1950s. A special class of CA referred to as single length cycle 2-attractor cellular automata (TACA has been planted to detect the inconsistencies in cache line states of processors’ private caches. The TACA module captures coherence status of the CMPs’ cache system and memorizes any inconsistent recording of the cache line states during the processors’ reference to a memory block. Theory has been developed to empower a TACA to analyse the cache state updates and then to settle to an attractor state indicating quick decision on a faulty recording of cache line status. The introduction of segmentation of the CMPs’ processor pool ensures a better efficiency, in determining the inconsistencies, by reducing the number of computation steps in the verification logic. The hardware requirement for the verification logic points to the fact that the overhead of proposed coherence verification module is much lesser than that of the conventional verification units and is insignificant with respect to the cost involved in CMPs’ cache system.

  16. Software verification, model validation, and hydrogeologic modelling aspects in nuclear waste disposal system simulations. A paradigm shift

    International Nuclear Information System (INIS)

    Sheng, G.M.

    1994-01-01

    This work reviewed the current concept of nuclear waste disposal in stable, terrestrial geologic media with a system of natural and man-made multi-barriers. Various aspects of this concept and supporting research were examined with the emphasis on the Canadian Nuclear Fuel Waste Management Program. Several of the crucial issues and challenges facing the current concept were discussed. These include: The difficulties inherent in a concept that centres around lithologic studies; the unsatisfactory state of software quality assurance in the present computer simulation programs; and the lack of a standardized, comprehensive, and systematic procedure to carry out a rigorous process of model validation and assessment of simulation studies. An outline of such an approach was presented and some of the principles, tools and techniques for software verification were introduced and described. A case study involving an evaluation of the Canadian performance assessment computer program is presented. A new paradigm to nuclear waste disposal was advocated to address the challenges facing the existing concept. The RRC (Regional Recharge Concept) was introduced and its many advantages were described and shown through a modelling exercise. (orig./HP)

  17. Broadband infrared beam splitter for spaceborne interferometric infrared sounder.

    Science.gov (United States)

    Yu, Tianyan; Liu, Dingquan; Qin, Yang

    2014-10-01

    A broadband infrared beam splitter (BS) on ZnSe substrate used for the spaceborne interferometric infrared sounder (SIIRS) is studied in the spectral range of 4.44-15 μm. Both broadband antireflection coating and broadband beam-splitter coating in this BS are designed and tested. To optimize the optical properties and the stability of the BS, suitable infrared materials were selected, and improved deposition techniques were applied. The designed structures matched experimental data well, and the properties of the BS met the application specification of SIIRS.

  18. Face-to-face or face-to-screen? Undergraduates' opinions and test performance in classroom vs. online learning

    Science.gov (United States)

    Kemp, Nenagh; Grieve, Rachel

    2014-01-01

    As electronic communication becomes increasingly common, and as students juggle study, work, and family life, many universities are offering their students more flexible learning opportunities. Classes once delivered face-to-face are often replaced by online activities and discussions. However, there is little research comparing students' experience and learning in these two modalities. The aim of this study was to compare undergraduates' preference for, and academic performance on, class material and assessment presented online vs. in traditional classrooms. Psychology students (N = 67) at an Australian university completed written exercises, a class discussion, and a written test on two academic topics. The activities for one topic were conducted face-to-face, and the other online, with topics counterbalanced across two groups. The results showed that students preferred to complete activities face-to-face rather than online, but there was no significant difference in their test performance in the two modalities. In their written responses, students expressed a strong preference for class discussions to be conducted face-to-face, reporting that they felt more engaged, and received more immediate feedback, than in online discussion. A follow-up study with a separate group (N = 37) confirmed that although students appreciated the convenience of completing written activities online in their own time, they also strongly preferred to discuss course content with peers in the classroom rather than online. It is concluded that online and face-to-face activities can lead to similar levels of academic performance, but that students would rather do written activities online but engage in discussion in person. Course developers could aim to structure classes so that students can benefit from both the flexibility of online learning, and the greater engagement experienced in face-to-face discussion. PMID:25429276

  19. Face-to-face or face-to-screen? Undergraduates' opinions and test performance in classroom vs. online learning.

    Science.gov (United States)

    Kemp, Nenagh; Grieve, Rachel

    2014-01-01

    As electronic communication becomes increasingly common, and as students juggle study, work, and family life, many universities are offering their students more flexible learning opportunities. Classes once delivered face-to-face are often replaced by online activities and discussions. However, there is little research comparing students' experience and learning in these two modalities. The aim of this study was to compare undergraduates' preference for, and academic performance on, class material and assessment presented online vs. in traditional classrooms. Psychology students (N = 67) at an Australian university completed written exercises, a class discussion, and a written test on two academic topics. The activities for one topic were conducted face-to-face, and the other online, with topics counterbalanced across two groups. The results showed that students preferred to complete activities face-to-face rather than online, but there was no significant difference in their test performance in the two modalities. In their written responses, students expressed a strong preference for class discussions to be conducted face-to-face, reporting that they felt more engaged, and received more immediate feedback, than in online discussion. A follow-up study with a separate group (N = 37) confirmed that although students appreciated the convenience of completing written activities online in their own time, they also strongly preferred to discuss course content with peers in the classroom rather than online. It is concluded that online and face-to-face activities can lead to similar levels of academic performance, but that students would rather do written activities online but engage in discussion in person. Course developers could aim to structure classes so that students can benefit from both the flexibility of online learning, and the greater engagement experienced in face-to-face discussion.

  20. A knowledge-base verification of NPP expert systems using extended Petri nets

    International Nuclear Information System (INIS)

    Kwon, Il Won; Seong, Poong Hyun

    1995-01-01

    The verification phase of knowledge base is an important part for developing reliable expert systems, especially in nuclear industry. Although several strategies or tools have been developed to perform potential error checking, they often neglect the reliability of verification methods. Because a Petri net provides a uniform mathematical formalization of knowledge base, it has been employed for knowledge base verification. In this work, we devise and suggest an automated tool, called COKEP (Checker Of Knowledge base using Extended Petri net), for detecting incorrectness, inconsistency, and incompleteness in a knowledge base. The scope of the verification problem is expanded to chained errors, unlike previous studies that assumed error incidence to be limited to rule pairs only. In addition, we consider certainty factor in checking, because most of knowledge bases have certainty factors